WorldWideScience

Sample records for lbb methodology application

  1. Advanced LBB methodology and considerations

    International Nuclear Information System (INIS)

    Olson, R.; Rahman, S.; Scott, P.

    1997-01-01

    LBB applications have existed in many industries and more recently have been applied in the nuclear industry under limited circumstances. Research over the past 10 years has evolved the technology so that more advanced consideration of LBB can now be given. Some of the advanced considerations for nuclear plants subjected to seismic loading evaluations are summarized in this paper

  2. Advanced LBB methodology and considerations

    Energy Technology Data Exchange (ETDEWEB)

    Olson, R.; Rahman, S.; Scott, P. [Battelle, Columbus, OH (United States)] [and others

    1997-04-01

    LBB applications have existed in many industries and more recently have been applied in the nuclear industry under limited circumstances. Research over the past 10 years has evolved the technology so that more advanced consideration of LBB can now be given. Some of the advanced considerations for nuclear plants subjected to seismic loading evaluations are summarized in this paper.

  3. The LBB methodology application results performed on the safety related piping of NPP V-1 in Jaslovske Bohunice

    Energy Technology Data Exchange (ETDEWEB)

    Kupca, L.; Beno, P. [Nuclear Power Plants Research Institute, Trnava (Slovakia)

    1997-04-01

    A broad overview of the leak before break (LBB) application to the Slovakian V-1 nuclear power plant is presented in the paper. LBB was applied to the primary cooling circuit and surge lines of both WWER 440 type units, and also used to assess the integrity of safety related piping in the feed water and main steam systems. Experiments and calculations performed included analyses of stresses, material mechanical properties, corrosion, fatigue damage, stability of heavy component supports, water hammer, and leak rates. A list of analysis results and recommendations are included in the paper.

  4. LBB application in the US operating and advanced reactors

    Energy Technology Data Exchange (ETDEWEB)

    Wichman, K.; Tsao, J.; Mayfield, M.

    1997-04-01

    The regulatory application of leak before break (LBB) for operating and advanced reactors in the U.S. is described. The U.S. Nuclear Regulatory Commission (NRC) has approved the application of LBB for six piping systems in operating reactors: reactor coolant system primary loop piping, pressurizer surge, safety injection accumulator, residual heat removal, safety injection, and reactor coolant loop bypass. The LBB concept has also been applied in the design of advanced light water reactors. LBB applications, and regulatory considerations, for pressurized water reactors and advanced light water reactors are summarized in this paper. Technology development for LBB performed by the NRC and the International Piping Integrity Research Group is also briefly summarized.

  5. Development of crack shape: LBB methodology for cracked pipes

    Energy Technology Data Exchange (ETDEWEB)

    Moulin, D.; Chapuliot, S.; Drubay, B. [Commissariat a l Energie Atomique, Gif sur Yvette (France)

    1997-04-01

    For structures like vessels or pipes containing a fluid, the Leak-Before-Break (LBB) assessment requires to demonstrate that it is possible, during the lifetime of the component, to detect a rate of leakage due to a possible defect, the growth of which would result in a leak before-break of the component. This LBB assessment could be an important contribution to the overall structural integrity argument for many components. The aim of this paper is to review some practices used for LBB assessment and to describe how some new R & D results have been used to provide a simplified approach of fracture mechanics analysis and especially the evaluation of crack shape and size during the lifetime of the component.

  6. Application of LBB to a nozzle-pipe interface

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Y.J.; Sohn, G.H.; Kim, Y.J. [and others

    1997-04-01

    Typical LBB (Leak-Before-Break) analysis is performed for the highest stress location for each different type of material in the high energy pipe line. In most cases, the highest stress occurs at the nozzle and pipe interface location at the terminal end. The standard finite element analysis approach to calculate J-Integral values at the crack tip utilizes symmetry conditions when modeling near the nozzle as well as away from the nozzle region to minimize the model size and simplify the calculation of J-integral values at the crack tip. A factor of two is typically applied to the J-integral value to account for symmetric conditions. This simplified analysis can lead to conservative results especially for small diameter pipes where the asymmetry of the nozzle-pipe interface is ignored. The stiffness of the residual piping system and non-symmetries of geometry along with different material for the nozzle, safe end and pipe are usually omitted in current LBB methodology. In this paper, the effects of non-symmetries due to geometry and material at the pipe-nozzle interface are presented. Various LBB analyses are performed for a small diameter piping system to evaluate the effect a nozzle has on the J-integral calculation, crack opening area and crack stability. In addition, material differences between the nozzle and pipe are evaluated. Comparison is made between a pipe model and a nozzle-pipe interface model, and a LBB PED (Piping Evaluation Diagram) curve is developed to summarize the results for use by piping designers.

  7. Special problems: LBB, thermal effects

    International Nuclear Information System (INIS)

    Lin Chiwen

    2001-01-01

    This section presents the discussion of special problems in the reactor coolant system design, including LBB and thermal effects. First, the categories of fracture mechanics technology applicable to LBB is discussed. Two categories of fracture mechanics, namely: linear-elastic fracture mechanics (LEFM) and elastic-plastic fracture mechanics (EPFM) are discussed specifically. Next, basic concepts of LEFM are discussed. This will be followed by a discussion of EPFM, with more specific discussion of the methodology currently acceptable to NRC, with the emphasis on the J-integral approach. This is followed by a discussion of the NRC position and recommendations and basic requirements laid out by NRC. A specific example of LBB application to WPWR piping is used to identify the key steps to be followed, in order to satisfy the recommendations and requirements of NRC. An application of LBB to the WPWR reactor coolant loop piping is provided as further illustration of the methodology. This section focuses on the thermal effects which have not been addressed earlier, and the thermal effects which have caused particular concerns on potential reactor degradations, such as pressurized thermal shocks. The organization of this section is divided into the following subsections: linear-elastic fracture mechanics (LEFM); elastic-plastic fracture mechanics (EPFM); J concepts; NRC recommendations and requirements on the application of LBB; two specific applications of LBB to WPWR piping; PWR internals degradation; thermal fatigue considerations; a case study of pressurized thermal shock

  8. Development of piping evaluation diagram for LBB application to KNGR surge line

    International Nuclear Information System (INIS)

    Yoon, K. S.; Park, W. B.; Kim, J. M.; Choi, T. S.; Yang, J. S.; Park, C. Y.

    1998-01-01

    Plant specific data, such as pipe geometry, material properties and pipe loads, are required in order to evaluate Leak-Before-Break (LBB) applicability to piping systems in nuclear power plant under the construction. However, the existing method of LBB evaluation for KSNP's can not be used for newly developed nuclear plants such as Korean Next Generation Reactor (KNGR) which material properties is not available and LBB evaluation is required during design process. In order to solve this problem during developing process for KNGR surge line LBB Piping Evaluation Diagram (PED), which is independent of piping geometry and has a function of the loads applied in piping system, is developed in this paper. Also, in order to evaluate LBB applicability during construction process with only the comparative evaluation of material properties between actually used and expected, the expected changes of material properties are considered in the PED. The PED, therefore, can be used for quick LBB evaluation of KNGR surge line in the process of both design and construction. The benefit obtained by using the PED is : 1) to be able to very quickly confirm LBB applicability without calculating any leakage crack length for all concerned piping locations in the process of both iterative design for optimal routing and construction and 2) to save significantly a lot of computing times required for the corresponding LBB analyses

  9. LBB application in Swedish BWR design

    Energy Technology Data Exchange (ETDEWEB)

    Kornfeldt, H.; Bjoerk, K.O.; Ekstroem, P. [ABB Atom, Vaesteras (Sweden)

    1997-04-01

    The protection against dynamic effects in connection with potential pipe breaks has been implemented in different ways in the development of BWR reactor designs. First-generation plant designs reflect code requirements in effect at that time which means that no piping restraint systems were designed and built into those plants. Modern designs have, in contrast, implemented full protection against damage in connection with postulated pipe breaks, as required in current codes and regulations. Moderns standards and current regulatory demands can be met for the older plants by backfitting pipe whip restraint hardware. This could lead to several practical difficulties as these installations were not anticipated in the original plant design and layout. Meeting the new demands by analysis would in this situation have great advantages. Application of leak-before-break criteria gives an alternative opportunity of meeting modem standards in reactor safety design. Analysis takes into account data specific to BWR primary system operation, actual pipe material properties, piping loads and leak detection capability. Special attention must be given to ensure that the data used reflects actual plant conditions.

  10. LBB application in Swedish BWR design

    International Nuclear Information System (INIS)

    Kornfeldt, H.; Bjoerk, K.O.; Ekstroem, P.

    1997-01-01

    The protection against dynamic effects in connection with potential pipe breaks has been implemented in different ways in the development of BWR reactor designs. First-generation plant designs reflect code requirements in effect at that time which means that no piping restraint systems were designed and built into those plants. Modern designs have, in contrast, implemented full protection against damage in connection with postulated pipe breaks, as required in current codes and regulations. Moderns standards and current regulatory demands can be met for the older plants by backfitting pipe whip restraint hardware. This could lead to several practical difficulties as these installations were not anticipated in the original plant design and layout. Meeting the new demands by analysis would in this situation have great advantages. Application of leak-before-break criteria gives an alternative opportunity of meeting modem standards in reactor safety design. Analysis takes into account data specific to BWR primary system operation, actual pipe material properties, piping loads and leak detection capability. Special attention must be given to ensure that the data used reflects actual plant conditions

  11. Application of the LBB concept to nuclear power plants with WWER 440 and WWER 1000 reactors

    Energy Technology Data Exchange (ETDEWEB)

    Zdarek, J.; Pecinka, L. [Nuclear Research Institute Rez (Czech Republic)

    1997-04-01

    Leak-before-break (LBB) analysis of WWER type reactors in the Czech and Sloval Republics is summarized in this paper. Legislative bases, required procedures, and validation and verification of procedures are discussed. A list of significant issues identified during the application of LBB analysis is presented. The results of statistical evaluation of crack length characteristics are presented and compared for the WWER 440 Type 230 and 213 reactors and for the WWER 1000 Type 302, 320 and 338 reactors.

  12. Application of the LBB concept to nuclear power plants with WWER 440 and WWER 1000 reactors

    International Nuclear Information System (INIS)

    Zdarek, J.; Pecinka, L.

    1997-01-01

    Leak-before-break (LBB) analysis of WWER type reactors in the Czech and Sloval Republics is summarized in this paper. Legislative bases, required procedures, and validation and verification of procedures are discussed. A list of significant issues identified during the application of LBB analysis is presented. The results of statistical evaluation of crack length characteristics are presented and compared for the WWER 440 Type 230 and 213 reactors and for the WWER 1000 Type 302, 320 and 338 reactors

  13. Development of modified piping evaluation diagram for LBB application to Korean next generation reactor

    International Nuclear Information System (INIS)

    Huh, Nam Su; Kim, Young Jin; Pyo, Chang ryul; Yu, Young Jun; Yang, Jun Seog

    1999-01-01

    Recently, the Piping Evaluation Diagram (PED) is accepted in nuclear industry for simple application of Leak-Before-Break (LBB) concept to piping system. By utilizing the PED, the LBB concept is applied before the piping layout is finalized. However, the developed PED may have to be modified to account for the difference between the material properties of the PED development stage and those of the assembly stage. The objective of this paper is to develop the modified PED which can account for the variation of material properties. For this purpose, a parametric study was performed to investigate the effect of stress-strain curve on the detectable crack length and the effect of fracture resistance curve on the LBB allowable load. Finite element analyses were also performed to investigate the effect of stress-strain curve on the LBB allowable load. Finally a modified PED is developed as a function of crack length (DLC and 2xDLC) and the allowable Safe shutdown Earthquake (SSE) load. By adopting the modified PED, the variation of material properties can be considered in the LBB analysis and the computer runs required for the LBB analysis can be considerably reduced

  14. The use of LBB concept in French fast reactors: Application to SPX plant

    International Nuclear Information System (INIS)

    Turbat, A.; Deschanels, H.; Sperandio, M.

    1997-01-01

    The leak before break (LBB) concept was not used at the design level for SUPERPHENIX (SPX), but different studies have been performed or are in progress concerning different components : Main Vessel (MV), pipings. These studies were undertaken to improve the defense in depth, an approach used in all French reactors. In a first study, the LBB approach has been applied to the MV of SPX plant to verify the absence of risk as regards the core supporting function and to help in the definition of in-service inspection (ISI) program. Defining a reference semi-elliptic defect located in the welds of the structure, it is verified that the crack growth is limited and that the end-of-life defect is smaller than the critical one. Then it is shown that the hoop welds (those which are the most important for safety) located between the roof and the triple point verify the leak-before-break criteria. However, generally speaking, the low level of membrane primary stresses which is favorable for the integrity of the vessel makes the application of the leak-before-break concept more difficult due to small crack opening areas. Finally, the extension of the methodology to the secondary pipings of SPX incorporating recent European works of DCRC is briefly presented

  15. The use of LBB concept in French fast reactors: Application to SPX plant

    Energy Technology Data Exchange (ETDEWEB)

    Turbat, A.; Deschanels, H.; Sperandio, M. [and others

    1997-04-01

    The leak before break (LBB) concept was not used at the design level for SUPERPHENIX (SPX), but different studies have been performed or are in progress concerning different components : Main Vessel (MV), pipings. These studies were undertaken to improve the defense in depth, an approach used in all French reactors. In a first study, the LBB approach has been applied to the MV of SPX plant to verify the absence of risk as regards the core supporting function and to help in the definition of in-service inspection (ISI) program. Defining a reference semi-elliptic defect located in the welds of the structure, it is verified that the crack growth is limited and that the end-of-life defect is smaller than the critical one. Then it is shown that the hoop welds (those which are the most important for safety) located between the roof and the triple point verify the leak-before-break criteria. However, generally speaking, the low level of membrane primary stresses which is favorable for the integrity of the vessel makes the application of the leak-before-break concept more difficult due to small crack opening areas. Finally, the extension of the methodology to the secondary pipings of SPX incorporating recent European works of DCRC is briefly presented.

  16. Application of the LBB regulatory approach to the steamlines of advanced WWER 1000 reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kiselyov, V.A.; Sokov, L.M.

    1997-04-01

    The LBB regulatory approach adopted in Russia in 1993 as an extra safety barrier is described for advanced WWER 1000 reactor steamline. The application of LBB concept requires the following additional protections. First, the steamline should be a highly qualified piping, performed in accordance with the applicable regulations and guidelines, carefully screened to verify that it is not subjected to any disqualifying failure mechanism. Second, a deterministic fracture mechanics analysis and leak rate evaluation have been performed to demonstrate that postulated through-wall crack that yields 95 1/min at normal operation conditions is stable even under seismic loads. Finally, it has been verified that the leak detection systems are sufficiently reliable, diverse and sensitive, and that adequate margins exist to detect a through wall crack smaller than the critical size. The obtained results are encouraging and show the possibility of the application of the LBB case to the steamline of advanced WWER 1000 reactor.

  17. Application of the LBB regulatory approach to the steamlines of advanced WWER 1000 reactor

    International Nuclear Information System (INIS)

    Kiselyov, V.A.; Sokov, L.M.

    1997-01-01

    The LBB regulatory approach adopted in Russia in 1993 as an extra safety barrier is described for advanced WWER 1000 reactor steamline. The application of LBB concept requires the following additional protections. First, the steamline should be a highly qualified piping, performed in accordance with the applicable regulations and guidelines, carefully screened to verify that it is not subjected to any disqualifying failure mechanism. Second, a deterministic fracture mechanics analysis and leak rate evaluation have been performed to demonstrate that postulated through-wall crack that yields 95 1/min at normal operation conditions is stable even under seismic loads. Finally, it has been verified that the leak detection systems are sufficiently reliable, diverse and sensitive, and that adequate margins exist to detect a through wall crack smaller than the critical size. The obtained results are encouraging and show the possibility of the application of the LBB case to the steamline of advanced WWER 1000 reactor

  18. Application of LBB to high energy piping systems in operating PWR

    Energy Technology Data Exchange (ETDEWEB)

    Swamy, S.A.; Bhowmick, D.C. [Westinghouse Nuclear Technology Division, Pittsburgh, PA (United States)

    1997-04-01

    The amendment to General Design Criterion 4 allows exclusion, from the design basis, of dynamic effects associated with high energy pipe rupture by application of leak-before-break (LBB) technology. This new approach has resulted in substantial financial savings to utilities when applied to the Pressurized Water Reactor (PWR) primary loop piping and auxiliary piping systems made of stainless steel material. To date majority of applications pertain to piping systems in operating plants. Various steps of evaluation associated with the LBB application to an operating plant are described in this paper.

  19. Defect occurrence, detection, location and characterization; essential variables of the LBB concept application to primary piping

    Energy Technology Data Exchange (ETDEWEB)

    Crutzen, S.; Koble, T.D.; Lemaitre, P. [and others

    1997-04-01

    Applications of the Leak Before Break (LBB) concept involve the knowledge of flaw presence and characteristics. In Service Inspection is given the responsibility of detecting flaws of a determined importance to locate them precisely and to classify them in broad families. Often LBB concepts application imply the knowledge of flaw characteristics such as through wall depth; length at the inner diameter (ID) or outer diameter (OD) surface; orientation or tilt and skew angles; branching; surface roughness; opening or width; crack tip aspect. Besides detection and characterization, LBB evaluations consider important the fact that a crack could be in the weld material or in the base material or in the heat affected zone. Cracks in tee junctions, in homogenous simple welds and in elbows are not considered in the same way. Essential variables of a flaw or defect are illustrated, and examples of flaws found in primary piping as reported by plant operators or service vendors are given. If such flaw variables are important in the applications of LBB concepts, essential is then the knowledge of the performance achievable by NDE techniques, during an ISI, in detecting such flaws, in locating them and in correctly evaluating their characteristics.

  20. LBB considerations for a new plant design

    Energy Technology Data Exchange (ETDEWEB)

    Swamy, S.A.; Mandava, P.R.; Bhowmick, D.C.; Prager, D.E. [Westinghouse Electric Corp., Pittsburgh, PA (United States)

    1997-04-01

    The leak-before-break (LBB) methodology is accepted as a technically justifiable approach for eliminating postulation of Double-Ended Guillotine Breaks (DEGB) in high energy piping systems. This is the result of extensive research, development, and rigorous evaluations by the NRC and the commercial nuclear power industry since the early 1970s. The DEGB postulation is responsible for the many hundreds of pipe whip restraints and jet shields found in commercial nuclear plants. These restraints and jet shields not only cost many millions of dollars, but also cause plant congestion leading to reduced reliability in inservice inspection and increased man-rem exposure. While use of leak-before-break technology saved hundreds of millions of dollars in backfit costs to many operating Westinghouse plants, value-impacts resulting from the application of this technology for future plants are greater on a per plant basis. These benefits will be highlighted in this paper. The LBB technology has been applied extensively to high energy piping systems in operating plants. However, there are differences between the application of LBB technology to an operating plant and to a new plant design. In this paper an approach is proposed which is suitable for application of LBB to a new plant design such as the Westinghouse AP600. The approach is based on generating Bounding Analyses Curves (BAC) for the candidate piping systems. The general methodology and criteria used for developing the BACs are based on modified GDC-4 and Standard Review Plan (SRP) 3.6.3. The BAC allows advance evaluation of the piping system from the LBB standpoint thereby assuring LBB conformance for the piping system. The piping designer can use the results of the BACs to determine acceptability of design loads and make modifications (in terms of piping layout and support configurations) as necessary at the design stage to assure LBB for the, piping systems under consideration.

  1. Applicability of LBB concept to tokamak-type fusion machine

    International Nuclear Information System (INIS)

    Nakahira, Masataka

    2003-12-01

    A tokamak-type fusion machine has been characterized as having inherent plasma shutdown safety. An extremely small leakage of impurities such as primary cooling water, i.e., less than 0.1 g/s, will cause a plasma disruption. This plasma disruption will induce electromagnetic forces (EM forces) acting in the Vacuum Vessel (VV) and plasma-facing components. The VV forms the physical barrier that encloses tritium and activated dust. If the VV has the possibility of sustaining an unstable fracture from a through crack caused by EM forces, the structural safety will be assured and the inherent safety will be demonstrated. This paper analytically assures the Leak-Before-Break (LBB) concept as applied to the VV and is based on experimental leak rate data of a through crack having a very small opening. Based on the analysis, the critical crack length to terminate plasma is evaluated as about 2 mm. On the other hand, the critical crack length for unstable fracture is obtained as about 400 mm. It is therefore concluded that EM forces induced by small leak to terminate plasma will not cause the unstable fracture of VV, and then the inherent safety is demonstrated. (author)

  2. Uncertainty analysis for probabilistic pipe fracture evaluations in LBB applications

    International Nuclear Information System (INIS)

    Rahman, S.; Ghadiali, N.; Wilkowski, G.

    1997-01-01

    During the NRC's Short Cracks in Piping and Piping Welds Program at Battelle, a probabilistic methodology was developed to conduct fracture evaluations of circumferentially cracked pipes for application to leak-rate detection. Later, in the IPIRG-2 program, several parameters that may affect leak-before-break and other pipe flaw evaluations were identified. This paper presents new results from several uncertainty analyses to evaluate the effects of normal operating stresses, normal plus safe-shutdown earthquake stresses, off-centered cracks, restraint of pressure-induced bending, and dynamic and cyclic loading rates on the conditional failure probability of pipes. systems in BWR and PWR. For each parameter, the sensitivity to conditional probability of failure and hence, its importance on probabilistic leak-before-break evaluations were determined

  3. Uncertainty analysis for probabilistic pipe fracture evaluations in LBB applications

    Energy Technology Data Exchange (ETDEWEB)

    Rahman, S.; Ghadiali, N.; Wilkowski, G.

    1997-04-01

    During the NRC`s Short Cracks in Piping and Piping Welds Program at Battelle, a probabilistic methodology was developed to conduct fracture evaluations of circumferentially cracked pipes for application to leak-rate detection. Later, in the IPIRG-2 program, several parameters that may affect leak-before-break and other pipe flaw evaluations were identified. This paper presents new results from several uncertainty analyses to evaluate the effects of normal operating stresses, normal plus safe-shutdown earthquake stresses, off-centered cracks, restraint of pressure-induced bending, and dynamic and cyclic loading rates on the conditional failure probability of pipes. systems in BWR and PWR. For each parameter, the sensitivity to conditional probability of failure and hence, its importance on probabilistic leak-before-break evaluations were determined.

  4. Overview of LBB implementation for the EPR

    International Nuclear Information System (INIS)

    Cauquelin, C.

    1997-01-01

    This paper presents an overview of the use of leak-before-break (LBB) analysis for EPR reactors. EPR is an evolutionary Nuclear Island of the 4 loop x 1500 Mwe class currently in the design phase. Application of LBB to the main coolant lines and resulting design impacts are summarized. Background information on LBB analysis in France and Germany is also presented

  5. Overview of LBB implementation for the EPR

    Energy Technology Data Exchange (ETDEWEB)

    Cauquelin, C.

    1997-04-01

    This paper presents an overview of the use of leak-before-break (LBB) analysis for EPR reactors. EPR is an evolutionary Nuclear Island of the 4 loop x 1500 Mwe class currently in the design phase. Application of LBB to the main coolant lines and resulting design impacts are summarized. Background information on LBB analysis in France and Germany is also presented.

  6. Experiences with leak rate calculations methods for LBB application

    International Nuclear Information System (INIS)

    Grebner, H.; Kastner, W.; Hoefler, A.; Maussner, G.

    1997-01-01

    In this paper, three leak rate computer programs for the application of leak before break analysis are described and compared. The programs are compared to each other and to results of an HDR Reactor experiment and two real crack cases. The programs analyzed are PIPELEAK, FLORA, and PICEP. Generally, the different leak rate models are in agreement. To obtain reasonable agreement between measured and calculated leak rates, it was necessary to also use data from detailed crack investigations

  7. Experiences with leak rate calculations methods for LBB application

    Energy Technology Data Exchange (ETDEWEB)

    Grebner, H.; Kastner, W.; Hoefler, A.; Maussner, G. [and others

    1997-04-01

    In this paper, three leak rate computer programs for the application of leak before break analysis are described and compared. The programs are compared to each other and to results of an HDR Reactor experiment and two real crack cases. The programs analyzed are PIPELEAK, FLORA, and PICEP. Generally, the different leak rate models are in agreement. To obtain reasonable agreement between measured and calculated leak rates, it was necessary to also use data from detailed crack investigations.

  8. Application of a finite element method to leak before break (LBB) of a heat exchanger

    International Nuclear Information System (INIS)

    Lee, Choon-Yeol; Kwon, Jae-Do; Lee, Yong-Sun

    2003-01-01

    The leak before break (LBB) concept is difficult to apply to a structure with a thin tube that is immersed in a water environment. A heat exchanger in a nuclear power plant is such a structure. The present paper addresses an application of the LBB concept to a heat exchanger in a nuclear power plant. The minimum leaked coolant amount containing the radioactive material which can activate the radiation detector device installed near the heat exchanger is assumed. The postulated initial flaw size that cannot grow to the critical flaw size within the time period to activate the radiation detector is justified. In this case, the radiation detector can activate the warning signal caused by coolant leakage from initially postulated flaws of the heat exchanger. The nuclear plant can safely shutdown when this occurs. Since the postulated initial flaw size can not grow to the critical flaw size, the structural integrity of the heat exchanger is not impeded. Particularly the informational scenario presented in this paper discusses an actual nuclear plant. (author)

  9. LBB evaluation for a typical Japanese PWR primary loop by using the US NRC approved methods

    Energy Technology Data Exchange (ETDEWEB)

    Swamy, S.A.; Bhowmick, D.C.; Prager, D.E. [Westinghouse Nuclear Technology Division, Pittsburgh, PA (United States)

    1997-04-01

    The regulatory requirements for postulated pipe ruptures have changed significantly since the first nuclear plants were designed. The Leak-Before-Break (LBB) methodology is now accepted as a technically justifiable approach for eliminating postulation of double-ended guillotine breaks (DEGB) in high energy piping systems. The previous pipe rupture design requirements for nuclear power plant applications are responsible for all the numerous and massive pipe whip restraints and jet shields installed for each plant. This results in significant plant congestion, increased labor costs and radiation dosage for normal maintenance and inspection. Also the restraints increase the probability of interference between the piping and supporting structures during plant heatup, thereby potentially impacting overall plant reliability. The LBB approach to eliminate postulating ruptures in high energy piping systems is a significant improvement to former regulatory methodologies, and therefore, the LBB approach to design is gaining worldwide acceptance. However, the methods and criteria for LBB evaluation depend upon the policy of individual country and significant effort continues towards accomplishing uniformity on a global basis. In this paper the historical development of the U.S. LBB criteria will be traced and the results of an LBB evaluation for a typical Japanese PWR primary loop applying U.S. NRC approved methods will be presented. In addition, another approach using the Japanese LBB criteria will be shown and compared with the U.S. criteria. The comparison will be highlighted in this paper with detailed discussion.

  10. Development of LBB Piping Evaluation Diagram for APR 1000 Main Steam Line Piping

    International Nuclear Information System (INIS)

    Yang, J. S.; Jeong, I. L.; Park, C. Y.; Bai, S. Y.

    2010-01-01

    This paper presents the piping evaluation diagram (PED) to assess the applicability of Leak-Before- Break(LBB) for APR 1000 main steam line piping. LBB-PED of APR 1000 main steam line piping is independent of its piping geometry and has a function of the loads applied in piping system. Also, in order to evaluate LBB applicability during construction process with only the comparative evaluation of material properties between actually used and expected, the expected changes of material properties are considered in the LBB-PED. The LBB-PED, therefore, can be used for quick LBB evaluation of APR 1000 main steam line piping of both design and construction

  11. Approach of Czech regulatory body to LBB

    Energy Technology Data Exchange (ETDEWEB)

    Tendera, P.

    1997-04-01

    At present there are two NPPs equipped with PWR units in Czech Republic. The Dukovany, NPP is about ten years in operation (four units 440 MW - WWBFL model 213) and Tomelin NPP is under construction (two units 1000 MW - WWER model 320). Both NPPs were built to Soviet design and according to Soviet regulations and standards but most of equipment for primary circuits was supplied by home manufacturers. The objective of the Czech LBB program is to prove the LBB status of the primary piping systems of there NPPs and the LBB concept is a part of strategy to meet western style safety standards. The reason for the Czech LBB project is a lack of some standard safety Facilities too. For both Dukovany and Tomelin NPPs a full LBB analysis should be carried out. The application of LBB to the piping system should be also a cost effective means to avoid installations of pipe whip restraints and jet shields. The Czech regulatory body issued non-mandatory requirement, {open_quotes}Leak Before Break{close_quotes} which is in compliance with national legal documents and which is based on the US NRC Regulatory Procedures and US standards (ASMF CODE, ANSI). The requirement has been published in the document {open_quotes}Safety of Nuclear Facilities{close_quotes} No 1/1991 as {open_quotes}Requirements on the Content and Format of Safety Reports and their Supplements{close_quote} and consist of two parts (1) procedure for obtaining proof of evidence {open_quotes}Leak Before Break{close_quotes} (2) leak detection systems for the pressurized reactor primary circuit. At present some changes concerning both parts of the above document will be introduced. The reasons for this modifications will be presented.

  12. Approach for Czech regulatory body to LBB

    Energy Technology Data Exchange (ETDEWEB)

    Tendera, P. [State Office for Nuclear Safety (SONS), Prague (Czech Republic)

    1997-04-01

    At present there are two NPPs equipped with PWR units in Czech Republic. The Dukovany NPP is about ten years in operation (four units 440 MW - WWER model 213) and Temelin NPP is under construction (two units 1000 MW-WWER model 320). Both NPPs were built to Soviet design and according to Soviet regulations and standards but most of equipment for primary circuits was supplied by home manufactures. The objective for the Czech LBB programme is to prove the LBB status of the primary piping systems of these NPPs and the LBB concept is a part of strategy to meet western style safety standards. The reason for the Czech LBB project is a lack of some standard safety facilities, too. For both Dukovany and Temolin NPPs a full LBB analysis should be carried out. The application of LBB to the piping system should be also a cost effective means to avoid installations of pipe whip restraints and jet shields. The Czech regulatory body issued non-mandatory requirement {open_quotes}Leak Before Break{close_quotes} which is in compliance with national legal documents and which is based on the US NRC Regulatory Procedures and US standards (ASME, CODE, ANSI). The requirement has been published in the document {open_quotes}Safety of Nuclear Facilities{close_quotes} No. 1/1991 as {open_quotes}Requirements on the Content and Format of Safety Reports and their Supplements{close_quotes} and consists of two parts (1) procedure for obtaining proof of evidence {open_quotes}Leak Before Break{close_quotes} (2) leak detection systems for the pressurized reactor primary circuit. At present some changes concerning both parts of the above document will be introduced. The reasons for this modifications will be presented.

  13. The nature thickness pipe element testing method to validate the application of LBB conception

    Energy Technology Data Exchange (ETDEWEB)

    Vasilchenko, G.S.; Artemyev, V.I.; Merinov, G.N. [and others

    1997-04-01

    To validate the application of leak before break analysis to the VVER-1000 reactor, a procedure for testing a large-scale specimen on electrohydraulic machinery was developed. Steel pipe with a circular weld and stainless cladding inside was manufactured and large-scale longitudinal cross-sections were cut. The remaining parts of the weld after cut out were used to determination standard tensile mechanical properties, critical temperature of brittlness and for manufacture of compact specimens. Experimental mechanical properties of the weld are summarized.

  14. The nature thickness pipe element testing method to validate the application of LBB conception

    International Nuclear Information System (INIS)

    Vasilchenko, G.S.; Artemyev, V.I.; Merinov, G.N.

    1997-01-01

    To validate the application of leak before break analysis to the VVER-1000 reactor, a procedure for testing a large-scale specimen on electrohydraulic machinery was developed. Steel pipe with a circular weld and stainless cladding inside was manufactured and large-scale longitudinal cross-sections were cut. The remaining parts of the weld after cut out were used to determination standard tensile mechanical properties, critical temperature of brittlness and for manufacture of compact specimens. Experimental mechanical properties of the weld are summarized

  15. A computing system for LBB considerations

    Energy Technology Data Exchange (ETDEWEB)

    Ikonen, K.; Miettinen, J.; Raiko, H.; Keskinen, R.

    1997-04-01

    A computing system has been developed at VTT Energy for making efficient leak-before-break (LBB) evaluations of piping components. The system consists of fracture mechanics and leak rate analysis modules which are linked via an interactive user interface LBBCAL. The system enables quick tentative analysis of standard geometric and loading situations by means of fracture mechanics estimation schemes such as the R6, FAD, EPRI J, Battelle, plastic limit load and moments methods. Complex situations are handled with a separate in-house made finite-element code EPFM3D which uses 20-noded isoparametric solid elements, automatic mesh generators and advanced color graphics. Analytical formulas and numerical procedures are available for leak area evaluation. A novel contribution for leak rate analysis is the CRAFLO code which is based on a nonequilibrium two-phase flow model with phase slip. Its predictions are essentially comparable with those of the well known SQUIRT2 code; additionally it provides outputs for temperature, pressure and velocity distributions in the crack depth direction. An illustrative application to a circumferentially cracked elbow indicates expectedly that a small margin relative to the saturation temperature of the coolant reduces the leak rate and is likely to influence the LBB implementation to intermediate diameter (300 mm) primary circuit piping of BWR plants.

  16. LBB in Candu plants

    Energy Technology Data Exchange (ETDEWEB)

    Kozluk, M.J.; Vijay, D.K. [Ontario Hydro Nuclear, Toronto, Ontario (Canada)

    1997-04-01

    Postulated catastrophic rupture of high-energy piping systems is the fundamental criterion used for the safety design basis of both light and heavy water nuclear generating stations. Historically, the criterion has been applied by assuming a nonmechanistic instantaneous double-ended guillotine rupture of the largest diameter pipes inside of containment. Nonmechanistic, meaning that the assumption of an instantaneous guillotine rupture has not been based on stresses in the pipe, failure mechanisms, toughness of the piping material, nor the dynamics of the ruptured pipe ends as they separate. This postulated instantaneous double-ended guillotine rupture of a pipe was a convenient simplifying assumption that resulted in a conservative accident scenario. This conservative accident scenario has now become entrenched as the design basis accident for: containment design, shutdown system design, emergency fuel cooling systems design, and to establish environmental qualification temperature and pressure conditions. The requirement to address dynamic effects associated with the postulated pipe rupture subsequently evolved. The dynamic effects include: potential missiles, pipe whipping, blowdown jets, and thermal-hydraulic transients. Recent advances in fracture mechanics research have demonstrated that certain pipes under specific conditions cannot crack in ways that result in an instantaneous guillotine rupture. Canadian utilities are now using mechanistic fracture mechanics and leak-before-break assessments on a case-by-case basis, in limited applications, to support licensing cases which seek exemption from the need to consider the various dynamic effects associated with postulated instantaneous catastrophic rupture of high-energy piping systems inside and outside of containment.

  17. Neural Networks Methodology and Applications

    CERN Document Server

    Dreyfus, Gérard

    2005-01-01

    Neural networks represent a powerful data processing technique that has reached maturity and broad application. When clearly understood and appropriately used, they are a mandatory component in the toolbox of any engineer who wants make the best use of the available data, in order to build models, make predictions, mine data, recognize shapes or signals, etc. Ranging from theoretical foundations to real-life applications, this book is intended to provide engineers and researchers with clear methodologies for taking advantage of neural networks in industrial, financial or banking applications, many instances of which are presented in the book. For the benefit of readers wishing to gain deeper knowledge of the topics, the book features appendices that provide theoretical details for greater insight, and algorithmic details for efficient programming and implementation. The chapters have been written by experts ands seemlessly edited to present a coherent and comprehensive, yet not redundant, practically-oriented...

  18. CIAU methodology and BEPU applications

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.

    2009-01-01

    Best-Estimate calculation results from complex thermal-hydraulic system codes (like Relap5, Cathare, Athlet, Trace, etc..) are affected by unavoidable approximations that are unpredictable without the use of computational tools that account for the various sources of uncertainty. Therefore the use of best-estimate codes within the reactor technology, either for design or safety purposes, implies understanding and accepting the limitations and the deficiencies of those codes. Uncertainties may have different origins ranging from the approximation of the models, to the approximation of the numerical solution, and to the lack of precision of the values adopted for boundary and initial conditions. The amount of uncertainty that affects a calculation may strongly depend upon the codes and the modeling techniques (i.e. the code's users). A consistent and robust uncertainty methodology must be developed taking into consideration all the above aspects. The CIAU (Code with the capability of Internal Assessment of Uncertainty) and the UMAE (Uncertainty Methodology based on Accuracy Evaluation) methods have been developed by University of Pisa (UNIPI) in the framework of a long lasting research activities started since 80's and involving several researchers. CIAU is extensively discussed in the available technical literature, Refs. [1, 2, 3, 4, 5, 6, 7], and tens of additional relevant papers, that provide comprehensive details about the method, can be found in the bibliography lists of the above references. Therefore, the present paper supplies only 'spot-information' about CIAU and focuses mostly on the applications to some cases of industrial interest. In particular the application of CIAU to the OECD BEMUSE (Best Estimate Methods Uncertainty and Sensitivity Evaluation, [8, 9]) project is discussed and a critical comparison respect with other uncertainty methods (in relation to items like: sources of uncertainties, selection of the input parameters and quantification of

  19. Practical applications of the R6 leak-before-break procedure

    International Nuclear Information System (INIS)

    Bouchard, P.J.

    1997-01-01

    A forthcoming revision to the R6 Leak-before-Break Assessment Procedure is briefly described. Practical application of the LbB concepts to safety-critical nuclear plant is illustrated by examples covering both low temperature and high temperature (>450 degrees C) operating regimes. The examples highlight a number of issues which can make the development of a satisfactory LbB case problematic: for example, coping with highly loaded components, methodology assumptions and the definition of margins, the effect of crack closure owing to weld residual stresses, complex thermal stress fields or primary bending fields, the treatment of locally high stresses at crack intersections with free surfaces, the choice of local limit load solution when predicting ligament breakthrough, and the scope of calculations required to support even a simplified LbB case for high temperature steam pipe-work systems

  20. Practical applications of the R6 leak-before-break procedure

    Energy Technology Data Exchange (ETDEWEB)

    Bouchard, P.J.

    1997-04-01

    A forthcoming revision to the R6 Leak-before-Break Assessment Procedure is briefly described. Practical application of the LbB concepts to safety-critical nuclear plant is illustrated by examples covering both low temperature and high temperature (>450{degrees}C) operating regimes. The examples highlight a number of issues which can make the development of a satisfactory LbB case problematic: for example, coping with highly loaded components, methodology assumptions and the definition of margins, the effect of crack closure owing to weld residual stresses, complex thermal stress fields or primary bending fields, the treatment of locally high stresses at crack intersections with free surfaces, the choice of local limit load solution when predicting ligament breakthrough, and the scope of calculations required to support even a simplified LbB case for high temperature steam pipe-work systems.

  1. PSA methodology development and application in Japan

    International Nuclear Information System (INIS)

    Kazuo Sato; Toshiaki Tobioka; Kiyoharu Abe

    1987-01-01

    The outlines of Japanese activities on development and application of probabilistic safety assessment (PSA) methodologies are described. First the activities on methodology development are described for system reliability analysis, operational data analysis, core melt accident analysis, environmental consequence analysis and seismic risk analysis. Then the methodoligy application examples by the regulatory side and the industry side are described. (author)

  2. Study of DCRC and RCCM-MR LBB procedures

    International Nuclear Information System (INIS)

    Zhang Zhengming; Cabrillat, M.T.; Lejeail, Y.; Michel, B.

    2006-01-01

    The Leak-Before-Break (LBB) technology has been rapidly developed during the past two decades. The Design and Construction Rules Committee (DCRC) had published a document titled as 'Leak-Before-Break Procedure for Sodium boundary Components'. RCC-MR had also published a document titled as 'A16: Guide for Defect Assessment and Leak Before Break Analysis'. this paper will focus on the comparison of the above two documents. (authors)

  3. Acoustic emission methodology and application

    CERN Document Server

    Nazarchuk, Zinoviy; Serhiyenko, Oleh

    2017-01-01

    This monograph analyses in detail the physical aspects of the elastic waves radiation during deformation or fracture of materials. I presents the  methodological bases for the practical use of acoustic emission device, and describes the results of theoretical and experimental researches of evaluation of the crack growth resistance of materials, selection of the useful AE signals. The efficiency of this methodology is shown through the diagnostics of various-purpose industrial objects. The authors obtain results of experimental researches with the help of the new methods and facilities.

  4. The GPT methodology. New fields of application

    International Nuclear Information System (INIS)

    Gandini, A.; Gomit, J.M.; Abramytchev, V.

    1996-01-01

    The GPT (Generalized Perturbation Theory) methodology is described, and a new application is discussed. The results obtained for a simple model (zero dimension, six parameters of interest) show that the expressions obtained using the GPT methodology, lead to results close to those obtained through direct calculations. The GPT methodology is useful to be used for radioactive waste disposal problems. The potentiality of the method linked to zero dimension model can be extended to radionuclide migration problems with space description. (K.A.)

  5. Current trends in Bayesian methodology with applications

    CERN Document Server

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  6. Methodology and applications of eyetracking

    Directory of Open Access Journals (Sweden)

    Arkadiusz Rajs

    2016-05-01

    Garbary 2,85-229 Bydgoszcz, jacek.gospodarczyk@byd.pl   Summary Eyetracking gives great capability of computer’s systems control and study of usability applications. In this paper we show construction of eyetracker and range of applications.   Key words: eyetracker, computer vision.

  7. Draft Genome Sequence of Lactobacillus delbrueckii subsp. bulgaricus LBB.B5

    NARCIS (Netherlands)

    Urshev, Z.; Hajo, K.; Lenoci, L.; Bron, P.A.; Dijkstra, A.; Alkema, W.; Wels, M.; Siezen, R.J.; Minkova, S.; Hijum, S.A. van

    2016-01-01

    Lactobacillus delbrueckii subsp. bulgaricus LBB.B5 originates from homemade Bulgarian yogurt and was selected for its ability to form a strong association with Streptococcus thermophilus The genome sequence will facilitate elucidating the genetic background behind the contribution of LBB.B5 to the

  8. Fatigue flaw growth assessment and inclusion of stratification to the LBB assessment

    Energy Technology Data Exchange (ETDEWEB)

    Samohyl, P.

    1997-04-01

    The application of the LBB requires also fatigue flaw growth assessment. This analysis was performed for PWR nuclear power plants types VVER 440/230, VVER 440/213c, VVER 1000/320. Respecting that these NPP`s were designed according to Russian codes that differ from US codes it was needed to compare these approaches. Comparison with our experimental data was accomplished, too. Margins of applicability of the US methods and their modifications for the materials used for construction of Czech and Slovak NPP`s are shown. Computer code accomplishing the analysis according to described method is presented. Some measurement and calculations show that thermal stratifications in horizontal pipelines can lead to additive loads that are not negligible and can be dangerous. An attempt to include these loads induced by steady-state stratification was made.

  9. Application of an allocation methodology

    International Nuclear Information System (INIS)

    Youngblood, R.

    1989-01-01

    This paper presents a method for allocating resources to elements of a system for the purpose of achieving prescribed levels of defense-in-depth at minimal cost. The method makes extensive use of logic modelling. An analysis of a simplified high-level waste repository is used as an illustrative application of the method. It is shown that it is possible to allocate quality control costs (or demonstrate performance) in an optimal way over elements of a conceptual design

  10. Application of an allocation methodology

    International Nuclear Information System (INIS)

    Youngblood, R.; de Oliveira, L.F.S.

    1989-01-01

    This paper presents a method for allocating resources to elements of a system for the purpose of achieving prescribed levels of defense-in-depth at minimal cost. The method makes extensive use of logic modelling. An analysis of a simplified high-level waste repository is used as an illustrative application of the method. It is shown that it is possible to allocate quality control costs (or demonstrated performance) in an optimal way over elements of a conceptual design. 6 refs., 3 figs., 2 tabs

  11. Evaluation of LBB margin of nuclear piping systems

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Il Soon; Kim, Ji Hyeon; Oh, Yeong Jin; Lim, Jun [Seoul Nationl Univ., Seoul (Korea, Republic of); Kim, In Seob; Kim, Yong Seon; Lee, Joo Seok [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1999-04-15

    Most of previous elastic-plastic fracture studies for LBB assessment of low alloy steel piping have been focused on base metals and weld metals. In contract, the heat affected zone of welded pipe has not been studied in detail primarily because the size of heat affected zone in welded pipe os too small to make specimens for mechanical properties measurement. When structural members are joined by welding, the base metal is heated to its melting point and then cooled rapidly. As a result of this very severe thermal cycle, mechanical properties in the heat affected zone can be degraded by grain coarsening, the precipitation and the segregation of trace impurities. In this study, a thermal and microstructural analysis is performed, and mechanical properties are measured for the weld heat affected zone of SA106Gr.C low allowed piping steel. In addition, inter critical annealing treatment. in two-phase (alpha+gamma) region was performed to investigate the possibilities of improving the toughness and reducing dynamic strain aging (DSA) susceptibility for giving allowable LBB safety margins. From the results, intercritical annealing is shown to give a smaller ductility loss due to DSA than the case of as-received material. Furthermore, the intercritical annealing was able to increase the impact toughness by a factor of 1.5 compared to the as-received material.

  12. Evaluation of LBB margin of nuclear piping systems

    International Nuclear Information System (INIS)

    Hwang, Il Soon; Kim, Ji Hyeon; Oh, Yeong Jin; Lim, Jun; Kim, In Seob; Kim, Yong Seon; Lee, Joo Seok

    1999-04-01

    Most of previous elastic-plastic fracture studies for LBB assessment of low alloy steel piping have been focused on base metals and weld metals. In contract, the heat affected zone of welded pipe has not been studied in detail primarily because the size of heat affected zone in welded pipe os too small to make specimens for mechanical properties measurement. When structural members are joined by welding, the base metal is heated to its melting point and then cooled rapidly. As a result of this very severe thermal cycle, mechanical properties in the heat affected zone can be degraded by grain coarsening, the precipitation and the segregation of trace impurities. In this study, a thermal and microstructural analysis is performed, and mechanical properties are measured for the weld heat affected zone of SA106Gr.C low allowed piping steel. In addition, inter critical annealing treatment. in two-phase (alpha+gamma) region was performed to investigate the possibilities of improving the toughness and reducing dynamic strain aging (DSA) susceptibility for giving allowable LBB safety margins. From the results, intercritical annealing is shown to give a smaller ductility loss due to DSA than the case of as-received material. Furthermore, the intercritical annealing was able to increase the impact toughness by a factor of 1.5 compared to the as-received material

  13. Overview of large scale experiments performed within the LBB project in the Czech Republic

    Energy Technology Data Exchange (ETDEWEB)

    Kadecka, P.; Lauerova, D. [Nuclear Research Institute, Rez (Czechoslovakia)

    1997-04-01

    During several recent years NRI Rez has been performing the LBB analyses of safety significant primary circuit pipings of NPPs in Czech and Slovak Republics. The analyses covered the NPPs with reactors WWER 440 Type 230 and 213 and WWER 1000 Type 320. Within the relevant LBB projects undertaken with the aim to prove the fulfilling of the requirements of LBB, a series of large scale experiments were performed. The goal of these experiments was to verify the properties of the components selected, and to prove the quality and/or conservatism of assessments used in the LBB-analyses. In this poster, a brief overview of experiments performed in Czech Republic under guidance of NRI Rez is presented.

  14. Draft Genome Sequence of Lactobacillus delbrueckii subsp. bulgaricus LBB.B5.

    Science.gov (United States)

    Urshev, Zoltan; Hajo, Karima; Lenoci, Leonardo; Bron, Peter A; Dijkstra, Annereinou; Alkema, Wynand; Wels, Michiel; Siezen, Roland J; Minkova, Svetlana; van Hijum, Sacha A F T

    2016-10-06

    Lactobacillus delbrueckii subsp. bulgaricus LBB.B5 originates from homemade Bulgarian yogurt and was selected for its ability to form a strong association with Streptococcus thermophilus The genome sequence will facilitate elucidating the genetic background behind the contribution of LBB.B5 to the taste and aroma of yogurt and its exceptional protocooperation with S. thermophilus. Copyright © 2016 Urshev et al.

  15. Proposed Methodology for Establishing Area of Applicability

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Hopper, C.M.; Parks, C.V.

    1999-01-01

    This paper presents the application of sensitivity and uncertainty (S/U) analysis methodologies to the data validation tasks of a criticality safety computational study. The S/U methods presented are designed to provide a formal means of establishing the area (or range) of applicability for criticality safety data validation studies. The development of parameters that are analogous to the standard trending parameters form the key to the technique. These parameters are the so-called D parameters, which represent the differences by energy group of S/U-generated sensitivity profiles, and c parameters, which are the k correlation coefficients, each of which give information relative to the similarity between pairs of selected systems. The use of a Generalized Linear Least-Squares Methodology (GLLSM) tool is also described in this paper. These methods and guidelines are also applied to a sample validation for uranium systems with enrichments greater than 5 wt %

  16. Evolving Intelligent Systems Methodology and Applications

    CERN Document Server

    Angelov, Plamen; Kasabov, Nik

    2010-01-01

    From theory to techniques, the first all-in-one resource for EIS. There is a clear demand in advanced process industries, defense, and Internet and communication (VoIP) applications for intelligent yet adaptive/evolving systems. Evolving Intelligent Systems is the first self- contained volume that covers this newly established concept in its entirety, from a systematic methodology to case studies to industrial applications. Featuring chapters written by leading world experts, it addresses the progress, trends, and major achievements in this emerging research field, with a strong emphasis on th

  17. Effects of local mechanical and fracture properties on LBB behavior of a dissimilar metal welded joint in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Du, L.Y.; Wang, G.Z., E-mail: gzwang@ecust.edu.cn; Xuan, F.Z.; Tu, S.T.

    2013-12-15

    Highlights: • Effect of local mechanical and fracture properties on LBB behavior were investigated. • Considering local mechanical properties leads to slightly high LBB curve. • Use of fracture resistance of base or weld will produce non-conservative LBB result. • Local fracture properties of interface region cannot be ignored in LBB analysis. - Abstract: In this paper, three-dimensional finite element models with and without considering local mechanical properties were built for a dissimilar metal welded joint (DMWJ) connected the safe end to pipe-nozzle of a reactor pressure vessel. The inner circumferential surface cracks were postulated at the interface of A508 steel and buttering Alloy52Mb. Based on the elastic–plastic fracture mechanics theory of J-integral, the crack growth stability was analyzed. The effects of the local mechanical and fracture resistance properties on LBB behavior were investigated. The results show that considering local mechanical properties leads to slightly high LBB curve. For the A508/Alloy52Mb interface region cracks in the DMWJ, if the fracture resistance curve of base metal A508 or the buttering Alloy52Mb is used, the non-conservative (unsafe) LBB assessment result will be produced. With increasing the applied bending moment, the degree of un-conservatism in LBB behavior becomes large. Therefore, to obtain accurate LBB assessment results, the local fracture resistance properties of the interface region should be used.

  18. Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2014-01-01

    This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

  19. Mining software specifications methodologies and applications

    CERN Document Server

    Lo, David

    2011-01-01

    An emerging topic in software engineering and data mining, specification mining tackles software maintenance and reliability issues that cost economies billions of dollars each year. The first unified reference on the subject, Mining Software Specifications: Methodologies and Applications describes recent approaches for mining specifications of software systems. Experts in the field illustrate how to apply state-of-the-art data mining and machine learning techniques to address software engineering concerns. In the first set of chapters, the book introduces a number of studies on mining finite

  20. MicroComputed Tomography: Methodology and Applications

    International Nuclear Information System (INIS)

    Stock, Stuart R.

    2009-01-01

    Due to the availability of commercial laboratory systems and the emergence of user facilities at synchrotron radiation sources, studies of microcomputed tomography or microCT have increased exponentially. MicroComputed Technology provides a complete introduction to the technology, describing how to use it effectively and understand its results. The first part of the book focuses on methodology, covering experimental methods, data analysis, and visualization approaches. The second part addresses various microCT applications, including porous solids, microstructural evolution, soft tissue studies, multimode studies, and indirect analyses. The author presents a sufficient amount of fundamental material so that those new to the field can develop a relative understanding of how to design their own microCT studies. One of the first full-length references dedicated to microCT, this book provides an accessible introduction to field, supplemented with application examples and color images.

  1. Regulatory aspects of the Leak Before Break application

    International Nuclear Information System (INIS)

    Korosec, D.; Vojvodic Tuma, J.

    2000-01-01

    In the present paper the Leak Before Break (LBB) methodology is described on the way, as it was evaluated in Slovenian Nuclear Safety Administration in the process of the nuclear power plant (NPP) Krsko modernization process. In the recent decade, reviewed regulatory position regarding elimination the dynamic effects of the postulated primary coolant pipe ruptures in some countries was issued. The basis for such new approach are research achievements on different areas of science like metallurgy, fracture mechanics, dynamic analysis and testing of materials. By this new regulatory position the utility has the possibility to adopt LBB concept, but has to fulfill at least general prerequisites, described in standard review plan, where the basic principles and objectives of the evaluation process are described. World wide practice shows that more intensive and detailed evaluation is necessary as it is described in the standard review plan. The concerns and requests raising during evaluation of the consequences of the adopted LBB concept have generally some common points comparing regulatory experience of different countries which have already accepted this methodology or are currently under the process of evaluation. But nevertheless every nuclear power plant is unique, comparing specific material properties, dynamic analysis assumptions, safety analyses performed etc. One of the most important areas in the LBB evaluation process is reliable primary piping material evaluation. Several generic material studies were performed and the applicability of these studies for the justification of LBB has to be carefully performed. Elimination of the double ended guillotine break from the safety analysis has very strong impact to the safety analysis results and finally to the structures, systems and components too. Nuclear safety administration has made a lot of effort to make its position on this methodology clear. The expertise of the authorized institutions is in such process

  2. PET/MRI. Methodology and clinical applications

    Energy Technology Data Exchange (ETDEWEB)

    Carrio, Ignasi [Autonomous Univ. of Barcelona, Hospital Sant Pau (Spain). Dept. Medicina Nuclear; Ros, Pablo (ed.) [Univ. Hospitals Case, Medical Center, Cleveland, OH (United States). Dept. of Radiology

    2014-04-01

    Provides detailed information on the methodology and equipment of MRI-PET. Covers a wide range of clinical applications in oncology, cardiology, and neurology. Written by an international group of experts in MRI and PET. PET/MRI is an exciting novel diagnostic imaging modality that combines the precise anatomic and physiologic information provided by magnetic resonance imaging (MRI) with the molecular data obtained with positron emission tomography (PET). PET/MRI offers the promise of a simplified work flow, reduced radiation, whole-body imaging with superior soft tissue contrast, and time of flight physiologic information. It has been described as the pathway to molecular imaging in medicine. In compiling this textbook, the editors have brought together a truly international group of experts in MRI and PET. The book is divided into two parts. The first part covers methodology and equipment and comprises chapters on basic molecular medicine, development of specific contrast agents, MR attenuation and validation, quantitative MRI and PET motion correction, and technical implications for both MRI and PET. The second part of the book focuses on clinical applications in oncology, cardiology, and neurology. Imaging of major neoplasms, including lymphomas and tumors of the breast, prostate, and head and neck, is covered in individual chapters. Further chapters address functional and metabolic cardiovascular examinations and major central nervous system applications such as brain tumors and dementias. Risks, safety aspects, and healthcare costs and impacts are also discussed. This book will be of interest to all radiologists and nuclear medicine physicians who wish to learn more about the latest developments in this important emerging imaging modality and its applications.

  3. A simplified LBB evaluation procedure for austenitic and ferritic steel piping

    International Nuclear Information System (INIS)

    Gamble, R.M.; Wichman, K.R.

    1997-01-01

    The NRC previously has approved application of LBB analysis as a means to demonstrate that the probability of pipe rupture was extremely low so that dynamic loads associated with postulated pipe break could be excluded from the design basis (1). The purpose of this work was to: (1) define simplified procedures that can be used by the NRC to compute allowable lengths for circumferential throughwall cracks and assess margin against pipe fracture, and (2) verify the accuracy of the simplified procedures by comparison with available experimental data for piping having circumferential throughwall flaws. The development of the procedures was performed using techniques similar to those employed to develop ASME Code flaw evaluation procedures. The procedures described in this report are applicable to pipe and pipe fittings with: (1) wrought austenitic steel (Ni-Cr-Fe alloy) having a specified minimum yield strength less than 45 ksi, and gas metal-arc, submerged arc and shielded metal-arc austentic welds, and (2) seamless or welded wrought carbon steel having a minimum yield strength not greater than 40 ksi, and associated weld materials. The procedures can be used for cast austenitic steel when adequate information is available to place the cast material toughness into one of the categories identified later in this report for austenitic wrought and weld materials

  4. A simplified LBB evaluation procedure for austenitic and ferritic steel piping

    Energy Technology Data Exchange (ETDEWEB)

    Gamble, R.M.; Wichman, K.R.

    1997-04-01

    The NRC previously has approved application of LBB analysis as a means to demonstrate that the probability of pipe rupture was extremely low so that dynamic loads associated with postulated pipe break could be excluded from the design basis (1). The purpose of this work was to: (1) define simplified procedures that can be used by the NRC to compute allowable lengths for circumferential throughwall cracks and assess margin against pipe fracture, and (2) verify the accuracy of the simplified procedures by comparison with available experimental data for piping having circumferential throughwall flaws. The development of the procedures was performed using techniques similar to those employed to develop ASME Code flaw evaluation procedures. The procedures described in this report are applicable to pipe and pipe fittings with: (1) wrought austenitic steel (Ni-Cr-Fe alloy) having a specified minimum yield strength less than 45 ksi, and gas metal-arc, submerged arc and shielded metal-arc austentic welds, and (2) seamless or welded wrought carbon steel having a minimum yield strength not greater than 40 ksi, and associated weld materials. The procedures can be used for cast austenitic steel when adequate information is available to place the cast material toughness into one of the categories identified later in this report for austenitic wrought and weld materials.

  5. Cleansing methodology of sites and its applications

    International Nuclear Information System (INIS)

    De Moura, Patrick; Dubot, Didier; Faure, Vincent; Attiogbe, Julien; Jeannee, Nicolas; Desnoyers, Yvon

    2009-01-01

    The Commissariat a l'Energie Atomique (CEA, French Atomic Energy Commission) has set up over the last 10 years an innovative methodology aiming at characterizing radiological contaminations. The application of the latter relies on various tools such as expertise vehicles with impressive detection performances (VEgAS) and recently developed software platform called Kartotrak. A Geographic Information System tailored to radiological needs constitutes the heart of the platform; it is surrounded by several modules aiming at sampling optimization (Stratege), data analysis and geostatistical modeling (Krigeo), real-time monitoring (Kartotrak- RT) and validation of cleaning efficiency (Pescar). This paper presents the different tools which provide exhaustive instruments for the follow-up of decontamination projects, from doubt removal to the verification of the decontamination process. (authors)

  6. Fracture properties evaluation of stainless steel piping for LBB applications

    International Nuclear Information System (INIS)

    Kim, Y.J.; Seok, C.S.; Chang, Y.S.

    1997-01-01

    The objective of this paper is to evaluate the material properties of SA312 TP316 and SA312 TP304 stainless steels and their associated welds manufactured for shutdown cooling line and safety injection line of nuclear generating stations. A total of 82 tensile tests and 58 fracture toughness tests on specimens taken from actual pipes were performed and the effect of various parameters such as the pipe size, the specimen orientation, the test temperature and the welding procedure on the material properties are discussed. Test results show that the effect of the test temperature on the fracture toughness was significant while the effects of the pipe size and the specimen orientation on the fracture toughness were negligible. The material properties of the GTAW weld metal was in general higher than those of the base metal

  7. Fracture properties evaluation of stainless steel piping for LBB applications

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Y.J.; Seok, C.S.; Chang, Y.S. [Sung Kyun Kwan Univ., Suwon (Korea, Republic of)

    1997-04-01

    The objective of this paper is to evaluate the material properties of SA312 TP316 and SA312 TP304 stainless steels and their associated welds manufactured for shutdown cooling line and safety injection line of nuclear generating stations. A total of 82 tensile tests and 58 fracture toughness tests on specimens taken from actual pipes were performed and the effect of various parameters such as the pipe size, the specimen orientation, the test temperature and the welding procedure on the material properties are discussed. Test results show that the effect of the test temperature on the fracture toughness was significant while the effects of the pipe size and the specimen orientation on the fracture toughness were negligible. The material properties of the GTAW weld metal was in general higher than those of the base metal.

  8. LBB assessment on ferrite piping structure of large-scale FBR

    OpenAIRE

    兪 淵植

    2002-01-01

    These days, this interest on LBB(Leak before Break) design becomes to be rising in the viewpoint of the cost reduction and structural inter-grity for the commercialization of FBR plants, LBB design enables pla-nts to be shut down safely before occuring unstable fracture by dete- cting the leak rates even if a crack initiates and penetrates a wall thickness. It is necessary to assess crack growth and penetration be- havior considering in-service conditions under operation temperature, leak re...

  9. Qualification of PHT piping of Indian 500 MW PHWR for LBB, using R-6 method

    International Nuclear Information System (INIS)

    Rastogi, Rohit; Bhasin, V.; Kushwaha, H.S.

    1997-01-01

    This document discusses the qualification of straight pipe portion of the primary heat transport (PHT) piping of Indian 500 MWe pressurised heavy water reactor (PHWR) for leak before break (LBB). The evaluation is done using R-6 [1] method. The results presented here are: the safety margins which exist on straight pipe components of main PHT piping of 500 MWe, under leakage size crack (LSC) and design basis accident loads; the sensitivity of safety margins with respect to different analysis parameters and the qualification of PHT piping for LBB based on criterion given by NUREG-1061 [2] and TECDOC-774 [3]. (author)

  10. Application of agile methodologies in software development

    Directory of Open Access Journals (Sweden)

    Jovanović Aca D.

    2016-01-01

    Full Text Available The paper presents the potentials for the development of software using agile methodologies. Special consideration is devoted to the potentials and advantages of use of the Scrum methodology in the development of software and the relationship between the implementation of agile methodologies and the software development projects.

  11. Application of Response Surface Methodology for Optimizing Oil ...

    African Journals Online (AJOL)

    Application of Response Surface Methodology for Optimizing Oil Extraction Yield From ... AFRICAN JOURNALS ONLINE (AJOL) · Journals · Advanced Search ... from tropical almond seed by the use of response surface methodology (RSM).

  12. Application opportunities of agile methodology in service company management

    OpenAIRE

    Barauskienė, Diana

    2017-01-01

    Application Opportunities of Agile Methodology in Service Company Management. The main purpose of this master thesis is to identify which methods (or their modified versions) of Agile methodology can be applied in service company management. This master thesis consists of these parts – literature scientific analysis, author’s research methodology (research methods, authors’ research model, essential elements used in the research of application of Agile methodology), research itself (prelimina...

  13. Causal Meta-Analysis : Methodology and Applications

    NARCIS (Netherlands)

    Bax, L.J.

    2009-01-01

    Meta-analysis is a statistical method to summarize research data from multiple studies in a quantitative manner. This dissertation addresses a number of methodological topics in causal meta-analysis and reports the development and validation of meta-analysis software. In the first (methodological)

  14. Developing educational hypermedia applications: a methodological approach

    Directory of Open Access Journals (Sweden)

    Jose Miguel Nunes

    1996-01-01

    Full Text Available This paper proposes an hypermedia development methodology with the aim of integrating the work of both educators, who will be primarily responsible for the instructional design, with that of software experts, responsible for the software design and development. Hence, it is proposed that the educators and programmers should interact in an integrated and systematic manner following a methodological approach.

  15. Design Methodologies: Industrial and Educational Applications

    NARCIS (Netherlands)

    Tomiyama, T.; Gul, P.; Jin, Y.; Lutters, Diederick; Kind, Ch.; Kimura, F.

    2009-01-01

    The field of Design Theory and Methodology has a rich collection of research results that has been taught at educational institutions as well as applied to design practices. First, this keynote paper describes some methods to classify them. It then illustrates individual theories and methodologies

  16. The analysis of normative requirements to materials of PWR components, basing on LBB concepts

    International Nuclear Information System (INIS)

    Anikovsky, V.V.; Karzov, G.P.; Timofeev, B.T.

    1997-01-01

    The paper discusses the advisability of the correction of Norms to solve in terms of material science the Problem: how the normative requirements to materials must be changed in terms of the concept open-quotes leak before breakclose quotes (LBB)

  17. The analysis of normative requirements to materials of PWR components, basing on LBB concepts

    Energy Technology Data Exchange (ETDEWEB)

    Anikovsky, V.V.; Karzov, G.P.; Timofeev, B.T. [CRISM Prometey, St. Petersburg (Russian Federation)

    1997-04-01

    The paper discusses the advisability of the correction of Norms to solve in terms of material science the Problem: how the normative requirements to materials must be changed in terms of the concept {open_quotes}leak before break{close_quotes} (LBB).

  18. Issues in the global applications of methodology in forensic anthropology.

    Science.gov (United States)

    Ubelaker, Douglas H

    2008-05-01

    The project and research reported in this collection of articles follows a long-term historical pattern in forensic anthropology in which new case work and applications reveal methodological issues that need to be addressed. Forensic anthropological analysis in the area of the former Yugoslavia led to questions raised regarding the applicability of methods developed from samples in other regions. The subsequently organized project reveals that such differences exist and new methodology and data are presented to facilitate applications in the Balkan area. The effort illustrates how case applications and court testimony can stimulate research advances. The articles also serve as a model for the improvement of methodology available for global applications.

  19. The corrosion and corrosion mechanical properties evaluation for the LBB concept in VVERs

    Energy Technology Data Exchange (ETDEWEB)

    Ruscak, M.; Chvatal, P.; Karnik, D.

    1997-04-01

    One of the conditions required for Leak Before Break application is the verification that the influence of corrosion environment on the material of the component can be neglected. Both the general corrosion and/or the initiation and, growth of corrosion-mechanical cracks must not cause the degradation. The primary piping in the VVER nuclear power plant is made from austenitic steels (VVER 440) and low alloy steels protected with the austenitic cladding (VVER 1000). Inspection of the base metal and heterogeneous weldments from the VVER 440 showed that the crack growth rates are below 10 m/s if a low oxygen level is kept in the primary environment. No intergranular cracking was observed in low and high oxygen water after any type of testing, with constant or periodic loading. In the framework of the LBB assessment of the VVER 1000, the corrosion and corrosion mechanical properties were also evaluated. The corrosion and corrosion mechanical testing was oriented predominantly to three types of tests: stress corrosion cracking tests corrosion fatigue tests evaluation of the resistance against corrosion damage. In this paper, the methods used for these tests are described and the materials are compared from the point of view of response on static and periodic mechanical stress on the low alloyed steel 10GN2WA and weld metal exposed in the primary circuit environment. The slow strain rate tests and static loading of both C-rings and CT specimens were performed in order to assess the stress corrosion cracking characteristics. Cyclic loading of CT specimens was done to evaluate the kinetics of the crack growth under periodical loading. Results are shown to illustrate the approaches used. The data obtained were evaluated also from the point of view of comparison of the influence of different structure on the stress corrosion cracking appearance. The results obtained for the base metal and weld metal of the piping are presented here.

  20. Guidance for the application of the leak before break concept. Report of the IAEA extrabudgetary programme on the safety of WWER-440 model 230 nuclear power plants

    International Nuclear Information System (INIS)

    1994-11-01

    This document provides additional guidance on application of the LBB concept to WWER-440/230 NPPs and complements the IAEA-TECDOC-710. The objective of the report is to describe in detail the elements of the LBB concept, the necessary support as well as the condition to be fulfilled, and the verification programme. It should also provide a clear picture of all the activities and resources needed to implement the LBB successfully as a comprehensive concept

  1. Residual radioactive material guidelines: Methodology and applications

    International Nuclear Information System (INIS)

    Yu, C.; Yuan, Y.C.; Zielen, A.J.; Wallo, A. III.

    1989-01-01

    A methodology to calculate residual radioactive material guidelines was developed for the US Department of Energy (DOE). This methodology is coded in a menu-driven computer program, RESRAD, which can be run on IBM or IBM-compatible microcomputers. Seven pathways of exposure are considered: external radiation, inhalation, and ingestion of plant foods, meat, milk, aquatic foods, and water. The RESRAD code has been applied to several DOE sites to calculate soil cleanup guidelines. This experience has shown that the computer code is easy to use and very user-friendly. 3 refs., 8 figs

  2. Crack-tip constraint analyses and constraint-dependent LBB curves for circumferential through-wall cracked pipes

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y.L.; Wang, G.Z., E-mail: gzwang@ecust.edu.cn; Xuan, F.Z.; Tu, S.T.

    2015-04-15

    Highlights: • Solution of constraint parameter τ* for through-wall cracked pipes has been obtained. • Constraint increases with increasing crack length and radius–thickness ratio of pipes. • Constraint-dependent LBB curve for through-wall cracked pipes has been constructed. • For increasing accuracy of LBB assessments, constraint effect should be considered. - Abstract: The leak-before-break (LBB) concept has been widely applied in the structural integrity assessments of pressured pipes in nuclear power plants. However, the crack-tip constraint effects in LBB analyses and designs cannot be incorporated. In this paper, by using three-dimensional finite element calculations, the modified load-independent T-stress constraint parameter τ* for circumferential through-wall cracked pipes with different geometries and crack sizes has been analyzed under different loading conditions, and the solutions of the crack-tip constraint parameter τ* have been obtained. Based on the τ* solutions and constraint-dependent J–R curves of a steel, the constraint-dependent LBB (leak-before-break) curves have been constructed. The results show that the constraint τ* increases with increasing crack length θ, mean radius R{sub m} and radius–thickness ratio R{sub m}/t of the pipes. In LBB analyses, the critical crack length calculated by the J–R curve of the standard high constraint specimen for pipes with shorter cracks is over-conservative, and the degree of conservatism increases with decreasing crack length θ, R{sub m} and R{sub m}/t. Therefore, the constraint-dependent LBB curves should be constructed to modify the over-conservatism and increase accuracy of LBB assessments.

  3. Application of PRINCE2 Project Management Methodology

    Directory of Open Access Journals (Sweden)

    Vaníčková Radka

    2017-09-01

    Full Text Available The methodology describes the principle of setting a project in PRINCE2 project management. The main aim of the paper is to implement PRINCE2 methodology to be used in an enterprise in the service industry. A partial aim is to choose a supplier of the project among new travel guides. The result of the project activity is a sight-seeing tour/service more attractive for customers in the tourism industry and a possible choice of new job opportunities. The added value of the article is the description of applying the principles, processes and topics of PRINCE2 project management so that they might be used in the field.

  4. Applicability of the Directed Graph Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Huszti, Jozsef [Institute of Isotope of the Hungarian Academy of Sciences, Budapest (Hungary); Nemeth, Andras [ESRI Hungary, Budapest (Hungary); Vincze, Arpad [Hungarian Atomic Energy Authority, Budapest (Hungary)

    2012-06-15

    Possible methods to construct, visualize and analyse the 'map' of the State's nuclear infrastructure based on different directed graph approaches are proposed. The transportation and the flow network models are described in detail. The use of the possible evaluation methodologies and the use of available software tools to construct and maintain the nuclear 'map' using pre-defined standard building blocks (nuclear facilities) are introduced and discussed.

  5. ''Training plan optimized design'' methodology application to IBERDROLA - Power generation

    International Nuclear Information System (INIS)

    Gil, S.; Mendizabal, J.L.

    1996-01-01

    The trend in both Europe and the United States, towards the understanding that no training plan may be considered suitable if not backed by the results of application of the S.A.T. (Systematic Approach to Training) methodology, led TECNATOM, S.A. to apply thy methodology through development of an application specific to the conditions of the Spanish working system. The requirement that design of the training be coherent with the realities of the working environment is met by systematic application of the SAT methodology as part of the work analysis and job-based task analysis processes, this serving as a basis for design of the training plans

  6. Application of a methodology for retouching

    Directory of Open Access Journals (Sweden)

    Ana Bailão

    2010-11-01

    Full Text Available Between November 2006 and January 2010, an investigation into retouching methodologies was carried out. The aim of this paper is to describe, in four steps, the retouching methodology of a contemporary painting. The four steps are: chromatic and formal study, considering the use of Gestalt theory and the phenomena of contrast and assimilation; selection of the technique; choice of the materials and retouching practice.Entre Novembre 2006 et Janvier 2010, nous avons fait une recherche dans le cadre du programme de Maitrise sur la méthodologie et les techniques de retouche. Le but de cet article est la description, en quatre étapes, de la méthodologie de retouche d’une peinture contemporaine. Les quatre étapes sont: étude chromatique et formelle, avec l’utilisation de la théorie de la Gestalt et des phénomènes de contraste et assimilation, la sélection de la technique, le choix des matériaux et la pratique de retouche. 

  7. Novel Biomaterials Methodology, Development and Application

    Science.gov (United States)

    Traditionally the use of carbohydrate-based wound dressings including cotton, xerogels, charcoal cloth, alginates, chitosan and hydrogels, have afforded properties such as absorbency, ease of application and removal, bacterial protection, fluid balance, occlusion, and elasticity. Recent efforts in ...

  8. Ecodesign of cosmetic formulae: methodology and application.

    Science.gov (United States)

    L'Haridon, J; Martz, P; Chenéble, J-C; Campion, J-F; Colombe, L

    2018-04-01

    This article describes an easy-to-use ecodesign methodology developed and applied since 2014 by the L'Oréal Group to improve the sustainable performance of its new products without any compromise on their cosmetic efficacy. Cosmetic products, after being used, are often discharged into the sewers and the aquatic compartment. This discharge is considered as dispersive and continuous. A consistent progress in reducing the environmental impact of cosmetic products can be achieved through focusing upon three strategic indicators: biodegradability, grey water footprint adapted for ecodesign (GWFE) and a global indicator, complementary to these two endpoints. Biodegradability represents the key process in the removal of organic ingredients from the environment. GWFE is defined herein as the theoretical volume of natural freshwater required to dilute a cosmetic formula after being used by the consumer, down to a concentration without any foreseeable toxic effects upon aquatic species. Finally, the complementary indicator highlights a possible alert on formula ingredients due to an unfavourable environmental profile based on hazard properties: for example Global Harmonization System/Classification, Labelling and Packaging (GHS/CLP) H410 classification or potential very persistent and very bioaccumulative (vPvB) classification. The ecodesign of a new cosmetic product can be a challenge as the cosmetic properties and quality of this new product should at least match the benchmark reference. As shown in the case studies described herein, new methodologies have been developed to maximize the biodegradability of cosmetic formulae, to minimize their GWFE and to limit the use of ingredients that present an unfavourable environmental profile, while reaching the highest standards in terms of cosmetic efficacy. By applying these methodologies, highly biodegradable products (≥ 95% based on ingredient composition) have been developed and marketed, with a low GWFE. This new

  9. Emission computed tomography: methodology and applications

    International Nuclear Information System (INIS)

    Reivich, M.; Alavi, A.; Greenberg, J.; Fowler, J.; Christman, D.; Rosenquist, A.; Rintelmann, W.; Hand, P.; MacGregor, R.; Wolf, A.

    1980-01-01

    A technique for the determination of local cerebral glucose metabolism using positron emission computed tomography is described as an example of the development of use of this methodology for the study of these parameters in man. The method for the determination of local cerebral glucose metabolism utilizes 18 F-2-fluoro-2-deoxyglucose ([ 18 F]-FDG). In this method [ 18 F]-FDG is used as a tracer for the exchange of glucose between plasma and brain and its phosphorylation by hexokinase in the tissue. The labelled product of metabolism, [ 18 F]-FDG phosphate, is essentially trapped in the tissue over the time course of the measurement. The studies demonstrate the potential usefulness of emission computed tomography for the measurement of various biochemical and physiological parameters in man. (Auth.)

  10. Robust PV Degradation Methodology and Application

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, Dirk [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Deline, Christopher A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Kimball, Greg [SunPower; Anderson, Mike [SunPower

    2017-11-15

    The degradation rate plays an important role in predicting and assessing the long-term energy generation of PV systems. Many methods have been proposed for extracting the degradation rate from operational data of PV systems, but most of the published approaches are susceptible to bias due to inverter clipping, module soiling, temporary outages, seasonality, and sensor degradation. In this manuscript, we propose a methodology for determining PV degradation leveraging available modeled clear-sky irradiance data rather than site sensor data, and a robust year-over-year (YOY) rate calculation. We show the method to provide reliable degradation rate estimates even in the case of sensor drift, data shifts, and soiling. Compared with alternate methods, we demonstrate that the proposed method delivers the lowest uncertainty in degradation rate estimates for a fleet of 486 PV systems.

  11. The micro-habitat methodology. Application protocols

    Energy Technology Data Exchange (ETDEWEB)

    Sabaton, C; Valentin, S; Souchon, Y

    1995-06-01

    A strong need has been felt for guidelines to help various entities in applying the micro-habitat methodology, particularly in impact studies on hydroelectric installations. CEMAGREF and Electricite de France have developed separately two protocols with five major steps: reconnaissance of the river, selection of representative units to be studied in greater depth, morpho-dynamic measurements at one or more rates of discharge and hydraulic modeling, coupling of hydraulic and biological models, calculation of habitat-quality scores for fish, analysis of results. The two approaches give very comparable results and are essentially differentiated by the hydraulic model used. CEMAGREF uses a one-dimensional model requiring measurements at only one discharge rate. Electricite de France uses a simplified model based on measurements at several rates of discharge. This approach is possible when discharge can be controlled in the study area during data acquisition, as is generally the case downstream of hydroelectric installations. The micro-habitat methodology is now a fully operational tool with which to study changes in fish habitat quality in relation to varying discharge. It provides an element of assessment pertinent to the choice of instreaming flow to be maintained downstream of a hydroelectric installation; this information is essential when the flow characteristics (velocity, depth) and the nature of the river bed are the preponderant factors governing habitat suitability for trout or salmon. The ultimate decision must nonetheless take into account any other potentially limiting factors for the biocenoses on the one hand, and the target water use objectives on the other. In many cases, compromises must be found among different uses, different species and different stages in the fish development cycle. (Abstract Truncated)

  12. The analysis of normative requirements to materials of VVER components, basing on LBB concepts

    Energy Technology Data Exchange (ETDEWEB)

    Anikovsky, V.V.; Karzov, G.P.; Timofeev, B.T. [CRISM Prometey, St. Petersburg (Russian Federation)

    1997-04-01

    The paper demonstrates an insufficiency of some requirements native Norms (when comparing them with the foreign requirements for the consideration of calculating situations): (1) leak before break (LBB); (2) short cracks; (3) preliminary loading (warm prestressing). In particular, the paper presents (1) Comparison of native and foreign normative requirements (PNAE G-7-002-86, Code ASME, BS 1515, KTA) on permissible stress levels and specifically on the estimation of crack initiation and propagation; (2) comparison of RF and USA Norms of pressure vessel material acceptance and also data of pressure vessel hydrotests; (3) comparison of Norms on the presence of defects (RF and USA) in NPP vessels, developments of defect schematization rules; foundation of a calculated defect (semi-axis correlation a/b) for pressure vessel and piping components: (4) sequence of defect estimation (growth of initial defects and critical crack sizes) proceeding from the concept LBB; (5) analysis of crack initiation and propagation conditions according to the acting Norms (including crack jumps); (6) necessity to correct estimation methods of ultimate states of brittle an ductile fracture and elastic-plastic region as applied to calculating situation: (a) LBB and (b) short cracks; (7) necessity to correct estimation methods of ultimate states with the consideration of static and cyclic loading (warm prestressing effect) of pressure vessel; estimation of the effect stability; (8) proposals on PNAE G-7-002-86 Norm corrections.

  13. New applications of partial residual methodology

    International Nuclear Information System (INIS)

    Uslu, V.R.

    1999-12-01

    The formulation of a problem of interest in the framework of a statistical analysis starts with collecting the data, choosing a model, making certain assumptions as described in the basic paradigm by Box (1980). This stage is is called model building. Then the estimation stage is in order by pretending as if the formulation of the problem was true to obtain estimates, to make tests and inferences. In the final stage, called diagnostic checking, checking of whether there are some disagreements between the data and the model fitted is done by using diagnostic measures and diagnostic plots. It is well known that statistical methods perform best under the condition that all assumptions related to the methods are satisfied. However it is true that having the ideal case in practice is very difficult. Diagnostics are therefore becoming important so are diagnostic plots because they provide a immediate assessment. Partial residual plots that are the main interest of the present study are playing the major role among the diagnostic plots in multiple regression analysis. In statistical literature it is admitted that partial residual plots are more useful than ordinary residual plots in detecting outliers, nonconstant variance, and especially discovering curvatures. In this study we consider the partial residual methodology in statistical methods rather than multiple regression. We have shown that for the same purpose as in the multiple regression the use of partial residual plots is possible particularly in autoregressive time series models, transfer function models, linear mixed models and ridge regression. (author)

  14. An aspect-oriented methodology for designing secure applications

    NARCIS (Netherlands)

    Georg, Geri; Ray, Indrakshi; Anastasakis, Kyriakos; Bordbar, Behzad; Toahchoodee, Manachai; Houmb, S.H.

    We propose a methodology, based on aspect-oriented modeling (AOM), for incorporating security mechanisms in an application. The functionality of the application is described using the primary model and the attacks are specified using aspects. The attack aspect is composed with the primary model to

  15. Selection methodology for LWR safety programs and proposals. Volume 2. Methodology application

    International Nuclear Information System (INIS)

    Ritzman, R.L.; Husseiny, A.A.

    1980-08-01

    The results of work done to update and apply a methodology for selecting (prioritizing) LWR safety technology R and D programs are described. The methodology is based on multiattribute utility (MAU) theory. Application of the methodology to rank-order a group of specific R and D programs included development of a complete set of attribute utility functions, specification of individual attribute scaling constants, and refinement and use of an interactive computer program (MAUP) to process decision-maker inputs and generate overall (multiattribute) program utility values. The output results from several decision-makers are examined for consistency and conclusions and recommendations regarding general use of the methodology are presented. 3 figures, 18 tables

  16. Probabilistic risk assessment methodology for risk management and regulatory applications

    International Nuclear Information System (INIS)

    See Meng Wong; Kelly, D.L.; Riley, J.E.

    1997-01-01

    This paper discusses the development and potential applications of PRA methodology for risk management and regulatory applications in the U.S. nuclear industry. The new PRA methodology centers on the development of This paper discusses the time-dependent configuration risk profile for evaluating the effectiveness of operational risk management programs at U.S. nuclear power plants. Configuration-risk profiles have been used as risk-information tools for (1) a better understanding of the impact of daily operational activities on plant safety, and (2) proactive planning of operational activities to manage risk. Trial applications of the methodology were undertaken to demonstrate that configuration-risk profiles can be developed routinely, and can be useful for various industry and regulatory applications. Lessons learned include a better understanding of the issues and characteristics of PRA models available to industry, and identifying the attributes and pitfalls in the developement of risk profiles

  17. Methodology and applications for organizational safety culture

    International Nuclear Information System (INIS)

    Sakaue, Takeharu; Makino, Maomi

    2004-01-01

    The mission of our activity is making 'guidance of safety culture for understanding and evaluations' which comes in much more useful and making it substantial by clarifying positioning of safety culture within evaluation of the quality management. This is pointed out by 'Discussion on how to implement safety culture sufficiently and possible recommendation' last year by falsification issue of TEPCO (Tokyo Electric Power Company). We have been developing the safety culture evaluation structured by three elements. One is safety culture evaluation support tool (SCET), another is organizational reliability model (ORM), third is system for safety. This paper describes mainly organizational reliability model (ORM) and its applications as well as ticking the system for safety culture within quality management. (author)

  18. Application and licensing requirements of the Framatome ANP RLBLOCA methodology

    International Nuclear Information System (INIS)

    Martin, R.P.; Dunn, B.M.

    2004-01-01

    The Framatome ANP Realistic Large-Break LOCA methodology (FANP RLBLOCA) is an analysis approach approved by the US NRC for supporting the licensing basis of 3- and 4-loop Westinghouse PWRs and CE 2x4 PWRs. It was developed consistent with the NRC's Code Scaling, Applicability, and Uncertainty (CSAU) methodology for performing best-estimate large-break LOCAs. The CSAU methodology consists of three key elements with the second and third element addressing uncertainty identification and application. Unique to the CSAU methodology is the use of engineering judgment and the Process Identification and Ranking Table (PIRT) defined in the first element to lay the groundwork for achieving the ultimate goal of quantifying the total uncertainty in predicted measures of interest associated with the large-break LOCA. It is the PIRT that not only directs the methodology development, but also directs the methodology review. While the FANP RLBLOCA methodology was generically approved, a plant-specific application is customized in two ways addressing how the unique plant characterization 1) is translated to code input and 2) relates to the unique methodology licensing requirements. Related to the former, plants are required by 10 CFR 50.36 to define a technical specification limiting condition for operation based on the following criteria: 1. Installed instrumentation that is used in the control room to detect, and indicate, a significant abnormal degradation of the reactor coolant pressure boundary. 2. A process variable, design feature, or operating restriction that is an initial condition of a design basis accident or transient analysis that either assumes the failure of or presents a challenge to the integrity of a fission product barrier. 3. A structure, system, or component that is part of the primary success path and which functions or actuates to mitigate a design basis accident or transient that either assumes the failure of or presents a challenge to the integrity of a

  19. Soils Activity Mobility Study: Methodology and Application

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2014-09-29

    and labor- and data-intensive methods. For the watersheds analyzed in this report using the Level 1 PSIAC method, the risk of erosion is low. The field reconnaissance surveys of these watersheds confirm the conclusion that the sediment yield of undisturbed areas at the NNSS would be low. The climate, geology, soils, ground cover, land use, and runoff potential are similar among these watersheds. There are no well-defined ephemeral channels except at the Smoky and Plutonium Valley sites. Topography seems to have the strongest influence on sediment yields, as sediment yields are higher on the steeper hill slopes. Lack of measured sediment yield data at the NNSS does not allow for a direct evaluation of the yield estimates by the PSIAC method. Level 2 MUSLE estimates in all the analyzed watersheds except Shasta are a small percentage of the estimates from PSIAC because MUSLE is not inclusive of channel erosion. This indicates that channel erosion dominates the total sediment yield in these watersheds. Annual sediment yields for these watersheds are estimated using the CHAN-SEDI and CHAN-SEDII channel sediment transport models. Both transport models give similar results and exceed the estimates obtained from PSIAC and MUSLE. It is recommended that the total watershed sediment yield of watersheds at the NNSS with flow channels be obtained by adding the washload estimate (rill and inter-rill erosion) from MUSLE to that obtained from channel transport models (bed load and suspended sediment). PSIAC will give comparable results if factor scores for channel erosion are revised towards the high erosion level. Application of the Level 3 process-based models to estimate sediment yields at the NNSS cannot be recommended at this time. Increased model complexity alone will not improve the certainty of the sediment yield estimates. Models must be calibrated against measured data before model results are accepted as certain. Because no measurements of sediment yields at the NNSS are

  20. ProLBB - A Probabilistic Approach to Leak Before Break Demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Dillstroem, Peter; Weilin Zang (Inspecta Technology AB, Stockholm (SE))

    2007-11-15

    Recently, the Swedish Nuclear Power Inspectorate has developed guidelines on how to demonstrate the existence of Leak Before Break (LBB). The guidelines, mainly based on NUREG/CR-6765, define the steps that must be fulfilled to get a conservative assessment of LBB acceptability. In this report, a probabilistic LBB approach is defined and implemented into the software ProLBB. The main conclusions, from the study presented in this report, are summarized below. - The probabilistic approach developed in this study was applied to different piping systems in both Boiler Water Reactors (BWR) and Pressurised Water Reactors (PWR). Pipe sizes were selected so that small, medium and large pipes were included in the analysis. The present study shows that the conditional probability of fracture is in general small for the larger diameter pipes when evaluated as function of leak flow rate. However, when evaluated as function of fraction of crack length around the circumference, then the larger diameter pipes will belong to the ones with the highest conditional fracture probabilities. - The total failure probability, corresponding to the product between the leak probability and the conditional fracture probability, will be very small for all pipe geometries when evaluated as function of fraction of crack length around the circumference. This is mainly due to a small leak probability which is consistent with expectations since no active damage mechanism has been assumed. - One of the objectives of the approach was to be able to check the influence of off-centre cracks (i.e. the possibility that cracks occur randomly around the pipe circumference). To satisfy this objective, new stress intensity factor solutions for off-centre cracks were developed. Also to check how off-centre cracks influence crack opening areas, new form factors solutions for COA were developed taking plastic deformation into account. - The influence from an off-center crack position on the conditional

  1. ProLBB - A Probabilistic Approach to Leak Before Break Demonstration

    International Nuclear Information System (INIS)

    Dillstroem, Peter; Weilin Zang

    2007-11-01

    Recently, the Swedish Nuclear Power Inspectorate has developed guidelines on how to demonstrate the existence of Leak Before Break (LBB). The guidelines, mainly based on NUREG/CR-6765, define the steps that must be fulfilled to get a conservative assessment of LBB acceptability. In this report, a probabilistic LBB approach is defined and implemented into the software ProLBB. The main conclusions, from the study presented in this report, are summarized below. - The probabilistic approach developed in this study was applied to different piping systems in both Boiler Water Reactors (BWR) and Pressurised Water Reactors (PWR). Pipe sizes were selected so that small, medium and large pipes were included in the analysis. The present study shows that the conditional probability of fracture is in general small for the larger diameter pipes when evaluated as function of leak flow rate. However, when evaluated as function of fraction of crack length around the circumference, then the larger diameter pipes will belong to the ones with the highest conditional fracture probabilities. - The total failure probability, corresponding to the product between the leak probability and the conditional fracture probability, will be very small for all pipe geometries when evaluated as function of fraction of crack length around the circumference. This is mainly due to a small leak probability which is consistent with expectations since no active damage mechanism has been assumed. - One of the objectives of the approach was to be able to check the influence of off-centre cracks (i.e. the possibility that cracks occur randomly around the pipe circumference). To satisfy this objective, new stress intensity factor solutions for off-centre cracks were developed. Also to check how off-centre cracks influence crack opening areas, new form factors solutions for COA were developed taking plastic deformation into account. - The influence from an off-center crack position on the conditional

  2. Stable isotope methodology and its application to nutrition and gastroenterology

    International Nuclear Information System (INIS)

    Klein, P.D.; Hachey, D.L.; Wong, W.W.; Abrams, S.A.

    1993-01-01

    This report describes the activities of the Stable Isotope Laboratory in its function as a core resource facility for stable isotope applications in human nutrition research. Three aspects are covered: Training of visitors, assessment of new instrumentation, and development of new methodology. The research achievements of the laboratory are indicated in the publications that appeared during this period. (author). 23 refs

  3. SINGULAR SPECTRUM ANALYSIS: METHODOLOGY AND APPLICATION TO ECONOMICS DATA

    Institute of Scientific and Technical Information of China (English)

    Hossein HASSANI; Anatoly ZHIGLJAVSKY

    2009-01-01

    This paper describes the methodology of singular spectrum analysis (SSA) and demonstrate that it is a powerful method of time series analysis and forecasting, particulary for economic time series. The authors consider the application of SSA to the analysis and forecasting of the Iranian national accounts data as provided by the Central Bank of the Islamic Republic of lran.

  4. Towards an MDA-based development methodology for distributed applications

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Gavras, A.; Belaunde, M.; Ferreira Pires, Luis; Andrade Almeida, João

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  5. Interfacing system LOCA risk assessment: Methodology and application

    International Nuclear Information System (INIS)

    Galyean, W.J.; Schroeher, J.A.; Hanson, D.J.

    1991-01-01

    The United States Nuclear Regulatory Commission (NRC) is sponsoring a research program to develop an improved understanding of the human factors hardware, and accident consequence issues that dominate the risk from an Interfacing Systems Loss-of-Coolant Accident (ISLOCA) at a nuclear power plant. To accomplish this program, a methodology has been developed for estimating the core damage frequency and risk associated with an ISLOCA. The steps of the methodology are described with emphasis on one step which is unique, estimation of the probability of rupture of the low pressure systems. A trial application of the methodology was made for a Pressurized Water Reactor (PWR). The results are believed to be plant specific and indicate that human errors during startup and shutdown could be significant contributors to ISLOCA risk at the plant evaluated. 10 refs

  6. Hybrid probabilistic and possibilistic safety assessment. Methodology and application

    International Nuclear Information System (INIS)

    Kato, Kazuyuki; Amano, Osamu; Ueda, Hiroyoshi; Ikeda, Takao; Yoshida, Hideji; Takase, Hiroyasu

    2002-01-01

    This paper presents a unified methodology to handle variability and ignorance by using probabilistic and possibilistic techniques respectively. The methodology has been applied to the safety assessment of geological disposal of high-level radioactive waste. Uncertainties associated with scenarios, models and parameters were defined in terms of fuzzy membership functions derived through a series of interviews to the experts, while variability was formulated by means of probability density functions (pdfs) based on available data sets. The exercise demonstrated the applicability of the new methodology and, in particular, its advantage in quantifying uncertainties based on expert opinion and in providing information on the dependence of assessment results on the level of conservatism. In addition, it was shown that sensitivity analysis can identify key parameters contributing to uncertainties associated with results of the overall assessment. The information mentioned above can be utilized to support decision-making and to guide the process of disposal system development and optimization of protection against potential exposure. (author)

  7. A Global Sensitivity Analysis Methodology for Multi-physics Applications

    Energy Technology Data Exchange (ETDEWEB)

    Tong, C H; Graziani, F R

    2007-02-02

    Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

  8. Training in radionuclide methodology and applications in biomedical area

    International Nuclear Information System (INIS)

    Signoretta, C.

    1998-01-01

    Full text: Training in the field of radionuclide methodology and applications in biomedical area is important to assure that radionuclide should duly be used without risk for patients or for technicians manipulating them. The National Atomic Energy Commission (CNEA) from its creation is giving training courses of different technical levels to those working in science and technology. The Course on Radionuclide Methodology and application is the most continuous, varied and requested within CNEA. This is a basic course mainly given to Biochemistry and Medicine. Its goal is to give both theoretical and practical knowledge for use and application of radionuclides bearing in mind radiological safety regulations. Personnel from CNEA and Nuclear Regulatory Authority (ARN) carry out teaching. On the other hand, a course for Technicians in Nuclear Medicine is giving supplying knowledge in this field, as well as expertise and practice to attend a responsible Medical Doctor. These curses comprise radionuclide methodology, anatomy, physiology, instrumentation and practical applications in Nuclear Medicine. Statistics concerning these course are giving. (author) [es

  9. Applications of mixed-methods methodology in clinical pharmacy research.

    Science.gov (United States)

    Hadi, Muhammad Abdul; Closs, S José

    2016-06-01

    Introduction Mixed-methods methodology, as the name suggests refers to mixing of elements of both qualitative and quantitative methodologies in a single study. In the past decade, mixed-methods methodology has gained popularity among healthcare researchers as it promises to bring together the strengths of both qualitative and quantitative approaches. Methodology A number of mixed-methods designs are available in the literature and the four most commonly used designs in healthcare research are: the convergent parallel design, the embedded design, the exploratory design, and the explanatory design. Each has its own unique advantages, challenges and procedures and selection of a particular design should be guided by the research question. Guidance on designing, conducting and reporting mixed-methods research is available in the literature, so it is advisable to adhere to this to ensure methodological rigour. When to use it is best suited when the research questions require: triangulating findings from different methodologies to explain a single phenomenon; clarifying the results of one method using another method; informing the design of one method based on the findings of another method, development of a scale/questionnaire and answering different research questions within a single study. Two case studies have been presented to illustrate possible applications of mixed-methods methodology. Limitations Possessing the necessary knowledge and skills to undertake qualitative and quantitative data collection, analysis, interpretation and integration remains the biggest challenge for researchers conducting mixed-methods studies. Sequential study designs are often time consuming, being in two (or more) phases whereas concurrent study designs may require more than one data collector to collect both qualitative and quantitative data at the same time.

  10. Probabilistic fracture mechanics applied for lbb case study: international benchmark

    International Nuclear Information System (INIS)

    Radu, V.

    2015-01-01

    An application of probabilistic fracture mechanics to evaluate the structural integrity for a case study chosen from experimental Mock-ups of FP7 STYLE project is described. The reliability model for probabilistic structural integrity, focused on the assessment of TWC in the pipe weld under complex loading (bending moment and residual stress) has been setup. The basic model is the model of fracture for through-wall cracked pipe under elastic-plastic conditions. The corresponding structural reliability approach is developed with the probabilities of failure associated with maximum load for crack initiation, net-section collapse but also the evaluation the instability loads. The probabilities of failure for a through-wall crack in a pipe subject to pure bending are evaluated by using crude Monte Carlo simulations. The results from the international benchmark are presented for the mentioned case in the context of ageing and lifetime management of pressure boundary/pressure circuit component. (authors)

  11. Methodological developments and applications of neutron activation analysis

    International Nuclear Information System (INIS)

    Kucera, J.

    2007-01-01

    The paper reviews the author's experience acquired and achievements made in methodological developments of neutron activation analysis (NAA) of mostly biological materials. These involve epithermal neutron activation analysis, radiochemical neutron activation analysis using both single- and multi-element separation procedures, use of various counting modes, and the development and use of the self-verification principle. The role of NAA in the detection of analytical errors is discussed and examples of applications of the procedures developed are given. (author)

  12. PIXE methodology of rare earth element analysis and its applications

    International Nuclear Information System (INIS)

    Ma Xinpei

    1992-01-01

    The Proton Induced X-ray Emission (PIXE) methodology of rare earth element (REEs) analysis is discussed, including the significance of REE analysis, the principle of PIXE applied to REE, selection of characteristic X-ray for Lanthanide series elements, deconvolution of highly over lapped PIXE spectrum and minimum detection limit (MDL) of REEs. Some practical applications are presented. And the specialities of PIXE analysis to the high pure REE chemicals are discussed. (author)

  13. Application of Bow-tie methodology to improve patient safety.

    Science.gov (United States)

    Abdi, Zhaleh; Ravaghi, Hamid; Abbasi, Mohsen; Delgoshaei, Bahram; Esfandiari, Somayeh

    2016-05-09

    Purpose - The purpose of this paper is to apply Bow-tie methodology, a proactive risk assessment technique based on systemic approach, for prospective analysis of the risks threatening patient safety in intensive care unit (ICU). Design/methodology/approach - Bow-tie methodology was used to manage clinical risks threatening patient safety by a multidisciplinary team in the ICU. The Bow-tie analysis was conducted on incidents related to high-alert medications, ventilator associated pneumonia, catheter-related blood stream infection, urinary tract infection, and unwanted extubation. Findings - In total, 48 potential adverse events were analysed. The causal factors were identified and classified into relevant categories. The number and effectiveness of existing preventive and protective barriers were examined for each potential adverse event. The adverse events were evaluated according to the risk criteria and a set of interventions were proposed with the aim of improving the existing barriers or implementing new barriers. A number of recommendations were implemented in the ICU, while considering their feasibility. Originality/value - The application of Bow-tie methodology led to practical recommendations to eliminate or control the hazards identified. It also contributed to better understanding of hazard prevention and protection required for safe operations in clinical settings.

  14. Risk-Informed Assessment Methodology Development and Application

    International Nuclear Information System (INIS)

    Sung Goo Chi; Seok Jeong Park; Chul Jin Choi; Ritterbusch, S.E.; Jacob, M.C.

    2002-01-01

    Westinghouse Electric Company (WEC) has been working with Korea Power Engineering Company (KOPEC) on a US Department of Energy (DOE) sponsored Nuclear Energy Research Initiative (NERI) project through a collaborative agreement established for the domestic NERI program. The project deals with Risk-Informed Assessment (RIA) of regulatory and design requirements of future nuclear power plants. An objective of the RIA project is to develop a risk-informed design process, which focuses on identifying and incorporating advanced features into future nuclear power plants (NPPs) that would meet risk goals in a cost-effective manner. The RIA design methodology is proposed to accomplish this objective. This paper discusses the development of this methodology and demonstrates its application in the design of plant systems for future NPPs. Advanced conceptual plant systems consisting of an advanced Emergency Core Cooling System (ECCS) and Emergency Feedwater System (EFWS) for a NPP were developed and the risk-informed design process was exercised to demonstrate the viability and feasibility of the RIA design methodology. Best estimate Loss-of-Coolant Accident (LOCA) analyses were performed to validate the PSA success criteria for the NPP. The results of the analyses show that the PSA success criteria can be met using the advanced conceptual systems and that the RIA design methodology is a viable and appropriate means of designing key features of risk-significant NPP systems. (authors)

  15. Calibration Modeling Methodology to Optimize Performance for Low Range Applications

    Science.gov (United States)

    McCollum, Raymond A.; Commo, Sean A.; Parker, Peter A.

    2010-01-01

    Calibration is a vital process in characterizing the performance of an instrument in an application environment and seeks to obtain acceptable accuracy over the entire design range. Often, project requirements specify a maximum total measurement uncertainty, expressed as a percent of full-scale. However in some applications, we seek to obtain enhanced performance at the low range, therefore expressing the accuracy as a percent of reading should be considered as a modeling strategy. For example, it is common to desire to use a force balance in multiple facilities or regimes, often well below its designed full-scale capacity. This paper presents a general statistical methodology for optimizing calibration mathematical models based on a percent of reading accuracy requirement, which has broad application in all types of transducer applications where low range performance is required. A case study illustrates the proposed methodology for the Mars Entry Atmospheric Data System that employs seven strain-gage based pressure transducers mounted on the heatshield of the Mars Science Laboratory mission.

  16. Reversible logic synthesis methodologies with application to quantum computing

    CERN Document Server

    Taha, Saleem Mohammed Ridha

    2016-01-01

    This book opens the door to a new interesting and ambitious world of reversible and quantum computing research. It presents the state of the art required to travel around that world safely. Top world universities, companies and government institutions  are in a race of developing new methodologies, algorithms and circuits on reversible logic, quantum logic, reversible and quantum computing and nano-technologies. In this book, twelve reversible logic synthesis methodologies are presented for the first time in a single literature with some new proposals. Also, the sequential reversible logic circuitries are discussed for the first time in a book. Reversible logic plays an important role in quantum computing. Any progress in the domain of reversible logic can be directly applied to quantum logic. One of the goals of this book is to show the application of reversible logic in quantum computing. A new implementation of wavelet and multiwavelet transforms using quantum computing is performed for this purpose. Rese...

  17. A gamma heating calculation methodology for research reactor application

    International Nuclear Information System (INIS)

    Lee, Y.K.; David, J.C.; Carcreff, H.

    2001-01-01

    Gamma heating is an important issue in research reactor operation and fuel safety. Heat deposition in irradiation targets and temperature distribution in irradiation facility should be determined so as to obtain the optimal irradiation conditions. This paper presents a recently developed gamma heating calculation methodology and its application on the research reactors. Based on the TRIPOLI-4 Monte Carlo code under the continuous-energy option, this new calculation methodology was validated against calorimetric measurements realized within a large ex-core irradiation facility of the 70 MWth OSIRIS materials testing reactor (MTR). The contributions from prompt fission neutrons, prompt fission γ-rays, capture γ-rays and inelastic γ-rays to heat deposition were evaluated by a coupled (n, γ) transport calculation. The fission product decay γ-rays were also considered but the activation γ-rays were neglected in this study. (author)

  18. Application of transient analysis methodology to heat exchanger performance monitoring

    International Nuclear Information System (INIS)

    Rampall, I.; Soler, A.I.; Singh, K.P.; Scott, B.H.

    1994-01-01

    A transient testing technique is developed to evaluate the thermal performance of industrial scale heat exchangers. A Galerkin-based numerical method with a choice of spectral basis elements to account for spatial temperature variations in heat exchangers is developed to solve the transient heat exchanger model equations. Testing a heat exchanger in the transient state may be the only viable alternative where conventional steady state testing procedures are impossible or infeasible. For example, this methodology is particularly suited to the determination of fouling levels in component cooling water system heat exchangers in nuclear power plants. The heat load on these so-called component coolers under steady state conditions is too small to permit meaningful testing. An adequate heat load develops immediately after a reactor shutdown when the exchanger inlet temperatures are highly time-dependent. The application of the analysis methodology is illustrated herein with reference to an in-situ transient testing carried out at a nuclear power plant. The method, however, is applicable to any transient testing application

  19. Fast underdetermined BSS architecture design methodology for real time applications.

    Science.gov (United States)

    Mopuri, Suresh; Reddy, P Sreenivasa; Acharyya, Amit; Naik, Ganesh R

    2015-01-01

    In this paper, we propose a high speed architecture design methodology for the Under-determined Blind Source Separation (UBSS) algorithm using our recently proposed high speed Discrete Hilbert Transform (DHT) targeting real time applications. In UBSS algorithm, unlike the typical BSS, the number of sensors are less than the number of the sources, which is of more interest in the real time applications. The DHT architecture has been implemented based on sub matrix multiplication method to compute M point DHT, which uses N point architecture recursively and where M is an integer multiples of N. The DHT architecture and state of the art architecture are coded in VHDL for 16 bit word length and ASIC implementation is carried out using UMC 90 - nm technology @V DD = 1V and @ 1MHZ clock frequency. The proposed architecture implementation and experimental comparison results show that the DHT design is two times faster than state of the art architecture.

  20. Design verification methodology for a solenoid valve for industrial applications

    International Nuclear Information System (INIS)

    Park, Chang Dae; Lim, Byung Ju; Chun, Kyung Yul

    2015-01-01

    Solenoid operated valves (SOV) are widely used in many applications due to their fast dynamic responses, cost effectiveness, and less contamination sensitive characteristics. In this paper, we tried to provide a convenient method of design verification of SOV to design engineers who depend on their experiences and experiment during design and development process of SOV. First, we summarize a detailed procedure for designing SOVs for industrial applications. All of the design constraints are defined in the first step of the design, and then the detail design procedure is presented based on design experiences as well as various physical and electromagnetic relationships. Secondly, we have suggested a verification method of this design using theoretical relationships, which enables optimal design of SOV from a point of view of safety factor of design attraction force. Lastly, experimental performance tests using several prototypes manufactured based on this design method show that the suggested design verification methodology is appropriate for designing new models of solenoids. We believe that this verification process is novel logic and useful to save time and expenses during development of SOV because verification tests with manufactured specimen may be substituted partly by this verification methodology.

  1. [Nursing methodology applicated in patients with pressure ulcers. Clinical report].

    Science.gov (United States)

    Galvez Romero, Carmen

    2014-05-01

    The application of functional patterns lets us to make a systematic and premeditated nursing assessment, with which we obtain a lot of relevant patient data in an organized way, making easier to analize them. In our case, we use Marjory Gordon's functional health patterns and NANDA (North American Nursing Diagnosis Association), NOC (Nursing Outcomes Classification), NIC (Nursing Intervention Classification) taxonomy. The overall objective of this paper is to present the experience of implementation and development of nursing methodology in the care of patients with pressure ulcers. In this article it's reported a case of a 52-year-old female who presented necrosis of phalanxes in upper and lower limbs and suffered amputations of them after being hospitalized in an Intensive Care Unit. She was discharged with pressure ulcers on both heels. GENERAL ASSESSMENT: It was implemented the nursing theory known as "Gordon's functional health patterns" and the affected patterns were identified. The Second Pattern (Nutritional-Metabolic) was considered as reference, since this was the pattern which altered the rest. EVOLUTION OF THE PATIENT: The patient had a favourable evolution, improving all the altered patterns. The infections symptoms disappeared and the pressure ulcers of both heels healed completely. The application of nursing methodology to care patients with pressure ulcers using clinical practice guidelines, standardized procedures and rating scales of assessment improves the evaluation of results and the performance of nurses.

  2. Exploring Methodologies and Indicators for Cross-disciplinary Applications

    Science.gov (United States)

    Bernknopf, R.; Pearlman, J.

    2015-12-01

    Assessing the impact and benefit of geospatial information is a multidisciplinary task that involves the social, economic and environmental knowledge to formulate indicators and methods. There are use cases that couple the social sciences including economics, psychology, sociology that incorporate geospatial information. Benefit - cost analysis is an empirical approach that uses money as an indicator for decision making. It is a traditional base for a use case and has been applied to geospatial information and other areas. A new use case that applies indicators is Meta Regression analysis, which is used to evaluate transfers of socioeconomic benefits from different geographic regions into a unifying statistical approach. In this technique, qualitative and quantitative variables are indicators, which provide a weighted average of value for the nonmarket good or resource over a large region. The expected willingness to pay for the nonmarket good can be applied to a specific region. A third use case is the application of Decision Support Systems and Tools that have been used for forecasting agricultural prices and analysis of hazard policies. However, new methods for integrating these disciplines into use cases, an avenue to instruct the development of operational applications of geospatial information, are needed. Experience in one case may not be broadly transferable to other uses and applications if multiple disciplines are involved. To move forward, more use cases are needed and, especially, applications in the private sector. Applications are being examined across a multidisciplinary community for good examples that would be instructive in meeting the challenge. This presentation will look at the results of an investigation into directions in the broader applications of use cases to teach the methodologies and use of indicators that have applications across fields of interest.

  3. Application of decision-making methodology to certificate-of-need applications for CT scanners

    International Nuclear Information System (INIS)

    Gottinger, H.W.; Shapiro, P.

    1985-01-01

    This paper describes a case study and application of decision-making methodology to two competing Certificate of Need (CON) applications for CT body scanners. We demonstrate the use of decision-making methodology by evaluating the CON applications. Explicit value judgements reflecting the monetary equivalent of the different categories of benefit are introduced to facilitate this comparison. The difference between the benefits (measured in monetary terms) and costs is called the net social value. Any alternative with positive net social value is judged economically justifiable, and the alternative with the greatest net social value is judged the most attractive. (orig.)

  4. Application of human reliability analysis methodology of second generation

    International Nuclear Information System (INIS)

    Ruiz S, T. de J.; Nelson E, P. F.

    2009-10-01

    The human reliability analysis (HRA) is a very important part of probabilistic safety analysis. The main contribution of HRA in nuclear power plants is the identification and characterization of the issues that are brought together for an error occurring in the human tasks that occur under normal operation conditions and those made after abnormal event. Additionally, the analysis of various accidents in history, it was found that the human component has been a contributing factor in the cause. Because of need to understand the forms and probability of human error in the 60 decade begins with the collection of generic data that result in the development of the first generation of HRA methodologies. Subsequently develop methods to include in their models additional performance shaping factors and the interaction between them. So by the 90 mid, comes what is considered the second generation methodologies. Among these is the methodology A Technique for Human Event Analysis (ATHEANA). The application of this method in a generic human failure event, it is interesting because it includes in its modeling commission error, the additional deviations quantification to nominal scenario considered in the accident sequence of probabilistic safety analysis and, for this event the dependency actions evaluation. That is, the generic human failure event was required first independent evaluation of the two related human failure events . So the gathering of the new human error probabilities involves the nominal scenario quantification and cases of significant deviations considered by the potential impact on analyzed human failure events. Like probabilistic safety analysis, with the analysis of the sequences were extracted factors more specific with the highest contribution in the human error probabilities. (Author)

  5. Methodology for neural networks prototyping. Application to traffic control

    Energy Technology Data Exchange (ETDEWEB)

    Belegan, I.C.

    1998-07-01

    The work described in this report was carried out in the context of the European project ASTORIA (Advanced Simulation Toolbox for Real-World Industrial Application in Passenger Management and Adaptive Control), and concerns the development of an advanced toolbox for complex transportation systems. Our work was focused on the methodology for prototyping a set of neural networks corresponding to specific strategies for traffic control and congestion management. The tool used for prototyping is SNNS (Stuggart Neural Network Simulator), developed at the University of Stuggart, Institute for Parallel and Distributed High Performance Systems, and the real data from the field were provided by ZELT. This report is structured into six parts. The introduction gives some insights about traffic control and its approaches. The second chapter discusses the various control strategies existing. The third chapter is an introduction to the field of neural networks. The data analysis and pre-processing is described in the fourth chapter. In the fifth chapter, the methodology for prototyping the neural networks is presented. Finally, conclusions and further work are presented. (author) 14 refs.

  6. Application of theoretical and methodological components of nursing care

    Directory of Open Access Journals (Sweden)

    Rosa del Socorro Morales-Aguilar

    2016-12-01

    Full Text Available Introduction: the theoretical and methodological components are the proper expertise in nursing, and it refers to models, theories, care process, taxonomy of nursing diagnoses, system of nursing intervention classification, and system of outcomes classification, which base nursing care into professional practice. Methodology: research was performed on Google Scholar, reviewing the databases of Scielo, Ciberindex, Index Enfermería, Dialnet, Redalyc, Medline, identifying 70 published articles between 2005-2015, and selecting 52 of them. The keywords used were: nurse care, nursing diagnostic, classification, nursing theory, in spanish and portuguese. Results: training students, receive knowledge in the nursing process, NANDA International, classification of the interventions, nurse results and theoretical components. The Dorothea Orem, Callista Roy, Nola Pender, Virginia Henderson, Florence Nightingale, and Betty Neuman theories are applied. The application of the nursing process is limited and low familiarity with the international taxonomy by nurse professionals in the assistance area is noticed. Conclusions: the challenge of nursing is to continue to solidify the scientific knowledge and to undo the gap between theory and practice.

  7. Analytical group decision making in natural resources: Methodology and application

    Science.gov (United States)

    Schmoldt, D.L.; Peterson, D.L.

    2000-01-01

    Group decision making is becoming increasingly important in natural resource management and associated scientific applications, because multiple values are treated coincidentally in time and space, multiple resource specialists are needed, and multiple stakeholders must be included in the decision process. Decades of social science research on decision making in groups have provided insights into the impediments to effective group processes and on techniques that can be applied in a group context. Nevertheless, little integration and few applications of these results have occurred in resource management decision processes, where formal groups are integral, either directly or indirectly. A group decision-making methodology is introduced as an effective approach for temporary, formal groups (e.g., workshops). It combines the following three components: (1) brainstorming to generate ideas; (2) the analytic hierarchy process to produce judgments, manage conflict, enable consensus, and plan for implementation; and (3) a discussion template (straw document). Resulting numerical assessments of alternative decision priorities can be analyzed statistically to indicate where group member agreement occurs and where priority values are significantly different. An application of this group process to fire research program development in a workshop setting indicates that the process helps focus group deliberations; mitigates groupthink, nondecision, and social loafing pitfalls; encourages individual interaction; identifies irrational judgments; and provides a large amount of useful quantitative information about group preferences. This approach can help facilitate scientific assessments and other decision-making processes in resource management.

  8. AN AUTOMATIC AND METHODOLOGICAL APPROACH FOR ACCESSIBLE WEB APPLICATIONS

    Directory of Open Access Journals (Sweden)

    Lourdes Moreno

    2007-06-01

    Full Text Available Semantic Web approaches try to get the interoperability and communication among technologies and organizations. Nevertheless, sometimes it is forgotten that the Web must be useful for every user, consequently it is necessary to include tools and techniques doing Semantic Web be accessible. Accessibility and usability are two usually joined concepts widely used in web application development, however their meaning are different. Usability means the way to make easy the use but accessibility is referred to the access possibility. For the first one, there are many well proved approaches in real cases. However, accessibility field requires a deeper research that will make feasible the access to disable people and also the access to novel non-disable people due to the cost to automate and maintain accessible applications. In this paper, we propose one architecture to achieve the accessibility in web-environments dealing with the WAI accessibility standard and the Universal Design paradigm. This architecture tries to control the accessibility in web applications development life-cycle following a methodology starting from a semantic conceptual model and leans on description languages and controlled vocabularies.

  9. Modern methodology and applications in spatial-temporal modeling

    CERN Document Server

    Matsui, Tomoko

    2015-01-01

    This book provides a modern introductory tutorial on specialized methodological and applied aspects of spatial and temporal modeling. The areas covered involve a range of topics which reflect the diversity of this domain of research across a number of quantitative disciplines. For instance, the first chapter deals with non-parametric Bayesian inference via a recently developed framework known as kernel mean embedding which has had a significant influence in machine learning disciplines. The second chapter takes up non-parametric statistical methods for spatial field reconstruction and exceedance probability estimation based on Gaussian process-based models in the context of wireless sensor network data. The third chapter presents signal-processing methods applied to acoustic mood analysis based on music signal analysis. The fourth chapter covers models that are applicable to time series modeling in the domain of speech and language processing. This includes aspects of factor analysis, independent component an...

  10. VaR Methodology Application for Banking Currency Portfolios

    Directory of Open Access Journals (Sweden)

    Daniel Armeanu

    2007-02-01

    Full Text Available VaR has become the standard measure that financial analysts use to quantify market risk. VaR measures can have many applications, such as in risk management, to evaluate the performance of risk takers and for regulatory requirements, and hence it is very important to develop methodologies that provide accurate estimates. In particular, the Basel Committee on Banking Supervision at the Bank for International Settlements imposes to financial institutions such as banks and investment firms to meet capital requirements based on VaR estimates. In this paper we determine VaR for a banking currency portfolio and respect rules of National Bank of Romania regarding VaR report.

  11. Application of the leak-before-break concept to the primary circuit piping of the Leningrad NPP

    Energy Technology Data Exchange (ETDEWEB)

    Eperin, A.P.; Zakharzhevsky, Yu.O.; Arzhaev, A.I. [and others

    1997-04-01

    A two-year Finnish-Russian cooperation program has been initiated in 1995 to demonstrate the applicability of the leak-before-break concept (LBB) to the primary circuit piping of the Leningrad NPP. The program includes J-R curve testing of authentic pipe materials at full operating temperature, screening and computational LBB analyses complying with the USNRC Standard Review Plan 3.6.3, and exchange of LBB-related information with emphasis on NDE. Domestic computer codes are mainly used, and all tests and analyses are independently carried out by each party. The results are believed to apply generally to RBMK type plants of the first generation.

  12. Applications of a constrained mechanics methodology in economics

    Science.gov (United States)

    Janová, Jitka

    2011-11-01

    This paper presents instructive interdisciplinary applications of constrained mechanics calculus in economics on a level appropriate for undergraduate physics education. The aim of the paper is (i) to meet the demand for illustrative examples suitable for presenting the background of the highly expanding research field of econophysics even at the undergraduate level and (ii) to enable the students to gain a deeper understanding of the principles and methods routinely used in mechanics by looking at the well-known methodology from the different perspective of economics. Two constrained dynamic economic problems are presented using the economic terminology in an intuitive way. First, the Phillips model of the business cycle is presented as a system of forced oscillations and the general problem of two interacting economies is solved by the nonholonomic dynamics approach. Second, the Cass-Koopmans-Ramsey model of economical growth is solved as a variational problem with a velocity-dependent constraint using the vakonomic approach. The specifics of the solution interpretation in economics compared to mechanics is discussed in detail, a discussion of the nonholonomic and vakonomic approaches to constrained problems in mechanics and economics is provided and an economic interpretation of the Lagrange multipliers (possibly surprising for the students of physics) is carefully explained. This paper can be used by the undergraduate students of physics interested in interdisciplinary physics applications to gain an understanding of the current scientific approach to economics based on a physical background, or by university teachers as an attractive supplement to classical mechanics lessons.

  13. Applications of a constrained mechanics methodology in economics

    International Nuclear Information System (INIS)

    Janova, Jitka

    2011-01-01

    This paper presents instructive interdisciplinary applications of constrained mechanics calculus in economics on a level appropriate for undergraduate physics education. The aim of the paper is (i) to meet the demand for illustrative examples suitable for presenting the background of the highly expanding research field of econophysics even at the undergraduate level and (ii) to enable the students to gain a deeper understanding of the principles and methods routinely used in mechanics by looking at the well-known methodology from the different perspective of economics. Two constrained dynamic economic problems are presented using the economic terminology in an intuitive way. First, the Phillips model of the business cycle is presented as a system of forced oscillations and the general problem of two interacting economies is solved by the nonholonomic dynamics approach. Second, the Cass-Koopmans-Ramsey model of economical growth is solved as a variational problem with a velocity-dependent constraint using the vakonomic approach. The specifics of the solution interpretation in economics compared to mechanics is discussed in detail, a discussion of the nonholonomic and vakonomic approaches to constrained problems in mechanics and economics is provided and an economic interpretation of the Lagrange multipliers (possibly surprising for the students of physics) is carefully explained. This paper can be used by the undergraduate students of physics interested in interdisciplinary physics applications to gain an understanding of the current scientific approach to economics based on a physical background, or by university teachers as an attractive supplement to classical mechanics lessons.

  14. Applications of a constrained mechanics methodology in economics

    Energy Technology Data Exchange (ETDEWEB)

    Janova, Jitka, E-mail: janova@mendelu.cz [Department of Theoretical Physics and Astrophysics, Faculty of Science, Masaryk University, Kotlarska 2, 611 37 Brno (Czech Republic); Department of Statistics and Operation Analysis, Faculty of Business and Economics, Mendel University in Brno, Zemedelska 1, 613 00 Brno (Czech Republic)

    2011-11-15

    This paper presents instructive interdisciplinary applications of constrained mechanics calculus in economics on a level appropriate for undergraduate physics education. The aim of the paper is (i) to meet the demand for illustrative examples suitable for presenting the background of the highly expanding research field of econophysics even at the undergraduate level and (ii) to enable the students to gain a deeper understanding of the principles and methods routinely used in mechanics by looking at the well-known methodology from the different perspective of economics. Two constrained dynamic economic problems are presented using the economic terminology in an intuitive way. First, the Phillips model of the business cycle is presented as a system of forced oscillations and the general problem of two interacting economies is solved by the nonholonomic dynamics approach. Second, the Cass-Koopmans-Ramsey model of economical growth is solved as a variational problem with a velocity-dependent constraint using the vakonomic approach. The specifics of the solution interpretation in economics compared to mechanics is discussed in detail, a discussion of the nonholonomic and vakonomic approaches to constrained problems in mechanics and economics is provided and an economic interpretation of the Lagrange multipliers (possibly surprising for the students of physics) is carefully explained. This paper can be used by the undergraduate students of physics interested in interdisciplinary physics applications to gain an understanding of the current scientific approach to economics based on a physical background, or by university teachers as an attractive supplement to classical mechanics lessons.

  15. Complex basis functions for molecular resonances: Methodology and applications

    Science.gov (United States)

    White, Alec; McCurdy, C. William; Head-Gordon, Martin

    The computation of positions and widths of metastable electronic states is a challenge for molecular electronic structure theory because, in addition to the difficulty of the many-body problem, such states obey scattering boundary conditions. These resonances cannot be addressed with naïve application of traditional bound state electronic structure theory. Non-Hermitian electronic structure methods employing complex basis functions is one way that we may rigorously treat resonances within the framework of traditional electronic structure theory. In this talk, I will discuss our recent work in this area including the methodological extension from single determinant SCF-based approaches to highly correlated levels of wavefunction-based theory such as equation of motion coupled cluster and many-body perturbation theory. These approaches provide a hierarchy of theoretical methods for the computation of positions and widths of molecular resonances. Within this framework, we may also examine properties of resonances including the dependence of these parameters on molecular geometry. Some applications of these methods to temporary anions and dianions will also be discussed.

  16. Spatial Development Modeling Methodology Application Possibilities in Vilnius

    Directory of Open Access Journals (Sweden)

    Lina Panavaitė

    2017-05-01

    Full Text Available In order to control the continued development of high-rise buildings and their irreversible visual impact on the overall silhouette of the city, the great cities of the world introduced new methodological principles to city’s spatial development models. These methodologies and spatial planning guidelines are focused not only on the controlled development of high-rise buildings, but on the spatial modelling of the whole city by defining main development criteria and estimating possible consequences. Vilnius city is no exception, however the re-establishment of independence of Lithuania caused uncontrolled urbanization process, so most of the city development regulations emerged as a consequence of unmanaged processes of investors’ expectations legalization. The importance of consistent urban fabric as well as conservation and representation of city’s most important objects gained attention only when an actual threat of overshadowing them with new architecture along with unmanaged urbanization in the city center or urban sprawl at suburbia, caused by land-use projects, had emerged. Current Vilnius’ spatial planning documents clearly define urban structure and key development principles, however the definitions are relatively abstract, causing uniform building coverage requirements for territories with distinct qualities and simplifying planar designs which do not meet quality standards. The overall quality of urban architecture is not regulated. The article deals with current spatial modeling methods, their individual parts, principles, the criteria for quality assessment and their applicability in Vilnius. The text contains an outline of possible building coverage regulations and impact assessment criteria for new development. The article contains a compendium of requirements for high-quality spatial planning and building design.

  17. Drug-targeting methodologies with applications: A review

    Science.gov (United States)

    Kleinstreuer, Clement; Feng, Yu; Childress, Emily

    2014-01-01

    Targeted drug delivery to solid tumors is a very active research area, focusing mainly on improved drug formulation and associated best delivery methods/devices. Drug-targeting has the potential to greatly improve drug-delivery efficacy, reduce side effects, and lower the treatment costs. However, the vast majority of drug-targeting studies assume that the drug-particles are already at the target site or at least in its direct vicinity. In this review, drug-delivery methodologies, drug types and drug-delivery devices are discussed with examples in two major application areas: (1) inhaled drug-aerosol delivery into human lung-airways; and (2) intravascular drug-delivery for solid tumor targeting. The major problem addressed is how to deliver efficiently the drug-particles from the entry/infusion point to the target site. So far, most experimental results are based on animal studies. Concerning pulmonary drug delivery, the focus is on the pros and cons of three inhaler types, i.e., pressurized metered dose inhaler, dry powder inhaler and nebulizer, in addition to drug-aerosol formulations. Computational fluid-particle dynamics techniques and the underlying methodology for a smart inhaler system are discussed as well. Concerning intravascular drug-delivery for solid tumor targeting, passive and active targeting are reviewed as well as direct drug-targeting, using optimal delivery of radioactive microspheres to liver tumors as an example. The review concludes with suggestions for future work, considereing both pulmonary drug targeting and direct drug delivery to solid tumors in the vascular system. PMID:25516850

  18. Application of Agent Methodology in Healthcare Information Systems

    Directory of Open Access Journals (Sweden)

    Reem Abdalla

    2017-02-01

    Full Text Available This paper presents a case study to describe the features and the phases of the two agent methodologies. The Gaia methodology for agent oriented analysis and design, Tropos is a detailed agent oriented software engineering methodology to explore each methodology's ability to present solutions for small problems. Also we provide an attempt to discover whether the methodology is in fact understandable and usable. In addition we were collecting and taking notes of the advantages and weaknesses of these methodologies during the study analysis for each methodology and the relationships among their models. The Guardian Angle: Patient-Centered Health Information System (GA: PCHIS is the personal system to help track, manage, and interpret the subject's health history, and give advice to both patient and provider is used as the case study throughout the paper.

  19. FPGA Design Methodologies Applicable to Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kwong, Yongil; Jeong, Choongheui

    2013-01-01

    In order to solve the above problem, NPPs in some countries such as the US, Canada and Japan have already applied FPGA-based equipment which has advantages as follows: It is easier to verify the performance because it needs only HDL code to configure logic circuits without other software, compared to microprocessor-based equipment, It is much cheaper than ASIC in a small quantity, Its logic circuits are re configurable, It has enough resources like logic blocks and memory blocks to implement I and C functions, Multiple functions can be implemented in a FPGA chip, It is stronger with respect to carboy security than microprocessor-based equipment because its configuration cannot be changed by external access, It is simple to replace it with new one when it is obsolete, Its power consumption is lower. However, FPGA-based equipment does not have only the merits. There are some issues on its application to NPPs. First of all, the experiences in applying it to NPPs are much less than to other industries, and international standards or guidelines are also very few. And there is the small number of FPGA platforms for I and C systems. Finally, the specific guidelines on FPGA design are required because the design has both hardware and software characteristics. In order to handle the above issues, KINS(Korea Institute of Nuclear Safety) built a test platform last year and have developed regulatory guidelines for FPGA-application in NPPs. I and C systems of NPPs have been increasingly using FPGA-based equipment as an alternative of microprocessor-based equipment which is not simple to be evaluated for safety due to its complexity. This paper explained the FPGA design flow and design guidelines. Those methodologies can be used as the guidelines on FPGA verification for safety of I and C systems

  20. Application of System Dynamics Methodology in Population Analysis

    Directory of Open Access Journals (Sweden)

    August Turina

    2009-09-01

    Full Text Available The goal of this work is to present the application of system dynamics and system thinking, as well as the advantages and possible defects of this analytic approach, in order to improve the analysis of complex systems such as population and, thereby, to monitor more effectively the underlying causes of migrations. This methodology has long been present in interdisciplinary scientific circles, but its scientific contribution has not been sufficiently applied in analysis practice in Croatia. Namely, the major part of system analysis is focused on detailed complexity rather than on dynamic complexity. Generally, the science of complexity deals with emergence, innovation, learning and adaptation. Complexity is viewed according to the number of system components, or through a number of combinations that must be continually analyzed in order to understand and consequently provide adequate decisions. Simulations containing thousands of variables and complex arrays of details distract overall attention from the basic cause patterns and key inter-relations emerging and prevailing within an analyzed population. Systems thinking offers a holistic and integral perspective for observation of the world.

  1. Artificial Intelligence Methodologies and Their Application to Diabetes.

    Science.gov (United States)

    Rigla, Mercedes; García-Sáez, Gema; Pons, Belén; Hernando, Maria Elena

    2018-03-01

    In the past decade diabetes management has been transformed by the addition of continuous glucose monitoring and insulin pump data. More recently, a wide variety of functions and physiologic variables, such as heart rate, hours of sleep, number of steps walked and movement, have been available through wristbands or watches. New data, hydration, geolocation, and barometric pressure, among others, will be incorporated in the future. All these parameters, when analyzed, can be helpful for patients and doctors' decision support. Similar new scenarios have appeared in most medical fields, in such a way that in recent years, there has been an increased interest in the development and application of the methods of artificial intelligence (AI) to decision support and knowledge acquisition. Multidisciplinary research teams integrated by computer engineers and doctors are more and more frequent, mirroring the need of cooperation in this new topic. AI, as a science, can be defined as the ability to make computers do things that would require intelligence if done by humans. Increasingly, diabetes-related journals have been incorporating publications focused on AI tools applied to diabetes. In summary, diabetes management scenarios have suffered a deep transformation that forces diabetologists to incorporate skills from new areas. This recently needed knowledge includes AI tools, which have become part of the diabetes health care. The aim of this article is to explain in an easy and plane way the most used AI methodologies to promote the implication of health care providers-doctors and nurses-in this field.

  2. PROLIFERATION RESISTANCE AND PHYSICAL PROTECTION WORKING GROUP: METHODOLOGY AND APPLICATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Bari R. A.; Whitlock, J.; Therios, I.U.; Peterson, P.F.

    2012-11-14

    We summarize the technical progress and accomplishments on the evaluation methodology for proliferation resistance and physical protection (PR and PP) of Generation IV nuclear energy systems. We intend the results of the evaluations performed with the methodology for three types of users: system designers, program policy makers, and external stakeholders. The PR and PP Working Group developed the methodology through a series of demonstration and case studies. Over the past few years various national and international groups have applied the methodology to nuclear energy system designs as well as to developing approaches to advanced safeguards.

  3. Proliferation resistance and physical protection working group: methodology and applications

    International Nuclear Information System (INIS)

    Bari, Robert A.; Whitlock, Jeremy J.; Therios, Ike U.; Peterson, P.F.

    2012-01-01

    We summarize the technical progress and accomplishments on the evaluation methodology for proliferation resistance and physical protection (PR and PP) of Generation IV nuclear energy systems. We intend the results of the evaluations performed with the methodology for three types of users: system designers, program policy makers, and external stakeholders. The PR and PP Working Group developed the methodology through a series of demonstration and case studies. Over the past few years various national and international groups have applied the methodology to nuclear energy system designs as well as to developing approaches to advanced safeguards.

  4. Additional requirements for leak-before-break application to primary coolant piping in Belgium

    Energy Technology Data Exchange (ETDEWEB)

    Roussel, G. [AIB Vincotte Nuclear, Brussels (Belgium)

    1997-04-01

    Leak-Before-Break (LBB) technology has not been applied in the first design of the seven Pressurized Water Reactors the Belgian utility is currently operating. The design basis of these plants required to consider the dynamic effects associated with the ruptures to be postulated in the high energy piping. The application of the LBB technology to the existing plants has been recently approved by the Belgian Safety Authorities but with a limitation to the primary coolant loop. LBB analysis has been initiated for the Doel 3 and Tihange 2 plants to allow the withdrawal of some of the reactor coolant pump snubbers at both plants and not reinstall some of the restraints after steam generator replacement at Doel 3. LBB analysis was also found beneficial to demonstrate the acceptability of the primary components and piping to the new conditions resulting from power uprating and stretch-out operation. LBB analysis has been subsequently performed on the primary coolant loop of the Tihange I plant and is currently being performed for the Doel 4 plant. Application of the LBB to the primary coolant loop is based in Belgium on the U.S. Nuclear Regulatory Commission requirements. However the Belgian Safety Authorities required some additional analyses and put some restrictions on the benefits of the LBB analysis to maintain the global safety of the plant at a sufficient level. This paper develops the main steps of the safety evaluation performed by the Belgian Safety Authorities for accepting the application of the LBB technology to existing plants and summarizes the requirements asked for in addition to the U.S. Nuclear Regulatory Commission rules.

  5. Application of systematic review methodology to the field of nutrition.

    Science.gov (United States)

    Lichtenstein, Alice H; Yetley, Elizabeth A; Lau, Joseph

    2008-12-01

    Systematic reviews represent a rigorous and transparent approach to synthesizing scientific evidence that minimizes bias. They evolved within the medical community to support development of clinical and public health practice guidelines, set research agendas, and formulate scientific consensus statements. The use of systematic reviews for nutrition-related topics is more recent. Systematic reviews provide independently conducted comprehensive and objective assessments of available information addressing precise questions. This approach to summarizing available data is a useful tool for identifying the state of science including knowledge gaps and associated research needs, supporting development of science-based recommendations and guidelines, and serving as the foundation for updates as new data emerge. Our objective is to describe the steps for performing systematic reviews and highlight areas unique to the discipline of nutrition that are important to consider in data assessment. The steps involved in generating systematic reviews include identifying staffing and planning for outside expert input, forming a research team, developing an analytic framework, developing and refining research questions, defining eligibility criteria, identifying search terms, screening abstracts according to eligibility criteria, retrieving articles for evaluation, constructing evidence and summary tables, assessing methodological quality and applicability, and synthesizing results including performing meta-analysis, if appropriate. Unique and at times challenging, nutrition-related considerations include baseline nutrient exposure, nutrient status, bioequivalence of bioactive compounds, bioavailability, multiple and interrelated biological functions, undefined nature of some interventions, and uncertainties in intake assessment. Systematic reviews are a valuable and independent component of decision-making processes by groups responsible for developing science-based recommendations

  6. Systems selection methodology for civil nuclear power applications

    International Nuclear Information System (INIS)

    Scarborough, J.

    1988-01-01

    A methodology for evaluation and selection of a preferred Advanced Small or Medium Power Reactor (SMPR) for commercial electric power generation is discussed, and an illustrative example is presented with five US Advanced SMPR power plants. The evaluation procedure was developed from a methodology for ranking small, advanced nuclear power plant designs under development by the US Department of Energy (DOE) and Department of Defense (DOD). The methodology involves establishing numerical probability distributions for each of fifteen evaluation criteria for each Advanced SMPR plant. A resultant single probability distribution with its associated numerical mean value is then developed for each Advanced SMPR plant by Monte Carlo sampling techniques in order that each plant may be ranked with an associated statement of certainty. The selection methodology is intended as a screening procedure for commercial offerings to preclude detailed technical and commercial assessments from being conducted for those offerings which do not meet the initial screening criteria

  7. Application of precursor methodology in initiating frequency estimates

    International Nuclear Information System (INIS)

    Kohut, P.; Fitzpatrick, R.G.

    1991-01-01

    The precursor methodology developed in recent years provides a consistent technique to identify important accident sequence precursors. It relies on operational events (extracting information from actual experience) and infers core damage scenarios based on expected safety system responses. The ranking or categorization of each precursor is determined by considering the full spectrum of potential core damage sequences. The methodology estimates the frequency of severe core damage based on the approach suggested by Apostolakis and Mosleh, which may lead to a potential overestimation of the severe-accident sequence frequency due to the inherent dependencies between the safety systems and the initiating events. The methodology is an encompassing attempt to incorporate most of the operating information available from nuclear power plants and is an attractive tool from the point of view of risk management. In this paper, a further extension of this methodology is discussed with regard to the treatment of initiating frequency of the accident sequences

  8. Systems selection methodology for civil nuclear power applications

    International Nuclear Information System (INIS)

    Scarborough, J.C.

    1987-01-01

    A methodology for evaluation and selection of a preferred Advanced Small or Medium Power Reactor (SMPR) for commercial electric power generation is discussed, and an illustrative example is presented with five U.S. Advanced SMPR power plants. The evaluation procedure was developed from a methodology for ranking small. advenced nuclear power plant designs under development by the U.S. Department of Energy (DOE) and Department of Defense (DOD). The methodology involves establishing numerical probability distributions for each of fifteen evaluation criteria for each Advanced SMPR plant. A resultant single probability distribution with its associated numerical mean value is then developed for each Advanced SMPR plant by Monte Carlo sampling techniques in order that each plant may be ranked with an associated statement of certainty. The selection methodology is intended as a screening procedure for commercial offerings to preclude detailed technical and commercial assessments from being conducted for those offerings which do not meet the initial screening criteria. (auhtor)

  9. Methodology for the collection and application of information on food ...

    African Journals Online (AJOL)

    S Blignaut

    ISSN 0378-5254 Journal of Family Ecology and Consumer Sciences, Vol 26: No 2, 1998. 89. Methodology .... Food preference therefore indicates an individual's personal motivation ...... food behavior in Sanjur, D. Social and cultural perspec-.

  10. Methodology of Neural Design: Applications in Microwave Engineering

    Directory of Open Access Journals (Sweden)

    Z. Raida

    2006-06-01

    Full Text Available In the paper, an original methodology for the automatic creation of neural models of microwave structures is proposed and verified. Following the methodology, neural models of the prescribed accuracy are built within the minimum CPU time. Validity of the proposed methodology is verified by developing neural models of selected microwave structures. Functionality of neural models is verified in a design - a neural model is joined with a genetic algorithm to find a global minimum of a formulated objective function. The objective function is minimized using different versions of genetic algorithms, and their mutual combinations. The verified methodology of the automated creation of accurate neural models of microwave structures, and their association with global optimization routines are the most important original features of the paper.

  11. Physical protection evaluation methodology program development and application

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Janghoon; Yoo, Hosik [Korea Institute of Nuclear Non-proliferation and Control, Daejeon (Korea, Republic of)

    2015-10-15

    It is essential to develop a reliable physical protection evaluation methodology for applying physical protection concept to the design stage. The methodology can be used to assess weak points and improve performance not only for the design stage but also for nuclear facilities in operation. Analyzing physical protection property of nuclear facilities is not a trivial work since there are many interconnected factors affecting overall performance. Therefore several international projects have been organized to develop a systematic physical protection evaluation methodology. INPRO (The International Project on Innovative Nuclear Reactors and Fuel Cycles) and GIF PRPP (Generation IV International Forum Proliferation Resistance and Physical Protection) methodology are among the most well-known evaluation methodologies. INPRO adopts a checklist type of questionnaire and has a strong point in analyzing overall characteristic of facilities in a qualitative way. COMPRE program has been developed to help general users apply COMPRE methodology to nuclear facilities. In this work, COMPRE program development and a case study of the hypothetical nuclear facility are presented. The development of COMPRE program and a case study for hypothetic facility is presented in this work. The case study shows that COMPRE PP methodology can be a useful tool to assess the overall physical protection performance of nuclear facilities. To obtain meaningful results from COMPRE PP methodology, detailed information and comprehensive analysis are required. Especially, it is not trivial to calculate reliable values for PPSE (Physical Protection System Effectiveness) and C (Consequence), while it is relatively straightforward to evaluate LI (Legislative and Institutional framework), MC (Material Control) and HR (Human Resources). To obtain a reliable PPSE value, comprehensive information about physical protection system, vital area analysis and realistic threat scenario assessment are required. Like

  12. Physical protection evaluation methodology program development and application

    International Nuclear Information System (INIS)

    Seo, Janghoon; Yoo, Hosik

    2015-01-01

    It is essential to develop a reliable physical protection evaluation methodology for applying physical protection concept to the design stage. The methodology can be used to assess weak points and improve performance not only for the design stage but also for nuclear facilities in operation. Analyzing physical protection property of nuclear facilities is not a trivial work since there are many interconnected factors affecting overall performance. Therefore several international projects have been organized to develop a systematic physical protection evaluation methodology. INPRO (The International Project on Innovative Nuclear Reactors and Fuel Cycles) and GIF PRPP (Generation IV International Forum Proliferation Resistance and Physical Protection) methodology are among the most well-known evaluation methodologies. INPRO adopts a checklist type of questionnaire and has a strong point in analyzing overall characteristic of facilities in a qualitative way. COMPRE program has been developed to help general users apply COMPRE methodology to nuclear facilities. In this work, COMPRE program development and a case study of the hypothetical nuclear facility are presented. The development of COMPRE program and a case study for hypothetic facility is presented in this work. The case study shows that COMPRE PP methodology can be a useful tool to assess the overall physical protection performance of nuclear facilities. To obtain meaningful results from COMPRE PP methodology, detailed information and comprehensive analysis are required. Especially, it is not trivial to calculate reliable values for PPSE (Physical Protection System Effectiveness) and C (Consequence), while it is relatively straightforward to evaluate LI (Legislative and Institutional framework), MC (Material Control) and HR (Human Resources). To obtain a reliable PPSE value, comprehensive information about physical protection system, vital area analysis and realistic threat scenario assessment are required. Like

  13. Application of theoretical and methodological components of nursing care

    OpenAIRE

    Rosa del Socorro Morales-Aguilar; Gloria Elena Lastre-Amell; Alba Cecilia Pardo-Vásquez

    2016-01-01

    Introduction: the theoretical and methodological components are the proper expertise in nursing, and it refers to models, theories, care process, taxonomy of nursing diagnoses, system of nursing intervention classification, and system of outcomes classification, which base nursing care into professional practice. Methodology: research was performed on Google Scholar, reviewing the databases of Scielo, Ciberindex, Index Enfermería, Dialnet, Redalyc, Medline, identifying 70 published articles b...

  14. Applicability and methodology of determining sustainable yield in groundwater systems

    Science.gov (United States)

    Kalf, Frans R. P.; Woolley, Donald R.

    2005-03-01

    There is currently a need for a review of the definition and methodology of determining sustainable yield. The reasons are: (1) current definitions and concepts are ambiguous and non-physically based so cannot be used for quantitative application, (2) there is a need to eliminate varying interpretations and misinterpretations and provide a sound basis for application, (3) the notion that all groundwater systems either are or can be made to be sustainable is invalid, (4) often there are an excessive number of factors bound up in the definition that are not easily quantifiable, (5) there is often confusion between production facility optimal yield and basin sustainable yield, (6) in many semi-arid and arid environments groundwater systems cannot be sensibly developed using a sustained yield policy particularly where ecological constraints are applied. Derivation of sustainable yield using conservation of mass principles leads to expressions for basin sustainable, partial (non-sustainable) mining and total (non-sustainable) mining yields that can be readily determined using numerical modelling methods and selected on the basis of applied constraints. For some cases there has to be recognition that the groundwater resource is not renewable and its use cannot therefore be sustainable. In these cases, its destiny should be the best equitable use. sostenible. Las razones son: (1) los conceptos y definiciones actuales son ambiguos y sin base física de modo que no pueden usarse para aplicación cuantitativa, (2) existe necesidad de eliminar interpretaciones variables y mal interpretaciones y aportar bases sanas para aplicación, (3) la noción de que todos los sistemas de aguas subterráneas son o pueden ser sostenibles no esvalida, (4) frecuentemente existen un numero excesivo de factores ligados a la definición de producción sostenible los cuales no son fácil de cuantificar, (5) frecuentemente existe confusión entre la producción optima de un establecimiento y la

  15. New quickest transient detection methodology. Nuclear engineering applications

    International Nuclear Information System (INIS)

    Wang, Xin; Jevremovic, Tatjana; Tsoukalas, Lefteri H.

    2003-01-01

    A new intelligent systems methodology for quickest online transient detection is presented. Based on information that includes, but is not limited to, statistical features, energy of frequency components and wavelet coefficients, the new methodology decides whether a transient has emerged. A fuzzy system makes the final decision, the membership functions of which are obtained by artificial neural networks and adjusted in an online manner. Comparisons are performed with conventional methods for transient detection using simulated and plant data. The proposed methodology could be useful in power plant operations, diagnostic and maintenance activities. It is also considered as a design tool for quick design modifications in a virtual design environment aimed at next generation University Research and Training Reactors (URTRs). (The virtual design environment is pursued as part of the Big-10 Consortium sponsored by the new Innovations in Nuclear Infrastructure and Education (INIE) program sponsored by the US Department of Energy.) (author)

  16. Application of integrated fuzzy VIKOR & AHP methodology to contractor ranking

    Directory of Open Access Journals (Sweden)

    Mohamad Rahim Ramezaniyan

    2012-08-01

    Full Text Available Contractor selection is a critical activity, which plays an important role in the overall success of any construction project. The implementation of fuzzy multiple criteria decision attribute (MCDA in selecting contractors has the advantage of rendering subjective and implicit decision making more objective and transparent. An additional merit of fuzzy MCDA is the ability to accommodate quantitative and qualitative information. In this paper, an integrated VIKOR–AHP methodology is proposed to make a selection among the alternative contractors in one of Iranian construction industry projects. In the proposed methodology, the weights of the selection criteria are determined by fuzzy pairwise comparison matrices of AHP.

  17. Software representation methodology for agile application development: An architectural approach

    Directory of Open Access Journals (Sweden)

    Alejandro Paolo Daza Corredor

    2016-06-01

    Full Text Available The generation of Web applications represents the execution of repetitive tasks, this process involves determining information structures, the generation of different types of components and finally deployment tasks and tuning applications. In many applications of this type are coincident components generated from application to application. Current trends in software engineering as MDE, MDA or MDD pretend to automate the generation of applications based on structuring a model to apply transformations to the achievement of the application. This document intends to translate an architectural foundation that facilitates the generation of these applications relying on model-driven architecture but without ignoring the existence and relevance of existing trends mentioned in this summary architectural models.

  18. Watermark: An Application and Methodology and Application for Interactive and intelligent Decision Support for Groundwater Systems

    Science.gov (United States)

    Pierce, S. A.; Wagner, K.; Schwartz, S.; Gentle, J. N., Jr.

    2016-12-01

    Critical water resources face the effects of historic drought, increased demand, and potential contamination, the need has never been greater to develop resources to effectively communicate conservation and protection across a broad audience and geographical area. The Watermark application and macro-analysis methodology merges topical analysis of context rich corpus from policy texts with multi-attributed solution sets from integrated models of water resource and other subsystems, such as mineral, food, energy, or environmental systems to construct a scalable, robust, and reproducible approach for identifying links between policy and science knowledge bases. The Watermark application is an open-source, interactive workspace to support science-based visualization and decision making. Designed with generalization in mind, Watermark is a flexible platform that allows for data analysis and inclusion of large datasets with an interactive front-end capable of connecting with other applications as well as advanced computing resources. In addition, the Watermark analysis methodology offers functionality that streamlines communication with non-technical users for policy, education, or engagement with groups around scientific topics of societal relevance. The technology stack for Watermark was selected with the goal of creating a robust and dynamic modular codebase that can be adjusted to fit many use cases and scale to support usage loads that range between simple data display to complex scientific simulation-based modelling and analytics. The methodology uses to topical analysis and simulation-optimization to systematically analyze the policy and management realities of resource systems and explicitly connect the social and problem contexts with science-based and engineering knowledge from models. A case example demonstrates use in a complex groundwater resources management study highlighting multi-criteria spatial decision making and uncertainty comparisons.

  19. Model evaluation methodology applicable to environmental assessment models

    International Nuclear Information System (INIS)

    Shaeffer, D.L.

    1979-08-01

    A model evaluation methodology is presented to provide a systematic framework within which the adequacy of environmental assessment models might be examined. The necessity for such a tool is motivated by the widespread use of models for predicting the environmental consequences of various human activities and by the reliance on these model predictions for deciding whether a particular activity requires the deployment of costly control measures. Consequently, the uncertainty associated with prediction must be established for the use of such models. The methodology presented here consists of six major tasks: model examination, algorithm examination, data evaluation, sensitivity analyses, validation studies, and code comparison. This methodology is presented in the form of a flowchart to show the logical interrelatedness of the various tasks. Emphasis has been placed on identifying those parameters which are most important in determining the predictive outputs of a model. Importance has been attached to the process of collecting quality data. A method has been developed for analyzing multiplicative chain models when the input parameters are statistically independent and lognormally distributed. Latin hypercube sampling has been offered as a promising candidate for doing sensitivity analyses. Several different ways of viewing the validity of a model have been presented. Criteria are presented for selecting models for environmental assessment purposes

  20. Meta-Analytical Studies in Transport Economics. Methodology and Applications

    Energy Technology Data Exchange (ETDEWEB)

    Brons, M.R.E.

    2006-05-18

    Vast increases in the external costs of transport in the late twentieth century have caused national and international governmental bodies to worry about the sustainability of their transport systems. In this thesis we use meta-analysis as a research method to study various topics in transport economics that are relevant for sustainable transport policymaking. Meta-analysis is a research methodology that is based on the quantitative summarisation of a body of previously documented empirical evidence. In several fields of economic, meta-analysis has become a well-accepted research tool. Despite the appeal of the meta-analytical approach, there are methodological difficulties that need to be acknowledged. We study a specific methodological problem which is common in meta-analysis in economics, viz., within-study dependence caused by multiple sampling techniques. By means of Monte Carlo analysis we investigate the effect of such dependence on the performance of various multivariate estimators. In the applied part of the thesis we use and develop meta-analytical techniques to study the empirical variation in indicators of the price sensitivity of demand for aviation transport, the price sensitivity of demand for gasoline, the efficiency of urban public transport and the valuation of the external costs of noise from rail transport. We focus on the estimation of mean values for these indicators and on the identification of the impact of conditioning factors.

  1. Uncertainty analysis for probabilistic steam generators tube rupture in LBB applications

    Energy Technology Data Exchange (ETDEWEB)

    Durbec, V.; Pitner, P.; Pages, D. [Electricite de France, 78 - Chatou (France). Research and Development Div.; Riffard, T. [Electricite de France, 69 - Villeurbanne (France). Engineering and Construction Div.; Flesch, B. [Electricite de France, 92 - Paris la Defense (France). Generation and Transmission Div.

    1997-10-01

    Steam Generators (SG) of Pressurized Water Reactors have experienced world wide various types of tube degradations, mainly from stress corrosion cracking; because of this damage, primary-secondary leakage or tube rupture can occur. Safety against the risk of tube rupture is achieved through a combination of periodic in-service inspections (eddy current testing), surveillance of leaks during operation (leak before break concept) and tube plugging. In order to optimize the tube bundle SG maintenance, Electricite de France has developed a specific software named COMPROMIS. The model, based on probabilistic fracture mechanics makes it possible to quantify the influence of in service inspections and maintenance work on the risk of a SG Tube Rupture (SGTR), taking all significant parameters into account as random variables (initial defect size distribution, reliability of non-destructive examinations, crack initiation and propagation, critical sizes, leak before risk of break, etc...). This paper focuses on the leak rate calculation module and presents a sensitivity study of the influence of the leak before break on the conditional failure probability. (author) 8 refs.

  2. Effects of weld residual stresses on crack-opening area analysis of pipes for LBB applications

    Energy Technology Data Exchange (ETDEWEB)

    Dong, P.; Rahman, S.; Wilkowski, G. [and others

    1997-04-01

    This paper summarizes four different studies undertaken to evaluate the effects of weld residual stresses on the crack-opening behavior of a circumferential through-wall crack in the center of a girth weld. The effect of weld residual stress on the crack-opening-area and leak-rate analyses of a pipe is not well understood. There are no simple analyses to account for these effects, and, therefore, they are frequently neglected. The four studies involved the following efforts: (1) Full-field thermoplastic finite element residual stress analyses of a crack in the center of a girth weld, (2) A comparison of the crack-opening displacements from a full-field thermoplastic residual stress analysis with a crack-face pressure elastic stress analysis to determine the residual stress effects on the crack-opening displacement, (3) The effects of hydrostatic testing on the residual stresses and the resulting crack-opening displacement, and (4) The effect of residual stresses on crack-opening displacement with different normal operating stresses.

  3. Effects of weld residual stresses on crack-opening area analysis of pipes for LBB applications

    International Nuclear Information System (INIS)

    Dong, P.; Rahman, S.; Wilkowski, G.

    1997-01-01

    This paper summarizes four different studies undertaken to evaluate the effects of weld residual stresses on the crack-opening behavior of a circumferential through-wall crack in the center of a girth weld. The effect of weld residual stress on the crack-opening-area and leak-rate analyses of a pipe is not well understood. There are no simple analyses to account for these effects, and, therefore, they are frequently neglected. The four studies involved the following efforts: (1) Full-field thermoplastic finite element residual stress analyses of a crack in the center of a girth weld, (2) A comparison of the crack-opening displacements from a full-field thermoplastic residual stress analysis with a crack-face pressure elastic stress analysis to determine the residual stress effects on the crack-opening displacement, (3) The effects of hydrostatic testing on the residual stresses and the resulting crack-opening displacement, and (4) The effect of residual stresses on crack-opening displacement with different normal operating stresses

  4. Uncertainty analysis for probabilistic steam generators tube rupture in LBB applications

    International Nuclear Information System (INIS)

    Durbec, V.; Pitner, P.; Pages, D.; Riffard, T.; Flesch, B.

    1997-10-01

    Steam Generators (SG) of Pressurized Water Reactors have experienced world wide various types of tube degradations, mainly from stress corrosion cracking; because of this damage, primary-secondary leakage or tube rupture can occur. Safety against the risk of tube rupture is achieved through a combination of periodic in-service inspections (eddy current testing), surveillance of leaks during operation (leak before break concept) and tube plugging. In order to optimize the tube bundle SG maintenance, Electricite de France has developed a specific software named COMPROMIS. The model, based on probabilistic fracture mechanics makes it possible to quantify the influence of in service inspections and maintenance work on the risk of a SG Tube Rupture (SGTR), taking all significant parameters into account as random variables (initial defect size distribution, reliability of non-destructive examinations, crack initiation and propagation, critical sizes, leak before risk of break, etc...). This paper focuses on the leak rate calculation module and presents a sensitivity study of the influence of the leak before break on the conditional failure probability. (author)

  5. Security Testing in Agile Web Application Development - A Case Study Using the EAST Methodology

    CERN Document Server

    Erdogan, Gencer

    2010-01-01

    There is a need for improved security testing methodologies specialized for Web applications and their agile development environment. The number of web application vulnerabilities is drastically increasing, while security testing tends to be given a low priority. In this paper, we analyze and compare Agile Security Testing with two other common methodologies for Web application security testing, and then present an extension of this methodology. We present a case study showing how our Extended Agile Security Testing (EAST) performs compared to a more ad hoc approach used within an organization. Our working hypothesis is that the detection of vulnerabilities in Web applications will be significantly more efficient when using a structured security testing methodology specialized for Web applications, compared to existing ad hoc ways of performing security tests. Our results show a clear indication that our hypothesis is on the right track.

  6. Interrogating discourse: the application of Foucault's methodological discussion to specific inquiry.

    Science.gov (United States)

    Fadyl, Joanna K; Nicholls, David A; McPherson, Kathryn M

    2013-09-01

    Discourse analysis following the work of Michel Foucault has become a valuable methodology in the critical analysis of a broad range of topics relating to health. However, it can be a daunting task, in that there seems to be both a huge number of possible approaches to carrying out this type of project, and an abundance of different, often conflicting, opinions about what counts as 'Foucauldian'. This article takes the position that methodological design should be informed by ongoing discussion and applied as appropriate to a particular area of inquiry. The discussion given offers an interpretation and application of Foucault's methodological principles, integrating a reading of Foucault with applications of his work by other authors, showing how this is then applied to interrogate the practice of vocational rehabilitation. It is intended as a contribution to methodological discussion in this area, offering an interpretation of various methodological elements described by Foucault, alongside specific application of these aspects.

  7. Application of code scaling applicability and uncertainty methodology to the large break loss of coolant

    International Nuclear Information System (INIS)

    Young, M.Y.; Bajorek, S.M.; Nissley, M.E.

    1998-01-01

    In the late 1980s, after completion of an extensive research program, the United States Nuclear Regulatory Commission (USNRC) amended its regulations (10CFR50.46) to allow the use of realistic physical models to analyze the loss of coolant accident (LOCA) in a light water reactors. Prior to this time, the evaluation of this accident was subject to a prescriptive set of rules (appendix K of the regulations) requiring conservative models and assumptions to be applied simultaneously, leading to very pessimistic estimates of the impact of this accident on the reactor core. The rule change therefore promised to provide significant benefits to owners of power reactors, allowing them to increase output. In response to the rule change, a method called code scaling, applicability and uncertainty (CSAU) was developed to apply realistic methods, while properly taking into account data uncertainty, uncertainty in physical modeling and plant variability. The method was claimed to be structured, traceable, and practical, but was met with some criticism when first demonstrated. In 1996, the USNRC approved a methodology, based on CSAU, developed by a group led by Westinghouse. The lessons learned in this application of CSAU will be summarized. Some of the issues raised concerning the validity and completeness of the CSAU methodology will also be discussed. (orig.)

  8. Tracking and sensor data fusion methodological framework and selected applications

    CERN Document Server

    Koch, Wolfgang

    2013-01-01

    Sensor Data Fusion is the process of combining incomplete and imperfect pieces of mutually complementary sensor information in such a way that a better understanding of an underlying real-world phenomenon is achieved. Typically, this insight is either unobtainable otherwise or a fusion result exceeds what can be produced from a single sensor output in accuracy, reliability, or cost. This book provides an introduction Sensor Data Fusion, as an information technology as well as a branch of engineering science and informatics. Part I presents a coherent methodological framework, thus providing th

  9. Improved FTA methodology and application to subsea pipeline reliability design.

    Science.gov (United States)

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.

  10. Intravenous dipyridamole thallium-201 SPECT imaging methodology, applications, and interpretations

    International Nuclear Information System (INIS)

    Rockett, J.F.; Magill, H.L.; Loveless, V.S.; Murray, G.L.

    1990-01-01

    Dipyridamole TI-201 imaging is an ideal alternative to exercise TI-201 scintigraphy in patients who are unwilling or unable to perform maximum exercise stress. The use of intravenous dipyridamole, alone or in combination with exercise, has not been approved for clinical practice by the Food and Drug Administration. Once approval is granted, the test will become a widely used and important component of the cardiac work-up. The indications, methodology, side effects, and utility of dipyridamole cardiac imaging in the clinical setting are discussed and a variety of examples presented.59 references

  11. Application of Six Sigma methodology to a diagnostic imaging process.

    Science.gov (United States)

    Taner, Mehmet Tolga; Sezen, Bulent; Atwat, Kamal M

    2012-01-01

    This paper aims to apply the Six Sigma methodology to improve workflow by eliminating the causes of failure in the medical imaging department of a private Turkish hospital. Implementation of the design, measure, analyse, improve and control (DMAIC) improvement cycle, workflow chart, fishbone diagrams and Pareto charts were employed, together with rigorous data collection in the department. The identification of root causes of repeat sessions and delays was followed by failure, mode and effect analysis, hazard analysis and decision tree analysis. The most frequent causes of failure were malfunction of the RIS/PACS system and improper positioning of patients. Subsequent to extensive training of professionals, the sigma level was increased from 3.5 to 4.2. The data were collected over only four months. Six Sigma's data measurement and process improvement methodology is the impetus for health care organisations to rethink their workflow and reduce malpractice. It involves measuring, recording and reporting data on a regular basis. This enables the administration to monitor workflow continuously. The improvements in the workflow under study, made by determining the failures and potential risks associated with radiologic care, will have a positive impact on society in terms of patient safety. Having eliminated repeat examinations, the risk of being exposed to more radiation was also minimised. This paper supports the need to apply Six Sigma and present an evaluation of the process in an imaging department.

  12. Application of the adjoint function methodology for neutron fluence determination

    International Nuclear Information System (INIS)

    Haghighat, A.; Nanayakkara, B.; Livingston, J.; Mahgerefteh, M.; Luoma, J.

    1991-01-01

    In previous studies, the neutron fluence at a reactor pressure vessel has been estimated based on consolidation of transport theory calculations and experimental data obtained from in-vessel capsules and/or cavity dosimeters. Normally, a forward neutron transport calculation is performed for each fuel cycle and the neutron fluxes are integrated over the reactor operating time to estimate the neutron fluence. Such calculations are performed for a geometrical model which is composed of one-eighth (0 to 45 deg) of the reactor core and its surroundings; i.e., core barrel, thermal shield, downcomer, reactor vessel, cavity region, concrete wall, and instrumentation well. Because the model is large, transport theory calculations generally require a significant amount of computer memory and time; hence, more efficient methodologies such as the adjoint transport approach have been proposed. These studies, however, do not address the necessary sensitivity studies needed for adjoint function calculations. The adjoint methodology has been employed to estimate the activity of a cavity dosimeter and that of an in-vessel capsule. A sensitivity study has been performed on the mesh distribution used in and around the cavity dosimeter and the in-vessel capsule. Further, since a major portion of the detector response is due to the neutrons originated in the peripheral fuel assemblies, a study on the use of a smaller calculational model has been performed

  13. Methodology Declassification of Impacted Buildings. Application of Technology MARSSIM

    International Nuclear Information System (INIS)

    Vico, A.M.; Álvarez, A.; Gómez, J.M.; Quiñones, J.

    2015-01-01

    This work describes the material measurement methodology to assure the absence of contamination on impacted buildings due to processes related to the first part of the nuclear fuel cycle performed at the former Junta de Energía Nuclear, JEN, currently Centro de Investigaciones Energéticas Medioambientales y Tecnológicas, CIEMAT. The first part of the work encloses the identification and quantification of natural isotopes and its proportion in the studied surfaces through different analytical techniques. The experimental study has involved the proper equipment selection to carry out the field measurement and the characterization of uranium isotopes and their immediate descendants. According to European Union recommendations and specifications established by CSN (Consejo de Seguridad Nuclear), Spanish Regulatory authorities, for CIEMAT, the surface activity reference level have been established, which allow to decide if a surface can be classified as a conventional surface. In order to make decisions about the compliance with the established clearance criteria, MARSSIM methodology is applied by using the results obtained from field measurements (impacted and non impacted surfaces).

  14. Methodology of developing a smartphone application for crisis research and its clinical application.

    Science.gov (United States)

    Zhang, Melvyn W B; Ho, Cyrus S H; Fang, Pan; Lu, Yanxia; Ho, Roger C M

    2014-01-01

    Recent advancement in Internet based technologies have resulted in the growth of a sub-specialized field, termed as "Infodemiology" and "Infoveillance". Infoveillence refers to the collation of infodemiology measures for the purpose of surveillance and trending. Previous research has only demonstrated the research potential of Web 2.0 medium in collation of data in crisis situation. The objectives for the current study are to demonstrate the methodology of implementation of a smartphone-based application for dissemination and collation of information during a crisis situation. The Haze Smartphone application was developed using an online application builder and using HTML5 as the core programming language. A five-phase developmental method including a) formulation of user requirements, b) system design, c) system development, d) system evaluation and finally e) system application and implementation were adopted. The smartphone application was deployed during a one-week period via a self-sponsored Facebook post and via direct dissemination of the web-links by emails. A total of 298 respondents took part in the survey within the application. Most of them were between the ages of 20- to 29-years old and had a university education. More individuals preferred the option of accessing and providing feedback to a survey on physical and psychological wellbeing via direct access to a Web-based questionnaire. In addition, the participants reported a mean number of 4.03 physical symptoms (SD 2.6). The total Impact of Event Scale-Revised (IES-R) score was 18.47 (SD 11.69), which indicated that the study population did experience psychological stress but not posttraumatic stress disorder. The perceived dangerous Pollutant Standards Index (PSI) level and the number of physical symptoms were associated with higher IES-R Score (Psmartphone application could potentially be used to acquire research data in a crisis situation. However, it is crucial for future research to further

  15. Methodological Note: Neurofeedback: A Comprehensive Review on System Design, Methodology and Clinical Applications

    Directory of Open Access Journals (Sweden)

    Hengameh Marzbani

    2016-04-01

    Full Text Available Neurofeedback is a kind of biofeedback, which teaches self-control of brain functions to subjects by measuring brain waves and providing a feedback signal. Neurofeedback usually provides the audio and or video feedback. Positive or negative feedback is produced for desirable or undesirable brain activities, respectively. In this review, we provided clinical and technical information about the following issues: (1 Various neurofeedback treatment protocols i.e. alpha, beta, alpha/theta, delta, gamma, and theta; (2 Different EEG electrode placements i.e. standard recording channels in the frontal, temporal, central, and occipital lobes; (3 Electrode montages (unipolar, bipolar; (4 Types of neurofeedback i.e. frequency, power, slow cortical potential, functional magnetic resonance imaging, and so on; (5 Clinical applications of neurofeedback i.e. treatment of attention deficit hyperactivity disorder, anxiety, depression, epilepsy, insomnia, drug addiction, schizophrenia, learning disabilities, dyslexia and dyscalculia, autistic spectrum disorders and so on as well as other applications such as pain management, and the improvement of musical and athletic performance; and (6 Neurofeedback softwares. To date, many studies have been conducted on the neurofeedback therapy and its effectiveness on the treatment ofmany diseases. Neurofeedback, like other treatments, has its own pros and cons. Although it  is a non-invasive procedure, its validity has been questioned in terms of conclusive scientific evidence. For example, it is expensive, time-consuming and its benefits are not long-lasting. Also, it might take months to show the desired improvements. Nevertheless, neurofeedback is known as a complementary and alternative treatment of many brain dysfunctions. However, current research does not support conclusive results about its efficacy.

  16. Applications of the TSUNAMI sensitivity and uncertainty analysis methodology

    International Nuclear Information System (INIS)

    Rearden, Bradley T.; Hopper, Calvin M.; Elam, Karla R.; Goluoglu, Sedat; Parks, Cecil V.

    2003-01-01

    The TSUNAMI sensitivity and uncertainty analysis tools under development for the SCALE code system have recently been applied in four criticality safety studies. TSUNAMI is used to identify applicable benchmark experiments for criticality code validation, assist in the design of new critical experiments for a particular need, reevaluate previously computed computational biases, and assess the validation coverage and propose a penalty for noncoverage for a specific application. (author)

  17. Case study application of the IAEA safeguards assessment methodology to a mixed oxide fuel fabrication facility

    International Nuclear Information System (INIS)

    Swartz, J.; McDaniel, T.

    1981-01-01

    Science Applications, Inc. has prepared a case study illustrating the application of an assessment methodology to an international system for safeguarding mixed oxide (MOX) fuel fabrication facilities. This study is the second in a series of case studies which support an effort by the International Atomic Energy Agency (IAEA) and an international Consultant Group to develop a methodology for assessing the effectiveness of IAEA safeguards. 3 refs

  18. Application of a Methodology to calculate logistical cost

    Directory of Open Access Journals (Sweden)

    Joaquín Mock-Díaz

    2017-12-01

    Full Text Available At present time, the managerial environment constantly becomes more aggressive and unstable. For that reason, companies are forced to improve on a regular basis their management, to increase their economic efficiency and their effectiveness and have a better performance. Within this context, the objective of this research is to apply a methodology to determine logistical costs, in a service−providing company, which allows assessing the behavior of such costs during the year 2016. A financial assessment performed to the logistical activities proved the existence of a high cost of opportunity, element mainly dependent on inventory rotation. For the purposes of this study, several scientific methods were used; the historical−logical method, to analyze the historical evolution of logistics; and the analysis−synthesis method to gather the elements and main ideas that characterize it.

  19. Radiation monitoring methodologies and their applications at BARC site

    International Nuclear Information System (INIS)

    Divkar, J.K.; Chatterjee, M.K.; Patra, R.P; Morali, S.; Singh, Rajvir

    2016-01-01

    Radiation monitoring methodology can be planned for various objectives during normal as well as emergency situations. During radiological emergency, radiation monitoring data provides useful information required for management of the abnormal situation. In order to assess the possible consequences accurately and to implement adequate measure, the emergency management authorities should have a well-prepared monitoring strategy in readiness. Fixed monitoring method is useful to analyze the behavior of nuclear plant site and to develop holistic model for it mobile monitoring is useful for quick impact assessment and will be the backbone of emergency response, particularly in case of non availability of fixed monitoring system caused due to natural disaster like floods, earthquake and tsunami

  20. Tutorials on emerging methodologies and applications in operations research

    CERN Document Server

    2005-01-01

    Operations Research emerged as a quantitative approach to problem-solving in World War II. Its founders, who were physicists, mathematicians, and engineers, quickly found peace-time uses for this new field. Moreover, we can say that Operations Research (OR) was born in the same incubator as computer science, and through the years, it has spawned many new disciplines, including systems engineering, health care management, and transportation science. Fundamentally, Operations Research crosses discipline domains to seek solutions on a range of problems and benefits diverse disciplines from finance to bioengineering. Many disciplines routinely use OR methods. Many scientific researchers, engineers, and others will find the methodological presentations in this book useful and helpful in their problem-solving efforts. OR’s strengths are modeling, analysis, and algorithm design. It provides a quantitative foundation for a broad spectrum of problems, from economics to medicine, from environmental control to sports,...

  1. Application of TRIZ Methodology in Diffusion Welding System Optimization

    Science.gov (United States)

    Ravinder Reddy, N.; Satyanarayana, V. V.; Prashanthi, M.; Suguna, N.

    2017-12-01

    Welding is tremendously used in metal joining processes in the manufacturing process. In recent years, diffusion welding method has significantly increased the quality of a weld. Nevertheless, diffusion welding has some extent short research and application progress. Therefore, diffusion welding has a lack of relevant information, concerned with the joining of thick and thin materials with or without interlayers, on welding design such as fixture, parameters selection and integrated design. This article intends to combine innovative methods in the application of diffusion welding design. This will help to decrease trial and error or failure risks in the welding process being guided by the theory of inventive problem solving (TRIZ) design method. This article hopes to provide welding design personnel with innovative design ideas under research and for practical application.

  2. Applications of a Constrained Mechanics Methodology in Economics

    Science.gov (United States)

    Janova, Jitka

    2011-01-01

    This paper presents instructive interdisciplinary applications of constrained mechanics calculus in economics on a level appropriate for undergraduate physics education. The aim of the paper is (i) to meet the demand for illustrative examples suitable for presenting the background of the highly expanding research field of econophysics even at the…

  3. Analytical group decision making in natural resources: methodology and application

    Science.gov (United States)

    Daniel L. Schmoldt; David L. Peterson

    2000-01-01

    Group decision making is becoming increasingly important in natural resource management and associated scientific applications, because multiple values are treated coincidentally in time and space, multiple resource specialists are needed, and multiple stakeholders must be included in the decision process. Decades of social science research on decision making in groups...

  4. Applications of neuroscience in criminal law: legal and methodological issues.

    Science.gov (United States)

    Meixner, John B

    2015-01-01

    The use of neuroscience in criminal law applications is an increasingly discussed topic among legal and psychological scholars. Over the past 5 years, several prominent federal criminal cases have referenced neuroscience studies and made admissibility determinations regarding neuroscience evidence. Despite this growth, the field is exceptionally young, and no one knows for sure how significant of a contribution neuroscience will make to criminal law. This article focuses on three major subfields: (1) neuroscience-based credibility assessment, which seeks to detect lies or knowledge associated with a crime; (2) application of neuroscience to aid in assessments of brain capacity for culpability, especially among adolescents; and (3) neuroscience-based prediction of future recidivism. The article briefly reviews these fields as applied to criminal law and makes recommendations for future research, calling for the increased use of individual-level data and increased realism in laboratory studies.

  5. New approaches in intelligent control techniques, methodologies and applications

    CERN Document Server

    Kountchev, Roumen

    2016-01-01

    This volume introduces new approaches in intelligent control area from both the viewpoints of theory and application. It consists of eleven contributions by prominent authors from all over the world and an introductory chapter. This volume is strongly connected to another volume entitled "New Approaches in Intelligent Image Analysis" (Eds. Roumen Kountchev and Kazumi Nakamatsu). The chapters of this volume are self-contained and include summary, conclusion and future works. Some of the chapters introduce specific case studies of various intelligent control systems and others focus on intelligent theory based control techniques with applications. A remarkable specificity of this volume is that three chapters are dealing with intelligent control based on paraconsistent logics.

  6. A Brief overview of neutron activation analyses methodology and applications

    International Nuclear Information System (INIS)

    Ali, M.A.

    2000-01-01

    The primary objective of this talk is to present our new facility for Neutron Activation Analysis to the scientific and industrial societies and show its possibilities. Therefore my talk will handle the following main items: An overview of neutron activation analysis, The special interest of fast mono-energetic neutrons, The NAA method and its sensitivities, The Recent scientific and industrial applications using NAA, and o An illustrating example measured by using our facility is presented What is NAA? It is a sensitive analytical technique useful for performing both qualitative and quantitative multi-element analyses in samples. Worldwide application of NAA is so widespread; it is estimated that approximately several 10,000 samples undergo analysis each year from almost every conceivable field of scientific or technical interest. Why NAA? For many elements and applications, NAA: Offers sensitivities that are sometimes superior to those attainable by other methods, on the order of nano-gram level, It is accurate and reliable, NAA is generally recognized as the r eferee method o f choice when new procedures are being developed or when other methods yield results that do not agree. However, the activation analysis at En=14 MeV is limited by a few factors: Low value of flux, low cross-sections of threshold reactions, o Short irradiation time due to finite target life, Interfering reactions and gamma ray spectral interference

  7. Advances in Artificial Neural Networks – Methodological Development and Application

    Directory of Open Access Journals (Sweden)

    Yanbo Huang

    2009-08-01

    Full Text Available Artificial neural networks as a major soft-computing technology have been extensively studied and applied during the last three decades. Research on backpropagation training algorithms for multilayer perceptron networks has spurred development of other neural network training algorithms for other networks such as radial basis function, recurrent network, feedback network, and unsupervised Kohonen self-organizing network. These networks, especially the multilayer perceptron network with a backpropagation training algorithm, have gained recognition in research and applications in various scientific and engineering areas. In order to accelerate the training process and overcome data over-fitting, research has been conducted to improve the backpropagation algorithm. Further, artificial neural networks have been integrated with other advanced methods such as fuzzy logic and wavelet analysis, to enhance the ability of data interpretation and modeling and to avoid subjectivity in the operation of the training algorithm. In recent years, support vector machines have emerged as a set of high-performance supervised generalized linear classifiers in parallel with artificial neural networks. A review on development history of artificial neural networks is presented and the standard architectures and algorithms of artificial neural networks are described. Furthermore, advanced artificial neural networks will be introduced with support vector machines, and limitations of ANNs will be identified. The future of artificial neural network development in tandem with support vector machines will be discussed in conjunction with further applications to food science and engineering, soil and water relationship for crop management, and decision support for precision agriculture. Along with the network structures and training algorithms, the applications of artificial neural networks will be reviewed as well, especially in the fields of agricultural and biological

  8. RF power harvesting: a review on designing methodologies and applications

    Science.gov (United States)

    Tran, Le-Giang; Cha, Hyouk-Kyu; Park, Woo-Tae

    2017-12-01

    Wireless power transmission was conceptualized nearly a century ago. Certain achievements made to date have made power harvesting a reality, capable of providing alternative sources of energy. This review provides a summ ary of radio frequency (RF) power harvesting technologies in order to serve as a guide for the design of RF energy harvesting units. Since energy harvesting circuits are designed to operate with relatively small voltages and currents, they rely on state-of-the-art electrical technology for obtaining high efficiency. Thus, comprehensive analysis and discussions of various designs and their tradeoffs are included. Finally, recent applications of RF power harvesting are outlined.

  9. Minimal cut-set methodology for artificial intelligence applications

    International Nuclear Information System (INIS)

    Weisbin, C.R.; de Saussure, G.; Barhen, J.; Oblow, E.M.; White, J.C.

    1984-01-01

    This paper reviews minimal cut-set theory and illustrates its application with an example. The minimal cut-set approach uses disjunctive normal form in Boolean algebra and various Boolean operators to simplify very complicated tree structures composed of AND/OR gates. The simplification process is automated and performed off-line using existing computer codes to implement the Boolean reduction on the finite, but large tree structure. With this approach, on-line expert diagnostic systems whose response time is critical, could determine directly whether a goal is achievable by comparing the actual system state to a concisely stored set of preprocessed critical state elements

  10. Energy minimization in medical image analysis: Methodologies and applications.

    Science.gov (United States)

    Zhao, Feng; Xie, Xianghua

    2016-02-01

    Energy minimization is of particular interest in medical image analysis. In the past two decades, a variety of optimization schemes have been developed. In this paper, we present a comprehensive survey of the state-of-the-art optimization approaches. These algorithms are mainly classified into two categories: continuous method and discrete method. The former includes Newton-Raphson method, gradient descent method, conjugate gradient method, proximal gradient method, coordinate descent method, and genetic algorithm-based method, while the latter covers graph cuts method, belief propagation method, tree-reweighted message passing method, linear programming method, maximum margin learning method, simulated annealing method, and iterated conditional modes method. We also discuss the minimal surface method, primal-dual method, and the multi-objective optimization method. In addition, we review several comparative studies that evaluate the performance of different minimization techniques in terms of accuracy, efficiency, or complexity. These optimization techniques are widely used in many medical applications, for example, image segmentation, registration, reconstruction, motion tracking, and compressed sensing. We thus give an overview on those applications as well. Copyright © 2015 John Wiley & Sons, Ltd.

  11. Latent Trait Theory Applications to Test Item Bias Methodology. Research Memorandum No. 1.

    Science.gov (United States)

    Osterlind, Steven J.; Martois, John S.

    This study discusses latent trait theory applications to test item bias methodology. A real data set is used in describing the rationale and application of the Rasch probabilistic model item calibrations across various ethnic group populations. A high school graduation proficiency test covering reading comprehension, writing mechanics, and…

  12. Seismic hazard analysis. Application of methodology, results, and sensitivity studies

    International Nuclear Information System (INIS)

    Bernreuter, D.L.

    1981-10-01

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectra for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimated seismic hazard in this region of the country. (author)

  13. Application of Six Sigma methodology to a cataract surgery unit.

    Science.gov (United States)

    Taner, Mehmet Tolga

    2013-01-01

    The article's aim is to focus on the application of Six Sigma to minimise intraoperative and post-operative complications rates in a Turkish public hospital cataract surgery unit. Implementing define-measure-analyse-improve and control (DMAIC) involves process mapping, fishbone diagrams and rigorous data-collection. Failure mode and effect analysis (FMEA), pareto diagrams, control charts and process capability analysis are applied to redress cataract surgery failure root causes. Inefficient skills of assistant surgeons and technicians, low quality of IOLs used, wrong IOL placement, unsystematic sterilisation of surgery rooms and devices, and the unprioritising network system are found to be the critical drivers of intraoperative-operative and post-operative complications. Sigma level was increased from 2.60 to 3.75 subsequent to extensive training of assistant surgeons, ophthalmologists and technicians, better quality IOLs, systematic sterilisation and air-filtering, and the implementation of a more sophisticated network system. This article shows that Six Sigma measurement and process improvement can become the impetus for cataract unit staff to rethink their process and reduce malpractices. Measuring, recording and reporting data regularly helps them to continuously monitor their overall process and deliver safer treatments. This is the first Six Sigma ophthalmology study in Turkey.

  14. Student satisfaction and loyalty in Denmark: Application of EPSI methodology.

    Science.gov (United States)

    Shahsavar, Tina; Sudzina, Frantisek

    2017-01-01

    Monitoring and managing customers' satisfaction are key features to benefit from today's competitive environment. In higher education context, only a few studies are available on satisfaction and loyalty of the main customers who are the students, which signifies the need to investigate the field more thoroughly. The aim of this research is to measure the strength of determinants of students' satisfaction and the importance of antecedents in students' satisfaction and loyalty in Denmark. Our research model is the modification of European Performance Satisfaction Index (EPSI), which takes the university's image direct effects on students' expectations into account from students' perspective. The structural equation model of student satisfaction and loyalty has been evaluated using partial least square path modelling. Our findings confirm that the EPSI framework is applicable on student satisfaction and loyalty among Danish universities. We show that all the relationships among variables of the research model are significant except the relationship between quality of software and students' loyalty. Results further verify the significance of antecedents in students' satisfaction and loyalty at Danish universities; the university image and student satisfaction are the antecedents of student loyalty with a significant direct effect, while perceived value, quality of hardware, quality of software, expectations, and university image are antecedents of student satisfaction. Eventually, our findings may be of an inspiration to maintain and improve students' experiences during their study at the university. Dedicating resources to identified important factors from students' perception enable universities to attract more students, make them highly satisfied and loyal.

  15. Student satisfaction and loyalty in Denmark: Application of EPSI methodology

    Science.gov (United States)

    Shahsavar, Tina

    2017-01-01

    Monitoring and managing customers’ satisfaction are key features to benefit from today’s competitive environment. In higher education context, only a few studies are available on satisfaction and loyalty of the main customers who are the students, which signifies the need to investigate the field more thoroughly. The aim of this research is to measure the strength of determinants of students’ satisfaction and the importance of antecedents in students’ satisfaction and loyalty in Denmark. Our research model is the modification of European Performance Satisfaction Index (EPSI), which takes the university’s image direct effects on students’ expectations into account from students’ perspective. The structural equation model of student satisfaction and loyalty has been evaluated using partial least square path modelling. Our findings confirm that the EPSI framework is applicable on student satisfaction and loyalty among Danish universities. We show that all the relationships among variables of the research model are significant except the relationship between quality of software and students’ loyalty. Results further verify the significance of antecedents in students’ satisfaction and loyalty at Danish universities; the university image and student satisfaction are the antecedents of student loyalty with a significant direct effect, while perceived value, quality of hardware, quality of software, expectations, and university image are antecedents of student satisfaction. Eventually, our findings may be of an inspiration to maintain and improve students’ experiences during their study at the university. Dedicating resources to identified important factors from students’ perception enable universities to attract more students, make them highly satisfied and loyal. PMID:29240801

  16. Strategic environmental assessment methodologies--applications within the energy sector

    International Nuclear Information System (INIS)

    Finnveden, Goeran; Nilsson, Maans; Johansson, Jessica; Persson, Aasa; Moberg, Aasa; Carlsson, Tomas

    2003-01-01

    Strategic Environmental Assessment (SEA) is a procedural tool and within the framework of SEA, several different types of analytical tools can be used in the assessment. Several analytical tools are presented and their relation to SEA is discussed including methods for future studies, Life Cycle Assessment, Risk Assessment, Economic Valuation and Multi-Attribute Approaches. A framework for the integration of some analytical tools in the SEA process is suggested. It is noted that the available analytical tools primarily cover some types of environmental impacts related to emissions of pollutants. Tools covering impacts on ecosystems and landscapes are more limited. The relation between application and choice of analytical tools is discussed. It is suggested that SEAs used to support a choice between different alternatives require more quantitative methods, whereas SEAs used to identify critical aspects and suggest mitigation strategies can suffice with more qualitative methods. The possible and desired degree of site-specificity in the assessment can also influence the choice of methods. It is also suggested that values and world views can be of importance for judging whether different types of tools and results are meaningful and useful. Since values and world views differ between different stakeholders, consultation and understanding are important to ensure credibility and relevance

  17. Flux Measurements in Trees: Methodological Approach and Application to Vineyards

    Directory of Open Access Journals (Sweden)

    Francesca De Lorenzi

    2008-03-01

    Full Text Available In this paper a review of two sap flow methods for measuring the transpiration in vineyards is presented. The objective of this work is to examine the potential of detecting transpiration in trees in response to environmental stresses, particularly the high concentration of ozone (O3 in troposphere. The methods described are the stem heat balance and the thermal dissipation probe; advantages and disadvantages of each method are detailed. Applications of both techniques are shown, in two large commercial vineyards in Southern Italy (Apulia and Sicily, submitted to semi-arid climate. Sap flow techniques allow to measure transpiration at plant scale and an upscaling procedure is necessary to calculate the transpiration at the whole stand level. Here a general technique to link the value of transpiration at plant level to the canopy value is presented, based on experimental relationships between transpiration and biometric characteristics of the trees. In both vineyards transpiration measured by sap flow methods compares well with evapotranspiration measured by micrometeorological techniques at canopy scale. Moreover soil evaporation component has been quantified. In conclusion, comments about the suitability of the sap flow methods for studying the interactions between trees and ozone are given.

  18. GO-FLOW methodology. Basic concept and integrated analysis framework for its applications

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2010-01-01

    GO-FLOW methodology is a success oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. Recently an integrated analysis framework of the GO-FLOW has been developed for the safety evaluation of elevator systems by the Ministry of Land, Infrastructure, Transport and Tourism, Japanese Government. This paper describes (a) an Overview of the GO-FLOW methodology, (b) Procedure of treating a phased mission problem, (c) Common cause failure analysis, (d) Uncertainty analysis, and (e) Integrated analysis framework. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis and has a wide range of applications. (author)

  19. Performance specification methodology: introduction and application to displays

    Science.gov (United States)

    Hopper, Darrel G.

    1998-09-01

    Acquisition reform is based on the notion that DoD must rely on the commercial marketplace insofar as possible rather than solely looking inward to a military marketplace to meet its needs. This reform forces a fundamental change in the way DoD conducts business, including a heavy reliance on private sector models of change. The key to more reliance on the commercial marketplace is the performance specifications (PS). This paper introduces some PS concepts and a PS classification principal to help bring some structure to the analysis of risk (cost, schedule, capability) in weapons system development and the management of opportunities for affordable ownership (maintain/increase capability via technology insertion, reduce cost) in this new paradigm. The DoD shift toward commercial components is nowhere better exemplified than in displays. Displays are the quintessential dual-use technology and are used herein to exemplify these PS concepts and principal. The advent of flat panel displays as a successful technology is setting off an epochal shift in cockpits and other military applications. Displays are installed in every DoD weapon system, and are, thus, representative of a range of technologies where issues and concerns throughout industry and government have been raised regarding the increased DoD reliance on the commercial marketplace. Performance specifications require metrics: the overall metrics of 'information-thrust' with units of Mb/s and 'specific info- thrust' with units of Mb/s/kg are introduced to analyze value of a display to the warfighter and affordability to the taxpayer.

  20. Application of the integrated safety assessment methodology to the protection of electric systems

    International Nuclear Information System (INIS)

    Hortal, Javier; Izquierdo, Jose M.

    1996-01-01

    The generalization of classical techniques for risk assessment incorporating dynamic effects is the main objective of the Integrated Safety Assessment Methodology, as practical implementation of Protection Theory. Transient stability, contingency analysis and protection setpoint verification in electric power systems are particularly appropriate domains of application, since the coupling of reliability and dynamic analysis in the protection assessment process is being increasingly demanded. Suitable techniques for dynamic simulation of sequences of switching events in power systems are derived from the use of quasi-linear equation solution algorithms. The application of the methodology, step by step, is illustrated in a simple but representative example

  1. Proteomes of Lactobacillus delbrueckii subsp. bulgaricus LBB.B5 Incubated in Milk at Optimal and Low Temperatures.

    Science.gov (United States)

    Yin, Xiaochen; Salemi, Michelle R; Phinney, Brett S; Gotcheva, Velitchka; Angelov, Angel; Marco, Maria L

    2017-01-01

    We identified the proteins synthesized by Lactobacillus delbrueckii subsp. bulgaricus strain LBB.B5 in laboratory culture medium (MRS) at 37°C and milk at 37 and 4°C. Cell-associated proteins were measured by gel-free, shotgun proteomics using high-performance liquid chromatography coupled with tandem mass spectrophotometry. A total of 635 proteins were recovered from all cultures, among which 72 proteins were milk associated (unique or significantly more abundant in milk). LBB.B5 responded to milk by increasing the production of proteins required for purine biosynthesis, carbohydrate metabolism (LacZ and ManM), energy metabolism (TpiA, PgK, Eno, SdhA, and GapN), amino acid synthesis (MetE, CysK, LBU0412, and AspC) and transport (GlnM and GlnP), and stress response (Trx, MsrA, MecA, and SmpB). The requirement for purines was confirmed by the significantly improved cell yields of L. delbrueckii subsp. bulgaricus when incubated in milk supplemented with adenine and guanine. The L. delbrueckii subsp. bulgaricus -expressed proteome in milk changed upon incubation at 4°C for 5 days and included increased levels of 17 proteins, several of which confer functions in stress tolerance (AddB, UvrC, RecA, and DnaJ). However, even with the activation of stress responses in either milk or MRS, L. delbrueckii subsp. bulgaricus did not survive passage through the murine digestive tract. These findings inform efforts to understand how L. delbrueckii subsp. bulgaricus is adapted to the dairy environment and its implications for its health-benefiting properties in the human digestive tract. IMPORTANCE Lactobacillus delbrueckii subsp. bulgaricus has a long history of use in yogurt production. Although commonly cocultured with Streptococcus salivarius subsp. thermophilus in milk, fundamental knowledge of the adaptive responses of L. delbrueckii subsp. bulgaricus to the dairy environment and the consequences of those responses on the use of L. delbrueckii subsp. bulgaricus as

  2. Go-flow: a reliability analysis methodology applicable to piping system

    International Nuclear Information System (INIS)

    Matsuoka, T.; Kobayashi, M.

    1985-01-01

    Since the completion of the Reactor Safety Study, the use of probabilistic risk assessment technique has been becoming more widespread in the nuclear community. Several analytical methods are used for the reliability analysis of nuclear power plants. The GO methodology is one of these methods. Using the GO methodology, the authors performed a reliability analysis of the emergency decay heat removal system of the nuclear ship Mutsu, in order to examine its applicability to piping systems. By this analysis, the authors have found out some disadvantages of the GO methodology. In the GO methodology, the signal is on-to-off or off-to-on signal, therefore the GO finds out the time point at which the state of a system changes, and can not treat a system which state changes as off-on-off. Several computer runs are required to obtain the time dependent failure probability of a system. In order to overcome these disadvantages, the authors propose a new analytical methodology: GO-FLOW. In GO-FLOW, the modeling method (chart) and the calculation procedure are similar to those in the GO methodology, but the meaning of signal and time point, and the definitions of operators are essentially different. In the paper, the GO-FLOW methodology is explained and two examples of the analysis by GO-FLOW are given

  3. The Application Strategy of Iterative Solution Methodology to Matrix Equations in Hydraulic Solver Package, SPACE

    International Nuclear Information System (INIS)

    Na, Y. W.; Park, C. E.; Lee, S. Y.

    2009-01-01

    As a part of the Ministry of Knowledge Economy (MKE) project, 'Development of safety analysis codes for nuclear power plants', KOPEC has been developing the hydraulic solver code package applicable to the safety analyses of nuclear power plants (NPP's). The matrices of the hydraulic solver are usually sparse and may be asymmetric. In the earlier stage of this project, typical direct matrix solver packages MA48 and MA28 had been tested as matrix solver for the hydraulic solver code, SPACE. The selection was based on the reasonably reliable performance experience from their former version MA18 in RELAP computer code. In the later stage of this project, the iterative methodologies have been being tested in the SPACE code. Among a few candidate iterative solution methodologies tested so far, the biconjugate gradient stabilization methodology (BICGSTAB) has shown the best performance in the applicability test and in the application to the SPACE code. Regardless of all the merits of using the direct solver packages, there are some other aspects of tackling the iterative solution methodologies. The algorithm is much simpler and easier to handle. The potential problems related to the robustness of the iterative solution methodologies have been resolved by applying pre-conditioning methods adjusted and modified as appropriate to the application in the SPACE code. The application strategy of conjugate gradient method was introduced in detail by Schewchuk, Golub and Saad in the middle of 1990's. The application of his methodology to nuclear engineering in Korea started about the same time and is still going on and there are quite a few examples of application to neutronics. Besides, Yang introduced a conjugate gradient method programmed in C++ language. The purpose of this study is to assess the performance and behavior of the iterative solution methodology compared to those of the direct solution methodology still being preferred due to its robustness and reliability. The

  4. Application of new design methodologies to very high-temperature metallic components of the HTTR

    International Nuclear Information System (INIS)

    Hada, Kazuhiko; Ohkubo, Minoru; Baba, Osamu

    1991-01-01

    The high-temperature piping and helium-to-helium intermediate heat exchanger of the High-Temperature Engineering Test Reactor (HTTR) are designed to be operating at very high temperatures of about 900deg C among the class 1 components of the HTTR. At such a high temperature, mechanical strength of heat-resistant metallic materials is very low and thermal expansions of structural members are large. Therefore, innovative design methodologies are needed to reduce both mechanical and thermal loads acting on these components. To the HTTR, the design methodologies which can separate the heat-resistant function from the pressure-retaining functions and allow them to expand freely are applied to reduce pressure and thermal loads. Since these design methodologies need to verify their applicability, the Japan Atomic Energy Research Institute (JAERI) has been performing many design and research works on their verifications. The details of the design methodologies and their verifications are given in this paper. (orig.)

  5. An Application of the Methodology for Assessment of the Sustainability of Air Transport System

    Science.gov (United States)

    Janic, Milan

    2003-01-01

    An assessment and operationalization of the concept of sustainable air transport system is recognized as an important but complex research, operational and policy task. In the scope of the academic efforts to properly address the problem, this paper aims to assess the sustainability of air transport system. It particular, the paper describes the methodology for assessment of sustainability and its potential application. The methodology consists of the indicator systems, which relate to the air transport system operational, economic, social and environmental dimension of performance. The particular indicator systems are relevant for the particular actors such users (air travellers), air transport operators, aerospace manufacturers, local communities, governmental authorities at different levels (local, national, international), international air transport associations, pressure groups and public. In the scope of application of the methodology, the specific cases are selected to estimate the particular indicators, and thus to assess the system sustainability under given conditions.

  6. Optimization Of Methodological Support Of Application Tax Benefits In Regions: Practice Of Perm Region

    Directory of Open Access Journals (Sweden)

    Alexandr Ivanovich Tatarkin

    2015-03-01

    Full Text Available In the article, the problem of the methodological process support of regional tax benefits is reviewed. The method of tax benefits assessment, accepted in Perm Region, was chosen as an analysis object because the relatively long period of application of benefits has allowed to build enough statistics base. In the article, the reliability of budget, economic, investment, and social effectiveness assessments of application benefits, based on the Method, is investigated. The suggestions of its perfection are formulated

  7. Prognostics and health management design for rotary machinery systems—Reviews, methodology and applications

    Science.gov (United States)

    Lee, Jay; Wu, Fangji; Zhao, Wenyu; Ghaffari, Masoud; Liao, Linxia; Siegel, David

    2014-01-01

    Much research has been conducted in prognostics and health management (PHM), an emerging field in mechanical engineering that is gaining interest from both academia and industry. Most of these efforts have been in the area of machinery PHM, resulting in the development of many algorithms for this particular application. The majority of these algorithms concentrate on applications involving common rotary machinery components, such as bearings and gears. Knowledge of this prior work is a necessity for any future research efforts to be conducted; however, there has not been a comprehensive overview that details previous and on-going efforts in PHM. In addition, a systematic method for developing and deploying a PHM system has yet to be established. Such a method would enable rapid customization and integration of PHM systems for diverse applications. To address these gaps, this paper provides a comprehensive review of the PHM field, followed by an introduction of a systematic PHM design methodology, 5S methodology, for converting data to prognostics information. This methodology includes procedures for identifying critical components, as well as tools for selecting the most appropriate algorithms for specific applications. Visualization tools are presented for displaying prognostics information in an appropriate fashion for quick and accurate decision making. Industrial case studies are included in this paper to show how this methodology can help in the design of an effective PHM system.

  8. Application of Resource Description Framework to Personalise Learning: Systematic Review and Methodology

    Science.gov (United States)

    Jevsikova, Tatjana; Berniukevicius, Andrius; Kurilovas, Eugenijus

    2017-01-01

    The paper is aimed to present a methodology of learning personalisation based on applying Resource Description Framework (RDF) standard model. Research results are two-fold: first, the results of systematic literature review on Linked Data, RDF "subject-predicate-object" triples, and Web Ontology Language (OWL) application in education…

  9. Radiological safety methodology in radioactive tracer applications for hydrodynamics and environmental studies

    International Nuclear Information System (INIS)

    Suarez, R.; Badano, A.; Dellepere, A.; Artucio, G.; Bertolotti, A.

    1995-01-01

    The use of radioactive tracer techniques as control sewage disposal contamination in Montevideo Estuarine and Carrasco beach has been studied for the Nuclear Technology National Direction. Hydrodynamic models simulation has been introduced as work methodology. As well as radiological safety and radioactive material applications in the environmental studies has been evaluated mainly in the conclusions and recommendations in this report. maps

  10. The Expanded Application of Forensic Science and Law Enforcement Methodologies in Army Counterintelligence

    Science.gov (United States)

    2017-09-01

    enforcement (LE) capabilities during the investigation of criminal offenses has become commonplace in the U.S. criminal justice system . These... system , and FORENSICS AND LAW ENFORCEMENT IN ARMY COUNTERINTELLIGENCE 22 would likely need to go to their local Army CID or military police...THE EXPANDED APPLICATION OF FORENSIC SCIENCE AND LAW ENFORCEMENT METHODOLOGIES IN ARMY COUNTERINTELLIGENCE A RESEARCH PROJECT

  11. CHARACTERIZATION OF SMALL AND MEDIUM ENTERPRISES (SMES OF POMERANIAN REGION IN SIX SIGMA METHODOLOGY APPLICATION

    Directory of Open Access Journals (Sweden)

    2011-12-01

    Full Text Available Background: Six Sigma is related to product’s characteristics and parameters of actions, needed to obtain these products. On the other hand, it is a multi-step, cyclic process aimed at the improvements leading to global standard, closed to the perfection. There is a growing interest in Six Sigma methodology among smaller organizations but there are still too little publications presented such events in the sector of small and medium enterprises, especially based on good empirical results. It was already noticed at the phase of the preliminary researches, that only small part of companies from this sector in Pomerian region use elements of this methodology. Methods: The companies were divided into groups by the type of their activities as well as the employment size. The questionnaires were sent to 150 randomly selected organizations in two steps and were addressed to senior managers. The questionnaire contained the questions about basic information about a company, the level of the knowledge and the practical application of Six Sigma methodology, opinions about improvements of processes occurring in the company, opinions about trainings in Six Sigma methodology. Results: The following hypotheses were proposed, statistically verified and received the answer: The lack of the adequate knowledge of Six Sigma methodology in SMEs limits the possibility to effectively monitor and improve processes - accepted. The use of statistical tools of Six Sigma methodology requires the broad action to popularize this knowledge among national SMEs - accepted. The level of the awareness of the importance as well as practical use of Six Sigma methodology in manufacturing SMEs is higher than in SMEs providing services - rejected, the level is equal. The level of the knowledge and the use of Six Sigma methodology in medium manufacturing companies is significantly higher than in small manufacturing companies - accepted. The level of the knowledge and the application

  12. Environmental and sanitary evaluation of electro-nuclear sites: methodological research and application to prospective scenarios

    International Nuclear Information System (INIS)

    2004-12-01

    In the framework of the radioactive wastes disposal of the law of 1991, an exchange forum constituted by ANDRA, CEA, COGEMA, EdF, Framatome-ANP and IRSN implemented an environmental and sanitary evaluation of the different methods of radioactive wastes management. This report presents the six studies scenarios, the proposed methodology, the application to the six scenarios and the analysis of the results which showed the efficiency of the different recycling options towards the electronuclear cycle impacts limitation, and a technical conclusion illustrated by improvement possibilities of the methodology. (A.L.B.)

  13. Case-Crossover Analysis of Air Pollution Health Effects: A Systematic Review of Methodology and Application

    Science.gov (United States)

    Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo

    2010-01-01

    Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818

  14. An Evaluation Methodology Development and Application Process for Severe Accident Safety Issue Resolution

    Directory of Open Access Journals (Sweden)

    Robert P. Martin

    2012-01-01

    Full Text Available A general evaluation methodology development and application process (EMDAP paradigm is described for the resolution of severe accident safety issues. For the broader objective of complete and comprehensive design validation, severe accident safety issues are resolved by demonstrating comprehensive severe-accident-related engineering through applicable testing programs, process studies demonstrating certain deterministic elements, probabilistic risk assessment, and severe accident management guidelines. The basic framework described in this paper extends the top-down, bottom-up strategy described in the U.S Nuclear Regulatory Commission Regulatory Guide 1.203 to severe accident evaluations addressing U.S. NRC expectation for plant design certification applications.

  15. Application of 'Process management' methodology in providing financial services of PE 'Post Serbia'

    Directory of Open Access Journals (Sweden)

    Kujačić Momčilo D.

    2014-01-01

    Full Text Available The paper describes application of the methodology 'Process management', in providing of financial services at the post office counter hall. An overview of the methodology is given, as one of the most commonly used qualitative methodology, whereby Process management's technics are described , those can better meet user needs and market demands, as well as to find more effectively way to resist current competition in the postal service market. One of the main problem that pointed out is a long waiting time in the counter hall during providing financial services, which leads to the formation of queue lines, and thus to customer dissatisfaction. According that, paper points steps that should be taken during provide of financial services in a postal network unit for providing services to customers by optimizing user time waiting in line and increasing the satisfaction of all participants in that process.

  16. Assessment methodology applicable to safe decommissioning of Romanian VVR-S research reactor

    International Nuclear Information System (INIS)

    Baniu, O.; Vladescu, G.; Vidican, D.; Penescu, M.

    2002-01-01

    The paper contains the results of research activity performed by CITON specialists regarding the assessment methodology intended to be applied to safe decommissioning of the research reactors, developed taking into account specific conditions of the Romanian VVR-S Research Reactor. The Romanian VVR-S Research Reactor is an old reactor (1957) and its Decommissioning Plan is under study. The main topics of paper are as follows: Safety approach of nuclear facilities decommissioning. Applicable safety principles; Main steps of the proposed assessment methodology; Generic content of Decommissioning Plan. Main decommissioning activities. Discussion about the proposed Decommissioning Plan for Romanian Research Reactor; Safety risks which may occur during decommissioning activities. Normal decommissioning operations. Fault conditions. Internal and external hazards; Typical development of a scenario. Features, Events and Processes List. Exposure pathways. Calculation methodology. (author)

  17. Applications of a methodology for the analysis of learning trends in nuclear power plants

    International Nuclear Information System (INIS)

    Cho, Hang Youn; Choi, Sung Nam; Yun, Won Yong

    1995-01-01

    A methodology is applied to identify the learning trend related to the safety and availability of U.S. commercial nuclear power plants. The application is intended to aid in reducing likelihood of human errors. To assure that the methodology can be easily adapted to various types of classification schemes of operation data, a data bank classified by the Transient Analysis Classification and Evaluation(TRACE) scheme is selected for the methodology. The significance criteria for human-initiated events affecting the systems and for events caused by human deficiencies were used. Clustering analysis was used to identify the learning trend in multi-dimensional histograms. A computer code is developed based on the K-Means algorithm and applied to find the learning period in which error rates are monotonously decreasing with plant age

  18. Application of Master Curve Methodology for Structural Integrity Assessments of Nuclear Components

    Energy Technology Data Exchange (ETDEWEB)

    Sattari-Far, Iradj [Det Norske Veritas, Stockholm (Sweden); Wallin, Kim [VTT, Esbo (Finland)

    2005-10-15

    The objective was to perform an in-depth investigation of the Master Curve methodology and also based on this method develop a procedure for fracture assessments of nuclear components. The project has sufficiently illustrated the capabilities of the Master Curve methodology for fracture assessments of nuclear components. Within the scope of this work, the theoretical background of the methodology and its validation on small and large specimens has been studied and presented to a sufficiently large extent, as well as the correlations between the charpy-V data and the Master Curve T{sub 0} reference temperature in the evaluation of fracture toughness. The work gives a comprehensive report of the background theory and the different applications of the Master Curve methodology. The main results of the work have shown that the cleavage fracture toughness is characterized by a large amount of statistical scatter in the transition region, it is specimen size dependent and it should be treated statistically rather than deterministically. The Master Curve methodology is able to make use of statistical data in a consistent way. Furthermore, the Master Curve methodology provides a more precise prediction of the fracture toughness of embrittled materials in comparison with the ASME K{sub IC} reference curve, which often gives over-conservative results. The suggested procedure in this study, concerning the application of the Master Curve method in fracture assessments of ferritic steels in the transition region and the low shelf regions, is valid for the temperatures range T{sub 0}-50{<=}T{<=}T{sub 0}+50 deg C. If only approximate information is required, the Master Curve may well be extrapolated outside this temperature range. The suggested procedure has also been illustrated for some examples.

  19. Towards more sustainable management of European food waste: Methodological approach and numerical application.

    Science.gov (United States)

    Manfredi, Simone; Cristobal, Jorge

    2016-09-01

    Trying to respond to the latest policy needs, the work presented in this article aims at developing a life-cycle based framework methodology to quantitatively evaluate the environmental and economic sustainability of European food waste management options. The methodology is structured into six steps aimed at defining boundaries and scope of the evaluation, evaluating environmental and economic impacts and identifying best performing options. The methodology is able to accommodate additional assessment criteria, for example the social dimension of sustainability, thus moving towards a comprehensive sustainability assessment framework. A numerical case study is also developed to provide an example of application of the proposed methodology to an average European context. Different options for food waste treatment are compared, including landfilling, composting, anaerobic digestion and incineration. The environmental dimension is evaluated with the software EASETECH, while the economic assessment is conducted based on different indicators expressing the costs associated with food waste management. Results show that the proposed methodology allows for a straightforward identification of the most sustainable options for food waste, thus can provide factual support to decision/policy making. However, it was also observed that results markedly depend on a number of user-defined assumptions, for example on the choice of the indicators to express the environmental and economic performance. © The Author(s) 2016.

  20. Application of FIVE methodology in probabilistic risk assessment (PRA) of fire events

    International Nuclear Information System (INIS)

    Lopez Garcia, F.J.; Suarez Alonso, J.; Fiolamengual, M.J.

    1993-01-01

    This paper reflects the experience acquired during the process of evaluation and updating of the fire analysis within the Cofrentes NPP PRA. It determines which points are the least precise, either because of their greater uncertainty or because of their excessive conservatism, as well as the subtasks which have involved a larger work load and could be simplified. These aspects are compared with the steps followed in methodology FIVE (Fire Vulnerability Evaluation Methodology) to assess whether application of this methodology would optimize the task, by making it more systematic and realistic and reducing uncertainties. On the one hand, the FIVE methodology does not have the scope sufficient to carry out a quantitative risk evaluation, but it can easily be complemented -without detriment to its systematic nature- by quantifying core damage in significant areas. On the other hand, certain issues such as definition of the fire growth software program which has to be used, are still not fully closed. Nevertheless, the conclusions derived from this assessment are satisfactory, since it is considered that this methodology would serve to unify the criteria and data of the analysis of fire-induced risks, providing a progressive screening method which would considerably simplify the task. (author)

  1. Application of the accident management information needs methodology to a severe accident sequence

    International Nuclear Information System (INIS)

    Ward, L.W.; Hanson, D.J.; Nelson, W.R.; Solberg, D.E.

    1989-01-01

    The U.S. Nuclear Regulatory Commission (NRC) is conducting an Accident Management Research Program that emphasizes the application of severe accident research results to enhance the capability of plant operating personnel to effectively manage severe accidents. A methodology to identify and assess the information needs of the operating staff of a nuclear power plant during a severe accident has been developed as part of the research program designed to resolve this issue. The methodology identifies the information needs of the plant personnel during a wide range of accident conditions, the existing plant measurements capable of supplying these information needs and what, if any minor additions to instrument and display systems would enhance the capability to manage accidents, known limitations on the capability of these measurements to function properly under the conditions that will be present during a wide range of severe accidents, and areas in which the information systems could mislead plant personnel. This paper presents an application of this methodology to a severe accident sequence to demonstrate its use in identifying the information which is available for management of the event. The methodology has been applied to a severe accident sequence in a Pressurized Water Reactor with a large dry containment. An examination of the capability of the existing measurements was then performed to determine whether the information needs can be supplied

  2. Application of NASA Kennedy Space Center system assurance analysis methodology to nuclear power plant systems designs

    International Nuclear Information System (INIS)

    Page, D.W.

    1985-01-01

    The Kennedy Space Center (KSC) entered into an agreement with the Nuclear Regulatory Commission (NRC) to conduct a study to demonstrate the feasibility and practicality of applying the KSC System Assurance Analysis (SAA) methodology to nuclear power plant systems designs. In joint meetings of KSC and Duke Power personnel, an agreement was made to select to CATAWBA systems, the Containment Spray System and the Residual Heat Removal System, for the analyses. Duke Power provided KSC with a full set a Final Safety Analysis Reports as well as schematics for the two systems. During Phase I of the study the reliability analyses of the SAA were performed. During Phase II the hazard analyses were performed. The final product of Phase II is a handbook for implementing the SAA methodology into nuclear power plant systems designs. The purpose of this paper is to describe the SAA methodology as it applies to nuclear power plant systems designs and to discuss the feasibility of its application. The conclusion is drawn that nuclear power plant systems and aerospace ground support systems are similar in complexity and design and share common safety and reliability goals. The SAA methodology is readily adaptable to nuclear power plant designs because of it's practical application of existing and well known safety and reliability analytical techniques tied to an effective management information system

  3. A dynamic systems engineering methodology research study. Phase 2: Evaluating methodologies, tools, and techniques for applicability to NASA's systems projects

    Science.gov (United States)

    Paul, Arthur S.; Gill, Tepper L.; Maclin, Arlene P.

    1989-01-01

    A study of NASA's Systems Management Policy (SMP) concluded that the primary methodology being used by the Mission Operations and Data Systems Directorate and its subordinate, the Networks Division, is very effective. Still some unmet needs were identified. This study involved evaluating methodologies, tools, and techniques with the potential for resolving the previously identified deficiencies. Six preselected methodologies being used by other organizations with similar development problems were studied. The study revealed a wide range of significant differences in structure. Each system had some strengths but none will satisfy all of the needs of the Networks Division. Areas for improvement of the methodology being used by the Networks Division are listed with recommendations for specific action.

  4. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study

  5. 2014 International Conference on Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Ören, Tuncer; Kacprzyk, Janusz; Filipe, Joaquim

    2015-01-01

    The present book includes a set of selected extended papers from the 4th International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2014), held in Vienna, Austria, from 28 to 30 August 2014. The conference brought together researchers, engineers and practitioners interested in methodologies and applications of modeling and simulation. New and innovative solutions are reported in this book. SIMULTECH 2014 received 167 submissions, from 45 countries, in all continents. After a double blind paper review performed by the Program Committee, 23% were accepted as full papers and thus selected for oral presentation. Additional papers were accepted as short papers and posters. A further selection was made after the Conference, based also on the assessment of presentation quality and audience interest, so that this book includes the extended and revised versions of the very best papers of SIMULTECH 2014. Commitment to high quality standards is a major concern of SIMULTEC...

  6. Methodology and application of surrogate plant PRA analysis to the Rancho Seco Power Plant: Final report

    International Nuclear Information System (INIS)

    Gore, B.F.; Huenefeld, J.C.

    1987-07-01

    This report presents the development and the first application of generic probabilistic risk assessment (PRA) information for identifying systems and components important to public risk at nuclear power plants lacking plant-specific PRAs. A methodology is presented for using the results of PRAs for similar (surrogate) plants, along with plant-specific information about the plant of interest and the surrogate plants, to infer important failure modes for systems of the plant of interest. This methodology, and the rationale on which it is based, is presented in the context of its application to the Rancho Seco plant. The Rancho Seco plant has been analyzed using PRA information from two surrogate plants. This analysis has been used to guide development of considerable plant-specific information about Rancho Seco systems and components important to minimizing public risk, which is also presented herein

  7. 5th International Conference on Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Kacprzyk, Janusz; Ören, Tuncer; Filipe, Joaquim

    2016-01-01

    The present book includes a set of selected extended papers from the 5th International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2015), held in Colmar, France, from 21 to 23 July 2015. The conference brought together researchers, engineers and practitioners interested in methodologies and applications of modeling and simulation. New and innovative solutions are reported in this book. SIMULTECH 2015 received 102 submissions, from 36 countries, in all continents. After a double blind paper review performed by the Program Committee, 19% were accepted as full papers and thus selected for oral presentation. Additional papers were accepted as short papers and posters. A further selection was made after the Conference, based also on the assessment of presentation quality and audience interest, so that this book includes the extended and revised versions of the very best papers of SIMULTECH 2015. Commitment to high quality standards is a major concern of SIMULTECH t...

  8. Managing mixed fisheries in the European western waters: application of Fcube methodology

    DEFF Research Database (Denmark)

    Iriondo, Ane; García, Dorleta; Santurtún, Marina

    2012-01-01

    Fisheries management is moving towards ecosystem based management instead of traditional single species based advice. To progress towards an ecosystem approach, a new methodology called “Fleet and Fisheries Forecast” (Fcube) has been proposed. In the application of the method, a precise initial f...... the lowest. In this analysis, Western Waters fleet management results show consistency between stocks and their respective TACs. The study highlights that it is possible to deliver advice within the context of mixed fisheries using the Fcube method......Fisheries management is moving towards ecosystem based management instead of traditional single species based advice. To progress towards an ecosystem approach, a new methodology called “Fleet and Fisheries Forecast” (Fcube) has been proposed. In the application of the method, a precise initial...

  9. Application of project management methodology in design management of nuclear safety related structure

    International Nuclear Information System (INIS)

    Chen Mao

    2004-01-01

    This paper focuses on the application of project management methodology in the design management of Nuclear Safety Related Structure (NSRS), considering the design management features of its civil construction. Based on the experiences from the management of several projects, the project management triangle is proposed to be used in the management, to well treat the position of design interface in the project management. Some other management methods are also proposed

  10. Methodology and application of 13C breath test in gastroenterology practice

    International Nuclear Information System (INIS)

    Yan Weili; Jiang Yibin

    2002-01-01

    13 C breath test has been widely used in research of nutrition, pharmacology and gastroenterology for its properties such as safety, non-invasion and so on. The author describes the principle, methodology of 13 C breath test and its application in detection to Helico-bacteria pylori infection in stomach and small bowl bacterial overgrowth, measurement of gastric emptying, pancreatic exocrine function and liver function with various substrates

  11. Methodologies and applications for critical infrastructure protection: State-of-the-art

    International Nuclear Information System (INIS)

    Yusta, Jose M.; Correa, Gabriel J.; Lacal-Arantegui, Roberto

    2011-01-01

    This work provides an update of the state-of-the-art on energy security relating to critical infrastructure protection. For this purpose, this survey is based upon the conceptual view of OECD countries, and specifically in accordance with EU Directive 114/08/EC on the identification and designation of European critical infrastructures, and on the 2009 US National Infrastructure Protection Plan. The review discusses the different definitions of energy security, critical infrastructure and key resources, and shows some of the experie'nces in countries considered as international reference on the subject, including some information-sharing issues. In addition, the paper carries out a complete review of current methodologies, software applications and modelling techniques around critical infrastructure protection in accordance with their functionality in a risk management framework. The study of threats and vulnerabilities in critical infrastructure systems shows two important trends in methodologies and modelling. A first trend relates to the identification of methods, techniques, tools and diagrams to describe the current state of infrastructure. The other trend accomplishes a dynamic behaviour of the infrastructure systems by means of simulation techniques including systems dynamics, Monte Carlo simulation, multi-agent systems, etc. - Highlights: → We examine critical infrastructure protection experiences, systems and applications. → Some international experiences are reviewed, including EU EPCIP Plan and the US NIPP programme. → We discuss current methodologies and applications on critical infrastructure protection, with emphasis in electric networks.

  12. Application of SADT and ARIS methodologies for modeling and management of business processes of information systems

    Directory of Open Access Journals (Sweden)

    O. V. Fedorova

    2018-01-01

    Full Text Available The article is devoted to application of SADT and ARIS methodologies for modeling and management of business processes of information systems. The relevance of this article is beyond doubt, because the design of the architecture of information systems, based on a thorough system analysis of the subject area, is of paramount importance for the development of information systems in general. The authors conducted a serious work on the analysis of the application of SADT and ARIS methodologies for modeling and managing business processes of information systems. The analysis was carried out both in terms of modeling business processes (notation and applying the CASE-tool, and in terms of business process management. The first point of view reflects the interaction of the business analyst and the programmer in the development of the information system. The second point of view is the interaction of the business analyst and the customer. The basis of many modern methodologies for modeling business processes is the SADT methodology. Using the methodology of the IDEF family, it is possible to efficiently display and analyze the activity models of a wide range of complex information systems in various aspects. CASE-tool ARIS is a complex of tools for analysis and modeling of the organization's activities. The methodical basis of ARIS is a set of different modeling methods that reflect different views on the system under study. The authors' conclusions are fully justified. The results of the work can be useful for specialists in the field of modeling business processes of information systems. In addition, the article has an oriented character when working on the constituent elements of curricula for students specializing in information specialties and management, provides an update of the content and structure of disciplines on modeling the architecture of information systems and organization management, using models.

  13. On the major ductile fracture methodologies for failure assessment of nuclear reactor components

    International Nuclear Information System (INIS)

    Cruz, Julio R.B.; Andrade, Arnaldo H.P. de; Landes, John D.

    1996-01-01

    In structures like nuclear reactor components there is a special concern with the loads that may occur under postulated accident conditions. These loads can cause the stresses to go well beyond the linear elastic limits, requiring the use of ductile fracture mechanics methods to the prediction of the structure behavior. Since the use of numerical methods to apply EPFM concepts is expensive and time consuming, the existence of analytical engineering procedures are of great relevance. The lack of precision in detail, as compared with numerical nonlinear analyses, is compensated by the possibility of quick failure assessments. This is a determinant factor in situations where a systematic evaluation of a large range of geometries and loading conditions is necessary, like in thr application of the Leak-Before-Break (LBB) concept on nuclear piping. This paper outlines four ductile fracture analytical methods, pointing out positive and negative aspects of each one. The objective is to take advantage of this critical review to conceive a new methodology, one that would gather strong points of the major existent methods and would try to eliminate some of their drawbacks. (author)

  14. Application of an environmental remediation methodology: theory vs. practice reflections and two Belgian case studies - 59184

    International Nuclear Information System (INIS)

    Blommaert, W.; Mannaerts, K.; Pepin, S.; Dehandschutter, B.

    2012-01-01

    Like in many countries, polluted industrial sites also exist in Belgium. Although the contamination is purely chemical in most cases, they may also contain a radioactive component. For chemically contaminated sites, extensive regulations and methodologies were already developed and applied by the different regional authorities. However and essentially because radioactivity is a federal competence, there was also a necessity for developing a legal federal framework (including an ER-methodology [1]) for remediation of radioactive contaminated sites. Most of the so-called radioactive contaminated sites are exhibiting a mixed contamination (chemical and radiological), and hence the development of such methodology had to be in line with the existing (regional) ones concerning chemical contamination. Each authority having their own responsibilities with regard to the type of contamination, this makes it more complicated and time-consuming finding the best solution satisfying all involved parties. To overcome these difficulties the legal framework and methodology - including the necessary involvement of the stakeholders and delineation of each party's responsibilities - has to be transparent, clear and unambiguous. Once the methodology is developed as such and approved, the application of it is expected to be more or less easy, logic and straightforward. But is this really true? The aim of this document is to investigate as well the impact of factors such as the type of radioactive contamination - levels of contamination, related to NORM activity or not, homogeneous or heterogeneous, the differences in licensing procedures,.. - on the application of the developed methodology and what could be the consequences in the long run on the remediation process. Two existing case studies in Belgium will be presented ([2]). The first case deals with a historical radium contaminated site, the second one with a phosphate processing facility still in operation, both with (very) low

  15. TACIS 91: Application of leak-before-break concept in VVER 440-230

    Energy Technology Data Exchange (ETDEWEB)

    Bartholome, G.; Faidy, C.; Franco, C. [and others

    1997-04-01

    The applicability of the leak-before-break (LBB) concept for primary piping in the first generation of WWER type plants in Russia is investigated. The procedures for LBB behavior used in France and Germany are applied, and the evaluation is discussed within the framework of the European Technical Assistance for the Community of Independent States (TACIS) project. Emphasis is placed on experimental validation of national and international engineering practice for evaluating and optimizing existing installations. Design criteria of WWER plants are compared to western standard design.

  16. Motivating Students for Project-based Learning for Application of Research Methodology Skills.

    Science.gov (United States)

    Tiwari, Ranjana; Arya, Raj Kumar; Bansal, Manoj

    2017-12-01

    Project-based learning (PBL) is motivational for students to learn research methodology skills. It is a way to engage and give them ownership over their own learning. The aim of this study is to use PBL for application of research methodology skills for better learning by encouraging an all-inclusive approach in teaching and learning rather than an individualized tailored approach. The present study was carried out for MBBS 6 th - and 7 th -semester students of community medicine. Students and faculties were sensitized about PBL and components of research methodology skills. They worked in small groups. The students were asked to fill the student feedback Questionnaire and the faculty was also asked to fill the faculty feedback Questionnaire. Both the Questionnaires were assessed on a 5 point Likert scale. After submitted projects, document analysis was done. A total of 99 students of the 6 th and 7 th semester were participated in PBL. About 90.91% students agreed that there should be continuation of PBL in subsequent batches. 73.74% felt satisfied and motivated with PBL, whereas 76.77% felt that they would be able to use research methodology in the near future. PBL requires considerable knowledge, effort, persistence, and self-regulation on the part of the students. They need to devise plans, gather information evaluate both the findings, and their approach. Facilitator plays a critical role in helping students in the process by shaping opportunity for learning, guiding students, thinking, and helping them construct new understanding.

  17. Production methodologies of polymeric and hydrogel particles for drug delivery applications.

    Science.gov (United States)

    Lima, Ana Catarina; Sher, Praveen; Mano, João F

    2012-02-01

    Polymeric particles are ideal vehicles for controlled delivery applications due to their ability to encapsulate a variety of substances, namely low- and high-molecular mass therapeutics, antigens or DNA. Micro and nano scale spherical materials have been developed as carriers for therapies, using appropriated methodologies, in order to achieve a prolonged and controlled drug administration. This paper reviews the methodologies used for the production of polymeric micro/nanoparticles. Emulsions, phase separation, spray drying, ionic gelation, polyelectrolyte complexation and supercritical fluids precipitation are all widely used processes for polymeric micro/nanoencapsulation. This paper also discusses the recent developments and patents reported in this field. Other less conventional methodologies are also described, such as the use of superhydrophobic substrates to produce hydrogel and polymeric particulate biomaterials. Polymeric drug delivery systems have gained increased importance due to the need for improving the efficiency and versatility of existing therapies. This allows the development of innovative concepts that could create more efficient systems, which in turn may address many healthcare needs worldwide. The existing methods to produce polymeric release systems have some critical drawbacks, which compromise the efficiency of these techniques. Improvements and development of new methodologies could be achieved by using multidisciplinary approaches and tools taken from other subjects, including nanotechnologies, biomimetics, tissue engineering, polymer science or microfluidics.

  18. Methodology for biosphere analysis in high level waste disposal. Application to the Mediterranean system

    International Nuclear Information System (INIS)

    Pinedo, P.; Simon, I.; Aguero, A.; Cancio, D.

    2000-01-01

    For several years CIEMAT has been developing for ENRESA a conceptual approach and tools to support the modelling of the migration and accumulation of radionuclides within the biosphere once those radionuclides are released or reach one or more parts of the biosphere (atmosphere, water bodies or soils). The model development also includes evaluation of radiological impacts arising from the resulting distribution of radionuclides in the biosphere. At the time when the methodology was proposed, the level of development of the different aspects proposed within it was quite heterogeneous and, while aspects of radionuclide transport modelling were already well developed in theoretical and practical terms, other aspects, like the procedure for conceptual model development and the description of biosphere systems representatives of the long term needed further developments. The developments have been performed in parallel to international projects, within which there were and are an active participation, mainly, the BIOphere Models Validation Study (BIOMOVS II) international Project, within which it was developed the so called Reference Biosphere Methodology and, the International Atomic Energy Agency (IAEA) Programme on BIOsphere Modelling and ASSessment methods (BIOMASS), that is under development at present. The methodology been made takes account of these international developments. The purpose of the work summarised herein is the application of the methodology to the 1997 performance assessment (PA) exercise made by ENRESA, using from it the general and particular information about the assessment context, the source term, and the geo-biosphere interface data. (author)

  19. Methodology of development and students' perceptions of a psychiatry educational smartphone application.

    Science.gov (United States)

    Zhang, Melvyn W B; Ho, Cyrus S H; Ho, Roger C M

    2014-01-01

    The usage of Smartphones and smartphone applications in the recent decade has indeed become more prevalent. Previous research has highlighted the lack of critical appraisal of new applications. In addition, previous research has highlighted a method of using just the Internet Browser and a text editor to create an application, but this does not eliminate the challenges faced by clinicians. In addition, even though there has been a high rate of smartphone applications usage and acceptance, it is common knowledge that it would cost clinicians as well as their centers a lot to develop smartphone applications that could be catered to their needs, and help them in their daily educational needs. The objectives of the current research are thus to highlight a cost-effective methodology of development of interactive education smartphone applications, and also to determine whether medical students are receptive towards having smartphone applications and their perspectives with regards to the contents within. In this study, we will elaborate how the Mastering Psychiatry Online Portal and web-based mobile application were developed using HTML5 as the core programming language. The online portal and web-based application was launched in July 2012 and usage data were obtained. Subsequently, a native application was developed, as it was funded by an educational grant and students are recruited after their end of posting clinical examination to fill up a survey questionnaire relating to perspectives. Our initial analytical results showed that since inception to date, for the online portal, there have been a total of 15,803 views, with a total of 2,109 copies of the online textbook being downloaded. As for the online videos, 5,895 viewers have watched the training videos from the start till the end. 722 users have accessed the mobile textbook application. A total of 185 students participated in the perspective survey, with the majority having positive perspectives about the

  20. Prototype application of best estimate and uncertainty safety analysis methodology to large LOCA analysis

    International Nuclear Information System (INIS)

    Luxat, J.C.; Huget, R.G.

    2001-01-01

    Development of a methodology to perform best estimate and uncertainty nuclear safety analysis has been underway at Ontario Power Generation for the past two and one half years. A key driver for the methodology development, and one of the major challenges faced, is the need to re-establish demonstrated safety margins that have progressively been undermined through excessive and compounding conservatism in deterministic analyses. The major focus of the prototyping applications was to quantify the safety margins that exist at the probable range of high power operating conditions, rather than the highly improbable operating states associated with Limit of the Envelope (LOE) assumptions. In LOE, all parameters of significance to the consequences of a postulated accident are assumed to simultaneously deviate to their limiting values. Another equally important objective of the prototyping was to demonstrate the feasibility of conducting safety analysis as an incremental analysis activity, as opposed to a major re-analysis activity. The prototype analysis solely employed prior analyses of Bruce B large break LOCA events - no new computer simulations were undertaken. This is a significant and novel feature of the prototyping work. This methodology framework has been applied to a postulated large break LOCA in a Bruce generating unit on a prototype basis. This paper presents results of the application. (author)

  1. Drift design methodology and preliminary application for the Yucca Mountain Site Characterization Project

    International Nuclear Information System (INIS)

    Hardy, M.P.; Bauer, S.J.

    1991-12-01

    Excavation stability in an underground nuclear waste repository is required during construction, emplacement, retrieval (if required), and closure phases to ensure worker health and safety, and to prevent development of potential pathways for radionuclide migration in the post-closure period. Stable excavations are developed by appropriate excavation procedures, design of the room shape, design and installation of rock support reinforcement systems, and implementation of appropriate monitoring and maintenance programs. In addition to the loads imposed by the in situ stress field, the repository drifts will be impacted by thermal loads developed after waste emplacement and, periodically, by seismic loads from naturally occurring earthquakes and underground nuclear events. A priori evaluation of stability is required for design of the ground support system, to confirm that the thermal loads are reasonable, and to support the license application process. In this report, a design methodology for assessing drift stability is presented. This is based on site conditions, together with empirical and analytical methods. Analytical numerical methods are emphasized at this time because empirical data are unavailable for excavations in welded tuff either at elevated temperatures or under seismic loads. The analytical methodology incorporates analysis of rock masses that are systematically jointed, randomly jointed, and sparsely jointed. In situ thermal and seismic loads are considered. Methods of evaluating the analytical results and estimating ground support requirements for all the full range of expected ground conditions are outlines. The results of a preliminary application of the methodology using the limited available data are presented. 26 figs., 55 tabs

  2. A general centroid determination methodology, with application to multilayer dielectric structures and thermally stimulated current measurements

    International Nuclear Information System (INIS)

    Miller, S.L.; Fleetwood, D.M.; McWhorter, P.J.; Reber, R.A. Jr.; Murray, J.R.

    1993-01-01

    A general methodology is developed to experimentally characterize the spatial distribution of occupied traps in dielectric films on a semiconductor. The effects of parasitics such as leakage, charge transport through more than one interface, and interface trap charge are quantitatively addressed. Charge transport with contributions from multiple charge species is rigorously treated. The methodology is independent of the charge transport mechanism(s), and is directly applicable to multilayer dielectric structures. The centroid capacitance, rather than the centroid itself, is introduced as the fundamental quantity that permits the generic analysis of multilayer structures. In particular, the form of many equations describing stacked dielectric structures becomes independent of the number of layers comprising the stack if they are expressed in terms of the centroid capacitance and/or the flatband voltage. The experimental methodology is illustrated with an application using thermally stimulated current (TSC) measurements. The centroid of changes (via thermal emission) in the amount of trapped charge was determined for two different samples of a triple-layer dielectric structure. A direct consequence of the TSC analyses is the rigorous proof that changes in interface trap charge can contribute, though typically not significantly, to thermally stimulated current

  3. A methodology for developing strategic municipal solid waste management plans with an application in Greece.

    Science.gov (United States)

    Economopoulos, A P

    2010-11-01

    A rational approach for developing optimal municipal solid waste (MSW) management plans comprises the strategic and the detailed planning phases. The present paper focuses on the former, the objective of which is to screen management alternatives so as to select the ones that are able to fulfil all legal and other management requirements with reasonable cost. The analysis considers the transportation, treatment and final disposal of the commingled wastes that remain after the application of material recovery at the source programmes and comprises 10 elements, four of which are region-dependent and the remaining ones application-dependent. These elements and their inter-dependencies are described and the entire methodology is applied to Greece. The application considers the existing regional plans and shows that they are incompatible with the existing EU Directives, as well as overly expensive. To address this problem, a new plan is developed in accordance with the rational planning principles of the present methodology. The comparative evaluation of the above alternatives shows that the existing regional plans, in addition to being incompatible with the applicable EU Directives, require 4.3 to 4.8 times (3.7 to 4.4 billion €) higher capital investment and their annual cost is at least 2.1 to 2.3 times (590 to 735 million € year(-1)) higher in comparison with the new national plan.

  4. Application of the HGPT methodology of reactor operation problems with a nodal mixed method

    International Nuclear Information System (INIS)

    Baudron, A.M.; Bruna, G.B.; Gandini, A.; Lautard, J.J.; Monti, S.; Pizzigati, G.

    1998-01-01

    The heuristically based generalized perturbation theory (HGPT), to first and higher order, applied to the neutron field of a reactor system, is discussed in relation to quasistatic problems. This methodology is of particular interest in reactor operation. In this application it may allow an on-line appraisal of the main physical responses of the reactor system when subject to alterations relevant to normal system exploitation, e.g. control rod movement, and/or soluble boron concentration changes to be introduced, for instance, for compensating power level variations following electrical network demands. In this paper, after describing the main features of the theory, its implementation into the diffusion, 3D mixed dual nodal code MINOS of the SAPHYR system is presented. The results from a small scale investigation performed on a simplified PWR system corroborate the validity of the methodology proposed

  5. Assessment of ISLOCA risk: Methodology and application to a Westinghouse four-loop ice condenser plant

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, D.L.; Auflick, J.L.; Haney, L.N. [EG and G Idaho, Inc., Idaho Falls, ID (United States)

    1992-04-01

    Inter-system loss-of-coolant accidents (ISLOCAs) have been identified as important contributors to offsite risk for some nuclear power plants. A methodology has been developed for identifying and evaluating plant-specific hardware designs, human factors issues, and accident consequence factors relevant to the estimation of ISLOCA core damage frequency and risk. This report presents a detailed description of the application of this analysis methodology to a Westinghouse four-loop ice condenser plant. This document also includes appendices A through I which provide: System descriptions; ISLOCA event trees; human reliability analysis; thermal hydraulic analysis; core uncovery timing calculations; calculation of system rupture probability; ISLOCA consequences analysis; uncertainty analysis; and component failure analysis.

  6. Assessment of ISLOCA risk: Methodology and application to a Westinghouse four-loop ice condenser plant

    International Nuclear Information System (INIS)

    Kelly, D.L.; Auflick, J.L.; Haney, L.N.

    1992-04-01

    Inter-system loss-of-coolant accidents (ISLOCAs) have been identified as important contributors to offsite risk for some nuclear power plants. A methodology has been developed for identifying and evaluating plant-specific hardware designs, human factors issues, and accident consequence factors relevant to the estimation of ISLOCA core damage frequency and risk. This report presents a detailed description of the application of this analysis methodology to a Westinghouse four-loop ice condenser plant. This document also includes appendices A through I which provide: System descriptions; ISLOCA event trees; human reliability analysis; thermal hydraulic analysis; core uncovery timing calculations; calculation of system rupture probability; ISLOCA consequences analysis; uncertainty analysis; and component failure analysis

  7. An overall methodology for reliability prediction of mechatronic systems design with industrial application

    International Nuclear Information System (INIS)

    Habchi, Georges; Barthod, Christine

    2016-01-01

    We propose in this paper an overall ten-step methodology dedicated to the analysis and quantification of reliability during the design phase of a mechatronic system, considered as a complex system. The ten steps of the methodology are detailed according to the downward side of the V-development cycle usually used for the design of complex systems. Two main phases of analysis are complementary and cover the ten steps, qualitative analysis and quantitative analysis. The qualitative phase proposes to analyze the functional and dysfunctional behavior of the system and then determine its different failure modes and degradation states, based on external and internal functional analysis, organic and physical implementation, and dependencies between components, with consideration of customer specifications and mission profile. The quantitative phase is used to calculate the reliability of the system and its components, based on the qualitative behavior patterns, and considering data gathering and processing and reliability targets. Systemic approach is used to calculate the reliability of the system taking into account: the different technologies of a mechatronic system (mechanics, electronics, electrical .), dependencies and interactions between components and external influencing factors. To validate the methodology, the ten steps are applied to an industrial system, the smart actuator of Pack'Aero Company. - Highlights: • A ten-step methodology for reliability prediction of mechatronic systems design. • Qualitative and quantitative analysis for reliability evaluation using PN and RBD. • A dependency matrix proposal, based on the collateral and functional interactions. • Models consider mission profile, deterioration, interactions and influent factors. • Application and validation of the methodology on the “Smart Actuator” of PACK’AERO.

  8. Application of the Biosphere Assessment Methodology to the ENRESA, 1997 Performance and Safety Assessment

    International Nuclear Information System (INIS)

    Pinedo, P.; Simon, I.; Aguero, A.

    1998-01-01

    For several years CIEMAT has been developing for ENRESA knowledge and tools to support the modelling of the migration and accumulation of radionuclides within the biosphere once those radionuclides are released or reach one or more parts of the biosphere (atmosphere, water bodies or soils). The model development also includes evaluation of radiological impacts arising from the resulting distribution of radionuclides in the biosphere. In 1996, a Methodology to analyse the biosphere in this context proposed to ENRESA. The level of development of the different aspects proposed within the Methodology was quite heterogeneous and, while aspects of radionuclide transport modelling were already well developed in theoretical and practical terms, other aspects like the procedure for conceptual model development and the description of biosphere system representatives of the long term needed further developments. At present, the International Atomic Energy Agency (IAEA) Programme on Biosphere Modelling and Assessment (BIOMASS) in collaboration with several national organizations, ENRESA and CIEMAT among them, is working to complete and augment the Reference Biosphere Methodology and to produce some practical descriptions of Reference Systems. The overall purpose of this document is to apply the Methodology, taking account of on-going developments in biosphere modelling, to the last performance assessment (PA) exercise made by ENRESA (ENRESA, 1997), using from it the general and particular information about the assessment context, radionuclide information, geosphere and geobiosphere interface data. There are three particular objectives to this work: (a) to determine the practicability of the Methodology in an application to a realistic assessment situation, (b) To compare and contrast previous biosphere modelling in HLW PA and, (c) to test software development related with data management and modelling. (Author) 42 refs

  9. Application of Haddon’s matrix in qualitative research methodology: an experience in burns epidemiology

    Directory of Open Access Journals (Sweden)

    Deljavan R

    2012-07-01

    Full Text Available Reza Deljavan,1 Homayoun Sadeghi-Bazarganim,2,3 Nasrin Fouladim,4 Shahnam Arshi,5 Reza Mohammadi61Injury Epidemiology and Prevention Research Center, 2Neuroscience Research Center, Department of Statistics and Epidemiology, Tabriz University of Medical Sciences, Tabriz, Iran; 3Public Health Department, Karolinska Institute, Stockholm, Sweden; 4Ardabil University of Medical Sciences, Ardabil, Iran; 5Shahid Beheshti University of Medical Sciences, Tehran, Iran; 6Public Health Department, Karolinska Institute, Stockholm, SwedenBackground: Little has been done to investigate the application of injury specific qualitative research methods in the field of burn injuries. The aim of this study was to use an analytical tool (Haddon’s matrix through qualitative research methods to better understand people’s perceptions about burn injuries.Methods: This study applied Haddon’s matrix as a framework and an analytical tool for a qualitative research methodology in burn research. Both child and adult burn injury victims were enrolled into a qualitative study conducted using focus group discussion. Haddon’s matrix was used to develop an interview guide and also through the analysis phase.Results: The main analysis clusters were pre-event level/human (including risky behaviors, belief and cultural factors, and knowledge and education, pre-event level/object, pre-event phase/environment and event and post-event phase (including fire control, emergency scald and burn wound management, traditional remedies, medical consultation, and severity indicators. This research gave rise to results that are possibly useful both for future injury research and for designing burn injury prevention plans.Conclusion: Haddon’s matrix is applicable in a qualitative research methodology both at data collection and data analysis phases. The study using Haddon’s matrix through a qualitative research methodology yielded substantially rich information regarding burn injuries

  10. Application of low-cost methodologies for mobile phone app development.

    Science.gov (United States)

    Zhang, Melvyn; Cheow, Enquan; Ho, Cyrus Sh; Ng, Beng Yeong; Ho, Roger; Cheok, Christopher Cheng Soon

    2014-12-09

    The usage of mobile phones and mobile phone apps in the recent decade has indeed become more prevalent. Previous research has highlighted a method of using just the Internet browser and a text editor to create an app, but this does not eliminate the challenges faced by clinicians. More recently, two methodologies of app development have been shared, but there has not been any disclosures pertaining to the costs involved. In addition, limitations such as the distribution and dissemination of the apps have not been addressed. The aims of this research article are to: (1) highlight a low-cost methodology that clinicians without technical knowledge could use to develop educational apps; (2) clarify the respective costs involved in the process of development; (3) illustrate how limitations pertaining to dissemination could be addressed; and (4) to report initial utilization data of the apps and to share initial users' self-rated perception of the apps. In this study, we will present two techniques of how to create a mobile app using two of the well-established online mobile app building websites. The costs of development are specified and the methodology of dissemination of the apps will be shared. The application of the low-cost methodologies in the creation of the "Mastering Psychiatry" app for undergraduates and "Déjà vu" app for postgraduates will be discussed. A questionnaire survey has been administered to undergraduate students collating their perceptions towards the app. For the Mastering Psychiatry app, a cumulative total of 722 users have used the mobile app since inception, based on our analytics. For the Déjà vu app, there has been a cumulative total of 154 downloads since inception. The utilization data demonstrated the receptiveness towards these apps, and this is reinforced by the positive perceptions undergraduate students (n=185) had towards the low-cost self-developed apps. This is one of the few studies that have demonstrated the low

  11. Application of code scaling, applicability and uncertainty methodology to large break LOCA analysis of two loop PWR

    International Nuclear Information System (INIS)

    Mavko, B.; Stritar, A.; Prosek, A.

    1993-01-01

    In NED 119, No. 1 (May 1990) a series of six papers published by a Technical Program Group presented a new methodology for the safety evaluation of emergency core cooling systems in nuclear power plants. This paper describes the application of that new methodology to the LB LOCA analysis of the two loop Westinghouse power plant. Results of the original work were used wherever possible, so that the analysis was finished in less than one man year of work. Steam generator plugging level and safety injection flow rate were used as additional uncertainty parameters, which had not been used in the original work. The computer code RELAP5/MOD2 was used. Response surface was generated by the regression analysis and by the artificial neural network like Optimal Statistical Estimator method. Results were compared also to the analytical calculation. (orig.)

  12. An efficient hysteresis modeling methodology and its implementation in field computation applications

    Energy Technology Data Exchange (ETDEWEB)

    Adly, A.A., E-mail: adlyamr@gmail.com [Electrical Power and Machines Dept., Faculty of Engineering, Cairo University, Giza 12613 (Egypt); Abd-El-Hafiz, S.K. [Engineering Mathematics Department, Faculty of Engineering, Cairo University, Giza 12613 (Egypt)

    2017-07-15

    Highlights: • An approach to simulate hysteresis while taking shape anisotropy into consideration. • Utilizing the ensemble of triangular sub-regions hysteresis models in field computation. • A novel tool capable of carrying out field computation while keeping track of hysteresis losses. • The approach may be extended for 3D tetra-hedra sub-volumes. - Abstract: Field computation in media exhibiting hysteresis is crucial to a variety of applications such as magnetic recording processes and accurate determination of core losses in power devices. Recently, Hopfield neural networks (HNN) have been successfully configured to construct scalar and vector hysteresis models. This paper presents an efficient hysteresis modeling methodology and its implementation in field computation applications. The methodology is based on the application of the integral equation approach on discretized triangular magnetic sub-regions. Within every triangular sub-region, hysteresis properties are realized using a 3-node HNN. Details of the approach and sample computation results are given in the paper.

  13. Application of a methodology based on the Theory of Constraints in the sector of tourism services

    Directory of Open Access Journals (Sweden)

    Reyner Pérez Campdesuñer

    2017-04-01

    Full Text Available Purpose: The objective of the research was aimed at achieving the implementation of the theory of constraints on the operating conditions of a hotel, which differs by its characteristics of traditional processes that have applied this method, from the great heterogeneity of resources needed to meet the demand of customers. Design/methodology/approach: To achieve this purpose, a method of generating conversion equations that allowed to express all the resources of the organization under study depending on the number of customers to serve facilitating comparison between different resources and estimated demand through techniques developed traditional forecasting, these features were integrated into the classical methodology of theory of constraints. Findings: The application of tools designed for hospitality organizations allowed to demonstrate the applicability of the theory of constraints on entities under conditions different from the usual, develop a set of conversion equations of different resources facilitating comparison with demand and consequently achieve improve levels of efficiency and effectiveness of the organization. Originality/value: The originality of the research is summarized in the application of the theory of constraints in a very different from the usual conditions, covering 100% of the processes and resources in hospitality organizations.

  14. Defense nuclear energy systems selection methodology for civil nuclear power applications

    International Nuclear Information System (INIS)

    Scarborough, J.C.

    1986-01-01

    A methodology developed to select a preferred nuclear power system for a US Department of Defense (DOD) application has been used to evaluate preferred nuclear power systems for a remote island community in Southeast Asia. The plant would provide ∼10 MW of electric power, possibly low-temperature process heat for the local community, and would supplement existing island diesel electric capacity. The nuclear power system evaluation procedure was evolved from a disciplined methodology for ranking ten nuclear power designs under joint development by the US Department of Energy (DOE) and DOD. These included six designs proposed by industry for the Secure Military Power Plant Program (now termed Multimegawatt Terrestrial Reactor Program), the SP-100 Program, the North Warning System Program, and the Modular Advanced High-Temperature Gas-Cooled Reactor (HTGR) and Liquid-Metal Reactor (LMR) programs. The 15 evaluation criteria established for the civil application were generally similar to those developed and used for the defense energy systems evaluation, except that the weighting factor applied to each individual criterion differed. The criteria and their weighting (importance) functions for the civil application are described

  15. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    Energy Technology Data Exchange (ETDEWEB)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between

  16. Methodology for digital radiography simulation using the Monte Carlo code MCNPX for industrial applications

    International Nuclear Information System (INIS)

    Souza, E.M.; Correa, S.C.A.; Silva, A.X.; Lopes, R.T.; Oliveira, D.F.

    2008-01-01

    This work presents a methodology for digital radiography simulation for industrial applications using the MCNPX radiography tally. In order to perform the simulation, the energy-dependent response of a BaFBr imaging plate detector was modeled and introduced in the MCNPX radiography tally input. In addition, a post-processing program was used to convert the MCNPX radiography tally output into 16-bit digital images. Simulated and experimental images of a steel pipe containing corrosion alveoli and stress corrosion cracking were compared, and the results showed good agreement between both images

  17. SystemVerilog assertions and functional coverage guide to language, methodology and applications

    CERN Document Server

    Mehta, Ashok B

    2013-01-01

    This book provides a hands-on, application-oriented guide to the language and methodology of both SystemVerilog Assertions and SytemVerilog Functional Coverage.  Readers will benefit from the step-by-step approach to functional hardware verification, which will enable them to uncover hidden and hard to find bugs, point directly to the source of the bug, provide for a clean and easy way to model complex timing checks and objectively answer the question 'have we functionally verified everything'.  Written by a professional end-user of both SystemVerilog Assertions and SystemVerilog Functional Co

  18. Application of NASA Kennedy Space Center System Assurance Analysis methodology to nuclear power plant systems designs

    International Nuclear Information System (INIS)

    Page, D.W.

    1985-01-01

    In May of 1982, the Kennedy Space Center (KSC) entered into an agreement with the NRC to conduct a study to demonstrate the feasibility and practicality of applying the KSC System Assurance Analysis (SAA) methodology to nuclear power plant systems designs. North Carolina's Duke Power Company expressed an interest in the study and proposed the nuclear power facility at CATAWBA for the basis of the study. In joint meetings of KSC and Duke Power personnel, an agreement was made to select two CATAWBA systems, the Containment Spray System and the Residual Heat Removal System, for the analyses. Duke Power provided KSC with a full set of Final Safety Analysis Reports (FSAR) as well as schematics for the two systems. During Phase I of the study the reliability analyses of the SAA were performed. During Phase II the hazard analyses were performed. The final product of Phase II is a handbook for implementing the SAA methodology into nuclear power plant systems designs. The purpose of this paper is to describe the SAA methodology as it applies to nuclear power plant systems designs and to discuss the feasibility of its application. (orig./HP)

  19. Vedic division methodology for high-speed very large scale integration applications

    Directory of Open Access Journals (Sweden)

    Prabir Saha

    2014-02-01

    Full Text Available Transistor level implementation of division methodology using ancient Vedic mathematics is reported in this Letter. The potentiality of the ‘Dhvajanka (on top of the flag’ formula was adopted from Vedic mathematics to implement such type of divider for practical very large scale integration applications. The division methodology was implemented through half of the divisor bit instead of the actual divisor, subtraction and little multiplication. Propagation delay and dynamic power consumption of divider circuitry were minimised significantly by stage reduction through Vedic division methodology. The functionality of the division algorithm was checked and performance parameters like propagation delay and dynamic power consumption were calculated through spice spectre with 90 nm complementary metal oxide semiconductor technology. The propagation delay of the resulted (32 ÷ 16 bit divider circuitry was only ∼300 ns and consumed ∼32.5 mW power for a layout area of 17.39 mm^2. Combination of Boolean arithmetic along with ancient Vedic mathematics, substantial amount of iterations were reduced resulted as ∼47, ∼38, 34% reduction in delay and ∼34, ∼21, ∼18% reduction in power were investigated compared with the mostly used (e.g. digit-recurrence, Newton–Raphson, Goldschmidt architectures.

  20. Calculation of t8/5 by response surface methodology for electric arc welding applications

    Directory of Open Access Journals (Sweden)

    Meseguer-Valdenebro José Luis

    2014-01-01

    Full Text Available One of the greatest difficulties traditionally found in stainless steel constructions has been the execution of welding parts in them. At the present time, the available technology allows us to use arc welding processes for that application without any disadvantage. Response surface methodology is used to optimise a process in which the variables that take part in it are not related to each other by a mathematical law. Therefore, an empiric model must be formulated. With this methodology the optimisation of one selected variable may be done. In this work, the cooling time that takes place from 800 to 500ºC, t8/5, after TIG welding operation, is modelled by the response surface method. The arc power, the welding velocity and the thermal efficiency factor are considered as the variables that have influence on the t8/5 value. Different cooling times,t8/5, for different combinations of values for the variables are previously determined by a numerical method. The input values for the variables have been experimentally established. The results indicate that response surface methodology may be considered as a valid technique for these purposes.

  1. Water and Carbon Footprint of Wine: Methodology Review and Application to a Case Study

    Directory of Open Access Journals (Sweden)

    Sara Rinaldi

    2016-07-01

    Full Text Available Life cycle assessments (LCAs play a strategic role in improving the environmental performance of a company and in supporting a successful marketing communication. The high impact of the food industry on natural resources, in terms of water consumption and greenhouse gases emission, has been focusing the attention of consumers and producers towards environmentally sustainable products. This work presents a comprehensive approach for the joint evaluation of carbon (CF and water (WF footprint of the wine industry from a cradle to grave perspective. The LCA analysis is carried out following the requirements of international standards (ISO/TS 14067 and ISO 14046. A complete review of the water footprint methodology is presented and guidelines for all the phases of the evaluation procedure are provided, including acquisition and validation of input data, allocation, application of analytic models, and interpretation of the results. The strength of this approach is the implementation of a side-by-side CF vs. WF assessment, based on the same system boundaries, functional unit, and input data, that allows a reliable comparison between the two indicators. In particular, a revised methodology is presented for the evaluation of the grey water component. The methodology was applied to a white and a red wine produced in the same company. A comparison between the two products is presented for each LCA phase along with literature results for similar wines.

  2. 3rd International Conference on Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Koziel, Slawomir; Kacprzyk, Janusz; Leifsson, Leifur; Ören, Tuncer

    2015-01-01

    This book includes extended and revised versions of a set of selected papers from the 3rd International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2013) which was co-organized by the Reykjavik University (RU) and sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC). SIMULTECH 2013 was held in cooperation with the ACM SIGSIM - Special Interest Group (SIG) on SImulation and Modeling (SIM), Movimento Italiano Modellazione e Simulazione (MIMOS) and AIS Special Interest Group on Modeling and Simulation (AIS SIGMAS) and technically co-sponsored by the Society for Modeling & Simulation International (SCS), Liophant Simulation, Simulation Team and International Federation for Information Processing (IFIP). This proceedings brings together researchers, engineers, applied mathematicians and practitioners working in the advances and applications in the field of system simulation.

  3. Probablistic risk assessment methodology application to Indian pressurised heavy water reactors

    International Nuclear Information System (INIS)

    Babar, A.K.; Grover, R.B.; Mehra, V.K.; Gangwal, D.K.; Chakraborty, G.

    1987-01-01

    Probabilistic risk assessment in the context of nuclear power plants is associated with models that predict the offsite radiological releases resulting from reactor accidents. Level 1 PRA deals with the identification of accident sequences relevant to the design of a system and also with their quantitative estimation. It is characterised by event tree, fault tree analysis. The initiating events applicable to pressurised heavy water reactors have been considered and the dominating initiating events essential for detailed studies are identified in this paper. Reliability analysis and the associated problems encountered during the case studies are mentioned briefly. It is imperative to validate the failure data used for analysis. Bayesian technique has been employed for the same and a brief account is included herein. A few important observations, e.g. effects of the presence of moderator, made during the application of probabilistic risk assessment methodology are also discussed. (author)

  4. Application of the BEPU methodology to assess fuel performance in dry storage

    International Nuclear Information System (INIS)

    Feria, F.; Herranz, L.E.

    2017-01-01

    Highlights: • Application of the BEPU methodology to estimate the cladding stress in dry storage. • The stress predicted is notably affected by the irradiation history. • Improvements of FGR modelling would significantly enhance the stress estimates. • The prediction uncertainty should not be disregarded when assessing clad integrity. - Abstract: The stress at which fuel cladding is submitted in dry storage is the driving force of the main degrading mechanisms postulated (i.e., embrittlement due to hydrides radial reorientation and creep). Therefore, a sound assessment is mandatory to reliably predict fuel performance under the dry storage prevailing conditions. Through fuel rod thermo-mechanical codes, best estimate calculations can be conducted. Precision of predictions depends on uncertainties affecting the way of calculating the stress, so by using uncertainty analysis an upper bound of stress can be determined and compared to safety limits set. The present work shows the application of the BEPU (Best Estimate Plus Uncertainty) methodology in this field. Concretely, hydrides radial reorientation has been assessed based on stress predictions under challenging thermal conditions (400 °C) and a stress limit of 90 MPa. The computational tools used to do that are FRAPCON-3xt (best estimate) and Dakota (uncertainty analysis). The methodology has been applied to a typical PWR fuel rod highly irradiated (65 GWd/tU) at different power histories. The study performed allows concluding that both the power history and the prediction uncertainty should not be disregarded when fuel rod integrity is evaluated in dry storage. On probabilistic bases, a burnup of 60 GWd/tU is found out as an acceptable threshold even in the most challenging irradiation conditions considered.

  5. Evaluation of Structural Robustness against Column Loss: Methodology and Application to RC Frame Buildings.

    Science.gov (United States)

    Bao, Yihai; Main, Joseph A; Noh, Sam-Young

    2017-08-01

    A computational methodology is presented for evaluating structural robustness against column loss. The methodology is illustrated through application to reinforced concrete (RC) frame buildings, using a reduced-order modeling approach for three-dimensional RC framing systems that includes the floor slabs. Comparisons with high-fidelity finite-element model results are presented to verify the approach. Pushdown analyses of prototype buildings under column loss scenarios are performed using the reduced-order modeling approach, and an energy-based procedure is employed to account for the dynamic effects associated with sudden column loss. Results obtained using the energy-based approach are found to be in good agreement with results from direct dynamic analysis of sudden column loss. A metric for structural robustness is proposed, calculated by normalizing the ultimate capacities of the structural system under sudden column loss by the applicable service-level gravity loading and by evaluating the minimum value of this normalized ultimate capacity over all column removal scenarios. The procedure is applied to two prototype 10-story RC buildings, one employing intermediate moment frames (IMFs) and the other employing special moment frames (SMFs). The SMF building, with its more stringent seismic design and detailing, is found to have greater robustness.

  6. Assessment of brain perfusion with MRI: methodology and application to acute stroke

    International Nuclear Information System (INIS)

    Grandin, C.B.

    2003-01-01

    We review the methodology of brain perfusion measurements with MRI and their application to acute stroke, with particular emphasis on the work awarded by the 6th Lucien Appel Prize for Neuroradiology. The application of the indicator dilution theory to the dynamic susceptibility-weighted bolus-tracking method is explained, as is the approach to obtaining quantitative measurements of cerebral blood flow (CBF) and volume (CBV). Our contribution to methodological developments, such as CBV measurement with the frequency-shifted burst sequence, development of the PRESTO sequence, comparison of different deconvolution methods and of spin- and gradient-echo sequences, and the validation of MRI measurements against positron emission tomography is summarised. The pathophysiology of brain ischaemia and the role of neuroimaging in the setting of acute stroke are reviewed, with an introduction to the concepts of ischaemic penumbra and diffusion/perfusion mismatch. Our work on the determination of absolute CBF and CBV thresholds for predicting the area of infarct growth, identification of the best perfusion parameters (relative or absolute) for predicting the area of infarct growth and the role of MR angiography is also summarised. We conclude that MRI is a very powerful way to assess brain perfusion and that its use might help in selecting patients who will benefit most from treatment such as thrombolysis. (orig.)

  7. APPLICATION OF METHODOLOGY OF STRATEGIC PLANNING IN DEVELOPING NATIONAL PROGRAMMES ON DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Inna NOVAK

    2015-07-01

    Full Text Available Actuality: The main purpose of strategic planning is that long-term interests of sustainable development of a market economy require the use of effective measures of state regulation of economic and social processes. Objective: The aim of the article is determined to analyze the development of strategic planning methodology and practical experience of its application in the design of national development programs. Methods: When writing the article the following research methods were used: analysis and synthesis, target-oriented and monographic. Results: In Ukraine at the level of state and local government authorities strategies of development of branches, regions, cities, etc. are being developed but given the lack of state funding a unified investment strategy of the country is not developed. After analyzing development of the strategic planning methodology and examples of its application in the design of state development programs we identified the need to develop an investment strategy of the state (sectors, regions, etc., as due to defined directions and guidelines of the activity it will increase the investment level in the country and ensure national strategy “Ukraine-2020”.

  8. Application of a power plant simplification methodology: The example of the condensate feedwater system

    International Nuclear Information System (INIS)

    Seong, P.H.; Manno, V.P.; Golay, M.W.

    1988-01-01

    A novel framework for the systematic simplification of power plant design is described with a focus on the application for the optimization of condensate feedwater system (CFWS) design. The evolution of design complexity of CFWS is reviewed with emphasis upon the underlying optimization process. A new evaluation methodology which includes explicit accounting of human as well as mechanical effects upon system availability is described. The unifying figure of merit for an operating system is taken to be net electricity production cost. The evaluation methodology is applied to the comparative analysis of three designs. In the illustrative examples, the results illustrate how inclusion in the evaluation of explicit availability related costs leads to optimal configurations. These are different from those of current system design practices in that thermodynamic efficiency and capital cost optimization are not overemphasized. Rather a more complete set of design-dependent variables is taken into account, and other important variables which remain neglected in current practices are identified. A critique of the new optimization approach and a discussion of future work areas including improved human performance modeling and different optimization constraints are provided. (orig.)

  9. Development and application of a hybrid transport methodology for active interrogation systems

    Energy Technology Data Exchange (ETDEWEB)

    Royston, K.; Walters, W.; Haghighat, A. [Nuclear Engineering Program, Department of Mechanical Engineering, Virginia Tech., 900 N Glebe Rd., Arlington, VA 22203 (United States); Yi, C.; Sjoden, G. [Nuclear and Radiological Engineering, Georgia Tech, 801 Ferst Drive, Atlanta, GA 30332 (United States)

    2013-07-01

    A hybrid Monte Carlo and deterministic methodology has been developed for application to active interrogation systems. The methodology consists of four steps: i) neutron flux distribution due to neutron source transport and subcritical multiplication; ii) generation of gamma source distribution from (n, 7) interactions; iii) determination of gamma current at a detector window; iv) detection of gammas by the detector. This paper discusses the theory and results of the first three steps for the case of a cargo container with a sphere of HEU in third-density water cargo. To complete the first step, a response-function formulation has been developed to calculate the subcritical multiplication and neutron flux distribution. Response coefficients are pre-calculated using the MCNP5 Monte Carlo code. The second step uses the calculated neutron flux distribution and Bugle-96 (n, 7) cross sections to find the resulting gamma source distribution. In the third step the gamma source distribution is coupled with a pre-calculated adjoint function to determine the gamma current at a detector window. The AIMS (Active Interrogation for Monitoring Special-Nuclear-Materials) software has been written to output the gamma current for a source-detector assembly scanning across a cargo container using the pre-calculated values and taking significantly less time than a reference MCNP5 calculation. (authors)

  10. Assessment of ISLOCA risk: Methodology and application to a Babcock and Wilcox nuclear power plant

    International Nuclear Information System (INIS)

    Galyean, W.J.; Gertman, D.I.

    1992-04-01

    This report presents information essential to understanding the risk associated with inter-system loss-of-coolant accidents (ISLOCAs). The methodology developed and presented in the report provides a state-of-the-art method for identifying and evaluating plant-specific hardware design, human performance issues, and accident consequence factors to relevant to the prediction of the ISLOCA risk. This ISLOCA methodology was developed and then applied to a Babcock and Wilcox (B ampersand W) nuclear power plants. The results from this application are described in detail. For this particular B ampersand W reference plant, the assessment indicated that the probability of a severe ISLOCA is approximately 2.2E-06/reactor-year. This document Volume 3 provides appendices A--H of the report. Topics are: Historical experience related to ISLOCA events; component failure rates; reference B ampersand W plant system descriptions; reference B ampersand W plant ISLOCA event trees; Human reliability analysis for the B ampersand W ISLOCA probabilistic risk assessment; thermal hydraulic calculations; bounding core uncovery time calculations; and system rupture probability

  11. Application of machine learning methodology for pet-based definition of lung cancer

    Science.gov (United States)

    Kerhet, A.; Small, C.; Quon, H.; Riauka, T.; Schrader, L.; Greiner, R.; Yee, D.; McEwan, A.; Roa, W.

    2010-01-01

    We applied a learning methodology framework to assist in the threshold-based segmentation of non-small-cell lung cancer (nsclc) tumours in positron-emission tomography–computed tomography (pet–ct) imaging for use in radiotherapy planning. Gated and standard free-breathing studies of two patients were independently analysed (four studies in total). Each study had a pet–ct and a treatment-planning ct image. The reference gross tumour volume (gtv) was identified by two experienced radiation oncologists who also determined reference standardized uptake value (suv) thresholds that most closely approximated the gtv contour on each slice. A set of uptake distribution-related attributes was calculated for each pet slice. A machine learning algorithm was trained on a subset of the pet slices to cope with slice-to-slice variation in the optimal suv threshold: that is, to predict the most appropriate suv threshold from the calculated attributes for each slice. The algorithm’s performance was evaluated using the remainder of the pet slices. A high degree of geometric similarity was achieved between the areas outlined by the predicted and the reference suv thresholds (Jaccard index exceeding 0.82). No significant difference was found between the gated and the free-breathing results in the same patient. In this preliminary work, we demonstrated the potential applicability of a machine learning methodology as an auxiliary tool for radiation treatment planning in nsclc. PMID:20179802

  12. An update on technical and methodological aspects for cardiac PET applications

    International Nuclear Information System (INIS)

    PRESOTTO, Luca; BUSNARDO, Elena; GIANOLLI, Luigi; BETTINARDI, Valentino

    2016-01-01

    Positron emission tomography (PET) is indicated for a large number of cardiac diseases: perfusion and viability studies are commonly used to evaluate coronary artery disease; PET can also be used to assess sarcoidosis and endocarditis, as well as to investigate amyloidosis. Furthermore, a hot topic for research is plaque characterization. Most of these studies are technically very challenging. High count rates and short acquisition times characterize perfusion scans while very small targets have to be imaged in inflammation/infection and plaques examinations. Furthermore, cardiac PET suffers from respiratory and cardiac motion blur. Each type of studies has specific requirements from the technical and methodological point of view, thus PET systems with overall high performances are required. Furthermore, in the era of hybrid PET/computed tomography (CT) and PET/Magnetic Resonance Imaging (MRI) systems, the combination of complementary functional and anatomical information can be used to improve diagnosis and prognosis. Moreover, PET images can be qualitatively and quantitatively improved exploiting information from the other modality, using advanced algorithms. In this review we will report the latest technological and methodological innovations for PET cardiac applications, with particular reference to the state of the art of the hybrid PET/CT and PET/MRI. We will also report the most recent advancements in software, from reconstruction algorithms to image processing and analysis programs.

  13. Application of REPAS Methodology to Assess the Reliability of Passive Safety Systems

    Directory of Open Access Journals (Sweden)

    Franco Pierro

    2009-01-01

    Full Text Available The paper deals with the presentation of the Reliability Evaluation of Passive Safety System (REPAS methodology developed by University of Pisa. The general objective of the REPAS is to characterize in an analytical way the performance of a passive system in order to increase the confidence toward its operation and to compare the performances of active and passive systems and the performances of different passive systems. The REPAS can be used in the design of the passive safety systems to assess their goodness and to optimize their costs. It may also provide numerical values that can be used in more complex safety assessment studies and it can be seen as a support to Probabilistic Safety Analysis studies. With regard to this, some examples in the application of the methodology are reported in the paper. A best-estimate thermal-hydraulic code, RELAP5, has been used to support the analyses and to model the selected systems. Probability distributions have been assigned to the uncertain input parameters through engineering judgment. Monte Carlo method has been used to propagate uncertainties and Wilks' formula has been taken into account to select sample size. Failure criterions are defined in terms of nonfulfillment of the defined design targets.

  14. Process monitoring for intelligent manufacturing processes - Methodology and application to Robot Assisted Polishing

    DEFF Research Database (Denmark)

    Pilny, Lukas

    Process monitoring provides important information on the product, process and manufacturing system during part manufacturing. Such information can be used for process optimization and detection of undesired processing conditions to initiate timely actions for avoidance of defects, thereby improving...... quality assurance. This thesis is aimed at a systematic development of process monitoring solutions, constituting a key element of intelligent manufacturing systems towards zero defect manufacturing. A methodological approach of general applicability is presented in this concern.The approach consists...... of six consecutive steps for identification of product Vital Quality Characteristics (VQCs) and Key Process Variables (KPVs), selection and characterization of sensors, optimization of sensors placement, validation of the monitoring solutions, definition of the reference manufacturing performance...

  15. Social Life Cycle Assessment as a Management Tool: Methodology for Application in Tourism

    Directory of Open Access Journals (Sweden)

    Roberto Merli

    2013-08-01

    Full Text Available As is widely known, sustainability is an important factor in competition, increasing the added value of a company in terms of image and credibility. However, it is important that sustainability assessments are effectively addressed in a global perspective. Therefore, life cycle tools are adopted to evaluate environmental and social impacts. Among these, and of particular significance, appears the Social Life Cycle Assessment (SLCA, which, although in its early stage of development, seems to have extremely promising methodological features. For this reason, it seemed interesting to propose a first application to the tourism sector, which could be better than other methods, studied in terms of social sustainability data. The particular characteristics of service delivery lend themselves more to the development of data related to social sustainability than other sectors. In this paper the results of a case study carried out using social accounting and business management tools are shown.

  16. A statistical methodology for the estimation of extreme wave conditions for offshore renewable applications

    DEFF Research Database (Denmark)

    Larsén, Xiaoli Guo; Kalogeri, Christina; Galanis, George

    2015-01-01

    and post-process outputs from a high resolution numerical wave modeling system for extreme wave estimation based on the significant wave height. This approach is demonstrated through the data analysis at a relatively deep water site, FINO 1, as well as a relatively shallow water area, coastal site Horns...... as a characteristic index of extreme wave conditions. The results from the proposed methodology seem to be in a good agreement with the measurements at both the relatively deep, open water and the shallow, coastal water sites, providing a potentially useful tool for offshore renewable energy applications. © 2015...... Rev, which is located in the North Sea, west of Denmark. The post-processing targets at correcting the modeled time series of the significant wave height, in order to match the statistics of the corresponding measurements, including not only the conventional parameters such as the mean and standard...

  17. Methodology for validating technical tools to assess customer Demand Response: Application to a commercial customer

    International Nuclear Information System (INIS)

    Alcazar-Ortega, Manuel; Escriva-Escriva, Guillermo; Segura-Heras, Isidoro

    2011-01-01

    The authors present a methodology, which is demonstrated with some applications to the commercial sector, in order to validate a Demand Response (DR) evaluation method previously developed and applied to a wide range of industrial and commercial segments, whose flexibility was evaluated by modeling. DR is playing a more and more important role in the framework of electricity systems management for the effective integration of other distributed energy resources. Consequently, customers must identify what they are using the energy for in order to use their flexible loads for management purposes. Modeling tools are used to predict the impact of flexibility on the behavior of customers, but this result needs to be validated since both customers and grid operators have to be confident in these flexibility predictions. An easy-to-use two-steps method to achieve this goal is presented in this paper.

  18. The application of life cycle assessment to integrated solid waste management. Pt. 1: Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Clift, R.; Doig, A.; Finnveden, G.

    2000-07-01

    Integrated Waste Management is one of the holistic approaches to environmental and resource management which are emerging from applying the concept of sustainable development. Assessment of waste management options requires application of Life Cycle Assessment (LCA). This paper summarizes the methodology for applying LCA to Integrated Waste Management of Municipal Solid Wastes (MSW) developed for and now used by the UK Environment Agency, including recent developments in international fora. Particular attention is devoted to system definition leading to rational and clear compilation of the Life Cycle Inventory, with appropriate 'credit' for recovering materials and/or energy from the waste. LCA of waste management is best seen as a way of structuring information to help decision processes. (Author)

  19. Selection of phage-displayed accessible recombinant targeted antibodies (SPARTA): methodology and applications.

    Science.gov (United States)

    D'Angelo, Sara; Staquicini, Fernanda I; Ferrara, Fortunato; Staquicini, Daniela I; Sharma, Geetanjali; Tarleton, Christy A; Nguyen, Huynh; Naranjo, Leslie A; Sidman, Richard L; Arap, Wadih; Bradbury, Andrew Rm; Pasqualini, Renata

    2018-05-03

    We developed a potentially novel and robust antibody discovery methodology, termed selection of phage-displayed accessible recombinant targeted antibodies (SPARTA). This combines an in vitro screening step of a naive human antibody library against known tumor targets, with in vivo selections based on tumor-homing capabilities of a preenriched antibody pool. This unique approach overcomes several rate-limiting challenges to generate human antibodies amenable to rapid translation into medical applications. As a proof of concept, we evaluated SPARTA on 2 well-established tumor cell surface targets, EphA5 and GRP78. We evaluated antibodies that showed tumor-targeting selectivity as a representative panel of antibody-drug conjugates (ADCs) and were highly efficacious. Our results validate a discovery platform to identify and validate monoclonal antibodies with favorable tumor-targeting attributes. This approach may also extend to other diseases with known cell surface targets and affected tissues easily isolated for in vivo selection.

  20. Improving life cycle assessment methodology for the application of decision support

    DEFF Research Database (Denmark)

    Herrmann, Ivan Tengbjerg

    for the application of decision support and evaluation of uncertainty in LCA. From a decision maker’s (DM’s) point of view there are at least three main “illness” factors influencing the quality of the information that the DM uses for making decisions. The factors are not independent of each other, but it seems......) refrain from making a decision based on an LCA and thus support a decision on other parameters than the LCA environmental parameters. Conversely, it may in some decision support contexts be acceptable to base a decision on highly uncertain information. This all depends on the specific decision support...... the different steps. A deterioration of the quality in each step is likely to accumulate through the statistical value chain in terms of increased uncertainty and bias. Ultimately this can make final decision support problematic. The "Law of large numbers" (LLN) is the methodological tool/probability theory...

  1. Development of a flow structure interaction methodology applicable to a convertible car roof

    International Nuclear Information System (INIS)

    Knight, Jason J.

    2003-01-01

    The current research investigates the flow-induced deformation of a convertible roof of a vehicle using experimental and numerical methods. A computational methodology is developed that entails the coupling of a commercial Computational Fluid Dynamics (CFD) code with an in-house structural code. A model two-dimensional problem is first studied. The CFD code and a Source Panel Method (SPM) code are used to predict the pressure acting on the surface of a rigid roof of a scale model. Good agreement is found between predicted pressure distribution and that obtained in a parallel wind-tunnel experimental programme. The validated computational modelling of the fluid flow is then used in a coupling strategy with a line-element structural model that incorporates initial slackness of the flexible roof material. The computed flow-structure interaction yields stable solutions, the aerodynamically loaded flexible roof settling into static equilibrium. The effects of slackness and material properties on deformation and convergence are investigated using the coupled code. The three-dimensional problem is addressed by extending the two-dimensional structural solver to represent a surface by a matrix of line elements with constant tension along their length. This has been successfully coupled with the three-dimensional CFD flow-solution technique. Computed deformations show good agreement with the results of wind tunnel experiments for the well prescribed geometry. In both two-and three-dimensional computations, the flow-structure interaction is found to yield a static deformation to within 1% difference in the displacement variable after three iterations between the fluid and structural codes. The same computational methodology is applied to a real-car application using a third-party structural solver. The methodology is shown to be robust even under conditions beyond those likely to be encountered. The full methodology could be used as a design tool. The present work

  2. Development of the CPXSD Methodology for Generation of Fine-Group Libraries for Shielding Applications

    International Nuclear Information System (INIS)

    Alpan, F. Arzu; Haghighat, Alireza

    2005-01-01

    Multigroup cross sections are one of the major factors that cause uncertainties in the results of deterministic transport calculations. Thus, it is important to prepare effective cross-section libraries that include an appropriate group structure and are based on an appropriate spectrum. There are several multigroup cross-section libraries available for particular applications. For example, the 47-neutron, 20-gamma group BUGLE library that is derived from the 199-neutron, 42-gamma group VITAMIN-B6 library is widely used for light water reactor (LWR) shielding and pressure vessel dosimetry applications. However, there is no publicly available methodology that can construct problem-dependent libraries. Thus, the authors have developed the Contributon and Point-wise Cross Section Driven (CPXSD) methodology for constructing effective fine- and broad-group structures. In this paper, new fine-group structures were constructed using the CPXSD, and new fine-group cross-section libraries were generated. The 450-group LIB450 and 589-group LIB589 libraries were developed for neutrons sensitive to the fast and thermal energy ranges, respectively, for LWR shielding problems. As compared to a VITAMIN-B6-like library, the new fine-group library developed for fast neutron dosimetry calculations resulted in closer agreement to the continuous-energy predictions. For example, for the fast neutron cavity dosimetry, ∼4% improvement was observed for the 237 Np(n,f) reaction rate. For the thermal neutron 1 H(n, γ) reaction, a maximum improvement of ∼14% was observed in the reaction rate at the middowncomer position

  3. Applications of a surveillance and diagnostics methodology using neutron noise from a pressurized-water reactor

    International Nuclear Information System (INIS)

    Wood, R.T.; Miller, L.F.; Perez, R.B.

    1992-01-01

    Two applications of a noise diagnostic methodology were performed with ex-core neutron detector data from a pressurized-water reactor (PWR). A feedback dynamics model of the neutron power spectral denisty was derived from a low-order whole-plant physical model made stochastic with the Langevin technique. From a functional fit to plant data, the response of the dynamic system to changes in important physical parameters was evaluated by a direct sensitivity analysis. In addition, changes in monitored spectra were related to changes in physical parameters, and detection thresholds using common surveillance discriminants were determined. A resonance model was developed from perturbation theory to give the ex-core neutron detector response for small in-core mechanical motions in terms of a pole-strength factor, a resonance asymmetry (or skewness) factor, a vibration damping factor, and a frequency of vibration. The mechanical motion paramters for several resonances were determined by a functional fit of the model to plant data taken at various times during a fuel cycle and were tracked to determined trends that indicated vibrational changes of reactor internals. In addition, the resonance model gave the ability to separate the resonant components of the power spectral density after the parameters had been identified. As a result, the behavior of several vibration peaks was monitored over a fuel cycle. The noise diagnostic methodology illustrated by these applications can be used in monitoring the condition of the reactor system. Early detection of degraded mechanical components or undesirable operating conditions by using such surveillance and diagnostic techniques would enhance plant safety. 15 refs., 6 figs., 1 tab

  4. A non-linear reduced order methodology applicable to boiling water reactor stability analysis

    International Nuclear Information System (INIS)

    Prill, Dennis Paul

    2013-01-01

    Thermal-hydraulic coupling between power, flow rate and density, intensified by neutronics feedback are the main drivers of boiling water reactor (BWR) stability behavior. High-power low-flow conditions in connection with unfavorable power distributions can lead the BWR system into unstable regions where power oscillations can be triggered. This important threat to operational safety requires careful analysis for proper understanding. Analyzing an exhaustive parameter space of the non-linear BWR system becomes feasible with methodologies based on reduced order models (ROMs), saving computational cost and improving the physical understanding. Presently within reactor dynamics, no general and automatic prediction of high-dimensional ROMs based on detailed BWR models are available. In this thesis a systematic self-contained model order reduction (MOR) technique is derived which is applicable for several classes of dynamical problems, and in particular to BWRs of any degree of details. Expert knowledge can be given by operational, experimental or numerical transient data and is transfered into an optimal basis function representation. The methodology is mostly automated and provides the framework for the reduction of various different systems of any level of complexity. Only little effort is necessary to attain a reduced version within this self-written code which is based on coupling of sophisticated commercial software. The methodology reduces a complex system in a grid-free manner to a small system able to capture even non-linear dynamics. It is based on an optimal choice of basis functions given by the so-called proper orthogonal decomposition (POD). Required steps to achieve reliable and numerical stable ROM are given by a distinct calibration road-map. In validation and verification steps, a wide spectrum of representative test examples is systematically studied regarding a later BWR application. The first example is non-linear and has a dispersive character

  5. Development and application of a methodology for identifying and characterising scenarios

    International Nuclear Information System (INIS)

    Billington, D.; Bailey, L.

    1998-01-01

    interval along each timeline. This report presents illustrative examples of the application of the above methodology to achieve this aim. The results of risk calculations and assigned weights are plotted on a 'weight-risk diagram', which is used to judge the relative significance of the different variant scenarios in relation to the base scenario and the regulatory risk target. The application of this methodology is consistent with a staged approach to performance assessment, in which effort is focused initially on scoping calculations of conditional risk. Only those variant scenarios giving a higher conditional risk than the base scenario are subject to more detailed evaluation, including the assignment of an appropriate weight. From the limited trialling that has been undertaken, the indications are that a tractable approach, consistent with the objectives of comprehensiveness, traceability and clarity, has been achieved. (author)

  6. Application of Binomial Model and Market Asset Declaimer Methodology for Valuation of Abandon and Expand Options. The Case Study

    Directory of Open Access Journals (Sweden)

    Paweł Mielcarz

    2007-06-01

    Full Text Available The article presents a case study of valuation of real options included in a investment project. The main goal of the article is to present the calculation and methodological issues of application the methodology for real option valuation. In order to do it there are used the binomial model and Market Asset Declaimer methodology. The project presented in the article concerns the introduction of radio station to a new market. It includes two valuable real options: to abandon the project and to expand.

  7. Application of extended statistical combination of uncertainties methodology for digital nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    In, Wang Ki; Uh, Keun Sun; Chul, Kim Heui [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-02-01

    A technically more direct statistical combinations of uncertainties methodology, extended SCU (XSCU), was applied to statistically combine the uncertainties associated with the DNBR alarm setpoint and the DNBR trip setpoint of digital nuclear power plants. The modified SCU (MSCU) methodology is currently used as the USNRC approved design methodology to perform the same function. In this report, the MSCU and XSCU methodologies were compared in terms of the total uncertainties and the net margins to the DNBR alarm and trip setpoints. The MSCU methodology resulted in the small total penalties due to a significantly negative bias which are quite large. However the XSCU methodology gave the virtually unbiased total uncertainties. The net margins to the DNBR alarm and trip setpoints by the MSCU methodology agree with those by the XSCU methodology within statistical variations. (Author) 12 refs., 17 figs., 5 tabs.

  8. SystemVerilog assertions and functional coverage guide to language, methodology and applications

    CERN Document Server

    Mehta, Ashok B

    2016-01-01

    This book provides a hands-on, application-oriented guide to the language and methodology of both SystemVerilog Assertions and SystemVerilog Functional Coverage. Readers will benefit from the step-by-step approach to functional hardware verification using SystemVerilog Assertions and Functional Coverage, which will enable them to uncover hidden and hard to find bugs, point directly to the source of the bug, provide for a clean and easy way to model complex timing checks and objectively answer the question ‘have we functionally verified everything’. Written by a professional end-user of ASIC/SoC/CPU and FPGA design and Verification, this book explains each concept with easy to understand examples, simulation logs and applications derived from real projects. Readers will be empowered to tackle the modeling of complex checkers for functional verification, thereby drastically reducing their time to design and debug. This updated second edition addresses the latest functional set released in IEEE-1800 (2012) L...

  9. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets.

    Science.gov (United States)

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-11-01

    With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. The work demonstrates the viability of the design approach and the software tool for analysis of large data sets.

  10. Application of Response Surface Methodology to Optimize Malachite Green Removal by Cl-nZVI Nanocomposites

    Directory of Open Access Journals (Sweden)

    Farshid Ghorbani

    2017-09-01

    Full Text Available Disposal of effluents containing dyes into natural ecosystems pose serious threats to both the environment and its aquatic life. Malachite green (MG is a basic dye that has extensive industrial applications, especially in aquaculture, throughout the world. This study reports on the application of the central composite design (CCD under the response surface methodology (RSM for the optimization of MG adsorption from aqueous solutions using the clinoptilolite nano-zerovalence iron (Cl-nZVI nanocomposites. The sorbent structures produced are characterized by means of scanning electron micrograph (SEM, energy-dispersive X-ray spectroscopy (EDS, and vibrating sample magnetometer (VSM. The effects of different parameters including pH, initial MG concentration, and sorbent dosage on the removal efficiency (R of MG were studied to find the optimum operating conditions. For this purpose, a total of 20 sets of experiments were designed by the Design Expert.7.0 software and the values of removal efficiency were used as input response to the software. The optimum pH, initial MG concentration, and sorbent dosage were found to be 5.6, 49.21 mg.L-1, and 1.43 g.L-1, respectively. A high MG removal efficiency (57.90% was obtained with optimal process parameters. Moreover, a desirability value of 0.963 was obtained for the optimization process.

  11. Development of a new damage function model for power plants: Methodology and applications

    International Nuclear Information System (INIS)

    Levy, J.I.; Hammitt, J.K.; Yanagisawa, Y.; Spengler, J.D.

    1999-01-01

    Recent models have estimated the environmental impacts of power plants, but differences in assumptions and analytical methodologies have led to diverging findings. In this paper, the authors present a new damage function model that synthesizes previous efforts and refines components that have been associated with variations in impact estimates. Their model focuses on end-use emissions and quantified the direct human health impacts of criteria air pollutants. To compare their model to previous efforts and to evaluate potential policy applications, the authors assess the impacts of an oil and natural gas-fueled cogeneration power plant in Boston, MA. Impacts under baseline assumptions are estimated to be $0.007/kWh of electricity, $0.23/klb of steam, and $0.004/ton-h of chilled water (representing 2--9% of the market value of outputs). Impacts are largely related to ozone (48%) and particulate matter (42%). Addition of upstream emissions and nonpublic health impacts increases externalities by as much as 50%. Sensitivity analyses demonstrate the importance of plant siting, meteorological conditions, epidemiological assumptions, and the monetary value placed on premature mortality as well as the potential influence of global warming. Comparative analyses demonstrate that their model provides reasonable impact estimates and would therefore be applicable in a broad range of policy settings

  12. Methodology development for dosimetry of 90Sr + 90Y beta therapy applicators

    International Nuclear Information System (INIS)

    Coelho, T.S.; Yoriyaz, H.; Fernandes, M.A.R.

    2009-01-01

    The 9 0Sr+ 9 0Y applicators, used in beta therapy for prevention of keloids and pterigio, are imported and its dosimetric features are only illustrated by the manufacturers. The exhaustive routine of the medical physicists in the clinic do not make possible the accomplishment of procedures for the confirmation of these parameters. This work presents a methodology development for dosimetry in two 9 0Sr+ 9 0Y beta therapy applicators of the Amersham brand. The Monte Carlo code MCNP 4 C was used for the simulation of the percentage depth dose curves. The experimental measurements of the radiation attenuation had been done with a mini-extrapolation chamber. The results of the experimental measures had been compared with the simulated values. Both percentage deep dose curves, the theoretical and the experimental ones, had presented similar behavior, which may validate the use of the MCNP 4 C for these simulations, strengthening the usage of this method at procedures of dosimetry of these beta radiation sources. (author)

  13. Electrification of particulate entrained fluid flows-Mechanisms, applications, and numerical methodology

    Science.gov (United States)

    Wei, Wei; Gu, Zhaolin

    2015-10-01

    Particulates in natural and industrial flows have two basic forms: liquid (droplet) and solid (particle). Droplets would be charged in the presence of the applied electric field (e.g. electrospray). Similar to the droplet charging, particles can also be charged under the external electric field (e.g. electrostatic precipitator), while in the absence of external electric field, tribo-electrostatic charging is almost unavoidable in gas-solid two-phase flows due to the consecutive particle contacts (e.g. electrostatic in fluidized bed or wind-blown sand). The particle charging may be beneficial, or detrimental. Although electrostatics in particulate entrained fluid flow systems have been so widely used and concerned, the mechanisms of particulate charging are still lack of a thorough understanding. The motivation of this review is to explore a clear understanding of particulate charging and movement of charged particulate in two-phase flows, by summarizing the electrification mechanisms, physical models of particulate charging, and methods of charging/charged particulate entrained fluid flow simulations. Two effective methods can make droplets charged in industrial applications: corona charging and induction charging. The droplet charge to mass ratio by corona charging is more than induction discharge. The particle charging through collisions could be attributed to electron transfer, ion transfer, material transfer, and/or aqueous ion shift on particle surfaces. The charges on charged particulate surface can be measured, nevertheless, the charging process in nature or industry is difficult to monitor. The simulation method might build a bridge of investigating from the charging process to finally charged state on particulate surface in particulate entrained fluid flows. The methodology combining the interface tracking under the action of the applied electric with the fluid flow governing equations is applicable to the study of electrohydrodynamics problems. The charge

  14. Electrification of particulate entrained fluid flows—Mechanisms, applications, and numerical methodology

    International Nuclear Information System (INIS)

    Wei, Wei; Gu, Zhaolin

    2015-01-01

    Particulates in natural and industrial flows have two basic forms: liquid (droplet) and solid (particle). Droplets would be charged in the presence of the applied electric field (e.g. electrospray). Similar to the droplet charging, particles can also be charged under the external electric field (e.g. electrostatic precipitator), while in the absence of external electric field, tribo-electrostatic charging is almost unavoidable in gas–solid two-phase flows due to the consecutive particle contacts (e.g. electrostatic in fluidized bed or wind-blown sand). The particle charging may be beneficial, or detrimental. Although electrostatics in particulate entrained fluid flow systems have been so widely used and concerned, the mechanisms of particulate charging are still lack of a thorough understanding. The motivation of this review is to explore a clear understanding of particulate charging and movement of charged particulate in two-phase flows, by summarizing the electrification mechanisms, physical models of particulate charging, and methods of charging/charged particulate entrained fluid flow simulations. Two effective methods can make droplets charged in industrial applications: corona charging and induction charging. The droplet charge to mass ratio by corona charging is more than induction discharge. The particle charging through collisions could be attributed to electron transfer, ion transfer, material transfer, and/or aqueous ion shift on particle surfaces. The charges on charged particulate surface can be measured, nevertheless, the charging process in nature or industry is difficult to monitor. The simulation method might build a bridge of investigating from the charging process to finally charged state on particulate surface in particulate entrained fluid flows. The methodology combining the interface tracking under the action of the applied electric with the fluid flow governing equations is applicable to the study of electrohydrodynamics problems. The

  15. Electrification of particulate entrained fluid flows—Mechanisms, applications, and numerical methodology

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Wei [School of Energy and Power Engineering, Wuhan University of Technology, Wuhan, Hubei, 430063 (China); School of Human Settlements and Civil Engineering, Xi’an Jiaotong University, Xi’an, Shaanxi, 710049 (China); Gu, Zhaolin, E-mail: guzhaoln@mail.xjtu.edu.cn [School of Human Settlements and Civil Engineering, Xi’an Jiaotong University, Xi’an, Shaanxi, 710049 (China)

    2015-10-28

    Particulates in natural and industrial flows have two basic forms: liquid (droplet) and solid (particle). Droplets would be charged in the presence of the applied electric field (e.g. electrospray). Similar to the droplet charging, particles can also be charged under the external electric field (e.g. electrostatic precipitator), while in the absence of external electric field, tribo-electrostatic charging is almost unavoidable in gas–solid two-phase flows due to the consecutive particle contacts (e.g. electrostatic in fluidized bed or wind-blown sand). The particle charging may be beneficial, or detrimental. Although electrostatics in particulate entrained fluid flow systems have been so widely used and concerned, the mechanisms of particulate charging are still lack of a thorough understanding. The motivation of this review is to explore a clear understanding of particulate charging and movement of charged particulate in two-phase flows, by summarizing the electrification mechanisms, physical models of particulate charging, and methods of charging/charged particulate entrained fluid flow simulations. Two effective methods can make droplets charged in industrial applications: corona charging and induction charging. The droplet charge to mass ratio by corona charging is more than induction discharge. The particle charging through collisions could be attributed to electron transfer, ion transfer, material transfer, and/or aqueous ion shift on particle surfaces. The charges on charged particulate surface can be measured, nevertheless, the charging process in nature or industry is difficult to monitor. The simulation method might build a bridge of investigating from the charging process to finally charged state on particulate surface in particulate entrained fluid flows. The methodology combining the interface tracking under the action of the applied electric with the fluid flow governing equations is applicable to the study of electrohydrodynamics problems. The

  16. Methodological Interactionism : Theory and Application to the Firm and to the Building of Trust

    NARCIS (Netherlands)

    Nooteboom, B.

    2007-01-01

    Recent insights from the ‘embodied cognition’ perspective in cognitive science, supported by neural research, provide a basis for a ‘methodological interactionism’ that transcends both the methodological individualism of economics and the methodological collectivism of (some) sociology, and is

  17. Application of fault tree methodology to modeling of the AP1000 plant digital reactor protection system

    International Nuclear Information System (INIS)

    Teolis, D.S.; Zarewczynski, S.A.; Detar, H.L.

    2012-01-01

    The reactor trip system (RTS) and engineered safety features actuation system (ESFAS) in nuclear power plants utilizes instrumentation and control (IC) to provide automatic protection against unsafe and improper reactor operation during steady-state and transient power operations. During normal operating conditions, various plant parameters are continuously monitored to assure that the plant is operating in a safe state. In response to deviations of these parameters from pre-determined set points, the protection system will initiate actions required to maintain the reactor in a safe state. These actions may include shutting down the reactor by opening the reactor trip breakers and actuation of safety equipment based on the situation. The RTS and ESFAS are represented in probabilistic risk assessments (PRAs) to reflect the impact of their contribution to core damage frequency (CDF). The reactor protection systems (RPS) in existing nuclear power plants are generally analog based and there is general consensus within the PRA community on fault tree modeling of these systems. In new plants, such as AP1000 plant, the RPS is based on digital technology. Digital systems are more complex combinations of hardware components and software. This combination of complex hardware and software can result in the presence of faults and failure modes unique to a digital RPS. The United States Nuclear Regulatory Commission (NRC) is currently performing research on the development of probabilistic models for digital systems for inclusion in PRAs; however, no consensus methodology exists at this time. Westinghouse is currently updating the AP1000 plant PRA to support initial operation of plants currently under construction in the United States. The digital RPS is modeled using fault tree methodology similar to that used for analog based systems. This paper presents high level descriptions of a typical analog based RPS and of the AP1000 plant digital RPS. Application of current fault

  18. A tsunami PSA methodology and application for NPP site in Korea

    International Nuclear Information System (INIS)

    Kim, Min Kyu; Choi, In-Kil

    2012-01-01

    Highlights: ► A methodology of tsunami PSA was developed in this study. ► Tsunami return period was evaluated by empirical method using historical tsunami record and tidal gauge record. ► Procedure of tsunami fragility analysis was established and target equipments and structures for investigation of tsunami fragility assessment were selected. ► A sample fragility calculation was performed for the equipment in Nuclear Power Plant. ► Accident sequence of tsunami event is developed by according to the tsunami run-up and draw down, and tsunami induced core damage frequency (CDF) is determined. - Abstract: A methodology of tsunami PSA was developed in this study. A tsunami PSA consists of tsunami hazard analysis, tsunami fragility analysis and system analysis. In the case of tsunami hazard analysis, evaluation of tsunami return period is a major task. For the evaluation of tsunami return period, numerical analysis and empirical method can be applied. In this study, tsunami return period was evaluated by empirical method using historical tsunami record and tidal gauge record. For the performing a tsunami fragility analysis, procedure of tsunami fragility analysis was established and target equipments and structures for investigation of tsunami fragility assessment were selected. A sample fragility calculation was performed for the equipment in Nuclear Power Plant. In the case of system analysis, accident sequence of tsunami event is developed by according to the tsunami run-up and draw down, and tsunami induced core damage frequency (CDF) is determined. For the application to the real Nuclear Power Plant, the Ulchin 56 NPP which located in east coast of Korean peninsula was selected. Through this study, whole tsunami PSA working procedure was established and example calculation was performed for one of real Nuclear Power Plant in Korea. But for more accurate tsunami PSA result, there are many researches needed for evaluation of hydrodynamic force, effect of

  19. Application of Genetic Algorithm methodologies in fuel bundle burnup optimization of Pressurized Heavy Water Reactor

    International Nuclear Information System (INIS)

    Jayalal, M.L.; Ramachandran, Suja; Rathakrishnan, S.; Satya Murty, S.A.V.; Sai Baba, M.

    2015-01-01

    Highlights: • We study and compare Genetic Algorithms (GA) in the fuel bundle burnup optimization of an Indian Pressurized Heavy Water Reactor (PHWR) of 220 MWe. • Two Genetic Algorithm methodologies namely, Penalty Functions based GA and Multi Objective GA are considered. • For the selected problem, Multi Objective GA performs better than Penalty Functions based GA. • In the present study, Multi Objective GA outperforms Penalty Functions based GA in convergence speed and better diversity in solutions. - Abstract: The work carried out as a part of application and comparison of GA techniques in nuclear reactor environment is presented in the study. The nuclear fuel management optimization problem selected for the study aims at arriving appropriate reference discharge burnup values for the two burnup zones of 220 MWe Pressurized Heavy Water Reactor (PHWR) core. Two Genetic Algorithm methodologies namely, Penalty Functions based GA and Multi Objective GA are applied in this study. The study reveals, for the selected problem of PHWR fuel bundle burnup optimization, Multi Objective GA is more suitable than Penalty Functions based GA in the two aspects considered: by way of producing diverse feasible solutions and the convergence speed being better, i.e. it is capable of generating more number of feasible solutions, from earlier generations. It is observed that for the selected problem, the Multi Objective GA is 25.0% faster than Penalty Functions based GA with respect to CPU time, for generating 80% of the population with feasible solutions. When average computational time of fixed generations are considered, Penalty Functions based GA is 44.5% faster than Multi Objective GA. In the overall performance, the convergence speed of Multi Objective GA surpasses the computational time advantage of Penalty Functions based GA. The ability of Multi Objective GA in producing more diverse feasible solutions is a desired feature of the problem selected, that helps the

  20. Assessment of critical minerals: Updated application of an early-warning screening methodology

    Science.gov (United States)

    McCullough, Erin A.; Nassar, Nedal

    2017-01-01

    Increasing reliance on non-renewable mineral resources reinforces the need for identifying potential supply constraints before they occur. The US National Science and Technology Council recently released a report that outlines a methodology for screening potentially critical minerals based on three indicators: supply risk (R), production growth (G), and market dynamics (M). This early-warning screening was initially applied to 78 minerals across the years 1996 to 2013 and identified a subset of minerals as “potentially critical” based on the geometric average of these indicators—designated as criticality potential (C). In this study, the screening methodology has been updated to include data for 2014, as well as to incorporate revisions and modifications to the data, where applicable. Overall, C declined in 2014 for the majority of minerals examined largely due to decreases in production concentration and price volatility. However, the results vary considerably across minerals, with some minerals, such as gallium, recording increases for all three indicators. In addition to assessing magnitudinal changes, this analysis also examines the significance of the change relative to historical variation for each mineral. For example, although mined nickel’s R declined modestly in 2014 in comparison to that of other minerals, it was by far the largest annual change recorded for mined nickel across all years examined and is attributable to Indonesia’s ban on the export of unprocessed minerals. Based on the 2014 results, 20 minerals with the highest C values have been identified for further study including the rare earths, gallium, germanium, rhodium, tantalum, and tungsten.

  1. A systematic methodology to extend the applicability of a bioconversion model for the simulation of various co-digestion scenarios

    DEFF Research Database (Denmark)

    Kovalovszki, Adam; Alvarado-Morales, Merlin; Fotidis, Ioannis

    2017-01-01

    Detailed simulation of anaerobic digestion (AD) requires complex mathematical models and the optimization of numerous model parameters. By performing a systematic methodology and identifying parameters with the highest impact on process variables in a well-established AD model, its applicability...... was extended to various co-digestion scenarios. More specifically, the application of the step-by-step methodology led to the estimation of a general and reduced set of parameters, for the simulation of scenarios where either manure or wastewater were co-digested with different organic substrates. Validation...... experimental data quite well, indicating that it offers a reliable reference point for future simulations of anaerobic co-digestion scenarios....

  2. The applicability of the Centeno, Chaudhary and Lopez repair time standard methodology in a rail maintenance environment

    Directory of Open Access Journals (Sweden)

    Rommelspacher, Karl Otto

    2015-11-01

    Full Text Available The establishment of labour standards within a production environment has become common practice, and is receiving growing recognition in the maintenance environment. However, the application of labour standards in a transit maintenance organisation has received limited attention. Centeno, Chaudhary and Lopez have developed a repair time standard methodology that has been applied in the transit bus maintenance facilities of three agencies in central Florida in the USA. An investigation into the applicability of this methodology in a rail maintenance environment in South Africa forms the basis for this study.

  3. ASAM - The international programme on application of safety assessment methodologies for near surface radioactive waste disposal facilities

    International Nuclear Information System (INIS)

    Batandjieva, B.

    2002-01-01

    The IAEA has launched a new Co-ordinated Research Project (CRP) on Application of Safety Assessment Methodologies for Near Surface Waste Disposal Facilities (ASAM). The CRP will focus on the practical application of the safety assessment methodology, developed under the ISAM programme, for different purposes, such as developing design concepts, licensing, upgrading existing repositories, reassessment of operating disposal facilities. The overall aim of the programme is to assist safety assessors, regulators and other specialists involved in the development and review of safety assessment for near surface disposal facilities in order to achieve transparent, traceable and defendable evaluation of safety of these facilities. (author)

  4. Application of Response Surface Methodology in Extraction of Bioactive Component from Palm Leaves (Elaeis guineensis

    Directory of Open Access Journals (Sweden)

    Nur Afiqah Arham

    2013-10-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 The hydroxyl groups of the polyphenols are capable to act as reducing agent for reduction reaction. The effect of drying temperature, extraction temperature and extraction duration were evaluated using central composite design which consists of 20 experimental runs. Response surface methodology (RSM was used to estimate the optimum parameters in extracting polyphenols from the palm leaves. The correspondence analysis of the results yielded a quadratic model which can be used to find optimum conditions of extraction process. The optimum extraction condition of drying temperature, extraction temperature and extraction duration are 70°C, at 70°C of 10 minutes, respectively. Total polyphenols were determined by application of the Folin-Ciocalteu micro method and the extract was found contain of 8 mg GAE/g dry palm leaves at optimum conditions. Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Doi: 10.12777/ijse.5.2.95-100 [How to cite this article: Arham, N.A., Mohamad, N.A.N., Jai, J., Krishnan, J., Noorsuhana Mohd Yusof, N.M. (2013. Application of Response Surface Methodology in Extraction of Bioactive Component from Palm Leaves (Elaeis guineensis. International Journal of Science and

  5. Applicability of contact angle techniques used in the analysis of contact lenses, part 1: comparative methodologies.

    Science.gov (United States)

    Campbell, Darren; Carnell, Sarah Maria; Eden, Russell John

    2013-05-01

    Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable.

  6. Quantum-Mechanics Methodologies in Drug Discovery: Applications of Docking and Scoring in Lead Optimization.

    Science.gov (United States)

    Crespo, Alejandro; Rodriguez-Granillo, Agustina; Lim, Victoria T

    2017-01-01

    The development and application of quantum mechanics (QM) methodologies in computer- aided drug design have flourished in the last 10 years. Despite the natural advantage of QM methods to predict binding affinities with a higher level of theory than those methods based on molecular mechanics (MM), there are only a few examples where diverse sets of protein-ligand targets have been evaluated simultaneously. In this work, we review recent advances in QM docking and scoring for those cases in which a systematic analysis has been performed. In addition, we introduce and validate a simplified QM/MM expression to compute protein-ligand binding energies. Overall, QMbased scoring functions are generally better to predict ligand affinities than those based on classical mechanics. However, the agreement between experimental activities and calculated binding energies is highly dependent on the specific chemical series considered. The advantage of more accurate QM methods is evident in cases where charge transfer and polarization effects are important, for example when metals are involved in the binding process or when dispersion forces play a significant role as in the case of hydrophobic or stacking interactions. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  7. Boolean modeling in systems biology: an overview of methodology and applications

    International Nuclear Information System (INIS)

    Wang, Rui-Sheng; Albert, Réka; Saadatpour, Assieh

    2012-01-01

    Mathematical modeling of biological processes provides deep insights into complex cellular systems. While quantitative and continuous models such as differential equations have been widely used, their use is obstructed in systems wherein the knowledge of mechanistic details and kinetic parameters is scarce. On the other hand, a wealth of molecular level qualitative data on individual components and interactions can be obtained from the experimental literature and high-throughput technologies, making qualitative approaches such as Boolean network modeling extremely useful. In this paper, we build on our research to provide a methodology overview of Boolean modeling in systems biology, including Boolean dynamic modeling of cellular networks, attractor analysis of Boolean dynamic models, as well as inferring biological regulatory mechanisms from high-throughput data using Boolean models. We finally demonstrate how Boolean models can be applied to perform the structural analysis of cellular networks. This overview aims to acquaint life science researchers with the basic steps of Boolean modeling and its applications in several areas of systems biology. (paper)

  8. Applications of a damage tolerance analysis methodology in aircraft design and production

    Science.gov (United States)

    Woodward, M. R.; Owens, S. D.; Law, G. E.; Mignery, L. A.

    1992-01-01

    Objectives of customer mandated aircraft structural integrity initiatives in design are to guide material selection, to incorporate fracture resistant concepts in the design, to utilize damage tolerance based allowables and planned inspection procedures necessary to enhance the safety and reliability of manned flight vehicles. However, validated fracture analysis tools for composite structures are needed to accomplish these objectives in a timely and economical manner. This paper briefly describes the development, validation, and application of a damage tolerance methodology for composite airframe structures. A closed-form analysis code, entitled SUBLAM was developed to predict the critical biaxial strain state necessary to cause sublaminate buckling-induced delamination extension in an impact damaged composite laminate. An embedded elliptical delamination separating a thin sublaminate from a thick parent laminate is modelled. Predicted failure strains were correlated against a variety of experimental data that included results from compression after impact coupon and element tests. An integrated analysis package was developed to predict damage tolerance based margin-of-safety (MS) using NASTRAN generated loads and element information. Damage tolerance aspects of new concepts are quickly and cost-effectively determined without the need for excessive testing.

  9. Partial least squares path modeling basic concepts, methodological issues and applications

    CERN Document Server

    Noonan, Richard

    2017-01-01

    This edited book presents the recent developments in partial least squares-path modeling (PLS-PM) and provides a comprehensive overview of the current state of the most advanced research related to PLS-PM. The first section of this book emphasizes the basic concepts and extensions of the PLS-PM method. The second section discusses the methodological issues that are the focus of the recent development of the PLS-PM method. The third part discusses the real world application of the PLS-PM method in various disciplines. The contributions from expert authors in the field of PLS focus on topics such as the factor-based PLS-PM, the perfect match between a model and a mode, quantile composite-based path modeling (QC-PM), ordinal consistent partial least squares (OrdPLSc), non-symmetrical composite-based path modeling (NSCPM), modern view for mediation analysis in PLS-PM, a multi-method approach for identifying and treating unobserved heterogeneity, multigroup analysis (PLS-MGA), the assessment of the common method b...

  10. APPLICATION OF THE CP METHODOLOGY IN REDUCTION OF WASTE IN THE PROCESSING OF TOBACCO COMPANIES

    Directory of Open Access Journals (Sweden)

    André Luiz Emmel Silva

    2015-01-01

    Full Text Available The production, marketing and processing of tobacco are the base of the municipalities of Vale do Rio Pardo / RS economy. Although it is the raw material for various products, this region is intended almost exclusively for the production of cigarettes. Dominated by a few large multinational, this market moves this imposing financial values, where tobacco is much of the cost of production. Thus, this paper seeks to prove the efficiency of the methodology application Cleaner Production (CP in tobacco waste reduction within the tobacco processing and cigarette manufacturing companies. This analysis was conducted as a case study, carrying out visits to the knowledge production process, identifying the points of waste, taking measurements and developing a set of measures to be taken to minimize these losses. The Cleaner Production method was chosen because it is a relatively new concept and it has shown good results in companies where it is located. Through the measurements, the main points of breaks were identified and then an analysis was performed by applying the concepts of CP, and a set of measures has been proposed to reduce losses. As a result, it was achieved a reduction of 83% in the rate of tobacco waste in the production process. It was concluded that the CP, within the tobacco processing industry, was efficient, impacting directly on production costs, rationalizing the use of raw materials and reducing the total volume of waste generated.

  11. Application of Direct Assessment Approaches and Methodologies to Cathodically Protected Nuclear Waste Transfer Lines

    International Nuclear Information System (INIS)

    Dahl, Megan M.; Pikas, Joseph; Edgemon, Glenn L.; Philo, Sarah

    2013-01-01

    The U.S. Department of Energy's (DOE) Hanford Site is responsible for the safe storage, retrieval, treatment, and disposal of approximately 54 million gallons (204 million liters) of radioactive waste generated since the site's inception in 1943. Today, the major structures involved in waste management at Hanford include 149 carbon steel single-shell tanks, 28 carbon-steel double-shell tanks, plus a network of buried metallic transfer lines and ancillary systems (pits, vaults, catch tanks, etc.) required to store, retrieve, and transfer waste within the tank farm system. Many of the waste management systems at Hanford are still in use today. In response to uncertainties regarding the structural integrity of these systems,' an independent, comprehensive integrity assessment of the Hanford Site piping system was performed. It was found that regulators do not require the cathodically protected pipelines located within the Hanford Site to be assessed by External Corrosion Direct Assessment (ECDA) or any other method used to ensure integrity. However, a case study is presented discussing the application of the direct assessment process on pipelines in such a nuclear environment. Assessment methodology and assessment results are contained herein. An approach is described for the monitoring, integration of outside data, and analysis of this information in order to identify whether coating deterioration accompanied by external corrosion is a threat for these waste transfer lines

  12. The Combined ASTER MODIS Emissivity over Land (CAMEL Part 1: Methodology and High Spectral Resolution Application

    Directory of Open Access Journals (Sweden)

    E. Eva Borbas

    2018-04-01

    Full Text Available As part of a National Aeronautics and Space Administration (NASA MEaSUREs (Making Earth System Data Records for Use in Research Environments Land Surface Temperature and Emissivity project, the Space Science and Engineering Center (UW-Madison and the NASA Jet Propulsion Laboratory (JPL developed a global monthly mean emissivity Earth System Data Record (ESDR. This new Combined ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer and MODIS (Moderate Resolution Imaging Spectroradiometer Emissivity over Land (CAMEL ESDR was produced by merging two current state-of-the-art emissivity datasets: the UW-Madison MODIS Infrared emissivity dataset (UW BF and the JPL ASTER Global Emissivity Dataset Version 4 (GEDv4. The dataset includes monthly global records of emissivity and related uncertainties at 13 hinge points between 3.6–14.3 µm, as well as principal component analysis (PCA coefficients at 5-km resolution for the years 2000 through 2016. A high spectral resolution (HSR algorithm is provided for HSR applications. This paper describes the 13 hinge-points combination methodology and the high spectral resolutions algorithm, as well as reports the current status of the dataset.

  13. Methodological application so as to obtain digital elevation models DEM in wetland areas

    International Nuclear Information System (INIS)

    Quintero, Deiby A; Montoya V, Diana M; Betancur, Teresita

    2009-01-01

    In order to understand hydrological systems and the description of flow processes that occur among its components it is essential to have a physiographic description that morphometric and relief characteristics. When local studies are performed, the basic cartography available, in the best case 1:25,000 scale, tends not to obey the needs required to represent the water dynamics that characterize the interactions between streams, aquifers and lenticular water bodies in flat zones particularly in those where there are wetlands localized in ancient F100D plains of rivers. A lack of financial resources is the principal obstacle to acquiring; information that is current and sufficient for the scale of the project. Geomorphologic conditions of flat relief zones are a good alternative for the construction of the new data. Using the basic cartography available and the new data, it is possible to obtain DEMs that are improved and consistent with the dynamics of surface and groundwater flows in the hydrological system. To accomplish this one must use spatial modeling tools coupled with Geographic Information System - GIS. This article present a methodological application for the region surrounding the catchment of wetland Cienaga Colombia in the Bajo Cauca region of Antioquia.

  14. Pursuit of new methodology on risk communication - Research assistance program by open application

    International Nuclear Information System (INIS)

    Konoa, N.; Takeshima, K.

    2004-01-01

    In the latter half of 1990s a series of incidents occurred in Japan such as MOX fuel inspection data falsification, Monju fast breeder reactor sodium leakage accident, Tokai nuclear fuel plant (JCO) criticality accident and so on. It is thought that existing measures based on nuclear technology are not well cope with those incidents and another countermeasure utilizing new methodology of cultural and social sciences was keenly felt by both administration agencies and nuclear industries. Above all, the technique such as risk communication to inform the influence of trouble correctly and convincingly to the residents and mass media and to prevent the harm due to rumor is obviously inevitable. Based on these circumstances, Japanese NISA (The Nuclear and Industrial Safety Agency) initiated in 2002FY new project by open application in the field of cultural and social sciences, and risk communication was one of the principal subject of study. Up to now, 6 risk communication studies are currently in progress. The project was taken over from NISA to JNES (Incorporated Administrative Agency Japan Nuclear Energy Safety Organization) since 2004FY. This paper shows the overall structure of the project and the outline of the running studies. (author)

  15. Application of the Coastal Hazard Wheel methodology for coastal multi-hazard assessment and management in the state of Djibouti

    DEFF Research Database (Denmark)

    Appelquist, Lars Rosendahl; Balstrøm, Thomas

    2014-01-01

    coastal classification system that incorporates the main static and dynamic parameters determining the characteristics of a coastal environment. The methodology provides information on the hazards of ecosystem disruption, gradual inundation, salt water intrusion, erosion and flooding and can be used...... to support management decisions at local, regional and national level, in areas with limited access to geophysical data. The assessment for Djibouti applies a geographic information system (GIS) to develop a range of national hazard maps along with relevant hazard statistics and is showcasing the procedure......This paper presents the application of a new methodology for coastal multi-hazard assessment and management in a changing global climate on the state of Djibouti. The methodology termed the Coastal Hazard Wheel (CHW) is developed for worldwide application and is based on a specially designed...

  16. The application of leak before break concept to W7-X target module

    Energy Technology Data Exchange (ETDEWEB)

    Dundulis, G., E-mail: gintas@mail.lei.lt; Janulionis, R.; Karalevičius, R.

    2013-11-15

    Highlights: • LBB application to Wendelstein 7-X fusion reactor. • R6 method application to crack analysis. • Through wall crack opening analysis. • Determination of leak rate function. • Crack growth analysis. -- Abstract: Fusion is the energy production technology, which could potentially solve problems with growing energy demand of population in the future. Wendelstein 7-X (W7-X) is an experimental stellarator of the helias type fusion reactor currently being built in Greifswald, Germany. This experimental stellarator is a complex structure, such as nuclear power plants and high level of safety requirements should be used for structural integrity analysis. It is thus not possible to obtain simple solutions for general cases, therefore sophisticated methods are necessary for the analysis. Inside the Plasma Vessel (PV) of W7-X there is a number of different components such as pipes, divertors, baffles and targets. A guillotine failure of one component is very dangerous for structural integrity of surrounding components located in PV. For this reason it is very important to evaluate possibility to apply “leak before break” (LBB) concept for W7-X. The LBB concept is widely used in the nuclear industry to describe the idea that in the piping carrying the coolant of a power reactor a leak will occur before a catastrophic break will occurred. LBB allows to conduct the structural design without considering the loads due to postulated line breaks. The LBB analysis was made for the case when plasma vessel is operating in “baking” mode. “Baking” is the mode, when the cooling system is working as a warming system and it heats the plasma vessel structures up to 160 °C in order to release the absorbed gases from the surfaces and to pump them out of the plasma vessel before plasma operation. The LBB analysis was performed for most loaded component of target module. According to the results of the analysis it is possible to conclude that target module 1H

  17. Optimization of a High Temperature PEMFC micro-CHP System by Formulation and Application of a Process Integration Methodology

    DEFF Research Database (Denmark)

    Arsalis, Alexandros; Nielsen, Mads Pagh; Kær, Søren Knudsen

    2013-01-01

    A 1 kWe micro combined heat and power (CHP) system based on high temperature proton exchange membrane fuel cell (PEMFC) technology is modeled and optimized by formulation and application of a process integration methodology. The system can provide heat and electricity for a singlefamily household...

  18. Application of the methodology of safety probabilistic analysis to the modelling the emergency feedwater system of Juragua nuclear power plant

    International Nuclear Information System (INIS)

    Troncoso, M.; Oliva, G.

    1993-01-01

    The application of the methodology developed in the framework of the national plan of safety probabilistic analysis (APS) to the emergency feed water system for the failures of small LOCAS and external electrical supply loss in the nuclear power plant is illustrated in this work. The facilities created by the ARCON code to model the systems and its documentation are also expounded

  19. Comprehensive Psychopathological Assessment Based on the Association for Methodology and Documentation in Psychiatry (AMDP) System: Development, Methodological Foundation, Application in Clinical Routine, and Research

    Science.gov (United States)

    Stieglitz, Rolf-Dieter; Haug, Achim; Fähndrich, Erdmann; Rösler, Michael; Trabert, Wolfgang

    2017-01-01

    The documentation of psychopathology is core to the clinical practice of the psychiatrist and clinical psychologist. However, both in initial as well as further training and specialization in their fields, this particular aspect of their work receives scanty attention only. Yet, for the past 50 years, the Association for Methodology and Documentation in Psychiatry (AMDP) System has been in existence and available as a tool to serve precisely the purpose of offering a systematic introduction to the terminology and documentation of psychopathology. The motivation for its development was based on the need for an assessment procedure for the reliable documentation of the effectiveness of newly developed psychopharmacological substances. Subsequently, the AMDP-System began to be applied in the context of investigations into a number of methodological issues in psychiatry (e.g., the frequency and specificity of particular symptoms, the comparison of rating scales). The System then became increasingly important also in clinical practice and, today, represents the most used instrument for the documentation of psychopathology in the German-speaking countries of Europe. This paper intends to offer an overview of the AMDP-System, its origins, design, and functionality. After an initial account of the history and development of the AMDP-System, the discussion will in turn focus on the System’s underlying methodological principles, the transfer of clinical skills and competencies in its practical application, and its use in research and clinical practice. Finally, potential future areas of development in relation to the AMDP-System are explored. PMID:28439242

  20. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    Science.gov (United States)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  1. Application of validity theory and methodology to patient-reported outcome measures (PROMs): building an argument for validity.

    Science.gov (United States)

    Hawkins, Melanie; Elsworth, Gerald R; Osborne, Richard H

    2018-07-01

    Data from subjective patient-reported outcome measures (PROMs) are now being used in the health sector to make or support decisions about individuals, groups and populations. Contemporary validity theorists define validity not as a statistical property of the test but as the extent to which empirical evidence supports the interpretation of test scores for an intended use. However, validity testing theory and methodology are rarely evident in the PROM validation literature. Application of this theory and methodology would provide structure for comprehensive validation planning to support improved PROM development and sound arguments for the validity of PROM score interpretation and use in each new context. This paper proposes the application of contemporary validity theory and methodology to PROM validity testing. The validity testing principles will be applied to a hypothetical case study with a focus on the interpretation and use of scores from a translated PROM that measures health literacy (the Health Literacy Questionnaire or HLQ). Although robust psychometric properties of a PROM are a pre-condition to its use, a PROM's validity lies in the sound argument that a network of empirical evidence supports the intended interpretation and use of PROM scores for decision making in a particular context. The health sector is yet to apply contemporary theory and methodology to PROM development and validation. The theoretical and methodological processes in this paper are offered as an advancement of the theory and practice of PROM validity testing in the health sector.

  2. Geographical targeting of poverty alleviation programs : methodology and applications in rural India

    NARCIS (Netherlands)

    Bigman, D.; Srinivasan, P.V.

    2002-01-01

    The paper presents a methodology for mapping poverty within national borders at the level of relatively small geographical areas and illustrates this methodology for India. Poverty alleviation programs in India are presently targeted only at the level of the state. All states includes, however, many

  3. Development and application of a deterministic-realistic hybrid methodology for LOCA licensing analysis

    International Nuclear Information System (INIS)

    Liang, Thomas K.S.; Chou, Ling-Yao; Zhang, Zhongwei; Hsueh, Hsiang-Yu; Lee, Min

    2011-01-01

    Highlights: → A new LOCA licensing methodology (DRHM, deterministic-realistic hybrid methodology) was developed. → DRHM involves conservative Appendix K physical models and statistical treatment of plant status uncertainties. → DRHM can generate 50-100 K PCT margin as compared to a traditional Appendix K methodology. - Abstract: It is well recognized that a realistic LOCA analysis with uncertainty quantification can generate greater safety margin as compared with classical conservative LOCA analysis using Appendix K evaluation models. The associated margin can be more than 200 K. To quantify uncertainty in BELOCA analysis, generally there are two kinds of uncertainties required to be identified and quantified, which involve model uncertainties and plant status uncertainties. Particularly, it will take huge effort to systematically quantify individual model uncertainty of a best estimate LOCA code, such as RELAP5 and TRAC. Instead of applying a full ranged BELOCA methodology to cover both model and plant status uncertainties, a deterministic-realistic hybrid methodology (DRHM) was developed to support LOCA licensing analysis. Regarding the DRHM methodology, Appendix K deterministic evaluation models are adopted to ensure model conservatism, while CSAU methodology is applied to quantify the effect of plant status uncertainty on PCT calculation. Generally, DRHM methodology can generate about 80-100 K margin on PCT as compared to Appendix K bounding state LOCA analysis.

  4. Scaling up methodology for CO2 emissions in ICT applications in traffic and transport in Europe

    NARCIS (Netherlands)

    Mans, D.; Jonkers, E.; Giannelos, I.; Palanciuc, D.

    2013-01-01

    The Amitran project aims to define a reference methodology for evaluating the effects of ICT measures in trafäc and transport on energy efficiency and consequently CO2 emissions. This methodology can be used as a reference by future projects and will address different modes for both passenger and

  5. THE COMPETITIVENESS OF THE SOUTH AFRICAN AND AUSTRALIAN FLOWER INDUSTRIES: An application of three methodologies.

    OpenAIRE

    van Rooyen, I.M.; Kirsten, Johann F.; van Rooyen, C.J.; Collins, Ray

    2001-01-01

    Competitiveness is defined to include both comparative and competitive advantage. Three different methodologies are applied in the analysis of the flower industries of South Africa and Australia: "Determinants of competitive advantage" methodology of Michael Porter (1990) describes the factors influencing competitive advantage; "Revealed comparative advantage" states the relative importance of flower trade in each country; and the "Policy Analyses Matrix" calculates the comparative advantage ...

  6. Determining Faculty and Student Views: Applications of Q Methodology in Higher Education

    Science.gov (United States)

    Ramlo, Susan

    2012-01-01

    William Stephenson specifically developed Q methodology, or Q, as a means of measuring subjectivity. Q has been used to determine perspectives/views in a wide variety of fields from marketing research to political science but less frequently in education. In higher education, the author has used Q methodology to determine views about a variety of…

  7. Development of the methodology for application of revised source term to operating nuclear power plants in Korea

    International Nuclear Information System (INIS)

    Kang, M.S.; Kang, P.; Kang, C.S.; Moon, J.H.

    2004-01-01

    Considering the current trend in applying the revised source term proposed by NUREG-1465 to the nuclear power plants in the U.S., it is expected that the revised source term will be applied to the Korean operating nuclear power plants in the near future, even though the exact time can not be estimated. To meet the future technical demands, it is necessary to prepare the technical system including the related regulatory requirements in advance. In this research, therefore, it is intended to develop the methodology to apply the revised source term to operating nuclear power plants in Korea. Several principles were established to develop the application methodologies. First, it is not necessary to modify the existing regulations about source term (i.e., any back-fitting to operating nuclear plants is not necessary). Second, if the pertinent margin of safety is guaranteed, the revised source term suggested by NUREG-1465 may be useful to full application. Finally, a part of revised source term could be selected to application based on the technical feasibility. As the results of this research, several methodologies to apply the revised source term to the Korean operating nuclear power plants have been developed, which include: 1) the selective (or limited) application to use only some of all the characteristics of the revised source term, such as release timing of fission products and chemical form of radio-iodine and 2) the full application to use all the characteristics of the revised source term. The developed methodologies are actually applied to Ulchin 9 and 4 units and their application feasibilities are reviewed. The results of this research are used as either a manual in establishing the plan and the procedure for applying the revised source term to the domestic nuclear plant from the utility's viewpoint; or a technical basis of revising the related regulations from the regulatory body's viewpoint. The application of revised source term to operating nuclear

  8. [Needs assessment to improve the applicability and methodological quality of a German S3 guideline].

    Science.gov (United States)

    Burckhardt, Marion; Hoffmann, Cristina; Nink-Grebe, Brigitte; Sänger, Sylvia

    2018-04-01

    Clinical practice guidelines can change the practice in healthcare only if their recommendations are implemented in a comprehensive way. The German S3 guideline "Local Therapy of Chronic Wounds in Patients with Peripheral Vascular Disease, Chronic Venous Insufficiency, and Diabetes" will be updated in 2017. The emphasis here is on the guideline's validity, user-friendliness and implementation into practice. Therefore, the aim was to identify the improvements required in regard to the guideline's methods and content presentation. The methodological approach used was the critical appraisal of the guideline according to established quality criteria and an additional stakeholder survey. Both were conducted between August and November 2016. The guideline and its related documents were reviewed independently by two researchers according to the criteria of the "Appraisal of Guidelines for Research and Evaluation" (AGREE-II). Published reviews and peer reviews by external experts and organisations were also taken into account. For the stakeholder survey, a questionnaire with open questions was distributed by e-mail and via the Internet to health professionals and organisations involved in the care of patients with leg ulcers in Germany. The questions were aimed at amendments and new topics based on the stakeholders' experience in inpatient and outpatient care. In addition, the survey focused on gathering suggestions to improve the applicability of the guideline. Suggested new topics and amendments were summarised thematically. The stakeholders' suggestions to improve the applicability, the results of the critical appraisal and the relevant aspects of the external reviews were then summarised according to the AGREE-II domains and presented in a cause and effect diagram. 17 questionnaires (out of 864 sent out by e-mail) were returned. Due to high practice relevance, the stakeholders suggested an expansion of the inclusion criteria to patients with infected wounds and

  9. The Application of Best Estimate and Uncertainty Analysis Methodology to Large LOCA Power Pulse in a CANDU 6 Reactor

    International Nuclear Information System (INIS)

    Abdul-Razzak, A.; Zhang, J.; Sills, H.E.; Flatt, L.; Jenkins, D.; Wallace, D.J.; Popov, N.

    2002-01-01

    The paper describes briefly a best estimate plus uncertainty analysis (BE+UA) methodology and presents its proto-typing application to the power pulse phase of a limiting large Loss-of-Coolant Accident (LOCA) for a CANDU 6 reactor fuelled with CANFLEX R fuel. The methodology is consistent with and builds on world practice. The analysis is divided into two phases to focus on the dominant parameters for each phase and to allow for the consideration of all identified highly ranked parameters in the statistical analysis and response surface fits for margin parameters. The objective of this analysis is to quantify improvements in predicted safety margins under best estimate conditions. (authors)

  10. Application of a new methodology for coastal multi-hazard-assessment and management on the state of Karnataka, India

    DEFF Research Database (Denmark)

    Appelquist, Lars Rosendahl; Balstrom, Thomas

    2015-01-01

    This paper presents the application of a new Methodology for coastal multi-hazard assessment & management under a changing global climate on the state of Karnataka, India. The recently published methodology termed the Coastal Hazard Wheel (CHW) is designed for local, regional and national hazard...... at a scale relevant for regional planning purposes. It uses a GIS approach to develop regional and sub-regional hazard maps as well as to produce relevant hazard risk data, and includes a discussion of uncertainties, limitations and management perspectives. The hazard assessment shows that 61 percent...

  11. The methodology of root cause analysis for equipment failure and its application at Guangdong nuclear power stations

    International Nuclear Information System (INIS)

    Gao Ligang; Lu Qunxian

    2004-01-01

    The methodology of Equipment Failure Root Cause Analysis (RCA) is described, as a systematic analysis methodology, it includes 9 steps. Its process is explained by some real examples, and the 6 precautions applying RCA is pointed out. The paper also summarizes the experience of RCA application at Daya Bay Nuclear Power Station, and the 7 key factors for RCA success is emphasized, that mainly concerns organization, objective, analyst, analysis technique, external technical supporting system, corrective actions developing and monitoring system for corrective actions. (authors)

  12. Methodological application of Location of service Public Bike. Service MUyBICI of Murcia

    Energy Technology Data Exchange (ETDEWEB)

    LiÑan Ruiz, R.J.; Berenguer Sempere, F.J.; Vera Lopez, J.A.; Pabon Dueñas, A.B.; Merino Cordoba, S.

    2016-07-01

    The use of non-motorized means of transport such as the bicycle, brings many benefits to the user and for the city in terms of costs and health for the first and decreased environmental pollution for the city. To find the optimal location for placement of the different parties to public bike, aims to attract the usual user and potential, have the feasibility of switching modes without any restrictions, while generating the ability to balance the demands users towards sustainable modes of transport, with special attention to cycling and public bike loan. The implementation of this methodology is performed in the municipality of Murcia (Spain) due to the opening of its public bicycle system MUyBICI which will have 60 benches, with a total of 1,200 posts anchor and put into circulation 600 public bicycles. As selection criteria to be considered for the optimal location of the beds, the existing network of bike paths were considered, roads used by all users of the public highway, a description of travel and a database information with different land uses and socioeconomic data transport areas. In this paper an analysis model and application for optimal design of banking locations for Murcia MUyBICI service occurs. Specifically, they define what are the best locations to attract a larger number of users, in order to achieve a change in the percentage of the modal split of the municipality, increasing the number of users MUyBICI service. This work comes under the direction of the Bicycle Office of Murcia, part of the ALEM (Local Agency for Energy and Environment) service under the Department of Environment of the City of Murcia. (Author)

  13. Application of a methodology for the development and validation of reliable process control software

    International Nuclear Information System (INIS)

    Ramamoorthy, C.V.; Mok, Y.R.; Bastani, F.B.; Chin, G.

    1980-01-01

    The necessity of a good methodology for the development of reliable software, especially with respect to the final software validation and testing activities, is discussed. A formal specification development and validation methodology is proposed. This methodology has been applied to the development and validation of a pilot software, incorporating typical features of critical software for nuclear power plants safety protection. The main features of the approach include the use of a formal specification language and the independent development of two sets of specifications. 1 ref

  14. Application of six sigma DMAIC methodology to reduce service resolution time in a service organization

    Directory of Open Access Journals (Sweden)

    Virender Narula

    2017-11-01

    Full Text Available The popularity of Six Sigma, as a means for improving quality, has grown exponentially in recent years. It is a proven methodology to achieve breakthrough improvement in process performance that generates significant savings to bottom line of an organization. This paper illustrates how Six Sigma methodology may be used to improve service processes. The purpose of this paper is to develop Six Sigma DMAIC methodologies that would help service organizations look into their processes. In addition, it demonstrates the vital linkages between process improvement and process variation. The study identifies critical process parameters and suggests a team structure for Six Sigma project in service operations.

  15. Application of a Bayesian model for the quantification of the European methodology for qualification of non-destructive testing

    International Nuclear Information System (INIS)

    Gandossi, Luca; Simola, Kaisa; Shepherd, Barrie

    2010-01-01

    The European methodology for qualification of non-destructive testing is a well-established approach adopted by nuclear utilities in many European countries. According to this methodology, qualification is based on a combination of technical justification and practical trials. The methodology is qualitative in nature, and it does not give explicit guidance on how the evidence from the technical justification and results from trials should be weighted. A Bayesian model for the quantification process was presented in a previous paper, proposing a way to combine the 'soft' evidence contained in a technical justification with the 'hard' evidence obtained from practical trials. This paper describes the results of a pilot study in which such a Bayesian model was applied to two realistic Qualification Dossiers by experienced NDT qualification specialists. At the end of the study, recommendations were made and a set of guidelines was developed for the application of the Bayesian model.

  16. Soviet-designed pressurized water reactor symptomatic emergency operating instruction analytical procedure: approach, methodology, development and application

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1999-01-01

    A symptom approach to the analytical validation of symptom-based EOPs includes: (1) Identification of critical safety functions to the maintenance of fission product barrier integrity; (2) Identification of the symptoms which manifest an impending challenge to critical safety function maintenance; (3) Development of a symptomatic methodology to delineate bounding plant transient response modes; (4) Specification of bounding scenarios; (5) Development of a systematic calculational approach consistent with the objectives of the methodology; (6) Performance of thermal-hydraulic computer code calculations implementing the analytical methodology; (7) Interpretation of the analytical results on the basis of information available to the operator; (8) Application of the results to the validation of the proposed operator actions; (9) Production of a technical basis document justifying the proposed operator actions. (author)

  17. The audit of social administration (AGSC. Methodological proposal for its application in cooperative companies

    Directory of Open Access Journals (Sweden)

    Leonardo Ojeda Mesa

    2014-06-01

    This article explains the bases for a methodology proposal that facilitates the execution of Social Administration Audits (AGS in cooperative companies, with the objective of evaluating the administration that they develop  the same social order.

  18. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.; Carroll, R. J.; Dabney, A. R.

    2012-01-01

    positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon

  19. Efficient Substrate Noise Coupling Verification and Failure Analysis Methodology for Smart Power ICs in Automotive Applications

    OpenAIRE

    Moursy , Yasser; Zou , Hao; Khalil , Raouf; Iskander , Ramy; Tisserand , Pierre; Ton , Dieu-My; Pasetti , Giuseppe; Louërat , Marie-Minerve

    2016-01-01

    International audience; This paper presents a methodology to analyze the substrate noise coupling and reduce their effects in smart power integrated circuits. This methodology considers the propagation of minority carriers in the substrate. Hence, it models the lateral bipolar junction transistors that are layout dependent and are not modeled in conventional substrate extraction tools. It allows the designer to simulate substrate currents and check their effects on circuits functionality. The...

  20. Application of a Resilience Framework to Military Installations: A Methodology for Energy Resilience Business Case Decisions

    Science.gov (United States)

    2016-09-01

    align to a disruption or an associated downtime impacting mission performance. Reliability metrics and models were also used throughout the study to...Military Installations: A Methodology for Energy Resilience Business Case Decisions N. Judson A.L. Pina E.V. Dydek S.B. Van Broekhoven A.S...Methodology for Energy Resilience Business Case Decisions N. Judson A.L. Pina E.V. Dydek S.B. Van Broekhoven Group 73 A.S. Castillo TBD

  1. Application of quality improvement analytic methodology in emergency medicine research: A comparative evaluation.

    Science.gov (United States)

    Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B

    2018-05-30

    Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.

  2. Application of a new methodology to evaluate Dnb limits based on statistical propagation of uncertainties

    International Nuclear Information System (INIS)

    Machado, Marcio Dornellas

    1998-09-01

    One of the most important thermalhydraulics safety parameters is the DNBR (Departure from Nucleate Boiling Ratio). The current methodology in use at Eletronuclear to determine DNBR is extremely conservative and may result in penalties to the reactor power due to an increase plugging level of steam generator tubes. This work uses a new methodology to evaluate DNBR, named mini-RTDP. The standard methodology (STDP) currently in use establishes a limit design value which cannot be surpassed. This limit value is determined taking into account the uncertainties of the empirical correlation used in COBRA IIC/MIT code, modified to Angra 1 conditions. The correlation used is the Westinghouse's W-3 and the minimum DNBR (MDBR) value cannot be less than 1.3. The new methodology reduces the excessive level of conservatism associated with the parameters used in the DNBR calculation, which take most unfavorable values in the STDP methodology, by using their best estimate values. The final goal is to obtain a new DNBR design limit which will provide a margin gain due to more realistic parameters values used in the methodology. (author)

  3. Methodological issues concerning the application of reliable laser particle sizing in soils

    Science.gov (United States)

    de Mascellis, R.; Impagliazzo, A.; Basile, A.; Minieri, L.; Orefice, N.; Terribile, F.

    2009-04-01

    During the past decade, the evolution of technologies has enabled laser diffraction (LD) to become a much widespread means of particle size distribution (PSD), replacing sedimentation and sieve analysis in many scientific fields mainly due to its advantages of versatility, fast measurement and high reproducibility. Despite such developments of the last decade, the soil scientist community has been quite reluctant to replace the good old sedimentation techniques (ST); possibly because of (i) the large complexity of the soil matrix inducing different types of artefacts (aggregates, deflocculating dynamics, etc.), (ii) the difficulties in relating LD results with results obtained through sedimentation techniques and (iii) the limited size range of most LD equipments. More recently LD granulometry is slowly gaining appreciation in soil science also because of some innovations including an enlarged size dynamic range (0,01-2000 m) and the ability to implement more powerful algorithms (e.g. Mie theory). Furthermore, LD PSD can be successfully used in the application of physically based pedo-transfer functions (i.e., Arya and Paris model) for investigations of soil hydraulic properties, due to the direct determination of PSD in terms of volume percentage rather than in terms of mass percentage, thus eliminating the need to adopt the rough approximation of a single value for soil particle density in the prediction process. Most of the recent LD work performed in soil science deals with the comparison with sedimentation techniques and show the general overestimation of the silt fraction following a general underestimation of the clay fraction; these well known results must be related with the different physical principles behind the two techniques. Despite these efforts, it is indeed surprising that little if any work is devoted to more basic methodological issues related to the high sensitivity of LD to the quantity and the quality of the soil samples. Our work aims to

  4. A Probabilistic Tsunami Hazard Assessment Methodology and Its Application to Crescent City, CA

    Science.gov (United States)

    Gonzalez, F. I.; Leveque, R. J.; Waagan, K.; Adams, L.; Lin, G.

    2012-12-01

    A PTHA methodology, based in large part on Probabilistic Seismic Hazard Assessment methods (e.g., Cornell, 1968; SSHAC, 1997; Geist and Parsons, 2005), was previously applied to Seaside, OR (Gonzalez, et al., 2009). This initial version of the method has been updated to include: a revised method to estimate tidal uncertainty; an improved method for generating stochastic realizations to estimate slip distribution uncertainty (Mai and Beroza, 2002; Blair, et al., 2011); additional near-field sources in the Cascadia Subduction Zone, based on the work of Goldfinger, et al. (2012); far-field sources in Japan, based on information updated since the 3 March 2011 Tohoku tsunami (Japan Earthquake Research Committee, 2011). The GeoClaw tsunami model (Berger, et. al, 2011) is used to simulate generation, propagation and inundation. We will discuss this revised PTHA methodology and the results of its application to Crescent City, CA. Berger, M.J., D. L. George, R. J. LeVeque, and K. T. Mandli, The GeoClaw software for depth-averaged flows with adaptive refinement, Adv. Water Res. 34 (2011), pp. 1195-1206. Blair, J.L., McCrory, P.A., Oppenheimer, D.H., and Waldhauser, F. (2011): A Geo-referenced 3D model of the Juan de Fuca Slab and associated seismicity: U.S. Geological Survey Data Series 633, v.1.0, available at http://pubs.usgs.gov/ds/633/. Cornell, C. A. (1968): Engineering seismic risk analysis, Bull. Seismol. Soc. Am., 58, 1583-1606. Geist, E. L., and T. Parsons (2005): Probabilistic Analysis of Tsunami Hazards, Nat. Hazards, 37 (3), 277-314. Goldfinger, C., Nelson, C.H., Morey, A.E., Johnson, J.E., Patton, J.R., Karabanov, E., Gutiérrez-Pastor, J., Eriksson, A.T., Gràcia, E., Dunhill, G., Enkin, R.J., Dallimore, A., and Vallier, T. (2012): Turbidite event history—Methods and implications for Holocene paleoseismicity of the Cascadia subduction zone: U.S. Geological Survey Professional Paper 1661-F, 170 p. (Available at http://pubs.usgs.gov/pp/pp1661f/). González, F

  5. Quantifying reactor safety margins: Application of code scaling, applicability, and uncertainty evaluation methodology to a large-break, loss-of-coolant accident

    International Nuclear Information System (INIS)

    Boyack, B.; Duffey, R.; Wilson, G.; Griffith, P.; Lellouche, G.; Levy, S.; Rohatgi, U.; Wulff, W.; Zuber, N.

    1989-12-01

    The US Nuclear Regulatory Commission (NRC) has issued a revised rule for loss-of-coolant accident/emergency core cooling system (ECCS) analysis of light water reactors to allow the use of best-estimate computer codes in safety analysis as an option. A key feature of this option requires the licensee to quantify the uncertainty of the calculations and include that uncertainty when comparing the calculated results with acceptance limits provided in 10 CFR Part 50. To support the revised ECCS rule and illustrate its application, the NRC and its contractors and consultants have developed and demonstrated an uncertainty evaluation methodology called code scaling, applicability, and uncertainty (CSAU). The CSAU methodology and an example application described in this report demonstrate that uncertainties in complex phenomena can be quantified. The methodology is structured, traceable, and practical, as is needed in the regulatory arena. The methodology is systematic and comprehensive as it addresses and integrates the scenario, experiments, code, and plant to resolve questions concerned with: (a) code capability to scale-up processes from test facility to full-scale nuclear power plants; (b) code applicability to safety studies of a postulated accident scenario in a specified nuclear power plant; and (c) quantifying uncertainties of calculated results. 127 refs., 55 figs., 40 tabs

  6. Analyzing the Feasibility of Using Secure Application Integration Methodology (SAIM) for Integrating DON Enterprise Resource Planning (ERP) Applications

    National Research Council Canada - National Science Library

    Marin, Ramon

    2004-01-01

    ...) would provide useful information about a beneficial methodology. SAIM is analyzed, by accessing its step by step directions, for suitability in the integration of the Enterprise Resource Planning (ERP...

  7. Application of the Coastal Hazard Wheel methodology for coastal multi-hazard assessment and management in the state of Djibouti

    Directory of Open Access Journals (Sweden)

    Lars Rosendahl Appelquist

    2014-01-01

    Full Text Available This paper presents the application of a new methodology for coastal multi-hazard assessment and management in a changing global climate on the state of Djibouti. The methodology termed the Coastal Hazard Wheel (CHW is developed for worldwide application and is based on a specially designed coastal classification system that incorporates the main static and dynamic parameters determining the characteristics of a coastal environment. The methodology provides information on the hazards of ecosystem disruption, gradual inundation, salt water intrusion, erosion and flooding and can be used to support management decisions at local, regional and national level, in areas with limited access to geophysical data. The assessment for Djibouti applies a geographic information system (GIS to develop a range of national hazard maps along with relevant hazard statistics and is showcasing the procedure for applying the CHW methodology for national hazard assessments. The assessment shows that the coastline of Djibouti is characterized by extensive stretches with high or very high hazards of ecosystem disruption, mainly related to coral reefs and mangrove forests, while large sections along the coastlines of especially northern and southern Djibouti have high hazard levels for gradual inundation. The hazard of salt water intrusion is moderate along most of Djibouti’s coastline, although groundwater availability is considered to be very sensitive to human ground water extraction. High or very high erosion hazards are associated with Djibouti’s sedimentary plains, estuaries and river mouths, while very high flooding hazards are associated with the dry river mouths.

  8. A methodology for the design and testing of atmospheric boundary layer models for wind energy applications

    Directory of Open Access Journals (Sweden)

    J. Sanz Rodrigo

    2017-02-01

    Full Text Available The GEWEX Atmospheric Boundary Layer Studies (GABLS 1, 2 and 3 are used to develop a methodology for the design and testing of Reynolds-averaged Navier–Stokes (RANS atmospheric boundary layer (ABL models for wind energy applications. The first two GABLS cases are based on idealized boundary conditions and are suitable for verification purposes by comparing with results from higher-fidelity models based on large-eddy simulation. Results from three single-column RANS models, of 1st, 1.5th and 2nd turbulence closure order, show high consistency in predicting the mean flow. The third GABLS case is suitable for the study of these ABL models under realistic forcing such that validation versus observations from the Cabauw meteorological tower are possible. The case consists on a diurnal cycle that leads to a nocturnal low-level jet and addresses fundamental questions related to the definition of the large-scale forcing, the interaction of the ABL with the surface and the evaluation of model results with observations. The simulations are evaluated in terms of surface-layer fluxes and wind energy quantities of interest: rotor equivalent wind speed, hub-height wind direction, wind speed shear and wind direction veer. The characterization of mesoscale forcing is based on spatially and temporally averaged momentum budget terms from Weather Research and Forecasting (WRF simulations. These mesoscale tendencies are used to drive single-column models, which were verified previously in the first two GABLS cases, to first demonstrate that they can produce similar wind profile characteristics to the WRF simulations even though the physics are more simplified. The added value of incorporating different forcing mechanisms into microscale models is quantified by systematically removing forcing terms in the momentum and heat equations. This mesoscale-to-microscale modeling approach is affected, to a large extent, by the input uncertainties of the mesoscale

  9. A methodology for the characterization and diagnosis of cognitive impairments-Application to specific language impairment.

    Science.gov (United States)

    Oliva, Jesús; Serrano, J Ignacio; del Castillo, M Dolores; Iglesias, Angel

    2014-06-01

    The diagnosis of mental disorders is in most cases very difficult because of the high heterogeneity and overlap between associated cognitive impairments. Furthermore, early and individualized diagnosis is crucial. In this paper, we propose a methodology to support the individualized characterization and diagnosis of cognitive impairments. The methodology can also be used as a test platform for existing theories on the causes of the impairments. We use computational cognitive modeling to gather information on the cognitive mechanisms underlying normal and impaired behavior. We then use this information to feed machine-learning algorithms to individually characterize the impairment and to differentiate between normal and impaired behavior. We apply the methodology to the particular case of specific language impairment (SLI) in Spanish-speaking children. The proposed methodology begins by defining a task in which normal and individuals with impairment present behavioral differences. Next we build a computational cognitive model of that task and individualize it: we build a cognitive model for each participant and optimize its parameter values to fit the behavior of each participant. Finally, we use the optimized parameter values to feed different machine learning algorithms. The methodology was applied to an existing database of 48 Spanish-speaking children (24 normal and 24 SLI children) using clustering techniques for the characterization, and different classifier techniques for the diagnosis. The characterization results show three well-differentiated groups that can be associated with the three main theories on SLI. Using a leave-one-subject-out testing methodology, all the classifiers except the DT produced sensitivity, specificity and area under curve values above 90%, reaching 100% in some cases. The results show that our methodology is able to find relevant information on the underlying cognitive mechanisms and to use it appropriately to provide better

  10. Development of a cross-section methodology and a real-time core model for VVER-1000 simulator application

    Energy Technology Data Exchange (ETDEWEB)

    Georgieva, Emiliya Lyudmilova

    2016-06-06

    The novel academic contributions are summarized as follows. A) A cross-section modelling methodology and a cycle-specific cross-section update procedure are developed to meet fidelity requirements applicable to a cycle-specific reactor core simulation, as well as particular customer needs and practices supporting VVER-1000 operation and safety. B) A real-time version of the Nodal Expansion Method code is developed and implemented into Kozloduy 6 full-scope replica control room simulator.

  11. Development of an accident sequence precursor methodology and its application to significant accident precursors

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Seung Hyun; Park, Sung Hyun; Jae, Moo Sung [Dept. of of Nuclear Engineering, Hanyang University, Seoul (Korea, Republic of)

    2017-03-15

    The systematic management of plant risk is crucial for enhancing the safety of nuclear power plants and for designing new nuclear power plants. Accident sequence precursor (ASP) analysis may be able to provide risk significance of operational experience by using probabilistic risk assessment to evaluate an operational event quantitatively in terms of its impact on core damage. In this study, an ASP methodology for two operation mode, full power and low power/shutdown operation, has been developed and applied to significant accident precursors that may occur during the operation of nuclear power plants. Two operational events, loss of feedwater and steam generator tube rupture, are identified as ASPs. Therefore, the ASP methodology developed in this study may contribute to identifying plant risk significance as well as to enhancing the safety of nuclear power plants by applying this methodology systematically.

  12. Methodology to assess the radiological sensitivity of soils: Application to Spanish soils

    International Nuclear Information System (INIS)

    Trueba Alonso, C.

    2005-01-01

    A methodology, based on standard physical and chemical soil properties, has been developed to estimate the radiological sensitivity of soils to a 137 C s and 90 S r contamination. In this framework, the soil radiological sensitivity is defined as the soil capability to mobilise or to retain these radionuclides. The purpose of this methodology is to assess, in terms of radiological sensitivity indexes, the behaviour of 137 C s and 90 S r in soils and their fluxes to man, considering two exposure pathways, the external irradiation exposure and the internal exposure from ingestion. The methodology is applied to the great variety of soil types found in Spain, where the soil profile is the reference unit for the assessment. The results for these soil types show, that their basic soil properties are the key to categorise the radiological sensitivity according to the risks considered. The final categorisation allows to identify soils specially sensible and improves the radiological impact assessment predictions. (Author)

  13. Application of Response Surface Methodology in Optimizing a Three Echelon Inventory System

    Directory of Open Access Journals (Sweden)

    Seyed Hossein Razavi Hajiagha

    2014-01-01

    Full Text Available Inventory control is an important subject in supply chain management. In this paper, a three echelon production, distribution, inventory system composed of one producer, two wholesalers and a set of retailers has been considered. Costumers' demands follow a compound Poisson process and the inventory policy is a kind of continuous review (R, Q. In this paper, regarding the standard cost structure in an inventory model, the cost function of system has been approximated using Response Surface Methodology as a combination of designed experiments, simulation, regression analysis and optimization. The proposed methodology in this paper can be applied as a novel method in optimization of inventory policy of supply chains. Also, the joint optimization of inventory parameters, including reorder point and batch order size, is another advantage of the proposed methodology.

  14. Fire risk analysis for nuclear power plants: Methodological developments and applications

    International Nuclear Information System (INIS)

    Kazarians, M.; Apostolakis, G.; Siv, N.O.

    1985-01-01

    A methodology to quantify the risk from fires in nuclear power plants is described. This methodology combines engineering judgment, statistical evidence, fire phenomenology, and plant system analysis. It can be divided into two major parts: (1) fire scenario identification and quantification, and (2) analysis of the impact on plant safety. This article primarily concentrates on the first part. Statistical analysis of fire occurrence data is used to establish the likelihood of ignition. The temporal behaviors of the two competing phenomena, fire propagation and fire detection and suppression, are studied and their characteristic times are compared. Severity measures are used to further specialize the frequency of the fire scenario. The methodology is applied to a switchgear room of a nuclear power plant

  15. Clearance of surface-contaminated objects from the controlled area of a nuclear facility. Application of the SUDOQU methodology

    Energy Technology Data Exchange (ETDEWEB)

    Russo, F.; Mommaert, C. [Bel V, Brussels (Belgium); Dillen, T. van [National Institute for Public Health and the Environment (RIVM), Bilthoven (Netherlands)

    2018-01-15

    The lack of clearly defined surface-clearance levels in the Belgian regulation led Bel V to start a collaboration with the Dutch National Institute for Public Health and the Environment (RIVM) to evaluate the applicability of the SUDOQU methodology for the derivation of nuclide-specific surface-clearance criteria for objects released from nuclear facilities. SUDOQU is a methodology for the dose assessment of exposure to a surface-contaminated object, with the innovative assumption of a time-dependent surface activity whose evolution is influenced by removal and deposition mechanisms. In this work, calculations were performed to evaluate the annual effective dose resulting from the use of a typical office item, e.g. a bookcase. Preliminary results allow understanding the interdependencies between the model's underlying mechanisms, and show a strong sensitivity to the main input parameters. The results were benchmarked against those from a model described in Radiation Protection 101, to investigate the impact of the model's main assumptions. Results of the two models were in good agreement. The SUDOQU methodology appears to be a flexible and powerful tool, suitable for the proposed application. Therefore, the project will be extended to more generic study cases, to eventually develop surface-clearance levels applicable to objects leaving nuclear facilities.

  16. An application of Six Sigma methodology to reduce the engine-overheating problem in an automotive company

    Energy Technology Data Exchange (ETDEWEB)

    Antony, J. [Glasgow Caledonian University (United Kingdom). Six Sigma and Process Improvement Research Centre; Kumar, M. [Glasgow Caledonian University (United Kingdom). Division of Management; Tiwari, M.K. [National Institute of Foundry and Forge Technology, Ranchi (India). Department of Manufacturing Engineering

    2005-08-15

    Six Sigma is a systematic methodology for continuous process quality improvement and for achieving operational excellence. The overstatement that often accompanies the presentation and adoption of Six Sigma in industry can lead to unrealistic expectations as to what Six Sigma is truly capable of achieving. This paper deals with the application of Six Sigma based methodology in eliminating an engine-overheating problem in an automotive company. The DMAIC (define-measure-analyse-improve-control) approach has been followed here to solve an underlying problem of reducing process variation and the associated high defect rate. This paper explores how a foundry can use a systematic and disciplined approach to move towards the goal of Six Sigma quality level. The application of the Six Sigma methodology resulted in a reduction in the jamming problem encountered in the cylinder head and increased the process capability from 0.49 to 1.28. The application of DMAIC has had a significant financial impact (saving over $US110 000 per annum) on the bottom-line of the company. (author)

  17. Application of a new methodology on the multicycle analysis for the Laguna Verde NPP en Mexico

    International Nuclear Information System (INIS)

    Cortes C, Carlos C.

    1997-01-01

    This paper describes the improvements done in the physical and economic methodologies on the multicycle analysis for the Boiling Water Reactors of the Laguna Verde NPP in Mexico, based on commercial codes and in-house developed computational tools. With these changes in our methodology, three feasible scenarios are generated for the operation of Laguna Verde Nuclear Power Plant Unit 2 at 12, 18 and 24 months. The physical economic results obtained are showed. Further, the effect of the replacement power is included in the economic evaluation. (author). 11 refs., 3 figs., 7 tabs

  18. Putting Foucault to work: an approach to the practical application of Foucault's methodological imperatives

    Directory of Open Access Journals (Sweden)

    DAVID A. NICHOLLS

    2009-01-01

    Full Text Available This paper presents an overview of the methodological approach taken in a recently completed Foucauldian discourse analysis of physiotherapy practice. In keeping with other approaches common to postmodern research this paper resists the temptation to define a proper or ‘correct’ interpretation of Foucault’s methodological oeuvre; preferring instead to apply a range of Foucauldian propositions to examples drawn directly from the thesis. In the paper I elucidate on the blended archaeological and genealogical approach I took and unpack some of the key imperatives, principles and rules I grappled with in completing the thesis.

  19. Application of the Integrated Safety Assessment methodology to safety margins. Dynamic Event Trees, Damage Domains and Risk Assessment

    International Nuclear Information System (INIS)

    Ibánez, L.; Hortal, J.; Queral, C.; Gómez-Magán, J.; Sánchez-Perea, M.; Fernández, I.; Meléndez, E.; Expósito, A.; Izquierdo, J.M.; Gil, J.; Marrao, H.; Villalba-Jabonero, E.

    2016-01-01

    The Integrated Safety Assessment (ISA) methodology, developed by the Consejo de Seguridad Nuclear, has been applied to an analysis of Zion NPP for sequences with Loss of the Component Cooling Water System (CCWS). The ISA methodology proposal starts from the unfolding of the Dynamic Event Tree (DET). Results from this first step allow assessing the sequence delineation of standard Probabilistic Safety Analysis results. For some sequences of interest of the outlined DET, ISA then identifies the Damage Domain (DD). This is the region of uncertain times and/or parameters where a safety limit is exceeded, which indicates the occurrence of certain damage situation. This paper illustrates application of this concept obtained simulating sequences with MAAP and with TRACE. From information of simulation results of sequence transients belonging to the DD and the time-density probability distributions of the manual actions and of occurrence of stochastic phenomena, ISA integrates the dynamic reliability equations proposed to obtain the sequence contribution to the global Damage Exceedance Frequency (DEF). Reported results show a slight increase in the DEF for sequences investigated following a power uprate from 100% to 110%. This demonstrates the potential use of the method to help in the assessment of design modifications. - Highlights: • This paper illustrates an application of the ISA methodology to safety margins. • Dynamic Event Trees are useful tool for verifying the standard PSA Event Trees. • The ISA methodology takes into account the uncertainties in human action times. • The ISA methodology shows the Damage Exceedance Frequency increase in power uprates.

  20. Application of Lean Healthcare methodology in a urology department of a tertiary hospital as a tool for improving efficiency.

    Science.gov (United States)

    Boronat, F; Budia, A; Broseta, E; Ruiz-Cerdá, J L; Vivas-Consuelo, D

    To describe the application of the Lean methodology as a method for continuously improving the efficiency of a urology department in a tertiary hospital. The implementation of the Lean Healthcare methodology in a urology department was conducted in 3 phases: 1) team training and improvement of feedback among the practitioners, 2) management by process and superspecialisation and 3) improvement of indicators (continuous improvement). The indicators were obtained from the Hospital's information systems. The main source of information was the Balanced Scorecard for health systems management (CUIDISS). The comparison with other autonomous and national urology departments was performed through the same platform with the help of the Hospital's records department (IASIST). A baseline was established with the indicators obtained in 2011 for the comparative analysis of the results after implementing the Lean Healthcare methodology. The implementation of this methodology translated into high practitioner satisfaction, improved quality indicators reaching a risk-adjusted complication index (RACI) of 0.59 and a risk-adjusted mortality rate (RAMR) of 0.24 in 4 years. A value of 0.61 was reached with the efficiency indicator (risk-adjusted length of stay [RALOS] index), with a savings of 2869 stays compared with national Benchmarking (IASIST). The risk-adjusted readmissions index (RARI) was the only indicator above the standard, with a value of 1.36 but with progressive annual improvement of the same. The Lean methodology can be effectively applied to a urology department of a tertiary hospital to improve efficiency, obtaining significant and continuous improvements in all its indicators, as well as practitioner satisfaction. Team training, management by process, continuous improvement and delegation of responsibilities has been shown to be the fundamental pillars of this methodology. Copyright © 2017 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.

  1. 76 FR 21036 - Application of the Prevailing Wage Methodology in the H-2B Program

    Science.gov (United States)

    2011-04-14

    ... Department to ``promulgate new rules concerning the calculation of the prevailing wage rate in the H-2B... wage methodology set forth in this Rule applies only to wages paid for work performed on or after...: Notice. SUMMARY: On January 19, 2011, the Department of Labor (Department) published a final rule, Wage...

  2. A general improved methodology to forecasting future oil production: Application to the UK and Norway

    International Nuclear Information System (INIS)

    Fiévet, L.; Forró, Z.; Cauwels, P.; Sornette, D.

    2015-01-01

    We present a new Monte-Carlo methodology to forecast the crude oil production of Norway and the U.K. based on a two-step process, (i) the nonlinear extrapolation of the current/past performances of individual oil fields and (ii) a stochastic model of the frequency of future oil field discoveries. Compared with the standard methodology that tends to underestimate remaining oil reserves, our method gives a better description of future oil production, as validated by our back-tests starting in 2008. Specifically, we predict remaining reserves extractable until 2030 to be 5.7 ± 0.3 billion barrels for Norway and 3.0 ± 0.3 billion barrels for the UK, which are respectively 45% and 66% above the predictions using an extrapolation of aggregate production. - Highlights: • Two step methodology to forecast a countries oil production. • Nonlinear extrapolation of the performance of individual fields. • Stochastic model of the frequency of future discoveries. • Backtest starting in 2008 of the methodology. • Improvement upon standard extrapolation of aggregate production

  3. Implementation and training methodology of subcritical reactors neutronic calculations triggered by external neutron source and applications

    International Nuclear Information System (INIS)

    Carluccio, Thiago

    2011-01-01

    This works had as goal to investigate calculational methodologies on subcritical source driven reactor, such as Accelerator Driven Subcritical Reactor (ADSR) and Fusion Driven Subcritical Reactor (FDSR). Intense R and D has been done about these subcritical concepts, mainly due to Minor Actinides (MA) and Long Lived Fission Products (LLFP) transmutation possibilities. In this work, particular emphasis has been given to: (1) complement and improve calculation methodology with neutronic transmutation and decay capabilities and implement it computationally, (2) utilization of this methodology in the Coordinated Research Project (CRP) of the International Atomic Energy Agency Analytical and Experimental Benchmark Analysis of ADS and in the Collaborative Work on Use of Low Enriched Uranium in ADS, especially in the reproduction of the experimental results of the Yalina Booster subcritical assembly and study of a subcritical core of IPEN / MB-01 reactor, (3) to compare different nuclear data libraries calculation of integral parameters, such as k eff and k src , and differential distributions, such as spectrum and flux, and nuclides inventories and (4) apply the develop methodology in a study that may help future choices about dedicated transmutation system. The following tools have been used in this work: MCNP (Monte Carlo N particle transport code), MCB (enhanced version of MCNP that allows burnup calculation) and NJOY to process nuclear data from evaluated nuclear data files. (author)

  4. A social network perspective on teacher collaboration in schools: Theory, methodology, and applications

    NARCIS (Netherlands)

    Moolenaar, Nienke

    2012-01-01

    An emerging trend in educational research is the use of social network theory and methodology to understand how teacher collaboration can support or constrain teaching, learning, and educational change. This article provides a critical synthesis of educational literature on school social networks

  5. Q and you: The application of Q methodology in recreation research

    Science.gov (United States)

    Whitney. Ward

    2010-01-01

    Researchers have used various qualitative and quantitative methods to deal with subjectivity in studying people's recreation experiences. Q methodology has been the most effective approach for analyzing both qualitative and quantitative aspects of experience, including attitudes or perceptions. The method is composed of two main components--Q sorting and Q factor...

  6. An extensible six-step methodology to automatically generate fuzzy DSSs for diagnostic applications.

    Science.gov (United States)

    d'Acierno, Antonio; Esposito, Massimo; De Pietro, Giuseppe

    2013-01-01

    The diagnosis of many diseases can be often formulated as a decision problem; uncertainty affects these problems so that many computerized Diagnostic Decision Support Systems (in the following, DDSSs) have been developed to aid the physician in interpreting clinical data and thus to improve the quality of the whole process. Fuzzy logic, a well established attempt at the formalization and mechanization of human capabilities in reasoning and deciding with noisy information, can be profitably used. Recently, we informally proposed a general methodology to automatically build DDSSs on the top of fuzzy knowledge extracted from data. We carefully refine and formalize our methodology that includes six stages, where the first three stages work with crisp rules, whereas the last three ones are employed on fuzzy models. Its strength relies on its generality and modularity since it supports the integration of alternative techniques in each of its stages. The methodology is designed and implemented in the form of a modular and portable software architecture according to a component-based approach. The architecture is deeply described and a summary inspection of the main components in terms of UML diagrams is outlined as well. A first implementation of the architecture has been then realized in Java following the object-oriented paradigm and used to instantiate a DDSS example aimed at accurately diagnosing breast masses as a proof of concept. The results prove the feasibility of the whole methodology implemented in terms of the architecture proposed.

  7. Application Development Methodology Appropriateness: An Exploratory Case Study Bridging the Gap between Framework Characteristics and Selection

    Science.gov (United States)

    Williams, Lawrence H., Jr.

    2013-01-01

    This qualitative study analyzed experiences of twenty software developers. The research showed that all software development methodologies are distinct from each other. While some, such as waterfall, focus on traditional, plan-driven approaches that allow software requirements and design to evolve; others facilitate ambiguity and uncertainty by…

  8. Simplified life cycle assessment models: methodological framework and applications to energy pathways

    International Nuclear Information System (INIS)

    Padey, Pierryves

    2013-01-01

    The energy transition debate is a key issue for today and the coming years. One of the challenges is to limit the environmental impacts of electricity production. Decision support tools, sufficiently accurate, simple to use, accounting for environmental aspects and favoring future energetic choices, must be implemented. However, the environmental assessment of the energy pathways is complex, and it means considering a two levels characterization. The 'energy pathway' is the first level and corresponds to its environmental distribution, to compare overall pathways. The 'system pathway' is the 2. level and compares environmental impacts of systems within each pathway. We have devised a generic methodology covering both necessary characterization levels by estimating the energy pathways environmental profiles while allowing a simple comparison of its systems environmental impacts. This methodology is based on the definition of a parameterized Life Cycle Assessment model and considers, through a Global Sensitivity Analysis, the environmental impacts of a large sample of systems representative of an energy pathway. As a second step, this methodology defines simplified models based on few key parameters identified as inducing the largest variability in the energy pathway environmental impacts. These models assess in a simple way the systems environmental impacts, avoiding any complex LCAs. This reduction methodology has been applied to the onshore wind power energy pathway in Europe and the photovoltaic energy pathway in France. (author)

  9. Application of a statistical methodology for the comprehension of corrosion phenomena on Phenix spent fuel pins

    International Nuclear Information System (INIS)

    Pantera, L.

    1992-11-01

    The maximum burnup of Phenix fuel elements is strongly conditioned by the internal corrosion of the steel cladding. This thesis is a part of a new study program on the corrosion phenomena. Based on the results of an experimental program during the years 1980-1990 its objective is the use of a statistical methodology for a better comprehension of the corrosion phenomena

  10. Advanced software development workstation: Object-oriented methodologies and applications for flight planning and mission operations

    Science.gov (United States)

    Izygon, Michel

    1993-01-01

    The work accomplished during the past nine months in order to help three different organizations involved in Flight Planning and in Mission Operations systems, to transition to Object-Oriented Technology, by adopting one of the currently most widely used Object-Oriented analysis and Design Methodology is summarized.

  11. Methodology for estimating biomass energy potential and its application to Colombia

    International Nuclear Information System (INIS)

    Gonzalez-Salazar, Miguel Angel; Morini, Mirko; Pinelli, Michele; Spina, Pier Ruggero; Venturini, Mauro; Finkenrath, Matthias; Poganietz, Witold-Roger

    2014-01-01

    Highlights: • Methodology to estimate the biomass energy potential and its uncertainty at a country level. • Harmonization of approaches and assumptions in existing assessment studies. • The theoretical and technical biomass energy potential in Colombia are estimated in 2010. - Abstract: This paper presents a methodology to estimate the biomass energy potential and its associated uncertainty at a country level when quality and availability of data are limited. The current biomass energy potential in Colombia is assessed following the proposed methodology and results are compared to existing assessment studies. The proposed methodology is a bottom-up resource-focused approach with statistical analysis that uses a Monte Carlo algorithm to stochastically estimate the theoretical and the technical biomass energy potential. The paper also includes a proposed approach to quantify uncertainty combining a probabilistic propagation of uncertainty, a sensitivity analysis and a set of disaggregated sub-models to estimate reliability of predictions and reduce the associated uncertainty. Results predict a theoretical energy potential of 0.744 EJ and a technical potential of 0.059 EJ in 2010, which might account for 1.2% of the annual primary energy production (4.93 EJ)

  12. Methodology for the biosphere evaluation during the RRAA management. Application for the Mediterranean system

    International Nuclear Information System (INIS)

    Pinedo, P.; Simon, I.; Aguero, A.

    1998-01-01

    For several years CIEMAT has been developing for ENRESA knowledge and tools to support the modelling of the migration and accumulation of radionuclides within the biosphere once those radionuclides are released or reach one or more parts of the biosphere (atmosphere, water impacts arising from the resulting distribution of radionuclides in the biosphere. In 1996, a Methodology to analyse the biosphere in the context of high level waste repositories was proposed to ENRESA, where the issues mentioned above were considered and treated. The level of development of the different aspects proposed within the Methodology was quite heterogeneous and, while aspects of radionuclide transport modelling were already well developed in theoretical and practical terms, other aspects like the procedure for conceptual model development using the RES matrix and the description of biosphere systems representatives of the long term needed further developments. These own methodological developments have been developed in parallel with similar international developments within which there were and are an active participation, the BIOMOVS II international Project, finalized in 1996 and where it was developed the so called Reference Biosphere Methodology and, the International Atomic Energy Agency (IAEA) Programme on Biosphere Modelling and Assessment (BIOMASS), that is developed at present in collaboration with several national organizations, ENRESA and CIEMAT among them. The work described here takes account of these international developments. The overall purpose of this work is to apply the Methodology, to the last performance assessment (PA) exercise made by ENRESA, using form it the general and particular information about the assessment context, the source term, and the geo-biosphere interface data. (Author) 6 refs

  13. Application of CASMO-4/MICROBURN-B2 methodology to mixed cores with Westinghouse Optima2 fuel

    Energy Technology Data Exchange (ETDEWEB)

    Hsiao, Ming Yuan; Wheeler, John K.; Hoz, Carlos de la [Nuclear Fuels, Warrenville (United States)

    2008-10-15

    The first application of CASMO-4/MICROBURN-B2 methodology to Westinghouse SVEA-96 Optima2 reload cycle is described in this paper. The first Westinghouse Optima2 reload cycle in the U.S. is Exelon's Quad Cities Unit 2 Cycle 19 (Q2C19). The core contains fresh Optima2 fuel and once burned and twice burned GE14 fuel. Although the licensing analyses for the reload cycle are performed by Westinghouse with Westinghouse methodology, the core is monitored with AREVA's POWERPLEX-III core monitoring system that is based on the CASMO-4/MICROBURN-B2 (C4/B2) methodology. This necessitates the development of a core model based on the C4/B2 methodology for both reload design and operational support purposes. In addition, as expected, there are many differences between the two vendors' methodologies; they differ not only in modeling some of the physical details of the Optima2 bundles but also in the modeling capability of the computer codes. In order to have high confidence that the online core monitoring results during the cycle startup and operation will comply with the Technical Specifications requirements (e.g., thermal limits, shutdown margins), the reload core design generated by Westinghouse design methodology was confirmed by the C4/B2 model. The C4/B2 model also assures that timely operational support during the cycle can be provided. Since this is the first application of C4/B2 methodology to an Optima2 reload in the US, many issues in the lattice design, bundle design, and reload core design phases were encountered. Many modeling issues have to be considered in order to develop a successful C4/B2 core model for the Optima2/GE14 mixed core. Some of the modeling details and concerns and their resolutions are described. The Q2C19 design was successfully completed and the 2 year cycle successfully started up in April 2006 and shut down in March 2008. Some of the operating results are also presented.

  14. Application of CASMO-4/MICROBURN-B2 methodology to mixed cores with Westinghouse Optima2 fuel

    Energy Technology Data Exchange (ETDEWEB)

    Hsiao, Ming Yuan; Wheeler, John K.; Hoz, Carlos de la [Nuclear Fuels, Warrenville (United States)

    2008-10-15

    The first application of CASMO-4/MICROBURN-B2 methodology to Westinghouse SVEA-96 Optima2 reload cycle is described in this paper. The first Westinghouse Optima2 reload cycle in the U.S. is Exelon's Quad Cities Unit 2 Cycle 19 (Q2C19). The core contains fresh Optima2 fuel and once burned and twice burned GE14 fuel. Although the licensing analyses for the reload cycle are performed by Westinghouse with Westinghouse methodology, the core is monitored with AREVA's POWERPLEX-III core monitoring system that is based on the CASMO-4/MICROBURN-B2 (C4/B2) methodology. This necessitates the development of a core model based on the C4/B2 methodology for both reload design and operational support purposes. In addition, as expected, there are many differences between the two vendors' methodologies; they differ not only in modeling some of the physical details of the Optima2 bundles but also in the modeling capability of the computer codes. In order to have high confidence that the online core monitoring results during the cycle startup and operation will comply with the Technical Specifications requirements (e.g., thermal limits, shutdown margins), the reload core design generated by Westinghouse design methodology was confirmed by the C4/B2 model. The C4/B2 model also assures that timely operational support during the cycle can be provided. Since this is the first application of C4/B2 methodology to an Optima2 reload in the US, many issues in the lattice design, bundle design, and reload core design phases were encountered. Many modeling issues have to be considered in order to develop a successful C4/B2 core model for the Optima2/GE14 mixed core. Some of the modeling details and concerns and their resolutions are described. The Q2C19 design was successfully completed and the 2 year cycle successfully started up in April 2006 and shut down in March 2008. Some of the operating results are also presented.

  15. Application of CASMO-4/MICROBURN-B2 methodology to mixed cores with Westinghouse Optima2 fuel

    International Nuclear Information System (INIS)

    Hsiao, Ming Yuan; Wheeler, John K.; Hoz, Carlos de la

    2008-01-01

    The first application of CASMO-4/MICROBURN-B2 methodology to Westinghouse SVEA-96 Optima2 reload cycle is described in this paper. The first Westinghouse Optima2 reload cycle in the U.S. is Exelon's Quad Cities Unit 2 Cycle 19 (Q2C19). The core contains fresh Optima2 fuel and once burned and twice burned GE14 fuel. Although the licensing analyses for the reload cycle are performed by Westinghouse with Westinghouse methodology, the core is monitored with AREVA's POWERPLEX-III core monitoring system that is based on the CASMO-4/MICROBURN-B2 (C4/B2) methodology. This necessitates the development of a core model based on the C4/B2 methodology for both reload design and operational support purposes. In addition, as expected, there are many differences between the two vendors' methodologies; they differ not only in modeling some of the physical details of the Optima2 bundles but also in the modeling capability of the computer codes. In order to have high confidence that the online core monitoring results during the cycle startup and operation will comply with the Technical Specifications requirements (e.g., thermal limits, shutdown margins), the reload core design generated by Westinghouse design methodology was confirmed by the C4/B2 model. The C4/B2 model also assures that timely operational support during the cycle can be provided. Since this is the first application of C4/B2 methodology to an Optima2 reload in the US, many issues in the lattice design, bundle design, and reload core design phases were encountered. Many modeling issues have to be considered in order to develop a successful C4/B2 core model for the Optima2/GE14 mixed core. Some of the modeling details and concerns and their resolutions are described. The Q2C19 design was successfully completed and the 2 year cycle successfully started up in April 2006 and shut down in March 2008. Some of the operating results are also presented

  16. Development and validation of a new turbocharger simulation methodology for marine two stroke diesel engine modelling and diagnostic applications

    International Nuclear Information System (INIS)

    Sakellaridis, Nikolaos F.; Raptotasios, Spyridon I.; Antonopoulos, Antonis K.; Mavropoulos, Georgios C.; Hountalas, Dimitrios T.

    2015-01-01

    Engine cycle simulation models are increasingly used in diesel engine simulation and diagnostic applications, reducing experimental effort. Turbocharger simulation plays an important role in model's ability to accurately predict engine performance and emissions. The present work describes the development of a complete engine simulation model for marine Diesel engines based on a new methodology for turbocharger modelling utilizing physically based meanline models for compressor and turbine. Simulation accuracy is evaluated against engine bench measurements. The methodology was developed to overcome the problem of limited experimental maps availability for compressor and turbine, often encountered in large marine diesel engine simulation and diagnostic studies. Data from the engine bench are used to calibrate the models, as well as to estimate turbocharger shaft mechanical efficiency. Closed cycle and gas exchange are modelled using an existing multizone thermodynamic model. The proposed methodology is applied on a 2-stroke marine diesel engine and its evaluation is based on the comparison of predictions against measured engine data. It is demonstrated model's ability to predict engine response with load variation regarding both turbocharger performance and closed cycle parameters, as well as NOx emission trends, making it an effective tool for both engine diagnostic and optimization studies. - Highlights: • Marine two stroke diesel engine simulation model. • Turbine and compressor simulation using physical meanline models. • Methodology to derive T/C component efficiency and T/C shaft mechanical efficiency. • Extensive validation of predictions against experimental data.

  17. Quantifying reactor safety margins: Part 1: An overview of the code scaling, applicability, and uncertainty evaluation methodology

    International Nuclear Information System (INIS)

    Boyack, B.E.; Duffey, R.B.; Griffith, P.

    1988-01-01

    In August 1988, the Nuclear Regulatory Commission (NRC) approved the final version of a revised rule on the acceptance of emergency core cooling systems (ECCS) entitled ''Emergency Core Cooling System; Revisions to Acceptance Criteria.'' The revised rule states an alternate ECCS performance analysis, based on best-estimate methods, may be used to provide more realistic estimates of plant safety margins, provided the licensee quantifies the uncertainty of the estimates and included that uncertainty when comparing the calculated results with prescribed acceptance limits. To support the revised ECCS rule, the NRC and its contractors and consultants have developed and demonstrated a method called the Code Scaling, Applicability, and Uncertainty (CSAU) evaluation methodology. It is an auditable, traceable, and practical method for combining quantitative analyses and expert opinions to arrive at computed values of uncertainty. This paper provides an overview of the CSAU evaluation methodology and its application to a postulated cold-leg, large-break loss-of-coolant accident in a Westinghouse four-loop pressurized water reactor with 17 /times/ 17 fuel. The code selected for this demonstration of the CSAU methodology was TRAC-PF1/MOD1, Version 14.3. 23 refs., 5 figs., 1 tab

  18. Methodology for accident analyses of fusion breeder blankets and its application to helium-cooled pebble bed blanket

    International Nuclear Information System (INIS)

    Panayotov, Dobromir; Grief, Andrew; Merrill, Brad J.; Humrickhouse, Paul; Trow, Martin; Dillistone, Michael; Murgatroyd, Julian T.; Owen, Simon; Poitevin, Yves; Peers, Karen; Lyons, Alex; Heaton, Adam; Scott, Richard

    2016-01-01

    Graphical abstract: - Highlights: • Test Blanket Systems (TBS) DEMO breeding blankets (BB) safety demonstration. • Comprehensive methodology for fusion breeding blanket accident analysis that addresses the specificity of the breeding blanket designs, materials, and phenomena. • Development of accident analysis specifications (AAS) via the use of phenomena identification and ranking tables (PIRT). • PIRT application to identify required physical models for BB accidents analysis, code assessment and selection. • Development of MELCOR and RELAP5 codes TBS models. • Qualification of the models via comparison with finite element calculations, code-tocode comparisons, and sensitivity studies. - Abstract: ‘Fusion for Energy’ (F4E) is designing, developing, and implementing the European Helium-Cooled Lead-Lithium (HCLL) and Helium-Cooled Pebble-Bed (HCPB) Test Blanket Systems (TBSs) for ITER (Nuclear Facility INB-174). Safety demonstration is an essential element for the integration of these TBSs into ITER and accident analysis is one of its critical components. A systematic approach to accident analysis has been developed under the F4E contract on TBS safety analyses. F4E technical requirements, together with Amec Foster Wheeler and INL efforts, have resulted in a comprehensive methodology for fusion breeding blanket accident analysis that addresses the specificity of the breeding blanket designs, materials, and phenomena while remaining consistent with the approach already applied to ITER accident analyses. The methodology phases are illustrated in the paper by its application to the EU HCPB TBS using both MELCOR and RELAP5 codes.

  19. Development and application of a statistical methodology to evaluate the predictive accuracy of building energy baseline models

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Energy Technologies Area Div.; Price, Phillip N. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Energy Technologies Area Div.

    2014-03-01

    This paper documents the development and application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-­building energy savings. The methodology complements the principles addressed in resources such as ASHRAE Guideline 14 and the International Performance Measurement and Verification Protocol. It requires fitting a baseline model to data from a ``training period’’ and using the model to predict total electricity consumption during a subsequent ``prediction period.’’ We illustrate the methodology by evaluating five baseline models using data from 29 buildings. The training period and prediction period were varied, and model predictions of daily, weekly, and monthly energy consumption were compared to meter data to determine model accuracy. Several metrics were used to characterize the accuracy of the predictions, and in some cases the best-­performing model as judged by one metric was not the best performer when judged by another metric.

  20. Methodology for accident analyses of fusion breeder blankets and its application to helium-cooled pebble bed blanket

    Energy Technology Data Exchange (ETDEWEB)

    Panayotov, Dobromir, E-mail: dobromir.panayotov@f4e.europa.eu [Fusion for Energy (F4E), Josep Pla, 2, Torres Diagonal Litoral B3, Barcelona E-08019 (Spain); Grief, Andrew [Amec Foster Wheeler, Booths Park, Chelford Road, Knutsford WA16 8QZ, Cheshire (United Kingdom); Merrill, Brad J.; Humrickhouse, Paul [Idaho National Laboratory, PO Box 1625, Idaho Falls, ID (United States); Trow, Martin; Dillistone, Michael; Murgatroyd, Julian T.; Owen, Simon [Amec Foster Wheeler, Booths Park, Chelford Road, Knutsford WA16 8QZ, Cheshire (United Kingdom); Poitevin, Yves [Fusion for Energy (F4E), Josep Pla, 2, Torres Diagonal Litoral B3, Barcelona E-08019 (Spain); Peers, Karen; Lyons, Alex; Heaton, Adam; Scott, Richard [Amec Foster Wheeler, Booths Park, Chelford Road, Knutsford WA16 8QZ, Cheshire (United Kingdom)

    2016-11-01

    Graphical abstract: - Highlights: • Test Blanket Systems (TBS) DEMO breeding blankets (BB) safety demonstration. • Comprehensive methodology for fusion breeding blanket accident analysis that addresses the specificity of the breeding blanket designs, materials, and phenomena. • Development of accident analysis specifications (AAS) via the use of phenomena identification and ranking tables (PIRT). • PIRT application to identify required physical models for BB accidents analysis, code assessment and selection. • Development of MELCOR and RELAP5 codes TBS models. • Qualification of the models via comparison with finite element calculations, code-tocode comparisons, and sensitivity studies. - Abstract: ‘Fusion for Energy’ (F4E) is designing, developing, and implementing the European Helium-Cooled Lead-Lithium (HCLL) and Helium-Cooled Pebble-Bed (HCPB) Test Blanket Systems (TBSs) for ITER (Nuclear Facility INB-174). Safety demonstration is an essential element for the integration of these TBSs into ITER and accident analysis is one of its critical components. A systematic approach to accident analysis has been developed under the F4E contract on TBS safety analyses. F4E technical requirements, together with Amec Foster Wheeler and INL efforts, have resulted in a comprehensive methodology for fusion breeding blanket accident analysis that addresses the specificity of the breeding blanket designs, materials, and phenomena while remaining consistent with the approach already applied to ITER accident analyses. The methodology phases are illustrated in the paper by its application to the EU HCPB TBS using both MELCOR and RELAP5 codes.

  1. Project Management Methodology for the Development of M-Learning Web Based Applications

    Directory of Open Access Journals (Sweden)

    Adrian VISOIU

    2010-01-01

    Full Text Available M-learning web based applications are a particular case of web applications designed to be operated from mobile devices. Also, their purpose is to implement learning aspects. Project management of such applications takes into account the identified peculiarities. M-learning web based application characteristics are identified. M-learning functionality covers the needs of an educational process. Development is described taking into account the mobile web and its influences over the analysis, design, construction and testing phases. Activities building up a work breakdown structure for development of m-learning web based applications are presented. Project monitoring and control techniques are proposed. Resources required for projects are discussed.

  2. Other best-estimate code and methodology applications in addition to licensing

    International Nuclear Information System (INIS)

    Tanarro, A.

    1999-01-01

    Along with their applications for licensing purposes, best-estimate thermalhydraulic codes allow for a wide scope of additional uses and applications, in which as realistic and realizable results as possible are necessary. Although many of these applications have been successfully developed nowadays, the use of best-estimate codes for applications other than those associated to licensing processes is not so well known among the nuclear community. This issue shows some of these applications, briefly describing their more significant and specific features. (Author)

  3. Distributed computing methodology for training neural networks in an image-guided diagnostic application.

    Science.gov (United States)

    Plagianakos, V P; Magoulas, G D; Vrahatis, M N

    2006-03-01

    Distributed computing is a process through which a set of computers connected by a network is used collectively to solve a single problem. In this paper, we propose a distributed computing methodology for training neural networks for the detection of lesions in colonoscopy. Our approach is based on partitioning the training set across multiple processors using a parallel virtual machine. In this way, interconnected computers of varied architectures can be used for the distributed evaluation of the error function and gradient values, and, thus, training neural networks utilizing various learning methods. The proposed methodology has large granularity and low synchronization, and has been implemented and tested. Our results indicate that the parallel virtual machine implementation of the training algorithms developed leads to considerable speedup, especially when large network architectures and training sets are used.

  4. Application of GO methodology in reliability analysis of offsite power supply of Daya Bay NPP

    International Nuclear Information System (INIS)

    Shen Zupei; Li Xiaodong; Huang Xiangrui

    2003-01-01

    The author applies the GO methodology to reliability analysis of the offsite power supply system of Daya Bay NPP. The direct quantitative calculation formulas of the stable reliability target of the system with shared signals and the dynamic calculation formulas of the state probability for the unit with two states are derived. The method to solve the fault event sets of the system is also presented and all the fault event sets of the outer power supply system and their failure probability are obtained. The resumption reliability of the offsite power supply system after the stability failure of the power net is also calculated. The result shows that the GO methodology is very simple and useful in the stable and dynamic reliability analysis of the repairable system

  5. Application of the accident management information needs methodology to a severe accident sequence

    International Nuclear Information System (INIS)

    Ward, L.W.; Hanson, D.J.; Nelson, W.R.; Solberg, D.E.

    1989-01-01

    The U.S. Nuclear Regulatory Commission is conducting an accident management research program that emphasizes the use of severe accident research to enhance the ability of plant operating personnel to effectively manage severe accidents. Hence, it is necessary to ensure that the plant instrumentation and information systems adequately provide this information to the operating staff during accident conditions. A methodology to identify and assess the information needs of the operating staff of a nuclear power plant during a severe accident has been developed. The methodology identifies (a) the information needs of the plant personnel during a wide range of accident conditions, (b) the existing plant measurements capable of supplying these information needs and minor additions to instrument and display systems that would enhance management capabilities, (c) measurement capabilities and limitations during severe accident conditions, and (d) areas in which the information systems could mislead plant personnel

  6. Application of the accident management information needs methodology to a severe accident sequence

    Energy Technology Data Exchange (ETDEWEB)

    Ward, L.W.; Hanson, D.J.; Nelson, W.R. (Idaho National Engineering Laboratory, Idaho Falls (USA)); Solberg, D.E. (Nuclear Regulatory Commission, Washington, DC (USA))

    1989-11-01

    The U.S. Nuclear Regulatory Commission is conducting an accident management research program that emphasizes the use of severe accident research to enhance the ability of plant operating personnel to effectively manage severe accidents. Hence, it is necessary to ensure that the plant instrumentation and information systems adequately provide this information to the operating staff during accident conditions. A methodology to identify and assess the information needs of the operating staff of a nuclear power plant during a severe accident has been developed. The methodology identifies (a) the information needs of the plant personnel during a wide range of accident conditions, (b) the existing plant measurements capable of supplying these information needs and minor additions to instrument and display systems that would enhance management capabilities, (c) measurement capabilities and limitations during severe accident conditions, and (d) areas in which the information systems could mislead plant personnel.

  7. Systematic Review of the Application of Lean and Six Sigma Quality Improvement Methodologies in Radiology.

    Science.gov (United States)

    Amaratunga, Thelina; Dobranowski, Julian

    2016-09-01

    Preventable yet clinically significant rates of medical error remain systemic, while health care spending is at a historic high. Industry-based quality improvement (QI) methodologies show potential for utility in health care and radiology because they use an empirical approach to reduce variability and improve workflow. The aim of this review was to systematically assess the literature with regard to the use and efficacy of Lean and Six Sigma (the most popular of the industrial QI methodologies) within radiology. MEDLINE, the Allied & Complementary Medicine Database, Embase Classic + Embase, Health and Psychosocial Instruments, and the Ovid HealthStar database, alongside the Cochrane Library databases, were searched on June 2015. Empirical studies in peer-reviewed journals were included if they assessed the use of Lean, Six Sigma, or Lean Six Sigma with regard to their ability to improve a variety of quality metrics in a radiology-centered clinical setting. Of the 278 articles returned, 23 studies were suitable for inclusion. Of these, 10 assessed Six Sigma, 7 assessed Lean, and 6 assessed Lean Six Sigma. The diverse range of measured outcomes can be organized into 7 common aims: cost savings, reducing appointment wait time, reducing in-department wait time, increasing patient volume, reducing cycle time, reducing defects, and increasing staff and patient safety and satisfaction. All of the included studies demonstrated improvements across a variety of outcomes. However, there were high rates of systematic bias and imprecision as per the Grading of Recommendations Assessment, Development and Evaluation guidelines. Lean and Six Sigma QI methodologies have the potential to reduce error and costs and improve quality within radiology. However, there is a pressing need to conduct high-quality studies in order to realize the true potential of these QI methodologies in health care and radiology. Recommendations on how to improve the quality of the literature are proposed

  8. Prediction Methodology for Proton Single Event Burnout: Application to a STRIPFET Device

    CERN Document Server

    Siconolfi, Sara; Oser, Pascal; Spiezia, Giovanni; Hubert, Guillaume; David, Jean-Pierre

    2015-01-01

    This paper presents a single event burnout (SEB) sensitivity characterization for power MOSFETs, independent from tests, through a prediction model issued from TCAD analysis and the knowledge of device topology. The methodology is applied to a STRIPFET device and compared to proton data obtained at PSI, showing a good agreement in the order of magnitude of proton SEB cross section, and thus validating the prediction model as an alternative device characterization with respect to SEB.

  9. Multi-objective and multi-physics optimization methodology for SFR core: application to CFV concept

    International Nuclear Information System (INIS)

    Fabbris, Olivier

    2014-01-01

    Nuclear reactor core design is a highly multidisciplinary task where neutronics, thermal-hydraulics, fuel thermo-mechanics and fuel cycle are involved. The problem is moreover multi-objective (several performances) and highly dimensional (several tens of design parameters).As the reference deterministic calculation codes for core characterization require important computing resources, the classical design method is not well suited to investigate and optimize new innovative core concepts. To cope with these difficulties, a new methodology has been developed in this thesis. Our work is based on the development and validation of simplified neutronics and thermal-hydraulics calculation schemes allowing the full characterization of Sodium-cooled Fast Reactor core regarding both neutronics performances and behavior during thermal hydraulic dimensioning transients.The developed methodology uses surrogate models (or meta-models) able to replace the neutronics and thermal-hydraulics calculation chain. Advanced mathematical methods for the design of experiment, building and validation of meta-models allows substituting this calculation chain by regression models with high prediction capabilities.The methodology is applied on a very large design space to a challenging core called CFV (French acronym for low void effect core) with a large gain on the sodium void effect. Global sensitivity analysis leads to identify the significant design parameters on the core design and its behavior during unprotected transient which can lead to severe accidents. Multi-objective optimizations lead to alternative core configurations with significantly improved performances. Validation results demonstrate the relevance of the methodology at the pre-design stage of a Sodium-cooled Fast Reactor core. (author) [fr

  10. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    Science.gov (United States)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  11. Adjustment methodology for preliminary study on the distribution of bone tissue boron. Potential therapeutic applications

    International Nuclear Information System (INIS)

    Brandizzi, D; Dagrosa, A; Carpano, M.; Olivera, M. S.; Nievas, S; Cabrini, R.L.

    2013-01-01

    Boron is an element that has an affinity for bone tissue and represents a considered element in bone health . Other boron compounds are used in the Boron Neutron Capture Therapy (BNCT ) in the form of sodium borocaptate (BSH ) and borono phenylalanine (BPA). The results of clinical trials up to date are encouraging but not conclusive . At an experimental level , some groups have applied BNCT in osteosarcomas . We present preliminary methodological adjustments for the presence of boron in bone. (author)

  12. APPLICATION OF LOT QUALITY ASSURANCE SAMPLING FOR ASSESSING DISEASE CONTROL PROGRAMMES - EXAMINATION OF SOME METHODOLOGICAL ISSUES

    OpenAIRE

    T. R. RAMESH RAO

    2011-01-01

    Lot Quality Assurance Sampling (LQAS), a statistical tool in industrial setup, has been in use since 1980 for monitoring and evaluation of programs on disease control / immunization status among children / health workers performance in health system. While conducting LQAS in the field, there are occasions, even after due care of design, there are practical and methodological issues to be addressed before it is recommended for implementation and intervention. LQAS is applied under the assumpti...

  13. Application of the risk-informed methodology for APR1400 P-T limits curve

    Energy Technology Data Exchange (ETDEWEB)

    Kim, K.; Namgung, I. [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2014-07-01

    A reactor pressure vessel (RPV) in a nuclear power plant has operational limits of pressure and temperature to prevent a potential drastic propagation of cracks due to brittle fracture. We call it a pressure-temperature limits curve (P-T limits curve). Appendix G of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code, Section XI, provides deterministic procedures to develop the P-T limits curve for the reactor pressure vessel. Recently, an alternative risk-informed methodology has been added in the ASME Code. Risk-informed means that we can consider insights from a probabilistic risk assessment by using this methodology. This alternative methodology provides a simple procedure to develop risk-informed P-T limits for heat up, cool down, and hydrostatic test events. The risk-informed P-T limits curve is known to provide more operational flexibility, particularly for reactor pressure vessels with relatively high irradiation levels and radiation sensitive materials. In this paper, we developed both the deterministic and a risk-informed P-T limits curve for an APR1400 reactor using Appendix G of the ASME Code, Section XI and compare the results in terms of additional operational margin. (author)

  14. A combined stochastic feedforward and feedback control design methodology with application to autoland design

    Science.gov (United States)

    Halyo, Nesim

    1987-01-01

    A combined stochastic feedforward and feedback control design methodology was developed. The objective of the feedforward control law is to track the commanded trajectory, whereas the feedback control law tries to maintain the plant state near the desired trajectory in the presence of disturbances and uncertainties about the plant. The feedforward control law design is formulated as a stochastic optimization problem and is embedded into the stochastic output feedback problem where the plant contains unstable and uncontrollable modes. An algorithm to compute the optimal feedforward is developed. In this approach, the use of error integral feedback, dynamic compensation, control rate command structures are an integral part of the methodology. An incremental implementation is recommended. Results on the eigenvalues of the implemented versus designed control laws are presented. The stochastic feedforward/feedback control methodology is used to design a digital automatic landing system for the ATOPS Research Vehicle, a Boeing 737-100 aircraft. The system control modes include localizer and glideslope capture and track, and flare to touchdown. Results of a detailed nonlinear simulation of the digital control laws, actuator systems, and aircraft aerodynamics are presented.

  15. Methodology Development and Applications of Proliferation Resistance and Physical Protection Evaluation

    International Nuclear Information System (INIS)

    Bari, R.A.; Peterson, P.F.; Therios, I.U.; Whitlock, J.J.

    2010-01-01

    We present an overview of the program on the evaluation methodology for proliferation resistance and physical protection (PR and PP) of advanced nuclear energy systems (NESs) sponsored by the Generation IV International Forum (GIF). For a proposed NES design, the methodology defines a set of challenges, analyzes system response to these challenges, and assesses outcomes. The challenges to the NES are the threats posed by potential actors (proliferant States or sub-national adversaries). The characteristics of Generation IV systems, both technical and institutional, are used to evaluate the response of the system and to determine its resistance against proliferation threats and robustness against sabotage and terrorism threats. The outcomes of the system response are expressed in terms of a set of measures, which are the high-level PR and PP characteristics of the NES. The methodology is organized to allow evaluations to be performed at the earliest stages of system design and to become more detailed and more representative as the design progresses. It can thus be used to enable a program in safeguards by design or to enhance the conceptual design process of an NES with regard to intrinsic features for PR and PP.

  16. Development and Application of Urban Landslide Vulnerability Assessment Methodology Reflecting Social and Economic Variables

    Directory of Open Access Journals (Sweden)

    Yoonkyung Park

    2016-01-01

    Full Text Available An urban landslide vulnerability assessment methodology is proposed with major focus on considering urban social and economic aspects. The proposed methodology was developed based on the landslide susceptibility maps that Korean Forest Service utilizes to identify landslide source areas. Frist, debris flows are propagated to urban areas from such source areas by Flow-R (flow path assessment of gravitational hazards at a regional scale, and then urban vulnerability is assessed by two categories: physical and socioeconomic aspect. The physical vulnerability is related to buildings that can be impacted by a landslide event. This study considered two popular building structure types, reinforced-concrete frame and nonreinforced-concrete frame, to assess the physical vulnerability. The socioeconomic vulnerability is considered a function of the resistant levels of the vulnerable people, trigger factor of secondary damage, and preparedness level of the local government. An index-based model is developed to evaluate the life and indirect damage under landslide as well as the resilience ability against disasters. To illustrate the validity of the proposed methodology, physical and socioeconomic vulnerability levels are analyzed for Seoul, Korea, using the suggested approach. The general trend found in this study indicates that the higher population density areas under a weaker fiscal condition that are located at the downstream of mountainous areas are more vulnerable than the areas in opposite conditions.

  17. Influence of activated carbon characteristics on toluene and hexane adsorption: Application of surface response methodology

    Science.gov (United States)

    Izquierdo, Mª Teresa; de Yuso, Alicia Martínez; Valenciano, Raquel; Rubio, Begoña; Pino, Mª Rosa

    2013-01-01

    The objective of this study was to evaluate the adsorption capacity of toluene and hexane over activated carbons prepared according an experimental design, considering as variables the activation temperature, the impregnation ratio and the activation time. The response surface methodology was applied to optimize the adsorption capacity of the carbons regarding the preparation conditions that determine the physicochemical characteristics of the activated carbons. The methodology of preparation produced activated carbons with surface areas and micropore volumes as high as 1128 m2/g and 0.52 cm3/g, respectively. Moreover, the activated carbons exhibit mesoporosity, ranging from 64.6% to 89.1% the percentage of microporosity. The surface chemistry was characterized by TPD, FTIR and acid-base titration obtaining different values of surface groups from the different techniques because the limitation of each technique, but obtaining similar trends for the activated carbons studied. The exhaustive characterization of the activated carbons allows to state that the measured surface area does not explain the adsorption capacity for either toluene or n-hexane. On the other hand, the surface chemistry does not explain the adsorption results either. A compromise between physical and chemical characteristics can be obtained from the appropriate activation conditions, and the response surface methodology gives the optimal activated carbon to maximize adsorption capacity. Low activation temperature, intermediate impregnation ratio lead to high toluene and n-hexane adsorption capacities depending on the activation time, which a determining factor to maximize toluene adsorption.

  18. High-frequency measurements of aeolian saltation flux: Field-based methodology and applications

    Science.gov (United States)

    Martin, Raleigh L.; Kok, Jasper F.; Hugenholtz, Chris H.; Barchyn, Thomas E.; Chamecki, Marcelo; Ellis, Jean T.

    2018-02-01

    Aeolian transport of sand and dust is driven by turbulent winds that fluctuate over a broad range of temporal and spatial scales. However, commonly used aeolian transport models do not explicitly account for such fluctuations, likely contributing to substantial discrepancies between models and measurements. Underlying this problem is the absence of accurate sand flux measurements at the short time scales at which wind speed fluctuates. Here, we draw on extensive field measurements of aeolian saltation to develop a methodology for generating high-frequency (up to 25 Hz) time series of total (vertically-integrated) saltation flux, namely by calibrating high-frequency (HF) particle counts to low-frequency (LF) flux measurements. The methodology follows four steps: (1) fit exponential curves to vertical profiles of saltation flux from LF saltation traps, (2) determine empirical calibration factors through comparison of LF exponential fits to HF number counts over concurrent time intervals, (3) apply these calibration factors to subsamples of the saltation count time series to obtain HF height-specific saltation fluxes, and (4) aggregate the calibrated HF height-specific saltation fluxes into estimates of total saltation fluxes. When coupled to high-frequency measurements of wind velocity, this methodology offers new opportunities for understanding how aeolian saltation dynamics respond to variability in driving winds over time scales from tens of milliseconds to days.

  19. Methodology for Computational Fluid Dynamic Validation for Medical Use: Application to Intracranial Aneurysm.

    Science.gov (United States)

    Paliwal, Nikhil; Damiano, Robert J; Varble, Nicole A; Tutino, Vincent M; Dou, Zhongwang; Siddiqui, Adnan H; Meng, Hui

    2017-12-01

    Computational fluid dynamics (CFD) is a promising tool to aid in clinical diagnoses of cardiovascular diseases. However, it uses assumptions that simplify the complexities of the real cardiovascular flow. Due to high-stakes in the clinical setting, it is critical to calculate the effect of these assumptions in the CFD simulation results. However, existing CFD validation approaches do not quantify error in the simulation results due to the CFD solver's modeling assumptions. Instead, they directly compare CFD simulation results against validation data. Thus, to quantify the accuracy of a CFD solver, we developed a validation methodology that calculates the CFD model error (arising from modeling assumptions). Our methodology identifies independent error sources in CFD and validation experiments, and calculates the model error by parsing out other sources of error inherent in simulation and experiments. To demonstrate the method, we simulated the flow field of a patient-specific intracranial aneurysm (IA) in the commercial CFD software star-ccm+. Particle image velocimetry (PIV) provided validation datasets for the flow field on two orthogonal planes. The average model error in the star-ccm+ solver was 5.63 ± 5.49% along the intersecting validation line of the orthogonal planes. Furthermore, we demonstrated that our validation method is superior to existing validation approaches by applying three representative existing validation techniques to our CFD and experimental dataset, and comparing the validation results. Our validation methodology offers a streamlined workflow to extract the "true" accuracy of a CFD solver.

  20. Application fo fault tree methodology in the risk analysis of complex systems

    International Nuclear Information System (INIS)

    Vasconcelos, V. de.

    1984-01-01

    This study intends to describe the fault tree methodology and apply it to risk assessment of complex facilities. In the methodology description, it has been attempted to provide all the pertinent basic information, pointing out its more important aspects like, for instance, fault tree construction, evaluation techniques and their use in risk and reliability assessment of a system. In view of their importance, topics like common mode failures, human errors, data bases used in the calculations, and uncertainty evaluation of the results, will be discussed separately, each one in a chapter. For the purpose of applying the methodology, it was necessary to implement computer codes normally used for this kind of analysis. The computer codes PREP, KITT and SAMPLE, written in FORTRAN IV, were chosen, due to their availability and to the fact that they have been used in important studies of the nuclear area, like Wash-1400. With these codes, the probability of occurence of excessive pressure in the main system of the component test loop - CTC, of CDTN, was evaluated. (Author) [pt

  1. AN INDUCTIVE, INTERACTIVE AND ADAPTIVE HYBRID PROBLEM-BASED LEARNING METHODOLOGY: APPLICATION TO STATISTICS

    Directory of Open Access Journals (Sweden)

    ADA ZHENG

    2011-10-01

    Full Text Available We have developed an innovative hybrid problem-based learning (PBL methodology. The methodology has the following distinctive features: i Each complex question was decomposed into a set of coherent finer subquestions by following the carefully designed criteria to maintain a delicate balance between guiding the students and inspiring them to think independently. This learning methodology enabled the students to solve the complex questions progressively in an inductive context. ii Facilitated by the utilization of our web-based learning systems, the teacher was able to interact with the students intensively and could allocate more teaching time to provide tailor-made feedback for individual student. The students were actively engaged in the learning activities, stimulated by the intensive interaction. iii The answers submitted by the students could be automatically consolidated in the report of the Moodle system in real-time. The teacher could adjust the teaching schedule and focus of the class to adapt to the learning progress of the students by analysing the automatically generated report and log files of the web-based learning system. As a result, the attendance rate of the students increased from about 50% to more than 90%, and the students’ learning motivation have been significantly enhanced.

  2. Level II Probabilistic Safety Analysis Methodology for the Application to GEN-IV Sodium-cooled Fast Reactor

    International Nuclear Information System (INIS)

    Park, S. Y.; Kim, T. W.; Han, S. H.; Jeong, H. Y.

    2010-03-01

    The Korea Atomic Energy Research Institute (KAERI) has been developing liquid metal reactor (LMR) design technologies under a National Nuclear R and D Program. Nevertheless, there is no experience of the probabilistic safety assessment (PSA) domestically for a fast reactor with the metal fuel. Therefore, the objective of this study is to establish the methodologies of risk assessment for the reference design of GEN-IV sodium fast reactor (SFR). An applicability of the PSA methodology of U. S. NRC and PRISM plant to the domestic GEN-IV SFR has been studied. The study contains a plant damage state analysis, a containment event tree analysis, and a source-term release category binning process

  3. Methodology for the economic evaluation of the application of the eolic energy and lot in the desalinization of sea water

    International Nuclear Information System (INIS)

    Cisneros Ramirez, Cesar A.

    2007-01-01

    The methodology that is presented allows the preliminary evaluation of the cost of the water of sea ($/m3) of a non connected system to the net, fed with renewable energy (eolic and photovoltaic lot) or with an electric generator. The production capacities they are limited to the 100 m 3 /d. The desalinisation plant can be fed by a single energy source or for but of one of them, what will constitute in this last case a system with feeding hybrid. In all the cases it was considered the necessity of energy storage to inclination of batteries to exception of when the feeding was by means of a generator electric. In the annex a chart is presented with the result of the application of the methodology

  4. A methodology for accident analysis of fusion breeder blankets and its application to helium-cooled lead–lithium blanket

    International Nuclear Information System (INIS)

    Panayotov, Dobromir; Poitevin, Yves; Grief, Andrew; Trow, Martin; Dillistone, Michael

    2016-01-01

    'Fusion for Energy' (F4E) is designing, developing, and implementing the European Helium-Cooled Lead-Lithium (HCLL) and Helium-Cooled Pebble-Bed (HCPB) Test Blanket Systems (TBSs) for ITER (Nuclear Facility INB-174). Safety demonstration is an essential element for the integration of these TBSs into ITER and accident analysis is one of its critical components. A systematic approach to accident analysis has been developed under the F4E contract on TBS safety analyses. F4E technical requirements, together with Amec Foster Wheeler and INL efforts, have resulted in a comprehensive methodology for fusion breeding blanket accident analysis that addresses the specificity of the breeding blanket designs, materials, and phenomena while remaining consistent with the approach already applied to ITER accident analyses. Furthermore, the methodology phases are illustrated in the paper by its application to the EU HCLL TBS using both MELCOR and RELAP5 codes.

  5. Definition of a methodology for the management of geological heritage. An application to the Azores archipelago (Portugal)

    Science.gov (United States)

    Lima, Eva; Nunes, João; Brilha, José; Calado, Helena

    2013-04-01

    The conservation of the geological heritage requires the support of appropriate policies, which should be the result of the integration of nature conservation, environmental and land-use planning, and environmental education perspectives. There are several papers about inventory methodologies for geological heritage and its scientific, educational and tourism uses (e.g. Cendrero, 2000, Lago et al., 2000; Brilha, 2005; Carcavilla et al., 2007). However, management methodologies for geological heritage are still poorly developed. They should be included in environmental and land-use planning and nature conservation policies, in order to support a holistic approach to natural heritage. This gap is explained by the fact that geoconservation is a new geoscience still needed of more basic scientific research, like any other geoscience (Henriques et al., 2011). It is necessary to establish protocols and mechanisms for the conservation and management of geological heritage. This is a complex type of management because it needs to address not only the fragile natural features to preserve but also legal, economic, cultural, educational and recreational aspects. In addition, a management methodology should ensure the geosites conservation, the local development and the dissemination of the geological heritage (Carcavilla et al., 2007). This work is part of a PhD project aiming to contribute to fill this gap that exists in the geoconservation domain, specifically in terms of establishing an appropriate methodology for the management of geological heritage, taking into account the natural diversity of geosites and the variety of natural and anthropic threats. The proposed methodology will be applied to the geological heritage of the Azores archipelago, which management acquires particular importance and urgency after the decision of the Regional Government to create the Azores Geopark and its application to the European and Global Geoparks Networks. Acknowledgment This work is

  6. Safety Assessment Methodologies and Their Application in Development of Near Surface Waste Disposal Facilities--ASAM Project

    International Nuclear Information System (INIS)

    Batandjieva, B.; Metcalf, P.

    2003-01-01

    Safety of near surface disposal facilities is a primary focus and objective of stakeholders involved in radioactive waste management of low and intermediate level waste and safety assessment is an important tool contributing to the evaluation and demonstration of the overall safety of these facilities. It plays significant role in different stages of development of these facilities (site characterization, design, operation, closure) and especially for those facilities for which safety assessment has not been performed or safety has not been demonstrated yet and the future has not been decided. Safety assessments also create the basis for the safety arguments presented to nuclear regulators, public and other interested parties in respect of the safety of existing facilities, the measures to upgrade existing facilities and development of new facilities. The International Atomic Energy Agency (IAEA) has initiated a number of research coordinated projects in the field of development and improvement of approaches to safety assessment and methodologies for safety assessment of near surface disposal facilities, such as NSARS (Near Surface Radioactive Waste Disposal Safety Assessment Reliability Study) and ISAM (Improvement of Safety Assessment Methodologies for Near Surface Disposal Facilities) projects. These projects were very successful and showed that there is a need to promote the consistent application of the safety assessment methodologies and to explore approaches to regulatory review of safety assessments and safety cases in order to make safety related decisions. These objectives have been the basis of the IAEA follow up coordinated research project--ASAM (Application of Safety Assessment Methodologies for Near Surface Disposal Facilities), which will commence in November 2002 and continue for a period of three years

  7. Achieving 95% probability level using best estimate codes and the code scaling, applicability and uncertainty (CSAU) [Code Scaling, Applicability and Uncertainty] methodology

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.; Duffey, R.B.; Griffith, P.; Katsma, K.R.; Lellouche, G.S.; Rohatgi, U.S.; Wulff, W.; Zuber, N.

    1988-01-01

    Issue of a revised rule for loss of coolant accident/emergency core cooling system (LOCA/ECCS) analysis of light water reactors will allow the use of best estimate (BE) computer codes in safety analysis, with uncertainty analysis. This paper describes a systematic methodology, CSAU (Code Scaling, Applicability and Uncertainty), which will provide uncertainty bounds in a cost effective, auditable, rational and practical manner. 8 figs., 2 tabs

  8. History, applications, methodological issues and perspectives for the use of environmental DNA (eDNA) in marine and freshwater environments.

    Science.gov (United States)

    Díaz-Ferguson, Edgardo E; Moyer, Gregory R

    2014-12-01

    Genetic material (short DNA fragments) left behind by species in nonliving components of the environment (e.g. soil, sediment, or water) is defined as environmental DNA (eDNA). This DNA has been previously described as particulate DNA and has been used to detect and describe microbial communities in marine sediments since the mid-1980's and phytoplankton communities in the water column since the early-1990's. More recently, eDNA has been used to monitor invasive or endangered vertebrate and invertebrate species. While there is a steady increase in the applicability of eDNA as a monitoring tool, a variety of eDNA applications are emerging in fields such as forensics, population and community ecology, and taxonomy. This review provides scientist with an understanding of the methods underlying eDNA detection as well as applications, key methodological considerations, and emerging areas of interest for its use in ecology and conservation of freshwater and marine environments.

  9. Research Activities on Development of Piping Design Methodology of High Temperature Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Huh, Nam-Su [Seoul National Univ. of Science and Technology, Seoul(Korea, Republic of); Won, Min-Gu [Sungkyukwan Univ., Suwon (Korea, Republic of); Oh, Young-Jin [KEPCO Engineering and Construction Co. Inc., Gimcheon (Korea, Republic of); Lee, Hyeog-Yeon; Kim, Yoo-Gon [Korea Atomic Energy Research Institute, Daejeon(Korea, Republic of)

    2016-10-15

    A SFR is operated at high temperature and low pressure compared with commercial pressurized water reactor (PWR), and such an operating condition leads to time-dependent damages such as creep rupture, excessive creep deformation, creep-fatigue interaction and creep crack growth. Thus, high temperature design and structural integrity assessment methodology should be developed considering such failure mechanisms. In terms of design of mechanical components of SFR, ASME B and PV Code, Sec. III, Div. 5 and RCC-MRx provide high temperature design and assessment procedures for nuclear structural components operated at high temperature, and a Leak-Before-Break (LBB) assessment procedure for high temperature piping is also provided in RCC-MRx, A16. Three web-based evaluation programs based on the current high temperature codes were developed for structural components of high temperature reactors. Moreover, for the detailed LBB analyses of high temperature piping, new engineering methods for predicting creep C*-integral and creep COD rate based either on GE/EPRI or on reference stress concepts were proposed. Finally, the numerical methods based on Garofalo's model and RCC-MRx have been developed, and they have been implemented into ABAQUS. The predictions based on both models were compared with the experimental results, and it has been revealed that the predictions from Garafalo's model gave somewhat successful results to describe the deformation behavior of Gr. 91 at elevated temperatures.

  10. THE PROPOSED METHODOLOGIES FOR THE SIX SIGMA METHOD AND TQM STRATEGY AS WELL AS THEIR APPLICATION IN PRACTICE IN MACEDONIA

    Directory of Open Access Journals (Sweden)

    Elizabeta Mitreva

    2014-05-01

    Full Text Available This paper presents the proposed methodologies for the Six Sigma method and the TQM strategy as well as their application in practice in Macedonia. Although the philosophy of the total quality management (TQM is deeply involved in many industries and business areas of European and other countries it is insufficiently known and present in our country and other developing countries. The same applies to the Six Sigma approach of reducing the dispersion of a process and it is present in a small fraction in Macedonian companies. The results of the implementation have shown that the application of the Six Sigma approach does not refer to the number of defects per million opportunities but to the systematic and systemic lowering of the dispersion process. The operation and effect of the implementation of the six sigma method engages experts that receive a salary depending on the success of the Six Sigma program. On other hand the results of the application of the TQM methodology within the Macedonian companies will depend on the commitment of all employees and their motivation.

  11. Quantitative evaluation of geodiversity: development of methodological procedures with application to territorial management

    Science.gov (United States)

    Forte, J.; Brilha, J.; Pereira, D.; Nolasco, M.

    2012-04-01

    Although geodiversity is considered the setting for biodiversity, there is still a huge gap in the social recognition of these two concepts. The concept of geodiversity, less developed, is now making its own way as a robust and fundamental idea concerning the abiotic component of nature. From a conservationist point of view, the lack of a broader knowledge concerning the type and spatial variation of geodiversity, as well as its relationship with biodiversity, makes the protection and management of natural or semi-natural areas incomplete. There is a growing need to understand the patterns of geodiversity in different landscapes and to translate this knowledge for territorial management in a practical and effective point of view. This kind of management can also represent an important tool for the development of sustainable tourism, particularly geotourism, which can bring benefits not only for the environment, but also for social and economic purposes. The quantification of geodiversity is an important step in all this process but still few researchers are investing in the development of a proper methodology. The assessment methodologies that were published so far are mainly focused on the evaluation of geomorphological elements, sometimes complemented with information about lithology, soils, hidrology, morphometric variables, climatic surfaces and geosites. This results in very dissimilar areas at very different spatial scales, showing the complexity of the task and the need of further research. This current work aims the development of an effective methodology for the assessment of the maximum elements of geodiversity possible (rocks, minerals, fossils, landforms, soils), based on GIS routines. The main determinant factor for the quantitative assessment is scale, but other factors are also very important, such as the existence of suitable spatial data with sufficient degree of detail. It is expected to attain the proper procedures in order to assess geodiversity

  12. Application of Taguchi methodology to improve the functional quality of a mechanical device

    International Nuclear Information System (INIS)

    Regeai, Awatef Omar

    2005-01-01

    Manufacturing and quality control are recognized branches of engineering management. special attention has been made to improve thr tools and methods for the purpose of improving the products quality and finding solutions for any Obstacles and/or problems during the production process. Taguchi methodology is one of the most powerful techniques for improving product and manufacturing process quality at low cost. It is a strategical and practical method that aims to assist managers and industrial engineers to tackle manufacturing quality problems in a systematic and structured manner. The potential benefit of Taguchi methodology lies in its ease of use, its emphasis on reducing variability to give more economical products and hence the accessibility to the engineering fraternity for solving real life quality problems. This study applies Taguchi methodology to improve the functional quality of a local made chain gear by a purposed heat treatment process. The hardness of steel is generally a function not of its composition only, but rather of its heat treatment. The study investigates the effects of various heat treatment parameters, including ramp rate of heating, normalizing holding time, normalizing temperature, annealing holding time, annealing temperature, hardening holding time, hardening temperature, quenching media, tempering temperature and tempering holding time upon the hardness, which is a measure of resistance to plastic deformation. Both the analysis of means (ANOM) and Signal to Noise ratio (S/N) have been carried out for determining the optimal condition of the process. A significant improvement of the functional quality characteristic (hardness) by more than 32% was obtained. The Scanning Electron Microscopy technique was used in this study to obtain visual evidence of the quality and continuous improvement of the heat treated samples. (author)

  13. Direct potable reuse microbial risk assessment methodology: Sensitivity analysis and application to State log credit allocations.

    Science.gov (United States)

    Soller, Jeffrey A; Eftim, Sorina E; Nappier, Sharon P

    2018-01-01

    Understanding pathogen risks is a critically important consideration in the design of water treatment, particularly for potable reuse projects. As an extension to our published microbial risk assessment methodology to estimate infection risks associated with Direct Potable Reuse (DPR) treatment train unit process combinations, herein, we (1) provide an updated compilation of pathogen density data in raw wastewater and dose-response models; (2) conduct a series of sensitivity analyses to consider potential risk implications using updated data; (3) evaluate the risks associated with log credit allocations in the United States; and (4) identify reference pathogen reductions needed to consistently meet currently applied benchmark risk levels. Sensitivity analyses illustrated changes in cumulative annual risks estimates, the significance of which depends on the pathogen group driving the risk for a given treatment train. For example, updates to norovirus (NoV) raw wastewater values and use of a NoV dose-response approach, capturing the full range of uncertainty, increased risks associated with one of the treatment trains evaluated, but not the other. Additionally, compared to traditional log-credit allocation approaches, our results indicate that the risk methodology provides more nuanced information about how consistently public health benchmarks are achieved. Our results indicate that viruses need to be reduced by 14 logs or more to consistently achieve currently applied benchmark levels of protection associated with DPR. The refined methodology, updated model inputs, and log credit allocation comparisons will be useful to regulators considering DPR projects and design engineers as they consider which unit treatment processes should be employed for particular projects. Published by Elsevier Ltd.

  14. Investigation of Super Learner Methodology on HIV-1 Small Sample: Application on Jaguar Trial Data.

    Science.gov (United States)

    Houssaïni, Allal; Assoumou, Lambert; Marcelin, Anne Geneviève; Molina, Jean Michel; Calvez, Vincent; Flandre, Philippe

    2012-01-01

    Background. Many statistical models have been tested to predict phenotypic or virological response from genotypic data. A statistical framework called Super Learner has been introduced either to compare different methods/learners (discrete Super Learner) or to combine them in a Super Learner prediction method. Methods. The Jaguar trial is used to apply the Super Learner framework. The Jaguar study is an "add-on" trial comparing the efficacy of adding didanosine to an on-going failing regimen. Our aim was also to investigate the impact on the use of different cross-validation strategies and different loss functions. Four different repartitions between training set and validations set were tested through two loss functions. Six statistical methods were compared. We assess performance by evaluating R(2) values and accuracy by calculating the rates of patients being correctly classified. Results. Our results indicated that the more recent Super Learner methodology of building a new predictor based on a weighted combination of different methods/learners provided good performance. A simple linear model provided similar results to those of this new predictor. Slight discrepancy arises between the two loss functions investigated, and slight difference arises also between results based on cross-validated risks and results from full dataset. The Super Learner methodology and linear model provided around 80% of patients correctly classified. The difference between the lower and higher rates is around 10 percent. The number of mutations retained in different learners also varys from one to 41. Conclusions. The more recent Super Learner methodology combining the prediction of many learners provided good performance on our small dataset.

  15. Methodology for the application of the probabilistic safety analysis to the cobalto therapy units in Cuba

    International Nuclear Information System (INIS)

    Vilaragut Llanes, Juan Jose; Ferro Fernandez, Ruben; Troncoso Fleitas, Mayra; Lozano Lima, Berta; De la Fuente Puch, Andres; Perez Reyes, Yolanda; Dumenigo Gonzalez, Cruz

    2001-01-01

    Presently work the main elements are discussed kept in mind for the use of the Analyses Probabilistas of Security in the evaluation of the security of the units of cobalto therapy of Cuba and it is presented, like part of the results of the first stage of the Study, the Methodological Guide that is being used in a Contract of Investigation of the OIEA that at the moment carries out the community of authors of the CNSN, of group with other specialists of the Ministry of Public Health (MINSAP)

  16. Applicability of risk-informed criticality methodology to spent fuel repositories

    International Nuclear Information System (INIS)

    Mays, C.; Thomas, D.A.; Favet, D.

    2000-01-01

    An important objective of geologic disposal is keeping the fissionable material in a condition so that a self-sustaining nuclear chain reaction (criticality) is highly unlikely. This objective supports the overall performance objective of any repository, which is to protect the health and safety of the public by limiting radiological exposure. This paper describes a risk-informed, performance-based methodology, which combines deterministic and probabilistic approaches for evaluating the criticality potential of high-level waste and spent nuclear fuel after the repository is sealed and permanently closed (postclosure). (authors)

  17. Application of ASSET methodology and operational experience feedback of NPPs in China

    Energy Technology Data Exchange (ETDEWEB)

    Lan, Ziyong [The National Nuclear Safety Administration, Beijing (China)

    1997-10-01

    The introductive presentation of ASSET methodology to China started in March 1992, 3 experts from the IAEA held the ASSET Seminar in Wuhan, China. Three years later, an IAEA seminar on ASSET Method and Operational Experience Feedback proceeded in Beijing on 20-24 March 1995. Another ASSET seminar on Self-Assessment and Operational Experience Feedback was held at Guangdong NPP site on 2-6 December 1996, The NNSA and the GNPP hosted the seminar, 2 IAEA experts, 55 participants from the NPPs, research institutes, the regulatory body (NNSA) and its regional offices attended the seminar. 3 figs, 5 tabs.

  18. Methodology of Integration for Competitive Technical Intelligence with Blue Ocean Strategy: Application to an exotic fruit

    Directory of Open Access Journals (Sweden)

    Marisela Rodríguez Salvador

    2011-12-01

    Full Text Available This article presents a new methodology that integrates Competitive Technical Intelligence with Blue Ocean Strategy. We explore new business niches taking advantage of the synergy that both areas offer, developing a model based on cyclic interactions through a process developed in two stages: Understanding opportunity that arise from idea formulation to decision making and strategic development. The validity of our approach (first stage was observed in the evaluation of an exotic fruit, Anacardium Occidentale, in the South of the State of Veracruz, Mexico with the support of the university ITESM, Campus Monterrey. We identified critical factors for success, opportunities and threats. Results confirm the attractiveness of this crop.

  19. Methodology and Applications in Non-linear Model-based Geostatistics

    DEFF Research Database (Denmark)

    Christensen, Ole Fredslund

    that are approximately Gaussian. Parameter estimation and prediction for the transformed Gaussian model is studied. In some cases a transformation cannot possibly render the data Gaussian. A methodology for analysing such data was introduced by Diggle, Tawn and Moyeed (1998): The generalised linear spatial model...... priors for Bayesian inference is discussed. Procedures for parameter estimation and prediction are studied. Theoretical properties of Markov chain Monte Carlo algorithms are investigated, and different algorithms are compared. In addition, the thesis contains a manual for an R-package, geoRglmm, which...

  20. Application of a systematic methodology for sustainable carbon dioxide utilization process design

    DEFF Research Database (Denmark)

    Plaza, Cristina Calvera; Frauzem, Rebecca; Gani, Rafiqul

    than carbon capture and storage. To achieve this a methodology is developed to design sustainable carbon dioxide utilization processes. First, the information on the possible utilization alternatives is collected, including the economic potential of the process and the carbon dioxide emissions...... emission are desired in order to reduce the carbon dioxide emissions. Using this estimated preliminary evaluation, the top processes, with the most negative carbon dioxide emission are investigated by rigorous detailed simulation to evaluate the net carbon dioxide emissions. Once the base case design...

  1. THEORY AND METHODOLOGY OF ICT APPLICATION INTO A SOCIAL STUDY IN ABROAD AND UKRAINE: GENERAL APPROACHES

    Directory of Open Access Journals (Sweden)

    Olena O. Hrytsenchuk

    2011-02-01

    Full Text Available The article presents an analysis of theoretical and methodological foundations of the implementation of information and communication technology (ICT in secondary education and particularly in the social sciences in Western Europe, USA and Ukraine today. Materials and documents of UNDP, Council of Europe, the Organization of European Cooperation and Development (OECD and documents legal and regulatory framework of education, school curricula and programs of foreign countries and Ukraine were researched. There are outlined approaches ICT use in subject areas of Social cycle of secondary school, the article covers some directions of national education strategies for using ICT in Western Europe, USA and Ukraine and prospects of development.

  2. Electrodeposition of Iridium Oxide by Cyclic Voltammetry: Application of Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Kakooei Saeid

    2014-07-01

    Full Text Available The effects of scan rate, temperature, and number of cycles on the coating thickness of IrOX electrodeposited on a stainless steel substrate by cyclic voltammetry were investigated in a statistical system. The central composite design, combined with response surface methodology, was used to study condition of electrodeposition. All fabricated electrodes were characterized using electrochemical methods. Field emission scanning electron microscopy and energy-dispersive X-ray spectroscopy were performed for IrOX film characterization. Results showed that scan rate significantly affects the thickness of the electrodeposited layer. Also, the number of cycles has a greater effect than temperature on the IrOX thickness.

  3. Application of frequency domain line edge roughness characterization methodology in lithography

    Science.gov (United States)

    Sun, Lei; Wang, Wenhui; Beique, Genevieve; Wood, Obert; Kim, Ryoung-Han

    2015-03-01

    A frequency domain 3 sigma LER characterization methodology combining the standard deviation and power spectral density (PSD) methods is proposed. In the new method, the standard deviation is calculated in the frequency domain instead of the spatial domain as in the conventional method. The power spectrum of the LER is divided into three regions: low frequency (LF), middle frequency (MF) and high frequency (HF) regions. The frequency region definition is based on process visual comparisons. Three standard deviation numbers are used to characterize the LER in the three frequency regions. Pattern wiggling can be detected quantitatively with a wiggling factor which is also proposed in this paper.

  4. Application of ASSET methodology and operational experience feedback of NPPs in China

    International Nuclear Information System (INIS)

    Ziyong Lan

    1997-01-01

    The introductive presentation of ASSET methodology to China started in March 1992, 3 experts from the IAEA held the ASSET Seminar in Wuhan, China. Three years later, an IAEA seminar on ASSET Method and Operational Experience Feedback proceeded in Beijing on 20-24 March 1995. Another ASSET seminar on Self-Assessment and Operational Experience Feedback was held at Guangdong NPP site on 2-6 December 1996, The NNSA and the GNPP hosted the seminar, 2 IAEA experts, 55 participants from the NPPs, research institutes, the regulatory body (NNSA) and its regional offices attended the seminar. 3 figs, 5 tabs

  5. Eliciting and communicating expert judgments: Methodology and application to nuclear safety

    International Nuclear Information System (INIS)

    Winterfeldt, D. von

    1989-01-01

    The most ambitious and certainly the most extensive formal expert judgment process was the elicitation of numerous events and uncertain quantities for safety issues in five nuclear power plants in the U.S. The general methodology for formal expert elicitations are described. An overview of the expert elicitation process of NUREG 1150 is provided and the elicitation of probabilities for the interfacing systems loss of coolant accident LOCA (ISL) in PWRs is given as an example of this elicitation process. Some lessons learned from this study are presented. (DG)

  6. Response surface methodology approach for structural reliability analysis: An outline of typical applications performed at CEC-JRC, Ispra

    International Nuclear Information System (INIS)

    Lucia, A.C.

    1982-01-01

    The paper presents the main results of the work carried out at JRC-Ispra for the study of specific problems posed by the application of the response surface methodology to the exploration of structural and nuclear reactor safety codes. Some relevant studies have been achieved: assessment of structure behaviours in the case of seismic occurrences; determination of the probability of coherent blockage in LWR fuel elements due to LOCA occurrence; analysis of ATWS consequences in PWR reactors by means of an ALMOD code; analysis of the first wall for an experimental fusion reactor by means of the Bersafe code. (orig.)

  7. Improving timeliness and efficiency in the referral process for safety net providers: application of the Lean Six Sigma methodology.

    Science.gov (United States)

    Deckard, Gloria J; Borkowski, Nancy; Diaz, Deisell; Sanchez, Carlos; Boisette, Serge A

    2010-01-01

    Designated primary care clinics largely serve low-income and uninsured patients who present a disproportionate number of chronic illnesses and face great difficulty in obtaining the medical care they need, particularly the access to specialty physicians. With limited capacity for providing specialty care, these primary care clinics generally refer patients to safety net hospitals' specialty ambulatory care clinics. A large public safety net health system successfully improved the effectiveness and efficiency of the specialty clinic referral process through application of Lean Six Sigma, an advanced process-improvement methodology and set of tools driven by statistics and engineering concepts.

  8. Application of virtual distances methodology to laser tracker verification with an indexed metrology platform

    International Nuclear Information System (INIS)

    Acero, R; Pueo, M; Santolaria, J; Aguilar, J J; Brau, A

    2015-01-01

    High-range measuring equipment like laser trackers need large dimension calibrated reference artifacts in their calibration and verification procedures. In this paper, a new verification procedure for portable coordinate measuring instruments based on the generation and evaluation of virtual distances with an indexed metrology platform is developed. This methodology enables the definition of an unlimited number of reference distances without materializing them in a physical gauge to be used as a reference. The generation of the virtual points and reference lengths derived is linked to the concept of the indexed metrology platform and the knowledge of the relative position and orientation of its upper and lower platforms with high accuracy. It is the measuring instrument together with the indexed metrology platform one that remains still, rotating the virtual mesh around them. As a first step, the virtual distances technique is applied to a laser tracker in this work. The experimental verification procedure of the laser tracker with virtual distances is simulated and further compared with the conventional verification procedure of the laser tracker with the indexed metrology platform. The results obtained in terms of volumetric performance of the laser tracker proved the suitability of the virtual distances methodology in calibration and verification procedures for portable coordinate measuring instruments, broadening and expanding the possibilities for the definition of reference distances in these procedures. (paper)

  9. Application of virtual distances methodology to laser tracker verification with an indexed metrology platform

    Science.gov (United States)

    Acero, R.; Santolaria, J.; Pueo, M.; Aguilar, J. J.; Brau, A.

    2015-11-01

    High-range measuring equipment like laser trackers need large dimension calibrated reference artifacts in their calibration and verification procedures. In this paper, a new verification procedure for portable coordinate measuring instruments based on the generation and evaluation of virtual distances with an indexed metrology platform is developed. This methodology enables the definition of an unlimited number of reference distances without materializing them in a physical gauge to be used as a reference. The generation of the virtual points and reference lengths derived is linked to the concept of the indexed metrology platform and the knowledge of the relative position and orientation of its upper and lower platforms with high accuracy. It is the measuring instrument together with the indexed metrology platform one that remains still, rotating the virtual mesh around them. As a first step, the virtual distances technique is applied to a laser tracker in this work. The experimental verification procedure of the laser tracker with virtual distances is simulated and further compared with the conventional verification procedure of the laser tracker with the indexed metrology platform. The results obtained in terms of volumetric performance of the laser tracker proved the suitability of the virtual distances methodology in calibration and verification procedures for portable coordinate measuring instruments, broadening and expanding the possibilities for the definition of reference distances in these procedures.

  10. Nuclear Forensics: A Methodology Applicable to Nuclear Security and to Non-Proliferation

    International Nuclear Information System (INIS)

    Mayer, K; Wallenius, M; Luetzenkirchen, K; Galy, J; Varga, Z; Erdmann, N; Buda, R; Kratz, J-V; Trautmann, N; Fifield, K

    2011-01-01

    Nuclear Security aims at the prevention and detection of and response to, theft, sabotage, unauthorized access, illegal transfer or other malicious acts involving nuclear material. Nuclear Forensics is a key element of nuclear security. Nuclear Forensics is defined as a methodology that aims at re-establishing the history of nuclear material of unknown origin. It is based on indicators that arise from known relationships between material characteristics and process history. Thus, nuclear forensics analysis includes the characterization of the material and correlation with production history. To this end, we can make use of parameters such as the isotopic composition of the nuclear material and accompanying elements, chemical impurities, macroscopic appearance and microstructure of the material. In the present paper, we discuss the opportunities for attribution of nuclear material offered by nuclear forensics as well as its limitations. Particular attention will be given to the role of nuclear reactions. Such reactions include the radioactive decay of the nuclear material, but also reactions with neutrons. When uranium (of natural composition) is exposed to neutrons, plutonium is formed, as well as 236 U. We will illustrate the methodology using the example of a piece of uranium metal that dates back to the German nuclear program in the 1940's. A combination of different analytical techniques and model calculations enables a nuclear forensics interpretation, thus correlating the material characteristics with the production history.

  11. Nuclear Forensics: A Methodology Applicable to Nuclear Security and to Non-Proliferation

    Science.gov (United States)

    Mayer, K.; Wallenius, M.; Lützenkirchen, K.; Galy, J.; Varga, Z.; Erdmann, N.; Buda, R.; Kratz, J.-V.; Trautmann, N.; Fifield, K.

    2011-09-01

    Nuclear Security aims at the prevention and detection of and response to, theft, sabotage, unauthorized access, illegal transfer or other malicious acts involving nuclear material. Nuclear Forensics is a key element of nuclear security. Nuclear Forensics is defined as a methodology that aims at re-establishing the history of nuclear material of unknown origin. It is based on indicators that arise from known relationships between material characteristics and process history. Thus, nuclear forensics analysis includes the characterization of the material and correlation with production history. To this end, we can make use of parameters such as the isotopic composition of the nuclear material and accompanying elements, chemical impurities, macroscopic appearance and microstructure of the material. In the present paper, we discuss the opportunities for attribution of nuclear material offered by nuclear forensics as well as its limitations. Particular attention will be given to the role of nuclear reactions. Such reactions include the radioactive decay of the nuclear material, but also reactions with neutrons. When uranium (of natural composition) is exposed to neutrons, plutonium is formed, as well as 236U. We will illustrate the methodology using the example of a piece of uranium metal that dates back to the German nuclear program in the 1940's. A combination of different analytical techniques and model calculations enables a nuclear forensics interpretation, thus correlating the material characteristics with the production history.

  12. Application of response surface methodology to optimize uranium biological leaching at high pulp density

    International Nuclear Information System (INIS)

    Fatemi, Faezeh; Arabieh, Masoud; Jahani, Samaneh

    2016-01-01

    The aim of the present study was to carry out uranium bioleaching via optimization of the leaching process using response surface methodology. For this purpose, the native Acidithiobacillus sp. was adapted to different pulp densities following optimization process carried out at a high pulp density. Response surface methodology based on Box-Behnken design was used to optimize the uranium bioleaching. The effects of six key parameters on the bioleaching efficiency were investigated. The process was modeled with mathematical equation, including not only first and second order terms, but also with probable interaction effects between each pair of factors.The results showed that the extraction efficiency of uranium dropped from 100% at pulp densities of 2.5, 5, 7.5 and 10% to 68% at 12.5% of pulp density. Using RSM, the optimum conditions for uranium bioleaching (12.5% (w/v)) were identified as pH = 1.96, temperature = 30.90 C, stirring speed = 158 rpm, 15.7% inoculum, FeSO 4 . 7H 2 O concentration at 13.83 g/L and (NH 4 ) 2 SO 4 concentration at 3.22 g/L which achieved 83% of uranium extraction efficiency. The results of uranium bioleaching experiment using optimized parameter showed 81% uranium extraction during 15 d. The obtained results reveal that using RSM is reliable and appropriate for optimization of parameters involved in the uranium bioleaching process.

  13. Application of response surface methodology to optimize uranium biological leaching at high pulp density

    Energy Technology Data Exchange (ETDEWEB)

    Fatemi, Faezeh; Arabieh, Masoud; Jahani, Samaneh [NSTRI, Tehran (Iran, Islamic Republic of). Nuclear Fuel Cycle Research School

    2016-08-01

    The aim of the present study was to carry out uranium bioleaching via optimization of the leaching process using response surface methodology. For this purpose, the native Acidithiobacillus sp. was adapted to different pulp densities following optimization process carried out at a high pulp density. Response surface methodology based on Box-Behnken design was used to optimize the uranium bioleaching. The effects of six key parameters on the bioleaching efficiency were investigated. The process was modeled with mathematical equation, including not only first and second order terms, but also with probable interaction effects between each pair of factors.The results showed that the extraction efficiency of uranium dropped from 100% at pulp densities of 2.5, 5, 7.5 and 10% to 68% at 12.5% of pulp density. Using RSM, the optimum conditions for uranium bioleaching (12.5% (w/v)) were identified as pH = 1.96, temperature = 30.90 C, stirring speed = 158 rpm, 15.7% inoculum, FeSO{sub 4} . 7H{sub 2}O concentration at 13.83 g/L and (NH{sub 4}){sub 2}SO{sub 4} concentration at 3.22 g/L which achieved 83% of uranium extraction efficiency. The results of uranium bioleaching experiment using optimized parameter showed 81% uranium extraction during 15 d. The obtained results reveal that using RSM is reliable and appropriate for optimization of parameters involved in the uranium bioleaching process.

  14. Regional Energy Demand Responses To Climate Change. Methodology And Application To The Commonwealth Of Massachusetts

    International Nuclear Information System (INIS)

    Amato, A.D.; Ruth, M.; Kirshen, P.; Horwitz, J.

    2005-01-01

    Climate is a major determinant of energy demand. Changes in climate may alter energy demand as well as energy demand patterns. This study investigates the implications of climate change for energy demand under the hypothesis that impacts are scale dependent due to region-specific climatic variables, infrastructure, socioeconomic, and energy use profiles. In this analysis we explore regional energy demand responses to climate change by assessing temperature-sensitive energy demand in the Commonwealth of Massachusetts. The study employs a two-step estimation and modeling procedure. The first step evaluates the historic temperature sensitivity of residential and commercial demand for electricity and heating fuels, using a degree-day methodology. We find that when controlling for socioeconomic factors, degree-day variables have significant explanatory power in describing historic changes in residential and commercial energy demands. In the second step, we assess potential future energy demand responses to scenarios of climate change. Model results are based on alternative climate scenarios that were specifically derived for the region on the basis of local climatological data, coupled with regional information from available global climate models. We find notable changes with respect to overall energy consumption by, and energy mix of the residential and commercial sectors in the region. On the basis of our findings, we identify several methodological issues relevant to the development of climate change impact assessments of energy demand

  15. IoT-Based Information System for Healthcare Application: Design Methodology Approach

    Directory of Open Access Journals (Sweden)

    Damian Dziak

    2017-06-01

    Full Text Available Over the last few decades, life expectancy has increased significantly. However, elderly people who live on their own often need assistance due to mobility difficulties, symptoms of dementia or other health problems. In such cases, an autonomous supporting system may be helpful. This paper proposes the Internet of Things (IoT-based information system for indoor and outdoor use. Since the conducted survey of related works indicated a lack of methodological approaches to the design process, therefore a Design Methodology (DM, which approaches the design target from the perspective of the stakeholders, contracting authorities and potential users, is introduced. The implemented solution applies the three-axial accelerometer and magnetometer, Pedestrian Dead Reckoning (PDR, thresholding and the decision trees algorithm. Such an architecture enables the localization of a monitored person within four room-zones with accuracy; furthermore, it identifies falls and the activities of lying, standing, sitting and walking. Based on the identified activities, the system classifies current activities as normal, suspicious or dangerous, which is used to notify the healthcare staff about possible problems. The real-life scenarios validated the high robustness of the proposed solution. Moreover, the test results satisfied both stakeholders and future users and ensured further cooperation with the project.

  16. A methodology for optimisation of countermeasures for animal products after a nuclear accident and its application

    International Nuclear Information System (INIS)

    Hwang, Won Tae; Cho, Gyuseong; Han, Moon Hee

    1999-01-01

    A methodology for the optimisation of the countermeasures associated with the contamination of animal products was designed based on cost-benefit analysis. Results are discussed for the hypothetical deposition of radionuclides on 15 August, when pastures are fully developed in Korean agricultural conditions. A dynamic food chain model, DYNACON, was used to evaluate the effectiveness of the countermeasures for reducing the ingestion dose. The countermeasures considered were: (1) a ban on food consumption; and (2) the substitution of clean fodder. These are effective in reducing the ingestion dose as well as simple and easy to carry out in the first year after deposition. The net benefit of the countermeasures was quantitatively estimated in terms of avertable doses and monetary costs. The benefit depends on a variety of factors, such as radionuclide concentrations on the ground, starting time and duration of the countermeasures. It is obvious that a fast reaction after deposition is important in maximising the cost effectiveness of the countermeasures. In most cases, the substitution of clean fodder is more cost effective than a ban on food consumption. The methodology used in this study may serve as a basis for rapid decision-making on the introduction of countermeasures relating to the contamination of animal products after a nuclear accident

  17. Regional Energy Demand Responses To Climate Change. Methodology And Application To The Commonwealth Of Massachusetts

    Energy Technology Data Exchange (ETDEWEB)

    Amato, A.D.; Ruth, M. [Environmental Policy Program, School of Public Policy, University of Maryland, 3139 Van Munching Hall, College Park, MD (United States); Kirshen, P. [Department of Civil and Environmental Engineering, Tufts University, Anderson Hall, Medford, MA (United States); Horwitz, J. [Climatological Database Consultant, Binary Systems Software, Newton, MA (United States)

    2005-07-01

    Climate is a major determinant of energy demand. Changes in climate may alter energy demand as well as energy demand patterns. This study investigates the implications of climate change for energy demand under the hypothesis that impacts are scale dependent due to region-specific climatic variables, infrastructure, socioeconomic, and energy use profiles. In this analysis we explore regional energy demand responses to climate change by assessing temperature-sensitive energy demand in the Commonwealth of Massachusetts. The study employs a two-step estimation and modeling procedure. The first step evaluates the historic temperature sensitivity of residential and commercial demand for electricity and heating fuels, using a degree-day methodology. We find that when controlling for socioeconomic factors, degree-day variables have significant explanatory power in describing historic changes in residential and commercial energy demands. In the second step, we assess potential future energy demand responses to scenarios of climate change. Model results are based on alternative climate scenarios that were specifically derived for the region on the basis of local climatological data, coupled with regional information from available global climate models. We find notable changes with respect to overall energy consumption by, and energy mix of the residential and commercial sectors in the region. On the basis of our findings, we identify several methodological issues relevant to the development of climate change impact assessments of energy demand.

  18. Headspace mass spectrometry methodology: application to oil spill identification in soils

    Energy Technology Data Exchange (ETDEWEB)

    Perez Pavon, J.L.; Garcia Pinto, C.; Moreno Cordero, B. [Universidad de Salamanca, Departamento de Quimica Analitica, Nutricion y Bromatologia, Facultad de Ciencias Quimicas, Salamanca (Spain); Guerrero Pena, A. [Universidad de Salamanca, Departamento de Quimica Analitica, Nutricion y Bromatologia, Facultad de Ciencias Quimicas, Salamanca (Spain); Laboratorio de Suelos, Plantas y Aguas, Campus Tabasco, Colegio de Postgraduados, Cardenas, Tabasco (Mexico)

    2008-05-15

    In the present work we report the results obtained with a methodology based on direct coupling of a headspace generator to a mass spectrometer for the identification of different types of petroleum crudes in polluted soils. With no prior treatment, the samples are subjected to the headspace generation process and the volatiles generated are introduced directly into the mass spectrometer, thereby obtaining a fingerprint of volatiles in the sample analysed. The mass spectrum corresponding to the mass/charge ratios (m/z) contains the information related to the composition of the headspace and is used as the analytical signal for the characterization of the samples. The signals obtained for the different samples were treated by chemometric techniques to obtain the desired information. The main advantage of the proposed methodology is that no prior chromatographic separation and no sample manipulation are required. The method is rapid, simple and, in view of the results, highly promising for the implementation of a new approach for oil spill identification in soils. (orig.)

  19. Reliability Centered Maintenance (RCM) Methodology and Application to the Shutdown Cooling System for APR-1400 Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Faragalla, Mohamed M.; Emmanuel, Efenji; Alhammadi, Ibrahim; Awwal, Arigi M.; Lee, Yong Kwan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2016-10-15

    Shutdown Cooling System (SCS) is a safety-related system that is used in conjunction with the Main Steam and Main or Auxiliary Feedwater Systems to reduce the temperature of the Reactor Coolant System (RCS) in post shutdown periods from the hot shutdown operating temperature to the refueling temperature. In this paper RCM methodology is applied to (SCS). RCM analysis is performed based on evaluation of Failure Modes Effects and Criticality Analysis (FME and CA) on the component, system and plant. The Logic Tree Analysis (LTA) is used to determine the optimum maintenance tasks. The main objectives of RCM is the safety, preserve the System function, the cost-effective maintenance of the plant components and increase the reliability and availability value. The RCM methodology is useful for improving the equipment reliability by strengthening the management of equipment condition, and leads to a significant decrease in the number of periodical maintenance, extended maintenance cycle, longer useful life of equipment, and decrease in overall maintenance cost. It also focuses on the safety of the system by assigning criticality index to the various components and further selecting maintenance activities based on the risk of failure involved. Therefore, it can be said that RCM introduces a maintenance plan designed for maximum safety in an economical manner and making the system more reliable. For the SCP, increasing the number of condition monitoring tasks will improve the availability of the SCP. It is recommended to reduce the number of periodic maintenance activities.

  20. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  1. A Probabilistic Analysis Methodology and Its Application to A Spent Fuel Pool System

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyowon; Jae, Moosung [Hanyang Univ., Seoul (Korea, Republic of); Ryu, Ho G. [Daedeok R and D Center, Daejeon (Korea, Republic of)

    2013-05-15

    There was a similar accident occurring at the 2{sup nd} unit of PAKS nuclear power station in Hungary on the 10{sup th} April 2003. Insufficient cooling of spent fuel caused the spent fuel burn up or partly melting. There were many previous studies performed for analyzing and measuring the risk of spent fuel damage. In the 1980s, there are changes in conditions such as development of high density storage racks and new information concerning the possibility of cladding fires in the drained spent fuel pools. The US NRC assessed the spent fuel pool risk under the Generic Issue 82. In the 1990s, under the US NRC sponsorship, the risk assessment about the spent fuel pool at Susquehanna Steam Electric Station (SSES) has been performed and Analysis Evaluation of Operational Data (AEOD) has been organized for accumulating the reliability data. A methodology for assessing the risk associated with the spent fuel pool facility has been developed and is applied to the reference plant. It is shown that the methodology developed in this study might contribute to assessing these kinds of the SFP facilities. In this probabilistic risk analysis, the LINV Initial event results in the high frequent occurrence. The most dominant cut-sets include the human errors. The result of this analysis might contribute to identifying the weakness of the preventive and mitigating system in the SFP facility.

  2. A technology-assessment methodology for electric utility planning: With application to nuclear power plant decommissioning

    International Nuclear Information System (INIS)

    Lough, W.T.

    1987-01-01

    Electric utilities and public service commissions have not taken full advantage of the many proven methodologies and techniques available for evaluating complex technological issues. In addition, evaluations performed are deficient in their use of (1) methods for evaluating public attitudes and (2) formal methods of analysis for decision making. These oversight are substantiated through an examination of the literature relevant to electric utility planning. The assessment process known as technology assessment or TA is proposed, and a TA model is developed for route in use in utility planning by electric utilities and state regulatory commissions. Techniques to facilitate public participation and techniques to aid decision making are integral to the proposed model and are described in detail. Criteria are provided for selecting an appropriate technique on a case-by-case basis. The TA model proved to be an effective methodology for evaluating technological issues associated with electric utility planning such as decommissioning nuclear power plants. Through the use of the nominal group technique, the attitudes of a group of residential ratepayers were successfully identified and included in the decision-making process

  3. A New Methodology for 3D Target Detection in Automotive Radar Applications

    Directory of Open Access Journals (Sweden)

    Fabio Baselice

    2016-04-01

    Full Text Available Today there is a growing interest in automotive sensor monitoring systems. One of the main challenges is to make them an effective and valuable aid in dangerous situations, improving transportation safety. The main limitation of visual aid systems is that they do not produce accurate results in critical visibility conditions, such as in presence of rain, fog or smoke. Radar systems can greatly help in overcoming such limitations. In particular, imaging radar is gaining interest in the framework of Driver Assistance Systems (DAS. In this manuscript, a new methodology able to reconstruct the 3D imaged scene and to detect the presence of multiple targets within each line of sight is proposed. The technique is based on the use of Compressive Sensing (CS theory and produces the estimation of multiple targets for each line of sight, their range distance and their reflectivities. Moreover, a fast approach for 2D focus based on the FFT algorithm is proposed. After the description of the proposed methodology, different simulated case studies are reported in order to evaluate the performances of the proposed approach.

  4. Adsorptive removal of residual catalyst from palm biodiesel: Application of response surface methodology

    Directory of Open Access Journals (Sweden)

    Mjalli Sabri Farouq

    2012-01-01

    Full Text Available In this work, the residual potassium hydroxide catalyst was removed from palm oil-based methyl esters using an adsorption technique. The produced biodiesel was initially purified through a water washing process. To produce a biodiesel with a better quality and also to meet standard specifications (EN 14214 and ASTM D6751, batch adsorption on palm shell activated carbon was used for further catalyst removal. The Central Composite Design (CCD of the Response Surface Methodology (RSM was used to study the influence of adsorbent amount, time and temperature on the adsorption of potassium species. The maximum catalyst removal was achieved at 40°C using 0.9 g activated carbon for 20 h adsorption time. The results from the Response Surface Methodology are in a good agreement with the measured values. The absolute error in prediction at the optimum condition was 3.7%, which is reasonably accurate. This study proves that adsorption post-treatment techniques can be successfully employed to improve the quality of biodiesel fuel for its effective use on diesel engines and to minimize the usage of water.

  5. Environmental integrated impact assessment for waste treatment activity: methodology and case-study application

    International Nuclear Information System (INIS)

    Lonati, G.; Panzeri, A.

    2008-01-01

    A literature method for the environmental integrated impact assessment, according to the IPPC Directive, has been critically analysed and adjusted in order to be used for the environmental performance assessment of waste treatment activities. The assessment parameters, sorted in eight treatment and combined pollution categories, have been partly redefined and re balanced. The adjusted methodology has been applied to a real case-study, a chemical- physical waste treatment plant, in order to calculate the current performance (Actual Integrated Index) and the ideal performance (Actual Integrated Index) achievable by technical and operational improvements. The adjusted methodology has also been used as a decision support system, in order to estimate the value of the expected environmental performances improvement after the execution achievable from the introduction of a single one or a set of improvement actions. The valuation of the Integrated Index percentage reduction, along with the action achievable, made the best actions able to be identified, both in comparative way and in the cost-effective one. The results, 50 as Effective Integrated Index and 42 as Ideal Integrated Index, in a 10-100 scale, show a medium impact level and point out an appreciable improvement margin on all the environmental performances, especially in air emission control and water consumption [it

  6. Design and application of complementary educational resources for self-learning methodology

    Science.gov (United States)

    Andrés Gilarranz Casado, Carlos; Rodriguez-Sinobas, Leonor

    2016-04-01

    The main goal of this work is enhanced the student`s self-learning in subjects regarding irrigation and its technology. Thus, the use of visual media (video recording) during the lectures (master classes and practicum) will help the students in understanding the scope of the course since they can watch the recorded material at any time and as many times they wish. The study comprised two parts. In the first, lectures were video filmed inside the classroom during one semester (16 weeks and four hours per week) in the course "Irrigation Systems and Technology" which is taught at the Technical University of Madrid. In total, 200 videos, approximated 12 min long, were recorded. Since the You tube platform is a worldwide platform and since it is commonly used by students and professors, the videos were uploaded in it. Then, the URL was inserted in the Moodle platform which contains the materials for the course. In the second part, the videos were edited and formatted. Special care was taking to maintain image and audio quality. Finally, thirty videos were developed which focused on the different main areas of the course and containing a clear and brief explanation of their basis. Each video lasted between 30 and 45 min Finally, a survey was handled at the end of the semester in order to assess the students' opinion about the methodology. In the questionnaire, the students highlighted the key aspects during the learning process and in general, they were very satisfied with the methodology.

  7. Application of the SAMINT methodology to the new cross section evaluations of 63Cu and 65Cu∗

    Directory of Open Access Journals (Sweden)

    Sobes Vladimir

    2017-01-01

    Full Text Available The SAMINT methodology allows coupling of differential and integral data evaluations in a continuous-energy framework. Prior to development of the SAMINT code, integral experimental data such as in the International Criticality Safety Benchmark Experiments Project remained a tool for validation of completed nuclear data evaluations. Now, SAMINT extracts information from integral benchmarks in the form of calculated sensitivity coefficients by Monte Carlo codes such as CE TSUNAMI-3D or MCNP6 and combines it with the results of experimental cross section measurements to produce an updated cross section evaluation utilizing information from both sets of data. The use of the generalized linear least squares methodology ensures that proper weight is given to both the differential and integral data. SAMINT is not intended to bias nuclear data toward specific integral experiments, but it should be used to supplement evaluation of differential experimental data. This work demonstrates the application of the SAMINT methodology to the new Oak Ridge National Laboratory (ORNL evaluations of the resonance parameters for two isotopes of copper: 63Cu and 65Cu.

  8. A methodology for evaluating weighting functions using MCNP and its application to PWR ex-core analyses

    International Nuclear Information System (INIS)

    Pecchia, Marco; Vasiliev, Alexander; Ferroukhi, Hakim; Pautz, Andreas

    2017-01-01

    Highlights: • Evaluation of neutron source importance for a given tally. • Assessment of ex-core detector response plus its uncertainty. • Direct use of neutron track evaluated by a Monte Carlo neutron transport code. - Abstract: The ex-core neutron detectors are commonly used to control reactor power in light water reactors. Therefore, it is relevant to understand the importance of a neutron source to the ex-core detectors response. In mathematical terms, this information is conveniently represented by the so called weighting functions. A new methodology based on the MCNP code for evaluating the weighting functions starting from the neutron history database is presented in this work. A simultaneous evaluation of the weighting functions in a user-given Cartesian coverage mesh is the main advantage of the method. The capability to generate weighting functions simultaneously in both spatial and energy ranges is the innovative part of this work. Then, an interpolation tool complements the methodology, allowing the generation of weighting functions up to the pin-by-pin fuel segment, where a direct evaluation is not possible due to low statistical precision. A comparison to reference results provides a verification of the methodology. Finally, an application to investigate the role of ex-core detectors spatial location and core burnup for a Swiss nuclear power plant is provided.

  9. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design.

    Science.gov (United States)

    Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen

    2015-02-28

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  10. Development of a methodology for the economical analysis of fuel cycles, application to the Laguna Verde central

    International Nuclear Information System (INIS)

    Malfavon, S.M.; Trejo, M.G.; Hernandez, H.; Francois, J.L.; Ortega, R.F.

    2003-01-01

    In this work a methodology developed to carry out the economical analysis of the fuel cycle of a nuclear reactor is presented. The methodology was applied to the Laguna Verde Nuclear Power Station (CNLV). The design of the reload scenarios of the CNLV are made with the Core Master Presto code (CM-Presto), three-dimensional simulator of the reactor core, the launched data by this, as well as the information of the Energy use plan (PUE), it allowed us to obtain reliable results through the fitness of an algorithm of economic calculation that considers all the components of the fuel cycle to present worth. With the application of the methodology it was obtained the generated energy, as well as their respective cost of each sub lot type of assemblies by operation cycle, from the start-up of the CNLV until September 13, 2002. Using the present worth method its were moved all the values at November 5, 1988, date of operation beginning. To the final of the analysis an even cost of 6.188 mills/kWh was obtained for those first 9 cycles of the Unit 1 of the CNLV, being observed that the costs of those first 3 operation cycles are the more elevated. Considering only the values starting from the cycle 4, the levelled cost turns out to be of 5.96 mills/kWh. It was also obtained the cost by fuel lot to evaluate the performance of assemble with the same physical composition. (Author)

  11. METHODOLOGICAL BASIS FOR CREATING AN ELECTRONIC APPLICATION TO THE TEXTBOOK "PHYSICS 9"

    Directory of Open Access Journals (Sweden)

    L.U. Blagodarenko

    2010-11-01

    Full Text Available The results of the analysis of educational software that exist on the Ukrainian market are presented. The structure, functions and advantages of the electronic application to the textbook "Physics 9" are defined.

  12. METHODOLOGICAL BASIS FOR CREATING AN ELECTRONIC APPLICATION TO THE TEXTBOOK "PHYSICS 9"

    OpenAIRE

    L.U. Blagodarenko

    2010-01-01

    The results of the analysis of educational software that exist on the Ukrainian market are presented. The structure, functions and advantages of the electronic application to the textbook "Physics 9" are defined.

  13. Determination of phase diagrams via computer simulation: methodology and applications to water, electrolytes and proteins

    International Nuclear Information System (INIS)

    Vega, C; Sanz, E; Abascal, J L F; Noya, E G

    2008-01-01

    In this review we focus on the determination of phase diagrams by computer simulation, with particular attention to the fluid-solid and solid-solid equilibria. The methodology to compute the free energy of solid phases will be discussed. In particular, the Einstein crystal and Einstein molecule methodologies are described in a comprehensive way. It is shown that both methodologies yield the same free energies and that free energies of solid phases present noticeable finite size effects. In fact, this is the case for hard spheres in the solid phase. Finite size corrections can be introduced, although in an approximate way, to correct for the dependence of the free energy on the size of the system. The computation of free energies of solid phases can be extended to molecular fluids. The procedure to compute free energies of solid phases of water (ices) will be described in detail. The free energies of ices Ih, II, III, IV, V, VI, VII, VIII, IX, XI and XII will be presented for the SPC/E and TIP4P models of water. Initial coexistence points leading to the determination of the phase diagram of water for these two models will be provided. Other methods to estimate the melting point of a solid, such as the direct fluid-solid coexistence or simulations of the free surface of the solid, will be discussed. It will be shown that the melting points of ice Ih for several water models, obtained from free energy calculations, direct coexistence simulations and free surface simulations agree within their statistical uncertainty. Phase diagram calculations can indeed help to improve potential models of molecular fluids. For instance, for water, the potential model TIP4P/2005 can be regarded as an improved version of TIP4P. Here we will review some recent work on the phase diagram of the simplest ionic model, the restricted primitive model. Although originally devised to describe ionic liquids, the model is becoming quite popular to describe the behavior of charged colloids

  14. Development and Application of a Clinical Microsystem Simulation Methodology for Human Factors-Based Research of Alarm Fatigue.

    Science.gov (United States)

    Kobayashi, Leo; Gosbee, John W; Merck, Derek L

    2017-07-01

    (1) To develop a clinical microsystem simulation methodology for alarm fatigue research with a human factors engineering (HFE) assessment framework and (2) to explore its application to the comparative examination of different approaches to patient monitoring and provider notification. Problems with the design, implementation, and real-world use of patient monitoring systems result in alarm fatigue. A multidisciplinary team is developing an open-source tool kit to promote bedside informatics research and mitigate alarm fatigue. Simulation, HFE, and computer science experts created a novel simulation methodology to study alarm fatigue. Featuring multiple interconnected simulated patient scenarios with scripted timeline, "distractor" patient care tasks, and triggered true and false alarms, the methodology incorporated objective metrics to assess provider and system performance. Developed materials were implemented during institutional review board-approved study sessions that assessed and compared an experimental multiparametric alerting system with a standard monitor telemetry system for subject response, use characteristics, and end-user feedback. A four-patient simulation setup featuring objective metrics for participant task-related performance and response to alarms was developed along with accompanying structured HFE assessment (questionnaire and interview) for monitor systems use testing. Two pilot and four study sessions with individual nurse subjects elicited true alarm and false alarm responses (including diversion from assigned tasks) as well as nonresponses to true alarms. In-simulation observation and subject questionnaires were used to test the experimental system's approach to suppressing false alarms and alerting providers. A novel investigative methodology applied simulation and HFE techniques to replicate and study alarm fatigue in controlled settings for systems assessment and experimental research purposes.

  15. The application of Lean Six Sigma methodology to reduce the risk of healthcare-associated infections in surgery departments.

    Science.gov (United States)

    Montella, Emma; Di Cicco, Maria Vincenza; Ferraro, Anna; Centobelli, Piera; Raiola, Eliana; Triassi, Maria; Improta, Giovanni

    2017-06-01

    Nowadays, the monitoring and prevention of healthcare-associated infections (HAIs) is a priority for the healthcare sector. In this article, we report on the application of the Lean Six Sigma (LSS) methodology to reduce the number of patients affected by sentinel bacterial infections who are at risk of HAI. The LSS methodology was applied in the general surgery department by using a multidisciplinary team of both physicians and academics. Data on more than 20 000 patients who underwent a wide range of surgical procedures between January 2011 and December 2014 were collected to conduct the study using the departmental information system. The most prevalent sentinel bacteria were determined among the infected patients. The preintervention (January 2011 to December 2012) and postintervention (January 2013 to December 2014) phases were compared to analyze the effects of the methodology implemented. The methodology allowed the identification of variables that influenced the risk of HAIs and the implementation of corrective actions to improve the care process, thereby reducing the percentage of infected patients. The improved process resulted in a 20% reduction in the average number of hospitalization days between preintervention and control phases, and a decrease in the mean (SD) number of days of hospitalization amounted to 36 (15.68), with a data distribution around 3 σ. The LSS is a helpful strategy that ensures a significant decrease in the number of HAIs in patients undergoing surgical interventions. The implementation of this intervention in the general surgery departments resulted in a significant reduction in both the number of hospitalization days and the number of patients affected by HAIs. This approach, together with other tools for reducing the risk of infection (surveillance, epidemiological guidelines, and training of healthcare personnel), could be applied to redesign and improve a wide range of healthcare processes. © 2016 John Wiley & Sons, Ltd.

  16. Application of the Spanish methodological approach for biosphere assessment to a generic high-level waste disposal site

    International Nuclear Information System (INIS)

    Agueero, A.; Pinedo, P.; Simon, I.; Cancio, D.; Moraleda, M.; Trueba, C.; Perez-Sanchez, D.

    2008-01-01

    A methodological approach which includes conceptual developments, methodological aspects and software tools have been developed in the Spanish context, based on the BIOMASS 'Reference Biospheres Methodology'. The biosphere assessments have to be undertaken with the aim of demonstrating compliance with principles and regulations established to limit the possible radiological impact of radioactive waste disposals on human health and on the environment, and to ensure that future generations will not be exposed to higher radiation levels than those that would be acceptable today. The biosphere in the context of high-level waste disposal is defined as the collection of various radionuclide transfer pathways that may result in releases into the surface environment, transport within and between the biosphere receptors, exposure of humans and biota, and the doses/risks associated with such exposures. The assessments need to take into account the complexity of the biosphere, the nature of the radionuclides released and the long timescales considered. It is also necessary to make assumptions related to the habits and lifestyle of the exposed population, human activities in the long term and possible modifications of the biosphere. A summary on the Spanish methodological approach for biosphere assessment are presented here as well as its application in a Spanish generic case study. A reference scenario has been developed based on current conditions at a site located in Central-West Spain, to indicate the potential impact to the actual population. In addition, environmental change has been considered qualitatively through the use of interaction matrices and transition diagrams. Unit source terms of 36 Cl, 79 Se, 99 Tc, 129 I, 135 Cs, 226 Ra, 231 Pa, 238 U, 237 Np and 239 Pu have been taken. Two exposure groups of infants and adults have been chosen for dose calculations. Results are presented and their robustness is evaluated through the use of uncertainty and sensitivity

  17. Application of the Spanish methodological approach for biosphere assessment to a generic high-level waste disposal site.

    Science.gov (United States)

    Agüero, A; Pinedo, P; Simón, I; Cancio, D; Moraleda, M; Trueba, C; Pérez-Sánchez, D

    2008-09-15

    A methodological approach which includes conceptual developments, methodological aspects and software tools have been developed in the Spanish context, based on the BIOMASS "Reference Biospheres Methodology". The biosphere assessments have to be undertaken with the aim of demonstrating compliance with principles and regulations established to limit the possible radiological impact of radioactive waste disposals on human health and on the environment, and to ensure that future generations will not be exposed to higher radiation levels than those that would be acceptable today. The biosphere in the context of high-level waste disposal is defined as the collection of various radionuclide transfer pathways that may result in releases into the surface environment, transport within and between the biosphere receptors, exposure of humans and biota, and the doses/risks associated with such exposures. The assessments need to take into account the complexity of the biosphere, the nature of the radionuclides released and the long timescales considered. It is also necessary to make assumptions related to the habits and lifestyle of the exposed population, human activities in the long term and possible modifications of the biosphere. A summary on the Spanish methodological approach for biosphere assessment are presented here as well as its application in a Spanish generic case study. A reference scenario has been developed based on current conditions at a site located in Central-West Spain, to indicate the potential impact to the actual population. In addition, environmental change has been considered qualitatively through the use of interaction matrices and transition diagrams. Unit source terms of (36)Cl, (79)Se, (99)Tc, (129)I, (135)Cs, (226)Ra, (231)Pa, (238)U, (237)Np and (239)Pu have been taken. Two exposure groups of infants and adults have been chosen for dose calculations. Results are presented and their robustness is evaluated through the use of uncertainty and

  18. Application of statistical experimental methodology to optimize bioremediation of n-alkanes in aquatic environment

    International Nuclear Information System (INIS)

    Zahed, Mohammad Ali; Aziz, Hamidi Abdul; Mohajeri, Leila; Mohajeri, Soraya; Kutty, Shamsul Rahman Mohamed; Isa, Mohamed Hasnain

    2010-01-01

    Response surface methodology (RSM) was employed to optimize nitrogen and phosphorus concentrations for removal of n-alkanes from crude oil contaminated seawater samples in batch reactors. Erlenmeyer flasks were used as bioreactors; each containing 250 mL dispersed crude oil contaminated seawater, indigenous acclimatized microorganism and different amounts of nitrogen and phosphorus based on central composite design (CCD). Samples were extracted and analyzed according to US-EPA protocols using a gas chromatograph. During 28 days of bioremediation, a maximum of 95% total aliphatic hydrocarbons removal was observed. The obtained Model F-value of 267.73 and probability F < 0.0001 implied the model was significant. Numerical condition optimization via a quadratic model, predicted 98% n-alkanes removal for a 20-day laboratory bioremediation trial using nitrogen and phosphorus concentrations of 13.62 and 1.39 mg/L, respectively. In actual experiments, 95% removal was observed under these conditions.

  19. [The system theory of aging: methodological principles, basic tenets and applications].

    Science.gov (United States)

    Krut'ko, V N; Dontsov, V I; Zakhar'iashcheva, O V

    2009-01-01

    The paper deals with the system theory of aging constructed on the basis of present-day scientific methodology--the system approach. The fundamental cause for aging is discrete existence of individual life forms, i.e. living organisms which, from the thermodynamic point of view, are not completely open systems. The primary aging process (build-up of chaos and system disintegration of aging organism) obeys the second law of thermodynamics or the law of entropy increase in individual partly open systems. In living organisms the law is exhibited as synergy of four main aging mechanisms: system "pollution" of organism, loss of non-regenerative elements, accumulation of damages and deformations, generation of variability on all levels, and negative changes in regulation processes and consequent degradation of the organism systematic character. These are the general aging mechanisms; however, the regulatory mechanisms may be important equally for organism aging and search for ways to prolong active life.

  20. The application of experimental design methodology for the investigation of liquid radioactive waste treatment

    Directory of Open Access Journals (Sweden)

    Šljivić-Ivanović Marija Z.

    2017-01-01

    Full Text Available The sorption properties of waste facade, brick, and asphalt sample towards Sr(II, Co(II, and Ni(II ions from single and multicomponent solutions were investigated. The highest sorption capacity was found for Ni(II ions, while the most effective sorbent was facade. Simplex Centroid Mixture Design was used in order to investigate the sorption processes of ions from solutions with different composition as well as the competition between the cations. Based on the statistical analysis results, the equations for data modeling were proposed. According to the observations, the investigated solid matrices can be effectively used for the liquid radioactive waste treatment. Furthermore, the applied methodology turned out to be an easy and operational way for the investigations of multicomponent sorption processes. [Project of the Serbian Ministry of Education, Science and Technological Development, Grant no. III 43009 and Grant no. OI 171007

  1. Application of a methodology to determine priorities for nuclear power plant safety issues

    International Nuclear Information System (INIS)

    Daling, P.M.

    1988-01-01

    The Nuclear Regulatory Commission (NRC) Office of Nuclear Regulatory Research (RES) is sponsoring a research program to determine priorities of nuclear power plant safety issues. A methodology has been developed at the Pacific Northwest Laboratory (PNL) to provide technical assistance in the development of risk and cost estimates for implementing resolutions to the safety issues. The information development methods are intended to provide the NRC with a consistent level of information for use in ranking the issues. The NRC uses this information, along with judgmental factors, to rank the issues for further consideration by the NRC staff. The primary purpose of the priority rankings are to assist in the allocation of resources to issues that have high potential for reducing public risk as well as to remove issues from further consideration that have little safety significance

  2. Determining the Environmental Effects of Indirect Subsidies. A Methodological Approach with an Application to the Netherlands

    Energy Technology Data Exchange (ETDEWEB)

    Van Beers, C. [Department of Economics, Delft University of Technology, Delft (Netherlands); Van den Bergh, J.C.J.M. [Faculty of Economics and Business Administration, Vrije Universiteit, Amsterdam (Netherlands); De Moor, A. [National Institute for Public Health and the Environment RIVM, Bilthoven (Netherlands); Oosterhuis, F. [Institute for Environmental Studies IVM, Vrije Universiteit, Amsterdam (Netherlands)

    2004-04-01

    Up to now a clear theoretical and methodological framework for economic-environmental analysis of environmentally damaging subsidies is lacking. Environmentally damaging subsidies are all kinds of direct and indirect subsidies aimed at achieving a certain (often non-environmental) goal that produce negative external effects to the natural environment. This article develops a transparent method to determine the environmental impact of indirect government subsidies and derive policy lessons. This method has been applied to several major subsidies in the Netherlands, namely in agriculture, energy, and transport. The results reveal large environmental effects, which need to be taken seriously by policy makers. The method enables policy makers to evaluate the environmental impacts of indirect government subsidies.

  3. Risk assessment of cryogenic installations – implementation, applicability of methodologies and challenges at CERN

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    For the safe design of a cryogenic installation, it is essential to carry out a comprehensive hazard identification and risk estimate in order to put in place the necessary control measures for an adequate risk mitigation. According to CERN Safety Rules, it is mandatory that the organic unit owning a cryogenic facility conducts and documents a risk assessment. This requirement is also given by the European Directive 2014/68/EU to manufacturers of pressure equipment. During the talk, some of the challenges CERN faces in the development of risk assessments across the broad array of activities involving cryogenic equipment in the organization will be discussed. Challenges such as the choice of the best-suited risk assessment methodology based on the features and complexity of the installation/activities, the efforts to develop tools to facilitate hazard identification, risk analysis and definition of related measures to protect the health and safety of workers, such as streamlined guidelines, forms and check...

  4. Application of response surface methodology (RSM) and genetic algorithm in minimizing warpage on side arm

    Science.gov (United States)

    Raimee, N. A.; Fathullah, M.; Shayfull, Z.; Nasir, S. M.; Hazwan, M. H. M.

    2017-09-01

    The plastic injection moulding process produces large numbers of parts of high quality with great accuracy and quickly. It has widely used for production of plastic part with various shapes and geometries. Side arm is one of the product using injection moulding to manufacture it. However, there are some difficulties in adjusting the parameter variables which are mould temperature, melt temperature, packing pressure, packing time and cooling time as there are warpage happen at the tip part of side arm. Therefore, the work reported herein is about minimizing warpage on side arm product by optimizing the process parameter using Response Surface Methodology (RSM) and with additional artificial intelligence (AI) method which is Genetic Algorithm (GA).

  5. Quantum mechanical reactive scattering theory for simple chemical reactions: Recent developments in methodology and applications

    International Nuclear Information System (INIS)

    Miller, W.H.

    1989-08-01

    It has recently been discovered that the S-matrix version of the Kohn variational principle is free of the ''Kohn anomalies'' that have plagued other versions and prevented its general use. This has made a major contribution to heavy particle reactive (and also to electron-atom/molecule) scattering which involve non-local (i.e., exchange) interactions that prevent solution of the coupled channel equations by propagation methods. This paper reviews the methodology briefly and presents a sample of integral and differential cross sections that have been obtained for the H + H 2 → H 2 +H and D + H 2 → HD + H reactions in the high energy region (up to 1.2 eV translational energy) relevant to resonance structures reported in recent experiments. 35 refs., 11 figs

  6. An Innovative Structural Mode Selection Methodology: Application for the X-33 Launch Vehicle Finite Element Model

    Science.gov (United States)

    Hidalgo, Homero, Jr.

    2000-01-01

    An innovative methodology for determining structural target mode selection and mode selection based on a specific criterion is presented. An effective approach to single out modes which interact with specific locations on a structure has been developed for the X-33 Launch Vehicle Finite Element Model (FEM). We presented Root-Sum-Square (RSS) displacement method computes resultant modal displacement for each mode at selected degrees of freedom (DOF) and sorts to locate modes with highest values. This method was used to determine modes, which most influenced specific locations/points on the X-33 flight vehicle such as avionics control components, aero-surface control actuators, propellant valve and engine points for use in flight control stability analysis and for flight POGO stability analysis. Additionally, the modal RSS method allows for primary or global target vehicle modes to also be identified in an accurate and efficient manner.

  7. Estimating the cost of delaying a nuclear power plant: methodology and application

    International Nuclear Information System (INIS)

    Hill, L.J.; Tepel, R.C.; Van Dyke, J.W.

    1985-01-01

    This paper presents an analysis of an actual 24-month nuclear power plant licensing delay under alternate assumptions about regulatory practice, sources of replacement power, and the cost of the plant. The analysis focuses on both the delay period and periods subsequent to the delay. The methodology utilized to simulate the impacts involved the recursive interaction of a generation-costing program to estimate fuel-replacement costs and a financial regulatory model to concomitantly determine the impact on the utility, its ratepayers, and security issues. The results indicate that a licensing delay has an adverse impact on the utility's internal generation of funds and financial indicators used to evaluate financial soundness. The direction of impact on electricity rates is contingent on the source of fuel used for replacement power. 5 references, 5 tables

  8. APPLICATION OF AN EXPERIMENTAL METHODOLOGY IN THE OPTIMIZATION OF A TUNGSTEN CONCENTRATION PROCESS BY MICROEMULSIONS

    Directory of Open Access Journals (Sweden)

    A.C.S. RAMOS

    1997-06-01

    Full Text Available Abstract - In this work, we applied an experimental planning methodology in order to correlate the necessary amounts with the description of the a tungsten extraction process by microemulsions. The result is a mathematical modelling carried out using the Sheffe Net method, where the mixtures concentration values are represented inside an equilateral triangle. The tungsten concentration process occurs in two stages: extraction and reextraction. The extraction stage was determined by monitoring: phase relative volume (Vr, extraction percentage (%E and tungsten concentration in the microemulsion phase (Ctm e. The reextraction phase was determined by monitoring: reextraction percentage (%Re and tungsten concentration in the aqueous phase (Ctaq. Finally, we obtained equations that relate the extraction / reextraction properties to the composition of specific points inside the extraction region, obeying the error limits specified for the acceptance of each parameter. The results were evaluated through the construction of isoresponse diagrams and correlation graphics between experimental values and those obtained through use of equations.

  9. Methodology for Assessing the Lithium-Sulfur Battery Degradation for Practical Applications

    DEFF Research Database (Denmark)

    Knap, Vaclav; Stroe, Daniel-Ioan; Purkayastha, Rajlakshmi

    2017-01-01

    -S batteries are driven by different electrochemical processes than commonly used Lithium-ion batteries, which often results in their very different behavior. Therefore, the modelling and testing have to be adjusted to reflect this unique behavior to prevent possible biases. A methodology for a reference......Lithium-Sulfur (Li-S) battery is an emerging battery technology receiving growing amount of attention due to its potential high contributions of gravimetric energy density, safety and low production cost. However, there are still some obstacles preventing their swift commercialization. Li...... performance test for the Li-S batteries is proposed in this study to point out the Li-S battery features and provide guidance to users how to deal with them and possible results into standardization....

  10. Multispecies fisheries management in the Mediterranean Sea: application of the Fcube methodology

    DEFF Research Database (Denmark)

    Maravelias, C.D.; Damalas, D.; Ulrich, Clara

    2012-01-01

    The ecosystem approach (EA) advocates that advice should be given based on a holistic management of the entire marine ecosystem and all fisheries and fleets involved. Recent developments have advanced to multi‐species, multi‐fisheries advice, rather than on a single‐species/fleet/area stock basis...... and socioeconomic parameters were used for coastal and trawl fisheries in the Aegean Sea. Results pointed out the strengths and weaknesses of alternative management strategies from both a biological and socioeconomic perspective. Fcube revealed the importance of effort control in the coastal fisheries...... that are still managed with no effort restrictions. The present findings, although preliminary, revealed that stringent cuts to effort and catch levels are required if EA management goals are to be met. The Fcube methodology, initially developed for mixed fisheries advice in northern European waters...

  11. [Subjectivity of nursing college students' awareness of gender equality: an application of Q-methodology].

    Science.gov (United States)

    Yeun, Eun Ja; Kwon, Hye Jin; Kim, Hyun Jeong

    2012-06-01

    This study was done to identify the awareness of gender equality among nursing college students, and to provide basic data for educational solutions and desirable directions. A Q-methodology which provides a method of analyzing the subjectivity of each item was used. 34 selected Q-statements from each of 20 women nursing college students were classified into a shape of normal distribution using 9-point scale. Subjectivity on the equality among genders was analyzed by the pc-QUANL program. Four types of awareness of gender equality in nursing college students were identified. The name for type I was 'pursuit of androgyny', for type II, 'difference-recognition', for type III, 'human-relationship emphasis', and for type IV, 'social-system emphasis'. The results of this study indicate that different approaches to educational programs on gender equality are recommended for nursing college students based on the four types of gender equality awareness.

  12. Multiple human schemas and the communication-information sources use: An application of Q-methodology

    Directory of Open Access Journals (Sweden)

    Mansour Shahvali

    2014-12-01

    Full Text Available This study was conducted with the aim of developing a communication and information model for greenhouse farmers in Yazd city using schema theory. Performing the Q methodology together with the factor analysis, as such, the different variables were loaded over the five schematic factors which included the human philosophical nature, ideological, economic, social, and environmental-conservation beliefs. Running AMOS,of course, it was also unveiled that the philosophical, ideological, social, economic and environmental schemas influence directly on the personal communication-information sources use. Furthermore, the environmental-conservation schema affects directly and indirectly the personal communication-information sources use. More importantly, this study indicated the important role of the indigenous sources which play in constructing, evaluating and retrieving the environmental knowledge in respondents. The research predisposes a suitable context for policymakers who seek to draw up much more effective and appropriate communication and information strategies to address the specific target groups’ needs.

  13. Application of methodology for calibration of instruments utilized in dosimetry of high energy beams, for radiodiagnosis

    International Nuclear Information System (INIS)

    Potiens, Maria P.A.; Caldas, Linda V.E.

    2000-01-01

    The radiation qualities recommended by the IEC 1267 standard for the calibration of instruments used in diagnostic radiology measurements were established using a neo-diagnomax X-ray system (125 kV). The RQR radiation qualities are recommended to test ionization chambers used in non attenuated beams, and the RQA radiation qualities in attenuated beams (behind a phantom). To apply the methodology, 6 ionization chambers commonly used in diagnostic radiology were tested. The higher energy dependence (17%) was obtained for an ionization chamber recommended for mammography beams, that is not the case of the X radiation system used in this work. The other ionization chambers presented good performance in terms of energy (maximum of 5%), therefore within the limits of the international recommendations for this kind of instrument. (author)

  14. Failure detection by adaptive lattice modelling using Kalman filtering methodology : application to NPP

    International Nuclear Information System (INIS)

    Ciftcioglu, O.

    1991-03-01

    Detection of failure in the operational status of a NPP is described. The method uses lattice form of the signal modelling established by means of Kalman filtering methodology. In this approach each lattice parameter is considered to be a state and the minimum variance estimate of the states is performed adaptively by optimal parameter estimation together with fast convergence and favourable statistical properties. In particular, the state covariance is also the covariance of the error committed by that estimate of the state value and the Mahalanobis distance formed for pattern comparison takes x 2 distribution for normally distributed signals. The failure detection is performed after a decision making process by probabilistic assessments based on the statistical information provided. The failure detection system is implemented in multi-channel signal environment of Borssele NPP and its favourable features are demonstrated. (author). 29 refs.; 7 figs

  15. Capturing complexity in work disability research: application of system dynamics modeling methodology.

    Science.gov (United States)

    Jetha, Arif; Pransky, Glenn; Hettinger, Lawrence J

    2016-01-01

    Work disability (WD) is characterized by variable and occasionally undesirable outcomes. The underlying determinants of WD outcomes include patterns of dynamic relationships among health, personal, organizational and regulatory factors that have been challenging to characterize, and inadequately represented by contemporary WD models. System dynamics modeling (SDM) methodology applies a sociotechnical systems thinking lens to view WD systems as comprising a range of influential factors linked by feedback relationships. SDM can potentially overcome limitations in contemporary WD models by uncovering causal feedback relationships, and conceptualizing dynamic system behaviors. It employs a collaborative and stakeholder-based model building methodology to create a visual depiction of the system as a whole. SDM can also enable researchers to run dynamic simulations to provide evidence of anticipated or unanticipated outcomes that could result from policy and programmatic intervention. SDM may advance rehabilitation research by providing greater insights into the structure and dynamics of WD systems while helping to understand inherent complexity. Challenges related to data availability, determining validity, and the extensive time and technical skill requirements for model building may limit SDM's use in the field and should be considered. Contemporary work disability (WD) models provide limited insight into complexity associated with WD processes. System dynamics modeling (SDM) has the potential to capture complexity through a stakeholder-based approach that generates a simulation model consisting of multiple feedback loops. SDM may enable WD researchers and practitioners to understand the structure and behavior of the WD system as a whole, and inform development of improved strategies to manage straightforward and complex WD cases.

  16. A Comparison between Standard and Functional Clustering Methodologies: Application to Agricultural Fields for Yield Pattern Assessment

    Directory of Open Access Journals (Sweden)

    Simone Pascucci

    2018-04-01

    Full Text Available The recognition of spatial patterns within agricultural fields, presenting similar yield potential areas, stable through time, is very important for optimizing agricultural practices. This study proposes the evaluation of different clustering methodologies applied to multispectral satellite time series for retrieving temporally stable (constant patterns in agricultural fields, related to within-field yield spatial distribution. The ability of different clustering procedures for the recognition and mapping of constant patterns in fields of cereal crops was assessed. Crop vigor patterns, considered to be related to soils characteristics, and possibly indicative of yield potential, were derived by applying the different clustering algorithms to time series of Landsat images acquired on 94 agricultural fields near Rome (Italy. Two different approaches were applied and validated using Landsat 7 and 8 archived imagery. The first approach automatically extracts and calculates for each field of interest (FOI the Normalized Difference Vegetation Index (NDVI, then exploits the standard K-means clustering algorithm to derive constant patterns at the field level. The second approach applies novel clustering procedures directly to spectral reflectance time series, in particular: (1 standard K-means; (2 functional K-means; (3 multivariate functional principal components clustering analysis; (4 hierarchical clustering. The different approaches were validated through cluster accuracy estimates on a reference set of FOIs for which yield maps were available for some years. Results show that multivariate functional principal components clustering, with an a priori determination of the optimal number of classes for each FOI, provides a better accuracy than those of standard clustering algorithms. The proposed novel functional clustering methodologies are effective and efficient for constant pattern retrieval and can be used for a sustainable management of

  17. Methodology for the application of probabilistic safety assessment techniques (PSA) to the cobalt-therapy units in Cuba

    International Nuclear Information System (INIS)

    Vilaragut Llanes, J.J.; Ferro Fernandez, R.; Troncoso Fleitas, M.; Lozano Lima, B.; Fuente Puch, A. de la; Perez Reyes, Y.; Dumenigo Gonzalez, C.

    2001-01-01

    The applications of PSA techniques in the nuclear power plants during the last two decades and the positive results obtained for decision making in relation with safety, as a complement to deterministic methods, have increased their use in the rest of the nuclear applications. At present a large set of documents from international institutions can be found summarizing the investigations carried out in this field and promoting their use in radioactive facilities. Although still without a mandatory character, the new regulations on radiological safety also promote the complete or partial application of the PSA techniques in the safety assessment of the radiological practices. Also the IAEA, through various programs in which Cuba has been inserted, is taking a group of actions so that the nuclear community will encourage the application of the probabilistic risk methods for the evaluations and decision making with respect to safety. However, the fact that in no radioactive installation has a complete PSA study been carried out, makes that certain methodological aspects require to be improved and modified for the application of these techniques. This work presents the main elements for the use of PSA in the evaluation of the safety of cobalt-therapy units in Cuba. Also presented, as part of the results of the first stage of the Study, are the Guidelines that are being applied in a Research Contract with the Agency by the authors themselves, who belong to the CNSN, together with other specialists from the Cuban Ministry of Public Health. (author) [es

  18. Step-by-step application methodology in practical KEPCO 22.9kV bus-bar system

    International Nuclear Information System (INIS)

    Yoon, Jae-young; Lee, Seung-yeol

    2010-01-01

    With the increase of power demand and the progress of power industry deregulation, the transmission and distribution systems will have more complicated problems by the influence of curtailing investment and the NIMBY phenomena in overall power systems. [1-2] It is expected that the route length per MW demand of South Korea will decrease gradually from 0.6[C-km/MW] to 0.53[C-km/MW] in 2010.[3] This comes up to a real serious problem from system planning and operation viewpoints. HTS technologies related to power system have properties to solve these complex transmission and distribution constraints, especially for metropolitan area, in the future. As the HTS technology has developed, the HTS cable technology can be the most effective alternative to solve the future expected power network constraints. This paper describes the application methodology of developing 22.9kV HTS cable by CAST for practical distribution system. 22.9kV HTS cable under development with step-by-step application methodology can substitute the existing and planning conventional 154kV cable.[4-5] If this scheme is applied, part of downtown 154kV substation of metropolitan city such as Seoul can be changed into 22.9kV switching station. Additionally, it can give huge economic, environmental benefits to all of the concerned authorities.

  19. A Methodology for the Design of Application-Specific Cyber-Physical Social Sensing Co-Simulators.

    Science.gov (United States)

    Sánchez, Borja Bordel; Alcarria, Ramón; Sánchez-Picot, Álvaro; Sánchez-de-Rivera, Diego

    2017-09-22

    Cyber-Physical Social Sensing (CPSS) is a new trend in the context of pervasive sensing. In these new systems, various domains coexist in time, evolve together and influence each other. Thus, application-specific tools are necessary for specifying and validating designs and simulating systems. However, nowadays, different tools are employed to simulate each domain independently. Mainly, the cause of the lack of co-simulation instruments to simulate all domains together is the extreme difficulty of combining and synchronizing various tools. In order to reduce that difficulty, an adequate architecture for the final co-simulator must be selected. Therefore, in this paper the authors investigate and propose a methodology for the design of CPSS co-simulation tools. The paper describes the four steps that software architects should follow in order to design the most adequate co-simulator for a certain application, considering the final users' needs and requirements and various additional factors such as the development team's experience. Moreover, the first practical use case of the proposed methodology is provided. An experimental validation is also included in order to evaluate the performing of the proposed co-simulator and to determine the correctness of the proposal.

  20. A Methodology for the Design of Application-Specific Cyber-Physical Social Sensing Co-Simulators

    Directory of Open Access Journals (Sweden)

    Borja Bordel Sánchez

    2017-09-01

    Full Text Available Cyber-Physical Social Sensing (CPSS is a new trend in the context of pervasive sensing. In these new systems, various domains coexist in time, evolve together and influence each other. Thus, application-specific tools are necessary for specifying and validating designs and simulating systems. However, nowadays, different tools are employed to simulate each domain independently. Mainly, the cause of the lack of co-simulation instruments to simulate all domains together is the extreme difficulty of combining and synchronizing various tools. In order to reduce that difficulty, an adequate architecture for the final co-simulator must be selected. Therefore, in this paper the authors investigate and propose a methodology for the design of CPSS co-simulation tools. The paper describes the four steps that software architects should follow in order to design the most adequate co-simulator for a certain application, considering the final users’ needs and requirements and various additional factors such as the development team’s experience. Moreover, the first practical use case of the proposed methodology is provided. An experimental validation is also included in order to evaluate the performing of the proposed co-simulator and to determine the correctness of the proposal.

  1. Issues and insights of PRA methodology in nuclear and space applications

    International Nuclear Information System (INIS)

    Hsu, F.

    2005-01-01

    This paper presents some important issues and technical insights on the scope, conceptual framework, and essential elements of nuclear power plant Probabilistic Risk Assessments (PRAs) and that of the PRAs in general applications of the aerospace industry, such as the Space Shuttle PRA being conducted by NASA. Discussions are focused on various lessons learned in nuclear power plant PRA applications and their potential applicability to the PRAs in the aerospace and launch vehicle systems. Based on insights gained from PRA projects for nuclear power plants and from the current Space Shuttle PRA effort, the paper explores the commonalities and the differences between the conduct of the different PRAs and the key issues and risk insights derived from extensive modeling practices in both industries of nuclear and space. (author)

  2. The importance of the application context for the design of Social LCA methodology

    DEFF Research Database (Denmark)

    Jørgensen, Andreas

    comprises users of the Social LCA, their target audience, ethical considerations of relevance to the application of Social LCA, and legal limitations. The users of the Social LCA are either companies, NGOs or GOs, and the target audience is the recipients of the assessment results from the above mentioned...... user groups. Both users’ and their target audience’s views are found through analysis of interviews with potential user groups. Ethical considerations concern negative social impacts that may occur in relation to the application of Social LCA. These are included here, as it is assumed that the overall...

  3. Towards a methodology for the engineering of event-driven process applications

    NARCIS (Netherlands)

    Baumgrass, A.; Botezatu, M.; Di Ciccio, C.; Dijkman, R.M.; Grefen, P.W.P.J.; Hewelt, M.; Mendling, J.; Meyer, A.; Pourmirza, S.; Völzer, H.; Reijers, H.; Reichert, M.

    2016-01-01

    Successful applications of the Internet of Things such as smart cities, smart logistics, and predictive maintenance, build on observing and analyzing business-related objects in the real world for business process execution and monitoring. In this context, complex event processing is increasingly

  4. Training in methodology and application radionuclides in the biomedical surface area

    International Nuclear Information System (INIS)

    Signoretta, C.

    1998-01-01

    The CNEA with the Biological Argentine Association and the Nuclear Medicine Association makes to the consideration the Nuclear Regulatory Authority the program a Radiotherapy c curse that will allow to form medical professionals with the purpose of obtaining the permission for the application radiopharmaceuticals in therapy

  5. The Team Climate Inventory: Application in hospital teams and methodological considerations

    NARCIS (Netherlands)

    Ouwens, M.A.; Hulscher, M.E.J.L.; Hermens, R.P.M.G.; Akkermans, R.P.; Grol, R.P.T.M.; Wollersheim, H.C.H.

    2008-01-01

    Understanding the feasibility of applying the Team Climate Inventory (TCI) in non-Western cultures is essential for researchers attempting to understand the influence of culture on workers’ perceived climate. This study describes the application of the TCI in such a setting using data from 203

  6. A novel methodology towards a trusted environment in mashup web applications

    DEFF Research Database (Denmark)

    Patel, Ahmed; Al-Janabi, Samaher; AlShourbaji, Ibrahim

    2015-01-01

    A mashup is a web-based application developed through aggregation of data from different public external or internal sources (including trusted and untrusted). Mashup introduces an open environment that is exposed to many security vulnerabilities, threats and risks. These weaknesses will bring se...

  7. Application of Bayesian network methodology to the probabilistic risk assessment of nuclear waste disposal facility

    International Nuclear Information System (INIS)

    Lee, Chang Ju

    2006-02-01

    The scenario in a risk analysis can be defined as the propagating feature of specific initiating event which can go to a wide range of undesirable consequences. If one takes various scenarios into consideration, the risk analysis becomes more complex than do without them. A lot of risk analyses have been performed to actually estimate a risk profile under both uncertain future states of hazard sources and undesirable scenarios. Unfortunately, in case of considering some stochastic passive systems such as a radioactive waste disposal facility, since the behaviour of future scenarios is hardly predicted without special reasoning process, we cannot estimate their risk only with a traditional risk analysis methodology. Moreover, it is believed that the sources of uncertainty at future states can be reduced pertinently by setting up dependency relationships interrelating geological, hydrological, and ecological aspects of the site with all the scenarios. It is then required current methodology of uncertainty analysis of the waste disposal facility be revisited under this belief. In order to consider the effects predicting from an evolution of environmental conditions of waste disposal facilities, this study proposes a quantitative assessment framework integrating the inference process of Bayesian network to the traditional probabilistic risk analysis. In this study an approximate probabilistic inference program for the specific Bayesian network developed and verified using a bounded-variance likelihood weighting algorithm. Ultimately, specific models, including a Monte-Carlo model for uncertainty propagation of relevant parameters, were developed with a comparison of variable-specific effects due to the occurrence of diverse altered evolution scenarios (AESs). After providing supporting information to get a variety of quantitative expectations about the dependency relationship between domain variables and AESs, this study could connect the results of probabilistic

  8. Application of a safety assessment methodology to a hypothetical surface disposal at Serpong site, Indonesia

    International Nuclear Information System (INIS)

    Lubis, E.; Mallants, D.; Volckaert, G.; Marivoet, J.; Neerdael, B.

    2000-01-01

    A preliminary and generic safety assessment of a candidate shallow land burial (SLB) repository at Serpong site, Indonesia, has been performed. The step-by-step safety assessment methodology included an analysis of features, events, and processes (FEPs), and mathematical modelling of radionuclide migration in the near field, geosphere and biosphere. On the basis of an extensive FEP catalogue the most relevant scenarios to be considered in the consequence analysis were selected. Both the normal evolution scenario (NES) and the alternative scenarios were identified. On the basis of these scenarios a conceptual model that included all the important physical-chemical processes was built for the near field and geosphere. A two-dimensional numerical model was then used to solve the governing flow and transport equations for appropriate initial and boundary conditions. The calculations were performed using a repository-specific value for the total disposed activity in combination with hypothetical values for radionuclide composition based on a typical radionuclide content of low level waste in Belgium. Site-specific data on hydrogeological properties were used for the geosphere calculations. Typical results of the consequence analysis in terms of radionuclide fluxes to the geosphere and radionuclide concentrations in the groundwater are discussed. (author)

  9. Application of response surface methodology for optimization of parameters for microwave heating of rare earth carbonates

    Science.gov (United States)

    Yin, Shaohua; Lin, Guo; Li, Shiwei; Peng, Jinhui; Zhang, Libo

    2016-09-01

    Microwave heating has been applied in the field of drying rare earth carbonates to improve drying efficiency and reduce energy consumption. The effects of power density, material thickness and drying time on the weight reduction (WR) are studied using response surface methodology (RSM). The results show that RSM is feasible to describe the relationship between the independent variables and weight reduction. Based on the analysis of variance (ANOVA), the model is in accordance with the experimental data. The optimum experiment conditions are power density 6 w/g, material thickness 15 mm and drying time 15 min, resulting in an experimental weight reduction of 73%. Comparative experiments show that microwave drying has the advantages of rapid dehydration and energy conservation. Particle analysis shows that the size distribution of rare earth carbonates after microwave drying is more even than those in an oven. Based on these findings, microwave heating technology has an important meaning to energy-saving and improvement of production efficiency for rare earth smelting enterprises and is a green heating process.

  10. Application of Response Surface Methodology for the Technological Improvement of Solid Lipid Nanoparticles.

    Science.gov (United States)

    Dal Pizzol, Carine; O'Reilly, Andre; Winter, Evelyn; Sonaglio, Diva; de Campos, Angela Machado; Creczynski-Pasa, Tânia Beatriz

    2016-02-01

    Solid lipid nanoparticles (SLN) are colloidal particles consisting of a matrix composed of solid (at room and body temperatures) lipids dispersed in aqueous emulsifier solution. During manufacture, their physicochemical properties may be affected by several formulation parameters, such as type and concentration of lipid, proportion of emulsifiers and amount of solvent. Thus, the aim of this work was to study the influence of these variables on the preparation of SLN. A D-optimal Response Surface Methodology design was used to establish a mathematical model for the optimization of SLN. A total of 30 SLN formulations were prepared using the ultrasound method, and then characterized on the basis of their physicochemical properties, including particle size, polydispersity index (PI) and Zeta Potential (s). Particle sizes ranged between 107 and 240 nm. All SLN formulations showed negative sigma and PI values below 0.28. Prediction of the optimal conditions was performed using the desirability function targeting the reduction of all responses. The optimized SLN formulation showed similar theoretical and experimental values, confirming the sturdiness and predictive ability of the mathematical model for SLN optimization.

  11. Spatial analysis of electricity demand patterns in Greece: Application of a GIS-based methodological framework

    Science.gov (United States)

    Tyralis, Hristos; Mamassis, Nikos; Photis, Yorgos N.

    2016-04-01

    We investigate various uses of electricity demand in Greece (agricultural, commercial, domestic, industrial use as well as use for public and municipal authorities and street lightning) and we examine their relation with variables such as population, total area, population density and the Gross Domestic Product. The analysis is performed on data which span from 2008 to 2012 and have annual temporal resolution and spatial resolution down to the level of prefecture. We both visualize the results of the analysis and we perform cluster and outlier analysis using the Anselin local Moran's I statistic as well as hot spot analysis using the Getis-Ord Gi* statistic. The definition of the spatial patterns and relationships of the aforementioned variables in a GIS environment provides meaningful insight and better understanding of the regional development model in Greece and justifies the basis for an energy demand forecasting methodology. Acknowledgement: This research has been partly financed by the European Union (European Social Fund - ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARISTEIA II: Reinforcement of the interdisciplinary and/ or inter-institutional research and innovation (CRESSENDO project; grant number 5145).

  12. Continuous electrocoagulation of cheese whey wastewater: an application of Response Surface Methodology.

    Science.gov (United States)

    Tezcan Un, Umran; Kandemir, Ayse; Erginel, Nihal; Ocal, S Eren

    2014-12-15

    In this study, treatment of cheese whey wastewater was performed using a uniquely-designed continuous electrocoagulation reactor, not previously encountered in the literature. An iron horizontal rotating screw type anode was used in the continuous mode. An empirical model, in terms of effective operational factors, such as current density (40, 50, 60 mA/cm(2)), pH (3, 5, 7) and retention time (20, 40, 60 min), was developed through Response Surface Methodology. An optimal region characterized by low values of Chemical Oxygen Demand (COD) was determined. As a result of experiments, a linear effect in the removal efficiency of COD was obtained for current density and retention time, while the initial pH of the wastewater was found to have a quadratic effect in the removal efficiency of COD. The best fit nonlinear mathematical model, with a coefficient of determination value (R(2)) of 85%, was defined. An initial COD concentration of 15.500 mg/L was reduced to 2112 mg/L with a removal efficiency of 86.4%. In conclusion, it can be said that electrocoagulation was successfully applied for the treatment of cheese whey wastewater. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Application of risk based inspection methodology of API 581 BRD to oil pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Pezzi Filho, M. [Petrobras, Rio de Janeiro (Brazil); Freire, J.L.F.; Maragone, F.. [Pontifica Univ. Catolica, Rio de Janeiro (Brazil)

    2004-07-01

    In response to public concerns regarding the safe operation of pipelines, operators are relying on risk management to balance the demands of pipeline integrity and service competitiveness. This paper presented a procedure for calculating the probability of failure according to the API 581 BRD RB I methodology that uses data collected from 3 existing oil pipelines subject to internal corrosion. The risk exposure that the pipeline may be subjected to during its remaining operating life was evaluated. Risk is a function of the probability of failure and the consequences of that failure. Therefore, if the consequences of the failure are assumed to remain unchanged for a given equipment under specific operational conditions, then risk reduction will only be achieved by decreasing the probability of failure and through management of the inspection process. It was shown that if the confidence on the damage rate and on the inspection effectiveness is known, then it is possible to design an alternative to an existing inspection plan. This calls for pig inspections at 5 year intervals to ensure that the probability of failures for pipelines will be under a certain level of tolerable risk. 9 refs., 4 tabs., 3 figs.

  14. Soundscape design guidelines through noise mapping methodologies: An application to medium urban agglomerations

    Directory of Open Access Journals (Sweden)

    Vogiatzis Konstantinos

    2017-03-01

    Full Text Available In the framework of the European Directive 2002/49/EC, from 2012 to 2016, several cities in Greece have completed noise strategic maps with noise action plans that usually define the main strategies to reduce the noise residents are exposed to and introduce and preserve “quiet zones”. Several medium urban agglomerations in Greece (Volos, Larissa, Chania, Heraklion, Corfu, Agrinio, Thessaloniki have been chosen to also analyse the sound qualities of the soundscapes of specific urban neighbourhoods in order to generate recommendations for the urban design of the soundscapes of these agglomerations in a manner that complements conventional noise mitigation measures. The general principle of this approach is to relate quantitative data (e.g., from measurements, acoustic simulations, urban forms, topography, and traffic model with qualitative data (e.g., from type of sources, interviews, reports on environmental noise perception by creating quantitative and qualitative maps. The aim of this study is to propose possible action tools to the relevant authorities aiming at diminishing noise levels in affected areas and also to provide solutions towards a sustainable sound environment both in space and time. This paper presents the main current methodology, selected important results proposed for the urban agglomerations of a typical Southeast Mediterranean country such as Greece.

  15. Planning Minimum Interurban Fast Charging Infrastructure for Electric Vehicles: Methodology and Application to Spain

    Directory of Open Access Journals (Sweden)

    Antonio Colmenar-Santos

    2014-02-01

    Full Text Available The goal of the research is to assess the minimum requirement of fast charging infrastructure to allow country-wide interurban electric vehicle (EV mobility. Charging times comparable to fueling times in conventional internal combustion vehicles are nowadays feasible, given the current availability of fast charging technologies. The main contribution of this paper is the analysis of the planning method and the investment requirements for the necessary infrastructure, including the definition of the Maximum Distance between Fast Charge (MDFC and the Basic Highway Charging Infrastructure (BHCI concepts. According to the calculations, distance between stations will be region-dependent, influenced primarily by weather conditions. The study considers that the initial investment should be sufficient to promote the EV adoption, proposing an initial state-financed public infrastructure and, once the adoption rate for EVs increases, additional infrastructure will be likely developed through private investment. The Spanish network of state highways is used as a case study to demonstrate the methodology and calculate the investment required. Further, the results are discussed and quantitatively compared to other incentives and policies supporting EV technology adoption in the light-vehicle sector.

  16. Clinical audit, a valuable tool to improve quality of care: General methodology and applications in nephrology

    Science.gov (United States)

    Esposito, Pasquale; Dal Canton, Antonio

    2014-01-01

    Evaluation and improvement of quality of care provided to the patients are of crucial importance in the daily clinical practice and in the health policy planning and financing. Different tools have been developed, including incident analysis, health technology assessment and clinical audit. The clinical audit consist of measuring a clinical outcome or a process, against well-defined standards set on the principles of evidence-based medicine in order to identify the changes needed to improve the quality of care. In particular, patients suffering from chronic renal diseases, present many problems that have been set as topics for clinical audit projects, such as hypertension, anaemia and mineral metabolism management. Although the results of these studies have been encouraging, demonstrating the effectiveness of audit, overall the present evidence is not clearly in favour of clinical audit. These findings call attention to the need to further studies to validate this methodology in different operating scenarios. This review examines the principle of clinical audit, focusing on experiences performed in nephrology settings. PMID:25374819

  17. A High-Performance Embedded Hybrid Methodology for Uncertainty Quantification With Applications

    Energy Technology Data Exchange (ETDEWEB)

    Iaccarino, Gianluca

    2014-04-01

    Multiphysics processes modeled by a system of unsteady di erential equations are natu- rally suited for partitioned (modular) solution strategies. We consider such a model where probabilistic uncertainties are present in each module of the system and represented as a set of random input parameters. A straightforward approach in quantifying uncertainties in the predicted solution would be to sample all the input parameters into a single set, and treat the full system as a black-box. Although this method is easily parallelizable and requires minimal modi cations to deterministic solver, it is blind to the modular structure of the underlying multiphysical model. On the other hand, using spectral representations polynomial chaos expansions (PCE) can provide richer structural information regarding the dynamics of these uncertainties as they propagate from the inputs to the predicted output, but can be prohibitively expensive to implement in the high-dimensional global space of un- certain parameters. Therefore, we investigated hybrid methodologies wherein each module has the exibility of using sampling or PCE based methods of capturing local uncertainties while maintaining accuracy in the global uncertainty analysis. For the latter case, we use a conditional PCE model which mitigates the curse of dimension associated with intru- sive Galerkin or semi-intrusive Pseudospectral methods. After formalizing the theoretical framework, we demonstrate our proposed method using a numerical viscous ow simulation and benchmark the performance against a solely Monte-Carlo method and solely spectral method.

  18. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.

    Science.gov (United States)

    Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R

    2012-08-01

    Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.

  19. Artificial neural network methodology: Application to predict magnetic properties of nanocrystalline alloys

    International Nuclear Information System (INIS)

    Hamzaoui, R.; Cherigui, M.; Guessasma, S.; ElKedim, O.; Fenineche, N.

    2009-01-01

    This paper is dedicated to the optimization of magnetic properties of iron based magnetic materials with regard to milling and coating process conditions using artificial neural network methodology. Fe-20 wt.% Ni and Fe-6.5 wt.% Si, alloys were obtained using two high-energy ball milling technologies, namely a planetary ball mill P4 vario ball mill from Fritsch and planetary ball mill from Retch. Further processing of Fe-Si powder allowed the spraying of the feedstock material using high-velocity oxy-fuel (HVOF) process to obtain a relatively dense coating. Input parameters were the disc Ω and vial ω speed rotations for the milling technique, and spray distance and oxygen flow rate in the case of coating process. Two main magnetic parameters are optimized namely the saturation magnetization and the coercivity. Predicted results depict clearly coupled effects of input parameters to vary magnetic parameters. In particular, the increase of saturation magnetization is correlated to the increase of the product Ωω (shock power) and the product of spray parameters. Largest coercivity values are correlated to the increase of the ratio Ω/ω (shock mode process) and the increase of the product of spray parameters.

  20. Application of Response Surface Methodology in Development of Sirolimus Liposomes Prepared by Thin Film Hydration Technique

    Directory of Open Access Journals (Sweden)

    Saeed Ghanbarzadeh

    2013-04-01

    Full Text Available Introduction: The present investigation was aimed to optimize the formulating process of sirolimus liposomes by thin film hydration method. Methods: In this study, a 32 factorial design method was used to investigate the influence of two independent variables in the preparation of sirolimus liposomes. The dipalmitoylphosphatidylcholine (DPPC /Cholesterol (Chol and dioleoyl phosphoethanolamine(DOPE /DPPC molar ratios were selected as the independent variables. Particle size (PS and Encapsulation Efficiency (EE % were selected as the dependent variables. To separate the un-encapsulated drug, dialysis method was used. Drug analysis was performed with a validated RP-HPLC method. Results: Using response surface methodology and based on the coefficient values obtained for independent variables in the regression equations, it was clear that the DPPC/Chol molar ratio was the major contributing variable in particle size and EE %. The use of a statistical approach allowed us to see individual and/or interaction effects of influencing parameters in order to obtain liposomes with desired properties and to determine the optimum experimental conditions that lead to the enhancement of characteristics. In the prediction of PS and EE % values, the average percent errors are found to be as 3.59 and 4.09%. This value is sufficiently low to confirm the high predictive power of model. Conclusion: Experimental results show that the observed responses were in close agreement with the predicted values and this demonstrates the reliability of the optimization procedure in prediction of PS and EE % in sirolimus liposomes preparation.