LBLOCA sensitivity analysis using meta models
International Nuclear Information System (INIS)
Villamizar, M.; Sanchez-Saez, F.; Villanueva, J.F.; Carlos, S.; Sanchez, A.I.; Martorell, S.
2014-01-01
This paper presents an approach to perform the sensitivity analysis of the results of simulation of thermal hydraulic codes within a BEPU approach. Sensitivity analysis is based on the computation of Sobol' indices that makes use of a meta model, It presents also an application to a Large-Break Loss of Coolant Accident, LBLOCA, in the cold leg of a pressurized water reactor, PWR, addressing the results of the BEMUSE program and using the thermal-hydraulic code TRACE. (authors)
Analysis of a PWR LBLOCA without SCRAM
International Nuclear Information System (INIS)
Tyler, T.N.; Macian-Juan, R.; Mahaffy, J.H.
1996-01-01
The authors analyze a conservative recriticality scenario to explore the potential risk of fuel damage during a large-break loss-of-coolant accident in a typical U.S. pressurized-water reactor. No SCRAM is assumed, and no credit is taken for injected boron in core neutronics calculations. Although the scenario is conservative, the analysis is best estimate, using TRAC-PF1/MOD2 to model the thermal-hydraulics, coupled with a three-dimensional, transient neutronic model of the core. The simulation can follow complex system interactions during the reflood, which influence the neutronic feedback in the core. In all cases examined, the return of cold water to the core is limited by increased steam production from a marginal (local) return to power. A quasi-steady state is established during low-pressure safety injection cooling in which sufficient core flow exists to maintain rod temperatures to well below the fuel damage limit, but insufficient total inventory is present to result in a full return to power
Evaluations of the CCFL and critical flow models in TRACE for PWR LBLOCA analysis
Energy Technology Data Exchange (ETDEWEB)
Yang, Jung-Hua; Lin, Hao Tzu [National Tsing Hua Univ., HsinChu, Taiwan (China). Dept. of Engineering and System Science; Wang, Jong-Rong [Atomic Energy Council, Taoyuan County, Taiwan (China). Inst. of Nuclear Energy Research; Shih, Chunkuan [National Tsing Hua Univ., HsinChu, Taiwan (China). Inst. of Nuclear Engineering and Science
2012-12-15
This study aims to develop the Maanshan Pressurized Water Reactor (PWR) analysis model by using the TRACE (TRAC/RELAP Advanced Computational Engine) code. By analyzing the Large Break Loss of Coolant Accident (LBLOCA) sequence, the results are compared with the Maanshan Final Safety Analysis Report (FSAR) data. The critical flow and Counter Current Flow Limitation (CCFL) play an important role in the overall performance of TRACE LBLOCA prediction. Therefore, the sensitivity study on the discharge coefficients of critical flow model and CCFL modeling among different regions are also discussed. The current conclusions show that modeling CCFL in downcomer has more significant impact on the peak cladding temperature than modeling CCFL in hot-legs does. No CCFL phenomena occurred in the pressurizer surge line. The best value for the multipliers of critical flow model would be 0.5 and the TRACE could consistently predict the break flow rate in the LBLOCA analysis as shown in FSAR. (orig.)
Mechanical analysis of a $\\beta=0.09 $ 162.5MHz taper HWR cavity
Fan, Peiliang; Zhu, Feng; Zhong, Hutianxiang; Quan, Shengwen; Liu, Kexin
2015-01-01
One superconducting taper-type half-wave resonator (HWR) with frequency of 162.5MHz, \\b{eta} of 0.09 has been developed at Peking University, which is used to accelerate high current proton ($\\sim$ 100mA) and $D^{+}$($\\sim$ 50mA). The radio frequency (RF) design of the cavity has been accomplished. Herein, we present the mechanical analysis of the cavity which is also an important aspect in superconducting cavity design. The frequency shift caused by bath helium pressure and Lorenz force, and...
International Nuclear Information System (INIS)
Kang, Dong Gu
2017-01-01
Highlights: • The nodalization of APR-1400 was modified to reflect the characteristic of upper region temperature. • The effect of nodalization and temperature of reactor upper region on LBLOCA consequence was evaluated. • The modification of nodalization is an essential prerequisite in APR-1400 LBLOCA analysis. - Abstract: In best estimate (BE) calculation, the definition of system nodalization is important step influencing the prediction accuracy for specific thermal-hydraulic phenomena. The upper region of reactor is defined as the region of the upper guide structure (UGS) and upper dome. It has been assumed that the temperature of upper region is close to average temperature in most large break loss of coolant accident (LBLOCA) analysis cases. However, it was recently found that the temperature of upper region of APR-1400 reactor might be little lower than or similar to hot leg temperature through the review of detailed design data. In this study, the nodalization of APR-1400 was modified to reflect the characteristic of upper region temperature, and the effect of nodalization and temperature of reactor upper region on LBLOCA consequence was evaluated by sensitivity analysis including best estimate plus uncertainty (BEPU) calculation. In basecase calculation, in case of modified version, the peak cladding temperature (PCT) in blowdown phase became higher and the blowdown quenching (or cooling) was significantly deteriorated as compared to original case, and as a result, the cladding temperature in reflood phase became higher and the final quenching was also delayed. In addition, thermal-hydraulic parameters were compared and analyzed to investigate the effect of change of upper region on cladding temperature. In BEPU analysis, the 95 percentile PCT used in current regulatory practice was increased due to the modification of upper region nodalization, and it occurred in the reflood phase unlike original case.
Application of status uncertainty analysis methods for AP1000 LBLOCA calculation
International Nuclear Information System (INIS)
Zhang Shunxiang; Liang Guoxing
2012-01-01
Parameter uncertainty analysis is developed by using the reasonable method to establish the response relations between input parameter uncertainties and output uncertainties. The application of the parameter uncertainty analysis makes the simulation of plant state more accuracy and improves the plant economy with reasonable security assurance. The AP1000 LBLOCA was analyzed in this paper and the results indicate that the random sampling statistical analysis method, sensitivity analysis numerical method and traditional error propagation analysis method can provide quite large peak cladding temperature (PCT) safety margin, which is much helpful for choosing suitable uncertainty analysis method to improve the plant economy. Additionally, the random sampling statistical analysis method applying mathematical statistics theory makes the largest safety margin due to the reducing of the conservation. Comparing with the traditional conservative bounding parameter analysis method, the random sampling method can provide the PCT margin of 100 K, while the other two methods can only provide 50-60 K. (authors)
Uncertainty analysis of LBLOCA for Advanced Heavy Water Reactor
International Nuclear Information System (INIS)
Srivastava, A.; Lele, H.G.; Ghosh, A.K.; Kushwaha, H.S.
2008-01-01
The main objective of safety analysis is to demonstrate in a robust way that all safety requirements are met, i.e. sufficient margins exist between real values of important parameters and their threshold values at which damage of the barriers against release of radioactivity would occur. As stated in the IAEA Safety Requirements for Design of NPPs 'a safety analysis of the plant design shall be conducted in which methods of both deterministic and probabilistic analysis shall be applied'. It is required that 'the computer programs, analytical methods and plant models used in the safety analysis shall be verified and validated, and adequate consideration shall be given to uncertainties'. Uncertainties are present in calculations due to the computer codes, initial and boundary conditions, plant state, fuel parameters, scaling and numerical solution algorithm. All conservative approaches, still widely used, were introduced to cover uncertainties due to limited capability for modelling and understanding of physical phenomena at the early stages of safety analysis. The results obtained by this approach are quite unrealistic and the level of conservatism is not fully known. Another approach is the use of Best Estimate (BE) codes with realistic initial and boundary conditions. If this approach is selected, it should be based on statistically combined uncertainties for plant initial and boundary conditions, assumptions and code models. The current trends are going into direction of the best estimate code with some conservative assumptions of the system with realistic input data with uncertainty analysis. The BE analysis with evaluation of uncertainties offers, in addition, a way to quantify the existing plant safety margins. Its broader use in the future is therefore envisaged, even though it is not always feasible because of the difficulty of quantifying code uncertainties with sufficiently narrow range for every phenomenon and for each accident sequence. In this paper
Analysis on Containment Response Following a LBLOCA of APR1400
International Nuclear Information System (INIS)
Han, Kyu Hyun
2016-01-01
The predictions are in good agreements with the final safety analysis report, which implies the containment integrity is maintained during or after an accident like loss of coolant accident. In this study, the CONTEMPT-LT/028 was used to calculate the pressure and temperature, and in the follow-up study, CONTAIN 2.0 will be used for the pressure and temperature predictions in APR1400 reactors. Shin-Hanul Units 1 and 2 may possess different characteristics of peak pressure and temperature in containment following a large break loss-of-coolant-accident. To assess the important performance independently and to compare with prediction results presented in the final safety analysis report (FSAR) of Shin-Hanul Units 1 and 2 might be helpful to regulatory review for identifying validity of the FSAR. The end of blowdown (EOB) time during a LOCA could largely affect the peak pressure and temperature in the containment. This paper provides CONTEMPT-LT/028 prediction of the peak pressure and temperature of Shin-Hanul Units 1 and 2 following a large break loss-of-coolant-accident and compares with licensee's prediction results.
Analysis on Containment Response Following a LBLOCA of APR1400
Energy Technology Data Exchange (ETDEWEB)
Han, Kyu Hyun [KINS, Daejeon (Korea, Republic of)
2016-05-15
The predictions are in good agreements with the final safety analysis report, which implies the containment integrity is maintained during or after an accident like loss of coolant accident. In this study, the CONTEMPT-LT/028 was used to calculate the pressure and temperature, and in the follow-up study, CONTAIN 2.0 will be used for the pressure and temperature predictions in APR1400 reactors. Shin-Hanul Units 1 and 2 may possess different characteristics of peak pressure and temperature in containment following a large break loss-of-coolant-accident. To assess the important performance independently and to compare with prediction results presented in the final safety analysis report (FSAR) of Shin-Hanul Units 1 and 2 might be helpful to regulatory review for identifying validity of the FSAR. The end of blowdown (EOB) time during a LOCA could largely affect the peak pressure and temperature in the containment. This paper provides CONTEMPT-LT/028 prediction of the peak pressure and temperature of Shin-Hanul Units 1 and 2 following a large break loss-of-coolant-accident and compares with licensee's prediction results.
Simulation of advanced accumulator and its application in CPR1000 LBLOCA analysis
International Nuclear Information System (INIS)
Hu, Hongwei; Shan, Jianqiang; Gou, Junli; Cao, Jianhua; Shen, Yonggang; Fu, Xiangang
2014-01-01
Highlights: • The analysis model was developed for advanced accumulator. • The sensitivity analysis of each key parameter was performed. • The LBLOCA was analyzed for the CPR1000 with advanced accumulator. • The analysis shows that advanced accumulator can improve CPR1000 safety performance. - Abstract: The advanced accumulator is designed to improve the safety and reliability of CPR1000 by providing a small injection flow to keep the reactor core in flooded condition. Thus, the core still stays in a cooling state without the intervention of low pressure safety injection and the startup grace time of the low pressure safety injection pump can be greatly extended. A new model for the advanced accumulator, which is based on the basic conservation equations, is developed and incorporated into RELAP5/MOD 3.3. The simulation of the advanced accumulator can be carried out and results show that the behavior of the advanced accumulator satisfied its primary design target. There is a large flow in the advanced accumulator at the initial stage. When the accumulator water level is lower than the stand pipe, a vortex appears in the damper, which results in a large pressure drop and a small flow. And then the sensitivity analysis is performed and the major factors which affected the flow rate of the advanced accumulator were obtained, including the damper diameter, the initial volume ratio of the water and the nitrogen and the diameter ratio of the standpipe and the small pipe. Additionally, the primary coolant loop cold leg double-ended guillotine break LBLOCA in CPR1000 with advanced accumulator is analyzed. The results show that the criterion for maximum cladding temperature limit (1477 K) (NRC, 1992) can be met ever with 200 s after the startup of the low pressure safety injection. From this point of view, passive advanced accumulator can strive a longer grace time for LPSI. Thus the reliability, safety and economy of the reactor system can be improved
Containment pressure analysis methodology during a LBLOCA with iteration between RELAP5 and COCOSYS
Energy Technology Data Exchange (ETDEWEB)
Silva, Dayane Faria; Sabundjian, Gaianê; Souza, Ana Cecília Lima, E-mail: dayanefs@ipen.br, E-mail: gdjian@ipen.br, E-mail: aclima@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)
2017-07-01
The pressure conditions inside the containment in the case of a Large Break Loss of Coolant Accident (LBLOCA) are more severe in the case of hot leg rupture, due to the large amount of mass and energy that is thrown from the break that lies just after the pressure vessel. This work presents a methodology of pressure analysis within the containment of a Brazilian PWR, Angra 2, with an iterative process between the code that simulates guillotine rupture - RELAP5 - and the COCOSYS code, which analyzes the containment pressure from the accident conditions. The results show that the iterative process between the codes allows the convergence of pressure data to a more realistic approach. (author)
Containment pressure analysis methodology during a LBLOCA with iteration between RELAP5 and COCOSYS
International Nuclear Information System (INIS)
Silva, Dayane Faria; Sabundjian, Gaianê; Souza, Ana Cecília Lima
2017-01-01
The pressure conditions inside the containment in the case of a Large Break Loss of Coolant Accident (LBLOCA) are more severe in the case of hot leg rupture, due to the large amount of mass and energy that is thrown from the break that lies just after the pressure vessel. This work presents a methodology of pressure analysis within the containment of a Brazilian PWR, Angra 2, with an iterative process between the code that simulates guillotine rupture - RELAP5 - and the COCOSYS code, which analyzes the containment pressure from the accident conditions. The results show that the iterative process between the codes allows the convergence of pressure data to a more realistic approach. (author)
Post-test analysis for the APR1400 LBLOCA DVI performance test using MARS
International Nuclear Information System (INIS)
Bae, Kyoo Hwan; Lee, Y. J.; Kim, H. C.; Bae, Y. Y.; Park, J. K.; Lee, W.
2002-03-01
Post-test analyses using a multi-dimensional best-estimate analysis code, MARS, are performed for the APR1400 LBLOCA DVI (Direct Vessel Injection) performance tests. This report describes the code evaluation results for the test data of various void height tests and direct bypass tests that have been performed at MIDAS test facility. MIDAS is a scaled test facility of APR1400 with the objective of identifying multi-dimensional thermal-hydraulic phenomena in the downcomer during the reflood conditions of a large break LOCA. A modified linear scale ratio was applied in its construction and test conditions. The major thermal-hydraulic parameters such as ECC bypass fraction, steam condensation fraction, and temperature distributions in downcomer are compared and evaluated. The evaluation results of MARS code for the various test cases show that: (a) MARS code has an advanced modeling capability of well predicting major multi-dimensional thermal-hydraulic phenomena occurring in the downcomer, (b) MARS code under-predicts the steam condensation rates, which in turn causes to over-predict the ECC bypass rates. However, the trend of decrease in steam condensation rate and increase in ECC bypass rate in accordance with the increase in steam flow rate, and the calculation results of the ECC bypass rates under the EM analysis conditions generally agree with the test data
Analysis of the return to power scenario following a LBLOCA in a PWR
Energy Technology Data Exchange (ETDEWEB)
Macian, R.; Tyler, T.N.; Mahaffy, J.H. [Pennsylvania State Univ., University Park, PA (United States)
1995-09-01
The risk of reactivity accidents has been considered an important safety issue since the beginning of the nuclear power industry. In particular, several events leading to such scenarios for PWR`s have been recognized and studied to assess the potential risk of fuel damage. The present paper analyzes one such event: the possible return to power during the reflooding phase following a LBLOCA. TRAC-PF1/MOD2 coupled with a three-dimensional neutronic model of the core based on the Nodal Expansion Method (NEM) was used to perform the analysis. The system computer model contains a detailed representation of a complete typical 4-loop PWR. Thus, the simulation can follow complex system interactions during reflooding, which may influence the neutronics feedback in the core. Analyses were made with core models bases on cross sections generated by LEOPARD. A standard and a potentially more limiting case, with increased pressurizer and accumulator inventories, were run. In both simulations, the reactor reaches a stable state after the reflooding is completed. The lower core region, filled with cold water, generates enough power to boil part of the incoming liquid, thus preventing the core average liquid fraction from reaching a value high enough to cause a return to power. At the same time, the mass flow rate through the core is adequate to maintain the rod temperature well below the fuel damage limit.
Analysis of the LBLOCAs in the HANARO pool for the 3-pin fuel test loop
International Nuclear Information System (INIS)
Park, S. K.; Chi, D. Y.; Sim, B. S.; Park, K. N.; Ahn, S. H.; Lee, J. M.; Lee, C. Y.; Kim, Y. J.
2004-12-01
The Fuel Test Loop(FTL) has been developed to meet the increasing demand on fuel irradiation and burn up test required the development of new fuels in Korea. It is designed to provide the test conditions of high pressure and temperature like the commercial PWR and CANDU power plants. And also the FTL have the cooling capability to sufficiently remove the thermal power of the in-pile test section for normal operation, Anticipated Operational Occurrences(AOOs), and Design Basis Accidents(DBAs). This report deals with the Large Break Loss of Coolant Accidents (LBLOCAs) in HANARO pool for the 3-pin fuel test loop. The MARS code has been used for the prediction of the emergency core cooling capability of the FTL and the peak cladding temperature of the test fuels for the LBLOCAs. The location of the pipe break is assumed at the hill taps connecting the cold and hot legs in HANARO pool to the inlet and outlet nozzles of the In-Pile test Section (IPS). Double ended guillotine break is assumed for the large break loss of coolant accidents. The discharge coefficients of 0.1, 0.33, 0.67, 1.0 are investigated for the LBLOCAs. The test fuels for PWR and CANDU test modes are not heated up for the LBLOCAs caused by the double ended guillotine break in the HANARO pool. The reason is that the sufficient emergency cooling water to cool down the test fuels is supplied continuously to the in-pile test section. Therefore the PCTs for the LBLOCAs in the HANARO pool meet the design criterion of commercial PWR fuel that maximum PCT is lower than 1204 .deg. C
Effect of Uncertainty Parameters in Blowdown and Reflood Models for OPR1000 LBLOCA Analysis
Energy Technology Data Exchange (ETDEWEB)
Huh, Byung Gil; Jin, Chang Yong; Seul, Kwangwon; Hwang, Taesuk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)
2014-05-15
KINS(Korea Institute of Nuclear Safety) has also performed the audit calculation with the KINS Realistic Evaluation Methodology(KINS-REM) to confirm the validity of licensee's calculation. In the BEPU method, it is very important to quantify the code and model uncertainty. It is referred in the following requirement: BE calculations in Regulatory Guide 1.157 - 'the code and models used are acceptable and applicable to the specific facility over the intended operating range and must quantify the uncertainty in the specific application'. In general, the uncertainty of model/code should be obtained through the data comparison with relevant integral- and separate-effect tests at different scales. However, it is not easy to determine these kinds of uncertainty because of the difficulty for evaluating accurately various experiments. Therefore, the expert judgment has been used in many cases even with the limitation that the uncertainty range of important parameters can be wide and inaccurate. In the KINS-REM, six heat transfer parameters in the blowdown phase have been used to consider the uncertainty of models. Recently, MARS-KS code was modified to consider the uncertainty of the five heat transfer parameters in the reflood phase. Accordingly, it is required that the uncertainty range for parameters of reflood models is determined and the effect of these ranges is evaluated. In this study, the large break LOCA (LBLOCA) analysis for OPR1000 was performed to identify the effect of uncertainty parameters in blowdown and reflood models.
Energy Technology Data Exchange (ETDEWEB)
Son, Seong Min; Lee, You Ho; Cho, Jae Wan; Lee, Jeong Ik [KAERI, Daejeon (Korea, Republic of)
2016-05-15
This study suggested thermal hydraulic-mechanical integrated stress based methodology for analyzing the behavior of ATF type claddings by SiC-Duplex cladding LBLOCA simulation. Also, this paper showed that this methodology could predict real experimental result well. That concept for enhanced safety of LWR called Advanced Accident-Tolerance Fuel Cladding (ATF cladding, ATF) is researched actively. However, current nuclear fuel cladding design criteria for zircaloy cannot be apply to ATF directly because those criteria are mainly based on limiting their oxidation. So, the new methodology for ATF design criteria is necessary. In this study, stress based analysis methodology for ATF cladding design criteria is suggested. By simulating LBLOCA scenario of SiC cladding which is the one of the most promising candidate of ATF. Also we'll confirm our result briefly through comparing some facts from other experiments. This result is validating now. Some of results show good performance with 1-D failure analysis code for SiC fuel cladding that already developed and validated by Lee et al,. It will present in meeting. Furthermore, this simulation presented the possibility of understanding the behavior of cladding deeper. If designer can predict the dangerous region and the time precisely, it may be helpful for designing nuclear fuel cladding geometry and set safety criteria.
International Nuclear Information System (INIS)
Arkoma, Asko; Hänninen, Markku; Rantamäki, Karin; Kurki, Joona; Hämäläinen, Anitta
2015-01-01
Highlights: • The number of failing fuel rods in a LB-LOCA in an EPR is evaluated. • 59 scenarios are simulated with the system code APROS. • 1000 rods per scenario are simulated with the fuel performance code FRAPTRAN-GENFLO. • All the rods in the reactor are simulated in the worst scenario. • Results suggest that the regulations set by the Finnish safety authority are met. - Abstract: In this paper, the number of failing fuel rods in a large break loss-of-coolant accident (LB-LOCA) in EPR-type nuclear power plant is evaluated using statistical methods. For this purpose, a statistical fuel failure analysis procedure has been developed. The developed method utilizes the results of nonparametric statistics, the Wilks’ formula in particular, and is based on the selection and variation of parameters that are important in accident conditions. The accident scenario is simulated with the coupled fuel performance – thermal hydraulics code FRAPTRAN-GENFLO using various parameter values and thermal hydraulic and power history boundary conditions between the simulations. The number of global scenarios is 59 (given by the Wilks’ formula), and 1000 rods are simulated in each scenario. The boundary conditions are obtained from a new statistical version of the system code APROS. As a result, in the worst global scenario, 1.2% of the simulated rods failed, and it can be concluded that the Finnish safety regulations are hereby met (max. 10% of the rods allowed to fail)
Energy Technology Data Exchange (ETDEWEB)
Arkoma, Asko, E-mail: asko.arkoma@vtt.fi; Hänninen, Markku; Rantamäki, Karin; Kurki, Joona; Hämäläinen, Anitta
2015-04-15
Highlights: • The number of failing fuel rods in a LB-LOCA in an EPR is evaluated. • 59 scenarios are simulated with the system code APROS. • 1000 rods per scenario are simulated with the fuel performance code FRAPTRAN-GENFLO. • All the rods in the reactor are simulated in the worst scenario. • Results suggest that the regulations set by the Finnish safety authority are met. - Abstract: In this paper, the number of failing fuel rods in a large break loss-of-coolant accident (LB-LOCA) in EPR-type nuclear power plant is evaluated using statistical methods. For this purpose, a statistical fuel failure analysis procedure has been developed. The developed method utilizes the results of nonparametric statistics, the Wilks’ formula in particular, and is based on the selection and variation of parameters that are important in accident conditions. The accident scenario is simulated with the coupled fuel performance – thermal hydraulics code FRAPTRAN-GENFLO using various parameter values and thermal hydraulic and power history boundary conditions between the simulations. The number of global scenarios is 59 (given by the Wilks’ formula), and 1000 rods are simulated in each scenario. The boundary conditions are obtained from a new statistical version of the system code APROS. As a result, in the worst global scenario, 1.2% of the simulated rods failed, and it can be concluded that the Finnish safety regulations are hereby met (max. 10% of the rods allowed to fail)
Analysis of the LBLOCAs in the room 1 for the 3-pin fuel test loop
International Nuclear Information System (INIS)
Park, S. K.; Chi, D. Y.; Sim, B. S.; Park, K. N.; Ahn, S. H.; Lee, J. M.; Lee, C. Y.; Kim, Y. J.
2004-12-01
Fuel Test Loop(FTL) has been developed to meet the increasing demand on fuel irradiation and burn up test required the development of new fuels in Korea. It is designed to provide the test conditions of high pressure and temperature like the commercial PWR and CANDU power plants. And also the FTL have the cooling capability to sufficiently remove the thermal power of the in-pile test section for normal operation, Anticipated Operational Occurrences(AOOs), and Design Basis Accidents(DBAs). This report deals with the Large Break Loss of Coolant Accidents (LBLOCAs) in the Room 1 for the 3-pin fuel test loop. The MARS code has been used for the prediction of the emergency core cooling capability of the FTL and the peak cladding temperature of the test fuels for the LBLOCAs. The location of the pipe break is assumed at the downstream of the main cooling water pump and the upstream of the main cooler in the room 1. Double ended guillotine break is assumed for the large break loss of coolant accidents. The discharge coefficients of 0.1, 0.33, 0.67, 1.0 are investigated for the LBLOCAs. The maximum Peak Cladding Temperature (PCT) is predicted to be about 734.7 .deg. C for the PWR fuel test mode and 850.4 .deg. C for the CANDU fuel test mode respectively. The maximum peak cladding temperatures meet the design criterion of commercial PWR fuel that the maximum PCT is lower than 1204 .deg. C
International Nuclear Information System (INIS)
Wulff, W.; Boyack, B.E.; Duffey, R.B.
1988-01-01
Comparisons of results from TRAC-PF1/MOD1 code calculations with measurements from Separate Effects Tests, and published experimental data for modeling parameters have been used to determine the uncertainty ranges of code input and modeling parameters which dominate the uncertainty in predicting the Peak Clad Temperature for a postulated Large Break Loss of Coolant Accident (LBLOCA) in a four-loop Westinghouse Pressurized Water Reactor. The uncertainty ranges are used for a detailed statistical analysis to calculate the probability distribution function for the TRAC code-predicted Peak Clad Temperature, as is described in an attendant paper. Measurements from Separate Effects Tests and Integral Effects Tests have been compared with results from corresponding TRAC-PF1/MOD1 code calculations to determine globally the total uncertainty in predicting the Peak Clad Temperature for LBLOCAs. This determination is in support of the detailed statistical analysis mentioned above. The analyses presented here account for uncertainties in input parameters, in modeling and scaling, in computing and in measurements. The analyses are an important part of the work needed to implement the Code Scalability, Applicability and Uncertainty (CSAU) methodology. CSAU is needed to determine the suitability of a computer code for reactor safety analyses and the uncertainty in computer predictions. The results presented here are used to estimate the safety margin of a particular nuclear reactor power plant for a postulated accident. 25 refs., 10 figs., 11 tabs
Energy Technology Data Exchange (ETDEWEB)
Arkoma, Asko, E-mail: asko.arkoma@vtt.fi; Ikonen, Timo
2016-08-15
Highlights: • A sensitivity analysis using the data from EPR LB-LOCA simulations is done. • A procedure to analyze such complex data is outlined. • Both visual and quantitative methods are used. • Input factors related to core design are identified as most significant. - Abstract: In this paper, a sensitivity analysis for the data originating from a large break loss-of-coolant accident (LB-LOCA) analysis of an EPR-type nuclear power plant is presented. In the preceding LOCA analysis, the number of failing fuel rods in the accident was established (Arkoma et al., 2015). However, the underlying causes for rod failures were not addressed. It is essential to bring out which input parameters and boundary conditions have significance to the outcome of the analysis, i.e. the ballooning and burst of the rods. Due to complexity of the existing data, the first part of the analysis consists of defining the relevant input parameters for the sensitivity analysis. Then, selected sensitivity measures are calculated between the chosen input and output parameters. The ultimate goal is to develop a systematic procedure for the sensitivity analysis of statistical LOCA simulation that takes into account the various sources of uncertainties in the calculation chain. In the current analysis, the most relevant parameters with respect to the cladding integrity are the decay heat power during the transient, the thermal hydraulic conditions in the rod’s location in reactor, and the steady-state irradiation history of the rod. Meanwhile, the tolerances in fuel manufacturing parameters were found to have negligible effect on cladding deformation.
Impact of spatial kinetics in severe accident analysis for a large HWR
International Nuclear Information System (INIS)
Morris, E.E.
1994-01-01
The impact on spatial kinetics on the analysis of severe accidents initiated by the unprotected withdrawal of one or more control rods is investigated for a large heavy water reactor. Large inter- and intra-assembly power shifts are observed, and the importance of detailed geometrical modeling of fuel assemblies is demonstrated. Neglect of space-time effects is shown to lead to erroneous estimates of safety margins, and of accident consequences in the event safety margins are exceeded. The results and conclusions are typical of what would be expected for any large, loosely coupled core
International Nuclear Information System (INIS)
Hyun-Sik Park; Ki-Yong Choi; Dong-Jin Euh; Tae-Soon Kwon; Won-Pil Baek
2005-01-01
Full text of publication follows: The simulation capability of the KAERI integral effect test facility, ATLAS (Advanced Thermalhydraulic Test Loop for Accident Simulation), has been assessed for a large-break loss-of-coolant accident (LBLOCA) transient. The ATLAS facility is a 1/2 height-scaled, 1/144 area-scaled (1/288 in volume scale), and full-pressure test loop based on the design features of the APR1400, an evolutionary pressurized water reactor that has been developed by Korean industry. The APR1400 has four mechanically separated hydraulic trains for the emergency core cooling system (ECCS) with direct vessel injection (DVI). The APR1400 design features have brought about several new safety issues related to the LBLOCA including the steam-water interaction, ECC bypass, and boiling in the reactor vessel downcomer. The ATLAS facility will be used to investigate the multiple responses between the systems or between the components during various anticipated transients. The ATLAS facility has been designed according to a scaling method that is mainly based on the model suggested by Ishii and Kataoka. The ATLAS facility is being evaluated against the prototype plant APR1400 with the same control logics and accident scenarios using the best-estimated code, MARS. This paper briefly introduces the basic design features of the ATLAS facility and presents the results of pre-test analysis for a postulated LBLOCA of a cold leg. The LBLOCA analyses has been conducted to assess the validity of the applied scaling law and the similarity between the ATLAS facility and the APR1400. As the core simulator of the ATLAS facility has the 10% capability of the scaled full power, the blowdown phase can not be simulated, and the starting point of the accident scenario is around the end of blowdown. So it is an important problem to find the correct initial conditions. For the analyzed LBLOCA scenario, the ATLAS facility showed very similar thermal-hydraulic characteristics to the APR
International Nuclear Information System (INIS)
Park, Chang Hwan; Lee, Un Chul; Park, Goon Cherl; Suh, Kune Yull
2004-01-01
The RELAP5 (1995) and MAAP4 (1994) linked-analysis system was designed and illustrative calculation was performed. A large-break loss-of-coolant accident (LBLOCA) was taken as the reference case for the APR1400 (Advanced Power Reactor 1400MWe). For the early phase of this case, the calculational results of two codes have some deviations in water level depletion and hot assembly temperature. Ordinarily, it is considered that RELAP5 has enough accuracy in calculation of the thermal hydraulic behavior of typical PWR during design basis accidents. If the data set for the thermal hydraulic state of RELAP5 can be well-transferred to MAAP4 as an initial condition, the overall transients given by the linked analysis can get more reliability. In this study, the linked analysis system of RELAP5 and MAAP4 doesn't mean the mechanically integrated code structure. The objective of this study is to formulate the linked analysis system of RELAP5 and MAAP4, which should precede construction of mechanically integrated analyzer. Thus, the main scope of this work covers development of the methodology for data linkage and decision of the transfer timing
Fuel cycle in Japanese Fugen - HWR
International Nuclear Information System (INIS)
1979-04-01
This paper describes the use of plutonium-bearing fuel in the Japanese Fugen-HWR. The Fugen-HWR is a pressure tube type, boiling light water cooled, and heavy water moderated reactor, which by using plutonium fuel (MOX) achieves the advantage of high neutron economy. The characteristics of the reactor are discussed, particularly its ability to operate with several different types of fuel - Pu-natural U MOX, Pu-Depleted U (from spent LWR fuel) MOX, Pu-Depleted U (from enrichment tails) MOX, and enriched UO 2 . The natural U and separative work units saved are given and the fuel management and control of the reactor discussed. Non-proliferation and safety considerations are given. The Fugen-HWR achieved 100% power rating in the autumn of 1979
Assessment of Effect on LBLOCA PCT for Change in Upper Head Nodalization
International Nuclear Information System (INIS)
Kang, Dong Gu; Huh, Byung Gil; Yoo, Seung Hun; Bang, Youngseok; Seul, Kwangwon; Cho, Daehyung
2014-01-01
In this study, the best estimate plus uncertainty (BEPU) analysis of LBLOCA for original and modified nodalizations was performed, and the effect on LBLOCA PCT for change in upper head nodalization was assessed. In this study, the best estimate plus uncertainty (BEPU) analysis of LBLOCA for original and modified nodalizations was performed, and the effect on LBLOCA PCT for change in upper head nodalization was assessed. It is confirmed that modification of upper head nodalization influences PCT behavior, especially in the reflood phase. In conclusions, the modification of nodalization to reflect design characteristic of upper head temperature should be done to predict PCT behavior accurately in LBLOCA analysis. In the best estimate (BE) method with the uncertainty evaluation, the system nodalization is determined by the comparative studies of the experimental data. Up to now, it was assumed that the temperature of the upper dome in OPR-1000 was close to that of the cold leg. However, it was found that the temperature of the upper head/dome might be a little lower than or similar to that of the hot leg through the evaluation of the detailed design data. Since the higher upper head temperature affects blowdown quenching and peak cladding temperature in the reflood phase, the nodalization for upper head should be modified
Energy Technology Data Exchange (ETDEWEB)
Seo, Si Won; Kim, Jong Hyun; Choi, Sung Soo [Atomic Creative Technology Co., Daejeon (Korea, Republic of); Kim, Sung Min [Central Research Institute, Korea Hydro and Nuclear Power Co., Daejeon (Korea, Republic of)
2016-05-15
Moderator subcooling margin has been analyzed using the MODTURC{sub C}LAS code in the Large LOCA FSAR PARTs C and F. Performance of moderator heat exchangers depends on RCW (Raw reCirculated Water) temperature. And also the temperature is affected by sea water temperature. Unfortunately, sea water temperature is gradually increasing by global warming. So it will cause increase of RCW temperature inevitably. There is no assessment result of moderator subcooling with increasing RCW temperature even if it is important problem. Therefore, sensitivity analysis is performed to give information about the relation between RCW temperature and moderator subcooling in the present study. The moderator subcooling margin has to be ensured to establish the moderator heat removal when Large LOCA with LOECI and Loss of Class IV Power occurs. However, sea water temperature is increasing gradually due to global warming. So it is necessary that sensitivity analysis of RCW temperature on the moderator subcooling margin to estimate the availability of the moderator heat removal. In the present paper, the moderator subcooling analysis is performed using the same methodology and assumptions except for RCW temperature used in FSAR Large LOCA PART F.
International Nuclear Information System (INIS)
Seo, Si Won; Kim, Jong Hyun; Choi, Sung Soo; Kim, Sung Min
2016-01-01
Moderator subcooling margin has been analyzed using the MODTURC_CLAS code in the Large LOCA FSAR PARTs C and F. Performance of moderator heat exchangers depends on RCW (Raw reCirculated Water) temperature. And also the temperature is affected by sea water temperature. Unfortunately, sea water temperature is gradually increasing by global warming. So it will cause increase of RCW temperature inevitably. There is no assessment result of moderator subcooling with increasing RCW temperature even if it is important problem. Therefore, sensitivity analysis is performed to give information about the relation between RCW temperature and moderator subcooling in the present study. The moderator subcooling margin has to be ensured to establish the moderator heat removal when Large LOCA with LOECI and Loss of Class IV Power occurs. However, sea water temperature is increasing gradually due to global warming. So it is necessary that sensitivity analysis of RCW temperature on the moderator subcooling margin to estimate the availability of the moderator heat removal. In the present paper, the moderator subcooling analysis is performed using the same methodology and assumptions except for RCW temperature used in FSAR Large LOCA PART F.
International Nuclear Information System (INIS)
Chojnacki, E.; Benoit, J.P.
2007-01-01
Best estimate computer codes are increasingly used in nuclear industry for the accident management procedures and have been planned to be used for the licensing procedures. Contrary to conservative codes which are supposed to give penalizing results, best estimate codes attempt to calculate accidental transients in a realistic way. It becomes therefore of prime importance, in particular for technical organization as IRSN in charge of safety assessment, to know the uncertainty on the results of such codes. Thus, CSNI has sponsored few years ago (published in 1998) the Uncertainty Methods Study (UMS) program on uncertainty methodologies used for a SBLOCA transient (LSTF-CL-18) and is now supporting the BEMUSE program for a LBLOCA transient (LOFT-L2-5). The large majority of BEMUSE participants (9 out of 10) use uncertainty methodologies based on a probabilistic modelling and all of them use Monte-Carlo simulations to propagate the uncertainties through their computer codes. Also, all of 'probabilistic participants' intend to use order statistics to determine the sampling size of the Monte-Carlo simulation and to derive the uncertainty ranges associated to their computer calculations. The first aim of this paper is to remind the advantages and also the assumptions of the probabilistic modelling and more specifically of order statistics (as Wilks' formula) in uncertainty methodologies. Indeed Monte-Carlo methods provide flexible and extremely powerful techniques for solving many of the uncertainty propagation problems encountered in nuclear safety analysis. However it is important to keep in mind that probabilistic methods are data intensive. That means, probabilistic methods cannot produce robust results unless a considerable body of information has been collected. A main interest of the use of order statistics results is to allow to take into account an unlimited number of uncertain parameters and, from a restricted number of code calculations to provide statistical
Audit Calculations of LBLOCA for Ulchin Unit 1 and 2 Power Up rate
Energy Technology Data Exchange (ETDEWEB)
Kang, Donggu; Huh, Byunggil; Yoo, Seunghunl; Yang, Chaeyong; Seul, Kwangwon [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)
2013-05-15
The KINS-Realistic Evaluation Model (KINS-REM) was developed for the independent audit calculation in 1991, and the code accuracy and statistical method have been improved. To support the licensing review and to confirm the validity of licensee's calculation, regulatory auditing calculations have been also conducted. Currently, the modification of Ulchin 1 and 2 operating license for 4.5% power up rate is under review. In this study, the regulatory audit calculation for LBLOCA of Ulchin Unit 1 and 2 with 4.5% power up rate was performed by applying KINS-REM. In this study, the regulatory audit calculation for LBLOCA of Ulchin Unit 1 and 2 with 4.5% power up rate was performed by applying KINS-REM. It is confirmed that the analysis results of LBLOCA for Ulchin 1 and 2 power up rate meets the PCT acceptance criteria.
Energy Technology Data Exchange (ETDEWEB)
Hwang, Min Jeong; Marigomena, Ralph; Yoo, Tae Ho; Kim, Y. S.; Sim, S. K. [Environment and Energy Technology, Inc., Daejeon (Korea, Republic of); Bang, Young Seok [KINS, Daejeon (Korea, Republic of)
2014-05-15
As a part of the regulatory safety research, Korea Institute of Nuclear Safety(KINS) also developed a best estimate safety analysis regulatory audit code, MARS-KS, to realistically predict and better understand the physical phenomena of the design basis accidents. KINS improved uncertainty propagation methodology using MARS-KS and applied the improved uncertainty evaluation method for the Shinkori Units 3 and 4 LBLOC. This study is to evaluate the uncertainty propagation of a postulated LBLOCA and quantify the safety margin using KINS-REM and MARS-KS code for the APR+ (Advanced Pressurizer Reactor Plus) Standard Safety Analysis Report(SSAR) which is under regulatory review by the KINS for its design approval. KINS-REM LBLOCA realistic evaluation methodology was used for the regulatory assessment of the APR+ LBLOCA using MARS-KS to evaluate the uncertainty propagation of the uncertainty variables as well as to assess the safety margin during the limiting case of the APR+ double ended guillotine cold leg LBLOCA. Uncertainty evaluation for the APR+ LBLOCA shows that the reflood PCT with upper limit of 95% probability at 95% confidence level is 1363.2 K and is higher than the blowdown PCT95/95 of 1275.3 K. The result shows that the current evaluation of APR+ LBLOCA PCT is within the acceptance criteria of 1477 K ECCS.
International Nuclear Information System (INIS)
Bang, Young Seok; Cheong, Aeju; Woo, Sweng Woong
2014-01-01
Confirmation of the performance of the SIT with FD should be based on thermal-hydraulic analysis of LBLOCA and an adequate and physical model simulating the SIT/FD should be used in the LBLOCA calculation. To develop such a physical model on SIT/FD, simulation of the major phenomena including flow distribution of by standpipe and FD should be justified by full scale experiment and/or plant preoperational testing. Author's previous study indicated that an approximation of SIT/FD phenomena could be obtained by a typical system transient code, MARS-KS, and using 'accumulator' component model, however, that additional improvement on modeling scheme of the FD and standpipe flow paths was needed for a reasonable prediction. One problem was a depressurizing behavior after switchover to low flow injection phase. Also a potential to release of nitrogen gas from the SIT to the downstream pipe and then reactor core through flow paths of FD and standpipe has been concerned. The intrusion of noncondensible gas may have an effect on LBLOCA thermal response. Therefore, a more reliable model on SIT/FD has been requested to get a more accurate prediction and a confidence of the evaluation of LBLOCA. The present paper is to discuss an improvement of modeling scheme from the previous study. Compared to the existing modeling, effect of the present modeling scheme on LBLOCA cladding thermal response is discussed. The present study discussed the modeling scheme of SIT with FD for a realistic simulation of LBLOCA of APR1400. Currently, the SIT blowdown test can be best simulated by the modeling scheme using 'pipe' component with dynamic area reduction. The LBLOCA analysis adopting the modeling scheme showed the PCT increase of 23K when compared to the case of 'accumulator' component model, which was due to the flow rate decrease at transition phase low flow injection and intrusion of nitrogen gas to the core. Accordingly, the effect of SIT/FD modeling
High density UO2 powder preparation for HWR fuel
International Nuclear Information System (INIS)
Hwang, S. T.; Chang, I. S.; Choi, Y. D.; Cho, B. R.; Kwon, S. W.; Kim, B. H.; Moon, B. H.; Kim, S. D.; Phyu, K. M.; Lee, K. A.
1992-01-01
The objective of this project is to study on the preparation of method high density UO 2 powder for HWR Fuel. Accordingly, it is necessary to character ize the AUC processed UO 2 powder and to search method for the preparation of high density UO 2 powder for HWR Fuel. Therefore, it is expected that the results of this study can effect the producing of AUC processed UO 2 powder having sinterability. (Author)
International Nuclear Information System (INIS)
Cognet, C.; Gandrille, P.
1999-01-01
In-, ex-vessel reflooding or both simultaneously can be envisaged as Accident Management Measures to stop a Severe Accident (SA) in vessel. This paper addresses the possibility of in-vessel core melt retention by RPV external flooding for a high power PWR (4250 MWth). The reactor vessel is assumed to have no lower head penetration and thermal insulation is neglected. The effects of external cooling of high power density debris, where the margin for such a strategy is low, are investigated with the MAAP4 code. MAAP4 code is used to verify the system capability to flood the reactor pit and to predict simultaneously the corium relocation into the lower head with the thermal and mechanical response of the RPV in transient conditions. The corium pool cooling and holding in the RPV lower head is analysed. Attention is paid to the internal heat exchanges between corium components. This paper focuses particularly the heat transfer between oxidic and metallic phases as well as between the molten metallic phase and the RPV wall of utmost importance for challenging the RPV integrity in vicinity of the metallic phase. The metal segregation has a decisive influence upon the attack of the vessel wall due to a very strong peaking of the lateral flux ('focusing effect'). Thus, the dynamics of the formation of the metallic layer characterized by a growing inventory of steel, both from a partial vessel ablation and the degradation of internals steel structures by the radiative heat flux from the debris, is displayed. The analysed sequence is a surge line rupture near the hot leg (LBLOCA) leading to the fastest accident progression
Surrogate Model for Recirculation Phase LBLOCA and DET Application
International Nuclear Information System (INIS)
Fynan, Douglas A; Ahn, Kwang-Il; Lee, John C.
2014-01-01
In the nuclear safety field, response surfaces were used in the first demonstration of the code scaling, applicability, and uncertainty (CSAU) methodology to quantify the uncertainty of the peak clad temperature (PCT) during a large-break loss-of-coolant accident (LBLOCA). Surrogates could have applications in other nuclear safety areas such as dynamic probabilistic safety assessment (PSA). Dynamic PSA attempts to couple the probabilistic nature of failure events, component transitions, and human reliability to deterministic calculations of time-dependent nuclear power plant (NPP) responses usually through the use of thermal-hydraulic (TH) system codes. The overall mathematical complexity of the dynamic PSA architectures with many embedded computational expensive TH code calculations with large input/output data streams have limited realistic studies of NPPs. This paper presents a time-dependent surrogate model for the recirculation phase of a hot leg LBLOCA in the OPR-1000. The surrogate model is developed through the ACE algorithm, a powerful nonparametric regression technique, trained on RELAP5 simulations of the LBLOCA. Benchmarking of the surrogate is presented and an application to a simplified dynamic event tree (DET). A time-dependent surrogate model to predict core subcooling during the recirculation phase of a hot leg LBLOCA in the OPR-1000 has been developed. The surrogate assumed the structure of a general discrete time dynamic model and learned the nonlinear functional form by performing nonparametric regression on RELAP5 simulations with the ACE algorithm. The surrogate model input parameters represent mass and energy flux terms to the RCS that appeared as user supplied or code calculated boundary conditions in the RELAP5 model. The surrogate accurately predicted the TH behavior of the core for a variety of HPSI system performance and containment conditions when compared with RELAP5 simulations. The surrogate was applied in a DET application replacing
Numerical simulation of AP1000 LBLOCA with SCDAP/RELAP 4.0 code
International Nuclear Information System (INIS)
Xie Heng
2017-01-01
The risk of large-break loss of coolant accident (LBLOCA) is that core will be exposed once the accident occurs, and may cause core damages. New phenomena may occur in LBLOCA due to passive safety injection adopted by AP1000. This paper used SCDAP/RELAP5 4.0 to build the numerical model of AP1000 and double-end guillotine of cold leg is simulated. Reactor coolant system and passive core cooling system were modeled by RELAP5 modular. HEAT STRUCTURE component of RELAP5 was used to simulate the fuel rod. The reflood option in RELAP5 was chosen to be activated or not to study the effect of axial heat conduction. Results show that the axial heat conduction plays an important role in the reflooding phase and can effectively shorten reflood process. An alternative core model is built by SCDAP modular. It is found that the SCDAP model predicts higher maximum peak cladding temperature and longer reflood process than RELAP5 model. Analysis shows that clad oxidation heat plays a key role in the reflood. From the simulation results, it can be concluded that the cladding will keep intact and fission product will not be released from fuel to coolant in LBLOCA. (author)
International Nuclear Information System (INIS)
Ruggles, A.E.; Cheng, L.Y.; Dimenna, R.A.; Griffith, P.; Wilson, G.E.
1994-06-01
A team of experts in reactor analysis conducted a phenomena identification and ranking (PIR) exercise for a large break loss-of-coolant accident (LBLOCA) in the Advanced Neutron source Reactor (ANSR). The LBLOCA transient is broken into two separate parts for the PIR exercise. The first part considers the initial depressurization of the system that follows the opening of the break. The second part of the transient includes long-term decay heat removal after the reactor is shut down and the system is depressurized. A PIR is developed for each part of the LBLOCA. The ranking results are reviewed to establish if models in the RELAP5-MOD3 thermalhydraulic code are adequate for use in ANSR LBLOCA simulations. Deficiencies in the RELAP5-MOD3 code are identified and existing data or models are recommended to improve the code for this application. Experiments were also suggested to establish models for situations judged to be beyond current knowledge. The applicability of the ANSR PIR results is reviewed for the entire set of transients important to the ANSR safety analysis
CARA design criteria for HWR fuel burnup extension
International Nuclear Information System (INIS)
Florido, P.C.; Cirimello, R.O.; Bergallo, J.E.; Marino, A.C.; Delmastro, D.F.; Brasnarof, D.O.; Gonzalez, J.H.; Juanico, L.A.
2002-01-01
A new concept for HWR fuel bundles, namely CARA, is presented. The CARA design allows to improve all the major performances in the PHWR fuel technology. Among others, it reaches higher burnup and thermohydraulic safety margins, together with lower fuel pellet temperatures and Zry/HM mass ratio. Moreover, it keeps the fuel mass content per unit length and the channel pressure drop by using a single diameter of fuel rods. (author)
The importance of proper feedback modeling in HWR
Energy Technology Data Exchange (ETDEWEB)
Saphier, D; Gorelik, Z; Shapira, M [Israel Atomic Energy Commission, Yavne (Israel). Soreq Nuclear Research Center
1996-12-01
The DSNP simulation language was applied to study the effect of different modeling approximations of feedback phenomena in nuclear power plants. The different methods to model the feedback effects are presented and discussed. It is shown that HWR`s are most sensitive to the correct modeling since the usually have at least three feedback effects acting at different time scales, and to achieve correct kinetics a one dimensional representation is needed with correct modeling of the in core time delays. The simulation methodology of lumped parameters and one dimensional models using the DSNP simulation language is presented (authors).
The importance of proper feedback modeling in HWR
International Nuclear Information System (INIS)
Saphier, D.; Gorelik, Z.; Shapira, M.
1996-01-01
The DSNP simulation language was applied to study the effect of different modeling approximations of feedback phenomena in nuclear power plants. The different methods to model the feedback effects are presented and discussed. It is shown that HWR's are most sensitive to the correct modeling since the usually have at least three feedback effects acting at different time scales, and to achieve correct kinetics a one dimensional representation is needed with correct modeling of the in core time delays. The simulation methodology of lumped parameters and one dimensional models using the DSNP simulation language is presented (authors)
IAEA ICSP on HWR moderator subcooling requirements to demonstrate backup heat sink
International Nuclear Information System (INIS)
Choi, J.; Nitheanandan, T.
2013-01-01
The IAEA launched a new International Collaborative Standard Problem (ICSP) on 'HWR Moderator Subcooling Requirements to Demonstrate Backup Heat Sink Capabilities of Moderator during Accidents'. The purpose of the ICSP is to benchmark analysis computer codes in simulating contact boiling experimental data to assess the subcooling requirements for an overheated pressure tube, plastically deforming into contact with the calandria tube during a postulated large break loss of coolant accident. The experimental data obtained for the ICSP blind simulation can be used to assess safety analysis computer codes simulating thermal radiation heat transfer to the pressure tube, pressure tube deformation or failure, pressure tube to calandria tube heat transfer, calandria tube to moderator heat transfer, and calandria tube deformation or failure. (author)
The dupic fuel cycle synergism between LWR and HWR
International Nuclear Information System (INIS)
Lee, J.S.; Yang, M.S.; Park, H.S.; Lee, H.H.; Kim, K.P.; Sullivan, J.D.; Boczar, P.G.; Gadsby, R.D.
1999-01-01
The DUPIC fuel cycle can be developed as an alternative to the conventional spent fuel management options of direct disposal or plutonium recycle. Spent LWR fuel can be burned again in a HWR by direct refabrication into CANDU-compatible DUPIC fuel bundles. Such a linkage between LWR and HWR can result in a multitude of synergistic effects, ranging from savings of natural uranium to reductions in the amount of spent fuel to be buried in the earth, for a given amount of nuclear electricity generated. A special feature of the DUPIC fuel cycle is its compliance with the 'Spent Fuel Standard' criteria for diversion resistance, throughout the entire fuel cycle. The DUPIC cycle thus has a very high degree of proliferation resistance. The cost penalty due to this technical factor needs to be considered in balance with the overall benefits of the DUPIC fuel cycle. The DUPIC alternative may be able to make a significant contribution to reducing spent nuclear fuel burial in the geosphere, in a manner similar to the contribution of the nuclear energy alternative in reducing atmospheric pollution from fossil fuel combustion. (author)
Effect of spray on performance of the hydrogen mitigation system during LB-LOCA for CPR1000 NPP
International Nuclear Information System (INIS)
Huang, X.G.; Yang, Y.H.; Cheng, X.; Al-Hawshabi, N.H.A.; Casey, S.P.
2011-01-01
Highlights: → This paper presents the spray effect on HMS during LB-LOCA by using GASFLOW. → The positive and negative effects of spray are summarized. → And the combination of DIS and PAR system is suggested as reasonable countermeasures. → This research is an important work aimed at the study of spray and hydrogen mitigation. → The contents of this paper should become a required part of the safety analysis of Chinese NPPs. - Abstract: During the course of the hypothetical large break loss-of-coolant accident (LB-LOCA) in a nuclear power plant (NPP), hydrogen is generated by a reaction between steam and the fuel-cladding inside the reactor pressure vessel (RPV). It is then ejected from the break into the containment along with a large amount of steam. Management of hydrogen safety and prevention of over-pressurization could be implemented through a hydrogen mitigation system (HMS) and spray system in CPR1000 NPP. The computational fluid dynamics (CFD) code GASFLOW is utilized in this study to analyze the spray effect on the performance of HMS during LB-LOCA. Results show that as a kind of HMS, deliberate igniter system (DIS) could initiate hydrogen combustion immediately after the flammability limit of the gas mixture has been reached. However, it will increase the temperature and pressure drastically. Operating the DIS under spray condition could result in hydrogen combustion being suppressed by suspended droplets inside the containment. Furthermore, the droplets could also mitigate local the temperature rise. Operation of a PAR system, another kind of HMS, consumes hydrogen steadily with a lower recombination rate which is not affected noticeably by the spray system. Numerical results indicate that the dual concept, namely the integrated application of DIS and PAR systems, is a constructive improvement for hydrogen safety under spray condition during LB-LOCA.
Phenomena identification and ranking tables (PIRT) for LBLOCA
International Nuclear Information System (INIS)
Shaw, R.A.; Dimenna, R.A.; Larson, T.K.; Wilson, G.E.
1988-01-01
The US Nuclear Regulatory Commission is sponsoring a program to provide validated reactor safety computer codes with quantified uncertainties. The intent is to quantify the accuracy of the codes for use in best estimate licensing applications. One of the tasks required to complete this program involves the identification and ranking of thermal-hydraulic phenomena that occur during particular accidents. This paper provides detailed tables of phenomena and importance ranks for a PWR LBLOCA. The phenomena were identified and ranked according to perceived impact on peak cladding temperature. Two approaches were used to complete this task. First, a panel of experts identified the physical processes considered to be most important during LBLOCA. A second team of experienced analysts then, in parallel, assembled complete tables of all plausible LBLOCA phenomena, regardless of perceived importance. Each phenomenon was then ranked in importance against every other phenomenon associated with a given component. The results were placed in matrix format and solved for the principal eigenvector. The results as determined by each method are presented in this report
CARA, new concept of advanced fuel element for HWR
International Nuclear Information System (INIS)
Florido, P.C.; Crimello, R.O.; Bergallo, J.E.; Marino, A.C.; Delmastro, D.F.; Brasnarof, D.O.; Gonzalez, J.H.
1999-01-01
All Argentinean NPPs (2 in operation, 1 under construction), use heavy water as coolant and moderator. With very different reactor concepts (pressure Vessel and CANDU type designs), the fuel elements are completely different in its concepts too. Argentina produces both types of fuel elements at a manufacturing fuel element company, called CONUAR. The very different fuel element's designs produce a very complex economical behavior in this company, due to the low production scale. The competitiveness of the Argentinean electric system (Argentina has a market driven electric system) put another push towards to increase the economical competitiveness of the nuclear fuel cycle. At present, Argentina has a very active Slightly Enriched Uranium (SEU) Program for the pressure vessel HWR type, but without strong changes in the fuel concept itself. Then, the Atomic Energy Commission in Argentina (CNEA) has developed a new concept of fuel element, named CARA, trying to achieve very ambitious goals, and substantially improved the competitiveness of the nuclear option. The ambitious targets for CARA fuel element are compatibility (a single fuel element for all Argentinean's HWR) using a single diameter fuel rod, improve the security margins, increase the burnup and do not exceed the CANDU fabrication costs. In this paper, the CARA concept will be presented, in order to explained how to achieve all together these goals. The design attracted the interest of the nuclear power operator utility (NASA), and the fuel manufacturing company (CONUAR). Then a new Project is right now under planning with the cooperation of three parts (CNEA - NASA - CONUAR) in order to complete the whole development program in the shortest time, finishing in the commercial production of CARA fuel bundle. At the end of the this paper, future CARA development program will be described. (author)
Experience in construction and operation of HWR plants in Argentina
International Nuclear Information System (INIS)
Madero, C.C.; Cosentino, J.O.
1982-01-01
''ATUCHA I'', the first nuclear power plant in Argentina, is in commerical operation since 1974 with a high capacity factor. The reactor is based on the MZFR prototype designed by SIEMENS with natural uranium and heavy water and PWR technic. The plant was built by SIEMENS on a turnkey contract and was rated 340 MWe. The offer presented in that opportunity by KWU was based on two reactors (ATUCHA I type) inside one single containment, due to the limitation in power of the reactor. Subsequent changes in the nature of the contract resulted in an active participation of CNEA engineering groups in the erection and commissioning of the reactor. In 1978 the national government approved a nuclear power plan to install four 600 MWe HWR plants until 1995. To start implementing this program, CNEA called for tenders for the supply of components and services for the ATUCHA II plant, in connection with the establishment of a local engineering company and the supply and construction of a heavy water production plant. In 1980 a contract was signed with KWU and the local company ENACE was formed to act as architect engineer and site coordinator. The plant will be located in the ATUCHA I site and the reactor will be similar but double in power to that one. Following the schedule of the nuclear plan, CNEA has just started preliminary studies for the next nuclear plant. ENACE will be responsible for the preparation of an offer for an ATUCHA reactor type. Local engineering and manufacturing firms, upon request and coordination from CNEA, and evaluating the local capacity to participate in the design and construction of a CANDU type nuclear plant. Final decision on this fourth nuclear plant in Argentina will be taken middle 1983. (J.P.N.)
International Nuclear Information System (INIS)
Rodriguez, S.B.; Unal, C.; Pasamehmetoglu, K.O.; Motley, F.E.
1992-01-01
In a postulated large-break loss-of-coolant accident (LBLOCA), the core of the reactor is uncovered quickly as the liquid that drains out of the tank is replaced by air. During the LBLOCA, the reactor is scrammed. the moderator tank is drained, and fuel and control rod tubes are cooled internally by forced convection via the emergency cooling system (ECS) water. However, the safety rods, reflector assemblies, tank wall, and instrument rods continue to heat up as a result of gamma deposition. These components are primarily cooled by natural/mixed convection and radiation heat transfer. In this paper, the thermal-hydraulic analysis of a reactor moderator tank exposed to air during an LBLOCA is discussed. The analysis was performed using a special version of the Transient Reactor Analysis Code (TRAC). TRAC input and code modifications considered the appropriate modeling of ECS cooling, thermal radiation heat transfer, and natural convection. The major objective of the model was to calculate the limiting component temperature (that establishes the maximum operating power) as a result of gamma heating. In addition, the nature of the moderator tank air-circulation pattern and its effects on the limiting temperature under various conditions were analyzed. None of the components were found to exceed their structural limits when the pre-scram power level was 50% of historical power
A Procedure to Address the Fuel Rod Failures during LB-LOCA Transient in Atucha-2 NPP
Directory of Open Access Journals (Sweden)
Martina Adorni
2011-01-01
Full Text Available Depending on the specific event scenario and on the purpose of the analysis, the availability of calculation methods that are not implemented in the standard system thermal hydraulic codes might be required. This may imply the use of a dedicated fuel rod thermomechanical computer code. This paper provides an outline of the methodology for the analysis of the 2A LB-LOCA accident in Atucha-2 NPP and describes the procedure adopted for the use of the fuel rod thermomechanical code. The methodology implies the application of best estimate thermalhydraulics, neutron physics, and fuel pin performance computer codes, with the objective to verify the compliance with the specific acceptance criteria. The fuel pin performance code is applied with the main objective to evaluate the extent of cladding failures during the transient. The procedure consists of a deterministic calculation by the fuel performance code of each individual fuel rod during its lifetime and in the subsequent LB-LOCA transient calculations. The boundary and initial conditions are provided by core physics and three-dimensional neutron kinetic coupled thermal-hydraulic system codes calculations. The procedure is completed by the sensitivity calculations and the application of the probabilistic method, which are outside the scope of the current paper.
International Nuclear Information System (INIS)
Tri-Yulianto
1996-01-01
Based on TOR BATAN for PELITA VI. On of BATAN program in the fuel element production technology section is the acquisition of the fuel element fabrication technology for research reactor as well as power reactor. The acquisition can be achieved using different strategies, e.g. by utilizing the facility owned for research and development of the technology desired or by transferring the technology directly from the source. With regards to the above, PEBN through its facility in BEBE has started the acquisition of the fuel element fabrication technology for power reactor by developing the existing equipment initially designed to fabricate HWR Cinere fuel element. The development, by way of modifying the equipment, is intended for the production of HWR (Candu) and LWR (PWR and BWR) fuel elements. To achieve above objective, at the early stage of activity, an assesment on the fabrication equipment for pelletizing, component production and assembly. The assesment was made by comparing the shape and the size of the existing fuel element with those used in the operating reactors such as Candu reactors, PWR and BWR. Equipment having the potential to be modified for the production of HWR fuel elements are as followed: For the pelletizing equipment, the punch and dies can be used of the pressing machine for making green pellet can be modified so that different sizes of punch and dies can be used, depending upon the size of the HWR and LWR pellets. The equipment for component production has good potential for modification to produce the HWR Candu fuel element, which has similar shape and size with those of the existing fuel element, while the possibility of producing the LWR fuel element component is small because only a limited number of the required component can be made with the existing equipment. The assembly equipment has similar situation whit that of the component production, that is, to assemble the HWR fuel element modification of few assembly units very probable
Preliminary Study on Effect of the Crud Deposits during LBLOCA Condition
Energy Technology Data Exchange (ETDEWEB)
Huh, Byung Gil; Lee, Joo Suk; Bang, Young Seok; Oh, Deog Yeon; Woo, Sweng Woong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)
2008-10-15
The severe crud deposits on fuel rod could result in the increment of the fuel temperature during the operation of nuclear power plants. The crud inhibits heat transfer, causing the cladding temperature to increase due to the low thermal conductivity. In the event of LBLOCA at the NPP operated with heavy crud layers, the peak cladding temperature (PCT) will be higher than it would be if the cladding were clean. Therefore, NRC has been reviewing a petition for rule making to set the limit of crud and/or oxide thickness. Also, it requires that LOCA analysis be calculated by factoring the role of crud. Since the crud deposits caused the stored energy in the fuel to increase, it could influence the following requirement: The Initial Stored Energy in the Fuel in Appendix K to Part 50. ECCS Evaluation Models, to require that the steady-state temperature distribution and stored energy in the fuel at the onset of a postulated LOCA. Also, safety review guidelines in NUREG-0800.
Direct ECC bypass phenomena in the MIDAS test facility during LBLOCA reflood phase
International Nuclear Information System (INIS)
Yun, B. J.; Kweon, T. S.; Ah, D. J.; Ju, I. C.; Song, C. H.; Park, J. K.
2001-01-01
This paper describes the experimental results of ECC Direct Bypass Phenomena in the downcomer during the late reflood phase of LBLOCA of the reactor that adopts Direct Vessel Injection as a ECC system. The experiments have been performed in MIDAS test facility using superheated steam and water. The test condition was determined, based on the preliminary analysis of TRAC code, from modified linear scaling method of 1/4.93 length scale. To measure the direct bypass fraction according to the nozzle location, separate effect tests have been performed in case of DVI-4(farthest from broken cold leg) injection, DVI-2(closest to broken cold leg) injection, and DVI-2 and 4 injection, respectively. Also the test was carried out varying the steam flow rate greatly to investigate the effect of steam flow rate on the direct bypass fraction of ECC water. Test results show that the direct bypass fraction of ECC water depends significantly on the injected steam mass flow rate. DVI-4 tests show that the direct bypass fraction increases drastically as the steam flow rate increases. However, in DVI-2 test most of the injected ECC water penetrates into lower downcomer. The direct bypass characteristic in each of DVI-2 and DVI-4 tests is reflected into the direct bypass characteristic curve of DVI-2 and 4 test. The steam condensation reaches to the theoretically allowable maximum value
Guidelines to achieve seals with minimal leak rates for HWR-NPR coolant system components
International Nuclear Information System (INIS)
Finn, P.A.
1991-03-01
Seal design practices that are acceptable in pressurized-water and boiling-water reactors in the United States are not usable for the Heavy Water Reactor-New Production Reactor (HWR-NPR) because of the stringent requirement on tritium control for the atmosphere within its containment building. To maintain an atmosphere in which workers do not need protective equipment, the components of the coolant system must have a cumulative leak rate less than 0.00026 L/s. Existing technology for seal systems was reviewed with regard to flange, elastomer, valve, and pump design. A technology data base for the designers of the HWR-NPR coolant system was derived from operating experience and seal development work on reactors in the United States, Canada, and Europe. This data base was then used to generate guidelines for the design of seals and/or joints for the HWR-NPR coolant system. Also discussed are needed additional research and development, as well as the necessary component qualification tests for an effective quality control program. 141 refs., 21 figs., 14 tabs
Guidelines to achieve seals with minimal leak rates for HWR-NPR coolant system components
Energy Technology Data Exchange (ETDEWEB)
Finn, P.A.
1991-03-01
Seal design practices that are acceptable in pressurized-water and boiling-water reactors in the United States are not usable for the Heavy Water Reactor-New Production Reactor (HWR-NPR) because of the stringent requirement on tritium control for the atmosphere within its containment building. To maintain an atmosphere in which workers do not need protective equipment, the components of the coolant system must have a cumulative leak rate less than 0.00026 L/s. Existing technology for seal systems was reviewed with regard to flange, elastomer, valve, and pump design. A technology data base for the designers of the HWR-NPR coolant system was derived from operating experience and seal development work on reactors in the United States, Canada, and Europe. This data base was then used to generate guidelines for the design of seals and/or joints for the HWR-NPR coolant system. Also discussed are needed additional research and development, as well as the necessary component qualification tests for an effective quality control program. 141 refs., 21 figs., 14 tabs.
TRAC analysis of design basis events for the accelerator production of tritium target/blanket
International Nuclear Information System (INIS)
Lin, J.C.; Elson, J.
1997-01-01
A two-loop primary cooling system with a residual heat removal system was designed to mitigate the heat generated in the tungsten neutron source rods inside the rungs of the ladders and the shell of the rungs. The Transient Reactor Analysis Code (TRAC) was used to analyze the thermal-hydraulic behavior of the primary cooling system during a pump coastdown transient; a cold-leg, large-break loss-of-coolant accident (LBLOCA); a hot-leg LBLOCA; and a target downcomer LBLOCA. The TRAC analysis results showed that the heat generated in the tungsten neutron source rods can be mitigated by the primary cooling system for the pump coastdown transient and all the LBLOCAs except the target downcomer LBLOCA. For the target downcomer LBLOCA, a cavity flood system is required to fill the cavity with water at a level above the large fixed headers
Energy Technology Data Exchange (ETDEWEB)
Kim, Seong O.; Hwang, Young Dong; Kim, Young In; Chang, Moon Hee
1997-03-01
This study was performed to establish the design concepts and to evaluate the performance of safety features of large capacity passive reactor (1000 MWe grade). The design concepts of the large capacity passive reactor `KP1000` were established to generate 1000 MW electric power based on the AP600 of Westinghouse by increasing the number of reactor coolant loop and by increasing the size of reactor internals/core. To implement the analysis of the LBLOCA for KP1000, various kinds of computer codes being considered, it was concluded that RELAP5 was the most appropriate one in availability and operations in present situation. By the analysis of the computer code `RELAP5/Mod3.2.1.2`, following conclusions were derived as described below. First, by spectrum analysis of the discharge factor of the berak part, the most conservative discharge factor C{sub D}=1.2 and the PCT value of KP1000 was 1254F, which is slightly higher than the value of AP600 but is much less than the existing active reactor `Kori 3 and 4` where blowdown PCT value is 1693.4 deg F and reflooding PCT is 1918.4 deg F. Second, after the 200 seconds from the initiation of LBLOCA, IRWST water was supplied in a stable state and the maximum temperature of clad were maintained in a saturated condition. Therefore, it was concluded that the passive safety features of KP1000 keep reactor core from being damaged for large break LOCA. (author). 11 refs., 28 tabs., 37 figs.
Evaluation of non-condensable gas effect during LBLOCA in an OPR1000 Plant
Energy Technology Data Exchange (ETDEWEB)
Yoo, Seung Hun; Seul, Kwang-Won; Bang, Young-Seok; Lee, Jun Soo [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)
2015-05-15
Gas accumulation in the nuclear power plant may cause diverse safety issues such as water hammer, pump cavitation and inadvertent valve actuation. The Nuclear Regulatory Commission (NRC) has published twenty Information Notices, two Generic Letters, and one NUREG report related to the issue of the gas accumulation. It has been considered that gas accumulation occurred since the beginning of commercial nuclear power plant operation and may occur in the currently operating plants. Gas accumulation in the Emergency Core Cooling System (ECCS) is the condition that did not consider in Accident Analysis of Final Safety Analysis Report or Technical Specification and may finally result in degradation or loss of the safety functions. In this paper, the effect of gas accumulation in the ECCS has been analyzed by modeling non-condensable gas injection during the operation of Safety Injection Tank (SIT) and Low Pressure Safety Injection (LPSI) under the LBLOCA condition. Gas accumulation in the ECCS has been dealt with one of significant safety issues in the operating nuclear power plants. In order to identify the effect of the non-condensable gas in Hanul unit 3 and 4, the sensitivity studies for gas quantity, location or injection time was conducted for high or low pressure condition. At high pressure condition, the injected gas induced the reduced SIT flow rate and the reduced period of SIT injection. The reflood PCT at 5 ft''3 condition was 1150 K which was 49K higher than that at no gas condition. At low pressure condition, the reduced flow rate and the increased reflood PCT were also identified. However, the PCT deviation due to different gas quantity was not large as much as that at high pressure condition. We concluded that it is necessary to evaluate the effect of the accumulated gas with the consideration of plant- specific conditions such as system pressure, accumulated location, gas quantity and injection time.
Drop size measurements and entrainment in APR1400 during LBLOCA reflood phase
International Nuclear Information System (INIS)
Lee, Eo Hwak
2010-02-01
A study has been performed to investigate droplet size in the nuclear reactor of APR1400 during LBLOCA reflood phase and to develop droplet entrainment and deposition models for SPACE (Safety and Performance CodE) which is a safety analysis tool for PWR being developed in Korea. A freezing technique for measuring the size of droplets was developed to obtain the droplet size distribution in horizontal annular flow in a pipe with a 37.1 mm diameter. Droplets are frozen by using an extremely low temperature nitrogen gas with liquid film extraction. They are then photographed with a microscope and a CCD camera and measured by means of an image process. The results are compared with various experimental data. The droplet sizes measured by the freezing technique are comparable with those measured by other methods at a high superficial air velocity (of 50 m/s). However, because of the film extraction problem, the droplet sizes measured at a low superficial air velocity of less than 40 m/s are higher than those measured by other methods. A present method suggested for predicting the Sauter mean diameter is based on the maximum droplet size correlation for the experimental data, with and without liquid film extraction. The average droplet size is remarkably smaller downstream of the liquid film extractor because large droplets from the liquid film are excluded. In order to understand and to predict a heat transfer between superheated steam and droplets properly during reflood phase of LBLOCA, it is very important to measure broken droplet sizes by spacer grids. A study, therefore, has been performed to investigate droplet size in rod bundles with spacer grids and to develop a spacer grid droplet breakup model for safety analysis codes. Experiments were conducted with liquid droplets (SMD of 300∼700 μm) and various spacer grids at superficial air velocity of 10 m/s and 20 m/s based on FLECHT SEASET. The test channel and the grids were heated to 150 .deg. C to prevent
Drop size measurements and entrainment in APR1400 during LBLOCA reflood phase
Energy Technology Data Exchange (ETDEWEB)
Lee, Eo Hwak
2010-02-15
A study has been performed to investigate droplet size in the nuclear reactor of APR1400 during LBLOCA reflood phase and to develop droplet entrainment and deposition models for SPACE (Safety and Performance CodE) which is a safety analysis tool for PWR being developed in Korea. A freezing technique for measuring the size of droplets was developed to obtain the droplet size distribution in horizontal annular flow in a pipe with a 37.1 mm diameter. Droplets are frozen by using an extremely low temperature nitrogen gas with liquid film extraction. They are then photographed with a microscope and a CCD camera and measured by means of an image process. The results are compared with various experimental data. The droplet sizes measured by the freezing technique are comparable with those measured by other methods at a high superficial air velocity (of 50 m/s). However, because of the film extraction problem, the droplet sizes measured at a low superficial air velocity of less than 40 m/s are higher than those measured by other methods. A present method suggested for predicting the Sauter mean diameter is based on the maximum droplet size correlation for the experimental data, with and without liquid film extraction. The average droplet size is remarkably smaller downstream of the liquid film extractor because large droplets from the liquid film are excluded. In order to understand and to predict a heat transfer between superheated steam and droplets properly during reflood phase of LBLOCA, it is very important to measure broken droplet sizes by spacer grids. A study, therefore, has been performed to investigate droplet size in rod bundles with spacer grids and to develop a spacer grid droplet breakup model for safety analysis codes. Experiments were conducted with liquid droplets (SMD of 300∼700 μm) and various spacer grids at superficial air velocity of 10 m/s and 20 m/s based on FLECHT SEASET. The test channel and the grids were heated to 150 .deg. C to prevent
Role of Fugen-HWR in Japan and design of a 600 MWe demonstration reactor
International Nuclear Information System (INIS)
Sawai, S.
1982-01-01
Fugen, a 165 MWe prototype of a heavy water moderated boiling light water cooled reactor; has been in commercial operation since March 20, 1979. In parallel with the Fugen project, the design work of the 600 MWe demonstration plant has been carried out since 1973. Important system and components, such as pressure tube assemblies, control rod drive mechanism, etc., are essentially the same as those of Fugen. Some modifications, however, are made especially from the stand point of experiences In the Fugen-HWR, plutonium and uranium would be effectively used; and plutonium could make the coolant void reactivity more negative which would give good results in increasing the reactor stability and safety. On the other hand, nuclear power plants are mainly consisted of LWRs in Japan. Considering the above situations, the Fugen-HWR, coupled with LWRs, is now considered in Japan to contribute to our energy security by using plutonium and depleted uranium extracted from spent fuels of LWRs: thereby reducing the demands On August 4, 1981, the ad hoc committee on the 600 MWe demonstration Fugen-HWR submitted the final report to the Japan AEC, after having had discussions and evaluations. In the report, the ad hoc committee recommended to build the 600 MWE demonstration plant with appropriate supports of the Government. The Japan AEC will be expected to make her decision on the program in the near future. As for the reactor safety R and C, development has been stressed on coolant leak detectors and ECCS performances or Since 1965, many development works have been done for mixed oxide fuel assemblies, both for establishment of the fabrication technology and for clarification of irradiation performances. 196 mixed oxide fuel assemblies have been manufactured for Fugen. 168 of them were loaded and 92 were withdrawn. No fuel has been failured yet. (author)
Experience with, and programme of, FBR and HWR development in Japan
International Nuclear Information System (INIS)
Iida, M.; Sawai, S.; Nomoto, S.
1983-01-01
Nuclear power generation in Japan is moving forward on the long-term development programme of nuclear power from the LWR to the FBR, essentially in the same way as in other advanced nuclear countries. In this development programme the unique HWR is also included; it can use plutonium produced in LWRs together with depleted uranium before the introduction of commercial FBRs. This report describes the status of the FBR and HWR development project being carried out by the Power Reactor and Nuclear Fuel Development Corporation (PNC) based upon the Long-Term Programme on Research, Development and Utilization of Nuclear Energy in Japan. Operational experience and technical results are shown for the experimental fast reactor JOYO (100 MW(th)), which reached initial criticality in 1977. The status of the 280 MW(e) prototype reactor MONJU, under construction as of 1982, is described. The conceptual design of the subsequent 1000 MW(e) demonstration plant is outlined, as is additional future planning. Research and development results, mainly carried out at Oarai Engineering Center of PNC, are shown. The 165 MW(e) prototype FUGEN is a heavy-water-moderated, boiling-light-water-cooled, pressure-tube-type reactor which uses plutonium mixed-oxide fuel. This report describes the relationship of the fuel cycle to the HWR in Japan and also discusses the operational experience of the prototype FUGEN, which has operated since 1979. Also described is the design of the 600 MW(e) demonstration plant and the programme of related research and development. (author)
International Nuclear Information System (INIS)
Mursid Djokolelono.
1976-01-01
Emergency core cooling systems in the PWR, BWR, and HWR-Candu type of nuclear power plant are reviewed. In PWR and BWR the emergency cooling can be catagorized as active high pressure, active low pressure, and a passive one. The PWR uses components of the shutdown cooling system: whereas the BWR uses components of pressure suppression contaiment. HWR Candu also uses the shutdown cooling system similar to the PWR except some details coming out from moderator coolant separation and expensive cost of heavy water. (author)
Probabilistic safety criteria on high burnup HWR fuels
International Nuclear Information System (INIS)
Marino, A.C.
2002-01-01
BACO is a code for the simulation of the thermo-mechanical and fission gas behaviour of a cylindrical fuel rod under operation conditions. Their input parameters and, therefore, output ones may include statistical dispersion. In this paper, experimental CANDU fuel rods irradiated at the NRX reactor together with experimental MOX fuel rods and the IAEA-CRP FUMEX cases are used in order to determine the sensitivity of BACO code predictions. The techniques for sensitivity analysis defined in BACO are: the 'extreme case analysis', the 'parametric analysis' and the 'probabilistic (or statistics) analysis'. We analyse the CARA and CAREM fuel rods relation between predicted performance and statistical dispersion in order of enhanced their original designs taking account probabilistic safety criteria and using the BACO's sensitivity analysis. (author)
Energy Technology Data Exchange (ETDEWEB)
Bevilacqua, Arturo M. [Comision Nacional de Energia Atomica, San Carlos de Bariloche (Argentina). Centro Atomico Bariloche
1996-07-01
A conceptual analysis was carried out on the size of a high-level wastes (HLW) repository for the waste arising from once-through and closed fuel cycles with (HLW) and PWR. The mass, the activity and thermal loading was calculated with the ORIGEN2.1 computer code for the spent fuel and for the high-level liquid wastes. It was considered a minimum burnup of 7.000 MW.d/t U and 33.000 MW.d/t U for HWR and PWR respectively, cooling times of 20 and 55 years, reprocessing recovery ratios of 99% and 99.7% and a total electricity production of 81.6 GW(e).a. It was concluded that the cooling time is the most important repository size reproduction parameter for the closed cycles. On the other hand, the spent fuel mass for the once-through cycles does not depend on the cooling time what prevents repository size reduction once a cooling time of 55 years is reached. The repository size reduction in the case of HWR is larger than in the case of PWR, owing to the larger fuel mass required to produce the specific electricity amount. (author)
Evaluation of techniques for inspection and diagnostics of HWR pressure tubes
International Nuclear Information System (INIS)
Choi, Jong-Ho
2008-01-01
Efficient and accurate inspection and diagnostic techniques for various reactor components and systems, especially pressure tubes for Heavy Water Reactors (HWRs), are an important factor in assuring reliable and safe plant operation. To foster international collaboration in the efficient and safe use of nuclear power, the IAEA conducted a Coordinated Research Project (CRP) on Inter-comparison of Techniques for HWR Pressure Tube Inspection and Diagnostics. The objective of the CRP was to inter-compare inspection and diagnostic techniques, in use and being developed, for structural integrity assessment of HWR pressure tubes. During the first phase of the CRP, participants investigated the capability of different techniques to detect and characterize flaws. During the second phase, participants collaborated to determine the hydrogen concentration and to detect and characterize hydride blisters in zirconium alloy pressure tubes. Eight organizations from six countries, which operate HWRs, have participated in this CRP, Most of the techniques examined are well established and many of them are regularly used during in-service inspection of pressure tubes. The inter-comparison of these techniques provides a platform for identifying a particular technique (or a set of techniques), which is more accurate and reliable as compared to others for a specified task. The CRP also witnessed some new methodologies, which can be implemented on in-service inspection tools. These new techniques could complement the existing ones to overcome their limitations, thereby improving the reliability and accuracy of in-service inspection. This CRP also identified future areas of research and development. (author)
International Nuclear Information System (INIS)
Kwon, T.S.; Yun, B.J.; Euh, D.J.; Chu, I.C.; Song, C.H.
2002-01-01
Multi-dimensional thermal-hydraulic behavior in the downcomer annulus of a pressurized water reactor vessel with a Direct Vessel Injection (DVI) mode is presented based on the experimental observation in the MIDAS (Multi-dimensional Investigation in Downcomer Annulus Simulation) steam-water test facility. From the steady-state test results to simulate the late reflood phase of a Large Break Loss-of-Coolant Accidents(LBLOCA), isothermal lines show the multidimensional phenomena of a phasic interaction between steam and water in the downcomer annulus very well. MIDAS is a steam-water separate effect test facility, which is 1/4.93 linearly scaled-down of 1400 MWe PWR type of a nuclear reactor, focused on understanding multi-dimensional thermalhydraulic phenomena in downcomer annulus with various types of safety injection during the refill or reflood phase of a LBLOCA. The initial and the boundary conditions are scaled from the pre-test analysis based on the preliminary calculation using the TRAC code. The superheated steam with a superheating degree of 80 K at a given downcomer pressure of 180 kPa is injected equally through three intact cold legs into the downcomer. (authors)
Study on entry criteria for severe accident management during hot leg LBLOCAs in a PWR
International Nuclear Information System (INIS)
Zhang, Longfei; Zhang, Dafa; Wang, Shaoming
2007-01-01
The risk of Large Break Loss of Coolant Accidents (LBLOCA) has been considered an important safety issue since the beginning of the nuclear power industry. The rapid depressurization occurs in the primary coolant circuit when a large break appears in a Pressurized Water Reactors (PWR).Then the coolant temperature reaches saturation at a very low pressure. The core outlet fluid temperatures maybe not reliable indicators of the core damage states at a such lower pressure. The problem is how to decide the time for water injection in the SAM (Severe Accident Management). An alternative entry criterion is the fluid temperature just above the hot channel in which the fluid temperature showed maximum among all the channels. For that reason, a systematic study of entry criterion of SAM for different hot leg break sizes in a 3-loop PWR has been started using the detailed system thermal hydraulic and severe accident analysis code package, RELAP/SCDAPSIM. Best estimate calculations of the large break LOCA of 15 cm, 20 cm and 25 cm without accident managements and in the case of high-pressure safety injection as the accident management were performed in this paper. The analysis results showed that the core exit temperatures are not reliable indicators of the peak core temperatures and core damage states once peak core temperatures reach 1500 K, and the proposed entry criteria for SAM at the time when the core outlet temperature reaches 900 K is not effective to prevent core melt. Then other analyses were performed with a parameter of fluid temperature just above the hot channel. The latter analysis showed that earlier water injection when the fluid temperature just above the hot channel reaches 900 K is effective to prevent further core melt. Since fuel surface and hot channel have spatial distribution and depend on a period of cycle operation, a series of thermocouples are required to install just above the fuel assembly. The maximum exit temperature of 900 K that captured by
Transient fuel and target performance testing for the HWR-NPR
International Nuclear Information System (INIS)
Jicha, J.J. Jr.
1990-01-01
This paper describes a five year program of fuel target transient performance testing and model development required for the safety assessment of the HWR new production reactor. Technical issues are described, focusing on fuel and target behavior during extremely low probability transients which can lead to fuel melting. Early work on these issues is reviewed. The program to meet remaining needs is described. Three major transient-testing activities are included: in-cell experiments on small samples of irradiated fuel and target, small-scale phenomenological experiments in the ACRR reactor, and limited-integral experiments in the TREAT reactor. A coordinated development of detailed fuel and target behavior models is also described
International Nuclear Information System (INIS)
Baccou, J.; Chojnacki, E.
2007-01-01
This work is devoted to some recent developments in uncertainty analysis of the computer code responses used for accident management procedures in nuclear industry. The classical probabilistic approach to evaluate uncertainties is recalled. In this case, the statistical treatment of the code responses is based on the use of order statistics. It provides direct estimations of relevant statistical measures for safety studies. However, the lack of knowledge about uncertainty sources can deteriorate the decision-making. To respect the real state of knowledge, a second model, based on the Dempster-Shafer theory is introduced. It allows to mix the probabilistic approach with the possibility theory that is more appropriate when few information is available. An application of both methodologies to the uncertainty analysis of a LBLOCA transient (LOFT-L2-5) is given
Preliminary regulatory audit calculation for Shinkori Units 3 and 4 LBLOCA
Energy Technology Data Exchange (ETDEWEB)
Woo, S. W.; Kim, B. S.; Kim, J. K. (and others)
2006-12-15
The objective of this study is to perform a preliminary evaluation for Shinkori Units 3 and 4 LBLOCA by applying KINS Realistic Evaluation Methodology (REM). The following results were obtained: (1) From the evaluation for Shinkori Units 3 and 4 LBLOCA, the peak cladding temperature was evaluated to meet the regulatory requirement and the feasibility of the KINS-REM was identified. (2) The input decks that were developed in the previous studies, were reviewed and the evaluation model of the fluidic device was developed and applied for the audit calculation. (3) The treating method for the uncertainty of the gap conductance was developed and applied for the audit calculation. (4) The pre- and post-processing programs were developed for this study. (5) For the more detailed assessments, the information for the gap conductance, etc. should be improved and the effects of coolant bypass during blowdown, steam binding and so on were not sufficiently evaluated. KINS-REM should be advanced to evaluate these effects properly. The KINS methodology that was used in this study, can be further applied for independent regulatory audit calculations related to the licensing application on LOCA best estimate calculation.
Integrated uncertainty analysis using RELAP/SCDAPSIM/MOD4.0
International Nuclear Information System (INIS)
Perez, M.; Reventos, F.; Wagner, R.; Allison, C.
2009-01-01
The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis package being developed jointly by the Technical University of Catalunya (UPC) and Innovative Systems Software (ISS). The integrated uncertainty analysis approach used in the package uses the following steps: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. The first four steps are performed by the user prior to the RELAP/SCDAPSIM/MOD4.0 analysis. The remaining steps are included with the MOD4.0 integrated uncertainty analysis (IUA) package. This paper briefly describes the integrated uncertainty analysis package including (a) the features of the package, (b) the implementation of the package into RELAP/SCDAPSIM/MOD4.0, and
Role of Fugen HWR in Japan and design of a 600 MWe demonstration reactor
International Nuclear Information System (INIS)
Sawai, Sadamu.
1982-03-01
Fugen, a 165 MWe prototype of a heavy water-moderated, boiling light water-cooled reactor, has been in commercial operation since March 20, 1979. In parallel with the Fugen project, the design work for a 600 MWe demonstration plant has been carried out since 1973. The important systems and components, such as pressure tube assemblies and control rod drive mechanism, are essentially the same as those of Fugen. However, some modification is made owing to the experience obtained in Fugen and LWrs. In the HWR Fugen, plutonium and uranium are effectively used, and plutonium makes the coolant void reactivity more negative, which results in the increase of the stability and safety of the reactor. On August 4, 1981, the ad hoc committee submitted the final report to the Japanese Atomic Energy Commission, in which the construction of a 600 MWe demonstration plant was recommended. As for the research and development on reactor safety, coolant leak detectors, the performance of ECCS, and safety design codes are enumerated. Since 1965, mixed oxide fuel has been developed, and 168 fuel assemblies were loaded in Fugen, but failure did not occur. (Kako, I.)
Statistical Uncertainty Quantification of Physical Models during Reflood of LBLOCA
Energy Technology Data Exchange (ETDEWEB)
Oh, Deog Yeon; Seul, Kwang Won; Woo, Sweng Woong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)
2015-05-15
The use of the best-estimate (BE) computer codes in safety analysis for loss-of-coolant accident (LOCA) is the major trend in many countries to reduce the significant conservatism. A key feature of this BE evaluation requires the licensee to quantify the uncertainty of the calculations. So, it is very important how to determine the uncertainty distribution before conducting the uncertainty evaluation. Uncertainty includes those of physical model and correlation, plant operational parameters, and so forth. The quantification process is often performed mainly by subjective expert judgment or obtained from reference documents of computer code. In this respect, more mathematical methods are needed to reasonably determine the uncertainty ranges. The first uncertainty quantification are performed with the various increments for two influential uncertainty parameters to get the calculated responses and their derivatives. The different data set with two influential uncertainty parameters for FEBA tests, are chosen applying more strict criteria for selecting responses and their derivatives, which may be considered as the user’s effect in the CIRCÉ applications. Finally, three influential uncertainty parameters are considered to study the effect on the number of uncertainty parameters due to the limitation of CIRCÉ method. With the determined uncertainty ranges, uncertainty evaluations for FEBA tests are performed to check whether the experimental responses such as the cladding temperature or pressure drop are inside the limits of calculated uncertainty bounds. A confirmation step will be performed to evaluate the quality of the information in the case of the different reflooding PERICLES experiments. The uncertainty ranges of physical model in MARS-KS thermal-hydraulic code during the reflooding were quantified by CIRCÉ method using FEBA experiment tests, instead of expert judgment. Also, through the uncertainty evaluation for FEBA and PERICLES tests, it was confirmed
Evaluation of the possibility of plutonium and minor actinides transmutation in HWR
International Nuclear Information System (INIS)
Ghitescu, P.; Ghizdeanu, N. B.
2008-01-01
Partitioning and Transmutation (P and T) techniques could contribute to reduce the radioactive inventory and its associated radio-toxicity. Until now, for this purpose were studied ADS and/or FBR, but not HWR. There are several developed computer codes that analyze the inventory of the radio-nuclides in spent fuel before and after transmutation. WIMSD code is a deterministic lattice spectrum code, which can analyze the reactor neutronic behaviour It also has the capacity to generate burn up and can calculate the inventory of the radio-nuclides of the spent fuel. The advantage of WIMSD code is the variety of the created geometries, together with the big amount of calculated information (K-infinite, macroscopic cross-sections, burnable material radioactive inventory etc). Starting from WIMSD code, the paper presents a model, which simulates the possibility of fuel transmutation in PHWRs. First step was to propose a model, which simulates a CANDU reactor lattice and calculate the radionuclides inventory in an irradiated CANDU fuel bundle. The results were compared with the existing experimental data from CANDU reactors and the calculated parameters were found to be in good agreement with them. After the validation, several simulations were made for PHWRs in order to establish the optimal parameters, related to the efficiency of the transmutation process. Therefore, the code was used for a new type of fuel, containing Plutonium and minor actinides that could be transmuted. The new radioactive inventories were calculated. The simulations showed that Pu content decreases up to 8% in a CANDU reactor and 25% in an ACR. Thus, ACR can reduce the Plutonium inventory from MOX fuel and could be a transmutation solution. (authors)
Mechanical design and analysis of a low beta squeezed half-wave resonator
He, Shou-Bo; Zhang, Cong; Yue, Wei-Ming; Wang, Ruo-Xu; Xu, Meng-Xin; Wang, Zhi-Jun; Huang, Shi-Chun; Huang, Yu-Lu; Jiang, Tian-Cai; Wang, Feng-Feng; Zhang, Sheng-Xue; He, Yuan; Zhang, Sheng-Hu; Zhao, Hong-Wei
2014-08-01
A superconducting squeezed type half-wave resonator (HWR) of β=0.09 has been developed at the Institute of Modern Physics, Lanzhou. In this paper, a basic design is presented for the stiffening structure for the detuning effect caused by helium pressure and Lorentz force. The mechanical modal analysis has been investigated the with finite element method (FEM). Based on these considerations, a new stiffening structure is proposed for the HWR cavity. The computation results concerning the frequency shift show that the low beta HWR cavity with new stiffening structure has low frequency sensitivity coefficient df/dp and Lorentz force detuning coefficient KL, and stable mechanical properties.
Development and application of KEPRI realistic evaluation methodology (KREM) for LB-LOCA
International Nuclear Information System (INIS)
Ban, Chang-Hwan; Lee, Sang-Yong; Sung, Chang-Kyung
2004-01-01
A realistic evaluation method for LB-LOCA of a PWR, KREM, is developed and its applicability is confirmed to a 3-loop Westinghouse plant in Korea. The method uses a combined code of CONTEMPT4/MOD5 and a modified RELAP5/MOD3.1. RELAP5 code calculates system thermal hydraulics with the containment backpressure calculated by CONTEMPT4, exchanging the mass/energy release and backpressure in every time step of RELAP5. The method is developed strictly following the philosophy of CSAU with a few improvements and differences. Elements and steps of KREM are shown in Figure this paper. Three elements of CSAU are maintained and the first element has no differences. An additional step of 'Check of Experimental Data Covering (EDC)' is embedded in element 2 in order to confirm the validity of code uncertainty parameters before applying them to plant calculations. The main idea to develop the EDC is to extrapolate the code accuracy which is determined in step 8 to the uncertainties of plant calculations. EDC is described in detail elsewhere and the basic concepts are explained in the later section of this paper. KREM adopts nonparametric statistics to quantify the overall uncertainty of a LB-LOCA at 95% probability and 95% confidence level from 59 plant calculations according to Wilks formula. These 59 calculations are performed in step 12 using code parameters determined in steps 8 and 9 and operation parameters from step 11. Scale biases are also evaluated in this step using the information of step 10. Uncertainties of code models and operation conditions are reflected in 59 plant calculations as multipliers to relevant parameters in the code or as input values simply. This paper gives the explanation on the overall structures of KREM and emphasizes its unique features. In addition, its applicability is confirmed to a 3-loop plant in Korea. KREM is developed for the realistic evaluation of LB-LOCA and its applicability is successfully demonstrated for the 3-loop power plants in
Energy Technology Data Exchange (ETDEWEB)
Serghiuta, D.; Tholammakkil, J.; Shen, W., E-mail: Dumitru.Serghiuta@cnsc-ccsn.gc.ca [Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)
2014-07-01
A stochastic-deterministic approach based on representation of uncertainties by subjective probabilities is proposed for evaluation of bounding values of functional failure probability and assessment of probabilistic safety margins. The approach is designed for screening and limited independent review verification. Its application is illustrated for a postulated generic CANDU LBLOCA and evaluation of the possibility distribution function of maximum bundle enthalpy considering the reactor physics part of LBLOCA power pulse simulation only. The computer codes HELIOS and NESTLE-CANDU were used in a stochastic procedure driven by the computer code DAKOTA to simulate the LBLOCA power pulse using combinations of core neutronic characteristics randomly generated from postulated subjective probability distributions with deterministic constraints and fixed transient bundle-wise thermal hydraulic conditions. With this information, a bounding estimate of functional failure probability using the limit for the maximum fuel bundle enthalpy can be derived for use in evaluation of core damage frequency. (author)
International Nuclear Information System (INIS)
Park, Yusun; Park, Hyun-sik; Kang, Kyoung-ho; Choi, Nam-hyun; Min, Kyoung-ho; Choi, Ki-yong
2014-01-01
Highlights: • Safety improvement by adopting 4 train emergency core cooling system was validated experimentally. • General thermal hydraulic behaviors of the system during LBLOCA reflood phase were successfully demonstrated. • Key parameters such as the liquid levels, the PCTs, the quenching time, and the ECC bypass ratios were investigated. • Asymmetric effects of the different combination of safety injection were negligible during the reflood period. - Abstract: The APR1400 is equipped with four safety injection pumps driven by two emergency diesel generators. However, the design has been changed so that the four safety injection pumps are driven by 4 emergency diesel generators during the design certification process from the U.S. NRC. Thus, 4 safety injection pumps (SIPs) are completely independent electrically and mechanically and three safety injection pumps are available in a single failure condition. This design change could have a certain effects on the thermal-hydraulic phenomenon occurring in the downcomer region during the late reflood phase of a large break loss of coolant accident (LBLOCA). Thus, in this study, a verification experiment for the reflood phase of a LBLOCA was performed to evaluate the core cooling performance of the 4 train emergency core cooling system (ECCS) with an assumption of a single failure. And the different combinations of three SIPs positions were tested to investigate the asymmetric effects on the reactor core cooling performance. The overall experimental results revealed the typical thermal–hydraulic trends expected to occur during the reflood phase of a large-break LOCA scenario for the APR1400. Experiment with the injection of three SIPs showed a faster core quenching time and lower bypass ratio than that of the case in which two SIPs were injected. The RPV wall temperature distributions showed the similar trend in spite of the different SIP combinations
Prediction of thermal-Hydraulic phenomena in the LBLOCA experiment L2-3 using RELAP5/MOD2
International Nuclear Information System (INIS)
Bang, Young Seok; Chung, Bub Dong; Kim, Hho Jung
1991-01-01
The LOFT LOCE L2-3 was simulated using the RELAP5/MOD2 Cycle 36.04 code to assess its capability in predicting the thermal-hydraulic phenomena in LBLOCA of a PWR. The reactor vessel was simulated with two core channels and split downcomer modeling for a base case calculation using the frozen code. The result of the base calculation showed that the code predicted the hydraulic behavior, and the blowdown thermal response at high power region of the core reasonably and that the code had deficiencies in the critical flow model during subcooled-two-phase transition period, in the CHF correlation at high mass flux and in the blowdown rewet criteria. An overprediction of coolant inventory due to the deficiencies yielded the poor prediction of reflood thermal response. Improvement of the code, RELAP5/MOD2 Cycle 36.04, based on the sensitivity study increased the accuracy of the prediction of the rewet phenomena. (Author)
International Nuclear Information System (INIS)
Ahn, Kwang Il; Chung, Bub Dong; Lee, John C.
2010-01-01
As pointed out in the OECD BEMUSE Program, when a high computation time is taken to obtain the relevant output values of a complex physical model (or code), the number of statistical samples that must be evaluated through it is a critical factor for the sampling-based uncertainty analysis. Two alternative methods have been utilized to avoid the problem associated with the size of these statistical samples: one is based on Wilks' formula, which is based on simple random sampling, and the other is based on the conventional nonlinear regression approach. While both approaches provide a useful means for drawing conclusions on the resultant uncertainty with a limited number of code runs, there are also some unique corresponding limitations. For example, a conclusion based on the Wilks' formula can be highly affected by the sampled values themselves, while the conventional regression approach requires an a priori estimate on the functional forms of a regression model. The main objective of this paper is to assess the feasibility of the ACE-RSM approach as a complementary method to the Wilks' formula and the conventional regression-based uncertainty analysis. This feasibility was assessed through a practical application of the ACE-RSM approach to the LOFT L2-5 LBLOCA PCT uncertainty analysis, which was implemented as a part of the OECD BEMUSE Phase III program
Development of Evaluation Technology of the Integrity of HWR Pressure Tubes
International Nuclear Information System (INIS)
Kim, Y. S.; Jeong, Y. M.; Ahn, S. B.
2005-03-01
Major degradation of the feeder pipe is the thinning due to the flow accelerated corrosion and the cracking in the bent region due to the stress corrosion cracking. The feeder pipe in a PHWR is a pipe to supply the coolant to the pressure tube and the heated coolant to the steam generator for power generation. Approximately 380 pipes are installed on the inlet side and outlet side each with two bent regions in the 600 MW-class PHWR. After a leakage in the bent region of the feeder pipe, it is required to examine all the pipes in order to ensure the integrity of the pressure boundaries. It is not easy, however, to examine all the pipes with the conventional ultrasonic method, because of a high dose of radiation exposure and a limited accessibility to the pipe. In order to get rid of the limited accessibility, the ultrasonic guided wave method are developed for detection and evaluation of the cracks in the feeder pipe. The dispersion mode analysis was performed for the development of long-range guided wave inspection for the feeder pipe. An analytical approach for the straight pipe as well as numerical approach for the bent pipe with 2-D FFT were accomplished. A computer program for the calculation of the dispersion curves and wave structures was developed. Based on the dispersion curves and wave structure of the feeder pipe, candidates for the optimal parameters on the frequencies and vibration modes were selected. A time-frequency analysis methodology was developed for the mode identification of received ultrasonic signal. A high power tone-burst ultrasonic system has been setup for the generation of guided waves. Various artificial notches were fabricated on the bent feeder pipes for the experiment on the flaw detection. Considering the results of dispersion analysis and field condition, the torsional vibration mode, T(0,1) is selected for the first choice. An array of electromagnetic acoustic transducers (EMAT) was designed and fabricated for the generation of T
Microstructure control of Zr-Nb-Sn alloy with Mo addition for HWR pressure tube application
International Nuclear Information System (INIS)
Hwang, S. K.; Kim, M. H.; Kim, J. H.; Kwon, S. I.; Kim, Y. S.
1997-01-01
As a basic research to develop the material for heavy water reactor pressure tube application the effect of Mo addition to Zr-Nb-Sn alloy was studied for the purpose of minimizing the amount of cold working while maintaining a high strength. To select the target alloy system we first designed various alloy compositions and chose Zr-Nb-Sn and Zr-Nb-Mo through multi-regression analysis of the relationship between the basic properties and the compositions. Plasma arc melting was used to produce the alloys and the microstructure change introduced by the processing steps including hot forging, beta-heat treatment, hot rolling, cold rolling and recrystallization heat treatment was investigated. Recrystallization of Zr-Nb-Sn was retarded by adding Mo and this resulted in a fine grain structure in Zr-Nb-Sn-Mo alloy. Beside the retarding effect recrystallization, Mo increased the amount of residual beta phase and showed an indication of precipitation hardening, which added up to the possibility of applying the alloy for the desired usage. (author)
Development of Evaluation Technology of the Integrity of HWR Pressure Tubes
Energy Technology Data Exchange (ETDEWEB)
Kim, Y S; Jeong, Y M; Ahn, S B [and others
2005-03-15
Major degradation of the feeder pipe is the thinning due to the flow accelerated corrosion and the cracking in the bent region due to the stress corrosion cracking. The feeder pipe in a PHWR is a pipe to supply the coolant to the pressure tube and the heated coolant to the steam generator for power generation. Approximately 380 pipes are installed on the inlet side and outlet side each with two bent regions in the 600 MW-class PHWR. After a leakage in the bent region of the feeder pipe, it is required to examine all the pipes in order to ensure the integrity of the pressure boundaries. It is not easy, however, to examine all the pipes with the conventional ultrasonic method, because of a high dose of radiation exposure and a limited accessibility to the pipe. In order to get rid of the limited accessibility, the ultrasonic guided wave method are developed for detection and evaluation of the cracks in the feeder pipe. The dispersion mode analysis was performed for the development of long-range guided wave inspection for the feeder pipe. An analytical approach for the straight pipe as well as numerical approach for the bent pipe with 2-D FFT were accomplished. A computer program for the calculation of the dispersion curves and wave structures was developed. Based on the dispersion curves and wave structure of the feeder pipe, candidates for the optimal parameters on the frequencies and vibration modes were selected. A time-frequency analysis methodology was developed for the mode identification of received ultrasonic signal. A high power tone-burst ultrasonic system has been setup for the generation of guided waves. Various artificial notches were fabricated on the bent feeder pipes for the experiment on the flaw detection. Considering the results of dispersion analysis and field condition, the torsional vibration mode, T(0,1) is selected for the first choice. An array of electromagnetic acoustic transducers (EMAT) was designed and fabricated for the generation of T
Energy Technology Data Exchange (ETDEWEB)
Lee, Youho, E-mail: euo@kaist.ac.kr; Lee, Jeong Ik, E-mail: jeongiklee@kaist.ac.kr; NO, Hee Cheon, E-mail: hcno@kaist.ac.kr
2016-03-15
Highlights: • Use of constant heat transfer coefficient for fracture analysis is not sound. • On-time heat transfer coefficient should be used for thermal fracture prediction. • ∼90% of the actual fracture stresses were predicted with the on-time transient h. • Thermal-hydraulic codes can be used to better predict brittle cladding fracture. • Effects of surface oxides on thermal shock fracture should be accounted by h. - Abstract: This study presents the importance of coherency in modeling thermal-hydraulics and mechanical behavior of a solid for an advanced prediction of cladding thermal shock fracture. In water quenching, a solid experiences dynamic heat transfer rate evolutions with phase changes of the fluid over a short quenching period. Yet, such a dynamic change of heat transfer rates has been overlooked in the analysis of thermal shock fracture. In this study, we are presenting quantitative evidence against the prevailing use of a constant heat transfer coefficient for thermal shock fracture analysis in water. We conclude that no single constant heat transfer could suffice to depict the actual stress evolution subject to dynamic fluid phase changes. Use of the surface temperature dependent heat transfer coefficient will remarkably increase predictability of thermal shock fracture of brittle materials. The presented results show a remarkable stress prediction improvement up to 80–90% of the actual stress with the use of the surface temperature dependent heat transfer coefficient. For thermal shock fracture analysis of brittle fuel cladding such as oxidized zirconium-based alloy or silicon carbide during LWR reflood, transient subchannel heat transfer coefficients obtained from a thermal-hydraulics code should be used as input for stress analysis. Such efforts will lead to a fundamental improvement in thermal shock fracture predictability over the current experimental empiricism for cladding fracture analysis during reflood.
International Nuclear Information System (INIS)
Ahmed, I.; Chow, H.C.; Younis, M.H.
1996-01-01
An investigation aimed at determining the effect of fuel string relocation on reactivity excursion and power pulse following a hypothetical Large Break Loss of Coolant Accident in KANUPP reactor is reported. The assessment of reactivity insertion was performed making use of global (reactor) core analysis computer code RFSP. The reactor kinetics module CERBERUS of the RFSP code and the SOPHT (thermal-hydraulics code) were subsequently employed for the neutronic transient analysis. The effect was evaluated in context of determining the adequacy of moderator dump shutdown system. Because of the presence of the gap between the inlet shield plug and the fuel string, the fuel bundles may shift in such a manner that low-irradiated fuel is moved towards the core centre. This represents an additional reactivity increase to be accounted for in the analysis. The reactivity excursion, however, is alleviated by an earlier reactor trip. The net impact is that the energy deposited in the maximum rated fuel pencil is increased from 56% of the 960 kJ/kg fuel-centre-line melting limit to 63%. The result demonstrated the adequacy of the shutdown system against the maximum credible accident event. (author)
Uncertainty Evaluation with Multi-Dimensional Model of LBLOCA in OPR1000 Plant
Energy Technology Data Exchange (ETDEWEB)
Kim, Jieun; Oh, Deog Yeon; Seul, Kwang-Won; Lee, Jin Ho [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)
2016-10-15
KINS has used KINS-REM (KINS-Realistic Evaluation Methodology) which developed for Best- Estimate (BE) calculation and uncertainty quantification for regulatory audit. This methodology has been improved continuously by numerous studies, such as uncertainty parameters and uncertainty ranges. In this study, to evaluate the applicability of improved KINS-REM for OPR1000 plant, uncertainty evaluation with multi-dimensional model for confirming multi-dimensional phenomena was conducted with MARS-KS code. In this study, the uncertainty evaluation with multi- dimensional model of OPR1000 plant was conducted for confirming the applicability of improved KINS- REM The reactor vessel modeled using MULTID component of MARS-KS code, and total 29 uncertainty parameters were considered by 124 sampled calculations. Through 124 calculations using Mosaique program with MARS-KS code, peak cladding temperature was calculated and final PCT was determined by the 3rd order Wilks' formula. The uncertainty parameters which has strong influence were investigated by Pearson coefficient analysis. They were mostly related with plant operation and fuel material properties. Evaluation results through the 124 calculations and sensitivity analysis show that improved KINS-REM could be reasonably applicable for uncertainty evaluation with multi-dimensional model calculations of OPR1000 plants.
Radionuclide Release after LBLOCA with Loss of Class IV Power Accident in CANDU-6 Plant
Energy Technology Data Exchange (ETDEWEB)
Choi, Hoon [KHNP Central Research Institute, Daejeon (Korea, Republic of)
2011-10-15
A large break in a pipe train of a primary heat transport system discharges coolant, which has high energy and large mass, into the containment building. Reactor shutdown and emergency core cooling water will limit the fuel cladding failure, but cannot prevent it entirely. The containment building is the last barrier of radionuclide release to the environment. Containment isolation and pressure suppression by dousing and local air cooler reduce the amount of radionuclide release to the environment. The objective of containment behavior analysis for large break loss of coolant with loss of class IV power accident is to assess the amount of radionuclide release to the ambient atmosphere. Radionuclide release rates in this event, with all safety system available, that is, the containment building is intact, as well as with containment system impairment, are analyzed with GOTHIC and SMART code
Energy Technology Data Exchange (ETDEWEB)
Jang, Hyung-wook; Lee, Sang-yong; Oh, Seung-jong; Kim, Woong-bae [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)
2016-10-15
The phenomena of LOCA have been investigated for long time. The most extensive research project for LOCA was the 2D/3D program experiments. The results of the 2D/3D experiments show flow conditions in the downcomer during end-of-blowdown were highly multi-dimensional at full-scale. In this paper, the authors modified the nodalization of MARS code LBLOCA input deck and performed LBLOCA analysis with new input deck. An LBLOCA analysis for APR1400 with new downcomer input deck was conducted using KREM with MARS-KS 1.4 Version code. Analysis was processed under LBCOCA of 100% break size of cold leg case. The authors developed input deck with new downcomer nodalizaion and Multi-Dimensional downcomer model, then implemented LOCA analysis with new input decks and compared with existing analysis results. PCT from new input and multi-dimensional input deck shows similar PCT trend from original input deck. There occurred more rapid drop of PCT from new and multidimensional input deck than original input deck. PCT from new and multidimensional input deck are satisfied with PCT design limit. It can be concluded that there occurs no acceptance criteria issue even though new and multidimensional input deck are applied to LBLOCA analysis. In future study, comparative analysis with experiment results will be implemented.
Energy Technology Data Exchange (ETDEWEB)
Won-Jae, Lee; Kwi-Seok, Ha; Chul-Hwa, Song [Korea Atomic Energy Research Inst., Daejeon (Korea, Republic of)
2001-07-01
The MARS code has been assessed for the downcomer multi-dimensional thermal hydraulics during a large break loss-of-coolant accident (LBLOCA) reflood of Korean Next Generation Reactor (KNGR) that adopted an upper direct vessel injection (DVI) design. Direct DVI bypass and downcomer level sweep-out tests carried out at 1/50-scale air-water DVI test facility are simulated to examine the capability of MARS. Test conditions are selected such that they represent typical reflood conditions of KNGR, that is, DVI injection velocities of 1.0 {approx} 1.6 m/sec and air injection velocities of 18.0 {approx} 35.0 m/sec, for single and double DVI configurations. MARS calculation is first adjusted to the experimental DVI film distribution that largely affects air-water interaction in a scaled-down downcomer, then, the code is assessed for the selected test matrix. With some improvements of MARS thermal-hydraulic (T/H) models, it has been demonstrated that the MARS code is capable of simulating the direct DVI bypass and downcomer level sweep-out as well as the multi-dimensional thermal hydraulics in downcomer, where condensation effect is excluded. (authors)
Thermal-Hydraulic Analysis for SBLOCA in OPR1000 and Evaluation of Uncertainty for PSA
International Nuclear Information System (INIS)
Kim, Tae Jin; Park, Goon Cherl
2012-01-01
Probabilistic Safety assessment (PSA) is a mathematical tool to evaluate numerical estimates of risk for nuclear power plants (NPPs). But PSA has the problems about quality and reliability since the quantification of uncertainties from thermal hydraulic (TH) analysis has not been included in the quantification of overall uncertainties in PSA. From the former research, it is proved that the quantification of uncertainties from best-estimate LBLOCA analysis can improve the PSA quality by modifying the core damage frequency (CDF) from the existing PSA report. Basing on the similar concept, this study considers the quantification of SBLOCA analysis results. In this study, however, operator error parameters are also included in addition to the phenomenon parameters which are considered in LBLOCA analysis
Sensitivity Study on Analysis of Reactor Containment Response to LOCA
International Nuclear Information System (INIS)
Chung, Ku Young; Sung, Key Yong
2010-01-01
As a reactor containment vessel is the final barrier to the release of radioactive material during design basis accidents (DBAs), its structural integrity must be maintained by withstanding the high pressure conditions resulting from DBAs. To verify the structural integrity of the containment, response analyses are performed to get the pressure transient inside the containment after DBAs, including loss of coolant accidents (LOCAs). The purpose of this study is to give regulative insights into the importance of input variables in the analysis of containment responses to a large break LOCA (LBLOCA). For the sensitivity study, a LBLOCA in Kori 3 and 4 nuclear power plant (NPP) is analyzed by CONTEMPT-LT computer code
Sensitivity Study on Analysis of Reactor Containment Response to LOCA
Energy Technology Data Exchange (ETDEWEB)
Chung, Ku Young; Sung, Key Yong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)
2010-10-15
As a reactor containment vessel is the final barrier to the release of radioactive material during design basis accidents (DBAs), its structural integrity must be maintained by withstanding the high pressure conditions resulting from DBAs. To verify the structural integrity of the containment, response analyses are performed to get the pressure transient inside the containment after DBAs, including loss of coolant accidents (LOCAs). The purpose of this study is to give regulative insights into the importance of input variables in the analysis of containment responses to a large break LOCA (LBLOCA). For the sensitivity study, a LBLOCA in Kori 3 and 4 nuclear power plant (NPP) is analyzed by CONTEMPT-LT computer code
Validation of Advanced Computer Codes for VVER Technology: LB-LOCA Transient in PSB-VVER Facility
Directory of Open Access Journals (Sweden)
A. Del Nevo
2012-01-01
Full Text Available The OECD/NEA PSB-VVER project provided unique and useful experimental data for code validation from PSB-VVER test facility. This facility represents the scaled-down layout of the Russian-designed pressurized water reactor, namely, VVER-1000. Five experiments were executed, dealing with loss of coolant scenarios (small, intermediate, and large break loss of coolant accidents, a primary-to-secondary leak, and a parametric study (natural circulation test aimed at characterizing the VVER system at reduced mass inventory conditions. The comparative analysis, presented in the paper, regards the large break loss of coolant accident experiment. Four participants from three different institutions were involved in the benchmark and applied their own models and set up for four different thermal-hydraulic system codes. The benchmark demonstrated the performances of such codes in predicting phenomena relevant for safety on the basis of fixed criteria.
Transient thermal-hydraulic/neutronic analysis in a VVER-1000 reactor core
International Nuclear Information System (INIS)
Seyed khalil Mousavian; Mohammad Mohsen Ertejaei; Majid Shahabfar
2005-01-01
Full text of publication follows: Nowadays, coupled thermal-hydraulic and three-dimensional neutronic codes in order to consider different feedback effects is state of the art subject in nuclear engineering researches. In this study, RELAP5/COBRA and WIMS/CITATION codes are implemented to investigate the VVER-1000 reactor core parameters during Large Break Loss of Coolant Accident (LB-LOCA). In a LB-LOCA, the primary side pressure, coolant density and fuel temperature strongly decrease but the cladding temperature experiences a strong peak. For this purpose, the RELAP5 Best Estimate (BE) system code is used to simulate the LB-LOCA analysis in VVER-1000 nuclear thermal-hydraulic loops. Also, the modified COBRA-IIIc software as a sub-channel analysis code is applied for modeling of VVER-1000 reactor core. Moreover, WIMS and CITATION as a cross section and 3-D neutron flux codes are coupled with thermal-hydraulic codes with the aim of consider the spatial effects through the reactor core. For this reason, suitable software is developed to link and speed up the coupled thermalhydraulic and three-dimensional neutronic calculations. This software utilizes of external coupling concept in order to integrate thermal-hydraulic and neutronic calculations. (authors)
Analysis of the FIST integral tests 4DBA1, 6SB2C and T1QUV with TRAC-BFl/v2001.2
International Nuclear Information System (INIS)
Analytis, G.Th.
2004-01-01
As part of the assessment of the frozen version of the PSU TRAC-BFl/v2001.2 and its qualification as a LB-LOCA and SB-LOCA code, in this work, we shall outline the comparisons between measurements and code predictions for three FIST tests: The LB-LOCA test 4DBA1, the SB-LOCA test 6SB2C and the failure to maintain water level test T1QUV. We shall study the effect of the number of axial levels in the active core as well as (in the case of the SB-LOCA test 6SB2C) the effect of the timing of the activation of the reflooding options/heat transfer package on the code predictions. Furthermore, we shall show that by using the upwinding option of some terms of the three-dimensional momentum equations, severe mass-error problems appearing in the analysis of the test T1QUV can be resolved. Generally, we shall show that although there are some differences between measurements and predictions, TRAC-BF1 captures quite well the overall behaviour of the LB-LOCA transient (depending on the number of axial nodes in the core) but underpredicts the rod surface temperatures of the SB-LOCA test 6SB2C
PWR systems transient analysis
International Nuclear Information System (INIS)
Kennedy, M.F.; Peeler, G.B.; Abramson, P.B.
1985-01-01
Analysis of transients in pressurized water reactor (PWR) systems involves the assessment of the response of the total plant, including primary and secondary coolant systems, steam piping and turbine (possibly including the complete feedwater train), and various control and safety systems. Transient analysis is performed as part of the plant safety analysis to insure the adequacy of the reactor design and operating procedures and to verify the applicable plant emergency guidelines. Event sequences which must be examined are developed by considering possible failures or maloperations of plant components. These vary in severity (and calculational difficulty) from a series of normal operational transients, such as minor load changes, reactor trips, valve and pump malfunctions, up to the double-ended guillotine rupture of a primary reactor coolant system pipe known as a Large Break Loss of Coolant Accident (LBLOCA). The focus of this paper is the analysis of all those transients and accidents except loss of coolant accidents
International Nuclear Information System (INIS)
Cardenas V, J.; Mugica R, C. A.; Lopez M, R.
2015-09-01
In this paper the analysis of scenario for the loss of coolant case was realized with break at the bottom of a recirculation loop of a BWR-5 with containment type Mark II and a thermal power of 2317 MWt considering that not have coolant injection. This in order to observe the speed of progression of the accident, the phenomenology of the scenario, the time to reach the limit pressure of containment venting and the amount of radionuclides released into the environment. This simulation was performed using the MELCOR code version 2.1. The scenario posits a break in one of the shear recirculation loops. The emergency core cooling system (ECCS) and the reactor core isolation cooling (Rcic) have not credit throughout the event, which allowed achieve greater severity on scenario. The venting of the primary containment was conducted via valve of 30 inches instead of the line of 24 inches of wet well, this in order to have a larger area of exhaust of fission products directly to the reactor building. The venting took place when the pressure in the primary containment reached the 4.5 kg/cm 2 and remained open for the rest of the scenario to maximize the amount released of radionuclides to the atmosphere. The safety relief valves were considered functional they do not present mechanical failure or limit their ability to release pressure due to the large number of performances in safety mode. The results of the analysis covers about 48 hours, time at which the accident evolution was observed; behavior of level, pressure in the vessel and the fuel temperature profile was analyzed. For progression of the scenario outside the vessel, the pressure and temperature of the primary containment, level and temperature of the suppression pool, the hydrogen accumulation in the container and the radionuclides mass released into the atmosphere were analyzed. (Author)
Containment Performance Analysis with Large Break LOCA for EU-APR1400
International Nuclear Information System (INIS)
Hwang, Do Hyun; Lee, Keun Sung; Kim, Yong Soo
2013-01-01
In this paper for containment performance analysis, the containment pressurization analysis is performed and thermo-hydraulic response analysis of containment structure is carried out to provide basic understanding of containment transient states under a severe accident sequence. Especially, in EU-APR1400 design, to reduce containment pressure and temperature, Severe Accident Containment Spray System (SACSS) is designed to be actuated automatically when Core Exit Temperature (CET) reaches 922 K (649 .deg. C). The containment performance analysis was carried on LBLOCA sequence for EU-APR1400 with SACSS through MAAP code. If SACSS is actuated when CET reaches 922 K (649 .deg. C) , the containment pressure and temperature decrease to a sufficient low level. The predicted atmospheric pressure of containment will not exceed the ultimate pressure capacity (UPC) and have a sufficient margin to it even though the UPC of the reference plant (Shin-Kori Units 3 and 4) is used instead because the UPC calculation for EU-APR1400 has not been completed. The largest load on the containment by LBLOCA is estimated at 306.1 kPa. Thus the margin to UPC is estimated to be 330 % in comparison with 1.329 MPa as UPC for the reference plant.
International Nuclear Information System (INIS)
Liang, T.H.; Liang, K.S.; Cheng, C.K.; Pei, B.S.; Patelli, E.
2016-01-01
Highlights: • With RISMC methodology, both aleatory and epistemic uncertainties have been considered. • 14 probabilistically significant sequences have been identified and quantified. • A load spectrum for LBLOCA has been conducted with CPCT and SP of each dominant sequence. • Comparing to deterministic methodologies, the risk-informed PCT margin can be greater by 44–62 K. • The SP of the referred sequence to cover 99% in the load spectrum is only 5.07 * 10 −3 . • The occurrence probability of the deterministic licensing sequence is 5.46 * 10 −5 . - Abstract: For general design basis accidents, such as SBLOCA and LBLOCA, the traditional deterministic safety analysis methodologies are always applied to analyze events based on a so called surrogate or licensing sequence, without considering how low this sequence occurrence probability is. In the to-be-issued 10 CFR 50.46a, the LBLOCA will be categorized as accidents beyond design basis and the PCT margin shall be evaluated in a risk-informed manner. According to the risk-informed safety margin characterization (RISMC) methodology, a process has been suggested to evaluate the risk-informed PCT margin. Following the RISMC methodology, a load spectrum of PCT for LBLOCA has been generated for the Taiwan’s Maanshan Nuclear Power plant and 14 probabilistic significant sequences have been identified. It was observed in the load spectrum that the conditional PCT generally ascends with the descending sequence occurrence probability. With the load spectrum covering both aleatory and epistemic uncertainties, the risk-informed PCT margin can be evaluated by either expecting value estimation method or sequence probability coverage method. It was found that by comparing with the traditional deterministic methodology, the PCT margin evaluated by the RISMC methodology can be greater by 44–62 K. Besides, to have a cumulated occurrence probability over 99% in the load spectrum, the occurrence probability of the
Energy Technology Data Exchange (ETDEWEB)
Liang, T.H. [Institute of Nuclear Engineering and Science, National Tsing Hua University, 101 Sec. 2, Kuang-Fu Road, Hsinchu 30013, Taiwan (China); Liang, K.S., E-mail: ksliang@alum.mit.edu [Institute of Nuclear Engineering and Science, National Tsing Hua University, 101 Sec. 2, Kuang-Fu Road, Hsinchu 30013, Taiwan (China); Cheng, C.K.; Pei, B.S. [Institute of Nuclear Engineering and Science, National Tsing Hua University, 101 Sec. 2, Kuang-Fu Road, Hsinchu 30013, Taiwan (China); Patelli, E. [Institute of Risk and Uncertainty, University of Liverpool, Room 610, Brodie Tower, L69 3GQ (United Kingdom)
2016-11-15
Highlights: • With RISMC methodology, both aleatory and epistemic uncertainties have been considered. • 14 probabilistically significant sequences have been identified and quantified. • A load spectrum for LBLOCA has been conducted with CPCT and SP of each dominant sequence. • Comparing to deterministic methodologies, the risk-informed PCT margin can be greater by 44–62 K. • The SP of the referred sequence to cover 99% in the load spectrum is only 5.07 * 10{sup −3}. • The occurrence probability of the deterministic licensing sequence is 5.46 * 10{sup −5}. - Abstract: For general design basis accidents, such as SBLOCA and LBLOCA, the traditional deterministic safety analysis methodologies are always applied to analyze events based on a so called surrogate or licensing sequence, without considering how low this sequence occurrence probability is. In the to-be-issued 10 CFR 50.46a, the LBLOCA will be categorized as accidents beyond design basis and the PCT margin shall be evaluated in a risk-informed manner. According to the risk-informed safety margin characterization (RISMC) methodology, a process has been suggested to evaluate the risk-informed PCT margin. Following the RISMC methodology, a load spectrum of PCT for LBLOCA has been generated for the Taiwan’s Maanshan Nuclear Power plant and 14 probabilistic significant sequences have been identified. It was observed in the load spectrum that the conditional PCT generally ascends with the descending sequence occurrence probability. With the load spectrum covering both aleatory and epistemic uncertainties, the risk-informed PCT margin can be evaluated by either expecting value estimation method or sequence probability coverage method. It was found that by comparing with the traditional deterministic methodology, the PCT margin evaluated by the RISMC methodology can be greater by 44–62 K. Besides, to have a cumulated occurrence probability over 99% in the load spectrum, the occurrence probability
International Nuclear Information System (INIS)
Shi, Xingwei; Cao, Xinrong; Liu, Zhengzhi
2013-01-01
Highlights: • A new verified oxidation model of cladding has been added in Severe Accident Program (SAP). • A coupled analysis method utilizing RELAP5 and SAP codes has been developed and applied to analyze a SA caused by LBLOCA. • Analysis of cladding oxidation under a SA for Qinshan Phase II Nuclear Power Plant (QSP-II NPP) has been performed by SAP. • Estimation of the production of hydrogen has been achieved by coupled codes. - Abstract: Core behavior at a high temperature is extremely complicated during transition from Design Basic Accident (DBA) to the severe accident (SA) in Light Water Reactors (LWRs). The progression of core damage is strongly affected by the behavior of fuel cladding (oxidation, embrittlement and burst). A Severe Accident Program (SAP) is developed to simulate the process of fuel cladding oxidation, rupture and relocation of core debris based on the oxidation models of cladding, candling of melted material and mechanical slumping of core components. Relying on the thermal–hydraulic boundary parameters calculated by RELAP5 code, analysis of a SA caused by the large break loss-of-coolant accident (LBLOCA) without mitigating measures for Qinshan Phase II Nuclear Power Plant (QSP-II NPP) was performed by SAP for finding the key sequences of accidents, estimating the amount of hydrogen generation and oxidation behavior of the cladding
Uncertainty and sensitivity analysis of the LOFT L2-5 test: Results of the BEMUSE programme
International Nuclear Information System (INIS)
Crecy, A. de; Bazin, P.; Glaeser, H.; Skorek, T.; Joucla, J.; Probst, P.; Fujioka, K.; Chung, B.D.; Oh, D.Y.; Kyncl, M.; Pernica, R.; Macek, J.; Meca, R.; Macian, R.; D'Auria, F.; Petruzzi, A.; Batet, L.; Perez, M.; Reventos, F.
2008-01-01
This paper presents the results and the main lessons learnt from the phase 3 of BEMUSE, an international benchmark activity sponsored by the Committee on the Safety of Nuclear Installations [CSNI: Committee on the Safety of Nuclear Installations (NEA, OECD), 2007. BEMUSE Phase III Report. NEA/CSNI R(2007) 4, October 2007] of the OECD/NEA. The phase 3 of BEMUSE aimed at performing Uncertainty and Sensitivity Analyses of thermal-hydraulic codes used for the calculation of LOFT L2-5 experiment, which simulated a Large-Break Loss-of-Coolant-Accident (LB-LOCA). Eleven participants coming from ten organisations and eight countries took part in this benchmark. In the first section of this paper, the context of BEMUSE is described as well as the methods used by the participants. In the second section, the results of the benchmark are presented. The majority of the participants find uncertainty bands which envelop the experimental data fairly well, however the width of these bands is much diverged. A synthesis of the sensitivity analysis results has been made and is expected to provide a useful basis for further uncertainty analysis dealing with LB-LOCA. Finally, recommendations are given both for uncertainty and sensitivity analysis
Energy Technology Data Exchange (ETDEWEB)
Cardenas V, J.; Mugica R, C. A.; Lopez M, R., E-mail: jaime.cardenas@cnsns.gob.mx [Comision Nacional de Seguridad Nuclear y Salvaguardias, Dr. Barragan 779, Col. Narvarte, 03020 Ciudad de Mexico (Mexico)
2015-09-15
In this paper the analysis of scenario for the loss of coolant case was realized with break at the bottom of a recirculation loop of a BWR-5 with containment type Mark II and a thermal power of 2317 MWt considering that not have coolant injection. This in order to observe the speed of progression of the accident, the phenomenology of the scenario, the time to reach the limit pressure of containment venting and the amount of radionuclides released into the environment. This simulation was performed using the MELCOR code version 2.1. The scenario posits a break in one of the shear recirculation loops. The emergency core cooling system (ECCS) and the reactor core isolation cooling (Rcic) have not credit throughout the event, which allowed achieve greater severity on scenario. The venting of the primary containment was conducted via valve of 30 inches instead of the line of 24 inches of wet well, this in order to have a larger area of exhaust of fission products directly to the reactor building. The venting took place when the pressure in the primary containment reached the 4.5 kg/cm{sup 2} and remained open for the rest of the scenario to maximize the amount released of radionuclides to the atmosphere. The safety relief valves were considered functional they do not present mechanical failure or limit their ability to release pressure due to the large number of performances in safety mode. The results of the analysis covers about 48 hours, time at which the accident evolution was observed; behavior of level, pressure in the vessel and the fuel temperature profile was analyzed. For progression of the scenario outside the vessel, the pressure and temperature of the primary containment, level and temperature of the suppression pool, the hydrogen accumulation in the container and the radionuclides mass released into the atmosphere were analyzed. (Author)
Superconducting Accelerating Cavity Pressure Sensitivity Analysis
International Nuclear Information System (INIS)
Rodnizki, J.; Horvits, Z.; Ben Aliz, Y.; Grin, A.; Weissman, L.
2014-01-01
The measured sensitivity of the cavity was evaluated and it is full consistent with the measured values. It was explored that the tuning system (the fog structure) has a significant contribution to the cavity sensitivity. By using ribs or by modifying the rigidity of the fog we may reduce the HWR sensitivity. During cool down and warming up we have to analyze the stresses on the HWR to avoid plastic deformation to the HWR since the Niobium yield is an order of magnitude lower in room temperature
Development of regulatory technology for thermal-hydraulic safety analysis
International Nuclear Information System (INIS)
Bang, Young Seok; Lee, S. H.; Ryu, Y. H.
2001-02-01
The present study aims to develop the regulation capability in thermal-hydraulic safety analysis which was required for the reasonable safety regulation in the current NPP, the next generation reactors, and the future-type reactors. The fourth fiscal year of the first phase of the research was focused on the following research topics: Investigation on the current status of the thermal-hydraulic safety analysis technology outside and inside of the country; Review on the improved features of the thermal-hydraulic safety analysis regulatory audit code, RELAP5/MOD3; Assessments of code with LOFT L9-3 ATWS experiment and LSTF SB-SG-10 multiple SGTR experiment; Application of the RELAP5/CANDU code to analyses of SLB and LBLOCA and evaluation of its effect on safety; Application of the code to IAEA PHWR ISP analysis; Assessments of RELAP5 and TRAC with UPTF downcomer injection test and Analysis of LBLOCA with RELAP5 for the performance evaluation of KNGR DVI; Setup of a coupled 3-D kinetics and thermal-hydraulics and application it to a reactivity accident analysis; and Extension of database and improvement of plant input decks. For supporting the resolution of safety issues, loss of RHR event during midloop operation was analyzed for Kori Unit 3, issues on high burnup fuel were reviewed and performance of FRAPCON-3 assessed. Also MSLB was analyzed to figure out the sensitivity of downcomer temperature supporting the PTS risk evaluation of Kori Unit 1. Thermal stratification in pipe was analyzed using the method proposed. And a method predicting the thermal-hydraulic performance of IRWST of KNGR was explored. The PWR ECCS performance criteria was issued as a MOST Article 200-19.and a regulatory guide on evaluation methodology was improved to cover concerns raised from the related licensing review process
International Nuclear Information System (INIS)
1989-02-01
Based on the IAEA Standards, essential safety aspects of a three-loop pressurized water reactor (1,000 MWe) and a corresponding heavy water reactor were studied by the TUeV Baden e.V. in cooperation with the Gabinete de Proteccao e Seguranca Nuclear, a department of the Ministry which is responsible for Nuclear power plants in Portugal. As the fundamental principles of this study the design data for the light water reactor and the heavy water reactor provided in the safety analysis reports (KWU-SSAR for the 1,000 MWe PWR, KWU-PSAR Nuclear Power Plant ATUCHA II) are used. The assessment of the two different reactor types based on the IAEA Nuclear Safety Standards shows that the reactor plants designed according to the data given in the safety analysis reports of the plant manufacturer meet the design requirements laid down in the pertinent IAEA Standards. (orig.) [de
Energy Technology Data Exchange (ETDEWEB)
Yamaji, K [Central Research Inst. of Electric Power Industry, Tokyo (Japan)
1980-07-01
Systems analysis of the thorium cycle, a nuclear fuel cycle accomplished by using thorium, is reported in this paper. Following a brief review on the history of the thorium cycle development, analysis is made on the three functions of the thorium cycle; (1) auxiliary system of U-Pu cycle to save uranium consumption, (2) thermal breeder system to exert full capacity of the thorium resource, (3) symbiotic system to utilize special features of /sup 233/U and neutron sources. The effects of the thorium loading in LWR (Light Water Reactor), HWR (Heavy Water Reactor) and HTGR (High Temperature Gas-cooled Reactor) are considered for the function of auxiliary system of U-Pu cycle. Analysis is made to find how much uranium is saved by /sup 233/U recycling and how the decrease in Pu production influences the introduction of FBR (Fast Breeder Reactor). Study on thermal breeder system is carried out in the case of MSBR (Molten Salt Breeder Reactor). Under a certain amount of fissile material supply, the potential system expansion rate of MSBR, which is determined by fissile material balance, is superior to that of FBR because of the smaller specific fissile inventory of MSBR. For symbiotic system, three cases are treated; i) nuclear heat supply system using HTGR, ii) denatured fuel supply system for nonproliferation purpose, and iii) hybrid system utilizing neutron sources other than fission reactor.
Research on using depleted uranium as nuclear fuel for HWR
International Nuclear Information System (INIS)
Zhang Jiahua; Chen Zhicheng; Bao Borong
1999-01-01
The purpose of our work is to find a way for application of depleted uranium in CANDU reactor by using MOX nuclear fuel of depleted U and Pu instead of natural uranium. From preliminary evaluation and calculation, it was shown that MOX nuclear fuel consisting of depleted uranium enrichment tailings (0.25% 235 U) and plutonium (their ratio 99.5%:0.5%) could replace natural uranium in CANDU reactor to sustain chain reaction. The prospects of application of depleted uranium in nuclear energy field are also discussed
Operating experience of Fugen-HWR in Japan
International Nuclear Information System (INIS)
Yoshino, F.
1991-01-01
Fugen is a 165 MWe prototype heavy water reactor which mainly uses plutonium-uranium mixed oxide (MOX) fuel. Power Reactor and Nuclear Fuel Development Corporation (PNC) has taken responsibility for the advanced thermal reactor (ATR) project, with its name 'FUGEN' taken from the Buddhist God of Mercy. The project started in October 1967, to develop and establish the technology for this new type of reactor and to clarify MOX fuel performance in the reactor. Site construction began in December 1970 at Tsuruga and the plant commenced commercial operation on March 20, 1979. Since then, Fugen has been operated successfully for more than twelve years. The plant performance and reliability of this type of reactor has been demonstrated through the operation. All these operational experiences have contributed to the establishment of the ATR technology
Promotion of good safety culture at a Canadian HWR
Energy Technology Data Exchange (ETDEWEB)
Curle, B [Darlington NPP (Canada)
1997-12-31
People work at a nuclear plant within a structured environment. It is the programs, procedures and other ``tools`` that are used in the workplace that actually guide behaviours, and behaviours then guide performance. The safety culture in the workplace is ``the way we do things around here``. This culture is created by structures, behaviours and performance, and is characterized by three attributes: A questioning attitude; a rigorous and prudent approach; open 2-way communication. This paper discusses a model of safety culture which puts the operating experience or learning cycle program at the heart of the endeavor. This cycle also consists of three elements: Observation; reporting; learning. It is more of a theoretical paper than a report of success at this stage. However, the ideas presented are being used to design and implement a strong safety culture at Ontario Hydro`s Darlington Nuclear Station (4 x 932 Mwe CANDU units), and are beginning to show results. Programs designed for excellence in human performance have to provide clear simple structure, and draw people into alignment with the required behaviours. By making the structural elements and the alignment activities explicit an attempt can be made to ``design`` programs that will create and reinforce the required safety culture. The paper places the learning cycle at the heart of safety culture because the cycle aligns with the three attributes of safety culture. If an organization cannot learn from experience (i.e. change behaviours based on experience) it is doubtful whether it can build a strong safety culture. 2 refs.
Promotion of good safety culture at a Canadian HWR
International Nuclear Information System (INIS)
Curle, B.
1996-01-01
People work at a nuclear plant within a structured environment. It is the programs, procedures and other ''tools'' that are used in the workplace that actually guide behaviours, and behaviours then guide performance. The safety culture in the workplace is ''the way we do things around here''. This culture is created by structures, behaviours and performance, and is characterized by three attributes: A questioning attitude; a rigorous and prudent approach; open 2-way communication. This paper discusses a model of safety culture which puts the operating experience or learning cycle program at the heart of the endeavor. This cycle also consists of three elements: Observation; reporting; learning. It is more of a theoretical paper than a report of success at this stage. However, the ideas presented are being used to design and implement a strong safety culture at Ontario Hydro's Darlington Nuclear Station (4 x 932 Mwe CANDU units), and are beginning to show results. Programs designed for excellence in human performance have to provide clear simple structure, and draw people into alignment with the required behaviours. By making the structural elements and the alignment activities explicit an attempt can be made to ''design'' programs that will create and reinforce the required safety culture. The paper places the learning cycle at the heart of safety culture because the cycle aligns with the three attributes of safety culture. If an organization cannot learn from experience (i.e. change behaviours based on experience) it is doubtful whether it can build a strong safety culture. 2 refs
Operating experience of Fugen-HWR in Japan
Energy Technology Data Exchange (ETDEWEB)
Yoshino, F [Reactor Regulation Division, Nuclear Safety Bureau, Science and Technology Agency, Tokyo (Japan)
1991-04-01
Fugen is a 165 MWe prototype heavy water reactor which mainly uses plutonium-uranium mixed oxide (MOX) fuel. Power Reactor and Nuclear Fuel Development Corporation (PNC) has taken responsibility for the advanced thermal reactor (ATR) project, with its name 'FUGEN' taken from the Buddhist God of Mercy. The project started in October 1967, to develop and establish the technology for this new type of reactor and to clarify MOX fuel performance in the reactor. Site construction began in December 1970 at Tsuruga and the plant commenced commercial operation on March 20, 1979. Since then, Fugen has been operated successfully for more than twelve years. The plant performance and reliability of this type of reactor has been demonstrated through the operation. All these operational experiences have contributed to the establishment of the ATR technology.
WELWING, Material Buckling for HWR with Annular Fuel Elements
International Nuclear Information System (INIS)
Grosskopf, O.G.P.
1973-01-01
1 - Nature of the physical problem solved: WELWING was developed to calculate the material buckling of reactor systems consisting of annular fuel elements in heavy water as moderator for various moderator to fuel ratios. The moderator to fuel ratio for the maximum material buckling for the particular system is selected automatically and the corresponding material buckling is calculated. 2 - Method of solution: The method used is an analytical solution of the one-group diffusion equations with various corrections and approximations. 3 - Restrictions on the complexity of the problem: Up to 32 different materials in the fuel element may be used
Time-step selection considerations in the analysis of reactor transients with DIF3D-K
International Nuclear Information System (INIS)
Taiwo, T.A.; Khalil, H.S.; Cahalan, J.E.; Morris, E.E.
1993-01-01
The DIF3D-K code solves the three-dimensional, time-dependent multigroup neutron diffusion equations by using a nodal approach for spatial discretization and either the theta method or one of three space-time factorization approaches for temporal integration of the nodal equations. The three space-time factorization options (namely, improved quasistatic, adiabatic, and conventional point kinetics) were implemented because of their potential efficiency advantage for the analysis of transients in which the flux shape changes more slowly than its amplitude. In this paper, we describe the implementation of DIF3D-K as the neutronics module within the SAS-HWR accident analysis code. We also describe the neuronic-related time-step selection algorithms and their influence on the accuracy and efficiency of the various solution options
Time-step selection considerations in the analysis of reactor transients with DIF3D-K
International Nuclear Information System (INIS)
Taiwo, T.A.; Khalil, H.S.; Cahalan, J.E.; Morris, E.E.
1993-01-01
The DIF3D-K code solves the three-dimensional, time-dependent multigroup neutron diffusion equations by using a nodal approach for spatial discretization and either the theta method or one of three space-time factorization approaches for temporal integration of the nodal equations. The three space-time factorization options (namely, improved quasistatic, adiabatic and conventional point kinetics) were implemented because of their potential efficiency advantage for the analysis of transients in which the flux shape changes more slowly than its amplitude. Here we describe the implementation of DIF3D-K as the neutronics module within the SAS-HWR accident analysis code. We also describe the neutronics-related time step selection algorithms and their influence on the accuracy and efficiency of the various solution options
Safety analysis procedures for PHWR
International Nuclear Information System (INIS)
Min, Byung Joo; Kim, Hyoung Tae; Yoo, Kun Joong
2004-03-01
The methodology of safety analyses for CANDU reactors in Canada, a vendor country, uses a combination of best-estimate physical models and conservative input parameters so as to minimize the uncertainty of the plant behavior predictions. As using the conservative input parameters, the results of the safety analyses are assured the regulatory requirements such as the public dose, the integrity of fuel and fuel channel, the integrity of containment and reactor structures, etc. However, there is not the comprehensive and systematic procedures for safety analyses for CANDU reactors in Korea. In this regard, the development of the safety analyses procedures for CANDU reactors is being conducted not only to establish the safety analyses system, but also to enhance the quality assurance of the safety assessment. In the first phase of this study, the general procedures of the deterministic safety analyses are developed. The general safety procedures are covered the specification of the initial event, selection of the methodology and accident sequences, computer codes, safety analysis procedures, verification of errors and uncertainties, etc. Finally, These general procedures of the safety analyses are applied to the Large Break Loss Of Coolant Accident (LBLOCA) in Final Safety Analysis Report (FSAR) for Wolsong units 2, 3, 4
Tritium target performance during an LBLOCA in a PWR
International Nuclear Information System (INIS)
Reid, B.D.
1996-01-01
In December 1995, the U.S. Department of Energy (DOE) announced a preferred strategy for acquiring a new supply of tritium. That strategy is based on pursuing the two most promising production alternatives. These alternatives include either constructing an accelerator-produced tritium system for tritium production or procuring an existing commercial light water reactor or irradiation services from such a reactor to irradiate tritium targets. This paper discusses the safety performance of a tritium target in a commercial pressurized water reactor (PWR). The current conceptual design for the light water tritium targets is quite similar, in terms of external dimensions and materials, to early designs for stainless steel clad discrete burnable absorbers used in PWRs. The tritium targets nominally consist of an annular lithium aluminate pellet wrapped in a Zircaloy-4 getter and clad with Type 316 stainless steel
Blanket Module Boil-Off Times during a Loss-of-Coolant Accident - Case 0: with Beam Shutdown only
International Nuclear Information System (INIS)
Hamm, L.L.
1998-01-01
This report is one of a series of reports that document LBLOCA analyses for the Accelerator Production of Tritium primary blanket Heat Removal system. This report documents the analysis results of a LBLOCA where the accelerator beam is shut off without primary pump trips and neither the RHR nor the cavity flood systems operation
Development and assessment of best estimate integrated safety analysis code
Energy Technology Data Exchange (ETDEWEB)
Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu (and others)
2007-03-15
Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published.
Development and assessment of best estimate integrated safety analysis code
International Nuclear Information System (INIS)
Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu
2007-03-01
Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published
Pre-Test Analysis of Major Scenarios for ATLAS
Energy Technology Data Exchange (ETDEWEB)
Euh, Dong-Jin; Choi, Ki-Yong; Park, Hyun-Sik; Kwon, Tae-Soon
2007-02-15
A thermal-hydraulic integral effect test facility, ATLAS was constructed at the Korea Atomic Energy Research Institute (KAERI). The ATLAS is a 1/2 reduced height and 1/288 volume scaled test facility based on the design features of the APR1400. The simulation capability of the ATLAS for major design basis accidents (DBAs), including a large-break loss-of-coolant (LBLOCA), DVI line break and main steam line break (MSLB) accidents, is evaluated by the best-estimate system code, MARS, with the same control logics, transient scenarios and nodalization scheme. The validity of the applied scaling law and the thermal-hydraulic similarity between the ATLAS and the APR1400 for the major design basis accidents are assessed. It is confirmed that the ATLAS has a capability of maintaining an overall similarity with the reference plant APR1400 for the major design basis accidents considered in the present study. However, depending on the accident scenarios, there are some inconsistencies in certain thermal hydraulic parameters. It is found that the inconsistencies are mainly due to the reduced power effect and the increased stored energy in the structure. The present similarity analysis was successful in obtaining a greater insight into the unique design features of the ATLAS and would be used for developing the optimized experimental procedures and control logics.
Pre-Test Analysis of Major Scenarios for ATLAS
International Nuclear Information System (INIS)
Euh, Dong-Jin; Choi, Ki-Yong; Park, Hyun-Sik; Kwon, Tae-Soon
2007-02-01
A thermal-hydraulic integral effect test facility, ATLAS was constructed at the Korea Atomic Energy Research Institute (KAERI). The ATLAS is a 1/2 reduced height and 1/288 volume scaled test facility based on the design features of the APR1400. The simulation capability of the ATLAS for major design basis accidents (DBAs), including a large-break loss-of-coolant (LBLOCA), DVI line break and main steam line break (MSLB) accidents, is evaluated by the best-estimate system code, MARS, with the same control logics, transient scenarios and nodalization scheme. The validity of the applied scaling law and the thermal-hydraulic similarity between the ATLAS and the APR1400 for the major design basis accidents are assessed. It is confirmed that the ATLAS has a capability of maintaining an overall similarity with the reference plant APR1400 for the major design basis accidents considered in the present study. However, depending on the accident scenarios, there are some inconsistencies in certain thermal hydraulic parameters. It is found that the inconsistencies are mainly due to the reduced power effect and the increased stored energy in the structure. The present similarity analysis was successful in obtaining a greater insight into the unique design features of the ATLAS and would be used for developing the optimized experimental procedures and control logics
Numerical Analysis of a Passive Containment Filtered Venting System
International Nuclear Information System (INIS)
Kim, Taejoon; Ha, Huiun; Heo, Sun
2014-01-01
The passive Containment Filtered Venting system (CFVS) does not have principally any kind of isolation valves or filtering devices which need periodic maintenance. In this study, the hydro-thermal analysis is presented to investigate the existence of flow instability in the passive CFVS and its performance under the pressure change of APR+ containment building with LB-LOCA M/E data. The Passive Containment Filtered Venting System was suggested as a part in i-Power development project and the operation mechanism was investigated by numerical modeling and simulation using GOTHIC8.0 system code. There are four Phases for consideration to investigate the pressurization of the containment building, loss of hydrostatic head in the pipe line of CFVS, opening of pipe line and gas ejection to the coolant tank, and the head recovery inside the pipe as the containment gas exhausted. The simulation results show that gas generation rate determine the timing of head recovery in the CFVS pipe line and that the equipment of various devices inducing pressure loss at the pipe can give the capacity of Phase control of the passive CFVS operation
Analysis of Operators Comments on the PSF Questionnaire of the Task Complexity Experiment 2003/2004
Energy Technology Data Exchange (ETDEWEB)
Torralba, B.; Martinez-Arias, R.
2007-07-01
Human Reliability Analysis (HRA) methods usually take into account the effect of Performance Shaping Factors (PSF). Therefore, the adequate treatment of PSFs in HRA of Probabilistic Safety Assessment (PSA) models has a crucial importance. There is an important need for collecting PSF data based on simulator experiments. During the task complexity experiment 2003-2004, carried out in the BWR simulator of Halden Man-Machine Laboratory (HAMMLAB), there was a data collection on PSF by means of a PSF Questionnaire. Seven crews (composed of shift supervisor, reactor operator and turbine operator) from Swedish Nuclear Power Plants participated in the experiment. The PSF Questionnaire collected data on the factors: procedures, training and experience, indications, controls, team management, team communication, individual work practice, available time for the tasks, number of tasks or information load, masking and seriousness. The main statistical significant results are presented on Performance Shaping Factors data collection and analysis of the task complexity experiment 2003/2004 (HWR-810). The analysis of the comments about PSFs, which were provided by operators on the PSF Questionnaire, is described. It has been summarised the comments provided for each PSF on the scenarios, using a content analysis technique. (Author)
Analysis of Operators Comments on the PSF Questionnaire of the Task Complexity Experiment 2003/2004
International Nuclear Information System (INIS)
Torralba, B.; Martinez-Arias, R.
2007-01-01
Human Reliability Analysis (HRA) methods usually take into account the effect of Performance Shaping Factors (PSF). Therefore, the adequate treatment of PSFs in HRA of Probabilistic Safety Assessment (PSA) models has a crucial importance. There is an important need for collecting PSF data based on simulator experiments. During the task complexity experiment 2003-2004, carried out in the BWR simulator of Halden Man-Machine Laboratory (HAMMLAB), there was a data collection on PSF by means of a PSF Questionnaire. Seven crews (composed of shift supervisor, reactor operator and turbine operator) from Swedish Nuclear Power Plants participated in the experiment. The PSF Questionnaire collected data on the factors: procedures, training and experience, indications, controls, team management, team communication, individual work practice, available time for the tasks, number of tasks or information load, masking and seriousness. The main statistical significant results are presented on Performance Shaping Factors data collection and analysis of the task complexity experiment 2003/2004 (HWR-810). The analysis of the comments about PSFs, which were provided by operators on the PSF Questionnaire, is described. It has been summarised the comments provided for each PSF on the scenarios, using a content analysis technique. (Author)
Multi-dimensional analysis of the ECC behavior in the UPI plant Kori Unit 1
International Nuclear Information System (INIS)
Bae, Sungwon; Chung, Bub-Dong; Bang, Young Seok
2008-01-01
A multi-dimensional transient analysis during the LBLOCA of the Kori Unit 1 has been performed by using the MARS code. Based on 1-D nodalization of the Kori Unit 1, the reactor vessel nodalizations have been replaced by the multi-dimensional component. The multi-dimensional component for the reactor vessel is designed as 5 radial, 8 peripheral, and 21 vertical grids. It is assumed that the fuel assemblies are homogeneously distributed in inner 3 radial grids. The outer 1 radial grid region is modeled as the core bypass. The outer-model 1 radial grid is used for the downcomer region. The corresponding heat structures and fuels are modified to fit for the multi-dimensional reactor vessel model. The form drag coefficients for the upper plenum and the core have been designated as 0.6 and 9.39, respectively. The form drag coefficients for the radial and peripheral directions are assigned to the same on the assumption of homogeneous distribution of the flow obstacles. After obtaining the 102% power steady operation condition, cold leg LOCA simulation is performed during 400 second period. The multi-dimensional steady run results show no severe differences compared to the traditional 1-D nodalization results. After the ECC injection starts, a liquid pool is maintained at the upper plenum because the ECCS water can not overcome the upward gas flow that comes from the reactor core through the upper tie plate. The depth of ECCS water pool is predicted as about 20% of the total height from the upper tie plate and the center line of the hot leg pipe. At the vicinity region of the active ECCS show higher depth of liquid pool. The accumulated water flow rate passing the upper tie plate is calculated by the transient result. Much downward water flow is obtained at the outer-most region of upper plenum space. The downward flow dominant region is about 32.3% of the total upper tie plate area. The accumulated ECCS bypass ratio is predicted as 27.64% at 300 second. It is calculated
Evaluation of Gap Conductance Approach for Mid-Burnup Fuel LOCA Analysis
Energy Technology Data Exchange (ETDEWEB)
Lee, Joosuk; Woo, Swengwoong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)
2013-10-15
In this study, therefore, the applicability of gap conductance approach on the mid-burnup fuel in LOCA analysis was estimated in terms of the comparison of PCT distribution method means the fuel rod uncertainty is taken into account by the combination of overall uncertainty parameters of fuel rod altogether by use of a simple random sampling(SRS) technique. There are many uncertainty parameters of fuel rod that can change the PCT during LOCA analysis, and these have been identified by the authors' previous work already. But, for the 'best-estimate' LOCA safety analysis the methodology that dose not use the overall uncertainty parameters altogether but used the gap conductance uncertainty alone has been developed to simulate the overall fuel rod uncertainty, because it can represent many uncertainty parameters. Based on this approach, uncertainty range of gap conductance was prescribed as 0.67∼1.5 in audit calculation methodology on LBLOCA analysis. This uncertainty was derived from experimental data of fresh or low burnup fuel. Meanwhile, recent research work identify that the currently utilized uncertainty range seems to be not enough to encompass the uncertainty of mid-burnup fuel. Instead it has to be changed to 0.5∼2.4 for the mid-burnup fuel(30 MWd/kgU)
Evaluation of Gap Conductance Approach for Mid-Burnup Fuel LOCA Analysis
International Nuclear Information System (INIS)
Lee, Joosuk; Woo, Swengwoong
2013-01-01
In this study, therefore, the applicability of gap conductance approach on the mid-burnup fuel in LOCA analysis was estimated in terms of the comparison of PCT distribution method means the fuel rod uncertainty is taken into account by the combination of overall uncertainty parameters of fuel rod altogether by use of a simple random sampling(SRS) technique. There are many uncertainty parameters of fuel rod that can change the PCT during LOCA analysis, and these have been identified by the authors' previous work already. But, for the 'best-estimate' LOCA safety analysis the methodology that dose not use the overall uncertainty parameters altogether but used the gap conductance uncertainty alone has been developed to simulate the overall fuel rod uncertainty, because it can represent many uncertainty parameters. Based on this approach, uncertainty range of gap conductance was prescribed as 0.67∼1.5 in audit calculation methodology on LBLOCA analysis. This uncertainty was derived from experimental data of fresh or low burnup fuel. Meanwhile, recent research work identify that the currently utilized uncertainty range seems to be not enough to encompass the uncertainty of mid-burnup fuel. Instead it has to be changed to 0.5∼2.4 for the mid-burnup fuel(30 MWd/kgU)
International Nuclear Information System (INIS)
Petruzzi, A.; D'Auria, F.; Crecy, Agnes de; Bazin, P.; Borisov, S.; Skorek, T.; Glaeser, H.; Benoit, J. P.; Chojnacki, E.; Fujioka, K.; Inoue, S.; Chung, B.D.; Trosztel, I.; Toth, I.; Oh, D. Y.; Pernica, R.; Kyncl, M.; Macek, J.; Macian, R.; Tanker, E.; Soyer, A. E.; Ozdere, O.; Perez, M.; Reventos, F.
2005-11-01
The BEMUSE (Best Estimate Methods - Uncertainty and Sensitivity Evaluation) Programme is focused on applications of the uncertainty methodologies to Large Break LOCA scenarios. The main goals of the Programme are: - To evaluate the practicability, quality and reliability of best-estimate methods including uncertainty evaluations in applications relevant to nuclear reactor safety; - To develop common understanding; - To promote / facilitate their use by the regulator bodies and the industry. The scope of the Phase II of BEMUSE is to perform Large Break LOCA analysis making reference to the experimental data of LOFT L2-5 in order to address the issue of 'the capabilities of computational tools', including the scaling / uncertainty analysis. The operational objective of the activity is the quality demonstration of the system code calculations in performing LBLOCA analysis through the fulfilment of a comprehensive set of common criteria established in correspondence of different steps of the code assessment process. In particular criteria and threshold values for selected parameters have been adopted for: a) The developing of the nodalization; b) The evaluation of the steady state results; c) The qualitative and quantitative comparison between measured and calculated time trends. Main achievements of the Phase II, to be considered in the following phases of BEMUSE, are summarized as follows: - Almost all performed calculations appear qualified against the fixed criteria; - Dispersion bands of reference results appear substantially less than in ISP-13; - The sensitivity study shall be used as guidance for deriving the uncertainty bands in the following Phase III of the Programme
Pressurized Thermal Shock Analysis for OPR1000 Pressure Vessel
Energy Technology Data Exchange (ETDEWEB)
Bhowmik, P. K.; Shamim, J. A.; Gairola, A.; Suh, Kune Y. [Seoul National Univ., Seoul (Korea, Republic of)
2014-10-15
The study provides a brief understanding of the analysis procedure and techniques using ANSYS, such as the acceptance criteria, selection and categorization of events, thermal analysis, structural analysis including fracture mechanics assessment, crack propagation and evaluation of material properties. PTS may result from instrumentation and control malfunction, inadvertent steam dump, and postulated accidents such as smallbreak (SB) LOCA, large-break (LB) LOCA, main steam line break (MSLB), feedwater line breaks and steam generator overfill. In this study our main focus is to consider only the LB LOCA due to a cold leg break of the Optimized Power Reactor 1000 MWe (OPR1000). Consideration is given as well to the emergency core cooling system (ECCS) specific sequence with the operating parameters like pressure, temperature and time sequences. The static structural and thermal analysis to investigate the effects of PTS on RPV is the main motivation of this study. Specific surface crack effects and its propagation is also considered to measure the integrity of the RPV. This study describes the procedure for pressurized thermal shock analysis due to a loss of coolant accidental condition and emergency core cooling system operation for reactor pressure vessel.. Different accidental events that cause pressurized thermal shock to nuclear RPV that can also be analyzed in the same way. Considering the limitations of low speed computer only the static analysis is conducted. The modified LBLOCA phases and simplified geometry can is utilized to analyze the effect of PTS on RPV for general understanding not for specific specialized purpose. However, by integrating the disciplines of thermal and structural analysis, and fracture mechanics analysis a clearer understanding of the total aspect of the PTS problem has resulted. By adopting the CFD, thermal hydraulics, uncertainties and risk analysis for different type of accidental conditions, events and sequences with proper
Steady-State and Transient Analysis for Design Validation of SMART-ITL Secondary System
Energy Technology Data Exchange (ETDEWEB)
Yun, Eunkoo; Bae, Hwang; Ryu, Sung Uk; Jeon, Byong-Guk; Yang, Jin-Hwa; Yi, Sung-Jae; Park, Hyun-Sik [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)
2016-10-15
SMART can prevent large-break loss of coolant accident (LBLOCA) inherently. SMART-ITL is an experimental simulation facility designed to perform integral effect tests for the SMART plant. In terms of the secondary system of SMART-ITL, the design has been simplified from that of reference plant by replacing several components, such as expansion device and condenser, with an appropriate device to be functional as the alternatives. In this paper, in order to understand the operational characteristics as well as design concept, the secondary system of SMRAT-ITL is analyzed in steady-state and transient aspects, and the results are compared with relevant experimental results. This study focuses on the understanding of thermal-hydraulic behavior of SMART-ITL secondary system, which is simplified from that of reference plant. To identify the behaviors of the secondary system, the steady-state and transient analysis were conducted based on experimental results. In steady-state analysis, the results clearly showed that the system pressure is related to the temperature of condensation tank which varies depending on mixture enthalpy. In transient analysis, the dynamic behavior during heat-up process has been investigated. The results reveal that we can reasonably assume the fluid filled in TK-CD-01 be in a saturated condition. The results showed that the design of SMART-ITL secondary system is appropriate, and the system is being properly operated to match the design intent.
International Nuclear Information System (INIS)
O'Kula, K.R.; Sharp, D.A.; Amos, C.N.; Wagner, K.C.; Bradley, D.R.
1992-01-01
A three-level Probabilistic Safety Assessment (PSA) of production reactor operation has been underway since 1985 at the US Department of Energy's Savannah River Site (SRS). The goals of this analysis are to: Analyze existing margins of safety provided by the heavy-water reactor (HWR) design challenged by postulated severe accidents; Compare measures of risk to the general public and onsite workers to guideline values, as well as to those posed by commercial reactor operation; and Develop the methodology and database necessary to prioritize improvements to engineering safety systems and components, operator training, and engineering projects that contribute significantly to improving plant safety. PSA technical staff from the Westinghouse Savannah River Company (WSRC) and Science Applications International Corporation (SAIC) have performed the assessment despite two obstacles: A variable baseline plant configuration and power level; and a lack of technically applicable code methodology to model the SRS reactor conditions. This paper discusses the detailed effort necessary to modify the requisite codes before accident analysis insights for the risk assessment were obtained
Fracture mechanics analysis and evaluation for the RPV of the Chinese Qinshan 300 MW NPP and PTS
International Nuclear Information System (INIS)
He Yinbiao; Isozaki, Toshikuni
2000-03-01
One of the most severe accident conditions of a reactor pressure vessel (RPV) in service is the loss of coolant accident (LOCA). Cold safety injection water is pumped into the downcomer of the RPV through inlet nozzles, while the internal pressure may remain at high level. Such an accident is called pressurized thermal shock (PTS) transient according to 10 CFR 50.61 definition. This paper illustrates the fracture mechanics analysis for the existing RPV of the Chinese Qinshan 300 MW nuclear power plant (NPP) under the postulated PTS transients that include SB-LOCA, LB-LOCA of Qinshan NPP and Rancho Seco transients. 3-D models with the flaw depth range a/w=0.05∼0.9 (a: flaw depth; w: wall thickness) were used to probe what kind of flaw and what kind of transient are most dangerous for the RPV in the end of life (EOF). Both the elastic and elastic-plastic material models were used in the stress analysis and fracture mechanics analysis. The different types of flaw and the influence of the stainless steel cladding on the fracture analysis were investigated for different PTS transients. comparing with the material initiation crack toughness K IC , the fracture evaluation for the RPV in question under PTS transients are performed in this paper. (author)
International Nuclear Information System (INIS)
Wu, Xiaoli; Li, Wei; Wang, Yang; Zhang, Yapei; Tian, Wenxi; Su, Guanghui; Qiu, Suizheng; Liu, Tong; Deng, Yongjun; Huang, Heng
2015-01-01
Highlights: • Analysis of severe accident scenarios for a PWR fueled with ATF system is performed. • A large-break LOCA without ECCS is analyzed for the PWR fueled with ATF system. • Extended SBO cases are discussed for the PWR fueled with ATF system. • The accident-tolerance of ATF system for application in PWR is illustrated. - Abstract: Experience gained in decades of nuclear safety research and previous nuclear accidents direct to the investigation of passive safety system design and accident-tolerant fuel (ATF) system which is now becoming a hot research point in the nuclear energy field. The ATF system is aimed at upgrading safety characteristics of the nuclear fuel and cladding in a reactor core where active cooling has been lost, and is preferable or comparable to the current UO 2 –Zr system when the reactor is in normal operation. By virtue of advanced materials with improved properties, the ATF system will obviously slow down the progression of accidents, allowing wider margin of time for the mitigation measures to work. Specifically, the simulation and analysis of a large break loss of coolant accident (LBLOCA) without ECCS and extended station blackout (SBO) severe accident are performed for a pressurized water reactor (PWR) loaded with ATF candidates, to reflect the accident-tolerance of ATF
THYDE-P2 code: RCS (reactor-coolant system) analysis code
International Nuclear Information System (INIS)
Asahi, Yoshiro; Hirano, Masashi; Sato, Kazuo
1986-12-01
THYDE-P2, being characterized by the new thermal-hydraulic network model, is applicable to analysis of RCS behaviors in response to various disturbances including LB (large break)-LOCA(loss-of-coolant accident). In LB-LOCA analysis, THYDE-P2 is capable of through calculation from its initiation to complete reflooding of the core without an artificial change in the methods and models. The first half of the report is the description of the methods and models for use in the THYDE-P2 code, i.e., (1) the thermal-hydraulic network model, (2) the various RCS components models, (3) the heat sources in fuel, (4) the heat transfer correlations, (5) the mechanical behavior of clad and fuel, and (6) the steady state adjustment. The second half of the report is the user's mannual for the THYDE-P2 code (version SV04L08A) containing items; (1) the program control (2) the input requirements, (3) the execution of THYDE-P2 job, (4) the output specifications and (5) the sample problem to demonstrate capability of the thermal-hydraulic network model, among other things. (author)
Preliminary Performance Analysis Program Development for Safety System with Safeguard Vessel
International Nuclear Information System (INIS)
Kang, Han-Ok; Lee, Jun; Park, Cheon-Tae; Yoon, Ju-Hyeon; Park, Keun-Bae
2007-01-01
SMART is an advanced modular integral type pressurized water reactor for a seawater desalination and an electricity production. Major components of the reactor coolant system such as the pressurizer, Reactor Coolant Pump (RCP), and steam generators are located inside the reactor vessel. The SMART can fundamentally eliminate the possibility of large break loss of coolant accidents (LBLOCAs), improve the natural circulation capability, and better accommodate and thus enhance a resistance to a wide range of transients and accidents. The safety goals of the SMART are enhanced through highly reliable safety systems such as the passive residual heat removal system (PRHRS) and the safeguard vessel coupled with the passive safety injection feature. The safeguard vessel is a steel-made, leak-tight pressure vessel housing the RPV, SIT, and the associated valves and pipelines. A primary function of the safeguard vessel is to confine any radioactive release from the primary circuit within the vessel under DBAs related to loss of the integrity of the primary system. A preliminary performance analysis program for a safety system using the safeguard vessel is developed in this study. The developed program is composed of several subroutines for the reactor coolant system, passive safety injection system, safeguard vessel including the pressure suppression pool, and PRHRS. A small break loss of coolant accident at the upper part of a reactor is analyzed and the results are discussed
Energy Technology Data Exchange (ETDEWEB)
Jang, Won Pyo; Jung, Yung Jong; Kim, Kyung Doo; Jung, Jae Joon; Kim, Won Suk; Han, Doh Heui; Hah, Kooi Suk; Jung, Bub Dong; Lee, Yung Jin; Hwang, Tae Suk; Lee, Sang Yong; Park, Chan Uk; Choi, Han Rim; Lee, Sang Jong; Choi, Jong Hoh; Ban, Chang Hwan; Bae, Kyoo Hwan [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)
1995-07-01
The present research aims at development of both a best estimate methodology on LOCA analysis and, as an application, performance analyses of safety systems. SBLOCA analyses have been continued to examine the capacity reduction effect of ECCS since the second project year. As a results, core uncovery, which is requirement of URD has not been occurred in 6`` cold leg break. Although core uncovery has been predicted when DVI line has been broken for DVI+4-Train HPIS, the calculated PCT has lied well within the criterion. The effect of safety injection position and SIT characteristics are also analyzed for LBLOCA. The results show that cold leg injection is the most effective way and the adaption of advanced SIT could lead to elimination of LPSI pump from the safety system. On the other hand, the quantified uncertainties obtained from THTF and FLECHT/SEASET which represents blowdown and reflood phenomena, respectively, have been confirmed using IET(LOFT test). The application uncertainty for Kori unit 3 has been analyzed. Finally, application of the best estimate methodology using the uncertainties concerned with the code, the bais, and the application, leads to overall uncertainty of about 200K for Kori unit 3. 244 figs, 22 tabs, 92 refs. (Author).
International Nuclear Information System (INIS)
Jang, Won Pyo; Jung, Yung Jong; Kim, Kyung Doo; Jung, Jae Joon; Kim, Won Suk; Han, Doh Heui; Hah, Kooi Suk; Jung, Bub Dong; Lee, Yung Jin; Hwang, Tae Suk; Lee, Sang Yong; Park, Chan Uk; Choi, Han Rim; Lee, Sang Jong; Choi, Jong Hoh; Ban, Chang Hwan; Bae, Kyoo Hwan
1995-07-01
The present research aims at development of both a best estimate methodology on LOCA analysis and, as an application, performance analyses of safety systems. SBLOCA analyses have been continued to examine the capacity reduction effect of ECCS since the second project year. As a results, core uncovery, which is requirement of URD has not been occurred in 6'' cold leg break. Although core uncovery has been predicted when DVI line has been broken for DVI+4-Train HPIS, the calculated PCT has lied well within the criterion. The effect of safety injection position and SIT characteristics are also analyzed for LBLOCA. The results show that cold leg injection is the most effective way and the adaption of advanced SIT could lead to elimination of LPSI pump from the safety system. On the other hand, the quantified uncertainties obtained from THTF and FLECHT/SEASET which represents blowdown and reflood phenomena, respectively, have been confirmed using IET(LOFT test). The application uncertainty for Kori unit 3 has been analyzed. Finally, application of the best estimate methodology using the uncertainties concerned with the code, the bais, and the application, leads to overall uncertainty of about 200K for Kori unit 3. 244 figs, 22 tabs, 92 refs. (Author)
Development of the advanced CANDU technology -Development of basic technology for HWR design
International Nuclear Information System (INIS)
Seok, Ho Cheon; Seok, Soo Dong; Lee, Sang Yong
1996-07-01
It is believed that it is easier for Korea to become self-reliant in PHWR technology than in PWR technology, mainly because of the lower design pressure and temperature and because of the simplicity, economy, flexibility of the fuel cycle in comparison with PWR systems. Even though one has no doubt about the safety and the economics of the PHWR's that are now being operated or constructed in Korea. It is necessary to develop the advanced design technology for even safer and more economical PHWR systems to overcome the ever growing international resistance to sharing of nuclear technology and to meet the even more stringent requirements for the future public acceptance of nuclear power. This study is to develop the more advance design technology compared to the existing one, especially in the field of reactor physics, safety systems and safety evaluation to realize the above requirements. 71 tabs., 147 figs., 143 refs. (Author)
Development of the advanced CANDU technology -Development of basic technology for HWR design-
Energy Technology Data Exchange (ETDEWEB)
Suk, Hoh Chun; Lee, Sang Yong; Suk, Soo Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)
1995-07-01
It is believed that it is easier for Korea to become self-reliant in PHWR technology than in PWR technology, mainly because of the lower design pressure and temperature and because of the simplicity, economy, flexibility of the fuel cycle in comparison with PWR systems. Even though one has no doubt about the safety and the economics of the PHWR`s that are now being operated or constructed in Korea, it is necessary to develop the advanced design technology for even safer and more economical PHWR systems to overcome the ever growing international resistance to sharing of nuclear technology and to meet the even more stringent requirements for the future public acceptance of nuclear power. This study is to develop the more advance design technology compared to the existing one, by performing in-depth studies especially in the field of reactor physics, safety systems and safety evaluation to realize the above requirements. 90 figs, 50 tabs, 38 refs. (Author).
Development of Evaluation Technology of the Integrity of HWR Pressure Tubes
Energy Technology Data Exchange (ETDEWEB)
Jeong, Y. M.; Kim, Y. S.; Im, K. S.; Kim, K. S.; Ahn, S. B
2007-06-15
Zr-2.5Nb pressure tubes are one of the most critical structural components governing the lifetime of the heavy water reactors to carry fuel bundles and heavy coolant water inside. Since they are being degraded during their operation in reactors due to dimensional changes caused by creep and irradiation growth, neutron irradiation and delayed hydride cracking, it is required to evaluate their degradation by conducting material testing and examinations on the highly irradiated pressure tubes in hot cells and to keep tracking of their degradation behavior with operation time, which are the aim of this project.
International Nuclear Information System (INIS)
2013-01-01
Pressure tube deformation is a critical aging issue in operating Heavy Water Reactors (HWRs). According to the service year, horizontal pressure tubes have three kinds of deformation: diametral creep leading to the flow bypass and the penalty to critical heat flux for fuel rods, longitudinal creep leading to the interference of feeder pipes and/or with fuelling machine, and sagging leading to the interference with in-core components and potential contact between the pressure tube and calandria tube. The CRP scope includes the establishment of a database for pressure tube deformation, microstructure characterization of pressure tube materials collected from HWRs currently operating in Member States and development of a prediction model for pressure tube deformation
Development of the advanced CANDU technology -Development of basic technology for HWR design-
International Nuclear Information System (INIS)
Suk, Hoh Chun; Lee, Sang Yong; Suk, Soo Dong
1995-07-01
It is believed that it is easier for Korea to become self-reliant in PHWR technology than in PWR technology, mainly because of the lower design pressure and temperature and because of the simplicity, economy, flexibility of the fuel cycle in comparison with PWR systems. Even though one has no doubt about the safety and the economics of the PHWR's that are now being operated or constructed in Korea, it is necessary to develop the advanced design technology for even safer and more economical PHWR systems to overcome the ever growing international resistance to sharing of nuclear technology and to meet the even more stringent requirements for the future public acceptance of nuclear power. This study is to develop the more advance design technology compared to the existing one, by performing in-depth studies especially in the field of reactor physics, safety systems and safety evaluation to realize the above requirements. 90 figs, 50 tabs, 38 refs. (Author)
REFLOS, Fuel Loading and Cost from Burnup and Heavy Atomic Mass Flow Calculation in HWR
International Nuclear Information System (INIS)
Boettcher, W.; Schmidt, E.
1969-01-01
1 - Nature of physical problem solved: REFLOS is a programme for the evaluation of fuel-loading schemes in heavy water moderated reactors. The problems involved in this study are: a) Burn-up calculation for the reactor cell. b) Determination of reactivity behaviour, power distribution, attainable burn-up for both the running-in period and the equilibrium of a 3-dimensional heterogeneous reactor model; investigation of radial fuel movement schemes. c) Evaluation of mass flows of heavy atoms through the reactor and fuel cycle costs for the running-in, the equilibrium, and the shut down of a power reactor. If the subroutine for treating the reactor cell were replaced by a suitable routine, other reactors with weakly absorbing moderators could be analyzed. 2 - Method of solution: Nuclear constants and isotopic compositions of the different fuels in the reactor are calculated by the cell-burn-up programme and tabulated as functions of the burn-up rate (MWD/T). Starting from a known state of the reactor, the 3-dimensional heterogeneous reactor programme (applying an extension of the technique of Feinberg and Galanin) calculates reactivity and neutron flux distribution using one thermal and one or two fast neutron groups. After a given irradiation time, the new state of the reactor is determined, and new nuclear constants are assigned to the various defined locations in the reactor. Reloading of fuel may occur if the prescribed life of the reactor is reached or if the effective multiplication factor or the power form factor falls below a specified level. The scheme of reloading to be carried out is specified by a load vector, giving the number of channels to be discharged, the kind of movement from one to another channel and the type of fresh fuel to be charged for each single reloading event. After having determined the core states characterizing the equilibrium period, and having decided the fuel reloading scheme for the running-in period of the reactor life, the fuel cycle costs are evaluated following proposals of the EURATOM Economic Handbook. 3 - Restrictions on the complexity of the problem: Maximum number of groups of channels having rotation symmetry is 60. Maximum Number of groups of channels having specular symmetry is 120. Maximum number of harmonics for the approximation of the axial flux distribution is 19. Highest order of Bessel functions for the approximation of the radial flux distribution is 12. Maximum number of axial pieces of a channel with possibly different neutronic properties is 20. Maximum number of neutron groups: two fast, one thermal. Maximum number of different types of channels in the reactor is 10. Maximum number of burn-up steps characterizing one type of channel is 50
International Nuclear Information System (INIS)
2004-08-01
Activities within the frame of the IAEA's Technical Working Group on Advanced Technologies for HWRs (TWG-HWR) are conducted in a project within the IAEA's subprogramme on nuclear power reactor technology development. The objective of the activities on HWRs is to foster, within the frame of the TWG-HWR, information exchange and co-operative research on technology development for current and future HWRs, with an emphasis on safety, economics and fuel resource sustainability. One of the activities recommended by the TWG-HWR was an international standard problem exercise entitled: Intercomparison and validation of computer codes for thermalhydraulics safety analyses. Intercomparison and validation of computer codes used in different countries for thermalhydraulics safety analyses will enhance the confidence in the predictions made by these codes. However, the intercomparison and validation exercise needs a set of reliable experimental data. The RD-14M Large-Loss Of Coolant Accident (LOCA) test B9401 simulating HWR LOCA behaviour that was conducted by Atomic Energy of Canada Ltd (AECL) was selected for this validation project. This report provides a comparison of the results obtained from six participating countries, utilizing four different computer codes. General conclusions are reached and recommendations made
International Nuclear Information System (INIS)
Shen, W.
2012-01-01
Recent assessment results indicate that the coarse-mesh finite-difference method (FDM) gives consistently smaller percent differences in channel powers than the fine-mesh FDM when compared to the reference MCNP solution for CANDU-type reactors. However, there is an impression that the fine-mesh FDM should always give more accurate results than the coarse-mesh FDM in theory. To answer the question if the better performance of the coarse-mesh FDM for CANDU-type reactors was just a coincidence (cancellation of errors) or caused by the use of heavy water or the use of lattice-homogenized cross sections for the cluster fuel geometry in the diffusion calculation, three benchmark problems were set up with three different fuel lattices: CANDU, HWR and PWR. These benchmark problems were then used to analyze the root cause of the better performance of the coarse-mesh FDM for CANDU-type reactors. The analyses confirm that the better performance of the coarse-mesh FDM for CANDU-type reactors is mainly caused by the use of lattice-homogenized cross sections for the sub-meshes of the cluster fuel geometry in the diffusion calculation. Based on the analyses, it is recommended to use 2 x 2 coarse-mesh FDM to analyze CANDU-type reactors when lattice-homogenized cross sections are used in the core analysis. (authors)
Energy Technology Data Exchange (ETDEWEB)
Shen, W. [Candu Energy Inc., 2285 Speakman Dr., Mississauga, ON L5B 1K (Canada)
2012-07-01
Recent assessment results indicate that the coarse-mesh finite-difference method (FDM) gives consistently smaller percent differences in channel powers than the fine-mesh FDM when compared to the reference MCNP solution for CANDU-type reactors. However, there is an impression that the fine-mesh FDM should always give more accurate results than the coarse-mesh FDM in theory. To answer the question if the better performance of the coarse-mesh FDM for CANDU-type reactors was just a coincidence (cancellation of errors) or caused by the use of heavy water or the use of lattice-homogenized cross sections for the cluster fuel geometry in the diffusion calculation, three benchmark problems were set up with three different fuel lattices: CANDU, HWR and PWR. These benchmark problems were then used to analyze the root cause of the better performance of the coarse-mesh FDM for CANDU-type reactors. The analyses confirm that the better performance of the coarse-mesh FDM for CANDU-type reactors is mainly caused by the use of lattice-homogenized cross sections for the sub-meshes of the cluster fuel geometry in the diffusion calculation. Based on the analyses, it is recommended to use 2 x 2 coarse-mesh FDM to analyze CANDU-type reactors when lattice-homogenized cross sections are used in the core analysis. (authors)
BEMUSE Phase III Report - Uncertainty and Sensitivity Analysis of the LOFT L2-5 Test
International Nuclear Information System (INIS)
Bazin, P.; Crecy, A. de; Glaeser, H.; Skorek, T.; Joucla, J.; Probst, P.; Chung, B.; Oh, D.Y.; Kyncl, M.; Pernica, R.; Macek, J.; Meca, R.; Macian, R.; D'Auria, F.; Petruzzi, A.; Perez, M.; Reventos, F.; Fujioka, K.
2007-02-01
This report summarises the various contributions (ten participants) for phase 3 of BEMUSE: Uncertainty and Sensitivity Analyses of the LOFT L2-5 experiment, a Large-Break Loss-of-Coolant-Accident (LB-LOCA). For this phase, precise requirements step by step were provided to the participants. Four main parts are defined, which are: 1. List and uncertainties of the input uncertain parameters. 2. Uncertainty analysis results. 3. Sensitivity analysis results. 4. Improved methods, assessment of the methods (optional). 5% and 95% percentiles have to be estimated for 6 output parameters, which are of two kinds: 1. Scalar output parameters (First Peak Cladding Temperature (PCT), Second Peak Cladding Temperature, Time of accumulator injection, Time of complete quenching); 2. Time trends output parameters (Maximum cladding temperature, Upper plenum pressure). The main lessons learnt from phase 3 of the BEMUSE programme are the following: - for uncertainty analysis, all the participants use a probabilistic method associated with the use of Wilks' formula, except for UNIPI with its CIAU method (Code with the Capability of Internal Assessment of Uncertainty). Use of both methods has been successfully mastered. - Compared with the experiment, the results of uncertainty analysis are good on the whole. For example, for the cladding temperature-type output parameters (1. PCT, 2. PCT, time of complete quenching, maximum cladding temperature), 8 participants out of 10 find upper and lower bounds which envelop the experimental data. - Sensitivity analysis has been successfully performed by all the participants using the probabilistic method. All the used influence measures include the range of variation of the input parameters. Synthesis tables of the most influential phenomena and parameters have been plotted and participants will be able to use them for the continuation of the BEMUSE programme
Reactor Core Failure Analysis for Feasibility Study of IVR-ERVC Strategy
International Nuclear Information System (INIS)
Lim, Kukhee; Cho, Yongjin; Hwang, Taesuk
2014-01-01
The complicated physical phenomena in a reactor vessel under severe accident environments should be evaluated by effective cooling methods at the same time to satisfy thermal failure margin of the strategy. The reactor integrity by this margin criterion is guaranteed if the heat flux obtained by the analysis of heat balance equations between heat structures in a reactor vessel does not exceed the critical-heat-flux (CHF) limit for nucleate boiling on the vessel outer surface. This method assumes the layer configuration of molten pools (number of layers, thickness of layers and heat generation rate of molten corium, etc.) and evaluates representative states by a 1-dimensional heat transfer analysis. Boundary conditions of model should be well defined to increase accuracy of assessed heat flux and these are dependent on accident scenarios. Therefore, conservative assumptions or results from the analysis using system codes for accident analyses should be considered to determine boundary conditions. In this paper, in-vessel molten corium behaviors during LBLOCA which is considered as the most conservative accident scenario in IVR-ERVC design concerns are examined using MELCOR 1.8.6 to check the feasibility of APR1400 IVR-ERVC strategy. The relocated debris mass in the lower head of APR1400 reactor is analyzed using MELCOR1.8.6. This analysis is to determine the boundary conditions of the heat balance equations consisting of the lumped parameter method in order to calculate heat flux at external vessel wall surface. As a result, lower head vessel failure has been occurred at the time of about 7e3 which is very short time comparing with total period of ERVC process. Even though the effect of external vessel cooling is well-modeled, however, the differences between debris mass are relatively small. Therefore, the physical feasibility of the creep rupture model in MELCOR COR package should be verified for an adequate debris mass assessment in a reactor lower head under the
Energy Technology Data Exchange (ETDEWEB)
Lee, Won Jae; Chung, Bub Dong; Jeong, Jae Jun; Ha, Kwi Seok [Korea Atomic Energy Research Institute, Taejon (Korea)
1998-06-01
A multi-dimensional realistic thermal-hydraulic system analysis code, MARS version 1.3 has been developed. Main purpose of MARS 1.3 development is to have the realistic analysis capability of transient two-phase thermal-hydraulics of Pressurized Water Reactors (PWRs) especially during Large Break Loss of Coolant Accidents (LBLOCAs) where the multi-dimensional phenomena domain the transients. MARS code is a unified version of USNRC developed COBRA-TF, domain the transients. MARS code is a unified version of USNRC developed COBRA-TF, three-dimensional (3D) reactor vessel analysis code, and RELAP5/MOD3.2.1.2, one-dimensional (1D) reactor system analysis code., Developmental requirements for MARS are chosen not only to best utilize the existing capability of the codes but also to have the enhanced capability in code maintenance, user accessibility, user friendliness, code portability, code readability, and code flexibility. For the maintenance of existing codes capability and the enhancement of code maintenance capability, user accessibility and user friendliness, MARS has been unified to be a single code consisting of 1D module (RELAP5) and 3D module (COBRA-TF). This is realized by implicitly integrating the system pressure matrix equations of hydrodynamic models and solving them simultaneously, by modifying the 1D/3D calculation sequence operable under a single Central Processor Unit (CPU) and by unifying the input structure and the light water property routines of both modules. In addition, the code structure of 1D module is completely restructured using the modular data structure of standard FORTRAN 90, which greatly improves the code maintenance capability, readability and portability. For the code flexibility, a dynamic memory management scheme is applied in both modules. MARS 1.3 now runs on PC/Windows and HP/UNIX platforms having a single CPU, and users have the options to select the 3D module to model the 3D thermal-hydraulics in the reactor vessel or other
CFX-10 Analysis of the High Temperature Thermal- Chemical Experiment (CS28-2)
International Nuclear Information System (INIS)
Kim, Hyoung Tae; Park, Joo Hwan; Rhee, Bo Wook
2008-02-01
A Computational Fluid Dynamics (CFD) model of a post-blowdown fuel channel analysis for aged CANDU reactors with crept pressure tube has been developed, and validated against a high temperature thermal-chemical experiment: CS28-2. The CS28-2 experiment is one of three series of experiments to simulate the thermal-chemical behavior of a 28-element fuel channel at a high temperature and a low steam flow rate which may occur in severe accident conditions such as a LBLOCA (Large Break Loss of Coolant Accident) of CANDU reactors. Pursuant to the objective of this study, the current study has focused on understanding the involved phenomena such as the thermal radiation and convection heat transfer, and the high temperature zirconium-steam reaction in a multi-ring geometry. Therefore, a zirconium-steam oxidation model based on a parabolic rate law was implemented into the CFX-10 code, which is a commercial CFD code offered from ANSYS Inc., and other heat transfer mechanisms in the 28-element fuel channel were modeled by the original CFX-10 heat transfer packages. To assess the capability of the CFX-10 code to model the thermal-chemical behavior of the 28-element fuel channel, the measured temperatures of the Fuel Element Simulators (FES) of three fuel rings in the test bundle and the pressure tube, and the hydrogen production in the CS28-2 experiment were compared with the CFX-10 predictions. In spite of some discrepancy between the measurement data and CFX results, it was found that the CFX-10 prediction based on the Urbanic-Heidrick correlation of the zirconium-steam reaction as well as the Discrete Transfer Model for a radiation heat transfer among the FES of three rings and the pressure tube are quite accurate and sound even for the offset a cluster fuel bundle of an aged fuel channel
CFX-10 Analysis of the High Temperature Thermal- Chemical Experiment (CS28-2)
Energy Technology Data Exchange (ETDEWEB)
Kim, Hyoung Tae; Park, Joo Hwan; Rhee, Bo Wook
2008-02-15
A Computational Fluid Dynamics (CFD) model of a post-blowdown fuel channel analysis for aged CANDU reactors with crept pressure tube has been developed, and validated against a high temperature thermal-chemical experiment: CS28-2. The CS28-2 experiment is one of three series of experiments to simulate the thermal-chemical behavior of a 28-element fuel channel at a high temperature and a low steam flow rate which may occur in severe accident conditions such as a LBLOCA (Large Break Loss of Coolant Accident) of CANDU reactors. Pursuant to the objective of this study, the current study has focused on understanding the involved phenomena such as the thermal radiation and convection heat transfer, and the high temperature zirconium-steam reaction in a multi-ring geometry. Therefore, a zirconium-steam oxidation model based on a parabolic rate law was implemented into the CFX-10 code, which is a commercial CFD code offered from ANSYS Inc., and other heat transfer mechanisms in the 28-element fuel channel were modeled by the original CFX-10 heat transfer packages. To assess the capability of the CFX-10 code to model the thermal-chemical behavior of the 28-element fuel channel, the measured temperatures of the Fuel Element Simulators (FES) of three fuel rings in the test bundle and the pressure tube, and the hydrogen production in the CS28-2 experiment were compared with the CFX-10 predictions. In spite of some discrepancy between the measurement data and CFX results, it was found that the CFX-10 prediction based on the Urbanic-Heidrick correlation of the zirconium-steam reaction as well as the Discrete Transfer Model for a radiation heat transfer among the FES of three rings and the pressure tube are quite accurate and sound even for the offset a cluster fuel bundle of an aged fuel channel.
International Nuclear Information System (INIS)
Guelfi, A.; Boucker, M.; Mimouni, S.; Bestion, D.; Boudier, P.
2005-01-01
The NEPTUNE project aims at building a new two-phase flow thermal-hydraulics platform for nuclear reactor simulation. EDF (Electricite de France) and CEA (Commissariat a l'Energie Atomique) with the co-sponsorship of IRSN (Institut de Radioprotection et Surete Nucleaire) and FRAMATOME-ANP, are jointly developing the NEPTUNE multi-scale platform that includes new physical models and numerical methods for each of the computing scales. One usually distinguishes three different scales for industrial simulations: the 'system' scale, the 'component' scale (subchannel analysis) and CFD (Computational Fluid Dynamics). In addition DNS (Direct Numerical Simulation) can provide information at a smaller scale that can be useful for the development of the averaged scales. The NEPTUNE project also includes work on software architecture and research on new numerical methods for coupling codes since both are required to improve industrial calculations. All these R and D challenges have been defined in order to meet industrial needs and the underlying stakes (mainly the competitiveness and the safety of Nuclear Power Plants). This paper focuses on three high priority needs: DNB (Departure from Nucleate Boiling) prediction, directly linked to fuel performance; PTS (Pressurized Thermal Shock), a key issue when studying the lifespan of critical components and LBLOCA (Large Break Loss of Coolant Accident), a reference accident for safety studies. For each of these industrial applications, we provide a review of the last developments within the NEPTUNE platform and we present the first results. A particular attention is also given to physical validation and the needs for further experimental data. (authors)
DEFF Research Database (Denmark)
Mathiesen, Brian Vad; Liu, Wen; Zhang, Xiliang
2014-01-01
three major technological changes: energy savings on the demand side, efficiency improvements in energy production, and the replacement of fossil fuels by various sources of renewable energy. Consequently, the analysis of these systems must include strategies for integrating renewable sources...
1st RCM of IAEA CRP on Prediction of Axial and Radial Creep in HWR Pressure Tubes
International Nuclear Information System (INIS)
Choi, Jong-Ho
2013-01-01
Expected outcome: Improved understanding of pressure tube creep mechanism by studying the effect of intrinsic (material response) as well as extrinsic parameters (operating conditions). • Improvement of material characterization technology: many laboratories participating in this CRP will conduct the microstructure characterization for the first time. • Recommendation for manufacturing to achieve optimal PT performance: The database will enable the identification of best pressure tube performance by comparison of data. • Improvement in aging management procedure: (channel selection for PT deformation management, etc.). • Safety enhancement for operating HWRs by reducing the uncertainty in the prediction of PT deformation
AP600 large-break loss-of-collant-accident developmental assessment plan for TRAC-PF1/MOD2
International Nuclear Information System (INIS)
Knight, T.D.
1996-07-01
The Westinghouse AP600 reactor is an advanced pressurized water reactor with passive safety systems to protect the plant against possible accidents and transients. The design has been submitted to the U.S. NRC for design certification. The NRC has selected the Transient Reactor Analysis Code (TRAC)-PF1/MOD2 for performing large break loss-of coolant-accident (LBLOCA) analysis to support the certification effort. This document defines the tests to be used in the current phase of developmental assessment related to AP600 LBLOCA
Energy Technology Data Exchange (ETDEWEB)
Dupleac, D., E-mail: danieldu@cne.pub.ro [Politehnica Univ. of Bucharest (Romania); Perez, M.; Reventos, F., E-mail: marina.perez@upc.edu, E-mail: francesc.reventos@upc.edu [Technical Univ. of Catalonia (Spain); Allison, C., E-mail: iss@cableone.net [Innovative Systems Software (United States)
2011-07-01
The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis (IUA) package being developed jointly by the Technical University of Catalonia (UPC) and Innovative Systems Software (ISS). RELAP/SCDAPSIM/MOD4.0(IUA) follows the input-propagation approach using probability distribution functions to define the uncertainty of the input parameters. The main steps for this type of methodologies, often referred as to statistical approaches or Wilks’ methods, are the ones that follow: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. RELAP/SCDAPSIM/MOD4.0(IUA) calculates the number of required code runs given the desired percentile and confidence level, performs the sampling process for the
International Nuclear Information System (INIS)
Dupleac, D.; Perez, M.; Reventos, F.; Allison, C.
2011-01-01
The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis (IUA) package being developed jointly by the Technical University of Catalonia (UPC) and Innovative Systems Software (ISS). RELAP/SCDAPSIM/MOD4.0(IUA) follows the input-propagation approach using probability distribution functions to define the uncertainty of the input parameters. The main steps for this type of methodologies, often referred as to statistical approaches or Wilks’ methods, are the ones that follow: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. RELAP/SCDAPSIM/MOD4.0(IUA) calculates the number of required code runs given the desired percentile and confidence level, performs the sampling process for the
International Nuclear Information System (INIS)
Lee, Sunghan; Kim, Jinhyuck; Suh, Nam Duk; Cho, Songwon
2014-01-01
Containment filtered venting system (CFVS) is designed to open and to close isolation valves passively by an operator. CFVS is operated when the containment pressure exceeds the design pressure (225 kPa(a)) and is closed when the containment pressure decreases below 151 kPa(a). The aim of this study is to analyze the depressurization performance of Wolsong unit 1 through CFVS during SBO. The thermal-hydraulic behavior in containment of Wolsong unit 1 was evaluated using the MELCOR 1.8.6 code developed at Sandia National Laboratories (SNL) for the U.S. Nuclear Regulatory Commission (NRC). In addition, in order to evaluate the effects of the CFVS according to the venting area, a sensitivity study depending on different venting area of the CFVS was conducted. Finally, an analysis of the effects of filtering and scrubbing of radioactive material for CFVS is important but not treated in this paper. The SBO accident is chosen to analyze the thermal-hydraulic behavior of Wolsong unit 1. During SBO, the analysis of CFVS affecting on the depressurization of the containment was conducted using MELCOR 1.8.6 code. Also, a sensitivity study was carried out to evaluate the depressurization performance according to the venting area of CFVS. The results show that the containment pressure is considerably decreased and the integrity of the containment could be maintained in case of CFVS operating. Therefore, CFVS has the capacity to keep the containment pressure below the design pressure during SBO. In addition, there are large differences in the containment pressure depending on venting area. We found that the decreasing rate of the pressure in the containment and water level in CFVS depends on the venting area. In the future, a proper requirement for CFVS sizing criteria according to accident scenarios such as LBLOCA, SBLOCA and SGTR, etc. should be evaluated in order to review the licensing for CFVS. Finally, analyses of aerosols, fission product, and radioactive material
SPACE Code Assessment for FLECHT Test
Energy Technology Data Exchange (ETDEWEB)
Ahn, Hyoung Kyoun; Min, Ji Hong; Park, Chan Eok; Park, Seok Jeong; Kim, Shin Whan [KEPCO E and C, Daejeon (Korea, Republic of)
2015-10-15
According to 10 CFR 50 Appendix K, Emergency Core Cooling System (ECCS) performance evaluation model during LBLOCA should be based on the data of FLECHT test. Heat transfer coefficient (HTC) and Carryout Rate Fraction (CRF) during reflood period of LBLOCA should be conservative. To develop Mass and Energy Release (MER) methodology using Safety and Performance Analysis CodE (SPACE), FLECHT test results were compared to the results calculated by SPACE. FLECHT test facility is modeled to compare the reflood HTC and CRF using SPACE. Sensitivity analysis is performed with various options for HTC correlation. Based on this result, it is concluded that the reflood HTC and CRF calculated with COBRA-TF correlation during LBLOCA meet the requirement of 10 CFR 50 Appendix K. In this study, the analysis results using SPACE predicts heat transfer phenomena of FLECHT test reasonably and conservatively. Reflood HTC for the test number of 0690, 3541 and 4225 are conservative in the reference case. In case of 6948 HTC using COBRATF is conservative to calculate film boiling region. All of analysis results for CRF have sufficient conservatism. Based on these results, it is possible to apply with COBRA-TF correlation to develop MER methodology to analyze LBLOCA using SPACE.
International Nuclear Information System (INIS)
Rhee, Bo Wook; Kim, Hyoung Tae; Park, Joo Hwan
2008-01-01
To form a licensing basis for the new methodology of the fuel channel safety analysis code system for CANDU-6, a CATHENA model for the post-blowdown fuel channel analysis for a Large Break LOCA has been developed, and tested for the steady state of a high temperature thermal-chemical experiment CS28-1. As the major concerns of the post-blowdown fuel channel analysis of the current CANDU-6 design are how much of the decay heat can be discharged to the moderator via a radiation and a convective heat transfer at the expected accident conditions, and how much zirconium sheath would be oxidized to generate H 2 at how high a fuel temperature, this study has focused on understanding these phenomena, their interrelations, and a way to maintain a good accuracy in the prediction of the fuel and the pressure tube temperatures without losing the important physics of the involved phenomena throughout the post-blowdown phase of a LBLOCA. For a better prediction, those factors that may significantly contribute to the prediction accuracy of the steady state of the test bundles were sought. The result shows that once the pressure tube temperature is predicted correctly by the CATHENA heat transfer model between the pressure tube and the calandria tube through a gap thermal resistance adjustment, all the remaining temperatures of the inner ring, middle ring and outer ring FES temperatures can be predicted quite satisfactorily, say to within an accuracy range of 20-25 deg. C, which is comparable to the reported accuracy of the temperature measurement, ±2%. Also the analysis shows the choice of the emissivity of the solid structures (typically, 0.80, 0.34, 0.34 for FES, PT, CT), and the thermal resistance across the CO 2 annulus are factors that significantly affect the steady state prediction accuracy. A question on the legitimacy of using 'transparent' assumption for the CO 2 gas annulus for the radiation heat transfer between the pressure tube and the calandria tube in CATHENA
International Nuclear Information System (INIS)
Velkov, K.
2002-01-01
Appendix 1: The Safety Analysis Report (S.A.R.) is presented from 3 Handbooks - ECC Handbook (LOCA), Plant Dynamics Handbook (Transients incl. ATWS), and Core Design Handbook. The first one Conceived as Living handbook, Basis for design, catalogue of transients, specifications and licensing. Handbook contains LOCA in primary system, it contains also core damage analysis, and description of codes, description of essential plant data and code input data. The second one consists of Basis for design, commissioning, operation, and catalogue of transients, specifications and licensing, as well as specified operation, disturbed operation, incidents, non-LOCA, SS-procedures and Code description. The third book consists of Reactivity balance and reactivity coefficients, efficiency of shutdown systems. Calculation of burn up cycle, power density distribution, and critical boron concentration. Also Codes used, as SAV79A standard analysis methodology including FASER for nuclear data generation, MEDIUM and PANBOX for static and transient core calculations. Appendix 2: The three TUEV (Technical Inspection Agencies) responsible for the three individual plants of type KONVOI: TUEV Bayern for ISAR-2, TUV-Hanover for KKE, TUEV-Stuttgart for GKN-2 and GRS performed the safety assessment. TUV-Bayern for disturbance and failure of secondary heat sink without loss of coolant (failure of main heat sink, erroneous operation of valves in MS and in FW system, failure of MFW supply), long term LONOP, performance of selected SBLOCA analyses. TUV Hanover for disturbances due to failure of MCPs, short term LONOP, damages of SG tubes incl. SGTR, performance of selected LOCA analyses (blowdown phase of LBLOCA). TUV-Stuttgart for breaks and leaks in MS and FW system with and without leaks in SG tubes. GRS for ATWS, sub-cooling transients due to disturbances on secondary side, initial and boundary conditions for transients with opening of pressurizer valves with and without stuck-open, most of the
Regression analysis of nuclear plant capacity factors
International Nuclear Information System (INIS)
Stocks, K.J.; Faulkner, J.I.
1980-07-01
Operating data on all commercial nuclear power plants of the PWR, HWR, BWR and GCR types in the Western World are analysed statistically to determine whether the explanatory variables size, year of operation, vintage and reactor supplier are significant in accounting for the variation in capacity factor. The results are compared with a number of previous studies which analysed only United States reactors. The possibility of specification errors affecting the results is also examined. Although, in general, the variables considered are statistically significant, they explain only a small portion of the variation in the capacity factor. The equations thus obtained should certainly not be used to predict the lifetime performance of future large reactors
Modified Phenomena Identification and Ranking Table (PIRT) for Uncertainty Analysis
International Nuclear Information System (INIS)
Gol-Mohamad, Mohammad P.; Modarres, Mohammad; Mosleh, Ali
2006-01-01
This paper describes a methodology of characterizing important phenomena, which is also part of a broader research by the authors called 'Modified PIRT'. The methodology provides robust process of phenomena identification and ranking process for more precise quantification of uncertainty. It is a two-step process of identifying and ranking methodology based on thermal-hydraulics (TH) importance as well as uncertainty importance. Analytical Hierarchical Process (AHP) has been used for as a formal approach for TH identification and ranking. Formal uncertainty importance technique is used to estimate the degree of credibility of the TH model(s) used to represent the important phenomena. This part uses subjective justification by evaluating available information and data from experiments, and code predictions. The proposed methodology was demonstrated by developing a PIRT for large break loss of coolant accident LBLOCA for the LOFT integral facility with highest core power (test LB-1). (authors)
Improvement of severe accident analysis method for KSNP
Energy Technology Data Exchange (ETDEWEB)
Park, Jae Hong [Korea Institute of Nuclear Safety, Taejon (Korea, Republic of); Cho, Song Won; Cho, Youn Soo [Korea Radiation Technology Institute Co., Taejon (Korea, Republic of)
2002-03-15
The objective of this study is preparation of MELCOR 1.8.5 input deck for KSNP and simulation of some major severe accidents. The contents of this project are preparation of MELCOR 1.8.5 base input deck for KSNP to understand severe accident phenomena and to assess severe accident strategy, preparation of 20 cell containment input deck to simulate the distribution of hydrogen and fission products in containment, simulation of some major severe accident scenarios such as TLOFW, SBO, SBLOCA, MBLOCA, and LBLOCA. The method for MELCOR 1.8.5 input deck preparation can be used to prepare the input deck for domestic PWRs and to simulate severe accident experiments such as ISP-46. Information gained from analyses of severe accidents may be helpful to set up the severe accident management strategy and to develop regulatory guidance.
International Nuclear Information System (INIS)
Park, Ju Hwan; Kang, Hyung Seok; Rhee, Bo Wook
2006-05-01
The establishment of safety analysis system and technology for CANDU reactors has been performed at KAERI. As for one of these researches, single CANDU fuel bundle has been simulated by CATHENA for the post-blowdown event to consider the complicated geometry and heat transfer in the fuel channel. In the previous LBLOCA analysis methodology adopted for Wolsong 2, 3, 4 licensing, the fuel channel blowdown phase was analyzed by a CANDU system analysis code CATHENA and the post-blowdown phase of fuel channel was analyzed by CHAN-IIA code. The use of one computer code in consecutive analyses appeared to be desirable for consistency and simplicity in the safety analysis process. However, validation of the high temperature post-blowdown fuel channel model in the CATHENA before being used in the accident analysis is necessary. Experimental data for the 37-element fuel bundle that fueled CANDU-6 has not been performed. The benchmark problems for the 37-element fuel bundle using CFD code will be compared with the test results of the 28-element fuel bundle in the CS28-1 experiment. A full grid model of FES to the calandria tube simulating the test section was generated. The number of the generated mesh in the grid model was 4,324,340 cells. The boundary and heat source conditions, and properties data in the CFD analysis were given according to the test results and reference data. Thermal hydraulic phenomena in the fuel channel were simulated by a compressible flow, a highly turbulent flow, and a convection/conduction/radiation heat transfer. The natural convection flow of CO 2 due to a large temperature difference in the gap between the pressure and the calandria tubes was treated by Boussinesq's buoyancy model. The CFD results showed good agreement with the test results as a whole. The inner/middle/outer FES temperature distributions of the CFD results showed a small overestimated value of about 30 .deg. C at the entrance region, but good agreement at the outlet region. The
Updated TRAC analysis of an 80% double-ended cold-leg break for the AP600 design
International Nuclear Information System (INIS)
Lime, J.F.; Boyack, B.E.
1995-01-01
An updated TRAC 80% large-break loss-of-coolant accident (LBLOCA) has been calculated for the Westinghouse AP600 advanced reactor design, The updated calculation incorporates major code error corrections, model corrections, and plant design changes. The 80% break size was calculated by Westinghouse to be the most severe large-break size for the AP600 design. The LBLOCA transient was calculated to 144 s. Peak cladding temperatures (PCTS) were well below the Appendix K limit of 1,478 K (2,200 F), but very near the cladding oxidation temperature of 1,200 K (1,700 F). Transient event times and PCT for the TRAC calculation were in reasonable agreement with those calculated by Westinghouse using their WCOBRA/TRAC code. However, there were significant differences in the detailed phenomena calculated by the two codes, particularly during the blowdown phase. The reasons for these differences are still being investigated. Additional break sizes and break locations need to be analyzed to confirm the most severe break postulated by Westinghouse
DEFF Research Database (Denmark)
Zhang, Xinxin; Thunem, Harald P - J; Lind, Morten
2014-01-01
and linking the MFM model to its process components. The purpose of this report is to make a comprehensive demonstration of how to use the MFM Suite to develop MFM models and run causal reasoning for abnormal situations. This report will explain the capability of representing process and operational knowledge...... by using the MFM methodolog y, and demonstrate how the model combined with the MFM reasoning can be used to evaluate the plant state, identify the current situation and support operational decisions. The report will provide a detailed explanation of MFM concepts by modelling the prim ary side system...... systems. Two of the modelling examples can be found in HWR - 990 and HWR - 1059. The inherent causal reasoning capability enabled the developed MFM models to be used for diagnostic and prognostic analysis. These MFM models h ave been used to develop the basis for implementing operator support tools...
International Nuclear Information System (INIS)
Chung, Bub Dong; Jeong, Jae Jun; Lee, Won Jae
1998-01-01
The two independent codes, MARS 1.3 and CONTEMPT4/MOD5/PCCS, have been coupled using the method of dynamic-link-library (DLL) technique. Overall configuration of the code system is designed so that MARS will be a main driver program which use CONTEMPT as associated routines. Using Digital Visual Fortran compiler, DLL was generated from the CONTEMPT source code with the interfacing routine names and arguments. Coupling of MARS with CONTEMPT was realized by calling the DLL routines at the appropriate step in the MARS code. Verification of coupling was carried out for LBLOCA transient of a typical plant design. It was found that the DLL technique is much more convenient than the UNIX process control techniques and effective for Window operating system. Since DLL can be used by more than one application and an application program can use many DLLs simultaneously, this technique would enable the existing codes to use more broadly with linking others
Energy Technology Data Exchange (ETDEWEB)
Chung, Bub Dong; Jeong, Jae Jun; Lee, Won Jae [KAERI, Taejon (Korea, Republic of)
1998-10-01
The two independent codes, MARS 1.3 and CONTEMPT4/MOD5/PCCS, have been coupled using the method of dynamic-link-library (DLL) technique. Overall configuration of the code system is designed so that MARS will be a main driver program which use CONTEMPT as associated routines. Using Digital Visual Fortran compiler, DLL was generated from the CONTEMPT source code with the interfacing routine names and arguments. Coupling of MARS with CONTEMPT was realized by calling the DLL routines at the appropriate step in the MARS code. Verification of coupling was carried out for LBLOCA transient of a typical plant design. It was found that the DLL technique is much more convenient than the UNIX process control techniques and effective for Window operating system. Since DLL can be used by more than one application and an application program can use many DLLs simultaneously, this technique would enable the existing codes to use more broadly with linking others.
International Nuclear Information System (INIS)
Lime, J.F.; Boyack, B.E.
1996-01-01
An updated TRAC 80% pump-side, cold-leg, large-break (LB) loss-of-coolant accident (LOCA) has been calculated for the Westinghouse AP600 advanced reactor design. The updated calculation incorporates major code error corrections, model corrections, and plant design changes. The break size and location were calculated by Westinghouse to be the most severe LBLOCA for the AP600 design. The LBLOCA transient was calculated to 280 s, which is the time of in-containment refueling water-storage-tank injection. All fuel rods were quenched completely by 240 s. Peak cladding temperatures (PCTs) were well below the licensing limit of 1,478 K (2,200 F) but were very near the cladding oxidation temperature of 1,200 K (1,700 F). Transient event times and PCTs for the TRAC calculation were in reasonable agreement with those calculated by Westinghouse using their WCOBRA/TRAC code. However, there were significant differences in the detailed phenomena calculated by the two codes, particularly during the blowdown and refill periods. The reasons for these differences are still being investigated
International Nuclear Information System (INIS)
2003-01-01
-informing technical requirement. In the third paper, the European reactor vendor gave its view on LB-LOCA definition for new reactors (EPR). The proposed concept take into account the LB-LOCA (of the main coolant line) for designing the ECCS and the containment but not for mechanical design of the main coolant lines itself. An important prerequisite for LBB and break exclusion in the EPR is also reliable monitoring and inspection. 2. Does adequate technical basis exist to support a redefinition of the LB-LOCA? None of the participants suggested that the probability of LB LOCA could be so high that it represents a significant contribution to the overall risk. There was a general confidence that the probability of a fast occurring large leak from the main reactor coolant circuit can be made insignificant with the right corrective measures. This is true at least in the new plants where lessons learned during the last 30 years have been implemented. The session had a well coordinated set of four complementary presentations aimed at measuring the technical basis to support a redefinition of the LB-LOCA, through the potential development of a spectrum of break sizes, their expected frequencies and the corresponding consequences. With that aim, the four papers presented, in a sequential manner: the critical issues and technical approaches to the subject from the risk requirements point of view, the known and potential aging mechanisms in primary pipes, the technical and administrative developments to prevent pressure boundary fractures through in service inspections and the new developments to detect such fractures through advanced leak detection technologies. The NRC presentation identified issues related to materials engineering, risk considerations, and plant response analysis, and discussed NRC's ongoing technical approaches to address these issues and develop a technical basis for the risk-informed revision of the rule. The EDF presentation was very insightful as it reflected
Advanced CANDU Design With Negative Power Feedback
International Nuclear Information System (INIS)
Andang-Widi-Harto; Muslim
2004-01-01
The problem of positive power feedback in the recent PHWR-CANDU design, especially related to coolant void increase, will be overcame by the use of dual moderator concept, in which two moderator systems are used, i.e. a main moderator outside the calandria tube and an annular moderator inside the annular space. Annular moderator is allowed to boil in the case of overheating. The numerical calculations have been performed for two core design namely HWR-DM-ST and HWR-DM-XI which can reach burn up of 16,000 and 17,500 MWd/ ton U respectively. The results for the two designs is that the values of k at fully annular moderator filling condition are 1.0054 (HWR-DM-ST) and 1.0019 (HWR-DM-XI), while at completely empty annular moderator condition are 0.9634 (HWR-DM-ST) and 0.9143 (HWR-DM-XI). The decrease of coolant flow rate from 3,043 kg/s to 853 kg/s decrease k values of 0.0109 (HWR-DM-ST) and 0.0232 (HWR-DM-XI). While increasing inlet coolant enthalpy from 2,950 kJ/kg to 3,175 kJ/kg decreases of k values of 0.0074 (HWR-DM-ST) and 0.0239 (HWR-DM-XI). Thus, it can be summarized that the HWR-DM design has negative power reactivity feedback.(author)
Decision analysis multicriteria analysis
International Nuclear Information System (INIS)
Lombard, J.
1986-09-01
The ALARA procedure covers a wide range of decisions from the simplest to the most complex one. For the simplest one the engineering judgement is generally enough and the use of a decision aiding technique is therefore not necessary. For some decisions the comparison of the available protection option may be performed from two or a few criteria (or attributes) (protection cost, collective dose,...) and the use of rather simple decision aiding techniques, like the Cost Effectiveness Analysis or the Cost Benefit Analysis, is quite enough. For the more complex decisions, involving numerous criteria or for decisions involving large uncertainties or qualitative judgement the use of these techniques, even the extended cost benefit analysis, is not recommended and appropriate techniques like multi-attribute decision aiding techniques are more relevant. There is a lot of such particular techniques and it is not possible to present all of them. Therefore only two broad categories of multi-attribute decision aiding techniques will be presented here: decision analysis and the outranking analysis
Information Synthesis in Uncertainty Studies: Application to the Analysis of the BEMUSE Results
International Nuclear Information System (INIS)
Baccou, J.; Chojnacki, E.; Destercke, S.
2013-01-01
mathematical framework, the more time consuming the propagation should be. Therefore, the key point is here to construct a numerical treatment for uncertainty propagation which reduces the computational cost and can be applied to complex models used in practice. In nuclear safety studies, different uncertainty analyses using different codes and implying different experts are generally performed. Deriving benefits from these analyses appears to be a problem of information synthesis which is the third key issue. Indeed each uncertainty study can be viewed as an information source on quantities of interest. It is then useful to define formal methods to combine all these information sources in order to improve the reliability of the results and to detect possible conflicts (if any) between the sources. The efficiency of an uncertainty analysis requires a reliable quantification of the information associated to uncertainty sources. This quantification is addressed in the fourth key issue. It consists in exploiting the information related to available experiments and to the comparison code/experiment to infer the uncertainty attached to the code input parameters. Therefore, the crucial points stand in the choice of an experimental database sufficiently representative and exhaustive of the considered phenomenon and in the construction of an efficient treatment to perform this inference. The two first points have been deeply studied in the frame of the OECD BEMUSE Program. In particular, it came out that statistical approaches, based on Monte-Carlo techniques, are now sufficiently robust for the evaluation of uncertainty on a LB-LOCA transient. In this paper, we focus on the third issue and present some recent developments proposed by IRSN to derive formal tools in order to improve the reliability of an analysis involving different information sources. It is applied to exhibit some important conclusions from the two BEMUSE benchmarks. For sake of completeness, we recall that the last
International Nuclear Information System (INIS)
2008-05-01
This book introduces energy and resource technology development business with performance analysis, which has business division and definition, analysis of current situation of support, substance of basic plan of national energy, resource technique development, selection of analysis index, result of performance analysis by index, performance result of investigation, analysis and appraisal of energy and resource technology development business in 2007.
Production of concentrates for supplying a national nuclear electric program
International Nuclear Information System (INIS)
Delgado, C.M.
An analysis was made of the yearly requirements of natural U to satisfy a nuclear power program in Mexico. On the basis of these requirements an evaluation was made of the concentrate production necessary to supply those needs. The lack of known commercial reserves to satisfy the concentrate needs is shown. An estimation is made of the costs for the discovery and exploitation of sufficient reserves. In evaluating the U needs and associated costs, three cases were considered: all BWR or PWR or HWR
Best estimate LB LOCA approach based on advanced thermal-hydraulic codes
International Nuclear Information System (INIS)
Sauvage, J.Y.; Gandrille, J.L.; Gaurrand, M.; Rochwerger, D.; Thibaudeau, J.; Viloteau, E.
2004-01-01
Improvements achieved in thermal-hydraulics with development of Best Estimate computer codes, have led number of Safety Authorities to preconize realistic analyses instead of conservative calculations. The potentiality of a Best Estimate approach for the analysis of LOCAs urged FRAMATOME to early enter into the development with CEA and EDF of the 2nd generation code CATHARE, then of a LBLOCA BE methodology with BWNT following the Code Scaling Applicability and Uncertainty (CSAU) proceeding. CATHARE and TRAC are the basic tools for LOCA studies which will be performed by FRAMATOME according to either a deterministic better estimate (dbe) methodology or a Statistical Best Estimate (SBE) methodology. (author)
International Nuclear Information System (INIS)
Jae, Myeong Gi; Lee, Won Seong; Kim, Ha Hyeok
1989-02-01
This book give description of electronic engineering such as circuit element and device, circuit analysis and logic digital circuit, the method of electrochemistry like conductometry, potentiometry and current measuring, spectro chemical analysis with electromagnetic radiant rays, optical components, absorption spectroscopy, X-ray analysis, atomic absorption spectrometry and reference, chromatography like gas-chromatography and liquid-chromatography and automated analysis on control system evaluation of automated analysis and automated analysis system and reference.
Energy Technology Data Exchange (ETDEWEB)
Jae, Myeong Gi; Lee, Won Seong; Kim, Ha Hyeok
1989-02-15
This book give description of electronic engineering such as circuit element and device, circuit analysis and logic digital circuit, the method of electrochemistry like conductometry, potentiometry and current measuring, spectro chemical analysis with electromagnetic radiant rays, optical components, absorption spectroscopy, X-ray analysis, atomic absorption spectrometry and reference, chromatography like gas-chromatography and liquid-chromatography and automated analysis on control system evaluation of automated analysis and automated analysis system and reference.
Energy Technology Data Exchange (ETDEWEB)
Kim, Seung Jae; Seo, Seong Gyu
1995-03-15
This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.
International Nuclear Information System (INIS)
Kim, Seung Jae; Seo, Seong Gyu
1995-03-01
This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.
... page: //medlineplus.gov/ency/article/003741.htm Sensitivity analysis To use the sharing features on this page, please enable JavaScript. Sensitivity analysis determines the effectiveness of antibiotics against microorganisms (germs) ...
McShane, Edward James
2013-01-01
This text surveys practical elements of real function theory, general topology, and functional analysis. Discusses the maximality principle, the notion of convergence, the Lebesgue-Stieltjes integral, function spaces and harmonic analysis. Includes exercises. 1959 edition.
Cerebrospinal fluid analysis ... Analysis of CSF can help detect certain conditions and diseases. All of the following can be, but ... An abnormal CSF analysis result may be due to many different causes, ... Encephalitis (such as West Nile and Eastern Equine) Hepatic ...
... analysis URL of this page: //medlineplus.gov/ency/article/003627.htm Semen analysis To use the sharing features on this page, please enable JavaScript. Semen analysis measures the amount and quality of a man's semen and sperm. Semen is ...
Kantorovich, L V
1982-01-01
Functional Analysis examines trends in functional analysis as a mathematical discipline and the ever-increasing role played by its techniques in applications. The theory of topological vector spaces is emphasized, along with the applications of functional analysis to applied analysis. Some topics of functional analysis connected with applications to mathematical economics and control theory are also discussed. Comprised of 18 chapters, this book begins with an introduction to the elements of the theory of topological spaces, the theory of metric spaces, and the theory of abstract measure space
Geometrically and material non-linear analysis of bubble condenser steel structure
International Nuclear Information System (INIS)
Gyoergyi, J.; Lenkei, P.
2003-01-01
In frame of the project funded by the European Commission (EC) through the Phare and Tacis Programmes experimentally investigate the behaviour of the bubble condenser system (BCS) during phenomena induced by postulated design basis accidents (DBA). The bubble condenser steel structure consists of 12 trays. To enable the Bubble Condenser Test Prototype to be representative of the majority of trays and sections, it was decided to model a typical tray. The test results demonstrate the integrity of the standard tray pressure retaining boundary (side wall, face wall, ceiling and bottom) against a differential pressure (30 kPa). The stability of the side wall and the face wall of tray level 12 was not assured for this differential pressure. The thermal-hydraulic tests demonstrate that the maximum differential pressure across the tray walls in the case of Large Break Loss of Coolant Accident (LBLOCA) is 20 kPa. We have got from the experiences the differential pressure in function of time. The results of the approximate calculations showed the effect of nonlinearly. In case of calculation by FEM model we have done the elastic and linear analyses, and calculated with the geometrically and material non-linearity. (author)
International Nuclear Information System (INIS)
Berman, M.; Bischof, L.M.; Breen, E.J.; Peden, G.M.
1994-01-01
This paper provides an overview of modern image analysis techniques pertinent to materials science. The usual approach in image analysis contains two basic steps: first, the image is segmented into its constituent components (e.g. individual grains), and second, measurement and quantitative analysis are performed. Usually, the segmentation part of the process is the harder of the two. Consequently, much of the paper concentrates on this aspect, reviewing both fundamental segmentation tools (commonly found in commercial image analysis packages) and more advanced segmentation tools. There is also a review of the most widely used quantitative analysis methods for measuring the size, shape and spatial arrangements of objects. Many of the segmentation and analysis methods are demonstrated using complex real-world examples. Finally, there is a discussion of hardware and software issues. 42 refs., 17 figs
Benchmarking Severe Accident Computer Codes for Heavy Water Reactor Applications
International Nuclear Information System (INIS)
2013-12-01
Requests for severe accident investigations and assurance of mitigation measures have increased for operating nuclear power plants and the design of advanced nuclear power plants. Severe accident analysis investigations necessitate the analysis of the very complex physical phenomena that occur sequentially during various stages of accident progression. Computer codes are essential tools for understanding how the reactor and its containment might respond under severe accident conditions. The IAEA organizes coordinated research projects (CRPs) to facilitate technology development through international collaboration among Member States. The CRP on Benchmarking Severe Accident Computer Codes for HWR Applications was planned on the advice and with the support of the IAEA Nuclear Energy Department's Technical Working Group on Advanced Technologies for HWRs (the TWG-HWR). This publication summarizes the results from the CRP participants. The CRP promoted international collaboration among Member States to improve the phenomenological understanding of severe core damage accidents and the capability to analyse them. The CRP scope included the identification and selection of a severe accident sequence, selection of appropriate geometrical and boundary conditions, conduct of benchmark analyses, comparison of the results of all code outputs, evaluation of the capabilities of computer codes to predict important severe accident phenomena, and the proposal of necessary code improvements and/or new experiments to reduce uncertainties. Seven institutes from five countries with HWRs participated in this CRP
Uncertainties in modelling and scaling of critical flows and pump model in TRAC-PF1/MOD1
International Nuclear Information System (INIS)
Rohatgi, U.S.; Yu, Wen-Shi.
1987-01-01
The USNRC has established a Code Scalability, Applicability and Uncertainty (CSAU) evaluation methodology to quantify the uncertainty in the prediction of safety parameters by the best estimate codes. These codes can then be applied to evaluate the Emergency Core Cooling System (ECCS). The TRAC-PF1/MOD1 version was selected as the first code to undergo the CSAU analysis for LBLOCA applications. It was established through this methodology that break flow and pump models are among the top ranked models in the code affecting the peak clad temperature (PCT) prediction for LBLOCA. The break flow model bias or discrepancy and the uncertainty were determined by modelling the test section near the break for 12 Marviken tests. It was observed that the TRAC-PF1/MOD1 code consistently underpredicts the break flow rate and that the prediction improved with increasing pipe length (larger L/D). This is true for both subcooled and two-phase critical flows. A pump model was developed from Westinghouse (1/3 scale) data. The data represent the largest available test pump relevant to Westinghouse PWRs. It was then shown through the analysis of CE and CREARE pump data that larger pumps degrade less and also that pumps degrade less at higher pressures. Since the model developed here is based on the 1/3 scale pump and on low pressure data, it is conservative and will overpredict the degradation when applied to PWRs
Thiemann, Francis C.
Semiotic analysis is a method of analyzing signs (e.g., words) to reduce non-numeric data to their component parts without losing essential meanings. Semiotics dates back to Aristotle's analysis of language; it was much advanced by nineteenth-century analyses of style and logic and by Whitehead and Russell's description in this century of the role…
Indian Academy of Sciences (India)
Dimensional analysis is a useful tool which finds important applications in physics and engineering. It is most effective when there exist a maximal number of dimensionless quantities constructed out of the relevant physical variables. Though a complete theory of dimen- sional analysis was developed way back in 1914 in a.
Bravená, Helena
2009-01-01
This bacherlor thesis deals with the importance of job analysis for personnel activities in the company. The aim of this work is to find the most suitable method of job analysis in a particular enterprise, and continues creating descriptions and specifications of each job.
International Nuclear Information System (INIS)
Francois, P.
1996-01-01
We undertook a study programme at the end of 1991. To start with, we performed some exploratory studies aimed at learning some preliminary lessons on this type of analysis: Assessment of the interest of probabilistic incident analysis; possibility of using PSA scenarios; skills and resources required. At the same time, EPN created a working group whose assignment was to define a new approach for analysis of incidents on NPPs. This working group gave thought to both aspects of Operating Feedback that EPN wished to improve: Analysis of significant incidents; analysis of potential consequences. We took part in the work of this group, and for the second aspects, we proposed a method based on an adaptation of the event-tree method in order to establish a link between existing PSA models and actual incidents. Since PSA provides an exhaustive database of accident scenarios applicable to the two most common types of units in France, they are obviously of interest for this sort of analysis. With this method we performed some incident analyses, and at the same time explores some methods employed abroad, particularly ASP (Accident Sequence Precursor, a method used by the NRC). Early in 1994 EDF began a systematic analysis programme. The first, transient phase will set up methods and an organizational structure. 7 figs
Energy Technology Data Exchange (ETDEWEB)
Francois, P
1997-12-31
We undertook a study programme at the end of 1991. To start with, we performed some exploratory studies aimed at learning some preliminary lessons on this type of analysis: Assessment of the interest of probabilistic incident analysis; possibility of using PSA scenarios; skills and resources required. At the same time, EPN created a working group whose assignment was to define a new approach for analysis of incidents on NPPs. This working group gave thought to both aspects of Operating Feedback that EPN wished to improve: Analysis of significant incidents; analysis of potential consequences. We took part in the work of this group, and for the second aspects, we proposed a method based on an adaptation of the event-tree method in order to establish a link between existing PSA models and actual incidents. Since PSA provides an exhaustive database of accident scenarios applicable to the two most common types of units in France, they are obviously of interest for this sort of analysis. With this method we performed some incident analyses, and at the same time explores some methods employed abroad, particularly ASP (Accident Sequence Precursor, a method used by the NRC). Early in 1994 EDF began a systematic analysis programme. The first, transient phase will set up methods and an organizational structure. 7 figs.
Khabaza, I M
1960-01-01
Numerical Analysis is an elementary introduction to numerical analysis, its applications, limitations, and pitfalls. Methods suitable for digital computers are emphasized, but some desk computations are also described. Topics covered range from the use of digital computers in numerical work to errors in computations using desk machines, finite difference methods, and numerical solution of ordinary differential equations. This book is comprised of eight chapters and begins with an overview of the importance of digital computers in numerical analysis, followed by a discussion on errors in comput
International Nuclear Information System (INIS)
Warner, M.
1987-01-01
What is the current state of quantitative trace analytical chemistry? What are today's research efforts? And what challenges does the future hold? These are some of the questions addressed at a recent four-day symposium sponsored by the National Bureau of Standards (NBS) entitled Accuracy in Trace Analysis - Accomplishments, Goals, Challenges. The two plenary sessions held on the first day of the symposium reviewed the history of quantitative trace analysis, discussed the present situation from academic and industrial perspectives, and summarized future needs. The remaining three days of the symposium consisted of parallel sessions dealing with the measurement process; quantitation in materials; environmental, clinical, and nutrient analysis; and advances in analytical techniques
Goodstein, R L
2010-01-01
Recursive analysis develops natural number computations into a framework appropriate for real numbers. This text is based upon primary recursive arithmetic and presents a unique combination of classical analysis and intuitional analysis. Written by a master in the field, it is suitable for graduate students of mathematics and computer science and can be read without a detailed knowledge of recursive arithmetic.Introductory chapters on recursive convergence and recursive and relative continuity are succeeded by explorations of recursive and relative differentiability, the relative integral, and
Tao, Terence
2016-01-01
This is part one of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...
Tao, Terence
2016-01-01
This is part two of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...
International Nuclear Information System (INIS)
1988-01-01
Basic studies in nuclear analytical techniques include the examination of underlying assumptions and the development and extention of techniques involving the use of ion beams for elemental and mass analysis. 1 ref., 1 tab
Energy Technology Data Exchange (ETDEWEB)
2016-06-01
Fact sheet summarizing NREL's techno-economic analysis and life-cycle assessment capabilities to connect research with future commercial process integration, a critical step in the scale-up of biomass conversion technologies.
Gasinski, Leszek
2005-01-01
Hausdorff Measures and Capacity. Lebesgue-Bochner and Sobolev Spaces. Nonlinear Operators and Young Measures. Smooth and Nonsmooth Analysis and Variational Principles. Critical Point Theory. Eigenvalue Problems and Maximum Principles. Fixed Point Theory.
DEFF Research Database (Denmark)
Bauer-Gottwein, Peter; Riegels, Niels; Pulido-Velazquez, Manuel
2017-01-01
Hydroeconomic analysis and modeling provides a consistent and quantitative framework to assess the links between water resources systems and economic activities related to water use, simultaneously modeling water supply and water demand. It supports water managers and decision makers in assessing...... trade-offs between different water uses, different geographic regions, and various economic sectors and between the present and the future. Hydroeconomic analysis provides consistent economic performance criteria for infrastructure development and institutional reform in water policies and management...... organizations. This chapter presents an introduction to hydroeconomic analysis and modeling, and reviews the state of the art in the field. We review available economic water-valuation techniques and summarize the main types of decision problems encountered in hydroeconomic analysis. Popular solution strategies...
Schiffrin, Deborah
1990-01-01
Summarizes the current state of research in conversation analysis, referring primarily to six different perspectives that have developed from the philosophy, sociology, anthropology, and linguistics disciplines. These include pragmatics; speech act theory; interactional sociolinguistics; ethnomethodology; ethnography of communication; and…
Gorsuch, Richard L
2013-01-01
Comprehensive and comprehensible, this classic covers the basic and advanced topics essential for using factor analysis as a scientific tool in psychology, education, sociology, and related areas. Emphasizing the usefulness of the techniques, it presents sufficient mathematical background for understanding and sufficient discussion of applications for effective use. This includes not only theory but also the empirical evaluations of the importance of mathematical distinctions for applied scientific analysis.
International Nuclear Information System (INIS)
Kalkahoran, Omid Noori; Ahangari, Rohollah; Shirani, Amir Saied
2016-01-01
Since the inception of nuclear power as a commercial energy source, safety has been recognized as a prime consideration in the design, construction, operation, maintenance, and decommissioning of nuclear power plants. The release of radioactivity to the environment requires the failure of multiple safety systems and the breach of three physical barriers: fuel cladding, the reactor cooling system, and containment. In this study, nuclear reactor containment pressurization has been modeled in a large break-loss of coolant accident (LB-LOCA) by programming single-cell and multicell models in MATLAB. First, containment has been considered as a control volume (single-cell model). In addition, spray operation has been added to this model. In the second step, the single-cell model has been developed into a multicell model to consider the effects of the nodalization and spatial location of cells in the containment pressurization in comparison with the single-cell model. In the third step, the accident has been simulated using the CONTAIN 2.0 code. Finally, Bushehr nuclear power plant (BNPP) containment has been considered as a case study. The results of BNPP containment pressurization due to LB-LOCA have been compared between models, final safety analysis report, and CONTAIN code's results
Directory of Open Access Journals (Sweden)
Omid Noori-Kalkhoran
2016-10-01
Full Text Available Since the inception of nuclear power as a commercial energy source, safety has been recognized as a prime consideration in the design, construction, operation, maintenance, and decommissioning of nuclear power plants. The release of radioactivity to the environment requires the failure of multiple safety systems and the breach of three physical barriers: fuel cladding, the reactor cooling system, and containment. In this study, nuclear reactor containment pressurization has been modeled in a large break-loss of coolant accident (LB-LOCA by programming single-cell and multicell models in MATLAB. First, containment has been considered as a control volume (single-cell model. In addition, spray operation has been added to this model. In the second step, the single-cell model has been developed into a multicell model to consider the effects of the nodalization and spatial location of cells in the containment pressurization in comparison with the single-cell model. In the third step, the accident has been simulated using the CONTAIN 2.0 code. Finally, Bushehr nuclear power plant (BNPP containment has been considered as a case study. The results of BNPP containment pressurization due to LB-LOCA have been compared between models, final safety analysis report, and CONTAIN code’s results.
International Nuclear Information System (INIS)
Sabunddjian, Gaiane; Andrade, Delvonei Alves de
2003-01-01
This work presents the simulation, with RELAP5/MOD.3.2.2G code, of the postulate accident with loss of coolant in the primary circuit for large break (LBLOCA), which is described in Chapter 15 of the Final Safety Analysis Report of Angra 2 FSAR. The accident consists basically of the total break of the cold leg (Loop 20) of Angra 2 Plant. The rupture area considered is 4418 cm 2 , which represents 100% of the primary circuit pipe flow area. The Emergency Core Cooling System (ECCS) efficiency is also verified for this accident. In this simulation, failure and repair criteria are adopted for the ECCS components, in order to verify the system operation, in carrying out its function as expected by the project to preserve the integrity of the reactor core and to guarantee its cooling. LBLOCA accidents are characterized by a fast blowdown in the primary circuit to values that the low pressure injection system is activated and then, followed by the water injection by the accumulators. The thermal-hydraulic processes inherent to the accident phenomenon, such as hot leg vaporization and consequently core vaporization causing an inappropriate flow distribution in the reactor core, can lead to a reduction in the core liquid level, until the ECCS is capable to reflood it. It is important to point out that the results do not represent an independent calculation for the licensing process, but a calculation to give support to the qualification process of Angra 2 transient basic nodalization (author)
International Nuclear Information System (INIS)
Akimoto, Hajime; Ohnuki, Akira; Murao, Yoshio
1994-03-01
The REFLA/TRAC code is a best estimate code developed at Japan Atomic Energy Research Institute (JAERI) to provide advanced predictions of thermal hydraulic transient in light water reactors (LWRs). The REFLA/TRAC code uses the TRAC-PF1/MOD1 code as the framework of the code. The REFLA/TRAC code is expected to be used for the calibration of licensing codes, accident analysis, accident simulation of LWRs, and design of advanced LWRs. Several models have been implemented to the TRAC-PF1/MOD1 code at JAERI including reflood model, condensation model, interfacial and wall friction models, etc. These models have been verified using data from various separate effect tests. This report describes an assessment result of the REFLA/TRAC code, which was performed to assess the predictive capability for integral system behavior under large break loss of coolant accident (LBLOCA) using data from the LOFT L2-5 test. The assessment calculation confirmed that the REFLA/TRAC code can predict break mass flow rate, emergency core cooling water bypass and clad temperature excellently in the LOFT L2-5 test. The CPU time of the REFLA/TRAC code was about 1/3 of the TRAC-PF1/MOD1 code. The REFLA/TRAC code can perform stable and fast simulation of thermal hydraulic behavior in PWR LBLOCA with enough accuracy for practical use. (author)
Energy Technology Data Exchange (ETDEWEB)
Kalkahoran, Omid Noori; Ahangari, Rohollah [Reactor Research School, Nuclear Science and Technology Research Institute, Tehran (Iran, Islamic Republic of); Shirani, Amir Saied [Faculty of Engineering, Shahid Beheshti University, Tehran (Iran, Islamic Republic of)
2016-10-15
Since the inception of nuclear power as a commercial energy source, safety has been recognized as a prime consideration in the design, construction, operation, maintenance, and decommissioning of nuclear power plants. The release of radioactivity to the environment requires the failure of multiple safety systems and the breach of three physical barriers: fuel cladding, the reactor cooling system, and containment. In this study, nuclear reactor containment pressurization has been modeled in a large break-loss of coolant accident (LB-LOCA) by programming single-cell and multicell models in MATLAB. First, containment has been considered as a control volume (single-cell model). In addition, spray operation has been added to this model. In the second step, the single-cell model has been developed into a multicell model to consider the effects of the nodalization and spatial location of cells in the containment pressurization in comparison with the single-cell model. In the third step, the accident has been simulated using the CONTAIN 2.0 code. Finally, Bushehr nuclear power plant (BNPP) containment has been considered as a case study. The results of BNPP containment pressurization due to LB-LOCA have been compared between models, final safety analysis report, and CONTAIN code's results.
International Nuclear Information System (INIS)
1959-01-01
Radioactivation analysis is the technique of radioactivation analysis of the constituents of a very small sample of matter by making the sample artificially radioactive. The first stage is to make the sample radioactive by artificial means, e.g. subject it to neutron bombardment. Once the sample has been activated, or made radioactive, the next task is to analyze the radiations given off by the sample. This analysis would indicate the nature and quantities of the various elements present in the sample. The reason is that the radiation from a particular radioisotope. In 1959 a symposium on 'Radioactivation Analysis' was organized in Vienna by the IAEA and the Joint Commission on Applied Radioactivity (ICSU). It was pointed out that there are certain factors creating uncertainties and elaborated how to overcome them. Attention was drawn to the fact that radioactivation analysis had proven a powerful tool tackling fundamental problems in geo- and cosmochemistry, and a review was given of the recent work in this field. Because of its extreme sensitivity radioactivation analysis had been principally employed for trace detection and its most extensive use has been in control of semiconductors and very pure metals. An account of the experience gained in the USA was given, where radioactivation analysis was being used by many investigators in various scientific fields as a practical and useful tool for elemental analyses. Much of this work had been concerned with determining sub microgramme and microgramme concentration of many different elements in samples of biological materials, drugs, fertilizers, fine chemicals, foods, fuels, glass, ceramic materials, metals, minerals, paints, petroleum products, resinous materials, soils, toxicants, water and other materials. In addition to these studies, radioactivation analysis had been used by other investigators to determine isotopic ratios of the stable isotopes of some of the elements. Another paper dealt with radioactivation
Energy Technology Data Exchange (ETDEWEB)
NONE
1959-07-15
Radioactivation analysis is the technique of radioactivation analysis of the constituents of a very small sample of matter by making the sample artificially radioactive. The first stage is to make the sample radioactive by artificial means, e.g. subject it to neutron bombardment. Once the sample has been activated, or made radioactive, the next task is to analyze the radiations given off by the sample. This analysis would indicate the nature and quantities of the various elements present in the sample. The reason is that the radiation from a particular radioisotope. In 1959 a symposium on 'Radioactivation Analysis' was organized in Vienna by the IAEA and the Joint Commission on Applied Radioactivity (ICSU). It was pointed out that there are certain factors creating uncertainties and elaborated how to overcome them. Attention was drawn to the fact that radioactivation analysis had proven a powerful tool tackling fundamental problems in geo- and cosmochemistry, and a review was given of the recent work in this field. Because of its extreme sensitivity radioactivation analysis had been principally employed for trace detection and its most extensive use has been in control of semiconductors and very pure metals. An account of the experience gained in the USA was given, where radioactivation analysis was being used by many investigators in various scientific fields as a practical and useful tool for elemental analyses. Much of this work had been concerned with determining sub microgramme and microgramme concentration of many different elements in samples of biological materials, drugs, fertilizers, fine chemicals, foods, fuels, glass, ceramic materials, metals, minerals, paints, petroleum products, resinous materials, soils, toxicants, water and other materials. In addition to these studies, radioactivation analysis had been used by other investigators to determine isotopic ratios of the stable isotopes of some of the elements. Another paper dealt with radioactivation
DEFF Research Database (Denmark)
Brænder, Morten; Andersen, Lotte Bøgh
2014-01-01
Based on our 2013-article, ”Does Deployment to War Affect Soldiers' Public Service Motivation – A Panel Study of Soldiers Before and After their Service in Afghanistan”, we present Panel Analysis as a methodological discipline. Panels consist of multiple units of analysis, observed at two or more...... in research settings where it is not possible to distribute units of analysis randomly or where the independent variables cannot be manipulated. The greatest disadvantage in regard to using panel studies is that data may be difficult to obtain. This is most clearly vivid in regard to the use of panel surveys...... points in time. In comparison with traditional cross-sectional studies, the advantage of using panel studies is that the time dimension enables us to study effects. Whereas experimental designs may have a clear advantage in regard to causal inference, the strength of panel studies is difficult to match...
Loeb, Peter A
2016-01-01
This textbook is designed for a year-long course in real analysis taken by beginning graduate and advanced undergraduate students in mathematics and other areas such as statistics, engineering, and economics. Written by one of the leading scholars in the field, it elegantly explores the core concepts in real analysis and introduces new, accessible methods for both students and instructors. The first half of the book develops both Lebesgue measure and, with essentially no additional work for the student, general Borel measures for the real line. Notation indicates when a result holds only for Lebesgue measure. Differentiation and absolute continuity are presented using a local maximal function, resulting in an exposition that is both simpler and more general than the traditional approach. The second half deals with general measures and functional analysis, including Hilbert spaces, Fourier series, and the Riesz representation theorem for positive linear functionals on continuous functions with compact support....
Scott, L Ridgway
2011-01-01
Computational science is fundamentally changing how technological questions are addressed. The design of aircraft, automobiles, and even racing sailboats is now done by computational simulation. The mathematical foundation of this new approach is numerical analysis, which studies algorithms for computing expressions defined with real numbers. Emphasizing the theory behind the computation, this book provides a rigorous and self-contained introduction to numerical analysis and presents the advanced mathematics that underpin industrial software, including complete details that are missing from most textbooks. Using an inquiry-based learning approach, Numerical Analysis is written in a narrative style, provides historical background, and includes many of the proofs and technical details in exercises. Students will be able to go beyond an elementary understanding of numerical simulation and develop deep insights into the foundations of the subject. They will no longer have to accept the mathematical gaps that ex...
Rao, G Shanker
2006-01-01
About the Book: This book provides an introduction to Numerical Analysis for the students of Mathematics and Engineering. The book is designed in accordance with the common core syllabus of Numerical Analysis of Universities of Andhra Pradesh and also the syllabus prescribed in most of the Indian Universities. Salient features: Approximate and Numerical Solutions of Algebraic and Transcendental Equation Interpolation of Functions Numerical Differentiation and Integration and Numerical Solution of Ordinary Differential Equations The last three chapters deal with Curve Fitting, Eigen Values and Eigen Vectors of a Matrix and Regression Analysis. Each chapter is supplemented with a number of worked-out examples as well as number of problems to be solved by the students. This would help in the better understanding of the subject. Contents: Errors Solution of Algebraic and Transcendental Equations Finite Differences Interpolation with Equal Intervals Interpolation with Unequal Int...
Jacques, Ian
1987-01-01
This book is primarily intended for undergraduates in mathematics, the physical sciences and engineering. It introduces students to most of the techniques forming the core component of courses in numerical analysis. The text is divided into eight chapters which are largely self-contained. However, with a subject as intricately woven as mathematics, there is inevitably some interdependence between them. The level of difficulty varies and, although emphasis is firmly placed on the methods themselves rather than their analysis, we have not hesitated to include theoretical material when we consider it to be sufficiently interesting. However, it should be possible to omit those parts that do seem daunting while still being able to follow the worked examples and to tackle the exercises accompanying each section. Familiarity with the basic results of analysis and linear algebra is assumed since these are normally taught in first courses on mathematical methods. For reference purposes a list of theorems used in the t...
DiBenedetto, Emmanuele
2016-01-01
The second edition of this classic textbook presents a rigorous and self-contained introduction to real analysis with the goal of providing a solid foundation for future coursework and research in applied mathematics. Written in a clear and concise style, it covers all of the necessary subjects as well as those often absent from standard introductory texts. Each chapter features a “Problems and Complements” section that includes additional material that briefly expands on certain topics within the chapter and numerous exercises for practicing the key concepts. The first eight chapters explore all of the basic topics for training in real analysis, beginning with a review of countable sets before moving on to detailed discussions of measure theory, Lebesgue integration, Banach spaces, functional analysis, and weakly differentiable functions. More topical applications are discussed in the remaining chapters, such as maximal functions, functions of bounded mean oscillation, rearrangements, potential theory, a...
International Nuclear Information System (INIS)
Romli
1997-01-01
Cluster analysis is the name of group of multivariate techniques whose principal purpose is to distinguish similar entities from the characteristics they process.To study this analysis, there are several algorithms that can be used. Therefore, this topic focuses to discuss the algorithms, such as, similarity measures, and hierarchical clustering which includes single linkage, complete linkage and average linkage method. also, non-hierarchical clustering method, which is popular name K -mean method ' will be discussed. Finally, this paper will be described the advantages and disadvantages of every methods
Rockafellar, Ralph Tyrell
2015-01-01
Available for the first time in paperback, R. Tyrrell Rockafellar's classic study presents readers with a coherent branch of nonlinear mathematical analysis that is especially suited to the study of optimization problems. Rockafellar's theory differs from classical analysis in that differentiability assumptions are replaced by convexity assumptions. The topics treated in this volume include: systems of inequalities, the minimum or maximum of a convex function over a convex set, Lagrange multipliers, minimax theorems and duality, as well as basic results about the structure of convex sets and
Brezinski, C
2012-01-01
Numerical analysis has witnessed many significant developments in the 20th century. This book brings together 16 papers dealing with historical developments, survey papers and papers on recent trends in selected areas of numerical analysis, such as: approximation and interpolation, solution of linear systems and eigenvalue problems, iterative methods, quadrature rules, solution of ordinary-, partial- and integral equations. The papers are reprinted from the 7-volume project of the Journal of Computational and Applied Mathematics on '/homepage/sac/cam/na2000/index.html<
International Nuclear Information System (INIS)
Biehl, F.A.
1984-05-01
This paper presents the criteria, previous nuclear experience in space, analysis techniques, and possible breakup enhancement devices applicable to an acceptable SP-100 reentry from space. Reactor operation in nuclear-safe orbit will minimize the radiological risk; the remaining safeguards criteria need to be defined. A simple analytical point mass reentry technique and a more comprehensive analysis method that considers vehicle dynamics and orbit insertion malfunctions are presented. Vehicle trajectory, attitude, and possible breakup enhancement devices will be integrated in the simulation as required to ensure an adequate representation of the reentry process
Aggarwal, Charu C
2013-01-01
With the increasing advances in hardware technology for data collection, and advances in software technology (databases) for data organization, computer scientists have increasingly participated in the latest advancements of the outlier analysis field. Computer scientists, specifically, approach this field based on their practical experiences in managing large amounts of data, and with far fewer assumptions- the data can be of any type, structured or unstructured, and may be extremely large.Outlier Analysis is a comprehensive exposition, as understood by data mining experts, statisticians and
Everitt, Brian S; Leese, Morven; Stahl, Daniel
2011-01-01
Cluster analysis comprises a range of methods for classifying multivariate data into subgroups. By organizing multivariate data into such subgroups, clustering can help reveal the characteristics of any structure or patterns present. These techniques have proven useful in a wide range of areas such as medicine, psychology, market research and bioinformatics.This fifth edition of the highly successful Cluster Analysis includes coverage of the latest developments in the field and a new chapter dealing with finite mixture models for structured data.Real life examples are used throughout to demons
Snell, K S; Langford, W J; Maxwell, E A
1966-01-01
Elementary Analysis, Volume 2 introduces several of the ideas of modern mathematics in a casual manner and provides the practical experience in algebraic and analytic operations that lays a sound foundation of basic skills. This book focuses on the nature of number, algebraic and logical structure, groups, rings, fields, vector spaces, matrices, sequences, limits, functions and inverse functions, complex numbers, and probability. The logical structure of analysis given through the treatment of differentiation and integration, with applications to the trigonometric and logarithmic functions, is
International Nuclear Information System (INIS)
Baron, J.H.; Nunez McLeod, J.; Rivera, S.S.
1997-01-01
This book contains a selection of research works performed in the CEDIAC Institute (Cuyo National University) in the area of Risk Analysis, with specific orientations to the subjects of uncertainty and sensitivity studies, software reliability, severe accident modeling, etc. This volume presents important material for all those researches who want to have an insight in the risk analysis field, as a tool to solution several problems frequently found in the engineering and applied sciences field, as well as for the academic teachers who want to keep up to date, including the new developments and improvements continuously arising in this field [es
Alan Gallegos
2002-01-01
Watershed analyses and assessments for the Kings River Sustainable Forest Ecosystems Project were done on about 33,000 acres of the 45,500-acre Big Creek watershed and 32,000 acres of the 85,100-acre Dinkey Creek watershed. Following procedures developed for analysis of cumulative watershed effects (CWE) in the Pacific Northwest Region of the USDA Forest Service, the...
Freund, Rudolf J; Sa, Ping
2006-01-01
The book provides complete coverage of the classical methods of statistical analysis. It is designed to give students an understanding of the purpose of statistical analyses, to allow the student to determine, at least to some degree, the correct type of statistical analyses to be performed in a given situation, and have some appreciation of what constitutes good experimental design
International Nuclear Information System (INIS)
Unterberger, A.
1987-01-01
We study the Klein-Gordon symbolic calculus of operators acting on solutions of the free Klein-Gordon equation. It contracts to the Weyl calculus as c→∞. Mathematically, it may also be considered as a pseudodifferential analysis on the unit ball of R n [fr
International Nuclear Information System (INIS)
Woodard, K.
1985-01-01
The objectives of this paper are to: Provide a realistic assessment of consequences; Account for plant and site-specific characteristics; Adjust accident release characteristics to account for results of plant-containment analysis; Produce conditional risk curves for each of five health effects; and Estimate uncertainties
DEFF Research Database (Denmark)
Hjørland, Birger
2017-01-01
The domain-analytic approach to knowledge organization (KO) (and to the broader field of library and information science, LIS) is outlined. The article reviews the discussions and proposals on the definition of domains, and provides an example of a domain-analytic study in the field of art studies....... Varieties of domain analysis as well as criticism and controversies are presented and discussed....
International Nuclear Information System (INIS)
Rhoades, W.A.; Dray, B.J.
1970-01-01
The effect of Gadolinium-155 on the prompt kinetic behavior of a zirconium hydride reactor has been deduced, using experimental data from the SNAPTRAN machine. The poison material makes the temperature coefficient more positive, and the Type IV sleeves were deduced to give a positive coefficient above 1100 0 F. A thorough discussion of the data and analysis is included. (U.S.)
International Nuclear Information System (INIS)
Saadi, Radouan; Marah, Hamid
2014-01-01
This report presents results related to Tritium analysis carried out at the CNESTEN DASTE in Rabat (Morocco), on behalf of Senegal, within the framework of the RAF7011 project. It describes analytical method and instrumentation including general uncertainty estimation: Electrolytic enrichment and liquid scintillation counting; The results are expressed in Tritium Unit (TU); Low Limit Detection: 0.02 TU
Miller, Rupert G
2011-01-01
A concise summary of the statistical methods used in the analysis of survival data with censoring. Emphasizes recently developed nonparametric techniques. Outlines methods in detail and illustrates them with actual data. Discusses the theory behind each method. Includes numerous worked problems and numerical exercises.
Koornneef, M.; Alonso-Blanco, C.; Stam, P.
2006-01-01
The Mendelian analysis of genetic variation, available as induced mutants or as natural variation, requires a number of steps that are described in this chapter. These include the determination of the number of genes involved in the observed trait's variation, the determination of dominance
DEFF Research Database (Denmark)
Nielsen, Kirsten
2010-01-01
The first part of this article presents the characteristics of Hebrew poetry: features associated with rhythm and phonology, grammatical features, structural elements like parallelism, and imagery and intertextuality. The second part consists of an analysis of Psalm 121. It is argued that assonan...
Adrian Ioana; Tiberiu Socaciu
2013-01-01
The article presents specific aspects of management and models for economic analysis. Thus, we present the main types of economic analysis: statistical analysis, dynamic analysis, static analysis, mathematical analysis, psychological analysis. Also we present the main object of the analysis: the technological activity analysis of a company, the analysis of the production costs, the economic activity analysis of a company, the analysis of equipment, the analysis of labor productivity, the anal...
International Nuclear Information System (INIS)
Smith, M.; Jones, D.R.
1991-01-01
The goal of exploration is to find reserves that will earn an adequate rate of return on the capital invested. Neither exploration nor economics is an exact science. The authors must therefore explore in those trends (plays) that have the highest probability of achieving this goal. Trend analysis is a technique for organizing the available data to make these strategic exploration decisions objectively and is in conformance with their goals and risk attitudes. Trend analysis differs from resource estimation in its purpose. It seeks to determine the probability of economic success for an exploration program, not the ultimate results of the total industry effort. Thus the recent past is assumed to be the best estimate of the exploration probabilities for the near future. This information is combined with economic forecasts. The computer software tools necessary for trend analysis are (1) Information data base - requirements and sources. (2) Data conditioning program - assignment to trends, correction of errors, and conversion into usable form. (3) Statistical processing program - calculation of probability of success and discovery size probability distribution. (4) Analytical processing - Monte Carlo simulation to develop the probability distribution of the economic return/investment ratio for a trend. Limited capital (short-run) effects are analyzed using the Gambler's Ruin concept in the Monte Carlo simulation and by a short-cut method. Multiple trend analysis is concerned with comparing and ranking trends, allocating funds among acceptable trends, and characterizing program risk by using risk profiles. In summary, trend analysis is a reality check for long-range exploration planning
DEFF Research Database (Denmark)
The 19th Scandinavian Conference on Image Analysis was held at the IT University of Copenhagen in Denmark during June 15-17, 2015. The SCIA conference series has been an ongoing biannual event for more than 30 years and over the years it has nurtured a world-class regional research and development...... area within the four participating Nordic countries. It is a regional meeting of the International Association for Pattern Recognition (IAPR). We would like to thank all authors who submitted works to this year’s SCIA, the invited speakers, and our Program Committee. In total 67 papers were submitted....... The topics of the accepted papers range from novel applications of vision systems, pattern recognition, machine learning, feature extraction, segmentation, 3D vision, to medical and biomedical image analysis. The papers originate from all the Scandinavian countries and several other European countries...
Helson, Henry
2010-01-01
This second edition has been enlarged and considerably rewritten. Among the new topics are infinite product spaces with applications to probability, disintegration of measures on product spaces, positive definite functions on the line, and additional information about Weyl's theorems on equidistribution. Topics that have continued from the first edition include Minkowski's theorem, measures with bounded powers, idempotent measures, spectral sets of bounded functions and a theorem of Szego, and the Wiener Tauberian theorem. Readers of the book should have studied the Lebesgue integral, the elementary theory of analytic and harmonic functions, and the basic theory of Banach spaces. The treatment is classical and as simple as possible. This is an instructional book, not a treatise. Mathematics students interested in analysis will find here what they need to know about Fourier analysis. Physicists and others can use the book as a reference for more advanced topics.
Bray, Hubert L; Mazzeo, Rafe; Sesum, Natasa
2015-01-01
This volume includes expanded versions of the lectures delivered in the Graduate Minicourse portion of the 2013 Park City Mathematics Institute session on Geometric Analysis. The papers give excellent high-level introductions, suitable for graduate students wishing to enter the field and experienced researchers alike, to a range of the most important areas of geometric analysis. These include: the general issue of geometric evolution, with more detailed lectures on Ricci flow and Kähler-Ricci flow, new progress on the analytic aspects of the Willmore equation as well as an introduction to the recent proof of the Willmore conjecture and new directions in min-max theory for geometric variational problems, the current state of the art regarding minimal surfaces in R^3, the role of critical metrics in Riemannian geometry, and the modern perspective on the study of eigenfunctions and eigenvalues for Laplace-Beltrami operators.
Freitag, Eberhard
2005-01-01
The guiding principle of this presentation of ``Classical Complex Analysis'' is to proceed as quickly as possible to the central results while using a small number of notions and concepts from other fields. Thus the prerequisites for understanding this book are minimal; only elementary facts of calculus and algebra are required. The first four chapters cover the essential core of complex analysis: - differentiation in C (including elementary facts about conformal mappings) - integration in C (including complex line integrals, Cauchy's Integral Theorem, and the Integral Formulas) - sequences and series of analytic functions, (isolated) singularities, Laurent series, calculus of residues - construction of analytic functions: the gamma function, Weierstrass' Factorization Theorem, Mittag-Leffler Partial Fraction Decomposition, and -as a particular highlight- the Riemann Mapping Theorem, which characterizes the simply connected domains in C. Further topics included are: - the theory of elliptic functions based on...
International Nuclear Information System (INIS)
Quinn, C.A.
1983-01-01
The article deals with spectrographic analysis and the analytical methods based on it. The theory of spectrographic analysis is discussed as well as the layout of a spectrometer system. The infrared absorption spectrum of a compound is probably its most unique property. The absorption of infrared radiation depends on increasing the energy of vibration and rotation associated with a covalent bond. The infrared region is intrinsically low in energy thus the design of infrared spectrometers is always directed toward maximising energy throughput. The article also considers atomic absorption - flame atomizers, non-flame atomizers and the source of radiation. Under the section an emission spectroscopy non-electrical energy sources, electrical energy sources and electrical flames are discussed. Digital computers form a part of the development on spectrographic instrumentation
Cheng, Lizhi; Luo, Yong; Chen, Bo
2014-01-01
This book could be divided into two parts i.e. fundamental wavelet transform theory and method and some important applications of wavelet transform. In the first part, as preliminary knowledge, the Fourier analysis, inner product space, the characteristics of Haar functions, and concepts of multi-resolution analysis, are introduced followed by a description on how to construct wavelet functions both multi-band and multi wavelets, and finally introduces the design of integer wavelets via lifting schemes and its application to integer transform algorithm. In the second part, many applications are discussed in the field of image and signal processing by introducing other wavelet variants such as complex wavelets, ridgelets, and curvelets. Important application examples include image compression, image denoising/restoration, image enhancement, digital watermarking, numerical solution of partial differential equations, and solving ill-conditioned Toeplitz system. The book is intended for senior undergraduate stude...
International Nuclear Information System (INIS)
Hwang, Hun
2007-02-01
This book explains potentiometry, voltametry, amperometry and basic conception of conductometry with eleven chapters. It gives the specific descriptions on electrochemical cell and its mode, basic conception of electrochemical analysis on oxidation-reduction reaction, standard electrode potential, formal potential, faradaic current and faradaic process, mass transfer and overvoltage, potentiometry and indirect potentiometry, polarography with TAST, normal pulse and deferential pulse, voltammetry, conductometry and conductometric titration.
International Nuclear Information System (INIS)
Badwe, R.A.
1999-01-01
The primary endpoint in the majority of the studies has been either disease recurrence or death. This kind of analysis requires a special method since all patients in the study experience the endpoint. The standard method for estimating such survival distribution is Kaplan Meier method. The survival function is defined as the proportion of individuals who survive beyond certain time. Multi-variate comparison for survival has been carried out with Cox's proportional hazard model
DEFF Research Database (Denmark)
Andersen, Lars
This book contains the lecture notes for the 9th semester course on elastodynamics. The first chapter gives an overview of the basic theory of stress waves propagating in viscoelastic media. In particular, the effect of surfaces and interfaces in a viscoelastic material is studied, and different....... Thus, in Chapter 3, an alternative semi-analytic method is derived, which may be applied for the analysis of layered half-spaces subject to moving or stationary loads....
Mucha, Hans-Joachim; Sofyan, Hizir
2000-01-01
As an explorative technique, duster analysis provides a description or a reduction in the dimension of the data. It classifies a set of observations into two or more mutually exclusive unknown groups based on combinations of many variables. Its aim is to construct groups in such a way that the profiles of objects in the same groups are relatively homogenous whereas the profiles of objects in different groups are relatively heterogeneous. Clustering is distinct from classification techniques, ...
International Nuclear Information System (INIS)
Garbarino, J.R.; Steinheimer, T.R.; Taylor, H.E.
1985-01-01
This is the twenty-first biennial review of the inorganic and organic analytical chemistry of water. The format of this review differs somewhat from previous reviews in this series - the most recent of which appeared in Analytical Chemistry in April 1983. Changes in format have occurred in the presentation of material concerning review articles and the inorganic analysis of water sections. Organic analysis of water sections are organized as in previous reviews. Review articles have been compiled and tabulated in an Appendix with respect to subject, title, author(s), citation, and number of references cited. The inorganic water analysis sections are now grouped by constituent using the periodic chart; for example, alkali, alkaline earth, 1st series transition metals, etc. Within these groupings the references are roughly grouped by instrumental technique; for example, spectrophotometry, atomic absorption spectrometry, etc. Multiconstituent methods for determining analytes that cannot be grouped in this manner are compiled into a separate section sorted by instrumental technique. References used in preparing this review were compiled from nearly 60 major journals published during the period from October 1982 through September 1984. Conference proceedings, most foreign journals, most trade journals, and most government publications are excluded. References cited were obtained using the American Chemical Society's Chemical Abstracts for sections on inorganic analytical chemistry, organic analytical chemistry, water, and sewage waste. Cross-references of these sections were also included. 860 references
Energy Technology Data Exchange (ETDEWEB)
None
1980-06-01
The Energy Policy and Conservation Act (EPCA) mandated that minimum energy efficiency standards be established for classes of refrigerators and refrigerator-freezers, freezers, clothes dryers, water heaters, room air conditioners, home heating equipment, kitchen ranges and ovens, central air conditioners, and furnaces. EPCA requires that standards be designed to achieve the maximum improvement in energy efficiency that is technologically feasible and economically justified. Following the introductory chapter, Chapter Two describes the methodology used in the economic analysis and its relationship to legislative criteria for consumer product efficiency assessment; details how the CPES Value Model systematically compared and evaluated the economic impacts of regulation on the consumer, manufacturer and Nation. Chapter Three briefly displays the results of the analysis and lists the proposed performance standards by product class. Chapter Four describes the reasons for developing a baseline forecast, characterizes the baseline scenario from which regulatory impacts were calculated and summarizes the primary models, data sources and assumptions used in the baseline formulations. Chapter Five summarizes the methodology used to calculate regulatory impacts; describes the impacts of energy performance standards relative to the baseline discussed in Chapter Four. Also discussed are regional standards and other program alternatives to performance standards. Chapter Six describes the procedure for balancing consumer, manufacturer, and national impacts to select standard levels. Details of models and data bases used in the analysis are included in Appendices A through K.
Quantifying phenomenological importance in best-estimate plus uncertainty analyses
International Nuclear Information System (INIS)
Martin, Robert P.
2009-01-01
This paper describes a general methodology for quantifying the importance of specific phenomenological elements to analysis measures evaluated from non-parametric best-estimate plus uncertainty evaluation methodologies. The principal objective of an importance analysis is to reveal those uncertainty contributors having the greatest influence on key analysis measures. This characterization supports the credibility of the uncertainty analysis, the applicability of the analytical tools, and even the generic evaluation methodology through the validation of the engineering judgments that guided the evaluation methodology development. A demonstration of the importance analysis is provided using data from a sample problem considered in the development of AREVA's Realistic LBLOCA methodology. The results are presented against the original large-break LOCA Phenomena Identification and Ranking Table developed by the Technical Program Group responsible for authoring the Code Scaling, Applicability and Uncertainty methodology. (author)
Newell, Homer E
2006-01-01
When employed with skill and understanding, vector analysis can be a practical and powerful tool. This text develops the algebra and calculus of vectors in a manner useful to physicists and engineers. Numerous exercises (with answers) not only provide practice in manipulation but also help establish students' physical and geometric intuition in regard to vectors and vector concepts.Part I, the basic portion of the text, consists of a thorough treatment of vector algebra and the vector calculus. Part II presents the illustrative matter, demonstrating applications to kinematics, mechanics, and e
Brand, Louis
2006-01-01
The use of vectors not only simplifies treatments of differential geometry, mechanics, hydrodynamics, and electrodynamics, but also makes mathematical and physical concepts more tangible and easy to grasp. This text for undergraduates was designed as a short introductory course to give students the tools of vector algebra and calculus, as well as a brief glimpse into these subjects' manifold applications. The applications are developed to the extent that the uses of the potential function, both scalar and vector, are fully illustrated. Moreover, the basic postulates of vector analysis are brou
PCT Uncertainty Analysis Using Unscented Transform with Random Orthogonal Matrix
Energy Technology Data Exchange (ETDEWEB)
Fynana, Douglas A.; Ahn, Kwang-Il [KAERI, Daejeon (Korea, Republic of); Lee, John C. [Univ. of Michigan, Michigan (United States)
2015-05-15
Most Best Estimate Plus Uncertainty (BEPU) methods employ nonparametric order statistics through Wilks' formula to quantify uncertainties of best estimate simulations of nuclear power plant (NPP) transients. 95%/95% limits, the 95''t{sup h} percentile at a 95% confidence level, are obtained by randomly sampling all uncertainty contributors through conventional Monte Carlo (MC). Advantages are simple implementation of MC sampling of input probability density functions (pdfs) and limited computational expense of 1''s{sup t}, 2''n{sup d}, and 3''r{sup d} order Wilks' formula requiring only 59, 93, or 124 simulations, respectively. A disadvantage of small sample size is large sample to sample variation of statistical estimators. This paper presents a new efficient sampling based algorithm for accurate estimation of mean and variance of the output parameter pdf. The algorithm combines a deterministic sampling method, the unscented transform (UT), with random sampling through the generation of a random orthogonal matrix (ROM). The UT guarantees the mean, covariance, and 3''r{sup d} order moments of the multivariate input parameter distributions are exactly preserved by the sampled input points and the orthogonal transformation of the points by a ROM guarantees the sample error of all 4''t{sup h} order and higher moments are unbiased. The UT with ROM algorithm is applied to the uncertainty quantification of the peak clad temperature (PCT) during a large break loss-of-coolant accident (LBLOCA) in an OPR1000 NPP to demonstrate the applicability of the new algorithm to BEPU. This paper presented a new algorithm combining the UT with ROM for efficient multivariate parameter sampling that ensures sample input covariance and 3''r{sup d} order moments are exactly preserved and 4''th moment errors are small and unbiased. The advantageous sample properties guarantee higher order accuracy and
Abbott, Stephen
2015-01-01
This lively introductory text exposes the student to the rewards of a rigorous study of functions of a real variable. In each chapter, informal discussions of questions that give analysis its inherent fascination are followed by precise, but not overly formal, developments of the techniques needed to make sense of them. By focusing on the unifying themes of approximation and the resolution of paradoxes that arise in the transition from the finite to the infinite, the text turns what could be a daunting cascade of definitions and theorems into a coherent and engaging progression of ideas. Acutely aware of the need for rigor, the student is much better prepared to understand what constitutes a proper mathematical proof and how to write one. Fifteen years of classroom experience with the first edition of Understanding Analysis have solidified and refined the central narrative of the second edition. Roughly 150 new exercises join a selection of the best exercises from the first edition, and three more project-sty...
Analysis of different containment models for IRIS small break LOCA, using GOTHIC and RELAP5 codes
International Nuclear Information System (INIS)
Papini, Davide; Grgic, Davor; Cammi, Antonio; Ricotti, Marco E.
2011-01-01
Advanced nuclear water reactors rely on containment behaviour in realization of some of their passive safety functions. Steam condensation on containment walls, where non-condensable gas effects are significant, is an important feature of the new passive containment concepts, like the AP600/1000 ones. In this work the international reactor innovative and secure (IRIS) was taken as reference, and the relevant condensation phenomena involved within its containment were investigated with different computational tools. In particular, IRIS containment response to a small break LOCA (SBLOCA) was calculated with GOTHIC and RELAP5 codes. A simplified model of IRIS containment drywell was implemented with RELAP5 according to a sliced approach, based on the two-pipe-with-junction concept, while it was addressed with GOTHIC using several modelling options, regarding both heat transfer correlations and volume and thermal structure nodalization. The influence on containment behaviour prediction was investigated in terms of drywell temperature and pressure response, heat transfer coefficient (HTC) and steam volume fraction distribution, and internal recirculating mass flow rate. The objective of the paper is to preliminarily compare the capability of the two codes in modelling of the same postulated accident, thus to check the results obtained with RELAP5, when applied in a situation not covered by its validation matrix (comprising SBLOCA and to some extent LBLOCA transients, but not explicitly the modelling of large dry containment volumes). The option to include or not droplets in fluid mass flow discharged to the containment was the most influencing parameter for GOTHIC simulations. Despite some drawbacks, due, e.g. to a marked overestimation of internal natural recirculation, RELAP5 confirmed its capability to satisfactorily model the basic processes in IRIS containment following SBLOCA.
DEFF Research Database (Denmark)
Moore, R; Brødsgaard, I; Miller, ML
1997-01-01
A quantitative method for validating qualitative interview results and checking sample parameters is described and illustrated using common pain descriptions among a sample of Anglo-American and mandarin Chinese patients and dentists matched by age and gender. Assumptions were that subjects were ...... of covalidating questionnaires that reflect results of qualitative interviews are recommended in order to estimate sample parameters such as intersubject agreement, individual subject accuracy, and minimum required sample sizes.......A quantitative method for validating qualitative interview results and checking sample parameters is described and illustrated using common pain descriptions among a sample of Anglo-American and mandarin Chinese patients and dentists matched by age and gender. Assumptions were that subjects were...... of consistency in use of descriptors within groups, validity of description, accuracy of individuals compared with others in their group, and minimum required sample size were calculated using Cronbach's alpha, factor analysis, and Bayesian probability. Ethnic and professional differences within and across...
International Nuclear Information System (INIS)
Iorio, A.F.; Crespi, J.C.
1987-01-01
After ten years of operation at the Atucha I Nuclear Power Station a gear belonging to a pressurized heavy water reactor refuelling machine, failed. The gear box was used to operate the inlet-outlet heavy-water valve of the machine. Visual examination of the gear device showed an absence of lubricant and that several gear teeth were broken at the root. Motion was transmitted with a speed-reducing device with controlled adjustable times in order to produce a proper fitness of the valve closure. The aim of this paper is to discuss the results of the gear failure analysis in order to recommend the proper solution to prevent further failures. (Author)
International Nuclear Information System (INIS)
1988-01-01
In a search for correlations between the elemental composition of trace elements in human stones and the stone types with relation to their growth pattern, a combined PIXE and x-ray diffraction spectrometry approach was implemented. The combination of scanning PIXE and XRD has proved to be an advance in the methodology of stone analysis and may point to the growth pattern in the body. The exact role of trace elements in the formation and growth of urinary stones is not fully understood. Efforts are thus continuing firstly to solve the analytical problems concerned and secondly to design suitable experiments that would provide information about the occurrence and distribution of trace elements in urine. 1 fig., 1 ref
International Nuclear Information System (INIS)
Straub, W.A.
1987-01-01
This review is the seventh in the series compiled by using the Dialog on-line CA Search facilities at the Information Resource Center of USS Technical Center covering the period from Oct. 1984 to Nov. 1, 1986. The quest for better surface properties, through the application of various electrochemical and other coating techniques, seems to have increased and reinforces the notion that only through the value added to a steel by proper finishing steps can a major supplier hope to compete profitably. The detection, determination, and control of microalloying constituents has also been generating a lot of interest as evidenced by the number of publications devoted to this subject in the last few years. Several recent review articles amplify on the recent trends in the application of modern analytical technology to steelmaking. Another review has been devoted to the determination of trace elements and the simultaneous determination of elements in metals by mass spectrometry, atomic absorption spectrometry, and multielement emission spectrometry. Problems associated with the analysis of electroplating wastewaters have been reviewed in a recent publication that has described the use of various spectrophotometric methods for this purpose. The collection and treatment of analytical data in the modern steel making environment have been extensively reviewed emphasis on the interaction of the providers and users of the analytical data, its quality, and the cost of its collection. Raw material treatment and beneficiation was the dominant theme
Bhatia, Rajendra
1997-01-01
A good part of matrix theory is functional analytic in spirit. This statement can be turned around. There are many problems in operator theory, where most of the complexities and subtleties are present in the finite-dimensional case. My purpose in writing this book is to present a systematic treatment of methods that are useful in the study of such problems. This book is intended for use as a text for upper division and gradu ate courses. Courses based on parts of the material have been given by me at the Indian Statistical Institute and at the University of Toronto (in collaboration with Chandler Davis). The book should also be useful as a reference for research workers in linear algebra, operator theory, mathe matical physics and numerical analysis. A possible subtitle of this book could be Matrix Inequalities. A reader who works through the book should expect to become proficient in the art of deriving such inequalities. Other authors have compared this art to that of cutting diamonds. One first has to...
International Nuclear Information System (INIS)
Thomas, R.E.
1982-03-01
An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software
Energy Technology Data Exchange (ETDEWEB)
Hashim, Muhammad, E-mail: hashimsajid@yahoo.com; Hidekazu, Yoshikawa, E-mail: yosikawa@kib.biglobe.ne.jp; Takeshi, Matsuoka, E-mail: mats@cc.utsunomiya-u.ac.jp; Ming, Yang, E-mail: myang.heu@gmail.com
2014-10-15
Highlights: • Discussion on reasons why AP1000 equipped with ADS system comparatively to PWR. • Clarification of full and partial depressurization of reactor coolant system by ADS system. • Application case study of four stages ADS system for reliability evaluation in LBLOCA. • GO-FLOW tool is capable to evaluate dynamic reliability of passive safety systems. • Calculated ADS reliability result significantly increased dynamic reliability of PXS. - Abstract: AP1000 nuclear power plant (NPP) utilized passive means for the safety systems to ensure its safety in events of transient or severe accidents. One of the unique safety systems of AP1000 to be compared with conventional PWR is the “four stages Automatic Depressurization System (ADS)”, and ADS system originally works as an active safety system. In the present study, authors first discussed the reasons of why four stages ADS system is added in AP1000 plant to be compared with conventional PWR in the aspect of reliability. And then explained the full and partial depressurization of RCS system by four stages ADS in events of transient and loss of coolant accidents (LOCAs). Lastly, the application case study of four stages ADS system of AP1000 has been conducted in the aspect of reliability evaluation of ADS system under postulated conditions of full RCS depressurization during large break loss of a coolant accident (LBLOCA) in one of the RCS cold legs. In this case study, the reliability evaluation is made by GO-FLOW methodology to determinate the influence of ADS system in dynamic reliability of passive core cooling system (PXS) of AP1000, i.e. what will happen if ADS system fails or successfully actuate. The GO-FLOW is success-oriented reliability analysis tool and is capable to evaluating the systems reliability/unavailability alternatively to Fault Tree Analysis (FTA) and Event Tree Analysis (ETA) tools. Under these specific conditions of LBLOCA, the GO-FLOW calculated reliability results indicated
Energy Technology Data Exchange (ETDEWEB)
Ibsen, Lars Bo; Liingaard, M.
2006-12-15
This technical report concerns the basic theory and principles for experimental modal analysis. The sections within the report are: Output-only modal analysis software, general digital analysis, basics of structural dynamics and modal analysis and system identification. (au)
Theoretical numerical analysis a functional analysis framework
Atkinson, Kendall
2005-01-01
This textbook prepares graduate students for research in numerical analysis/computational mathematics by giving to them a mathematical framework embedded in functional analysis and focused on numerical analysis. This helps the student to move rapidly into a research program. The text covers basic results of functional analysis, approximation theory, Fourier analysis and wavelets, iteration methods for nonlinear equations, finite difference methods, Sobolev spaces and weak formulations of boundary value problems, finite element methods, elliptic variational inequalities and their numerical solu
International Nuclear Information System (INIS)
2003-08-01
This book deals with analysis of heat transfer which includes nonlinear analysis examples, radiation heat transfer, analysis of heat transfer in ANSYS, verification of analysis result, analysis of heat transfer of transition with automatic time stepping and open control, analysis of heat transfer using arrangement of ANSYS, resistance of thermal contact, coupled field analysis such as of thermal-structural interaction, cases of coupled field analysis, and phase change.
Information security risk analysis
Peltier, Thomas R
2001-01-01
Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex
International Nuclear Information System (INIS)
Son, Seung Hui
2004-02-01
This book deals with information technology and business process, information system architecture, methods of system development, plan on system development like problem analysis and feasibility analysis, cases for system development, comprehension of analysis of users demands, analysis of users demands using traditional analysis, users demands analysis using integrated information system architecture, system design using integrated information system architecture, system implementation, and system maintenance.
Study on fracture of fuel element cladding for naval reactor during typical accidents
International Nuclear Information System (INIS)
Zhang Fan; Shang Xueli; Zheng Zhongliang; Yu Lei
2011-01-01
Aiming at defining the grade of nuclear emergency response, the best estimate model has been adopted; the simulation of large break loss of coolant accident (LBLOCA) has been carried out by the radioactive analysis software coupled with relap5/mod 3.2 and core physics model. First, the peak clad temperature of the critical failure channel is calculated in relap5 code, and simultaneously its power factor is obtained. Second, pin power distribution of the fuel assemblies has been calculated in coarse-mesh nodal method. According to the pin power distribution in the whole core and the result gained above, the fraction of fuel element fracture is calculated. Finally, the radioactive analysis has been carried out and the reasonable source term is gotten, which can offer reference for the nuclear emergency decision making. (authors)
Analysis of Project Finance | Energy Analysis | NREL
Analysis of Project Finance Analysis of Project Finance NREL analysis helps potential renewable energy developers and investors gain insights into the complex world of project finance. Renewable energy project finance is complex, requiring knowledge of federal tax credits, state-level incentives, renewable
International Nuclear Information System (INIS)
Wright, A.C.D.
2002-01-01
This paper discusses the safety analysis fundamentals in reactor design. This study includes safety analysis done to show consequences of postulated accidents are acceptable. Safety analysis is also used to set design of special safety systems and includes design assist analysis to support conceptual design. safety analysis is necessary for licensing a reactor, to maintain an operating license, support changes in plant operations
EID - prototype design and user test 2004
International Nuclear Information System (INIS)
Welch, Robin; Friberg, Maarten; Nystad, Espen; Teigen, Arild; Veland, Oeystein
2005-08-01
programme is to gain insight into how this methodology can contribute to the design of operator displays in the nuclear industry. To do this, it was decided to design a limited number of displays on the FRESH simulator and conduct a user test to examine whether operators would be able to use and accept this type of design. The FRESH EID displays intend to show information and relationships in a graphical form that would require substantially more mental resources to utilize if using the conventional displays. This HWR presents the background for EID, the analysis process, the displays that have been designed, the user test and the outcome of the user test. This first attempt at developing and evaluating an EID has provided both valuable practical lessons learned and promising results for further work. (Author)
An example of multidimensional analysis: Discriminant analysis
International Nuclear Information System (INIS)
Lutz, P.
1990-01-01
Among the approaches on the data multi-dimensional analysis, lectures on the discriminant analysis including theoretical and practical aspects are presented. The discrimination problem, the analysis steps and the discrimination categories are stressed. Examples on the descriptive historical analysis, the discrimination for decision making, the demonstration and separation of the top quark are given. In the linear discriminant analysis the following subjects are discussed: Huyghens theorem, projection, discriminant variable, geometrical interpretation, case for g=2, classification method, separation of the top events. Criteria allowing the obtention of relevant results are included [fr
Development of thermal hydraulic models for the reliable regulatory auditing code
Energy Technology Data Exchange (ETDEWEB)
Chung, B. D.; Song, C. H.; Lee, Y. J.; Kwon, T. S.; Lee, S. W. [Korea Automic Energy Research Institute, Taejon (Korea, Republic of)
2004-02-15
The objective of this project is to develop thermal hydraulic models for use in improving the reliability of the regulatory auditing codes. The current year fall under the second step of the 3 year project, and the main researches were focused on the development of downcorner boiling model. During the current year, the bubble stream model of downcorner has been developed and installed in he auditing code. The model sensitivity analysis has been performed for APR1400 LBLOCA scenario using the modified code. The preliminary calculation has been performed for the experimental test facility using FLUENT and MARS code. The facility for air bubble experiment has been installed. The thermal hydraulic phenomena for VHTR and super critical reactor have been identified for the future application and model development.
International Nuclear Information System (INIS)
Andrade, Delvonei Alves de; Sabundjian, Gaiane
2004-01-01
The objective of this work is to present the simulation of a large break loss of coolant accident - LBLOCA in the hot leg of the primary loop in Angra 2, with RELAP5/MOD3.2.2g code. This accident is described in the Final Safety Report Analysis of Angra 2 - FSAR and consists basically of the hot leg total break, in loop 20 of the plant. The area considered for the rupture is 4480 cm 2 , which corresponds to 100% of the pipe flow area. Besides, this work also has the objective of verifying the efficiency of the emergency core coolant system - ECCS in case of accidents and transients. The thermal-hydraulic processes inherent to the accident phenomenology, such as hot leg vaporization and consequently core vaporization causing an inappropriate flow distribution in the reactor core, can lead to a reduction in the liquid level, until the ECCS is capable to reflood it
... Sources Ask Us Also Known As Sperm Analysis Sperm Count Seminal Fluid Analysis Formal Name Semen Analysis This ... semen Viscosity—consistency or thickness of the semen Sperm count—total number of sperm Sperm concentration (density)—number ...
Papageorgiou, Nikolaos S
2009-01-01
Offers an examination of important theoretical methods and procedures in applied analysis. This book details the important theoretical trends in nonlinear analysis and applications to different fields. It is suitable for those working on nonlinear analysis.
A generic approach for steel containment vessel success criteria for severe accident loads
International Nuclear Information System (INIS)
Sammataro, R.F.; Solonick, W.R.; Edwards, N.W.
1993-01-01
Safety has been defined as the foremost design criterion for the Heavy Water New Production Reactor (NPR-HWR) by the U.S. DOE, Office of New Production Reactors (NP). The DOE-NP issued the Deterministic Severe Accident Criteria (DSAC) concept to guide the design of the NPR-HWR containment for resistance to severe accidents. The DSAC concept provides for a generic approach for containment vessel success criteria to predict the threshold of containment failure under severe accident loads. This concept consists of two parts: (1) Problem Statements and (2) Success Criteria. The paper is limited to a discussion of a success criteria. These criteria define acceptable containment response measures and limits for each problem statement. The criteria are based on the 'best estimate' of failure with no conservatism. Rather, conservatism, if required, is to be provided in the problem statements prepared by the designer and/or the regulatory authorities. The success criteria are presented on a multi-tiered basis for static pressure and temperature loadings, dynamic loadings, and missiles that may impact the containment. Within the static pressure and temperature loadings and the dynamic loadings, the criteria are separated into elastic analysis success criteria and inelastic analysis success criteria. Each of these areas, in turn, defines limits on either the stress or strain measures as well as on measures for buckling and displacements. The rationale upon which these criteria are based is contained in referenced documents. Rigorous validation of the criteria by comparison with results from analytical or experimental programs and application of the criteria to a containment design remain as future tasks. (orig./HP)
A generic approach for containment success criteria under severe accident loads
International Nuclear Information System (INIS)
Sammataro, R.F.; Solonick, W.R.; Edwards, N.W.
1992-01-01
The U.S. Department of Energy (DOE), Office of New Production Reactors (NP), has identified safety as the foremost design criterion for the Heavy Water New Production Reactor (NPR-HWR). The DOE-NP has issued the Deterministic Severe Accident Criteria (DSACs) to guide the design of the NPR-HWR containment for resistance to severe accidents. The DSAC concept provides for a generic approach for success criteria to predict the threshold of containment failure under severe accident loads. This concept consists of two parts: (1) Problem Statements that are qualitative and quantitative bases for calculating associated loadings and containment response to those loadings, and (2) Success Criteria that specify acceptable containment response measures and limits for each problem statement. This paper is limited to a discussion of a generic approach for containment success criteria. The main elements of these success criteria are expressed in terms of elastic stresses and inelastic strains. Containment performance is based on the best estimate of failure as predicted by either stress or strain, buckling, displacements, or ability to withstand missile perforation. Since these limits are best estimates of failure, no conservatism exists in these success criteria. Rather, conservatism is to be provided in the problem statements, i.e., the quantified severe accident loads. These success criteria are presented on a multi-tiered basis for static pressure and temperature loadings, dynamic loadings, and missiles. Within the static pressure and temperature loadings and the dynamic loadings, the criteria are separated into elastic analysis success criteria and inelastic analysis success criteria. Each of these areas, in turn, defines limits on either the stress or strain measures as well as on measures for buckling and displacements
Energy Technology Data Exchange (ETDEWEB)
Chun, Ji-Han, E-mail: chunjh@kaeri.re.kr; Lim, Sung-Won; Chung, Bub-Dong; Lee, Won-Jae
2015-08-15
Highlights: • Thermal conductivity model of the FCM fuel was developed and adopted in the MARS. • Scoping analysis for candidate FCM FAs was performed to select feasible FA. • Preliminary safety criteria for FCM fuel and SiC/Zr cladding were set up. • Enhanced safety margin and accident tolerance for FCM-SiC/Zr core were demonstrated. - Abstract: The FCM fueled cores proposed as an accident tolerant concept is assessed against the design-basis-accident (DBA) and the beyond-DBA (BDBA) scenarios using MARS code. A thermal conductivity model of FCM fuel is incorporated in the MARS code to take into account the effects of irradiation and temperature that was recently measured by ORNL. Preliminary analyses regarding the initial stored energy and accident tolerant performance were carried out for the scoping of various cladding material candidates. A 16 × 16 FA with SiC-coated Zircalloy cladding was selected as the feasible conceptual design through a preliminary scoping analysis. For a selected design, safety analyses for DBA and BDBA scenarios were performed to demonstrate the accident tolerance of the FCM fueled core. A loss of flow accident (LOFA) scenario was selected for a departure-from-nucleate-boiling (DNB) evaluation, and large-break loss of coolant accident (LBLOCA) scenario for peak cladding temperature (PCT) margin evaluation. A control element assembly (CEA) ejection accident scenario was selected for peak fuel enthalpy and temperature. Moreover, a station blackout (SBO) and LBLOCA without a safety injection (SI) scenario were selected as a BDBA. It was demonstrated that the DBA safety margin of the FCM core is satisfied and the time for operator actions for BDBA s is evaluated.
Shape analysis in medical image analysis
Tavares, João
2014-01-01
This book contains thirteen contributions from invited experts of international recognition addressing important issues in shape analysis in medical image analysis, including techniques for image segmentation, registration, modelling and classification, and applications in biology, as well as in cardiac, brain, spine, chest, lung and clinical practice. This volume treats topics such as, anatomic and functional shape representation and matching; shape-based medical image segmentation; shape registration; statistical shape analysis; shape deformation; shape-based abnormity detection; shape tracking and longitudinal shape analysis; machine learning for shape modeling and analysis; shape-based computer-aided-diagnosis; shape-based medical navigation; benchmark and validation of shape representation, analysis and modeling algorithms. This work will be of interest to researchers, students, and manufacturers in the fields of artificial intelligence, bioengineering, biomechanics, computational mechanics, computationa...
Organization Search Go Search Polar Go MMAB SST Analysis Main page About MMAB Our Mission Our Personnel EMC Branches Global Climate & Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Contact EMC (RTG_SST_HR) analysis For a regional map, click the desired area in the global SST analysis and anomaly maps
Foundations of factor analysis
Mulaik, Stanley A
2009-01-01
Introduction Factor Analysis and Structural Theories Brief History of Factor Analysis as a Linear Model Example of Factor AnalysisMathematical Foundations for Factor Analysis Introduction Scalar AlgebraVectorsMatrix AlgebraDeterminants Treatment of Variables as Vectors Maxima and Minima of FunctionsComposite Variables and Linear Transformations Introduction Composite Variables Unweighted Composite VariablesDifferentially Weighted Composites Matrix EquationsMulti
International Nuclear Information System (INIS)
PECH, S.H.
2000-01-01
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report
Quantitative analysis chemistry
International Nuclear Information System (INIS)
Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung
1995-02-01
This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.
Energy Technology Data Exchange (ETDEWEB)
PECH, S.H.
2000-08-23
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.
International Nuclear Information System (INIS)
WEBB, R.H.
1999-01-01
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Safety Analysis Report (HNF-SD-WM-SAR-062/Rev.4). This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report
Philipp Mayring
2000-01-01
The article describes an approach of systematic, rule guided qualitative text analysis, which tries to preserve some methodological strengths of quantitative content analysis and widen them to a concept of qualitative procedure. First the development of content analysis is delineated and the basic principles are explained (units of analysis, step models, working with categories, validity and reliability). Then the central procedures of qualitative content analysis, inductive development of ca...
RELIABILITY ANALYSIS OF BENDING ELIABILITY ANALYSIS OF ...
African Journals Online (AJOL)
eobe
Reliability analysis of the safety levels of the criteria slabs, have been .... was also noted [2] that if the risk level or β < 3.1), the ... reliability analysis. A study [6] has shown that all geometric variables, ..... Germany, 1988. 12. Hasofer, A. M and ...
DTI analysis methods : Voxel-based analysis
Van Hecke, Wim; Leemans, Alexander; Emsell, Louise
2016-01-01
Voxel-based analysis (VBA) of diffusion tensor imaging (DTI) data permits the investigation of voxel-wise differences or changes in DTI metrics in every voxel of a brain dataset. It is applied primarily in the exploratory analysis of hypothesized group-level alterations in DTI parameters, as it does
Analysis of Precision of Activation Analysis Method
DEFF Research Database (Denmark)
Heydorn, Kaj; Nørgaard, K.
1973-01-01
The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...
Hazard Analysis Database Report
Grams, W H
2000-01-01
The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from t...
Santiago, John
2013-01-01
Circuits overloaded from electric circuit analysis? Many universities require that students pursuing a degree in electrical or computer engineering take an Electric Circuit Analysis course to determine who will ""make the cut"" and continue in the degree program. Circuit Analysis For Dummies will help these students to better understand electric circuit analysis by presenting the information in an effective and straightforward manner. Circuit Analysis For Dummies gives you clear-cut information about the topics covered in an electric circuit analysis courses to help
Cluster analysis for applications
Anderberg, Michael R
1973-01-01
Cluster Analysis for Applications deals with methods and various applications of cluster analysis. Topics covered range from variables and scales to measures of association among variables and among data units. Conceptual problems in cluster analysis are discussed, along with hierarchical and non-hierarchical clustering methods. The necessary elements of data analysis, statistics, cluster analysis, and computer implementation are integrated vertically to cover the complete path from raw data to a finished analysis.Comprised of 10 chapters, this book begins with an introduction to the subject o
Activation analysis in food analysis. Pt. 9
International Nuclear Information System (INIS)
Szabo, S.A.
1992-01-01
An overview is presented on the application of activation analysis (AA) techniques for food analysis, as reflected at a recent international conference titled Activation Analysis and its Applications. The most popular analytical techniques include instrumental neutron AA, (INAA or NAA), radiochemical NAA (RNAA), X-ray fluorescence analysis and mass spectrometry. Data are presented for the multielemental NAA of instant soups, for elemental composition of drinking water in Iraq, for Na, K, Mn contents of various Indian rices, for As, Hg, Sb and Se determination in various seafoods, for daily microelement takeup in China, for the elemental composition of Chinese teas. Expected development trends in AA are outlined. (R.P.) 24 refs.; 8 tabs
Joint fluid analysis; Joint fluid aspiration ... El-Gabalawy HS. Synovial fluid analysis, synovial biopsy, and synovial pathology. In: Firestein GS, Budd RC, Gabriel SE, McInnes IB, O'Dell JR, eds. Kelly's Textbook of ...
International Nuclear Information System (INIS)
Burgess, R.L.
1978-01-01
Progress is reported on the following research programs: analysis and modeling of ecosystems; EDFB/IBP data center; biome analysis studies; land/water interaction studies; and computer programs for development of models
Confirmatory Composite Analysis
Schuberth, Florian; Henseler, Jörg; Dijkstra, Theo K.
2018-01-01
We introduce confirmatory composite analysis (CCA) as a structural equation modeling technique that aims at testing composite models. CCA entails the same steps as confirmatory factor analysis: model specification, model identification, model estimation, and model testing. Composite models are
Introductory numerical analysis
Pettofrezzo, Anthony J
2006-01-01
Written for undergraduates who require a familiarity with the principles behind numerical analysis, this classical treatment encompasses finite differences, least squares theory, and harmonic analysis. Over 70 examples and 280 exercises. 1967 edition.
Gap Analysis: Application to Earned Value Analysis
Langford, Gary O.; Franck, Raymond (Chip)
2008-01-01
Sponsored Report (for Acquisition Research Program) Earned Value is regarded as a useful tool to monitor commercial and defense system acquisitions. This paper applies the theoretical foundations and systematics of Gap Analysis to improve Earned Value Management. As currently implemented, Earned Value inaccurately provides a higher value for the work performed. This preliminary research indicates that Earned Value calculations can be corrected. Value Analysis, properly defined and enacted,...
Importance-performance analysis based SWOT analysis
Phadermrod, Boonyarat; Crowder, Richard M.; Wills, Gary B.
2016-01-01
SWOT analysis, a commonly used tool for strategic planning, is traditionally a form of brainstorming. Hence, it has been criticised that it is likely to hold subjective views of the individuals who participate in a brainstorming session and that SWOT factors are not prioritized by their significance thus it may result in an improper strategic action. While most studies of SWOT analysis have only focused on solving these shortcomings separately, this study offers an approach to diminish both s...
Discourse analysis and Foucault's
Directory of Open Access Journals (Sweden)
Jansen I.
2008-01-01
Full Text Available Discourse analysis is a method with up to now was less recognized in nursing science, althoughmore recently nursing scientists are discovering it for their purposes. However, several authors have criticized thatdiscourse analysis is often misinterpreted because of a lack of understanding of its theoretical backgrounds. In thisarticle, I reconstruct Foucault’s writings in his “Archaeology of Knowledge” to provide a theoretical base for futurearchaeological discourse analysis, which can be categorized as a socio-linguistic discourse analysis.
Yuan, Ying; MacKinnon, David P.
2009-01-01
This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...
Fitzmaurice, Garrett M; Ware, James H
2012-01-01
Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo
Regression analysis by example
Chatterjee, Samprit
2012-01-01
Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded
2014-01-01
M.Ing. (Electrical & Electronic Engineering) One of the most important steps to be taken before a site is to be selected for the extraction of wind energy is the analysis of the energy within the wind on that particular site. No wind energy analysis system exists for the measurement and analysis of wind power. This dissertation documents the design and development of a Wind Energy Analysis System (WEAS). Using a micro-controller based design in conjunction with sensors, WEAS measure, calcu...
Slice hyperholomorphic Schur analysis
Alpay, Daniel; Sabadini, Irene
2016-01-01
This book defines and examines the counterpart of Schur functions and Schur analysis in the slice hyperholomorphic setting. It is organized into three parts: the first introduces readers to classical Schur analysis, while the second offers background material on quaternions, slice hyperholomorphic functions, and quaternionic functional analysis. The third part represents the core of the book and explores quaternionic Schur analysis and its various applications. The book includes previously unpublished results and provides the basis for new directions of research.
Computational movement analysis
Laube, Patrick
2014-01-01
This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi
Thompson, Cheryl Bagley
2009-01-01
This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.
Trend Analysis Using Microcomputers.
Berger, Carl F.
A trend analysis statistical package and additional programs for the Apple microcomputer are presented. They illustrate strategies of data analysis suitable to the graphics and processing capabilities of the microcomputer. The programs analyze data sets using examples of: (1) analysis of variance with multiple linear regression; (2) exponential…
Yuan, Ying; MacKinnon, David P.
2009-01-01
In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…
Automation of activation analysis
International Nuclear Information System (INIS)
Ivanov, I.N.; Ivanets, V.N.; Filippov, V.V.
1985-01-01
The basic data on the methods and equipment of activation analysis are presented. Recommendations on the selection of activation analysis techniques, and especially the technique envisaging the use of short-lived isotopes, are given. The equipment possibilities to increase dataway carrying capacity, using modern computers for the automation of the analysis and data processing procedure, are shown
Cuesta, Hector
2013-01-01
Each chapter of the book quickly introduces a key 'theme' of Data Analysis, before immersing you in the practical aspects of each theme. You'll learn quickly how to perform all aspects of Data Analysis.Practical Data Analysis is a book ideal for home and small business users who want to slice & dice the data they have on hand with minimum hassle.
Mathematical analysis fundamentals
Bashirov, Agamirza
2014-01-01
The author's goal is a rigorous presentation of the fundamentals of analysis, starting from elementary level and moving to the advanced coursework. The curriculum of all mathematics (pure or applied) and physics programs include a compulsory course in mathematical analysis. This book will serve as can serve a main textbook of such (one semester) courses. The book can also serve as additional reading for such courses as real analysis, functional analysis, harmonic analysis etc. For non-math major students requiring math beyond calculus, this is a more friendly approach than many math-centric o
Foundations of mathematical analysis
Johnsonbaugh, Richard
2010-01-01
This classroom-tested volume offers a definitive look at modern analysis, with views of applications to statistics, numerical analysis, Fourier series, differential equations, mathematical analysis, and functional analysis. Upper-level undergraduate students with a background in calculus will benefit from its teachings, along with beginning graduate students seeking a firm grounding in modern analysis. A self-contained text, it presents the necessary background on the limit concept, and the first seven chapters could constitute a one-semester introduction to limits. Subsequent chapters discuss
Analysis in usability evaluations
DEFF Research Database (Denmark)
Følstad, Asbjørn; Lai-Chong Law, Effie; Hornbæk, Kasper
2010-01-01
While the planning and implementation of usability evaluations are well described in the literature, the analysis of the evaluation data is not. We present interviews with 11 usability professionals on how they conduct analysis, describing the resources, collaboration, creation of recommendations......, and prioritization involved. The interviews indicate a lack of structure in the analysis process and suggest activities, such as generating recommendations, that are unsupported by existing methods. We discuss how to better support analysis, and propose four themes for future research on analysis in usability...
DEFF Research Database (Denmark)
Bøving, Kristian Billeskov; Simonsen, Jesper
2004-01-01
This article documents how log analysis can inform qualitative studies concerning the usage of web-based information systems (WIS). No prior research has used http log files as data to study collaboration between multiple users in organisational settings. We investigate how to perform http log...... analysis; what http log analysis says about the nature of collaborative WIS use; and how results from http log analysis may support other data collection methods such as surveys, interviews, and observation. The analysis of log files initially lends itself to research designs, which serve to test...... hypotheses using a quantitative methodology. We show that http log analysis can also be valuable in qualitative research such as case studies. The results from http log analysis can be triangulated with other data sources and for example serve as a means of supporting the interpretation of interview data...
Amir Farbin
The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...
Multivariate analysis with LISREL
Jöreskog, Karl G; Y Wallentin, Fan
2016-01-01
This book traces the theory and methodology of multivariate statistical analysis and shows how it can be conducted in practice using the LISREL computer program. It presents not only the typical uses of LISREL, such as confirmatory factor analysis and structural equation models, but also several other multivariate analysis topics, including regression (univariate, multivariate, censored, logistic, and probit), generalized linear models, multilevel analysis, and principal component analysis. It provides numerous examples from several disciplines and discusses and interprets the results, illustrated with sections of output from the LISREL program, in the context of the example. The book is intended for masters and PhD students and researchers in the social, behavioral, economic and many other sciences who require a basic understanding of multivariate statistical theory and methods for their analysis of multivariate data. It can also be used as a textbook on various topics of multivariate statistical analysis.
International Nuclear Information System (INIS)
Actis, Oxana; Brodski, Michael; Erdmann, Martin; Fischer, Robert; Hinzmann, Andreas; Mueller, Gero; Muenzer, Thomas; Plum, Matthias; Steggemann, Jan; Winchen, Tobias; Klimkovich, Tatsiana
2010-01-01
VISPA is a development environment for high energy physics analyses which enables physicists to combine graphical and textual work. A physics analysis cycle consists of prototyping, performing, and verifying the analysis. The main feature of VISPA is a multipurpose window for visual steering of analysis steps, creation of analysis templates, and browsing physics event data at different steps of an analysis. VISPA follows an experiment-independent approach and incorporates various tools for steering and controlling required in a typical analysis. Connection to different frameworks of high energy physics experiments is achieved by using different types of interfaces. We present the look-and-feel for an example physics analysis at the LHC and explain the underlying software concepts of VISPA.
Cost benefit analysis cost effectiveness analysis
International Nuclear Information System (INIS)
Lombard, J.
1986-09-01
The comparison of various protection options in order to determine which is the best compromise between cost of protection and residual risk is the purpose of the ALARA procedure. The use of decision-aiding techniques is valuable as an aid to selection procedures. The purpose of this study is to introduce two rather simple and well known decision aiding techniques: the cost-effectiveness analysis and the cost-benefit analysis. These two techniques are relevant for the great part of ALARA decisions which need the use of a quantitative technique. The study is based on an hypothetical case of 10 protection options. Four methods are applied to the data
International Nuclear Information System (INIS)
Sommer, S; Tinh Tran, T.
2008-01-01
Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process
Functional analysis and applications
Siddiqi, Abul Hasan
2018-01-01
This self-contained textbook discusses all major topics in functional analysis. Combining classical materials with new methods, it supplies numerous relevant solved examples and problems and discusses the applications of functional analysis in diverse fields. The book is unique in its scope, and a variety of applications of functional analysis and operator-theoretic methods are devoted to each area of application. Each chapter includes a set of problems, some of which are routine and elementary, and some of which are more advanced. The book is primarily intended as a textbook for graduate and advanced undergraduate students in applied mathematics and engineering. It offers several attractive features making it ideally suited for courses on functional analysis intended to provide a basic introduction to the subject and the impact of functional analysis on applied and computational mathematics, nonlinear functional analysis and optimization. It introduces emerging topics like wavelets, Gabor system, inverse pro...
DEFF Research Database (Denmark)
Bemman, Brian; Meredith, David
it with a “ground truth” analysis of the same music pro- duced by a human expert (see, in particular, [5]). In this paper, we explore the problem of generating an encoding of the musical surface of a work automatically from a systematic encoding of an analysis. The ability to do this depends on one having...... an effective (i.e., comput- able), correct and complete description of some aspect of the structure of the music. Generating the surface struc- ture of a piece from an analysis in this manner serves as a proof of the analysis' correctness, effectiveness and com- pleteness. We present a reductive analysis......In recent years, a significant body of research has focused on developing algorithms for computing analyses of mu- sical works automatically from encodings of these works' surfaces [3,4,7,10,11]. The quality of the output of such analysis algorithms is typically evaluated by comparing...
Fundamentals of functional analysis
Farenick, Douglas
2016-01-01
This book provides a unique path for graduate or advanced undergraduate students to begin studying the rich subject of functional analysis with fewer prerequisites than is normally required. The text begins with a self-contained and highly efficient introduction to topology and measure theory, which focuses on the essential notions required for the study of functional analysis, and which are often buried within full-length overviews of the subjects. This is particularly useful for those in applied mathematics, engineering, or physics who need to have a firm grasp of functional analysis, but not necessarily some of the more abstruse aspects of topology and measure theory normally encountered. The reader is assumed to only have knowledge of basic real analysis, complex analysis, and algebra. The latter part of the text provides an outstanding treatment of Banach space theory and operator theory, covering topics not usually found together in other books on functional analysis. Written in a clear, concise manner,...
DEFF Research Database (Denmark)
This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....
Analysis apparatus and method of analysis
International Nuclear Information System (INIS)
1976-01-01
A continuous streaming method developed for the excution of immunoassays is described in this patent. In addition, a suitable apparatus for the method was developed whereby magnetic particles are automatically employed for the consecutive analysis of a series of liquid samples via the RIA technique
International Nuclear Information System (INIS)
Dougherty, E.M.; Fragola, J.R.
1988-01-01
The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach
Emission spectrochemical analysis
International Nuclear Information System (INIS)
Rives, R.D.; Bruks, R.R.
1983-01-01
The emission spectrochemical method of analysis based on the fact that atoms of elements can be excited in the electric arc or in the laser beam and will emit radiation with characteristic wave lengths is considered. The review contains the data on spectrochemical analysis, of liquids geological materials, scheme of laser microprobe. The main characteristics of emission spectroscopy, atomic absorption spectroscopy and X-ray fluorescent analysis, are aeneralized
International Nuclear Information System (INIS)
Crawford, H.J.; Lindstrom, P.J.
1983-06-01
Our analysis program LULU has proven very useful in all stages of experiment analysis, from prerun detector debugging through final data reduction. It has solved our problem of having arbitrary word length events and is easy enough to use that many separate experimenters are now analyzing with LULU. The ability to use the same software for all stages of experiment analysis greatly eases the programming burden. We may even get around to making the graphics elegant someday
Mastering Clojure data analysis
Rochester, Eric
2014-01-01
This book consists of a practical, example-oriented approach that aims to help you learn how to use Clojure for data analysis quickly and efficiently.This book is great for those who have experience with Clojure and who need to use it to perform data analysis. This book will also be hugely beneficial for readers with basic experience in data analysis and statistics.
Fast neutron activation analysis
International Nuclear Information System (INIS)
Pepelnik, R.
1986-01-01
Since 1981 numerous 14 MeV neutron activation analyses were performed at Korona. On the basis of that work the advantages of this analysis technique and therewith obtained results are compared with other analytical methods. The procedure of activation analysis, the characteristics of Korona, some analytical investigations in environmental research and material physics, as well as sources of systematic errors in trace analysis are described. (orig.) [de
Crisan, Dan
2011-01-01
"Stochastic Analysis" aims to provide mathematical tools to describe and model high dimensional random systems. Such tools arise in the study of Stochastic Differential Equations and Stochastic Partial Differential Equations, Infinite Dimensional Stochastic Geometry, Random Media and Interacting Particle Systems, Super-processes, Stochastic Filtering, Mathematical Finance, etc. Stochastic Analysis has emerged as a core area of late 20th century Mathematics and is currently undergoing a rapid scientific development. The special volume "Stochastic Analysis 2010" provides a sa
The ATLAS Analysis Architecture
International Nuclear Information System (INIS)
Cranmer, K.S.
2008-01-01
We present an overview of the ATLAS analysis architecture including the relevant aspects of the computing model and the major architectural aspects of the Athena framework. Emphasis will be given to the interplay between the analysis use cases and the technical aspects of the architecture including the design of the event data model, transient-persistent separation, data reduction strategies, analysis tools, and ROOT interoperability
Circuit analysis with Multisim
Baez-Lopez, David
2011-01-01
This book is concerned with circuit simulation using National Instruments Multisim. It focuses on the use and comprehension of the working techniques for electrical and electronic circuit simulation. The first chapters are devoted to basic circuit analysis.It starts by describing in detail how to perform a DC analysis using only resistors and independent and controlled sources. Then, it introduces capacitors and inductors to make a transient analysis. In the case of transient analysis, it is possible to have an initial condition either in the capacitor voltage or in the inductor current, or bo
Textile Technology Analysis Lab
Federal Laboratory Consortium — The Textile Analysis Labis built for evaluating and characterizing the physical properties of an array of textile materials, but specifically those used in aircrew...
DEFF Research Database (Denmark)
Sørensen, Olav Jull
2009-01-01
The review presents the book International Market Analysis: Theories and Methods, written by John Kuiada, professor at Centre of International Business, Department of Business Studies, Aalborg University. The book is refreshingly new in its way of looking at a classical problem. It looks at market...... analysis from the point of vie of ways of thinking about markets. Furthermore, the book includes the concept of learning in the analysis of markets og how the way we understand business reality influneces our choice of methodology for market analysis....
Chemical Security Analysis Center
Federal Laboratory Consortium — In 2006, by Presidential Directive, DHS established the Chemical Security Analysis Center (CSAC) to identify and assess chemical threats and vulnerabilities in the...
Geospatial Data Analysis Facility
Federal Laboratory Consortium — Geospatial application development, location-based services, spatial modeling, and spatial analysis are examples of the many research applications that this facility...
National Research Council Canada - National Science Library
Gilbert, John
1984-01-01
... quantification methods used in the analysis of mycotoxins in foods - Confirmation and quantification of trace organic food contaminants by mass spectrometry-selected ion monitoring - Chemiluminescence...
Federal Laboratory Consortium — FUNCTION: Uses state-of-the-art instrumentation for qualitative and quantitative analysis of organic and inorganic compounds, and biomolecules from gas, liquid, and...
Thermogravimetric Analysis Laboratory
Federal Laboratory Consortium — At NETL’s Thermogravimetric Analysis Laboratory in Morgantown, WV, researchers study how chemical looping combustion (CLC) can be applied to fossil energy systems....
Sensitivity and uncertainty analysis
Cacuci, Dan G; Navon, Ionel Michael
2005-01-01
As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c
Hox, J.J.; Maas, C.J.M.; Lensvelt-Mulders, G.J.L.M.
2004-01-01
The goal of meta-analysis is to integrate the research results of a number of studies on a specific topic. Characteristic for meta-analysis is that in general only the summary statistics of the studies are used and not the original data. When the published research results to be integrated
International Nuclear Information System (INIS)
Hahn, A.A.
1994-11-01
The complexity of instrumentation sometimes requires data analysis to be done before the result is presented to the control room. This tutorial reviews some of the theoretical assumptions underlying the more popular forms of data analysis and presents simple examples to illuminate the advantages and hazards of different techniques
Activation analysis. Detection limits
International Nuclear Information System (INIS)
Revel, G.
1999-01-01
Numerical data and limits of detection related to the four irradiation modes, often used in activation analysis (reactor neutrons, 14 MeV neutrons, photon gamma and charged particles) are presented here. The technical presentation of the activation analysis is detailed in the paper P 2565 of Techniques de l'Ingenieur. (A.L.B.)
SMART performance analysis methodology
International Nuclear Information System (INIS)
Lim, H. S.; Kim, H. C.; Lee, D. J.
2001-04-01
To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis
Contrast analysis : A tutorial
Haans, A.
2018-01-01
Contrast analysis is a relatively simple but effective statistical method for testing theoretical predictions about differences between group means against the empirical data. Despite its advantages, contrast analysis is hardly used to date, perhaps because it is not implemented in a convenient
Interactive Controls Analysis (INCA)
Bauer, Frank H.
1989-01-01
Version 3.12 of INCA provides user-friendly environment for design and analysis of linear control systems. System configuration and parameters easily adjusted, enabling INCA user to create compensation networks and perform sensitivity analysis in convenient manner. Full complement of graphical routines makes output easy to understand. Written in Pascal and FORTRAN.
Marketing research cluster analysis
Directory of Open Access Journals (Sweden)
Marić Nebojša
2002-01-01
Full Text Available One area of applications of cluster analysis in marketing is identification of groups of cities and towns with similar demographic profiles. This paper considers main aspects of cluster analysis by an example of clustering 12 cities with the use of Minitab software.
SWOT ANALYSIS - CHINESE PETROLEUM
Directory of Open Access Journals (Sweden)
Chunlan Wang
2014-01-01
Full Text Available This article was written in early December 2013,combined with the historical development andthe latest data on the Chinese Petroleum carried SWOT- analysis. This paper discusses corporate resources, cost, management and external factorssuch as the political environment and the marketsupply and demand, conducted a comprehensiveand profound analysis.
de Roon, F.A.; Nijman, T.E.; Ter Horst, J.R.
2000-01-01
In this paper we evaluate applications of (return based) style analysis.The portfolio and positivity constraints imposed by style analysis are useful in constructing mimicking portfolios without short positions.Such mimicking portfolios can be used, e.g., to construct efficient portfolios of mutual
F.A. de Roon (Frans); T.E. Nijman (Theo); B.J.M. Werker
2000-01-01
textabstractIn this paper we evaluate applications of (return based) style analysis. The portfolio and positivity constraints imposed by style analysis are useful in constructing mimicking portfolios without short positions. Such mimicking portfolios can be used e.g. to construct efficient
Directory of Open Access Journals (Sweden)
Satu Elo
2014-02-01
Full Text Available Qualitative content analysis is commonly used for analyzing qualitative data. However, few articles have examined the trustworthiness of its use in nursing science studies. The trustworthiness of qualitative content analysis is often presented by using terms such as credibility, dependability, conformability, transferability, and authenticity. This article focuses on trustworthiness based on a review of previous studies, our own experiences, and methodological textbooks. Trustworthiness was described for the main qualitative content analysis phases from data collection to reporting of the results. We concluded that it is important to scrutinize the trustworthiness of every phase of the analysis process, including the preparation, organization, and reporting of results. Together, these phases should give a reader a clear indication of the overall trustworthiness of the study. Based on our findings, we compiled a checklist for researchers attempting to improve the trustworthiness of a content analysis study. The discussion in this article helps to clarify how content analysis should be reported in a valid and understandable manner, which would be of particular benefit to reviewers of scientific articles. Furthermore, we discuss that it is often difficult to evaluate the trustworthiness of qualitative content analysis studies because of defective data collection method description and/or analysis description.
Schraagen, J.M.C.
2000-01-01
Cognitive task analysis is defined as the extension of traditional task analysis techniques to yield information about the knowledge, thought processes and goal structures that underlie observable task performance. Cognitive task analyses are conducted for a wide variety of purposes, including the
DEFF Research Database (Denmark)
Damkilde, Lars
2007-01-01
Limit State analysis has a long history and many prominent researchers have contributed. The theoretical foundation is based on the upper- and lower-bound theorems which give a very comprehensive and elegant formulation on complicated physical problems. In the pre-computer age Limit State analysis...... also enabled engineers to solve practical problems within reinforced concrete, steel structures and geotechnics....
Verhoosel, C.V.; Scott, M.A.; Borden, M.J.; Borst, de R.; Hughes, T.J.R.; Mueller-Hoeppe, D.; Loehnert, S.; Reese, S.
2011-01-01
Isogeometric analysis is a versatile tool for failure analysis. On the one hand, the excellent control over the inter-element continuity conditions enables a natural incorporation of continuum constitutive relations that incorporate higher-order strain gradients, as in gradient plasticity or damage.
DEFF Research Database (Denmark)
Durbin, Richard; Eddy, Sean; Krogh, Anders Stærmose
This book provides an up-to-date and tutorial-level overview of sequence analysis methods, with particular emphasis on probabilistic modelling. Discussed methods include pairwise alignment, hidden Markov models, multiple alignment, profile searches, RNA secondary structure analysis, and phylogene...
International Nuclear Information System (INIS)
Arien, B.
2000-01-01
The objective of SCK-CEN's programme on reactor safety is to develop expertise in probabilistic and deterministic reactor safety analysis. The research programme consists of two main activities, in particular the development of software for reliability analysis of large systems and participation in the international PHEBUS-FP programme for severe accidents. Main achievements in 1999 are reported
Factorial Analysis of Profitability
Georgeta VINTILA; Ilie GHEORGHE; Ioana Mihaela POCAN; Madalina Gabriela ANGHEL
2012-01-01
The DuPont analysis system is based on decomposing the profitability ratio in factors of influence. This paper describes the factorial analysis of profitability based on the DuPont system. Significant importance is given to the impact on various indicators on the shares value and profitability.
Spool assembly support analysis
International Nuclear Information System (INIS)
Norman, B.F.
1994-01-01
This document provides the wind/seismic analysis and evaluation for the pump pit spool assemblies. Hand calculations were used for the analysis. UBC, AISC, and load factors were used in this evaluation. The results show that the actual loads are under the allowable loads and all requirements are met
International Nuclear Information System (INIS)
Hansen, J.D.
1976-01-01
This article discusses the partial wave analysis of two, three and four meson systems. The difference between the two approaches, referred to as amplitude and Ascoli analysis is discussed. Some of the results obtained with these methods are shown. (B.R.H.)
Enabling interdisciplinary analysis
L. M. Reid
1996-01-01
'New requirements for evaluating environmental conditions in the Pacific Northwest have led to increased demands for interdisciplinary analysis of complex environmental problems. Procedures for watershed analysis have been developed for use on public and private lands in Washington State (Washington Forest Practices Board 1993) and for federal lands in the Pacific...
Shot loading platform analysis
International Nuclear Information System (INIS)
Norman, B.F.
1994-01-01
This document provides the wind/seismic analysis and evaluation for the shot loading platform. Hand calculations were used for the analysis. AISC and UBC load factors were used in this evaluation. The results show that the actual loads are under the allowable loads and all requirements are met
Marketing research cluster analysis
Marić Nebojša
2002-01-01
One area of applications of cluster analysis in marketing is identification of groups of cities and towns with similar demographic profiles. This paper considers main aspects of cluster analysis by an example of clustering 12 cities with the use of Minitab software.
Towards Cognitive Component Analysis
DEFF Research Database (Denmark)
Hansen, Lars Kai; Ahrendt, Peter; Larsen, Jan
2005-01-01
Cognitive component analysis (COCA) is here defined as the process of unsupervised grouping of data such that the ensuing group structure is well-aligned with that resulting from human cognitive activity. We have earlier demonstrated that independent components analysis is relevant for representing...
Satu Elo; Maria Kääriäinen; Outi Kanste; Tarja Pölkki; Kati Utriainen; Helvi Kyngäs
2014-01-01
Qualitative content analysis is commonly used for analyzing qualitative data. However, few articles have examined the trustworthiness of its use in nursing science studies. The trustworthiness of qualitative content analysis is often presented by using terms such as credibility, dependability, conformability, transferability, and authenticity. This article focuses on trustworthiness based on a review of previous studie...
Interaction Analysis and Supervision.
Amidon, Edmund
This paper describes a model that uses interaction analysis as a tool to provide feedback to a teacher in a microteaching situation. The author explains how interaction analysis can be used for teacher improvement, describes the category system used in the model, the data collection methods used, and the feedback techniques found in the model. (JF)
Activation analysis. Chapter 4
International Nuclear Information System (INIS)
1976-01-01
The principle, sample and calibration standard preparation, activation by neutrons, charged particles and gamma radiation, sample transport after activation, activity measurement, and chemical sample processing are described for activation analysis. Possible applications are shown of nondestructive activation analysis. (J.P.)
Donahue, Craig J.; Rais, Elizabeth A.
2009-01-01
This lab experiment illustrates the use of thermogravimetric analysis (TGA) to perform proximate analysis on a series of coal samples of different rank. Peat and coke are also examined. A total of four exercises are described. These are dry exercises as students interpret previously recorded scans. The weight percent moisture, volatile matter,…
Ian M. Franks; Mike Hughes
2004-01-01
This book addresses and appropriately explains the notational analysis of technique, tactics, individual athlete/team exercise and work-rate in sport. The book offers guidance in: developing a system, analyzes of data, effective coaching using notational performance analysis and modeling sport behaviors. It updates and improves the 1997 edition
Directory of Open Access Journals (Sweden)
Ian M. Franks
2004-06-01
Full Text Available This book addresses and appropriately explains the notational analysis of technique, tactics, individual athlete/team exercise and work-rate in sport. The book offers guidance in: developing a system, analyzes of data, effective coaching using notational performance analysis and modeling sport behaviors. It updates and improves the 1997 edition
Allain, Rhett
2016-05-01
We currently live in a world filled with videos. There are videos on YouTube, feature movies and even videos recorded with our own cameras and smartphones. These videos present an excellent opportunity to not only explore physical concepts, but also inspire others to investigate physics ideas. With video analysis, we can explore the fantasy world in science-fiction films. We can also look at online videos to determine if they are genuine or fake. Video analysis can be used in the introductory physics lab and it can even be used to explore the make-believe physics embedded in video games. This book covers the basic ideas behind video analysis along with the fundamental physics principles used in video analysis. The book also includes several examples of the unique situations in which video analysis can be used.
Ramsay, J O
1997-01-01
Scientists today collect samples of curves and other functional observations. This monograph presents many ideas and techniques for such data. Included are expressions in the functional domain of such classics as linear regression, principal components analysis, linear modelling, and canonical correlation analysis, as well as specifically functional techniques such as curve registration and principal differential analysis. Data arising in real applications are used throughout for both motivation and illustration, showing how functional approaches allow us to see new things, especially by exploiting the smoothness of the processes generating the data. The data sets exemplify the wide scope of functional data analysis; they are drwan from growth analysis, meterology, biomechanics, equine science, economics, and medicine. The book presents novel statistical technology while keeping the mathematical level widely accessible. It is designed to appeal to students, to applied data analysts, and to experienced researc...
Systems engineering and analysis
Blanchard, Benjamin S
2010-01-01
For senior-level undergraduate and first and second year graduate systems engineering and related courses. A total life-cycle approach to systems and their analysis. This practical introduction to systems engineering and analysis provides the concepts, methodologies, models, and tools needed to understand and implement a total life-cycle approach to systems and their analysis. The authors focus first on the process of bringing systems into being--beginning with the identification of a need and extending that need through requirements determination, functional analysis and allocation, design synthesis, evaluation, and validation, operation and support, phase-out, and disposal. Next, the authors discuss the improvement of systems currently in being, showing that by employing the iterative process of analysis, evaluation, feedback, and modification, most systems in existence can be improved in their affordability, effectiveness, and stakeholder satisfaction.
International Nuclear Information System (INIS)
Ishii, Keizo
1997-01-01
Elemental analysis based on the particle induced x-ray emission (PIXE) is a novel technique to analyze trace elements. It is a very simple method, its sensitivity is very high, multiple elements in a sample can be simultaneously analyzed and a few 10 μg of a sample is enough to be analyzed. Owing to these characteristics, the PIXE analysis is now used in many fields (e.g. biology, medicine, dentistry, environmental pollution, archaeology, culture assets etc.). Fundamentals of the PIXE analysis are described here: the production of characteristic x-rays and inner shell ionization by heavy charged particles, the continuous background in PIXE spectrum, quantitative formulae of the PIXE analysis, the detection limit of PIXE analysis, etc. (author)
International Nuclear Information System (INIS)
Porten, D.R.; Crowe, R.D.
1994-01-01
The purpose of this accident safety analysis is to document in detail, analyses whose results were reported in summary form in the K Basins Safety Analysis Report WHC-SD-SNF-SAR-001. The safety analysis addressed the potential for release of radioactive and non-radioactive hazardous material located in the K Basins and their supporting facilities. The safety analysis covers the hazards associated with normal K Basin fuel storage and handling operations, fuel encapsulation, sludge encapsulation, and canister clean-up and disposal. After a review of the Criticality Safety Evaluation of the K Basin activities, the following postulated events were evaluated: Crane failure and casks dropped into loadout pit; Design basis earthquake; Hypothetical loss of basin water accident analysis; Combustion of uranium fuel following dryout; Crane failure and cask dropped onto floor of transfer area; Spent ion exchange shipment for burial; Hydrogen deflagration in ion exchange modules and filters; Release of Chlorine; Power availability and reliability; and Ashfall
International Nuclear Information System (INIS)
Goetz, A.; Gerring, M.; Svensson, O.; Brockhauser, S.
2012-01-01
Data Analysis Workbench (DAWB) is a new software tool being developed at the ESRF. Its goal is to provide a tool for both online data analysis which can be used on the beamlines and for offline data analysis which users can use during experiments or take home. The tool includes support for data visualization and work-flows. work-flows allow algorithms which exploit parallel architectures to be designed from existing high level modules for data analysis in combination with data collection. The workbench uses Passerelle as the work-flow engine and EDNA plug-ins for data analysis. Actors talking to Tango are used for sending commands to a limited set of hardware to start existing data collection algorithms. A Tango server allows work-flows to be executed from existing applications. There are scripting interfaces to Python, Javascript and SPEC. The current state at the ESRF is the workbench is in test on a selected number of beamlines. (authors)
J Olive, David
2017-01-01
This text presents methods that are robust to the assumption of a multivariate normal distribution or methods that are robust to certain types of outliers. Instead of using exact theory based on the multivariate normal distribution, the simpler and more applicable large sample theory is given. The text develops among the first practical robust regression and robust multivariate location and dispersion estimators backed by theory. The robust techniques are illustrated for methods such as principal component analysis, canonical correlation analysis, and factor analysis. A simple way to bootstrap confidence regions is also provided. Much of the research on robust multivariate analysis in this book is being published for the first time. The text is suitable for a first course in Multivariate Statistical Analysis or a first course in Robust Statistics. This graduate text is also useful for people who are familiar with the traditional multivariate topics, but want to know more about handling data sets with...
Field, Michael
2017-01-01
This book provides a rigorous introduction to the techniques and results of real analysis, metric spaces and multivariate differentiation, suitable for undergraduate courses. Starting from the very foundations of analysis, it offers a complete first course in real analysis, including topics rarely found in such detail in an undergraduate textbook such as the construction of non-analytic smooth functions, applications of the Euler-Maclaurin formula to estimates, and fractal geometry. Drawing on the author’s extensive teaching and research experience, the exposition is guided by carefully chosen examples and counter-examples, with the emphasis placed on the key ideas underlying the theory. Much of the content is informed by its applicability: Fourier analysis is developed to the point where it can be rigorously applied to partial differential equations or computation, and the theory of metric spaces includes applications to ordinary differential equations and fractals. Essential Real Analysis will appeal t...
Real analysis and applications
Botelho, Fabio Silva
2018-01-01
This textbook introduces readers to real analysis in one and n dimensions. It is divided into two parts: Part I explores real analysis in one variable, starting with key concepts such as the construction of the real number system, metric spaces, and real sequences and series. In turn, Part II addresses the multi-variable aspects of real analysis. Further, the book presents detailed, rigorous proofs of the implicit theorem for the vectorial case by applying the Banach fixed-point theorem and the differential forms concept to surfaces in Rn. It also provides a brief introduction to Riemannian geometry. With its rigorous, elegant proofs, this self-contained work is easy to read, making it suitable for undergraduate and beginning graduate students seeking a deeper understanding of real analysis and applications, and for all those looking for a well-founded, detailed approach to real analysis.
Nonactivation interaction analysis. Chapter 5
International Nuclear Information System (INIS)
1976-01-01
Analyses are described including the alpha scattering analysis, beta absorption and scattering analysis, gamma and X-ray absorption and scattering analysis, X-ray fluorescence analysis, neutron absorption and scattering analysis, Moessbauer effect application and an analysis based on the application of radiation ionizing effects. (J.P.)
Is activation analysis still active?
International Nuclear Information System (INIS)
Chai Zhifang
2001-01-01
This paper reviews some aspects of neutron activation analysis (NAA), covering instrumental neutron activation analysis (INAA), k 0 method, prompt gamma-ray neutron activation analysis (PGNAA), radiochemical neutron activation analysis (RNAA) and molecular activation analysis (MAA). The comparison of neutron activation analysis with other analytical techniques are also made. (author)
International Nuclear Information System (INIS)
González Caballero, I; Cuesta Noriega, A; Rodríguez Marrero, A; Fernández del Castillo, E
2012-01-01
The analysis of the complex LHC data usually follows a standard path that aims at minimizing not only the amount of data but also the number of observables used. After a number of steps of slimming and skimming the data, the remaining few terabytes of ROOT files hold a selection of the events and a flat structure for the variables needed that can be more easily inspected and traversed in the final stages of the analysis. PROOF arises at this point as an efficient mechanism to distribute the analysis load by taking advantage of all the cores in modern CPUs through PROOF Lite, or by using PROOF Cluster or PROOF on Demand tools to build dynamic PROOF cluster on computing facilities with spare CPUs. However using PROOF at the level required for a serious analysis introduces some difficulties that may scare new adopters. We have developed the PROOF Analysis Framework (PAF) to facilitate the development of new analysis by uniformly exposing the PROOF related configurations across technologies and by taking care of the routine tasks as much as possible. We describe the details of the PAF implementation as well as how we succeeded in engaging a group of CMS physicists to use PAF as their daily analysis framework.
Hazard Analysis Database Report
Energy Technology Data Exchange (ETDEWEB)
GAULT, G.W.
1999-10-13
The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.
Containment vessel stability analysis
International Nuclear Information System (INIS)
Harstead, G.A.; Morris, N.F.; Unsal, A.I.
1983-01-01
The stability analysis for a steel containment shell is presented herein. The containment is a freestanding shell consisting of a vertical cylinder with a hemispherical dome. It is stiffened by large ring stiffeners and relatively small longitudinal stiffeners. The containment vessel is subjected to both static and dynamic loads which can cause buckling. These loads must be combined prior to their use in a stability analysis. The buckling loads were computed with the aid of the ASME Code case N-284 used in conjunction with general purpose computer codes and in-house programs. The equations contained in the Code case were used to compute the knockdown factors due to shell imperfections. After these knockdown factors were applied to the critical stress states determined by freezing the maximum dynamic stresses and combining them with other static stresses, a linear bifurcation analysis was carried out with the aid of the BOSOR4 program. Since the containment shell contained large penetrations, the Code case had to be supplemented by a local buckling analysis of the shell area surrounding the largest penetration. This analysis was carried out with the aid of the NASTRAN program. Although the factor of safety against buckling obtained in this analysis was satisfactory, it is claimed that the use of the Code case knockdown factors are unduly conservative when applied to the analysis of buckling around penetrations. (orig.)
International Nuclear Information System (INIS)
Thompson, W.A. Jr.
1979-11-01
This paper briefly describes WASH 1400 and the Lewis report. It attempts to define basic concepts such as risk and risk analysis, common mode failure, and rare event. Several probabilistic models which go beyond the WASH 1400 methodology are introduced; the common characteristic of these models is that they recognize explicitly that risk analysis is time dependent whereas WASH 1400 takes a per demand failure rate approach which obscures the important fact that accidents are time related. Further, the presentation of a realistic risk analysis should recognize that there are various risks which compete with one another for the lives of the individuals at risk. A way of doing this is suggested
International Nuclear Information System (INIS)
Kartiwa Sumadi; Yayah Rohayati
1996-01-01
The 'monazit' analytical program has been set up for routine work of Rare Earth Elements analysis in the monazite and xenotime minerals samples. Total relative error of the analysis is very low, less than 2.50%, and the reproducibility of counting statistic and stability of the instrument were very excellent. The precision and accuracy of the analytical program are very good with the maximum percentage relative are 5.22% and 1.61%, respectively. The mineral compositions of the 30 monazite samples have been also calculated using their chemical constituents, and the results were compared to the grain counting microscopic analysis
Methods of Multivariate Analysis
Rencher, Alvin C
2012-01-01
Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit
Dunham, Ken
2014-01-01
The rapid growth and development of Android-based devices has resulted in a wealth of sensitive information on mobile devices that offer minimal malware protection. This has created an immediate demand for security professionals that understand how to best approach the subject of Android malware threats and analysis.In Android Malware and Analysis, Ken Dunham, renowned global malware expert and author, teams up with international experts to document the best tools and tactics available for analyzing Android malware. The book covers both methods of malware analysis: dynamic and static.This tact
Aven, Terje
2012-01-01
Foundations of Risk Analysis presents the issues core to risk analysis - understanding what risk means, expressing risk, building risk models, addressing uncertainty, and applying probability models to real problems. The author provides the readers with the knowledge and basic thinking they require to successfully manage risk and uncertainty to support decision making. This updated edition reflects recent developments on risk and uncertainty concepts, representations and treatment. New material in Foundations of Risk Analysis includes:An up to date presentation of how to understand, define and
International Nuclear Information System (INIS)
Ko, Myeong Su; Kim, Tae Hwa; Park, Gyu Hyeon; Yang, Jong Beom; Oh, Chang Hwan; Lee, Kyoung Hye
2010-04-01
This textbook describes instrument analysis in easy way with twelve chapters. The contents of the book are pH measurement on principle, pH meter, pH measurement, examples of the experiments, centrifugation, Absorptiometry, Fluorescent method, Atomic absorption analysis, Gas-chromatography, Gas chromatography-mass spectrometry, High performance liquid chromatography liquid chromatograph-mass spectrometry, Electrophoresis on practical case and analysis of the result and examples, PCR on principle, device, application and examples and Enzyme-linked immunosorbent assay with indirect ELISA, sandwich ELISA and ELISA reader.
International Nuclear Information System (INIS)
Williams, Mike; Egede, Ulrik; Paterson, Stuart
2011-01-01
The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.
Factor analysis and scintigraphy
International Nuclear Information System (INIS)
Di Paola, R.; Penel, C.; Bazin, J.P.; Berche, C.
1976-01-01
The goal of factor analysis is usually to achieve reduction of a large set of data, extracting essential features without previous hypothesis. Due to the development of computerized systems, the use of largest sampling, the possibility of sequential data acquisition and the increase of dynamic studies, the problem of data compression can be encountered now in routine. Thus, results obtained for compression of scintigraphic images were first presented. Then possibilities given by factor analysis for scan processing were discussed. At last, use of this analysis for multidimensional studies and specially dynamic studies were considered for compression and processing [fr
Energy Technology Data Exchange (ETDEWEB)
Ko, Myeong Su; Kim, Tae Hwa; Park, Gyu Hyeon; Yang, Jong Beom; Oh, Chang Hwan; Lee, Kyoung Hye
2010-04-15
This textbook describes instrument analysis in easy way with twelve chapters. The contents of the book are pH measurement on principle, pH meter, pH measurement, examples of the experiments, centrifugation, Absorptiometry, Fluorescent method, Atomic absorption analysis, Gas-chromatography, Gas chromatography-mass spectrometry, High performance liquid chromatography liquid chromatograph-mass spectrometry, Electrophoresis on practical case and analysis of the result and examples, PCR on principle, device, application and examples and Enzyme-linked immunosorbent assay with indirect ELISA, sandwich ELISA and ELISA reader.
DEFF Research Database (Denmark)
Raket, Lars Lau
We propose a direction it the field of statistics which we will call functional object analysis. This subfields considers the analysis of functional objects defined on continuous domains. In this setting we will focus on model-based statistics, with a particularly emphasis on mixed......-effect formulations, where the observed functional signal is assumed to consist of both fixed and random functional effects. This thesis takes the initial steps toward the development of likelihood-based methodology for functional objects. We first consider analysis of functional data defined on high...
Bayesian nonparametric data analysis
Müller, Peter; Jara, Alejandro; Hanson, Tim
2015-01-01
This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.
International Nuclear Information System (INIS)
Ramirez T, J.J.; Lopez M, J.; Sandoval J, A.R.; Villasenor S, P.; Aspiazu F, J.A.
2001-01-01
An elemental analysis, metallographic and of phases was realized in order to determine the oxidation states of Fe contained in three metallic pieces: block, plate and cylinder of unknown material. Results are presented from the elemental analysis which was carried out in the Tandem Accelerator of ININ by Proton induced X-ray emission (PIXE). The phase analysis was carried out by X-ray diffraction which allowed to know the type of alloy or alloys formed. The combined application of nuclear techniques with metallographic techniques allows the integral characterization of industrial metals. (Author)
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
Iremonger, M J
1982-01-01
BASIC Stress Analysis aims to help students to become proficient at BASIC programming by actually using it in an important engineering subject. It also enables the student to use computing as a means of learning stress analysis because writing a program is analogous to teaching-it is necessary to understand the subject matter. The book begins by introducing the BASIC approach and the concept of stress analysis at first- and second-year undergraduate level. Subsequent chapters contain a summary of relevant theory, worked examples containing computer programs, and a set of problems. Topics c
Fundamentals of mathematical analysis
Paul J Sally, Jr
2013-01-01
This is a textbook for a course in Honors Analysis (for freshman/sophomore undergraduates) or Real Analysis (for junior/senior undergraduates) or Analysis-I (beginning graduates). It is intended for students who completed a course in "AP Calculus", possibly followed by a routine course in multivariable calculus and a computational course in linear algebra. There are three features that distinguish this book from many other books of a similar nature and which are important for the use of this book as a text. The first, and most important, feature is the collection of exercises. These are spread
Systems analysis-independent analysis and verification
Energy Technology Data Exchange (ETDEWEB)
Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)
1995-09-01
The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.
Plasma data analysis using statistical analysis system
International Nuclear Information System (INIS)
Yoshida, Z.; Iwata, Y.; Fukuda, Y.; Inoue, N.
1987-01-01
Multivariate factor analysis has been applied to a plasma data base of REPUTE-1. The characteristics of the reverse field pinch plasma in REPUTE-1 are shown to be explained by four independent parameters which are described in the report. The well known scaling laws F/sub chi/ proportional to I/sub p/, T/sub e/ proportional to I/sub p/, and tau/sub E/ proportional to N/sub e/ are also confirmed. 4 refs., 8 figs., 1 tab
Summary Analysis: Hanford Site Composite Analysis Update
Energy Technology Data Exchange (ETDEWEB)
Nichols, W. E. [CH2M HILL Plateau Remediation Company, Richland, WA (United States); Lehman, L. L. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)
2017-06-05
The Hanford Site’s currently maintained Composite Analysis, originally completed in 1998, requires an update. A previous update effort was undertaken by the U.S. Department of Energy (DOE) in 2001-2005, but was ended before completion to allow the Tank Closure & Waste Management Environmental Impact Statement (TC&WM EIS) (DOE/EIS-0391) to be prepared without potential for conflicting sitewide models. This EIS was issued in 2012, and the deferral was ended with guidance in memorandum “Modeling to Support Regulatory Decision Making at Hanford” (Williams, 2012) provided with the aim of ensuring subsequent modeling is consistent with the EIS.
He, Jingrui
2012-01-01
This book focuses on rare category analysis where the majority classes have smooth distributions and the minority classes exhibit the compactness property. It focuses on challenging cases where the support regions of the majority and minority classes overlap.
Longitudinal categorical data analysis
Sutradhar, Brajendra C
2014-01-01
This is the first book in longitudinal categorical data analysis with parametric correlation models developed based on dynamic relationships among repeated categorical responses. This book is a natural generalization of the longitudinal binary data analysis to the multinomial data setup with more than two categories. Thus, unlike the existing books on cross-sectional categorical data analysis using log linear models, this book uses multinomial probability models both in cross-sectional and longitudinal setups. A theoretical foundation is provided for the analysis of univariate multinomial responses, by developing models systematically for the cases with no covariates as well as categorical covariates, both in cross-sectional and longitudinal setups. In the longitudinal setup, both stationary and non-stationary covariates are considered. These models have also been extended to the bivariate multinomial setup along with suitable covariates. For the inferences, the book uses the generalized quasi-likelihood as w...
International Nuclear Information System (INIS)
1981-09-01
Suggestion are made concerning the method of the fault tree analysis, the use of certain symbols in the examination of system failures. This purpose of the fault free analysis is to find logical connections of component or subsystem failures leading to undesirable occurrances. The results of these examinations are part of the system assessment concerning operation and safety. The objectives of the analysis are: systematical identification of all possible failure combinations (causes) leading to a specific undesirable occurrance, finding of reliability parameters such as frequency of failure combinations, frequency of the undesirable occurrance or non-availability of the system when required. The fault tree analysis provides a near and reconstructable documentation of the examination. (orig./HP) [de
Denker, A; Rauschenberg, J; Röhrich, J; Strub, E
2006-01-01
Materials analysis with ion beams exploits the interaction of ions with the electrons and nuclei in the sample. Among the vast variety of possible analytical techniques available with ion beams we will restrain to ion beam analysis with ion beams in the energy range from one to several MeV per mass unit. It is possible to use either the back-scattered projectiles (RBS – Rutherford Back Scattering) or the recoiled atoms itself (ERDA – Elastic Recoil Detection Analysis) from the elastic scattering processes. These techniques allow the simultaneous and absolute determination of stoichiometry and depth profiles of the detected elements. The interaction of the ions with the electrons in the sample produces holes in the inner electronic shells of the sample atoms, which recombine and emit X-rays characteristic for the element in question. Particle Induced X-ray Emission (PIXE) has shown to be a fast technique for the analysis of elements with an atomic number above 11.
DEFF Research Database (Denmark)
Vatrapu, Ravi; Mukkamala, Raghava Rao; Hussain, Abid
2016-01-01
, conceptual and formal models of social data, and an analytical framework for combining big social data sets with organizational and societal data sets. Three empirical studies of big social data are presented to illustrate and demonstrate social set analysis in terms of fuzzy set-theoretical sentiment...... automata and agent-based modeling). However, when it comes to organizational and societal units of analysis, there exists no approach to conceptualize, model, analyze, explain, and predict social media interactions as individuals' associations with ideas, values, identities, and so on. To address...... analysis, crisp set-theoretical interaction analysis, and event-studies-oriented set-theoretical visualizations. Implications for big data analytics, current limitations of the set-theoretical approach, and future directions are outlined....
Full closure strategic analysis.
2014-07-01
The full closure strategic analysis was conducted to create a decision process whereby full roadway : closures for construction and maintenance activities can be evaluated and approved or denied by CDOT : Traffic personnel. The study reviewed current...
Electrical Subsurface Grounding Analysis
International Nuclear Information System (INIS)
J.M. Calle
2000-01-01
The purpose and objective of this analysis is to determine the present grounding requirements of the Exploratory Studies Facility (ESF) subsurface electrical system and to verify that the actual grounding system and devices satisfy the requirements
DEFF Research Database (Denmark)
Skrypnyuk, Nataliya; Nielson, Flemming; Pilegaard, Henrik
2009-01-01
We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced into the...... into the syntax of IMC in order to make our analysis feasible. Finally we describe the analysis itself together with several theoretical results that we have proved for it.......We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced...
Canonical Information Analysis
DEFF Research Database (Denmark)
Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg
2015-01-01
is replaced by the information theoretical, entropy based measure mutual information, which is a much more general measure of association. We make canonical information analysis feasible for large sample problems, including for example multispectral images, due to the use of a fast kernel density estimator......Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables...... for entropy estimation. Canonical information analysis is applied successfully to (1) simple simulated data to illustrate the basic idea and evaluate performance, (2) fusion of weather radar and optical geostationary satellite data in a situation with heavy precipitation, and (3) change detection in optical...
Qualitative Data Analysis Strategies
Greaves, Kristoffer
2014-01-01
A set of concept maps for qualitative data analysis strategies, inspired by Corbin, JM & Strauss, AL 2008, Basics of qualitative research: Techniques and procedures for developing grounded theory, 3rd edn, Sage Publications, Inc, Thousand Oaks, California.
Statistical data analysis handbook
National Research Council Canada - National Science Library
Wall, Francis J
1986-01-01
It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...
Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank
2009-01-01
The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.
NOAA's Inundation Analysis Tool
National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...
Multidimensional nonlinear descriptive analysis
Nishisato, Shizuhiko
2006-01-01
Quantification of categorical, or non-numerical, data is a problem that scientists face across a wide range of disciplines. Exploring data analysis in various areas of research, such as the social sciences and biology, Multidimensional Nonlinear Descriptive Analysis presents methods for analyzing categorical data that are not necessarily sampled randomly from a normal population and often involve nonlinear relations. This reference not only provides an overview of multidimensional nonlinear descriptive analysis (MUNDA) of discrete data, it also offers new results in a variety of fields. The first part of the book covers conceptual and technical preliminaries needed to understand the data analysis in subsequent chapters. The next two parts contain applications of MUNDA to diverse data types, with each chapter devoted to one type of categorical data, a brief historical comment, and basic skills peculiar to the data types. The final part examines several problems and then concludes with suggestions for futu...
Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars
2016-04-12
A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.
Water Quality Analysis Simulation
The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural phenomena and man-made pollution for variious pollution management decisions.
... Plasma Free Metanephrines Platelet Count Platelet Function Tests Pleural Fluid Analysis PML-RARA Porphyrin Tests Potassium Prealbumin ... is being tested? Synovial fluid is a thick liquid that acts as a lubricant for the body's ...
Hytönen, Tuomas; Veraar, Mark; Weis, Lutz
The present volume develops the theory of integration in Banach spaces, martingales and UMD spaces, and culminates in a treatment of the Hilbert transform, Littlewood-Paley theory and the vector-valued Mihlin multiplier theorem. Over the past fifteen years, motivated by regularity problems in evolution equations, there has been tremendous progress in the analysis of Banach space-valued functions and processes. The contents of this extensive and powerful toolbox have been mostly scattered around in research papers and lecture notes. Collecting this diverse body of material into a unified and accessible presentation fills a gap in the existing literature. The principal audience that we have in mind consists of researchers who need and use Analysis in Banach Spaces as a tool for studying problems in partial differential equations, harmonic analysis, and stochastic analysis. Self-contained and offering complete proofs, this work is accessible to graduate students and researchers with a background in functional an...
Analysis Streamlining in ATLAS
Heinrich, Lukas; The ATLAS collaboration
2018-01-01
We present recent work within the ATLAS collaboration centrally provide tools to facilitate analysis management and highly automated container-based analysis execution in order to both enable non-experts to benefit from these best practices as well as the collaboration to track and re-execute analyses indpendently, e.g. during their review phase. Through integration with the ATLAS GLANCE system, users can request a pre-configured, but customizable version control setup, including continuous integration for automated build and testing as well as continuous Linux Container image building for software preservation purposes. As analyses typically require many individual steps, analysis workflow pipelines can then be defined using such images and the yadage workflow description language. The integration into the workflow exection service REANA allows the interactive or automated reproduction of the main analysis results by orchestrating a large number of container jobs using the Kubernetes. For long-term archival,...
Wolff, Thomas H; Shubin, Carol
2003-01-01
This book demonstrates how harmonic analysis can provide penetrating insights into deep aspects of modern analysis. It is both an introduction to the subject as a whole and an overview of those branches of harmonic analysis that are relevant to the Kakeya conjecture. The usual background material is covered in the first few chapters: the Fourier transform, convolution, the inversion theorem, the uncertainty principle and the method of stationary phase. However, the choice of topics is highly selective, with emphasis on those frequently used in research inspired by the problems discussed in the later chapters. These include questions related to the restriction conjecture and the Kakeya conjecture, distance sets, and Fourier transforms of singular measures. These problems are diverse, but often interconnected; they all combine sophisticated Fourier analysis with intriguing links to other areas of mathematics and they continue to stimulate first-rate work. The book focuses on laying out a solid foundation for fu...
Water Quality Analysis Simulation
U.S. Environmental Protection Agency — The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural...
Federal Laboratory Consortium — Provides engineering design of aircraft components, subsystems and installations using Pro/E, Anvil 1000, CADKEY 97, AutoCAD 13. Engineering analysis tools include...
CSIR Research Space (South Africa)
Khuluse, S
2009-04-01
Full Text Available ) determination of the distribution of the damage and (iii) preparation of products that enable prediction of future risk events. The methodology provided by extreme value theory can also be a powerful tool in risk analysis...
Ziemer, William P
2017-01-01
This first year graduate text is a comprehensive resource in real analysis based on a modern treatment of measure and integration. Presented in a definitive and self-contained manner, it features a natural progression of concepts from simple to difficult. Several innovative topics are featured, including differentiation of measures, elements of Functional Analysis, the Riesz Representation Theorem, Schwartz distributions, the area formula, Sobolev functions and applications to harmonic functions. Together, the selection of topics forms a sound foundation in real analysis that is particularly suited to students going on to further study in partial differential equations. This second edition of Modern Real Analysis contains many substantial improvements, including the addition of problems for practicing techniques, and an entirely new section devoted to the relationship between Lebesgue and improper integrals. Aimed at graduate students with an understanding of advanced calculus, the text will also appeal to mo...
DEFF Research Database (Denmark)
Fischer, Paul; Hilbert, Astrid
2012-01-01
We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...
International Nuclear Information System (INIS)
Holland, W.E.
1980-02-01
A method was developed to determine if boron-loaded polymeric material contained enriched boron or natural boron. A prototype analyzer was constructed, and initial planning was done for an actual analysis facility
Stakeholder Analysis Worksheet
Stakeholder Analysis WorksheetA worksheet that can be used to document potential stakeholder groups, the information or expertise they hold, the role that they can play, their interests or concerns about the HIA
Energy Technology Data Exchange (ETDEWEB)
Arent, D.; Benioff, R.; Mosey, G.; Bird, L.; Brown, J.; Brown, E.; Vimmerstedt, L.; Aabakken, J.; Parks, K.; Lapsa, M.; Davis, S.; Olszewski, M.; Cox, D.; McElhaney, K.; Hadley, S.; Hostick, D.; Nicholls, A.; McDonald, S.; Holloman, B.
2006-10-01
This paper presents the results of energy market analysis sponsored by the Department of Energy's (DOE) Weatherization and International Program (WIP) within the Office of Energy Efficiency and Renewable Energy (EERE). The analysis was conducted by a team of DOE laboratory experts from the National Renewable Energy Laboratory (NREL), Oak Ridge National Laboratory (ORNL), and Pacific Northwest National Laboratory (PNNL), with additional input from Lawrence Berkeley National Laboratory (LBNL). The analysis was structured to identify those markets and niches where government can create the biggest impact by informing management decisions in the private and public sectors. The analysis identifies those markets and niches where opportunities exist for increasing energy efficiency and renewable energy use.
Principles of Fourier analysis
Howell, Kenneth B
2001-01-01
Fourier analysis is one of the most useful and widely employed sets of tools for the engineer, the scientist, and the applied mathematician. As such, students and practitioners in these disciplines need a practical and mathematically solid introduction to its principles. They need straightforward verifications of its results and formulas, and they need clear indications of the limitations of those results and formulas.Principles of Fourier Analysis furnishes all this and more. It provides a comprehensive overview of the mathematical theory of Fourier analysis, including the development of Fourier series, "classical" Fourier transforms, generalized Fourier transforms and analysis, and the discrete theory. Much of the author''s development is strikingly different from typical presentations. His approach to defining the classical Fourier transform results in a much cleaner, more coherent theory that leads naturally to a starting point for the generalized theory. He also introduces a new generalized theory based ...
Quantitative investment analysis
DeFusco, Richard
2007-01-01
In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.
Energy Technology Data Exchange (ETDEWEB)
Wood, William Monford [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-02-23
Presenting a systematic study of the standard analysis of rod-pinch radiographs for obtaining quantitative measurements of areal mass densities, and making suggestions for improving the methodology of obtaining quantitative information from radiographed objects.
Biodiesel Emissions Analysis Program
Using existing data, the EPA's biodiesel emissions analysis program sought to quantify the air pollution emission effects of biodiesel for diesel engines that have not been specifically modified to operate on biodiesel.
Introduction to global analysis
Kahn, Donald W
2007-01-01
This text introduces the methods of mathematical analysis as applied to manifolds, including the roles of differentiation and integration, infinite dimensions, Morse theory, Lie groups, and dynamical systems. 1980 edition.
Biorefinery Sustainability Analysis
DEFF Research Database (Denmark)
J. S. M. Silva, Carla; Prunescu, Remus Mihail; Gernaey, Krist
2017-01-01
This chapter deals with sustainability analysis of biorefinery systems in terms of environmental and socio-economic indicators . Life cycle analysis has methodological issues related to the functional unit (FU), allocation , land use and biogenic carbon neutrality of the reference system and of t......This chapter deals with sustainability analysis of biorefinery systems in terms of environmental and socio-economic indicators . Life cycle analysis has methodological issues related to the functional unit (FU), allocation , land use and biogenic carbon neutrality of the reference system...... and of the biorefinery-based system. Socio-economic criteria and indicators used in sustainability frameworks assessment are presented and discussed. There is not one single methodology that can aptly cover the synergies of environmental, economic, social and governance issues required to assess the sustainable...
Pesticide Instrumental Analysis
International Nuclear Information System (INIS)
Samir, E.; Fonseca, E.; Baldyga, N.; Acosta, A.; Gonzalez, F.; Felicita, F.; Tomasso, M.; Esquivel, D.; Parada, A.; Enriquez, P.; Amilibia, M.
2012-01-01
This workshop was the evaluation of the pesticides impact on the vegetable matrix with the purpose to determine the analysis by GC / M S. The working material were lettuce matrix, chard and a mix of green leaves and pesticides.
Kansas Data Access and Support Center — The Kansas GAP Analysis Land Cover database depicts 43 land cover classes for the state of Kansas. The database was generated using a two-stage hybrid classification...
Perspectives in shape analysis
Bruckstein, Alfred; Maragos, Petros; Wuhrer, Stefanie
2016-01-01
This book presents recent advances in the field of shape analysis. Written by experts in the fields of continuous-scale shape analysis, discrete shape analysis and sparsity, and numerical computing who hail from different communities, it provides a unique view of the topic from a broad range of perspectives. Over the last decade, it has become increasingly affordable to digitize shape information at high resolution. Yet analyzing and processing this data remains challenging because of the large amount of data involved, and because modern applications such as human-computer interaction require real-time processing. Meeting these challenges requires interdisciplinary approaches that combine concepts from a variety of research areas, including numerical computing, differential geometry, deformable shape modeling, sparse data representation, and machine learning. On the algorithmic side, many shape analysis tasks are modeled using partial differential equations, which can be solved using tools from the field of n...
National Research Council Canada - National Science Library
1998-01-01
.... Establishing proper job procedures is one of the benefits of conducting a job hazard analysis carefully studying and recording each step of a job, identifying existing or potential job hazards...
Main: Nucleotide Analysis [KOME
Lifescience Database Archive (English)
Full Text Available Nucleotide Analysis Japonica genome blast search result Result of blastn search against jap...onica genome sequence kome_japonica_genome_blast_search_result.zip kome_japonica_genome_blast_search_result ...
Schuller, Björn W
2013-01-01
This book provides the reader with the knowledge necessary for comprehension of the field of Intelligent Audio Analysis. It firstly introduces standard methods and discusses the typical Intelligent Audio Analysis chain going from audio data to audio features to audio recognition. Further, an introduction to audio source separation, and enhancement and robustness are given. After the introductory parts, the book shows several applications for the three types of audio: speech, music, and general sound. Each task is shortly introduced, followed by a description of the specific data and methods applied, experiments and results, and a conclusion for this specific task. The books provides benchmark results and standardized test-beds for a broader range of audio analysis tasks. The main focus thereby lies on the parallel advancement of realism in audio analysis, as too often today’s results are overly optimistic owing to idealized testing conditions, and it serves to stimulate synergies arising from transfer of ...
Scientific stream pollution analysis
National Research Council Canada - National Science Library
Nemerow, Nelson Leonard
1974-01-01
A comprehensive description of the analysis of water pollution that presents a careful balance of the biological,hydrological, chemical and mathematical concepts involved in the evaluation of stream...
International Nuclear Information System (INIS)
Andreeva, J; Maier, G; Spiga, D; Calloni, M; Colling, D; Fanzago, F; D'Hondt, J; Maes, J; Van Mulders, P; Villella, I; Klem, J; Letts, J; Padhi, S; Sarkar, S
2010-01-01
During normal data taking CMS expects to support potentially as many as 2000 analysis users. Since the beginning of 2008 there have been more than 800 individuals who submitted a remote analysis job to the CMS computing infrastructure. The bulk of these users will be supported at the over 40 CMS Tier-2 centres. Supporting a globally distributed community of users on a globally distributed set of computing clusters is a task that requires reconsidering the normal methods of user support for Analysis Operations. In 2008 CMS formed an Analysis Support Task Force in preparation for large-scale physics analysis activities. The charge of the task force was to evaluate the available support tools, the user support techniques, and the direct feedback of users with the goal of improving the success rate and user experience when utilizing the distributed computing environment. The task force determined the tools needed to assess and reduce the number of non-zero exit code applications submitted through the grid interfaces and worked with the CMS experiment dashboard developers to obtain the necessary information to quickly and proactively identify issues with user jobs and data sets hosted at various sites. Results of the analysis group surveys were compiled. Reference platforms for testing and debugging problems were established in various geographic regions. The task force also assessed the resources needed to make the transition to a permanent Analysis Operations task. In this presentation the results of the task force will be discussed as well as the CMS Analysis Operations plans for the start of data taking.
Invitation to classical analysis
Duren, Peter
2012-01-01
This book gives a rigorous treatment of selected topics in classical analysis, with many applications and examples. The exposition is at the undergraduate level, building on basic principles of advanced calculus without appeal to more sophisticated techniques of complex analysis and Lebesgue integration. Among the topics covered are Fourier series and integrals, approximation theory, Stirling's formula, the gamma function, Bernoulli numbers and polynomials, the Riemann zeta function, Tauberian theorems, elliptic integrals, ramifications of the Cantor set, and a theoretical discussion of differ
Analysis of irradiated materials
International Nuclear Information System (INIS)
Bellamy, B.A.
1988-01-01
Papers presented at the UKAEA Conference on Materials Analysis by Physical Techniques (1987) covered a wide range of techniques as applied to the analysis of irradiated materials. These varied from reactor component materials, materials associated with the Authority's radwaste disposal programme, fission products and products associated with the decommissioning of nuclear reactors. An invited paper giving a very comprehensive review of Laser Ablation Microprobe Mass Spectroscopy (LAMMS) was included in the programme. (author)
Oktavianto, Digit
2013-01-01
This book is a step-by-step, practical tutorial for analyzing and detecting malware and performing digital investigations. This book features clear and concise guidance in an easily accessible format.Cuckoo Malware Analysis is great for anyone who wants to analyze malware through programming, networking, disassembling, forensics, and virtualization. Whether you are new to malware analysis or have some experience, this book will help you get started with Cuckoo Sandbox so you can start analysing malware effectively and efficiently.
International Nuclear Information System (INIS)
Niehaus, F.
1988-01-01
In this paper, the risks of various energy systems are discussed considering severe accidents analysis, particularly the probabilistic safety analysis, and probabilistic safety criteria, and the applications of these criteria and analysis. The comparative risk analysis has demonstrated that the largest source of risk in every society is from daily small accidents. Nevertheless, we have to be more concerned about severe accidents. The comparative risk analysis of five different energy systems (coal, oil, gas, LWR and STEC (Solar)) for the public has shown that the main sources of risks are coal and oil. The latest comparative risk study of various energy has been conducted in the USA and has revealed that the number of victims from coal is 42 as many than victims from nuclear. A study for severe accidents from hydro-dams in United States has estimated the probability of dam failures at 1 in 10,000 years and the number of victims between 11,000 and 260,000. The average occupational risk from coal is one fatal accident in 1,000 workers/year. The probabilistic safety analysis is a method that can be used to assess nuclear energy risks, and to analyze the severe accidents, and to model all possible accident sequences and consequences. The 'Fault tree' analysis is used to know the probability of failure of the different systems at each point of accident sequences and to calculate the probability of risks. After calculating the probability of failure, the criteria for judging the numerical results have to be developed, that is the quantitative and qualitative goals. To achieve these goals, several systems have been devised by various countries members of AIEA. The probabilistic safety ana-lysis method has been developed by establishing a computer program permit-ting to know different categories of safety related information. 19 tabs. (author)
International Nuclear Information System (INIS)
Arien, B.
1998-01-01
The objective of SCK-CEN's programme on reactor safety is to develop expertise in probabilistic and deterministic reactor safety analysis. The research programme consists of four main activities, in particular the development of software for reliability analysis of large systems and participation in the international PHEBUS-FP programme for severe accidents, the development of an expert system for the aid to diagnosis; the development and application of a probabilistic reactor dynamics method. Main achievements in 1999 are reported
Brady, Sir Michael; Highnam, Ralph; Irving, Benjamin; Schnabel, Julia A
2016-10-01
Cancer is one of the world's major healthcare challenges and, as such, an important application of medical image analysis. After a brief introduction to cancer, we summarise some of the major developments in oncological image analysis over the past 20 years, but concentrating those in the authors' laboratories, and then outline opportunities and challenges for the next decade. Copyright © 2016 Elsevier B.V. All rights reserved.
International Nuclear Information System (INIS)
Tibari, Elghali; Taous, Fouad; Marah, Hamid
2014-01-01
This report presents results related to stable isotopes analysis carried out at the CNESTEN DASTE in Rabat (Morocco), on behalf of Senegal. These analyzes cover 127 samples. These results demonstrate that Oxygen-18 and Deuterium in water analysis were performed by infrared Laser spectroscopy using a LGR / DLT-100 with Autosampler. Also, the results are expressed in δ values (‰) relative to V-SMOW to ± 0.3 ‰ for oxygen-18 and ± 1 ‰ for deuterium.
Oden, J Tinsley
2010-01-01
The textbook is designed to drive a crash course for beginning graduate students majoring in something besides mathematics, introducing mathematical foundations that lead to classical results in functional analysis. More specifically, Oden and Demkowicz want to prepare students to learn the variational theory of partial differential equations, distributions, and Sobolev spaces and numerical analysis with an emphasis on finite element methods. The 1996 first edition has been used in a rather intensive two-semester course. -Book News, June 2010
Griffel, DH
2002-01-01
A stimulating introductory text, this volume examines many important applications of functional analysis to mechanics, fluid mechanics, diffusive growth, and approximation. Detailed enough to impart a thorough understanding, the text is also sufficiently straightforward for those unfamiliar with abstract analysis. Its four-part treatment begins with distribution theory and discussions of Green's functions. Essentially independent of the preceding material, the second and third parts deal with Banach spaces, Hilbert space, spectral theory, and variational techniques. The final part outlines the
Ivan Stosic; Drasko Nikolic; Aleksandar Zdravkovic
2012-01-01
The main purpose of this paper is to examine the impact of the current Serbian macro-environment on the businesses through the implementation of PEST analysis as a framework for assessing general or macro environment in which companies are operating. The authors argue the elements in presented PEST analysis indicate that the current macro-environment is characterized by the dominance of threats and weaknesses with few opportunities and strengths. Consequently, there is a strong need for faste...
Forensic neutron activation analysis
International Nuclear Information System (INIS)
Kishi, T.
1987-01-01
The progress of forensic neutron activation analysis (FNAA) in Japan is described. FNAA began in 1965 and during the past 20 years many cases have been handled; these include determination of toxic materials, comparison examination of physical evidences (e.g., paints, metal fragments, plastics and inks) and drug sample differentiation. Neutron activation analysis is applied routinely to the scientific criminal investigation as one of multielement analytical techniques. This paper also discusses these routine works. (author) 14 refs
Probabilistic Structural Analysis Program
Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.
2010-01-01
NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.
Directory of Open Access Journals (Sweden)
Iulian N. BUJOREANU
2011-01-01
Full Text Available Sensitivity analysis represents such a well known and deeply analyzed subject that anyone to enter the field feels like not being able to add anything new. Still, there are so many facets to be taken into consideration.The paper introduces the reader to the various ways sensitivity analysis is implemented and the reasons for which it has to be implemented in most analyses in the decision making processes. Risk analysis is of outmost importance in dealing with resource allocation and is presented at the beginning of the paper as the initial cause to implement sensitivity analysis. Different views and approaches are added during the discussion about sensitivity analysis so that the reader develops an as thoroughly as possible opinion on the use and UTILITY of the sensitivity analysis. Finally, a round-up conclusion brings us to the question of the possibility of generating the future and analyzing it before it unfolds so that, when it happens it brings less uncertainty.
International Nuclear Information System (INIS)
Grimanis, A.P.
1985-01-01
A review of research and development on NAA as well as examples of applications of this method are presented, taken from work carried out over the last 21 years at the Radioanalytical Laboratory of the Department of Chemistry in the Greek Nuclear Research Center ''Demokritos''. Improved and faster radiochemical NAA methods have been developed for the determination of Au, Ni, Cl, As, Cu, U, Cr, Eu, Hg and Mo in several materials, for the simultaneous determination of Br and I; Mg, Sr and Ni; As and Cu; As, Sb and Hg; Mn, Sr and Ba; Cd and Zn; Se and As; Mo and Cr in biological materials. Instrumental NAA methods have also been developed for the determination of Ag, Cl and Na in lake waters, Al, Ca, Mg and V in wines, 7 trace elements in biological materials, 17 trace elements in sediments and 20 minor and trace elements in ceramics. A comprehensive computer program for routine activation analysis using Ge(Li) detectors have been worked out. A rather extended charged-particle activation analysis program is carried out for the last 10 years, including particle induced X-ray emission (PIXE) analysis, particle induced prompt gamma-ray emission analysis (PIGE), other nuclear reactions and proton activation analysis. A special neutron activation method, the delayed fission neutron counting method is used for the analysis of fissionable elements, as U, Th, Pu, in samples of the whole nuclear fuel cycle including geological, enriched and nuclear safeguards samples
Integrated genetic analysis microsystems
International Nuclear Information System (INIS)
Lagally, Eric T; Mathies, Richard A
2004-01-01
With the completion of the Human Genome Project and the ongoing DNA sequencing of the genomes of other animals, bacteria, plants and others, a wealth of new information about the genetic composition of organisms has become available. However, as the demand for sequence information grows, so does the workload required both to generate this sequence and to use it for targeted genetic analysis. Microfabricated genetic analysis systems are well poised to assist in the collection and use of these data through increased analysis speed, lower analysis cost and higher parallelism leading to increased assay throughput. In addition, such integrated microsystems may point the way to targeted genetic experiments on single cells and in other areas that are otherwise very difficult. Concomitant with these advantages, such systems, when fully integrated, should be capable of forming portable systems for high-speed in situ analyses, enabling a new standard in disciplines such as clinical chemistry, forensics, biowarfare detection and epidemiology. This review will discuss the various technologies available for genetic analysis on the microscale, and efforts to integrate them to form fully functional robust analysis devices. (topical review)
INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES
Directory of Open Access Journals (Sweden)
Caescu Stefan Claudiu
2011-12-01
Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such
Professionalizing Intelligence Analysis
Directory of Open Access Journals (Sweden)
James B. Bruce
2015-09-01
Full Text Available This article examines the current state of professionalism in national security intelligence analysis in the U.S. Government. Since the introduction of major intelligence reforms directed by the Intelligence Reform and Terrorism Prevention Act (IRTPA in December, 2004, we have seen notable strides in many aspects of intelligence professionalization, including in analysis. But progress is halting, uneven, and by no means permanent. To consolidate its gains, and if it is to continue improving, the U.S. intelligence community (IC should commit itself to accomplishing a new program of further professionalization of analysis to ensure that it will develop an analytic cadre that is fully prepared to deal with the complexities of an emerging multipolar and highly dynamic world that the IC itself is forecasting. Some recent reforms in intelligence analysis can be assessed against established standards of more fully developed professions; these may well fall short of moving the IC closer to the more fully professionalized analytical capability required for producing the kind of analysis needed now by the United States.
Harmonic and geometric analysis
Citti, Giovanna; Pérez, Carlos; Sarti, Alessandro; Zhong, Xiao
2015-01-01
This book presents an expanded version of four series of lectures delivered by the authors at the CRM. Harmonic analysis, understood in a broad sense, has a very wide interplay with partial differential equations and in particular with the theory of quasiconformal mappings and its applications. Some areas in which real analysis has been extremely influential are PDE's and geometric analysis. Their foundations and subsequent developments made extensive use of the Calderón–Zygmund theory, especially the Lp inequalities for Calderón–Zygmund operators (Beurling transform and Riesz transform, among others) and the theory of Muckenhoupt weights. The first chapter is an application of harmonic analysis and the Heisenberg group to understanding human vision, while the second and third chapters cover some of the main topics on linear and multilinear harmonic analysis. The last serves as a comprehensive introduction to a deep result from De Giorgi, Moser and Nash on the regularity of elliptic partial differen...
Hansson, Sven Ove; Aven, Terje
2014-07-01
This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.
Zhou, Qing; Son, Kyungjin; Liu, Ying; Revzin, Alexander
2015-01-01
Biosensors first appeared several decades ago to address the need for monitoring physiological parameters such as oxygen or glucose in biological fluids such as blood. More recently, a new wave of biosensors has emerged in order to provide more nuanced and granular information about the composition and function of living cells. Such biosensors exist at the confluence of technology and medicine and often strive to connect cell phenotype or function to physiological or pathophysiological processes. Our review aims to describe some of the key technological aspects of biosensors being developed for cell analysis. The technological aspects covered in our review include biorecognition elements used for biosensor construction, methods for integrating cells with biosensors, approaches to single-cell analysis, and the use of nanostructured biosensors for cell analysis. Our hope is that the spectrum of possibilities for cell analysis described in this review may pique the interest of biomedical scientists and engineers and may spur new collaborations in the area of using biosensors for cell analysis.
Clinical reasoning: concept analysis.
Simmons, Barbara
2010-05-01
This paper is a report of a concept analysis of clinical reasoning in nursing. Clinical reasoning is an ambiguous term that is often used synonymously with decision-making and clinical judgment. Clinical reasoning has not been clearly defined in the literature. Healthcare settings are increasingly filled with uncertainty, risk and complexity due to increased patient acuity, multiple comorbidities, and enhanced use of technology, all of which require clinical reasoning. Data sources. Literature for this concept analysis was retrieved from several databases, including CINAHL, PubMed, PsycINFO, ERIC and OvidMEDLINE, for the years 1980 to 2008. Rodgers's evolutionary method of concept analysis was used because of its applicability to concepts that are still evolving. Multiple terms have been used synonymously to describe the thinking skills that nurses use. Research in the past 20 years has elucidated differences among these terms and identified the cognitive processes that precede judgment and decision-making. Our concept analysis defines one of these terms, 'clinical reasoning,' as a complex process that uses cognition, metacognition, and discipline-specific knowledge to gather and analyse patient information, evaluate its significance, and weigh alternative actions. This concept analysis provides a middle-range descriptive theory of clinical reasoning in nursing that helps clarify meaning and gives direction for future research. Appropriate instruments to operationalize the concept need to be developed. Research is needed to identify additional variables that have an impact on clinical reasoning and what are the consequences of clinical reasoning in specific situations.
International Nuclear Information System (INIS)
Sitek, J.; Degmova, J.; Dekan, J.
2011-01-01
Meteorite Kosice fell down 28 th of February 2010 near the Kosice and represents an unique find, because the last fall of meteorite was observed in Slovakia at the year 1895. It supposes that for this kind of meteorite the orbit in cosmic space could be calculated. This is one of most important part because until now 13 places of meteorite find are known in the world of which cosmic orbit in space have been calculated. Slovakia is member of international bolide net, dealing with meteorite analysis in Middle Europe .Analysis of Kosice meteorite will also concern at the long live and short live nuclides. Results should be a contribution to determination of radiation and formative ages. From structural analysis of meteorite it will be possible to compare it with similar types of meteorites. In this work Moessbauer spectroscopy will be used for phase analysis from point of view iron contain components with the aim to identify magnetic and non magnetic fractions. From the analysis of magnetic part we can find that the first sextet with hyperfine magnetic field 33.5 T corresponds to bcc Fe-Ni alloy (kamacite) and second with field 31.5 T to FeS (triolite). Meteorites with mentioned composition belong to the mineral group of chondrites. Comparing our parameters with results of measurements at the similar meteorites we can conclude that Kosice meteorite contains the same components. According all Moessbauer parameters we can also include this meteorite in the mineral group of chondrites. (authors)
Foundations of VISAR analysis.
Energy Technology Data Exchange (ETDEWEB)
Dolan, Daniel H.
2006-06-01
The Velocity Interferometer System for Any Reflector (VISAR) is a widely used diagnostic at Sandia National Laboratories. Although the operating principles of the VISAR are well established, recently deployed systems (such as the fast push-pull and air delay VISAR) require more careful consideration, and many common assumptions about VISAR are coming into question. This report presents a comprehensive review of VISAR analysis to address these issues. Detailed treatment of several interferometer configurations is given to identify important aspects of the operation and characterization of VISAR systems. The calculation of velocity from interferometer measurements is also described. The goal is to derive the standard VISAR analysis relationships, indicate when these relationships are valid, and provide alternative methods when the standard analysis fails.
Pugh, Charles C
2015-01-01
Based on an honors course taught by the author at UC Berkeley, this introduction to undergraduate real analysis gives a different emphasis by stressing the importance of pictures and hard problems. Topics include: a natural construction of the real numbers, four-dimensional visualization, basic point-set topology, function spaces, multivariable calculus via differential forms (leading to a simple proof of the Brouwer Fixed Point Theorem), and a pictorial treatment of Lebesgue theory. Over 150 detailed illustrations elucidate abstract concepts and salient points in proofs. The exposition is informal and relaxed, with many helpful asides, examples, some jokes, and occasional comments from mathematicians, such as Littlewood, Dieudonné, and Osserman. This book thus succeeds in being more comprehensive, more comprehensible, and more enjoyable, than standard introductions to analysis. New to the second edition of Real Mathematical Analysis is a presentation of Lebesgue integration done almost entirely using the un...
Digital Fourier analysis fundamentals
Kido, Ken'iti
2015-01-01
This textbook is a thorough, accessible introduction to digital Fourier analysis for undergraduate students in the sciences. Beginning with the principles of sine/cosine decomposition, the reader walks through the principles of discrete Fourier analysis before reaching the cornerstone of signal processing: the Fast Fourier Transform. Saturated with clear, coherent illustrations, "Digital Fourier Analysis - Fundamentals" includes practice problems and thorough Appendices for the advanced reader. As a special feature, the book includes interactive applets (available online) that mirror the illustrations. These user-friendly applets animate concepts interactively, allowing the user to experiment with the underlying mathematics. For example, a real sine signal can be treated as a sum of clockwise and counter-clockwise rotating vectors. The applet illustration included with the book animates the rotating vectors and the resulting sine signal. By changing parameters such as amplitude and frequency, the reader ca...
Frank, IE
1994-01-01
Analyzing observed or measured data is an important step in applied sciences. The recent increase in computer capacity has resulted in a revolution both in data collection and data analysis. An increasing number of scientists, researchers and students are venturing into statistical data analysis; hence the need for more guidance in this field, which was previously dominated mainly by statisticians. This handbook fills the gap in the range of textbooks on data analysis. Written in a dictionary format, it will serve as a comprehensive reference book in a rapidly growing field. However, this book is more structured than an ordinary dictionary, where each entry is a separate, self-contained entity. The authors provide not only definitions and short descriptions, but also offer an overview of the different topics. Therefore, the handbook can also be used as a companion to textbooks for undergraduate or graduate courses. 1700 entries are given in alphabetical order grouped into 20 topics and each topic is organized...
International Nuclear Information System (INIS)
Arien, B.
1998-01-01
Risk assessments of nuclear installations require accurate safety and reliability analyses to estimate the consequences of accidental events and their probability of occurrence. The objective of the work performed in this field at the Belgian Nuclear Research Centre SCK-CEN is to develop expertise in probabilistic and deterministic reactor safety analysis. The four main activities of the research project on reactor safety analysis are: (1) the development of software for the reliable analysis of large systems; (2) the development of an expert system for the aid to diagnosis; (3) the development and the application of a probabilistic reactor-dynamics method, and (4) to participate in the international PHEBUS-FP programme for severe accidents. Progress in research during 1997 is described
Energy Technology Data Exchange (ETDEWEB)
Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-05-12
The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.
International Nuclear Information System (INIS)
Deville, J.P.
1998-01-01
Nowadays, there are a lot of surfaces analysis methods, each having its specificity, its qualities, its constraints (for instance vacuum) and its limits. Expensive in time and in investment, these methods have to be used deliberately. This article appeals to non specialists. It gives some elements of choice according to the studied information, the sensitivity, the use constraints or the answer to a precise question. After having recalled the fundamental principles which govern these analysis methods, based on the interaction between radiations (ultraviolet, X) or particles (ions, electrons) with matter, two methods will be more particularly described: the Auger electron spectroscopy (AES) and x-rays photoemission spectroscopy (ESCA or XPS). Indeed, they are the most widespread methods in laboratories, the easier for use and probably the most productive for the analysis of surface of industrial materials or samples submitted to treatments in aggressive media. (O.M.)
Power electronics reliability analysis.
Energy Technology Data Exchange (ETDEWEB)
Smith, Mark A.; Atcitty, Stanley
2009-12-01
This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.
International Nuclear Information System (INIS)
Gregg, H.R.; Meltzer, M.P.
1996-01-01
The portable Contamination Analysis Unit (CAU) measures trace quantities of surface contamination in real time. The detector head of the portable contamination analysis unit has an opening with an O-ring seal, one or more vacuum valves and a small mass spectrometer. With the valve closed, the mass spectrometer is evacuated with one or more pumps. The O-ring seal is placed against a surface to be tested and the vacuum valve is opened. Data is collected from the mass spectrometer and a portable computer provides contamination analysis. The CAU can be used to decontaminate and decommission hazardous and radioactive surfaces by measuring residual hazardous surface contamination, such as tritium and trace organics. It provides surface contamination data for research and development applications as well as real-time process control feedback for industrial cleaning operations and can be used to determine the readiness of a surface to accept bonding or coatings. 1 fig
Jorgensen, Palle
2017-01-01
The book features new directions in analysis, with an emphasis on Hilbert space, mathematical physics, and stochastic processes. We interpret 'non-commutative analysis' broadly to include representations of non-Abelian groups, and non-Abelian algebras; emphasis on Lie groups and operator algebras (C* algebras and von Neumann algebras.)A second theme is commutative and non-commutative harmonic analysis, spectral theory, operator theory and their applications. The list of topics includes shift invariant spaces, group action in differential geometry, and frame theory (over-complete bases) and their applications to engineering (signal processing and multiplexing), projective multi-resolutions, and free probability algebras.The book serves as an accessible introduction, offering a timeless presentation, attractive and accessible to students, both in mathematics and in neighboring fields.
DEFF Research Database (Denmark)
Frigaard, Peter; Andersen, Thomas Lykke
The present book describes the most important aspects of wave analysis techniques applied to physical model tests. Moreover, the book serves as technical documentation for the wave analysis software WaveLab 3, cf. Aalborg University (2012). In that respect it should be mentioned that supplementary...... to the present technical documentation exists also the online help document describing the WaveLab software in detail including all the inputs and output fields. In addition to the two main authors also Tue Hald, Jacob Helm-Petersen and Morten Møller Jakobsen have contributed to the note. Their input is highly...... acknowledged. The outline of the book is as follows: • Chapter 2 and 3 describes analysis of waves in time and frequency domain. • Chapter 4 and 5 describes the separation of incident and reflected waves for the two-dimensional case. • Chapter 6 describes the estimation of the directional spectra which also...
Canuto, Claudio
2015-01-01
The purpose of the volume is to provide a support textbook for a second lecture course on Mathematical Analysis. The contents are organised to suit, in particular, students of Engineering, Computer Science and Physics, all areas in which mathematical tools play a crucial role. The basic notions and methods concerning integral and differential calculus for multivariable functions, series of functions and ordinary differential equations are presented in a manner that elicits critical reading and prompts a hands-on approach to concrete applications. The pedagogical layout echoes the one used in the companion text Mathematical Analysis I. The book’s structure has a specifically-designed modular nature, which allows for great flexibility in the preparation of a lecture course on Mathematical Analysis. The style privileges clarity in the exposition and a linear progression through the theory. The material is organised on two levels. The first, reflected in this book, allows students to grasp the essential ideas, ...
Bandemer, Hans
1992-01-01
Fuzzy data such as marks, scores, verbal evaluations, imprecise observations, experts' opinions and grey tone pictures, are quite common. In Fuzzy Data Analysis the authors collect their recent results providing the reader with ideas, approaches and methods for processing such data when looking for sub-structures in knowledge bases for an evaluation of functional relationship, e.g. in order to specify diagnostic or control systems. The modelling presented uses ideas from fuzzy set theory and the suggested methods solve problems usually tackled by data analysis if the data are real numbers. Fuzzy Data Analysis is self-contained and is addressed to mathematicians oriented towards applications and to practitioners in any field of application who have some background in mathematics and statistics.
Zorich, Vladimir A
2015-01-01
VLADIMIR A. ZORICH is professor of mathematics at Moscow State University. His areas of specialization are analysis, conformal geometry, quasiconformal mappings, and mathematical aspects of thermodynamics. He solved the problem of global homeomorphism for space quasiconformal mappings. He holds a patent in the technology of mechanical engineering, and he is also known by his book Mathematical Analysis of Problems in the Natural Sciences . This second English edition of a very popular two-volume work presents a thorough first course in analysis, leading from real numbers to such advanced topics as differential forms on manifolds; asymptotic methods; Fourier, Laplace, and Legendre transforms; elliptic functions; and distributions. Especially notable in this course are the clearly expressed orientation toward the natural sciences and the informal exploration of the essence and the roots of the basic concepts and theorems of calculus. Clarity of exposition is matched by a wealth of instructive exercises, problems...
DEFF Research Database (Denmark)
Christensen, Ole; Feichtinger, Hans G.; Paukner, Stephan
2015-01-01
, it characterizes a function by its transform over phase space, which is the time–frequency plane (TF-plane) in a musical context or the location–wave-number domain in the context of image processing. Since the transition from the signal domain to the phase space domain introduces an enormous amount of data...... of the generalities relevant for an understanding of Gabor analysis of functions on Rd. We pay special attention to the case d = 2, which is the most important case for image processing and image analysis applications. The chapter is organized as follows. Section 2 presents central tools from functional analysis......, the application of Gabor expansions to image representation is considered in Sect. 6....
Sohrab, Houshang H
2014-01-01
This expanded second edition presents the fundamentals and touchstone results of real analysis in full rigor, but in a style that requires little prior familiarity with proofs or mathematical language. The text is a comprehensive and largely self-contained introduction to the theory of real-valued functions of a real variable. The chapters on Lebesgue measure and integral have been rewritten entirely and greatly improved. They now contain Lebesgue’s differentiation theorem as well as his versions of the Fundamental Theorem(s) of Calculus. With expanded chapters, additional problems, and an expansive solutions manual, Basic Real Analysis, Second Edition, is ideal for senior undergraduates and first-year graduate students, both as a classroom text and a self-study guide. Reviews of first edition: The book is a clear and well-structured introduction to real analysis aimed at senior undergraduate and beginning graduate students. The prerequisites are few, but a certain mathematical sophistication is required. ....
Software safety hazard analysis
International Nuclear Information System (INIS)
Lawrence, J.D.
1996-02-01
Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper
Vágner, Petr; Pavelka, Michal; Maršík, František
2017-04-01
The well-known Gouy-Stodola theorem states that a device produces maximum useful power when working reversibly, that is with no entropy production inside the device. This statement then leads to a method of thermodynamic optimization based on entropy production minimization. Exergy destruction (difference between exergy of fuel and exhausts) is also given by entropy production inside the device. Therefore, assessing efficiency of a device by exergy analysis is also based on the Gouy-Stodola theorem. However, assumptions that had led to the Gouy-Stodola theorem are not satisfied in several optimization scenarios, e.g. non-isothermal steady-state fuel cells, where both entropy production minimization and exergy analysis should be used with caution. We demonstrate, using non-equilibrium thermodynamics, a few cases where entropy production minimization and exergy analysis should not be applied.
Directory of Open Access Journals (Sweden)
Sutawanir Darwis
2012-05-01
Full Text Available Empirical decline curve analysis of oil production data gives reasonable answer in hyperbolic type curves situations; however the methodology has limitations in fitting real historical production data in present of unusual observations due to the effect of the treatment to the well in order to increase production capacity. The development ofrobust least squares offers new possibilities in better fitting production data using declinecurve analysis by down weighting the unusual observations. This paper proposes a robustleast squares fitting lmRobMM approach to estimate the decline rate of daily production data and compares the results with reservoir simulation results. For case study, we usethe oil production data at TBA Field West Java. The results demonstrated that theapproach is suitable for decline curve fitting and offers a new insight in decline curve analysis in the present of unusual observations.
Applied multivariate statistical analysis
Härdle, Wolfgang Karl
2015-01-01
Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners. It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added. All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior. All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...
Trajectory Based Traffic Analysis
DEFF Research Database (Denmark)
Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin
2013-01-01
We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point-and-click a......We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point......-and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most...
International Nuclear Information System (INIS)
Johnstad, H.
1989-06-01
The Physics Analysis Workstation (PAW) is a high-level program providing data presentation and statistical or mathematical analysis. PAW has been developed at CERN as an instrument to assist physicists in the analysis and presentation of their data. The program is interfaced to a high level graphics package, based on basic underlying graphics. 3-D graphics capabilities are being implemented. The major objects in PAW are 1 or 2 dimensional binned event data with fixed number of entries per event, vectors, functions, graphics pictures, and macros. Command input is handled by an integrated user interface package, which allows for a variety of choices for input, either with typed commands, or in a tree structure menu driven mode. 6 refs., 1 fig
Choudary, A D R
2014-01-01
The book targets undergraduate and postgraduate mathematics students and helps them develop a deep understanding of mathematical analysis. Designed as a first course in real analysis, it helps students learn how abstract mathematical analysis solves mathematical problems that relate to the real world. As well as providing a valuable source of inspiration for contemporary research in mathematics, the book helps students read, understand and construct mathematical proofs, develop their problem-solving abilities and comprehend the importance and frontiers of computer facilities and much more. It offers comprehensive material for both seminars and independent study for readers with a basic knowledge of calculus and linear algebra. The first nine chapters followed by the appendix on the Stieltjes integral are recommended for graduate students studying probability and statistics, while the first eight chapters followed by the appendix on dynamical systems will be of use to students of biology and environmental scie...
Banks, David L; Rios Insua, David
2015-01-01
Flexible Models to Analyze Opponent Behavior A relatively new area of research, adversarial risk analysis (ARA) informs decision making when there are intelligent opponents and uncertain outcomes. Adversarial Risk Analysis develops methods for allocating defensive or offensive resources against intelligent adversaries. Many examples throughout illustrate the application of the ARA approach to a variety of games and strategic situations. The book shows decision makers how to build Bayesian models for the strategic calculation of their opponents, enabling decision makers to maximize their expected utility or minimize their expected loss. This new approach to risk analysis asserts that analysts should use Bayesian thinking to describe their beliefs about an opponent's goals, resources, optimism, and type of strategic calculation, such as minimax and level-k thinking. Within that framework, analysts then solve the problem from the perspective of the opponent while placing subjective probability distributions on a...
COMPUTER METHODS OF GENETIC ANALYSIS.
Directory of Open Access Journals (Sweden)
A. L. Osipov
2017-02-01
Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.
International Nuclear Information System (INIS)
Strait, R.S.
1996-01-01
The first phase of the Depleted Uranium Hexafluoride Management Program (Program)--management strategy selection--consists of several program elements: Technology Assessment, Engineering Analysis, Cost Analysis, and preparation of an Environmental Impact Statement (EIS). Cost Analysis will estimate the life-cycle costs associated with each of the long-term management strategy alternatives for depleted uranium hexafluoride (UF6). The scope of Cost Analysis will include all major expenditures, from the planning and design stages through decontamination and decommissioning. The costs will be estimated at a scoping or preconceptual design level and are intended to assist decision makers in comparing alternatives for further consideration. They will not be absolute costs or bid-document costs. The purpose of the Cost Analysis Guidelines is to establish a consistent approach to analyzing of cost alternatives for managing Department of Energy's (DOE's) stocks of depleted uranium hexafluoride (DUF6). The component modules that make up the DUF6 management program differ substantially in operational maintenance, process-options, requirements for R and D, equipment, facilities, regulatory compliance, (O and M), and operations risk. To facilitate a consistent and equitable comparison of costs, the guidelines offer common definitions, assumptions or basis, and limitations integrated with a standard approach to the analysis. Further, the goal is to evaluate total net life-cycle costs and display them in a way that gives DOE the capability to evaluate a variety of overall DUF6 management strategies, including commercial potential. The cost estimates reflect the preconceptual level of the designs. They will be appropriate for distinguishing among management strategies
Schramm, Michael J
2008-01-01
This text forms a bridge between courses in calculus and real analysis. It focuses on the construction of mathematical proofs as well as their final content. Suitable for upper-level undergraduates and graduate students of real analysis, it also provides a vital reference book for advanced courses in mathematics.The four-part treatment begins with an introduction to basic logical structures and techniques of proof, including discussions of the cardinality concept and the algebraic and order structures of the real and rational number systems. Part Two presents in-depth examinations of the compl
Fourier analysis an introduction
Stein, Elias M
2003-01-01
This first volume, a three-part introduction to the subject, is intended for students with a beginning knowledge of mathematical analysis who are motivated to discover the ideas that shape Fourier analysis. It begins with the simple conviction that Fourier arrived at in the early nineteenth century when studying problems in the physical sciences--that an arbitrary function can be written as an infinite sum of the most basic trigonometric functions.The first part implements this idea in terms of notions of convergence and summability of Fourier series, while highlighting applications such as th
DEFF Research Database (Denmark)
Reinau, Kristian Hegner
Traditionally, focus in the transport field, both politically and scientifically, has been on private cars and public transport. Freight transport has been a neglected topic. Recent years has seen an increased focus upon congestion as a core issue across Europe, resulting in a great need for know...... speed data for freight. Secondly, the analytical methods used, space-time cubes and emerging hot spot analysis, are also new in the freight transport field. The analysis thus estimates precisely how fast freight moves on the roads in Northern Jutland and how this has evolved over time....
Abrahams, J R; Hiller, N
1965-01-01
Signal Flow Analysis provides information pertinent to the fundamental aspects of signal flow analysis. This book discusses the basic theory of signal flow graphs and shows their relation to the usual algebraic equations.Organized into seven chapters, this book begins with an overview of properties of a flow graph. This text then demonstrates how flow graphs can be applied to a wide range of electrical circuits that do not involve amplification. Other chapters deal with the parameters as well as circuit applications of transistors. This book discusses as well the variety of circuits using ther