WorldWideScience

Sample records for development analysis method

  1. Development of advanced MCR task analysis methods

    International Nuclear Information System (INIS)

    Na, J. C.; Park, J. H.; Lee, S. K.; Kim, J. K.; Kim, E. S.; Cho, S. B.; Kang, J. S.

    2008-07-01

    This report describes task analysis methodology for advanced HSI designs. Task analyses was performed by using procedure-based hierarchical task analysis and task decomposition methods. The results from the task analysis were recorded in a database. Using the TA results, we developed static prototype of advanced HSI and human factors engineering verification and validation methods for an evaluation of the prototype. In addition to the procedure-based task analysis methods, workload estimation based on the analysis of task performance time and analyses for the design of information structure and interaction structures will be necessary

  2. Development of analysis methods for seismically isolated nuclear structures

    International Nuclear Information System (INIS)

    Yoo, Bong; Lee, Jae-Han; Koo, Gyeng-Hoi

    2002-01-01

    KAERI's contributions to the project entitled Development of Analysis Methods for Seismically Isolated Nuclear Structures under IAEA CRP of the intercomparison of analysis methods for predicting the behaviour of seismically isolated nuclear structures during 1996-1999 in effort to develop the numerical analysis methods and to compare the analysis results with the benchmark test results of seismic isolation bearings and isolated nuclear structures provided by participating countries are briefly described. Certain progress in the analysis procedures for isolation bearings and isolated nuclear structures has been made throughout the IAEA CRPs and the analysis methods developed can be improved for future nuclear facility applications. (author)

  3. Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Ettehadtavakkol, Amin, E-mail: amin.ettehadtavakkol@ttu.edu [Texas Tech University (United States); Jablonowski, Christopher [Shell Exploration and Production Company (United States); Lake, Larry [University of Texas at Austin (United States)

    2017-04-15

    Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum design concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.

  4. Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs

    International Nuclear Information System (INIS)

    Ettehadtavakkol, Amin; Jablonowski, Christopher; Lake, Larry

    2017-01-01

    Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum design concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.

  5. Considerations for Task Analysis Methods and Rapid E-Learning Development Techniques

    Directory of Open Access Journals (Sweden)

    Dr. Ismail Ipek

    2014-02-01

    Full Text Available The purpose of this paper is to provide basic dimensions for rapid training development in e-learning courses in education and business. Principally, it starts with defining task analysis and how to select tasks for analysis and task analysis methods for instructional design. To do this, first, learning and instructional technologies as visions of the future were discussed. Second, the importance of task analysis methods in rapid e-learning was considered, with learning technologies as asynchronous and synchronous e-learning development. Finally, rapid instructional design concepts and e-learning design strategies were defined and clarified with examples, that is, all steps for effective task analysis and rapid training development techniques based on learning and instructional design approaches were discussed, such as m-learning and other delivery systems. As a result, the concept of task analysis, rapid e-learning development strategies and the essentials of online course design were discussed, alongside learner interface design features for learners and designers.

  6. Preliminary Tests For Development Of A Non-Pertechnetate Analysis Method

    Energy Technology Data Exchange (ETDEWEB)

    Diprete, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); McCabe, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-09-28

    The objective of this task was to develop a non-pertechnetate analysis method that 222-S lab could easily implement. The initial scope involved working with 222-S laboratory personnel to adapt the existing Tc analytical method to fractionate the non-pertechnetate and pertechnetate. SRNL then developed and tested a method using commercial sorbents containing Aliquat® 336 to extract the pertechnetate (thereby separating it from non-pertechnetate), followed by oxidation, extraction, and stripping steps, and finally analysis by beta counting and Mass Spectroscopy. Several additional items were partially investigated, including impacts of a 137Cs removal step. The method was initially tested on SRS tank waste samples to determine its viability. Although SRS tank waste does not contain non-pertechnetate, testing with it was useful to investigate the compatibility, separation efficiency, interference removal efficacy, and method sensitivity.

  7. Development of motion image prediction method using principal component analysis

    International Nuclear Information System (INIS)

    Chhatkuli, Ritu Bhusal; Demachi, Kazuyuki; Kawai, Masaki; Sakakibara, Hiroshi; Kamiaka, Kazuma

    2012-01-01

    Respiratory motion can induce the limit in the accuracy of area irradiated during lung cancer radiation therapy. Many methods have been introduced to minimize the impact of healthy tissue irradiation due to the lung tumor motion. The purpose of this research is to develop an algorithm for the improvement of image guided radiation therapy by the prediction of motion images. We predict the motion images by using principal component analysis (PCA) and multi-channel singular spectral analysis (MSSA) method. The images/movies were successfully predicted and verified using the developed algorithm. With the proposed prediction method it is possible to forecast the tumor images over the next breathing period. The implementation of this method in real time is believed to be significant for higher level of tumor tracking including the detection of sudden abdominal changes during radiation therapy. (author)

  8. Development of an analysis rule of diagnosis error for standard method of human reliability analysis

    International Nuclear Information System (INIS)

    Jeong, W. D.; Kang, D. I.; Jeong, K. S.

    2003-01-01

    This paper presents the status of development of Korea standard method for Human Reliability Analysis (HRA), and proposed a standard procedure and rules for the evaluation of diagnosis error probability. The quality of KSNP HRA was evaluated using the requirement of ASME PRA standard guideline, and the design requirement for the standard HRA method was defined. Analysis procedure and rules, developed so far, to analyze diagnosis error probability was suggested as a part of the standard method. And also a study of comprehensive application was performed to evaluate the suitability of the proposed rules

  9. Development and analysis of finite volume methods

    International Nuclear Information System (INIS)

    Omnes, P.

    2010-05-01

    This document is a synthesis of a set of works concerning the development and the analysis of finite volume methods used for the numerical approximation of partial differential equations (PDEs) stemming from physics. In the first part, the document deals with co-localized Godunov type schemes for the Maxwell and wave equations, with a study on the loss of precision of this scheme at low Mach number. In the second part, discrete differential operators are built on fairly general, in particular very distorted or nonconforming, bidimensional meshes. These operators are used to approach the solutions of PDEs modelling diffusion, electro and magneto-statics and electromagnetism by the discrete duality finite volume method (DDFV) on staggered meshes. The third part presents the numerical analysis and some a priori as well as a posteriori error estimations for the discretization of the Laplace equation by the DDFV scheme. The last part is devoted to the order of convergence in the L2 norm of the finite volume approximation of the solution of the Laplace equation in one dimension and on meshes with orthogonality properties in two dimensions. Necessary and sufficient conditions, relatively to the mesh geometry and to the regularity of the data, are provided that ensure the second-order convergence of the method. (author)

  10. Development of computational methods of design by analysis for pressure vessel components

    International Nuclear Information System (INIS)

    Bao Shiyi; Zhou Yu; He Shuyan; Wu Honglin

    2005-01-01

    Stress classification is not only one of key steps when pressure vessel component is designed by analysis, but also a difficulty which puzzles engineers and designers at all times. At present, for calculating and categorizing the stress field of pressure vessel components, there are several computation methods of design by analysis such as Stress Equivalent Linearization, Two-Step Approach, Primary Structure method, Elastic Compensation method, GLOSS R-Node method and so on, that are developed and applied. Moreover, ASME code also gives an inelastic method of design by analysis for limiting gross plastic deformation only. When pressure vessel components design by analysis, sometimes there are huge differences between the calculating results for using different calculating and analysis methods mentioned above. As consequence, this is the main reason that affects wide application of design by analysis approach. Recently, a new approach, presented in the new proposal of a European Standard, CEN's unfired pressure vessel standard EN 13445-3, tries to avoid problems of stress classification by analyzing pressure vessel structure's various failure mechanisms directly based on elastic-plastic theory. In this paper, some stress classification methods mentioned above, are described briefly. And the computational methods cited in the European pressure vessel standard, such as Deviatoric Map, and nonlinear analysis methods (plastic analysis and limit analysis), are depicted compendiously. Furthermore, the characteristics of computational methods of design by analysis are summarized for selecting the proper computational method when design pressure vessel component by analysis. (authors)

  11. Development of a general method for photovoltaic system analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nolay, P

    1987-01-01

    The photovoltaic conversion for energetic applications is now widely used, but its development still needs the resolution of many problems for the sizing and for the real working of the installations. The precise analysis of the components and whole system behaviour has led to the development of accurate models for the simulation of such systems. From this modelling phase, a simulation code has been built. The validation of this software has been achieved from experimental test measurements. Since the quality of the software depends on the precision of the input data, an original method of determination of component characteristics, by means of model identification, has been developed. These tools permit the prediction of system behaviour and the dynamic simulation of systems under real conditions. Used for the study of photovoltaic system sizing, this software has allowed the definition of new concepts which will serve as a basis for the development of a sizing method.

  12. Method development for trace and ultratrace analysis

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Method development, that is, selection of a mode of chromatography and the right column and mobile-phase combination for trace and ultratrace analysis, requires several main considerations. The method should be useful for resolving various trace and ultratrace components present in the sample. If the nature of these components is known, the choice of method may be straightforward, that is, a selection can be made from the following modes of HPLC: (1) adsorption chromatography; (2) normal-phase chromatography; (3) reversed-phase chromatography; (4) ion-pair chromatography; (5) ion-exchange chromatography; (6) ion chromatography. Unfortunately, the nature of all of the components is frequently unknown. However, several intelligent judgments can be made on the nature of impurities. This chapter deals with some basic approaches to mobile-phase selection and optimization. More detailed information may be found in basic texts. Techniques for separation of high-molecular-weight compounds (macromolecules) and chiral compounds may be found elsewhere. Mainly compounds with molecular weight lower than 2,000 are discussed here. 123 refs

  13. Development of three-dimensional ENRICHED FREE MESH METHOD and its application to crack analysis

    International Nuclear Information System (INIS)

    Suzuki, Hayato; Matsubara, Hitoshi; Ezawa, Yoshitaka; Yagawa, Genki

    2010-01-01

    In this paper, we describe a method for three-dimensional high accurate analysis of a crack included in a large-scale structure. The Enriched Free Mesh Method (EFMM) is a method for improving the accuracy of the Free Mesh Method (FMM), which is a kind of meshless method. First, we developed an algorithm of the three-dimensional EFMM. The elastic problem was analyzed using the EFMM and we find that its accuracy compares advantageously with the FMM, and the number of CG iterations is smaller. Next, we developed a method for calculating the stress intensity factor by employing the EFMM. The structure with a crack was analyzed using the EFMM, and the stress intensity factor was calculated by the developed method. The analysis results were very well in agreement with reference solution. It was shown that the proposed method is very effective in the analysis of the crack included in a large-scale structure. (author)

  14. Development of A Standard Method for Human Reliability Analysis of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Jung, Won Dea; Kang, Dae Il; Kim, Jae Whan

    2005-12-01

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  15. Development of A Standard Method for Human Reliability Analysis of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kang, Dae Il; Kim, Jae Whan

    2005-12-15

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method.

  16. Development of TRU waste mobile analysis methods for RCRA-regulated metals

    International Nuclear Information System (INIS)

    Mahan, C.A.; Villarreal, R.; Drake, L.; Figg, D.; Wayne, D.; Goldstein, S.

    1998-01-01

    This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). Glow-discharge mass spectrometry (GD-MS), laser-induced breakdown spectroscopy (LIBS), dc-arc atomic-emission spectroscopy (DC-ARC-AES), laser-ablation inductively-coupled-plasma mass spectrometry (LA-ICP-MS), and energy-dispersive x-ray fluorescence (EDXRF) were identified as potential solid-sample analytical techniques for mobile characterization of TRU waste. Each technology developers was provided with surrogate TRU waste samples in order to develop an analytical method. Following successful development of the analytical method, five performance evaluation samples were distributed to each of the researchers in a blind round-robin format. Results of the round robin were compared to known values and Transuranic Waste Characterization Program (TWCP) data quality objectives. Only two techniques, DC-ARC-AES and EDXRF, were able to complete the entire project. Methods development for GD-MS and LA-ICP-MS was halted due to the stand-down at the CMR facility. Results of the round-robin analysis are given for the EDXRF and DCARC-AES techniques. While DC-ARC-AES met several of the data quality objectives, the performance of the EDXRF technique by far surpassed the DC-ARC-AES technique. EDXRF is a simple, rugged, field portable instrument that appears to hold great promise for mobile characterization of TRU waste. The performance of this technique needs to be tested on real TRU samples in order to assess interferences from actinide constituents. In addition, mercury and beryllium analysis will require another analytical technique because the EDXRF method failed to meet the TWCP data quality objectives. Mercury analysis is easily accomplished on solid samples by cold vapor atomic fluorescence (CVAFS). Beryllium can be analyzed by any of a variety of emission techniques

  17. Development of rapid urine analysis method for uranium

    Energy Technology Data Exchange (ETDEWEB)

    Kuwabara, J.; Noguchi, H. [Japan Atomic Energy Research Institute, Tokai, Ibaraki (Japan)

    2000-05-01

    ICP-MS has begun to spread in the field of individual monitoring for internal exposure as a very effective machine for uranium analysis. Although the ICP-MS has very high sensitivity, it requires longer time than conventional analysis, such as fluorescence analysis, because it is necessary to remove matrix from a urine sample sufficiently. To shorten time required for the urine bioassay by ICP-MS, a rapid uranium analysis method using the ICP-MS connected with a flow injection system was developed. Since this method does not involve chemical separation steps, the time required is equivalent to the conventional analysis. A measurement test was carried out using 10 urine solutions prepared from a urine sample. Required volume of urine solution is 5 ml. Main chemical treatment is only the digestion with 5 ml of nitric acid using a microwave oven to decompose organic matter and to dissolve suspended or precipitated matter. The microwave oven can digest 10 samples at once within an hour. Volume of digested sample solution was adjusted to 10 ml. The prepared sample solutions were directly introduced to the ICP-MS without any chemical separation procedure. The ICP-MS was connected with a flow injection system and an auto sampler. The flow injection system can minimize the matrix effects caused from salt dissolved in high matrix solution, such as non chemical separated urine sample, because it can introduce micro volume of sample solution into the ICP-MS. The ICP-MS detected uranium within 2 min/sample using the auto sampler. The 10 solutions prepared from a urine sample showed an average of 7.5 ng/l of uranium concentration in urine with 10 % standard deviation. A detection limit is about 1 ng/l. The total time required was less than 4 hours for 10 sample analysis. In the series of measurement, any memory effect was not observed. The present analysis method using the ICP-MS equipped with the flow injection system demonstrated that the shortening of time required on high

  18. Development of rapid urine analysis method for uranium

    International Nuclear Information System (INIS)

    Kuwabara, J.; Noguchi, H.

    2000-01-01

    ICP-MS has begun to spread in the field of individual monitoring for internal exposure as a very effective machine for uranium analysis. Although the ICP-MS has very high sensitivity, it requires longer time than conventional analysis, such as fluorescence analysis, because it is necessary to remove matrix from a urine sample sufficiently. To shorten time required for the urine bioassay by ICP-MS, a rapid uranium analysis method using the ICP-MS connected with a flow injection system was developed. Since this method does not involve chemical separation steps, the time required is equivalent to the conventional analysis. A measurement test was carried out using 10 urine solutions prepared from a urine sample. Required volume of urine solution is 5 ml. Main chemical treatment is only the digestion with 5 ml of nitric acid using a microwave oven to decompose organic matter and to dissolve suspended or precipitated matter. The microwave oven can digest 10 samples at once within an hour. Volume of digested sample solution was adjusted to 10 ml. The prepared sample solutions were directly introduced to the ICP-MS without any chemical separation procedure. The ICP-MS was connected with a flow injection system and an auto sampler. The flow injection system can minimize the matrix effects caused from salt dissolved in high matrix solution, such as non chemical separated urine sample, because it can introduce micro volume of sample solution into the ICP-MS. The ICP-MS detected uranium within 2 min/sample using the auto sampler. The 10 solutions prepared from a urine sample showed an average of 7.5 ng/l of uranium concentration in urine with 10 % standard deviation. A detection limit is about 1 ng/l. The total time required was less than 4 hours for 10 sample analysis. In the series of measurement, any memory effect was not observed. The present analysis method using the ICP-MS equipped with the flow injection system demonstrated that the shortening of time required on high

  19. Development and simulation of various methods for neutron activation analysis

    International Nuclear Information System (INIS)

    Otgooloi, B.

    1993-01-01

    Simple methods for neutron activation analysis have been developed. The results on the studies of installation for determination of fluorine in fluorite ores directly on the lorry by fast neutron activation analysis have been shown. Nitrogen in organic materials was shown by N 14 and N 15 activation. The description of the new equipment 'FLUORITE' for fluorate factory have been shortly given. Pu and Be isotope in organic materials, including in wheat, was measured. 25 figs, 19 tabs. (Author, Translated by J.U)

  20. Development of A Standard Method for Human Reliability Analysis (HRA) of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kang, Dae Il; Jung, Won Dea; Kim, Jae Whan

    2005-12-01

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  1. Development of A Standard Method for Human Reliability Analysis (HRA) of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Jung, Won Dea; Kim, Jae Whan

    2005-12-15

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method.

  2. Development of rupture process analysis method for great earthquakes using Direct Solution Method

    Science.gov (United States)

    Yoshimoto, M.; Yamanaka, Y.; Takeuchi, N.

    2010-12-01

    Conventional rupture process analysis methods using teleseismic body waves were based on ray theory. Therefore, these methods have the following problems in applying to great earthquakes such as 2004 Sumatra earthquake: (1) difficulty in computing all later phases such as the PP reflection phase, (2) impossibility of computing called “W phase”, the long period phase arriving before S wave, (3) implausibility of hypothesis that the distance is far enough from the observation points to the hypocenter compared to the fault length. To solve above mentioned problems, we have developed a new method which uses the synthetic seismograms computed by the Direct Solution Method (DSM, e.g. Kawai et al. 2006) as Green’s functions. We used the DSM software (http://www.eri.u-tokyo.ac.jp/takeuchi/software/) for computing the Green’s functions up to 1 Hz for the IASP91 (Kennett and Engdahl, 1991) model, and determined the final slip distributions using the waveform inversion method (Kikuchi et al. 2003). First we confirmed whether the Green’s functions computed by DSM were accurate in higher frequencies up to 1 Hz. Next we performed the rupture process analysis of this new method for Mw8.0 (GCMT) large Solomon Islands earthquake on April 1, 2007. We found that this earthquake consisted of two asperities and the rupture propagated across the subducting Sinbo ridge. The obtained slip distribution better correlates to the aftershock distributions than existing method. Furthermore, this new method keep same accuracy of existing method (which has the advantage of calculating) with respect to direct P-wave and reflection phases near the source, and also accurately calculate the later phases such a PP-wave.

  3. Development of automatic image analysis methods for high-throughput and high-content screening

    NARCIS (Netherlands)

    Di, Zi

    2013-01-01

    This thesis focuses on the development of image analysis methods for ultra-high content analysis of high-throughput screens where cellular phenotype responses to various genetic or chemical perturbations that are under investigation. Our primary goal is to deliver efficient and robust image analysis

  4. Chemical analysis of solid residue from liquid and solid fuel combustion: Method development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Trkmic, M. [University of Zagreb, Faculty of Mechanical Engineering and Naval Architecturek Zagreb (Croatia); Curkovic, L. [University of Zagreb, Faculty of Chemical Engineering and Technology, Zagreb (Croatia); Asperger, D. [HEP-Proizvodnja, Thermal Power Plant Department, Zagreb (Croatia)

    2012-06-15

    This paper deals with the development and validation of methods for identifying the composition of solid residue after liquid and solid fuel combustion in thermal power plant furnaces. The methods were developed for energy dispersive X-ray fluorescence (EDXRF) spectrometer analysis. Due to the fuels used, the different composition and the location of creation of solid residue, it was necessary to develop two methods. The first method is used for identifying solid residue composition after fuel oil combustion (Method 1), while the second method is used for identifying solid residue composition after the combustion of solid fuels, i. e. coal (Method 2). Method calibration was performed on sets of 12 (Method 1) and 6 (Method 2) certified reference materials (CRM). CRMs and analysis test samples were prepared in pellet form using hydraulic press. For the purpose of method validation the linearity, accuracy, precision and specificity were determined, and the measurement uncertainty of methods for each analyte separately was assessed. The methods were applied in the analysis of real furnace residue samples. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  5. Development of sample preparation method for honey analysis using PIXE

    International Nuclear Information System (INIS)

    Saitoh, Katsumi; Chiba, Keiko; Sera, Koichiro

    2008-01-01

    We developed an original preparation method for honey samples (samples in paste-like state) specifically designed for PIXE analysis. The results of PIXE analysis of thin targets prepared by adding a standard containing nine elements to honey samples demonstrated that the preparation method bestowed sufficient accuracy on quantitative values. PIXE analysis of 13 kinds of honey was performed, and eight mineral components (Si, P, S, K, Ca, Mn, Cu and Zn) were detected in all honey samples. The principal mineral components were K and Ca, and the quantitative value for K accounted for the majority of the total value for mineral components. K content in honey varies greatly depending on the plant source. Chestnuts had the highest K content. In fact, it was 2-3 times that of Manuka, which is known as a high quality honey. K content of false-acacia, which is produced in the greatest abundance, was 1/20 that of chestnuts. (author)

  6. DEVELOPMENT AND VALIDATION OF NUMERICAL METHOD FOR STRENGTH ANALYSIS OF LATTICE COMPOSITE FUSELAGE STRUCTURES

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available Lattice composite fuselage structures are developed as an alternative to conventional composite structures based on laminated skin and stiffeners. Structure layout of lattice structures allows to realize advantages of current composite materials to a maximal extent, at the same time minimizing their main shortcomings, that allows to provide higher weight efficiency for these structures in comparison with conventional analogues.Development and creation of lattice composite structures requires development of novel methods of strength anal- ysis, as conventional methods, as a rule, are aiming to strength analysis of thin-walled elements and do not allow to get confident estimation of local strength of high-loaded unidirectional composite ribs.In the present work the method of operative strength analysis of lattice composite structure is presented, based onspecialized FE-models of unidirectional composite ribs and their intersections. In the frames of the method, every rib is modeled by a caisson structure, consisting of arbitrary number of flanges and webs, modeled by membrane finite elements. Parameters of flanges and webs are calculated automatically from the condition of stiffness characteristics equality of real rib and the model. This method allows to perform local strength analysis of high-loaded ribs of lattice structure without use of here-dimensional finite elements, that allows to shorten time of calculations and sufficiently simplify the procedure of analysis of results of calculations.For validation of the suggested method, the results of experimental investigations of full-scale prototype of shell of lattice composite fuselage section have been used. The prototype of the lattice section was manufactured in CRISM and tested in TsAGI within the frames of a number of Russian and International scientific projects. The results of validation have shown that the suggested method allows to provide high operability of strength analysis, keeping

  7. Development of high performance liquid chromatography method for miconazole analysis in powder sample

    Science.gov (United States)

    Hermawan, D.; Suwandri; Sulaeman, U.; Istiqomah, A.; Aboul-Enein, H. Y.

    2017-02-01

    A simple high performance liquid chromatography (HPLC) method has been developed in this study for the analysis of miconazole, an antifungal drug, in powder sample. The optimized HPLC system using C8 column was achieved using mobile phase composition containing methanol:water (85:15, v/v), a flow rate of 0.8 mL/min, and UV detection at 220 nm. The calibration graph was linear in the range from 10 to 50 mg/L with r 2 of 0.9983. The limit of detection (LOD) and limit of quantitation (LOQ) obtained were 2.24 mg/L and 7.47 mg/L, respectively. The present HPLC method is applicable for the determination of miconazole in the powder sample with a recovery of 101.28 % (RSD = 0.96%, n = 3). The developed HPLC method provides short analysis time, high reproducibility and high sensitivity.

  8. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    Ure, James; Chen, Haofeng; Tipping, David

    2014-01-01

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  9. Analysis apparatus and method of analysis

    International Nuclear Information System (INIS)

    1976-01-01

    A continuous streaming method developed for the excution of immunoassays is described in this patent. In addition, a suitable apparatus for the method was developed whereby magnetic particles are automatically employed for the consecutive analysis of a series of liquid samples via the RIA technique

  10. Developing a Clustering-Based Empirical Bayes Analysis Method for Hotspot Identification

    Directory of Open Access Journals (Sweden)

    Yajie Zou

    2017-01-01

    Full Text Available Hotspot identification (HSID is a critical part of network-wide safety evaluations. Typical methods for ranking sites are often rooted in using the Empirical Bayes (EB method to estimate safety from both observed crash records and predicted crash frequency based on similar sites. The performance of the EB method is highly related to the selection of a reference group of sites (i.e., roadway segments or intersections similar to the target site from which safety performance functions (SPF used to predict crash frequency will be developed. As crash data often contain underlying heterogeneity that, in essence, can make them appear to be generated from distinct subpopulations, methods are needed to select similar sites in a principled manner. To overcome this possible heterogeneity problem, EB-based HSID methods that use common clustering methodologies (e.g., mixture models, K-means, and hierarchical clustering to select “similar” sites for building SPFs are developed. Performance of the clustering-based EB methods is then compared using real crash data. Here, HSID results, when computed on Texas undivided rural highway cash data, suggest that all three clustering-based EB analysis methods are preferred over the conventional statistical methods. Thus, properly classifying the road segments for heterogeneous crash data can further improve HSID accuracy.

  11. Original methods of quantitative analysis developed for diverse samples in various research fields. Quantitative analysis at NMCC

    International Nuclear Information System (INIS)

    Sera, Koichiro

    2003-01-01

    Nishina Memorial Cyclotron Center (NMCC) has been opened for nationwide-common utilization of positron nuclear medicine (PET) and PIXE since April 1993. At the present time, nearly 40 subjects of PIXE in various research fields are pursued here, and more than 50,000 samples have been analyzed up to the present. In order to perform quantitative analyses of diverse samples, technical developments in sample preparation, measurement and data analysis have been continuously carried out. Especially, a standard-free method for quantitative analysis'' made it possible to perform analysis of infinitesimal samples, powdered samples and untreated bio samples, which could not be well analyzed quantitatively in the past. The standard-free method'' and a ''powdered internal standard method'' made the process for target preparation quite easier. It has been confirmed that results obtained by these methods show satisfactory accuracy and reproducibility preventing any ambiguity coming from complicated target preparation processes. (author)

  12. Development of spectral history methods for pin-by-pin core analysis method using three-dimensional direct response matrix

    International Nuclear Information System (INIS)

    Mitsuyasu, T.; Ishii, K.; Hino, T.; Aoyama, M.

    2009-01-01

    Spectral history methods for pin-by-pin core analysis method using the three-dimensional direct response matrix have been developed. The direct response matrix is formalized by four sub-response matrices in order to respond to a core eigenvalue k and thus can be recomposed at each outer iteration in the core analysis. For core analysis, it is necessary to take into account the burn-up effect related to spectral history. One of the methods is to evaluate the nodal burn-up spectrum obtained using the out-going neutron current. The other is to correct the fuel rod neutron production rates obtained the pin-by-pin correction. These spectral history methods were tested in a heterogeneous system. The test results show that the neutron multiplication factor error can be reduced by half during burn-up, the nodal neutron production rates errors can be reduced by 30% or more. The root-mean-square differences between the relative fuel rod neutron production rate distributions can be reduced within 1.1% error. This means that these methods can accurately reflect the effects of intra- and inter-assembly heterogeneities during burn-up and can be used for core analysis. Core analysis with the DRM method was carried out for an ABWR quarter core and it was found that both thermal power and coolant-flow distributions were smoothly converged. (authors)

  13. Development of Ultraviolet Spectrophotometric Method for Analysis ...

    African Journals Online (AJOL)

    HP

    Method for Analysis of Lornoxicam in Solid Dosage. Forms. Sunit Kumar Sahoo ... testing. Mean recovery was 100.82 % for tablets. Low values of % RSD indicate .... Saharty E, Refaat YS, Khateeb ME. Stability-. Indicating. Spectrophotometric.

  14. Ion beam analysis - development and application of nuclear reaction analysis methods, in particular at a nuclear microprobe

    International Nuclear Information System (INIS)

    Sjoeland, K.A.

    1996-11-01

    This thesis treats the development of Ion Beam Analysis methods, principally for the analysis of light elements at a nuclear microprobe. The light elements in this context are defined as having an atomic number less than approx. 13. The work reported is to a large extent based on multiparameter methods. Several signals are recorded simultaneously, and the data can be effectively analyzed to reveal structures that can not be observed through one-parameter collection. The different techniques are combined in a new set-up at the Lund Nuclear Microprobe. The various detectors for reaction products are arranged in such a way that they can be used for the simultaneous analysis of hydrogen, lithium, boron and fluorine together with traditional PIXE analysis and Scanning Transmission Ion Microscopy as well as photon-tagged Nuclear Reaction Analysis. 48 refs

  15. Adjoint-based Mesh Optimization Method: The Development and Application for Nuclear Fuel Analysis

    International Nuclear Information System (INIS)

    Son, Seongmin; Lee, Jeong Ik

    2016-01-01

    In this research, methods for optimizing mesh distribution is proposed. The proposed method uses adjoint base optimization method (adjoint method). The optimized result will be obtained by applying this meshing technique to the existing code input deck and will be compared to the results produced from the uniform meshing method. Numerical solutions are calculated form an in-house 1D Finite Difference Method code while neglecting the axial conduction. The fuel radial node optimization was first performed to match the Fuel Centerline Temperature (FCT) the best. This was followed by optimizing the axial node which the Peak Cladding Temperature (PCT) is matched the best. After obtaining the optimized radial and axial nodes, the nodalization is implemented into the system analysis code and transient analyses were performed to observe the optimum nodalization performance. The developed adjoint-based mesh optimization method in the study is applied to MARS-KS, which is a nuclear system analysis code. Results show that the newly established method yields better results than that of the uniform meshing method from the numerical point of view. It is again stressed that the optimized mesh for the steady state can also give better numerical results even during a transient analysis

  16. COMPUTER METHODS OF GENETIC ANALYSIS.

    Directory of Open Access Journals (Sweden)

    A. L. Osipov

    2017-02-01

    Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.

  17. Development of Uncertainty Analysis Method for SMART Digital Core Protection and Monitoring System

    International Nuclear Information System (INIS)

    Koo, Bon Seung; In, Wang Kee; Hwang, Dae Hyun

    2012-01-01

    The Korea Atomic Energy Research Institute has developed a system-integrated modular advanced reactor (SMART) for a seawater desalination and electricity generation. Online digital core protection and monitoring systems, called SCOPS and SCOMS respectively were developed. SCOPS calculates minimum DNBR and maximum LPD based on the several online measured system parameters. SCOMS calculates the variables of limiting conditions for operation. KAERI developed overall uncertainty analysis methodology which is used statistically combining uncertainty components of SMART core protection and monitoring system. By applying overall uncertainty factors in on-line SCOPS/SCOMS calculation, calculated LPD and DNBR are conservative with a 95/95 probability/confidence level. In this paper, uncertainty analysis method is described for SMART core protection and monitoring system

  18. Development of the complex of nuclear-physical methods of analysis for geology and technology tasks in Kazakhstan

    International Nuclear Information System (INIS)

    Solodukhin, V.; Silachyov, I.; Poznyak, V.; Gorlachev, I.

    2016-01-01

    The paper describes the development of nuclear-physical methods of analysis and their applications in Kazakhstan for geological tasks and technology. The basic methods of this complex include instrumental neutron-activation analysis, x-ray fluorescent analysis and instrumental γ-spectrometry. The following aspects are discussed: applications of developed and adopted analytical techniques for assessment and calculations of rare-earth metal reserves at various deposits in Kazakhstan, for technology development of mining and extraction from uranium-phosphorous ore and wastes, for radioactive coal gasification technology, for studies of rare metal contents in chromite, bauxites, black shales and their processing products. (author)

  19. Development of calculation method for one-dimensional kinetic analysis in fission reactors, including feedback effects

    International Nuclear Information System (INIS)

    Paixao, S.B.; Marzo, M.A.S.; Alvim, A.C.M.

    1986-01-01

    The calculation method used in WIGLE code is studied. Because of the non availability of such a praiseworthy solution, expounding the method minutely has been tried. This developed method has been applied for the solution of the one-dimensional, two-group, diffusion equations in slab, axial analysis, including non-boiling heat transfer, accountig for feedback. A steady-state program (CITER-1D), written in FORTRAN 4, has been implemented, providing excellent results, ratifying the developed work quality. (Author) [pt

  20. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    Science.gov (United States)

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  1. RESULTS OF ANALYSIS OF BENCHMARKING METHODS OF INNOVATION SYSTEMS ASSESSMENT IN ACCORDANCE WITH AIMS OF SUSTAINABLE DEVELOPMENT OF SOCIETY

    Directory of Open Access Journals (Sweden)

    A. Vylegzhanina

    2016-01-01

    Full Text Available In this work, we introduce results of comparative analysis of international ratings indexes of innovation systems for their compliance with purposes of sustainable development. Purpose of this research is defining requirements to benchmarking methods of assessing national or regional innovation systems and compare them basing on assumption, that innovation system is aligned with sustainable development concept. Analysis of goal sets and concepts, which underlie observed international composite innovation indexes, comparison of their metrics and calculation techniques, allowed us to reveal opportunities and limitations of using these methods in frames of sustainable development concept. We formulated targets of innovation development on the base of innovation priorities of sustainable socio-economic development. Using comparative analysis of indexes with these targets, we revealed two methods of assessing innovation systems, maximally connected with goals of sustainable development. Nevertheless, today no any benchmarking method, which meets need of innovation systems assessing in compliance with sustainable development concept to a sufficient extent. We suggested practical directions of developing methods, assessing innovation systems in compliance with goals of societal sustainable development.

  2. Development of Compressive Failure Strength for Composite Laminate Using Regression Analysis Method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Myoung Keon [Agency for Defense Development, Daejeon (Korea, Republic of); Lee, Jeong Won; Yoon, Dong Hyun; Kim, Jae Hoon [Chungnam Nat’l Univ., Daejeon (Korea, Republic of)

    2016-10-15

    This paper provides the compressive failure strength value of composite laminate developed by using regression analysis method. Composite material in this document is a Carbon/Epoxy unidirection(UD) tape prepreg(Cycom G40-800/5276-1) cured at 350°F(177°C). The operating temperature is –60°F~+200°F(-55°C - +95°C). A total of 56 compression tests were conducted on specimens from eight (8) distinct laminates that were laid up by standard angle layers (0°, +45°, –45° and 90°). The ASTM-D-6484 standard was used for test method. The regression analysis was performed with the response variable being the laminate ultimate fracture strength and the regressor variables being two ply orientations (0° and ±45°)

  3. Development of Compressive Failure Strength for Composite Laminate Using Regression Analysis Method

    International Nuclear Information System (INIS)

    Lee, Myoung Keon; Lee, Jeong Won; Yoon, Dong Hyun; Kim, Jae Hoon

    2016-01-01

    This paper provides the compressive failure strength value of composite laminate developed by using regression analysis method. Composite material in this document is a Carbon/Epoxy unidirection(UD) tape prepreg(Cycom G40-800/5276-1) cured at 350°F(177°C). The operating temperature is –60°F~+200°F(-55°C - +95°C). A total of 56 compression tests were conducted on specimens from eight (8) distinct laminates that were laid up by standard angle layers (0°, +45°, –45° and 90°). The ASTM-D-6484 standard was used for test method. The regression analysis was performed with the response variable being the laminate ultimate fracture strength and the regressor variables being two ply orientations (0° and ±45°)

  4. Development of a Method for Tool Wear Analysis Using 3D Scanning

    Directory of Open Access Journals (Sweden)

    Hawryluk Marek

    2017-12-01

    Full Text Available The paper deals with evaluation of a 3D scanning method elaborated by the authors, by applying it to the analysis of the wear of forging tools. The 3D scanning method in the first place consists in the application of scanning to the analysis of changes in geometry of a forging tool by way of comparing the images of a worn tool with a CAD model or an image of a new tool. The method was evaluated in the context of the important measurement problems resulting from the extreme conditions present during the industrial hot forging processes. The method was used to evaluate wear of tools with an increasing wear degree, which made it possible to determine the wear characteristics in a function of the number of produced forgings. The following stage was the use it for a direct control of the quality and geometry changes of forging tools (without their disassembly by way of a direct measurement of the geometry of periodically collected forgings (indirect method based on forgings. The final part of the study points to the advantages and disadvantages of the elaborated method as well as the potential directions of its further development.

  5. Excitation methods for energy dispersive analysis

    International Nuclear Information System (INIS)

    Jaklevic, J.M.

    1976-01-01

    The rapid development in recent years of energy dispersive x-ray fluorescence analysis has been based primarily on improvements in semiconductor detector x-ray spectrometers. However, the whole analysis system performance is critically dependent on the availability of optimum methods of excitation for the characteristic x rays in specimens. A number of analysis facilities based on various methods of excitation have been developed over the past few years. A discussion is given of the features of various excitation methods including charged particles, monochromatic photons, and broad-energy band photons. The effects of the excitation method on background and sensitivity are discussed from both theoretical and experimental viewpoints. Recent developments such as pulsed excitation and polarized photons are also discussed

  6. Coupled Electro-Magneto-Mechanical-Acoustic Analysis Method Developed by Using 2D Finite Element Method for Flat Panel Speaker Driven by Magnetostrictive-Material-Based Actuator

    Science.gov (United States)

    Yoo, Byungjin; Hirata, Katsuhiro; Oonishi, Atsurou

    In this study, a coupled analysis method for flat panel speakers driven by giant magnetostrictive material (GMM) based actuator was developed. The sound field produced by a flat panel speaker that is driven by a GMM actuator depends on the vibration of the flat panel, this vibration is a result of magnetostriction property of the GMM. In this case, to predict the sound pressure level (SPL) in the audio-frequency range, it is necessary to take into account not only the magnetostriction property of the GMM but also the effect of eddy current and the vibration characteristics of the actuator and the flat panel. In this paper, a coupled electromagnetic-structural-acoustic analysis method is presented; this method was developed by using the finite element method (FEM). This analysis method is used to predict the performance of a flat panel speaker in the audio-frequency range. The validity of the analysis method is verified by comparing with the measurement results of a prototype speaker.

  7. Development of a reliability-analysis method for category I structures

    International Nuclear Information System (INIS)

    Shinozuka, M.; Kako, T.; Hwang, H.; Reich, M.

    1983-01-01

    The present paper develops a reliability analysis method for category I nuclear structures, particularly for reinforced concrete containment structures subjected to various load combinations. The loads considered here include dead loads, accidental internal pressure and earthquake ground acceleration. For mathematical tractability, an earthquake occurrence is assumed to be governed by the Poisson arrival law, while its acceleration history is idealized as a Gaussian vector process of finite duration. A vector process consists of three component processes, each with zero mean. The second order statistics of this process are specified by a three-by-three spectral density matrix with a multiplying factor representing the overall intensity of the ground acceleration. With respect to accidental internal pressure, the following assumptions are made: (a) it occurs in accordance with the Poisson law; (b) its intensity and duration are random; and (c) its temporal rise and fall behaviors are such that a quasi-static structural analysis applies. A dead load is considered to be a deterministic constant

  8. Development of 3D CFD simulation method in nuclear reactor safety analysis

    International Nuclear Information System (INIS)

    Rosli Darmawan; Mariah Adam

    2012-01-01

    One of the most prevailing issues in the operation of nuclear reactor is the safety of the system. Worldwide publicity on a few nuclear accidents as well as the notorious Hiroshima and Nagasaki bombing have always brought about public fear on anything related to nuclear. Most findings on the nuclear reactor accidents are closely related to the reactor cooling system. Thus, the understanding of the behaviour of reactor cooling system is very important to ensure the development and improvement on safety can be continuously done. Throughout the development of nuclear reactor technology, investigation and analysis on reactor safety have gone through several phases. In the early days, analytical and experimental methods were employed. For the last three decades 1D system level codes were widely used. The continuous development of nuclear reactor technology has brought about more complex system and processes of nuclear reactor operation. More detailed dimensional simulation codes are needed to assess these new reactors. This paper discusses the development of 3D CFD usage in nuclear reactor safety analysis worldwide. A brief review on the usage of CFD at Malaysia's Reactor TRIGA PUSPATI is also presented. (author)

  9. Analysis and development of adjoint-based h-adaptive direct discontinuous Galerkin method for the compressible Navier-Stokes equations

    Science.gov (United States)

    Cheng, Jian; Yue, Huiqiang; Yu, Shengjiao; Liu, Tiegang

    2018-06-01

    In this paper, an adjoint-based high-order h-adaptive direct discontinuous Galerkin method is developed and analyzed for the two dimensional steady state compressible Navier-Stokes equations. Particular emphasis is devoted to the analysis of the adjoint consistency for three different direct discontinuous Galerkin discretizations: including the original direct discontinuous Galerkin method (DDG), the direct discontinuous Galerkin method with interface correction (DDG(IC)) and the symmetric direct discontinuous Galerkin method (SDDG). Theoretical analysis shows the extra interface correction term adopted in the DDG(IC) method and the SDDG method plays a key role in preserving the adjoint consistency. To be specific, for the model problem considered in this work, we prove that the original DDG method is not adjoint consistent, while the DDG(IC) method and the SDDG method can be adjoint consistent with appropriate treatment of boundary conditions and correct modifications towards the underlying output functionals. The performance of those three DDG methods is carefully investigated and evaluated through typical test cases. Based on the theoretical analysis, an adjoint-based h-adaptive DDG(IC) method is further developed and evaluated, numerical experiment shows its potential in the applications of adjoint-based adaptation for simulating compressible flows.

  10. Development of quantitative methods for spill response planning: a trajectory analysis planner

    International Nuclear Information System (INIS)

    Galt, J.A.; Payton, D.L.

    1999-01-01

    In planning for response to oil spills, a great deal of information must be assimilated. Typically, geophysical flow patterns, ocean turbulence, complex chemical processes, ecological setting, fisheries activities, economics of land use, and engineering constraints on response equipment all need to be considered. This presents a formidable analysis problem. It can be shown, however, that if an appropriate set of evaluation data is available, an objective function and appropriate constraints can be formulated. From these equations, the response problem can be cast in terms of game theory of decision analysis and an optimal solution can be obtained using common scarce-resource allocation methods. The optimal solution obtained by this procedure maximises the expected return over all possible implementations of a given set of response options. While considering the development of an optimal spill response, it is useful to consider whether (in the absence of complete data) implementing some subset of these methods is possible to provide relevant and useful information for the spill planning process, even though it may fall short of a statistically optimal solution. In this work we introduce a trajectory analysis planning (TAP) methodology that can provide a cohesive framework for integrating physical transport processes, environmental sensitivity of regional sites, and potential response options. This trajectory analysis planning methodology can be shown to implement a significant part of the game theory analysis and provide 'minimum regret' strategy advice, without actually carrying out the optimisation procedures. (Author)

  11. The development of a task analysis method applicable to the tasks of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Wan Chul; Park, Ji Soo; Baek, Dong Hyeon; Ham, Dong Han; Kim, Huhn [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1996-07-01

    While task analysis is one of the essential processes for human factors studies, traditional methods reveal weaknesses in dealing with the cognitive aspects, which become more critical in modern complex system. This report proposes a cognitive task analysis (CTA) method for identifying cognitive requirements of operators' tasks in nuclear power plants. The proposed CTA method is characterized by the information-oriented concept and procedure-based approach. The task prescription identifies the information requirements and trace the information flow to reveal the cognitive organization of task procedure with emphasis to the relations among the information requirements. The cognitive requirements are then analyzed in terms of cognitive span of task information, cognitive envelope and working memory relief point of t procedures, and working memory load. The proposed method is relatively simple and, possibly being incorporated in a full task analysis scheme, directly applicable to the design/evaluation of human-machine interfaces and operating procedures. A prototype of a computerized support system is developed for supporting the practicality of the proposed method. (Author) 104 refs., 8 tabs., 7 figs.

  12. The development of a task analysis method applicable to the tasks of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Wan Chul; Park, Ji Soo; Baek, Dong Hyeon; Ham, Dong Han; Kim, Huhn [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1996-07-01

    While task analysis is one of the essential processes for human factors studies, traditional methods reveal weaknesses in dealing with the cognitive aspects, which become more critical in modern complex system. This report proposes a cognitive task analysis (CTA) method for identifying cognitive requirements of operators' tasks in nuclear power plants. The proposed CTA method is characterized by the information-oriented concept and procedure-based approach. The task prescription identifies the information requirements and trace the information flow to reveal the cognitive organization of task procedure with emphasis to the relations among the information requirements. The cognitive requirements are then analyzed in terms of cognitive span of task information, cognitive envelope and working memory relief point of t procedures, and working memory load. The proposed method is relatively simple and, possibly being incorporated in a full task analysis scheme, directly applicable to the design/evaluation of human-machine interfaces and operating procedures. A prototype of a computerized support system is developed for supporting the practicality of the proposed method. (Author) 104 refs., 8 tabs., 7 figs.

  13. k_0-neutron activation analysis based method at CDTN: history, development and main achievements

    International Nuclear Information System (INIS)

    Menezes, Maria Ângela de B.C.; Jacimovic, Radojko; Dalmazio, Ilza

    2017-01-01

    Neutron Activation Analysis (NAA) is an analytical technique to assay the elemental chemical composition in samples of several matrices. It has been applied by the Laboratory for Neutron Activation Analysis, located at Centro de Desenvolvimento da Tecnologia Nuclear (Nuclear Technology Development Centre) /Comissao Nacional de Energia Nuclear (Brazilian Commission for Nuclear Energy), CDTN/CNEN, since the starting up of the TRIGA MARK I IPR-R1 reactor, in 1960. Among the methods of this technique, the k_0-standardization method, which was established at CDTN in 1995, is the most efficient and in 2003 it was re-established and optimized. This paper is about the history and the main achievements since then. (author)

  14. Development and validation of a multiresidue method for pesticide analysis in honey by UFLC-MS

    Directory of Open Access Journals (Sweden)

    Adriana M. Zamudio S.

    2017-05-01

    Full Text Available A method for the determination of pesticide residues in honey by ultra fast liquid chromatography coupled with mass spectrometry was developed. For this purpose, different variations of the QuECHERS method were performed: (i amount of sample, (ii type of salt to control pH, (iii buffer pH, and (iv different mixtures for cleaning-up. In addition, to demonstrate that the method is reliable, different validation parameters were studied: accuracy, limits of detection and quantification, linearity and selectivity. The results showed that by means of the changes introduced it was possible to get a more selective method that improves the accuracy of about 19 pesticides selected from the original method. It was found that the method is suitable for the analysis of 50 pesticides, out of 56. Furthermore, with the developed method recoveries between 70 and 120% and relative standard deviation below 15% were found.

  15. Trial Sequential Methods for Meta-Analysis

    Science.gov (United States)

    Kulinskaya, Elena; Wood, John

    2014-01-01

    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  16. Development and Validation of an HPLC Method for the Analysis of Sirolimus in Drug Products

    Directory of Open Access Journals (Sweden)

    Hadi Valizadeh

    2012-05-01

    Full Text Available Purpose: The aim of this study was to develop a simple, rapid and sensitive reverse phase high performance liquid chromatography (RP-HPLC method for quantification of sirolimus (SRL in pharmaceutical dosage forms. Methods: The chromatographic system employs isocratic elution using a Knauer- C18, 5 mm, 4.6 × 150 mm. Mobile phase consisting of acetonitril and ammonium acetate buffer set at flow rate 1.5 ml/min. The analyte was detected and quantified at 278nm using ultraviolet detector. The method was validated as per ICH guidelines. Results: The standard curve was found to have a linear relationship (r2 > 0.99 over the analytical range of 125–2000ng/ml. For all quality control (QC standards in intraday and interday assay, accuracy and precision range were -0.96 to 6.30 and 0.86 to 13.74 respectively, demonstrating the precision and accuracy over the analytical range. Samples were stable during preparation and analysis procedure. Conclusion: Therefore the rapid and sensitive developed method can be used for the routine analysis of sirolimus such as dissolution and stability assays of pre- and post-marketed dosage forms.

  17. Advanced methods for a probabilistic safety analysis of fires. Development of advanced methods for performing as far as possible realistic plant specific fire risk analysis (fire PSA)

    International Nuclear Information System (INIS)

    Hofer, E.; Roewekamp, M.; Tuerschmann, M.

    2003-07-01

    In the frame of the research project RS 1112 'Development of Methods for a Recent Probabilistic Safety Analysis, Particularly Level 2' funded by the German Federal Ministry of Economics and Technology (BMWi), advanced methods, in particular for performing as far as possible realistic plant specific fire risk analyses (fire PSA), should be developed. The present Technical Report gives an overview on the methodologies developed in this context for assessing the fire hazard. In the context of developing advanced methodologies for fire PSA, a probabilistic dynamics analysis with a fire simulation code including an uncertainty and sensitivity study has been performed for an exemplary scenario of a cable fire induced by an electric cabinet inside the containment of a modern Konvoi type German nuclear power plant taking into consideration the effects of fire detection and fire extinguishing means. With the present study, it was possible for the first time to determine the probabilities of specified fire effects from a class of fire events by means of probabilistic dynamics supplemented by uncertainty and sensitivity analyses. The analysis applies a deterministic dynamics model, consisting of a dynamic fire simulation code and a model of countermeasures, considering effects of the stochastics (so-called aleatory uncertainties) as well as uncertainties in the state of knowledge (so-called epistemic uncertainties). By this means, probability assessments including uncertainties are provided to be used within the PSA. (orig.) [de

  18. Shlaer-Mellor object-oriented analysis and recursive design, an effective modern software development method for development of computing systems for a large physics detector

    International Nuclear Information System (INIS)

    Kozlowski, T.; Carey, T.A.; Maguire, C.F.

    1995-01-01

    After evaluation of several modern object-oriented methods for development of the computing systems for the PHENIX detector at RHIC, we selected the Shlaer-Mellor Object-Oriented Analysis and Recursive Design method as the most appropriate for the needs and development environment of a large nuclear or high energy physics detector. This paper discusses our specific needs and environment, our method selection criteria, and major features and components of the Shlaer-Mellor method

  19. Development and application of an automated analysis method for individual cerebral perfusion single photon emission tomography images

    International Nuclear Information System (INIS)

    Cluckie, Alice Jane

    2001-01-01

    Neurological images may be analysed by performing voxel by voxel comparisons with a group of control subject images. An automated, 3D, voxel-based method has been developed for the analysis of individual single photon emission tomography (SPET) scans. Clusters of voxels are identified that represent regions of abnormal radiopharmaceutical uptake. Morphological operators are applied to reduce noise in the clusters, then quantitative estimates of the size and degree of the radiopharmaceutical uptake abnormalities are derived. Statistical inference has been performed using a Monte Carlo method that has not previously been applied to SPET scans, or for the analysis of individual images. This has been validated for group comparisons of SPET scans and for the analysis of an individual image using comparison with a group. Accurate statistical inference was obtained independent of experimental factors such as degrees of freedom, image smoothing and voxel significance level threshold. The analysis method has been evaluated for application to cerebral perfusion SPET imaging in ischaemic stroke. It has been shown that useful quantitative estimates, high sensitivity and high specificity may be obtained. Sensitivity and the accuracy of signal quantification were found to be dependent on the operator defined analysis parameters. Recommendations for the values of these parameters have been made. The analysis method developed has been compared with an established method and shown to result in higher specificity for the data and analysis parameter sets tested. In addition, application to a group of ischaemic stroke patient SPET scans has demonstrated its clinical utility. The influence of imaging conditions has been assessed using phantom data acquired with different gamma camera SPET acquisition parameters. A lower limit of five million counts and standardisation of all acquisition parameters has been recommended for the analysis of individual SPET scans. (author)

  20. The development of trend and pattern analysis methods for incident data by CEC'S joint research at Ispra

    International Nuclear Information System (INIS)

    Amesz, J.; Kalfsbeek, H.W.

    1990-01-01

    The Abnormal Occurrences Reporting System of the Commission of the European Communities was developed by the Joint Research Centre at Ispra in the period 1982 through 1985. It collects in a unique format all safety relevant events from NPPs as recorded in the participating countries. The system has been set-up with the specific objective of providing an advanced tool for a synoptic analysis of a large number of events, identifying patterns of sequences, trends, multiple dependencies between incident descriptors, precursors to severe incidents, performance indicators etc. This paper gives an overview of the development of trend and pattern analysis techniques of two different types: - event sequence analysis; - statistical methods. Though these methods have been developed and applied in relation with the AORS data, they can be regarded as generic in the sense that they may be applied to any incident reporting system satisfying the necessary criteria as to homogeneity and completeness, for rendering valid results

  1. Current status of methods for shielding analysis

    International Nuclear Information System (INIS)

    Engle, W.W.

    1980-01-01

    Current methods used in shielding analysis and recent improvements in those methods are discussed. The status of methods development is discussed based on needs cited at the 1977 International Conference on Reactor Shielding. Additional areas where methods development is needed are discussed

  2. Methods for Risk Analysis

    International Nuclear Information System (INIS)

    Alverbro, Karin

    2010-01-01

    Many decision-making situations today affect humans and the environment. In practice, many such decisions are made without an overall view and prioritise one or other of the two areas. Now and then these two areas of regulation come into conflict, e.g. the best alternative as regards environmental considerations is not always the best from a human safety perspective and vice versa. This report was prepared within a major project with the aim of developing a framework in which both the environmental aspects and the human safety aspects are integrated, and decisions can be made taking both fields into consideration. The safety risks have to be analysed in order to be successfully avoided and one way of doing this is to use different kinds of risk analysis methods. There is an abundance of existing methods to choose from and new methods are constantly being developed. This report describes some of the risk analysis methods currently available for analysing safety and examines the relationships between them. The focus here is mainly on human safety aspects

  3. Development of safety evaluation methods and analysis codes applied to the safety regulations for the design and construction stage of fast breeder reactor

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    The purposes of this study are to develop the safety evaluation methods and analysis codes needed in the design and construction stage of fast breeder reactor (FBR). In JFY 2012, the following results are obtained. As for the development of safety evaluation methods needed in the safety examination conducted for the reactor establishment permission, development of the analysis codes, such as core damage analysis code, were carried out following the planned schedule. As for the development of the safety evaluation method needed for the risk informed safety regulation, the quantification technique of the event tree using the Continuous Markov chain Monte Carlo method (CMMC method) were studied. (author)

  4. Development of a Probabilistic Tsunami Hazard Analysis Method and Application to an NPP in Korea

    International Nuclear Information System (INIS)

    Kim, M. K.; Choi, Ik

    2012-01-01

    A methodology of tsunami PSA was developed in this study. A tsunami PSA consists of tsunami hazard analysis, tsunami fragility analysis and system analysis. In the case of tsunami hazard analysis, evaluation of tsunami return period is a major task. For the evaluation of tsunami return period was evaluated with empirical method using historical tsunami record and tidal gauge record. For the performing a tsunami fragility analysis, procedure of tsunami fragility analysis was established and target equipment and structures for investigation of tsunami fragility assessment were selected. A sample fragility calculation was performed for the equipment in a Nuclear Power Plant. For the system analysis, accident sequence of tsunami event was developed according to the tsunami run-up and draw down, and tsunami induced core damage frequency (CDF) is determined. For the application to the real nuclear power plant, the Ulchin 56 NPP which is located on the east coast of Korean peninsula was selected. Through this study, whole tsunami PSA (Probabilistic Safety Assessment) working procedure was established and an example calculation was performed for one nuclear power plant in Korea

  5. Developing strategies to reduce the risk of hazardous materials transportation in iran using the method of fuzzy SWOT analysis

    Directory of Open Access Journals (Sweden)

    A. S. Kheirkhah

    2009-12-01

    Full Text Available An increase in hazardous materials transportation in Iran along with the industrial development and increase of resulted deadly accidents necessitate the development and implementation of some strategies to reduce these incidents. SWOT analysis is an efficient method for developing strategies, however, its structural problems, including a lack of prioritizing internal and external factors and inability to consider two sided factors reducing its performance in the situations where the number of internal and external factors affecting the risk of hazardous materials is relatively high and some factors are two sided in nature are presented in the article. Fuzzy SWOT analysis is a method the use of which helps with solving these problems and is the issue of employing an effective methodology. Also, the article compares the resulted strategies of the fuzzy method with the strategies developed following SWOT in order to show the relative supremacy of the new method.

  6. APPLICATION OF THE SPECTRUM ANALYSIS WITH USING BERG METHOD TO DEVELOPED SPECIAL SOFTWARE TOOLS FOR OPTICAL VIBRATION DIAGNOSTICS SYSTEM

    Directory of Open Access Journals (Sweden)

    E. O. Zaitsev

    2016-01-01

    Full Text Available The objective of this paper is development and experimental verification special software of spectral analysis. Spectral analysis use of controlled vibrations objects. Spectral analysis of vibration based on use maximum-entropy autoregressive method of spectral analysis by the Berg algorithm. For measured signals use preliminary analysis based on regression analysis. This analysis of the signal enables to eliminate uninformative parameters such as – the noise and the trend. For preliminary analysis developed special software tools. Non-contact measurement of mechanical vibrations parameters rotating diffusely-reflecting surfaces used in circumstances where the use of contact sensors difficult or impossible for a number of reasons, including lack of access to the object, the small size of the controlled area controlled portion has a high temperature or is affected by strong electromagnetic fields. For control use offered laser measuring system. This measuring system overcomes the shortcomings interference or Doppler optical measuring systems. Such as measure the large amplitude and inharmonious vibration. On the basis of the proposed methods developed special software tools for use measuring laser system. LabVIEW using for developed special software. Experimental research of the proposed method of vibration signals processing is checked in the analysis of the diagnostic information obtained by measuring the vibration system grinding diamond wheel cold solid tungsten-containing alloy TK8. A result of work special software tools was complex spectrum obtained «purified» from non-informative parameters. Spectrum of the signal corresponding to the vibration process observed object. 

  7. k{sub 0}-neutron activation analysis based method at CDTN: history, development and main achievements

    Energy Technology Data Exchange (ETDEWEB)

    Menezes, Maria Ângela de B.C.; Jacimovic, Radojko; Dalmazio, Ilza, E-mail: menezes@cdtn.br, E-mail: id@cdtn.br, E-mail: radojko.jacimovic@ijs.si [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte - MG (Brazil); Jožef Stefan Institute, Department of Environmental Sciences, Ljubljana (Slovenia)

    2017-11-01

    Neutron Activation Analysis (NAA) is an analytical technique to assay the elemental chemical composition in samples of several matrices. It has been applied by the Laboratory for Neutron Activation Analysis, located at Centro de Desenvolvimento da Tecnologia Nuclear (Nuclear Technology Development Centre) /Comissao Nacional de Energia Nuclear (Brazilian Commission for Nuclear Energy), CDTN/CNEN, since the starting up of the TRIGA MARK I IPR-R1 reactor, in 1960. Among the methods of this technique, the k{sub 0}-standardization method, which was established at CDTN in 1995, is the most efficient and in 2003 it was re-established and optimized. This paper is about the history and the main achievements since then. (author)

  8. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  9. Development of a quantitative method for the analysis of cocaine analogue impregnated into textiles by Raman spectroscopy.

    Science.gov (United States)

    Xiao, Linda; Alder, Rhiannon; Mehta, Megha; Krayem, Nadine; Cavasinni, Bianca; Laracy, Sean; Cameron, Shane; Fu, Shanlin

    2018-04-01

    Cocaine trafficking in the form of textile impregnation is routinely encountered as a concealment method. Raman spectroscopy has been a popular and successful testing method used for in situ screening of cocaine in textiles and other matrices. Quantitative analysis of cocaine in these matrices using Raman spectroscopy has not been reported to date. This study aimed to develop a simple Raman method for quantifying cocaine using atropine as the model analogue in various types of textiles. Textiles were impregnated with solutions of atropine in methanol. The impregnated atropine was extracted using less hazardous acidified water with the addition of potassium thiocyanate (KSCN) as an internal standard for Raman analysis. Despite the presence of background matrix signals arising from the textiles, the cocaine analogue could easily be identified by its characteristic Raman bands. The successful use of KSCN normalised the analyte signal response due to different textile matrix background interferences and thus removed the need for a matrix-matched calibration. The method was linear over a concentration range of 6.25-37.5 mg/cm 2 with a coefficient of determination (R 2 ) at 0.975 and acceptable precision and accuracy. A simple and accurate Raman spectroscopy method for the analysis and quantification of a cocaine analogue impregnated in textiles has been developed and validated for the first time. This proof-of-concept study has demonstrated that atropine can act as an ideal model compound to study the problem of cocaine impregnation in textile. The method has the potential to be further developed and implemented in real world forensic cases. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Analysis within the systems development life-cycle

    CERN Document Server

    Rock-Evans, Rosemary

    1987-01-01

    Analysis within the Systems Development Life-Cycle: Book 2, Data Analysis-The Methods describes the methods for carrying out data analysis within the systems development life-cycle and demonstrates how the results of fact gathering can be used to produce and verify the analysis deliverables. A number of alternative methods of analysis other than normalization are suggested. Comprised of seven chapters, this book shows the tasks to be carried out in the logical order of progression-preparation, collection, analysis of the existing system (which comprises the tasks of synthesis, verification, an

  11. Development of a micropulverized extraction method for rapid toxicological analysis of methamphetamine in hair.

    Science.gov (United States)

    Miyaguchi, Hajime; Kakuta, Masaya; Iwata, Yuko T; Matsuda, Hideaki; Tazawa, Hidekatsu; Kimura, Hiroko; Inoue, Hiroyuki

    2007-09-07

    We developed a rapid sample preparation method for the toxicological analysis of methamphetamine and amphetamine (the major metabolite of methamphetamine) in human hair by high-performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS), to facilitate fast screening and quantitation. Two milligrams of hair were mechanically micropulverized for 5 min in a 2-ml plastic tube together with 100 microl of an aqueous solvent containing 10% acetonitrile, 100 mM trifluoroacetic acid and the corresponding deuterium analogues as internal standards. The pulverizing highly disintegrated the hair components, simultaneously allowing the extraction of any drugs present in the hair. After filtering the suspension with a membrane-filter unit, the clear filtrate was directly analyzed by HPLC-MS/MS. No evaporation processes were required for sample preparation. Method optimization and validation study were carried out using real-case specimens and fortified samples in which the drugs had been artificially absorbed, respectively. Concentration ranges for quantitation were 0.040-125 and 0.040-25 ng/mg for methamphetamine and amphetamine, respectively. Real-case specimens were analyzed by the method presented here and by conventional ones to verify the applicability of our method to real-world analysis. Our method took less than 30 min for a set of chromatograms to be obtained from a washed hair sample.

  12. Development of an environment-insensitive PWR radial reflector model applicable to modern nodal reactor analysis method

    International Nuclear Information System (INIS)

    Mueller, E.M.

    1989-05-01

    This research is concerned with the development and analysis of methods for generating equivalent nodal diffusion parameters for the radial reflector of a PWR. The requirement that the equivalent reflector data be insensitive to changing core conditions is set as a principle objective. Hence, the environment dependence of the currently most reputable nodal reflector models, almost all of which are based on the nodal equivalence theory homgenization methods of Koebke and Smith, is investigated in detail. For this purpose, a special 1-D nodal equivalence theory reflector model, called the NGET model, is developed and used in 1-D and 2-D numerical experiments. The results demonstrate that these modern radial reflector models exhibit sufficient sensitivity to core conditions to warrant the development of alternative models. A new 1-D nodal reflector model, which is based on a novel combination of the nodal equivalence theory and the response matrix homogenization methods, is developed. Numerical results varify that this homogenized baffle/reflector model, which is called the NGET-RM model, is highly insensitive to changing core conditions. It is also shown that the NGET-RM model is not inferior to any of the existing 1-D nodal reflector models and that it has features which makes it an attractive alternative model for multi-dimensional reactor analysis. 61 refs., 40 figs., 36 tabs

  13. Chemical sensors and the development of potentiometric methods for liquid media analysis

    International Nuclear Information System (INIS)

    Vlasov, Yu.G.; Kolodnikov, V.V.; Ermolenko, Yu.E.; Mikhajlova, S.S.

    1996-01-01

    Aspects of applying indirect potentiometric determination to chemical analysis are considered. Among them are the standard and modified addition and subtraction methods, the multiple addition method, and potentiometric titration using ion-selective electrodes as indicators. These methods significantly extend the capabilities of ion-selective potentiometric analysis. Conditions for the applicability of the above-mentioned methods to various samples (Cd, REE, Th, iodides and others) are discussed using all available ion-selective electrodes as examples. 162 refs., 2 figs., 5 tabs

  14. Development of Quality Control Method for Glucofarmaka Antidiabetic Jamu by HPLC Fingerprint Analysis

    Directory of Open Access Journals (Sweden)

    Hanifullah Habibie

    2017-04-01

    Full Text Available Herbal medicines become increasingly popular all over the world for preventive and therapeutic purposes. Quality control of herbal medicines is important to make sure their safety and efficacy. Chromatographic fingerprinting has been accepted by the World Health Organization as one reliable strategy for quality control method in herbal medicines. In this study, high-performance liquid chromatography fingerprint analysis was developed as a quality control method for glucofarmaka antidiabetic jamu. The optimum fingerprint chromatogram were obtained using C18 as the stationary phase and linear gradient elution using 10–95% acetonitrile:water as the mobile phase within 60 minutes of elution and detection at 210 nm. About 20 peaks were detected and could be used as fingerprint of glucofarmaka jamu. To evaluate the analytical performance of the method, we determined the precision, reproducibility, and stability. The result of the analytical performance showed reliable results. The proposed method could be used as a quality control method for glucofarmaka antidiabetic jamu and also for its raw materials.

  15. Analysis within the systems development life-cycle

    CERN Document Server

    Rock-Evans, Rosemary

    1987-01-01

    Analysis within the Systems Development Life-Cycle: Book 4, Activity Analysis-The Methods describes the techniques and concepts for carrying out activity analysis within the systems development life-cycle. Reference is made to the deliverables of data analysis and more than one method of analysis, each a viable alternative to the other, are discussed. The """"bottom-up"""" and """"top-down"""" methods are highlighted. Comprised of seven chapters, this book illustrates how dependent data and activities are on each other. This point is especially brought home when the task of inventing new busin

  16. Application of Software Safety Analysis Methods

    International Nuclear Information System (INIS)

    Park, G. Y.; Hur, S.; Cheon, S. W.; Kim, D. H.; Lee, D. Y.; Kwon, K. C.; Lee, S. J.; Koo, Y. H.

    2009-01-01

    A fully digitalized reactor protection system, which is called the IDiPS-RPS, was developed through the KNICS project. The IDiPS-RPS has four redundant and separated channels. Each channel is mainly composed of a group of bistable processors which redundantly compare process variables with their corresponding setpoints and a group of coincidence processors that generate a final trip signal when a trip condition is satisfied. Each channel also contains a test processor called the ATIP and a display and command processor called the COM. All the functions were implemented in software. During the development of the safety software, various software safety analysis methods were applied, in parallel to the verification and validation (V and V) activities, along the software development life cycle. The software safety analysis methods employed were the software hazard and operability (Software HAZOP) study, the software fault tree analysis (Software FTA), and the software failure modes and effects analysis (Software FMEA)

  17. Development of an evaluation method for the quality of NPP MCR operators' communication using Work Domain Analysis (WDA)

    International Nuclear Information System (INIS)

    Jang, Inseok; Park, Jinkyun; Seong, Poonghyun

    2011-01-01

    Research highlights: → No evaluation method is available for operators' communication quality in NPPs. → To model this evaluation method, the Work Domain Analysis (WDA) method was found. → This proposed method was applied to NPP MCR operators. → The quality of operators' communication can be evaluated with the propose method. - Abstract: The evolution of work demands has seen industrial evolution itself evolve into the computerization of these demands, making systems more complex. This field is now known as the Complex Socio-Technical System. As communication failures are problems associated with Complex Socio-Technical Systems, it has been discovered that communication failures are the cause of many incidents and accidents in various industries, including the nuclear, aerospace and railway industries. Despite the fact that there have been many studies on the severity of communication failures, there is no evaluation method for operators' communication quality in Nuclear Power Plants (NPPs). Therefore, the objectives of this study are to develop an evaluation method for the quality of NPP Main Control Room (MCR) operators' communication and to apply the proposed method to operators in a full-scope simulator. To develop the proposed method, the Work Domain Analysis (WDA) method is introduced. Several characteristics of WDA, including Abstraction Decomposition Space (ADS) and the diagonal of ADS are the important points in developing an evaluation method for the quality of NPP MCR operators' communication. In addition, to apply the proposed method, nine teams working in NPPs participated in a field simulation. The results of this evaluation reveal that operators' communication quality improved as a greater proportion of the components in the developed evaluation criteria were mentioned. Therefore, the proposed method could be useful for evaluating the communication quality in any complex system.

  18. Global/local methods research using a common structural analysis framework

    Science.gov (United States)

    Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.

    1991-01-01

    Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.

  19. Development and Validation of Analytical Method for Losartan ...

    African Journals Online (AJOL)

    Development and Validation of Analytical Method for Losartan-Copper Complex Using UV-Vis Spectrophotometry. ... Tropical Journal of Pharmaceutical Research ... Purpose: To develop a new spectrophotometric method for the analysis of losartan potassium in pharmaceutical formulations by making its complex with ...

  20. Applied research and development of neutron activation analysis - Development of the precise analysis method for plastic materials by the use of NAA

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kil Yong; Sim, Sang Kwan; Yoon, Yoon Yeol; Chun, Sang Ki [Korea Institute of Geology, Mining and Materials, Taejon (Korea)

    2000-04-01

    The demand for inorganic analysis of plastics has significantly increased in the fields of microelectronic, environmental, nuclear and resource recycling. The difficulties of chemical analysis methods have led to the application of NAA which has great advantages of non-destructivity, freedom from blank, high sensitivity. The goal of the present work is to optimize and to develop the NAA procedures for the inorganic analysis of plastics. Even though NAA has unique advantages, it has two problems for plastics. One is the contamination by metallic utensils during sample treatment and the other is destruction of sample ampule due to pressure build-up by hydrogen and methane gas formed from oxyhydrogenation reaction with neutrons. For the first problem, large plastics were cut to pieces after immersion in liquid nitrogen. And the second problem has been solved by making an aperture on top side of sample ampule. These research results have been applied to analysis of various plastic materials which were used in food, drug containers and toys for children. Moreover, korean irradiation rabbit could be produced by the application of the results and standard reference materials of plastics which were used for the analysis in XRF and ICP could be produced. 36 refs., 6 figs., 37 tabs (Author)

  1. Development of a low-cost method of analysis for the qualitative and quantitative analysis of butyltins in environmental samples.

    Science.gov (United States)

    Bangkedphol, Sornnarin; Keenan, Helen E; Davidson, Christine; Sakultantimetha, Arthit; Songsasen, Apisit

    2008-12-01

    Most analytical methods for butyltins are based on high resolution techniques with complicated sample preparation. For this study, a simple application of an analytical method was developed using High Performance Liquid Chromatography (HPLC) with UV detection. The developed method was studied to determine tributyltin (TBT), dibutyltin (DBT) and monobutyltin (MBT) in sediment and water samples. The separation was performed in isocratic mode on an ultra cyanopropyl column with a mobile phase of hexane containing 5% THF and 0.03% acetic acid. This method was confirmed using standard GC/MS techniques and verified by statistical paired t-test method. Under the experimental conditions used, the limit of detection (LOD) of TBT and DBT were 0.70 and 0.50 microg/mL, respectively. The optimised extraction method for butyltins in water and sediment samples involved using hexane containing 0.05-0.5% tropolone and 0.2% sodium chloride in water at pH 1.7. The quantitative extraction of butyltin compounds in a certified reference material (BCR-646) and naturally contaminated samples was achieved with recoveries ranging from 95 to 108% and at %RSD 0.02-1.00%. This HPLC method and optimum extraction conditions were used to determine the contamination level of butyltins in environmental samples collected from the Forth and Clyde canal, Scotland, UK. The values obtained severely exceeded the Environmental Quality Standard (EQS) values. Although high resolution methods are utilised extensively for this type of research, the developed method is cheaper in both terms of equipment and running costs, faster in analysis time and has comparable detection limits to the alternative methods. This is advantageous not just as a confirmatory technique but also to enable further research in this field.

  2. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  3. Probabilistic Structural Analysis Theory Development

    Science.gov (United States)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  4. Development of seismic analysis method considered FSI effect on a neutron reflector for APWR reactor internals

    Energy Technology Data Exchange (ETDEWEB)

    Hideyuki, Morika; Tomomichi, Nakamura [Mitsubishi Heavy Industries Ltd., Takasago R and D Center, Hyogo (Japan); Toshio, Ichikawa; Kazuo, Hirota; Hiroyuki, Murakiso [Mitsubishi Heavy Industries Ltd., Kobe Shipyard and Machinery Works, Hyogo, Kobe (Japan); Minoru, Murota [Japan Atomic Power Co., Tokyo (Japan)

    2004-07-01

    A Neutron Reflector (NR) is a new structure designed for improving the structure reliability of Advanced Pressurized Water Reactors (APWR,). The NR is placed in a narrow gap between the NR and a Core Barrel (CB). In the case of a structure surrounded by liquid in a narrow gap, the added fluid mass and the damping increases compared with in the air. This effect is famous for Fluid-Structure Interaction effect (FSI effect) in the narrow gap and it depends on the vibration displacement of the structure. A new method to estimate the added fluid damping for this case has been introduced by some of the authors in 2001, which is based on a narrow passage flow theory (Morita et al., 2001). Following this theory, a vibration test was performed to assess the appropriateness of the analysis method employed to measure the response of the NR during an earthquake (Nakamura et al., 2002). In this paper, results of a model test are shown comparing the data with the calculated ones based on the new analysis method that is combined the above method with the ANSYS computer code. As a result, a new seismic analysis method using the above theory was developed. The analytical results are in good agreement with the test results. (authors)

  5. Development of advanced methods for analysis of experimental data in diffusion

    Science.gov (United States)

    Jaques, Alonso V.

    There are numerous experimental configurations and data analysis techniques for the characterization of diffusion phenomena. However, the mathematical methods for estimating diffusivities traditionally do not take into account the effects of experimental errors in the data, and often require smooth, noiseless data sets to perform the necessary analysis steps. The current methods used for data smoothing require strong assumptions which can introduce numerical "artifacts" into the data, affecting confidence in the estimated parameters. The Boltzmann-Matano method is used extensively in the determination of concentration - dependent diffusivities, D(C), in alloys. In the course of analyzing experimental data, numerical integrations and differentiations of the concentration profile are performed. These methods require smoothing of the data prior to analysis. We present here an approach to the Boltzmann-Matano method that is based on a regularization method to estimate a differentiation operation on the data, i.e., estimate the concentration gradient term, which is important in the analysis process for determining the diffusivity. This approach, therefore, has the potential to be less subjective, and in numerical simulations shows an increased accuracy in the estimated diffusion coefficients. We present a regression approach to estimate linear multicomponent diffusion coefficients that eliminates the need pre-treat or pre-condition the concentration profile. This approach fits the data to a functional form of the mathematical expression for the concentration profile, and allows us to determine the diffusivity matrix directly from the fitted parameters. Reformulation of the equation for the analytical solution is done in order to reduce the size of the problem and accelerate the convergence. The objective function for the regression can incorporate point estimations for error in the concentration, improving the statistical confidence in the estimated diffusivity matrix

  6. Method Development of Cadmium Investigation in Rice by Radiochemical Neutron Activation Analysis

    International Nuclear Information System (INIS)

    Promsawad, Arunee; Pareepart, Ratirot; Laoharojanaphand, Sirinart; Arunee, Kongsakpaisal

    2007-08-01

    Full text: A radiochemical neutron activation analysis for the determination of cadmium was investigated. A chemical separation of cadmium utilized ion exchange chromatography of a strong basic anion-exchange resin BIO-RAD 1X 8 (Chloride form). The adsorbing medium of 2M HCl was found to be the most suitable among the concentration attempted (2, 4, 6, 8 and 10M HCl) and the eluent for desorption of the cadmium from column was 8M NH 3 solution. A chemical yield of 95% was found. The method has been evaluated by analyzing certified reference materials with 0.5.g/g (SRM 1577b, Bovine Liver) and 2.48.g/g (SRM 1566b, Oyster Tissue) cadmium. The agreement of the result with certified values is within 92% for Bovine Liver and 96% for Oyster Tissue. The method developed was applied to determine the cadmium concentrations in contaminated Thai rice. It was found that the cadmium concentrations ranged from 7.4 to 578.9 ppb

  7. Advanced organic analysis and analytical methods development: FY 1995 progress report. Waste Tank Organic Safety Program

    International Nuclear Information System (INIS)

    Wahl, K.L.; Campbell, J.A.; Clauss, S.A.

    1995-09-01

    This report describes the work performed during FY 1995 by Pacific Northwest Laboratory in developing and optimizing analysis techniques for identifying organics present in Hanford waste tanks. The main focus was to provide a means for rapidly obtaining the most useful information concerning the organics present in tank waste, with minimal sample handling and with minimal waste generation. One major focus has been to optimize analytical methods for organic speciation. Select methods, such as atmospheric pressure chemical ionization mass spectrometry and matrix-assisted laser desorption/ionization mass spectrometry, were developed to increase the speciation capabilities, while minimizing sample handling. A capillary electrophoresis method was developed to improve separation capabilities while minimizing additional waste generation. In addition, considerable emphasis has been placed on developing a rapid screening tool, based on Raman and infrared spectroscopy, for determining organic functional group content when complete organic speciation is not required. This capability would allow for a cost-effective means to screen the waste tanks to identify tanks that require more specialized and complete organic speciation to determine tank safety

  8. Development of Three Methods for Simultaneous Quantitative ...

    African Journals Online (AJOL)

    Development of Three Methods for Simultaneous Quantitative Determination of Chlorpheniramine Maleate and Dexamethasone in the Presence of Parabens in ... Tropical Journal of Pharmaceutical Research ... Results: All the proposed methods were successfully applied to the analysis of raw materials and dosage form.

  9. The development of a 3D risk analysis method.

    Science.gov (United States)

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  10. Analysis of numerical methods

    CERN Document Server

    Isaacson, Eugene

    1994-01-01

    This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.

  11. Probabilistic methods for rotordynamics analysis

    Science.gov (United States)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  12. Development of a micrometre-scale radiographic measuring method for residual stress analysis

    International Nuclear Information System (INIS)

    Moeller, D.

    1999-01-01

    The radiographic method described uses micrometre X-ray diffraction for high-resolution residual stress analysis in single crystals. The focus is on application of two x-ray optics (glass capillaries) for shaping a sufficiently fine and intensive primary beam. Due to application of a proper one-grain measuring and analysis method, the resolution results are applicable to the characteristic grain sizes of many materials. (orig.) [de

  13. Development of multi-dimensional analysis method for porous blockage in fuel subassembly. Numerical simulation for 4 subchannel geometry water test

    International Nuclear Information System (INIS)

    Tanaka, Masa-aki; Kamide, Hideki

    2001-02-01

    This investigation deals with the porous blockage in a wire spacer type fuel subassembly in Fast Breeder Reactors (FBR's). Multi-dimensional analysis method for a porous blockage in a fuel subassembly is developed using the standard k-ε turbulence model with the typical correlations in handbooks. The purpose of this analysis method is to evaluate the position and the magnitude of the maximum temperature, and to investigate the thermo-hydraulic phenomena in the porous blockage. Verification of this analysis method was conducted based on the results of 4-subchannel geometry water test. It was revealed that the evaluation of the porosity distribution and the particle diameter in a porous blockage was important to predict the temperature distribution. This analysis method could simulate the spatial characteristic of velocity and temperature distributions in the blockage and evaluate the pin surface temperature inside the porous blockage. Through the verification of this analysis method, it is shown that this multi-dimensional analysis method is useful to predict the thermo-hydraulic field and the highest temperature in a porous blockage. (author)

  14. Developing the UIC 406 Method for Capacity Analysis

    DEFF Research Database (Denmark)

    Khadem Sameni, Melody; Landex, Alex; Preston, John

    2011-01-01

    This paper applies an improvement cycle for analysing and enhancing capacity utilisation of an existing timetable. Macro and micro capacity utilisation are defined based on the discrete nature of capacity utilisation and different capacity metrics are analysed. In the category of macro asset...... utilisation, two methods of CUI and the UIC 406 are compared with each other. A British and a Danish case study are explored for a periodic and a nonperiodic timetable: 1- Freeing up capacity by omitting the train that has the highest capacity consumption (British case study). 2- Adding trains to use...... the spare capacity (Danish case study). Some suggestions are made to develop meso indices by using the UIC 406 method to decide between the alternatives for adding or removing trains....

  15. Viscous wing theory development. Volume 1: Analysis, method and results

    Science.gov (United States)

    Chow, R. R.; Melnik, R. E.; Marconi, F.; Steinhoff, J.

    1986-01-01

    Viscous transonic flows at large Reynolds numbers over 3-D wings were analyzed using a zonal viscid-inviscid interaction approach. A new numerical AFZ scheme was developed in conjunction with the finite volume formulation for the solution of the inviscid full-potential equation. A special far-field asymptotic boundary condition was developed and a second-order artificial viscosity included for an improved inviscid solution methodology. The integral method was used for the laminar/turbulent boundary layer and 3-D viscous wake calculation. The interaction calculation included the coupling conditions of the source flux due to the wing surface boundary layer, the flux jump due to the viscous wake, and the wake curvature effect. A method was also devised incorporating the 2-D trailing edge strong interaction solution for the normal pressure correction near the trailing edge region. A fully automated computer program was developed to perform the proposed method with one scalar version to be used on an IBM-3081 and two vectorized versions on Cray-1 and Cyber-205 computers.

  16. Development and application of methods to characterize code uncertainty

    International Nuclear Information System (INIS)

    Wilson, G.E.; Burtt, J.D.; Case, G.S.; Einerson, J.J.; Hanson, R.G.

    1985-01-01

    The United States Nuclear Regulatory Commission sponsors both international and domestic studies to assess its safety analysis codes. The Commission staff intends to use the results of these studies to quantify the uncertainty of the codes with a statistically based analysis method. Development of the methodology is underway. The Idaho National Engineering Laboratory contributions to the early development effort, and testing of two candidate methods are the subjects of this paper

  17. Software development for teleroentgenogram analysis

    Science.gov (United States)

    Goshkoderov, A. A.; Khlebnikov, N. A.; Obabkov, I. N.; Serkov, K. V.; Gajniyarov, I. M.; Aliev, A. A.

    2017-09-01

    A framework for the analysis and calculation of teleroentgenograms was developed. Software development was carried out in the Department of Children's Dentistry and Orthodontics in Ural State Medical University. The software calculates the teleroentgenogram by the original method which was developed in this medical department. Program allows designing its own methods for calculating the teleroentgenograms by new methods. It is planned to use the technology of machine learning (Neural networks) in the software. This will help to make the process of calculating the teleroentgenograms easier because methodological points will be placed automatically.

  18. Structural reliability methods: Code development status

    Science.gov (United States)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  19. Analysis and development of stochastic multigrid methods in lattice field theory

    International Nuclear Information System (INIS)

    Grabenstein, M.

    1994-01-01

    We study the relation between the dynamical critical behavior and the kinematics of stochastic multigrid algorithms. The scale dependence of acceptance rates for nonlocal Metropolis updates is analyzed with the help of an approximation formula. A quantitative study of the kinematics of multigrid algorithms in several interacting models is performed. We find that for a critical model with Hamiltonian H(Φ) absence of critical slowing down can only be expected if the expansion of (H(Φ+ψ)) in terms of the shift ψ contains no relevant term (mass term). The predictions of this rule was verified in a multigrid Monte Carlo simulation of the Sine Gordon model in two dimensions. Our analysis can serve as a guideline for the development of new algorithms: We propose a new multigrid method for nonabelian lattice gauge theory, the time slice blocking. For SU(2) gauge fields in two dimensions, critical slowing down is almost completely eliminated by this method, in accordance with the theoretical prediction. The generalization of the time slice blocking to SU(2) in four dimensions is investigated analytically and by numerical simulations. Compared to two dimensions, the local disorder in the four dimensional gauge field leads to kinematical problems. (orig.)

  20. Mass spectrometric methods for trace analysis of metals

    International Nuclear Information System (INIS)

    Bahr, U.; Schulten, H.R.

    1981-01-01

    A brief outline is given of the principles of mass spectrometry (MS) and the fundamentals of qualitative and quantitative mass spectrometric analysis emphasizing recent developments and results. Classical methods of the analysis of solids, i.e. spark-source MS and thermal ionization MS, as well as recent methods of metal analysis are described. Focal points in this survey of recently developed techniques include secondary ion MS, laser probe MS, plasma ion source MS, gas discharge MS and field desorption MS. Here, a more detailed description is given and the merits of these emerging methods are discussed more explicitly. In particular, the results of the field desorption techniques in elemental analyses are reviewed and critically evaluated

  1. Advances in methods and applications of reliability and safety analysis

    International Nuclear Information System (INIS)

    Fieandt, J.; Hossi, H.; Laakso, K.; Lyytikaeinen, A.; Niemelae, I.; Pulkkinen, U.; Pulli, T.

    1986-01-01

    The know-how of the reliability and safety design and analysis techniques of Vtt has been established over several years in analyzing the reliability in the Finnish nuclear power plants Loviisa and Olkiluoto. This experience has been later on applied and developed to be used in the process industry, conventional power industry, automation and electronics. VTT develops and transfers methods and tools for reliability and safety analysis to the private and public sectors. The technology transfer takes place in joint development projects with potential users. Several computer-aided methods, such as RELVEC for reliability modelling and analysis, have been developed. The tool developed are today used by major Finnish companies in the fields of automation, nuclear power, shipbuilding and electronics. Development of computer-aided and other methods needed in analysis of operating experience, reliability or safety is further going on in a number of research and development projects

  2. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    OpenAIRE

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective: To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods: TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results: Both assays provided good linearity, accuracy, reproducibility and selectivity for dete...

  3. Development of Reverse-Phase HPLC Method for Simultaneous ...

    African Journals Online (AJOL)

    Erah

    Purpose: To develop a simple, sensitive and rapid reverse phase HPLC method for the simultaneous analysis of metoprolol succinate and hydrochlorothiazide in a solid dosage form. Methods: The .... Extraction was carried out three times with.

  4. Development of analytical methods for the determination of some radiologically important elements in biological materials using neutron activation analysis

    International Nuclear Information System (INIS)

    Dang, H.S.; Jaiswal, D.D.; Pullat, V.R.; Krishnamony, S.

    1998-01-01

    This paper describes the analytical methods developed for the estimation of Cs, I, Sr, Th and U in biological materials such as food and human tissues. The methods employ both, the instrumental neutron activation analysis (INAA) and radiochemical neutron activation analysis (RNAA). The adequacy of these methods to determine the concentrations of the above elements in dietary and tissue materials was also studied. The study showed that the analytical methods described in this paper are adequate for the determination of Cs, Sr, Th and U in all kinds of biological samples. In the case of I however, the method is adequate only for determining its concentration in thyroid, but needs to be modified to improve its sensitivity for the determination of I in diet samples. (author)

  5. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    Science.gov (United States)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize

  6. Recent Advances in the Method of Forces: Integrated Force Method of Structural Analysis

    Science.gov (United States)

    Patnaik, Surya N.; Coroneos, Rula M.; Hopkins, Dale A.

    1998-01-01

    Stress that can be induced in an elastic continuum can be determined directly through the simultaneous application of the equilibrium equations and the compatibility conditions. In the literature, this direct stress formulation is referred to as the integrated force method. This method, which uses forces as the primary unknowns, complements the popular equilibrium-based stiffness method, which considers displacements as the unknowns. The integrated force method produces accurate stress, displacement, and frequency results even for modest finite element models. This version of the force method should be developed as an alternative to the stiffness method because the latter method, which has been researched for the past several decades, may have entered its developmental plateau. Stress plays a primary role in the development of aerospace and other products, and its analysis is difficult. Therefore, it is advisable to use both methods to calculate stress and eliminate errors through comparison. This paper examines the role of the integrated force method in analysis, animation and design.

  7. Development of achiral and chiral 2D HPLC methods for analysis of albendazole metabolites in microsomal fractions using multivariate analysis for the in vitro metabolism.

    Science.gov (United States)

    Belaz, Kátia Roberta A; Pereira-Filho, Edenir Rodrigues; Oliveira, Regina V

    2013-08-01

    In this work, the development of two multidimensional liquid chromatography methods coupled to a fluorescence detector is described for direct analysis of microsomal fractions obtained from rat livers. The chiral multidimensional method was then applied for the optimization of the in vitro metabolism of albendazole by experimental design. Albendazole was selected as a model drug because of its anthelmintics properties and recent potential for cancer treatment. The development of two fully automated achiral-chiral and chiral-chiral high performance liquid chromatography (HPLC) methods for the determination of albendazole (ABZ) and its metabolites albendazole sulphoxide (ABZ-SO), albendazole sulphone (ABZ-SO2) and albendazole 2-aminosulphone (ABZ-SO2NH2) in microsomal fractions are described. These methods involve the use of a phenyl (RAM-phenyl-BSA) or octyl (RAM-C8-BSA) restricted access media bovine serum albumin column for the sample clean-up, followed by an achiral phenyl column (15.0×0.46cmI.D.) or a chiral amylose tris(3,5-dimethylphenylcarbamate) column (15.0×0.46cmI.D.). The chiral 2D HPLC method was applied to the development of a compromise condition for the in vitro metabolism of ABZ by means of experimental design involving multivariate analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. CZECHOSLOVAK FOOTPRINTS IN THE DEVELOPMENT OF METHODS OF THERMOMETRY, CALORIMETRY AND THERMAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Pavel Holba

    2012-07-01

    Full Text Available A short history on the development of thermometric methods are reviewed accentuating the role of Rudolf Bárta in underpinning special thermoanalytical conferences and new journal Silikáty in fifties as well as Vladimir Šatava mentioning his duty in the creation of the Czech school on thermoanalytical kinetics. This review surveys the innovative papers dealing with thermal analysis and the related fields (e.g. calorimetry, kinetics which have been published by noteworthy postwar Czechoslovak scholars and scientists and by their disciples in 1950-1980. Itemized 227 references with titles show rich scientific productivity revealing that many of them were ahead of time even at international connotation.

  9. Analysis and development of methods of correcting for heterogeneities to cobalt-60: computing application

    International Nuclear Information System (INIS)

    Kappas, K.

    1982-11-01

    The purpose of this work is the analysis of the influence of inhomogeneities of the human body on the determination of the dose in Cobalt-60 radiation therapy. The first part is dedicated to the physical characteristics of inhomogeneities and to the conventional methods of correction. New methods of correction are proposed based on the analysis of the scatter. This analysis allows to take account, with a greater accuracy of their physical characteristics and of the corresponding modifications of the dose: ''the differential TAR method'' and ''the Beam Substraction Method''. The second part is dedicated to the computer implementation of the second method of correction for routine application in hospital [fr

  10. Direct methods of soil-structure interaction analysis for earthquake loadings(II)

    Energy Technology Data Exchange (ETDEWEB)

    Yun, Chung Bang; Lee, S R; Kim, J M; Park, K L; Oh, S B; Choi, J S; Kim, Y S [Korea Advanced Institute of Science Technology, Daejeon (Korea, Republic of)

    1994-07-15

    In this study, methods for 3-D soil-structure interaction analysis have been studied. They are 3-D axisymmetric analysis method, 3-D axisymmetric finite element method incorporating infinite elements, and 3-D boundary element methods. The computer code, named as 'KIESSI - PF', has been developed which is based on the 3-D axisymmetric finite element method coupled with infinite element method. It is able to simulate forced vibration test results of a soil-structure interaction system. The Hualien FVT post-correlation analysis before backfill and the blind prediction analysis after backfill have been carried out using the developed computer code 'KIESSI - PF'.

  11. Direct methods of soil-structure interaction analysis for earthquake loadings(II)

    International Nuclear Information System (INIS)

    Yun, Chung Bang; Lee, S. R.; Kim, J. M.; Park, K. L.; Oh, S. B.; Choi, J. S.; Kim, Y. S.

    1994-07-01

    In this study, methods for 3-D soil-structure interaction analysis have been studied. They are 3-D axisymmetric analysis method, 3-D axisymmetric finite element method incorporating infinite elements, and 3-D boundary element methods. The computer code, named as 'KIESSI - PF', has been developed which is based on the 3-D axisymmetric finite element method coupled with infinite element method. It is able to simulate forced vibration test results of a soil-structure interaction system. The Hualien FVT post-correlation analysis before backfill and the blind prediction analysis after backfill have been carried out using the developed computer code 'KIESSI - PF'

  12. Cooperative method development

    DEFF Research Database (Denmark)

    Dittrich, Yvonne; Rönkkö, Kari; Eriksson, Jeanette

    2008-01-01

    The development of methods tools and process improvements is best to be based on the understanding of the development practice to be supported. Qualitative research has been proposed as a method for understanding the social and cooperative aspects of software development. However, qualitative...... research is not easily combined with the improvement orientation of an engineering discipline. During the last 6 years, we have applied an approach we call `cooperative method development', which combines qualitative social science fieldwork, with problem-oriented method, technique and process improvement....... The action research based approach focusing on shop floor software development practices allows an understanding of how contextual contingencies influence the deployment and applicability of methods, processes and techniques. This article summarizes the experiences and discusses the further development...

  13. Compare the user interface of digital libraries\\' websites between the developing and developed countries in content analysis method

    Directory of Open Access Journals (Sweden)

    Gholam Abbas Mousavi

    2017-03-01

    Full Text Available Purpose: This study performed with goals of determining the Items in designing and developing the user interface of digital libraries' websites and to determine the best digital libraries' websites and discuss their advantages and disadvantages; to analyze and compare digital libraries' websites in developing countries with those in the developed countries. Methodology: to do so, 50 digital libraries' websites were selected by purposive sampling method. By analyzing the level of development of the countries in the sample regarding their digital libraries' websites, 12 websites were classified as belonging to developing and 38 countries to developed counties. Then, their content was studied by using a qualitative content analysis. The study was conducted by using a research-constructed checklist containing 12 main categories and 44 items, whose validity was decided by content validity method. The data was analyzed in SPSS (version 16. Findings: The results showed that in terms of “online resources”, “library collection,” and “navigation”, there is a significant relationship between the digital library' user interface design in both types of countries. Results: The items of “online public access catalogue (OPAC” and “visits statistics” were observed in more developing countries’ digital libraries' websites. However, the item of “menu and submenus to introduce library' sections” was presented in more developed countries’ digital libraries' websites. Moreover, by analyzing the number of items in the selected websites, “American Memory” with 44 items, “International Children Digital Library” with 40 items, and “California” with 39 items were the best, and “Berkeley Sun Site” with 10 items was the worst website. Despite more and better quality digital libraries in developed countries, the quality of digital libraries websites in developing countries is considerable. In general, some of the newly established

  14. Development of evaluation method for the quality of NPP MCR operators' communication using work domain analysis (WDA)

    International Nuclear Information System (INIS)

    Jang, In Seok

    2010-02-01

    Evolution of work demands has changed industrial evolution to computerization which makes systems complex and complicated: this field is called Complex Socio-Technical Systems. As communication failure is one problem of Complex Socio-Technical Systems, it has been discovered that communication failure is the reason for many incidents and accidents in various industries, including the nuclear, aerospace and railway industries. Despite the fact that there have been many studies on the severity of communication failure, there is no evaluation method for operators' communication quality in NPPs. Therefore, the objectives of this study are to develop an evaluation method for the quality of NPP Main Control Room (MCR) operators' communication and to apply the proposed method to operators in a full-scope simulator. In order to develop the proposed method, the Work Domain Analysis (WDA) method is introduced. Several characteristic of WDA, such as Abstraction Decomposition Space (ADS) and the diagonal of ADS are the key points in developing an evaluation method for the quality of NPP MCR operators' communication. In addition, in order to apply the proposed method, nine teams working in NPPs participated in the field simulation. Evaluation results reveal that operators' communication quality was higher as larger portion of components in the developed evaluation criteria were mentioned. Therefore, the proposed method could be a useful one for evaluating the communication quality in any complex system. In order to verify that the proposed method is meaningful to evaluate communication quality, the evaluation results were further investigated with objective performance measures. Further investigation of the evaluation results also supports the idea that the proposed method can be used in evaluating communication quality

  15. Quantitative methods for developing C2 system requirement

    Energy Technology Data Exchange (ETDEWEB)

    Tyler, K.K.

    1992-06-01

    The US Army established the Army Tactical Command and Control System (ATCCS) Experimentation Site (AES) to provide a place where material and combat developers could experiment with command and control systems. The AES conducts fundamental and applied research involving command and control issues using a number of research methods, ranging from large force-level experiments, to controlled laboratory experiments, to studies and analyses. The work summarized in this paper was done by Pacific Northwest Laboratory under task order from the Army Tactical Command and Control System Experimentation Site. The purpose of the task was to develop the functional requirements for army engineer automation and support software, including MCS-ENG. A client, such as an army engineer, has certain needs and requirements of his or her software; these needs must be presented in ways that are readily understandable to the software developer. A requirements analysis then, such as the one described in this paper, is simply the means of communication between those who would use a piece of software and those who would develop it. The analysis from which this paper was derived attempted to bridge the ``communications gap`` between army combat engineers and software engineers. It sought to derive and state the software needs of army engineers in ways that are meaningful to software engineers. In doing this, it followed a natural sequence of investigation: (1) what does an army engineer do, (2) with which tasks can software help, (3) how much will it cost, and (4) where is the highest payoff? This paper demonstrates how each of these questions was addressed during an analysis of the functional requirements of engineer support software. Systems engineering methods are used in a task analysis and a quantitative scoring method was developed to score responses regarding the feasibility of task automation. The paper discusses the methods used to perform utility and cost-benefits estimates.

  16. DEVELOPMENT OF METHODS FOR STABILITY ANALYSIS OF TOWER CRANES

    Directory of Open Access Journals (Sweden)

    Sinel'shchikov Aleksey Vladimirovich

    2018-01-01

    Full Text Available Tower cranes are one of the main tools for execution of reloading works during construction. Design of tower cranes is carried out in accordance with RD 22-166-86 “Construction of tower cranes. Rules of analysis”, according to which to ensure stability it is required not to exceed the overturning moment upper limit. The calculation of these moments is carried out with the use of empirical coefficients and quite time-consuming. Moreover, normative methodology only considers the static position of the crane and does not take into account the presence of dynamic transients due to crane functioning (lifting and swinging of the load, boom turning and the presence of the dynamic external load (e.g. from wind for different orientations of the crane. This paper proposes a method of determining the stability coefficient of the crane based on acting reaction forces at the support points - the points of contact of wheels with the crane rail track, which allows us, at the design stage, to investigate stability of tower crane under variable external loads and operating conditions. Subject: the safety of tower cranes operation with regard to compliance with regulatory requirements of ensuring their stability both at the design stage and at the operational stage. Research objectives: increasing the safety of operation of tower cranes on the basis of improving methodology of their design to ensure static and dynamic stability. Materials and methods: analysis and synthesis of the regulatory framework and modern research works on provision of safe operation of tower cranes, the method of numerical simulation. Results: we proposed the formula for analysis of stability of tower cranes using the resulting reaction forces at the supports of the crane at the point of contact of the wheel with the rail track.

  17. Development of High Precision Tsunami Runup Calculation Method Coupled with Structure Analysis

    Science.gov (United States)

    Arikawa, Taro; Seki, Katsumi; Chida, Yu; Takagawa, Tomohiro; Shimosako, Kenichiro

    2017-04-01

    Calculation Method Based on a Hierarchical Simulation", Journal of Disaster ResearchVol.11 No.4 T. Arikawa, K. Hamaguchi, K. Kitagawa, T. Suzuki (2009): "Development of Numerical Wave Tank Coupled with Structure Analysis Based on FEM", Journal of J.S.C.E., Ser. B2 (Coastal Engineering) Vol. 65, No. 1 T. Arikawa et. al.(2012) "Failure Mechanism of Kamaishi Breakwaters due to the Great East Japan Earthquake Tsunami", 33rd International Conference on Coastal Engineering, No.1191

  18. Waste Tank Organic Safety Program: Analytical methods development. Progress report, FY 1994

    International Nuclear Information System (INIS)

    Campbell, J.A.; Clauss, S.A.; Grant, K.E.

    1994-09-01

    The objectives of this task are to develop and document extraction and analysis methods for organics in waste tanks, and to extend these methods to the analysis of actual core samples to support the Waste Tank organic Safety Program. This report documents progress at Pacific Northwest Laboratory (a) during FY 1994 on methods development, the analysis of waste from Tank 241-C-103 (Tank C-103) and T-111, and the transfer of documented, developed analytical methods to personnel in the Analytical Chemistry Laboratory (ACL) and 222-S laboratory. This report is intended as an annual report, not a completed work

  19. Development of a new method for hydrogen isotope analysis of trace hydrocarbons in natural gas samples

    Directory of Open Access Journals (Sweden)

    Xibin Wang

    2016-12-01

    Full Text Available A new method had been developed for the analysis of hydrogen isotopic composition of trace hydrocarbons in natural gas samples by using solid phase microextraction (SPME combined with gas chromatography-isotope ratio mass spectrometry (GC/IRMS. In this study, the SPME technique had been initially introduced to achieve the enrichment of trace content of hydrocarbons with low abundance and coupled to GC/IRMS for hydrogen isotopic analysis. The main parameters, including the equilibration time, extraction temperature, and the fiber type, were systematically optimized. The results not only demonstrated that high extraction yield was true but also shows that the hydrogen isotopic fractionation was not observed during the extraction process, when the SPME device fitted with polydimethylsiloxane/divinylbenzene/carbon molecular sieve (PDMS/DVB/CAR fiber. The applications of SPME-GC/IRMS method were evaluated by using natural gas samples collected from different sedimentary basins; the standard deviation (SD was better than 4‰ for reproducible measurements; and also, the hydrogen isotope values from C1 to C9 can be obtained with satisfying repeatability. The SPME-GC/IRMS method fitted with PDMS/DVB/CAR fiber is well suited for the preconcentration of trace hydrocarbons, and provides a reliable hydrogen isotopic analysis for trace hydrocarbons in natural gas samples.

  20. Analytical Methods Development in Support of the Caustic Side Solvent Extraction System

    International Nuclear Information System (INIS)

    Maskarinec, M.P.

    2001-01-01

    The goal of the project reported herein was to develop and apply methods for the analysis of the major components of the solvent system used in the Caustic-Side Solvent Extraction Process (CSSX). These include the calix(4)arene, the modifier, 1-(2,2,3,3- tetrafluoropropoxy)-3-(4-sec-butylphenoxy)-2-propanol and tri-n-octylamine. In addition, it was an objective to develop methods that would allow visualization of other components under process conditions. These analyses would include quantitative laboratory methods for each of the components, quantitative analysis of expected breakdown products (4-see-butylphenol and di-n-octylamine), and qualitative investigations of possible additional breakdown products under a variety of process extremes. These methods would also provide a framework for process analysis should a pilot facility be developed. Two methods were implemented for sample preparation of aqueous phases. The first involves solid-phase extraction and produces quantitative recovery of the solvent components and degradation products from the various aqueous streams. This method can be automated and is suitable for use in radiation shielded facilities. The second is a variation of an established EPA liquid-liquid extraction procedure. This method is also quantitative and results in a final extract amenable to virtually any instrumental analysis. Two HPLC methods were developed for quantitative analysis. The first is a reverse-phase system with variable wavelength W detection. This method is excellent from a quantitative point of view. The second method is a size-exclusion method coupled with dual UV and evaporative light scattering detectors. This method is much faster than the reverse-phase method and allows for qualitative analysis of other components of the waste. For tri-n-octylamine and other degradation products, a GC method was developed and subsequently extended to GUMS. All methods have precision better than 5%. The combination of these methods

  1. Development and validation of a spectroscopic method for the ...

    African Journals Online (AJOL)

    Development and validation of a spectroscopic method for the simultaneous analysis of ... advanced analytical methods such as high pressure liquid ..... equipment. DECLARATIONS ... high-performance liquid chromatography. J Chromatogr.

  2. Analysis of Vibration Diagnostics Methods for Induction Motors

    Directory of Open Access Journals (Sweden)

    A. P. Kalinov

    2012-01-01

    Full Text Available The paper presents an analysis of existing vibration diagnostics methods. In order to evaluate an efficiency of method application the following criteria have been proposed: volume of input data required for establishing diagnosis, data content, software and hardware level, execution time for vibration diagnostics. According to the mentioned criteria a classification of vibration diagnostics methods for determination of their advantages and disadvantages, search for their development and improvement has been presented in paper. The paper contains a comparative estimation of methods in accordance with the proposed  criteria. According to this estimation the most efficient methods are a spectral analysis and spectral analysis of the vibration signal envelope.

  3. Applicability of soil-structure interaction analysis methods for earthquake loadings (V)

    International Nuclear Information System (INIS)

    Chang, S. P.; Ko, H. M.; Kim, J. K.; Yoon, J. Y.; Chin, B. M.; Yang, T. S.; Park, J. Y.; Cho, J. R.; Ryu, H.

    1997-07-01

    The ultimate goals of this research are to cultivate the capability of accurate 551 analysis and to develop the effective soil-structure interaction analysis method and computer program by comparing analysis results obtained in Lotung/Hualien lS5T project. In this research, the scope of this study is to establish the method of soil-structure interaction analysis using hyperlement and to develop a computer program of 551 analysis, to do parametric study for the comprehension of the characteristics and the applicability of hyper elements and to verify the validity and the applicability of this method(or program) through the analysis of seismic response of Hualien lS5T project. In this study, we verified the validity and the efficiency of the soil-structure interaction analysis method using hyper elements and developed computer programs using hyper elements. Based on the I-dimensional wave propagation theory, we developed a computer program of free-field analysis considering the primary non-lineriry of seismic responses. And using this program, we computed the effective ground earthquake motions of soil regions. The computer programs using hyper elements can treat non-homogeneity of soil regions very easily and perform the analysis quickly by the usage of the analytical solutions in horizontal direction. 50 this method would be very efficient and practical method

  4. Direct methods of soil-structure interaction analysis for earthquake loadings(II)

    Energy Technology Data Exchange (ETDEWEB)

    Yun, Chung Bang; Lee, S. R.; Kim, J. M.; Park, K. L.; Oh, S. B.; Choi, J. S.; Kim, Y. S. [Korea Advanced Institute of Science Technology, Daejeon (Korea, Republic of)

    1994-07-15

    In this study, methods for 3-D soil-structure interaction analysis have been studied. They are 3-D axisymmetric analysis method, 3-D axisymmetric finite element method incorporating infinite elements, and 3-D boundary element methods. The computer code, named as 'KIESSI - PF', has been developed which is based on the 3-D axisymmetric finite element method coupled with infinite element method. It is able to simulate forced vibration test results of a soil-structure interaction system. The Hualien FVT post-correlation analysis before backfill and the blind prediction analysis after backfill have been carried out using the developed computer code 'KIESSI - PF'.

  5. Development and Validation of Improved Method for Fingerprint ...

    African Journals Online (AJOL)

    Purpose: To develop and validate an improved method by capillary zone electrophoresis with photodiode array detection for the fingerprint analysis of Ligusticum chuanxiong Hort. (Rhizoma Chuanxiong). Methods: The optimum high performance capillary electrophoresis (HPCE) conditions were 30 mM borax containing 5 ...

  6. Development and Implementation of Efficiency-Improving Analysis Methods for the SAGE III on ISS Thermal Model Originating

    Science.gov (United States)

    Liles, Kaitlin; Amundsen, Ruth; Davis, Warren; Scola, Salvatore; Tobin, Steven; McLeod, Shawn; Mannu, Sergio; Guglielmo, Corrado; Moeller, Timothy

    2013-01-01

    The Stratospheric Aerosol and Gas Experiment III (SAGE III) instrument is the fifth in a series of instruments developed for monitoring aerosols and gaseous constituents in the stratosphere and troposphere. SAGE III will be delivered to the International Space Station (ISS) via the SpaceX Dragon vehicle in 2015. A detailed thermal model of the SAGE III payload has been developed in Thermal Desktop (TD). Several novel methods have been implemented to facilitate efficient payload-level thermal analysis, including the use of a design of experiments (DOE) methodology to determine the worst-case orbits for SAGE III while on ISS, use of TD assemblies to move payloads from the Dragon trunk to the Enhanced Operational Transfer Platform (EOTP) to its final home on the Expedite the Processing of Experiments to Space Station (ExPRESS) Logistics Carrier (ELC)-4, incorporation of older models in varying unit sets, ability to change units easily (including hardcoded logic blocks), case-based logic to facilitate activating heaters and active elements for varying scenarios within a single model, incorporation of several coordinate frames to easily map to structural models with differing geometries and locations, and streamlined results processing using an Excel-based text file plotter developed in-house at LaRC. This document presents an overview of the SAGE III thermal model and describes the development and implementation of these efficiency-improving analysis methods.

  7. Development of mechanical analysis module for simulation of SFR fuel rod behavior using finite element method

    International Nuclear Information System (INIS)

    Shin, Andong; Jeong, Hyedong; Suh, Namduk; Kim, Hyochan; Yang, Yongsik

    2014-01-01

    Korean SFR developer decided to adapt metal fuel, current study focused on the metal fuel instead of oxide fuel. The SFR metal fuel has been developed by Korea Atomic Energy Research Institute (KAERI) and many efforts focused on designing and manufacturing the metal fuel. Since a nuclear fuel is the first barrier to protect radioactive isotope release, the fuel's integrity must be secured during steady-state operation and accident condition within an acceptable range. Whereas the design and evaluation methodologies, code systems and test procedures of a light water reactor fuel are sufficiently established, those of the SFR fuel needs more technical advances. In the view of regulatory point, there are still many challenging issues which are required to secure the safety of fuel and reactors. For this reason, the Korea Institute of Nuclear Safety (KINS) has launched the new project to develop the regulatory technology for SFR system including a fuel area. The ALFUS code was developed by CRIEPI and employs mechanistic model for fission gas release and swelling of fuel slug. In the code system, a finite element method was introduced to analyze the fuel and cladding's mechanical behaviors. The FEAST code is more advanced code system for SFR which adopted mechanistic FGR and swelling model but still use analytical model to simulate fuel and cladding mechanical behavior. Based on the survey of the previous studies, fuel and cladding mechanical model should be improved. Analysis of mechanical behavior for fuel rod is crucial to evaluate overall rod's integrity. In addition, it is because contact between fuel slug and cladding or an over-pressure of rod internal pressure can cause rod failure during steady-state and other operation condition. The most of reference codes have simplified mechanical analysis model, so called 'analytical mode', because the detailed mechanical analysis requires large amount of calculation time and computing power. Even

  8. Development of mechanical analysis module for simulation of SFR fuel rod behavior using finite element method

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Andong; Jeong, Hyedong; Suh, Namduk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of); Kim, Hyochan; Yang, Yongsik [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    Korean SFR developer decided to adapt metal fuel, current study focused on the metal fuel instead of oxide fuel. The SFR metal fuel has been developed by Korea Atomic Energy Research Institute (KAERI) and many efforts focused on designing and manufacturing the metal fuel. Since a nuclear fuel is the first barrier to protect radioactive isotope release, the fuel's integrity must be secured during steady-state operation and accident condition within an acceptable range. Whereas the design and evaluation methodologies, code systems and test procedures of a light water reactor fuel are sufficiently established, those of the SFR fuel needs more technical advances. In the view of regulatory point, there are still many challenging issues which are required to secure the safety of fuel and reactors. For this reason, the Korea Institute of Nuclear Safety (KINS) has launched the new project to develop the regulatory technology for SFR system including a fuel area. The ALFUS code was developed by CRIEPI and employs mechanistic model for fission gas release and swelling of fuel slug. In the code system, a finite element method was introduced to analyze the fuel and cladding's mechanical behaviors. The FEAST code is more advanced code system for SFR which adopted mechanistic FGR and swelling model but still use analytical model to simulate fuel and cladding mechanical behavior. Based on the survey of the previous studies, fuel and cladding mechanical model should be improved. Analysis of mechanical behavior for fuel rod is crucial to evaluate overall rod's integrity. In addition, it is because contact between fuel slug and cladding or an over-pressure of rod internal pressure can cause rod failure during steady-state and other operation condition. The most of reference codes have simplified mechanical analysis model, so called 'analytical mode', because the detailed mechanical analysis requires large amount of calculation time and computing power. Even

  9. Microlocal methods in the analysis of the boundary element method

    DEFF Research Database (Denmark)

    Pedersen, Michael

    1993-01-01

    The application of the boundary element method in numerical analysis is based upon the use of boundary integral operators stemming from multiple layer potentials. The regularity properties of these operators are vital in the development of boundary integral equations and error estimates. We show...

  10. Development of an unbiased statistical method for the analysis of unigenic evolution

    Directory of Open Access Journals (Sweden)

    Shilton Brian H

    2006-03-01

    Full Text Available Abstract Background Unigenic evolution is a powerful genetic strategy involving random mutagenesis of a single gene product to delineate functionally important domains of a protein. This method involves selection of variants of the protein which retain function, followed by statistical analysis comparing expected and observed mutation frequencies of each residue. Resultant mutability indices for each residue are averaged across a specified window of codons to identify hypomutable regions of the protein. As originally described, the effect of changes to the length of this averaging window was not fully eludicated. In addition, it was unclear when sufficient functional variants had been examined to conclude that residues conserved in all variants have important functional roles. Results We demonstrate that the length of averaging window dramatically affects identification of individual hypomutable regions and delineation of region boundaries. Accordingly, we devised a region-independent chi-square analysis that eliminates loss of information incurred during window averaging and removes the arbitrary assignment of window length. We also present a method to estimate the probability that conserved residues have not been mutated simply by chance. In addition, we describe an improved estimation of the expected mutation frequency. Conclusion Overall, these methods significantly extend the analysis of unigenic evolution data over existing methods to allow comprehensive, unbiased identification of domains and possibly even individual residues that are essential for protein function.

  11. Developing Methods of praxeology to Perform Document-analysis

    DEFF Research Database (Denmark)

    Frederiksen, Jesper

    2016-01-01

    This paper provides a contribution to the methodological development on praxeologic document analysis of neoliberal welfare state policies. Different institutions related to the Danish Healthcare area, transform international health policies and these institutions produce a range of strategies...... is possible. The different works are unique but at the same time part of a common neoliberal welfare state practice. They have a structural similarity as homologous strategies related to an institutional production field of Health- and Social care service. From the construction of these strategies, it is thus...... possible to discuss more overall consequences of the neoliberal policies and the impact on nurses and their position as a health-profession....

  12. Validation and further development of a novel thermal analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Mathews, E.H.; Shuttleworth, A.G.; Rousseau, P.G. [Pretoria Univ. (South Africa). Dept. of Mechanical Engineering

    1994-12-31

    The design of thermal and energy efficient buildings requires inter alia the investigation of the passive performance, natural ventilation, mechanical ventilation as well as structural and evaporative cooling of the building. Only when these fail to achieve the desired thermal comfort should mechanical cooling systems be considered. Few computer programs have the ability to investigate all these comfort regulating methods at the design stage. The QUICK design program can simulate these options with the exception of mechanical cooling. In this paper, Quick`s applicability is extended to include the analysis of basic air-conditioning systems. Since the design of these systems is based on indoor loads, it was necessary to validate QUICK`s load predictions before extending it. This article addresses validation in general and proposes a procedure to establish the efficiency of a program`s load predictions. This proposed procedure is used to compare load predictions by the ASHRAE, CIBSE, CARRIER, CHEETAH, BSIMAC and QUICK methods for 46 case studies involving 36 buildings in various climatic conditions. Although significant differences in the results of the various methods were observed, it is concluded that QUICK can be used with the same confidence as the other methods. It was further shown that load prediction programs usually under-estimate the effect of building mass and therefore over-estimate the peak loads. The details for the 46 case studies are available to other researchers for further verification purposes. With the confidence gained in its load predictions, QUICK was extended to include air-conditioning system analysis. The program was then applied to different case studies. It is shown that system size and energy usage can be reduced by more than 60% by using a combination of passive and mechanical cooling systems as well as different control strategies. (author)

  13. Nodal method for fast reactor analysis

    International Nuclear Information System (INIS)

    Shober, R.A.

    1979-01-01

    In this paper, a nodal method applicable to fast reactor diffusion theory analysis has been developed. This method has been shown to be accurate and efficient in comparison to highly optimized finite difference techniques. The use of an analytic solution to the diffusion equation as a means of determining accurate coupling relationships between nodes has been shown to be highly accurate and efficient in specific two-group applications, as well as in the current multigroup method

  14. Development and application of RP-HPLC methods for the analysis of transition metals and their radioactive isotops in radioactive waste

    International Nuclear Information System (INIS)

    Seekamp, S.

    1999-07-01

    A major criterion in the final disposal of nuclear waste is to keep possible changes in the geosphere due to the introduction of radioactive waste as small as possible and to prevent any escape into the biosphere in the long term. The Federal Office for Radiation Protection (BfS) has therefore established limit values for a number of nuclides. Verifying these limits has to date involved laborious wet chemical analysis. In order to accelerate quantification there is a need to develop rapid multielement methods. HPLC methods represent a starting point for this development. Chemical separation is necessary to quantify β-emitters via their radioactive radiation since they are characterized by a continuous energy spectrum. A method for quantifying transition metals and their radioactive isotopes from radioactive waste has been created by using a chelating agent to select the analytes and RP-HPLC to separate the complexes formed. In addition to separating the matrix, complexation on a precolumn has the advantage of enriching the analytes. The subject of this thesis is the development and application of the method including studies of the mobile and stationary phase, as well as the optimization of all parameters, such as pH value, sample volume etc., which influence separation, enrichment or detection. The method developed was successfully tested using cement samples. It was also used for investigations of ion exchange resins and for trace analysis in calcium fluoride. Furthermore, the transferability of the method to actinides was examined by using a different complexing agent. (orig.) [de

  15. Development of Tsunami PSA method for Korean NPP site

    International Nuclear Information System (INIS)

    Kim, Min Kyu; Choi, In Kil; Park, Jin Hee

    2010-01-01

    A methodology of tsunami PSA was developed in this study. A tsunami PSA consists of tsunami hazard analysis, tsunami fragility analysis and system analysis. In the case of tsunami hazard analysis, evaluation of tsunami return period is major task. For the evaluation of tsunami return period, numerical analysis and empirical method can be applied. The application of this method was applied to a nuclear power plant, Ulchin 56 NPP, which is located in the east coast of Korean peninsula. Through this study, whole tsunami PSA working procedure was established and example calculation was performed for one of real nuclear power plant in Korea

  16. Developments of an Interactive Sail Design Method

    OpenAIRE

    S. M. Malpede; M. Vezza

    2000-01-01

    This paper presents a new tool for performing the integrated design and analysis of a sail. The features of the system are the geometrical definition of a sail shape, using the Bezier surface method, the creation of a finite element model for the non-linear structural analysis and a fluid-dynamic model for the aerodynamic analysis. The system has been developed using MATLAB(r). Recent sail design efforts have been focused on solving the aeroelastic behavior of the sail. The pressure dis...

  17. Quantitative methods for developing C2 system requirement

    Energy Technology Data Exchange (ETDEWEB)

    Tyler, K.K.

    1992-06-01

    The US Army established the Army Tactical Command and Control System (ATCCS) Experimentation Site (AES) to provide a place where material and combat developers could experiment with command and control systems. The AES conducts fundamental and applied research involving command and control issues using a number of research methods, ranging from large force-level experiments, to controlled laboratory experiments, to studies and analyses. The work summarized in this paper was done by Pacific Northwest Laboratory under task order from the Army Tactical Command and Control System Experimentation Site. The purpose of the task was to develop the functional requirements for army engineer automation and support software, including MCS-ENG. A client, such as an army engineer, has certain needs and requirements of his or her software; these needs must be presented in ways that are readily understandable to the software developer. A requirements analysis then, such as the one described in this paper, is simply the means of communication between those who would use a piece of software and those who would develop it. The analysis from which this paper was derived attempted to bridge the communications gap'' between army combat engineers and software engineers. It sought to derive and state the software needs of army engineers in ways that are meaningful to software engineers. In doing this, it followed a natural sequence of investigation: (1) what does an army engineer do, (2) with which tasks can software help, (3) how much will it cost, and (4) where is the highest payoff This paper demonstrates how each of these questions was addressed during an analysis of the functional requirements of engineer support software. Systems engineering methods are used in a task analysis and a quantitative scoring method was developed to score responses regarding the feasibility of task automation. The paper discusses the methods used to perform utility and cost-benefits estimates.

  18. A method for studying decision-making by guideline development groups

    Directory of Open Access Journals (Sweden)

    Michie Susan

    2009-08-01

    Full Text Available Abstract Background Multidisciplinary guideline development groups (GDGs have considerable influence on UK healthcare policy and practice, but previous research suggests that research evidence is a variable influence on GDG recommendations. The Evidence into Recommendations (EiR study has been set up to document social-psychological influences on GDG decision-making. In this paper we aim to evaluate the relevance of existing qualitative methodologies to the EiR study, and to develop a method best-suited to capturing influences on GDG decision-making. Methods A research team comprised of three postdoctoral research fellows and a multidisciplinary steering group assessed the utility of extant qualitative methodologies for coding verbatim GDG meeting transcripts and semi-structured interviews with GDG members. A unique configuration of techniques was developed to permit data reduction and analysis. Results Our method incorporates techniques from thematic analysis, grounded theory analysis, content analysis, and framework analysis. Thematic analysis of individual interviews conducted with group members at the start and end of the GDG process defines discrete problem areas to guide data extraction from GDG meeting transcripts. Data excerpts are coded both inductively and deductively, using concepts taken from theories of decision-making, social influence and group processes. These codes inform a framework analysis to describe and explain incidents within GDG meetings. We illustrate the application of the method by discussing some preliminary findings of a study of a National Institute for Health and Clinical Excellence (NICE acute physical health GDG. Conclusion This method is currently being applied to study the meetings of three of NICE GDGs. These cover topics in acute physical health, mental health and public health, and comprise a total of 45 full-day meetings. The method offers potential for application to other health care and decision

  19. Development of conjugate methods with gas chromatography for inorganic compounds analysis

    International Nuclear Information System (INIS)

    Baccan, N.

    1975-01-01

    The application of gas chromatography combined with mass spectrometry or with nuclear methods for the analysis of inorganic compounds is studied. The advantages of the use of a gas chromatograph coupled with a quadrupole mass spectrometer or with a high resolution radiation detector, are discussed. We also studied the formation and solvent extraction of metal chelates; an aliquot of the organic phase was directly injected into the gas chromatograph and the eluted compounds were detected by mass spectrometry or, when radioactive, by nuclear methods. (author)

  20. Pathways to lean software development: An analysis of effective methods of change

    Science.gov (United States)

    Hanson, Richard D.

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is firmly entrenched in American business today was to blame for this difficulty (Chatterjee, 2010). Each of these proponents of new methods sought to remove waste, lighten out the process, and implement lean principles in software development. Through this study, the experts evaluated the barriers to effective development principles and defined leadership qualities necessary to overcome these barriers. The barriers identified were issues of resistance to change, risk and reward issues, and management buy-in. Thirty experts in software development from several Fortune 500 companies across the United States explored each of these issues in detail. The conclusion reached by these experts was that visionary leadership is necessary to overcome these challenges.

  1. Development of a sensitive and rapid method for rifampicin impurity analysis using supercritical fluid chromatography.

    Science.gov (United States)

    Li, Wei; Wang, Jun; Yan, Zheng-Yu

    2015-10-10

    A novel simple, fast and efficient supercritical fluid chromatography (SFC) method was developed and compared with RPLC method for the separation and determination of impurities in rifampicin. The separation was performed using a packed diol column and a mobile phase B (modifier) consisting of methanol with 0.1% ammonium formate (w/v) and 2% water (v/v). Overall satisfactory resolutions and peak shapes for rifampicin quinone (RQ), rifampicin (RF), rifamycin SV (RSV), rifampicin N-oxide (RNO) and 3-formylrifamycinSV (3-FR) were obtained by optimization of the chromatography system. With gradient elution of mobile phase, all of the impurities and the active were separated within 4 min. Taking full advantage of features of SFC (such as particular selectivity, non-sloping baseline in gradient elution, and without injection solvent effects), the method was successfully used for determination of impurities in rifampicin, with more impurity peaks detected, better resolution achieved and much less analysis time needed compared with conventional reversed-phase liquid chromatography (RPLC) methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Gap analysis: a method to assess core competency development in the curriculum.

    Science.gov (United States)

    Fater, Kerry H

    2013-01-01

    To determine the extent to which safety and quality improvement core competency development occurs in an undergraduate nursing program. Rapid change and increased complexity of health care environments demands that health care professionals are adequately prepared to provide high quality, safe care. A gap analysis compared the present state of competency development to a desirable (ideal) state. The core competencies, Nurse of the Future Nursing Core Competencies, reflect the ideal state and represent minimal expectations for entry into practice from pre-licensure programs. Findings from the gap analysis suggest significant strengths in numerous competency domains, deficiencies in two competency domains, and areas of redundancy in the curriculum. Gap analysis provides valuable data to direct curriculum revision. Opportunities for competency development were identified, and strategies were created jointly with the practice partner, thereby enhancing relevant knowledge, attitudes, and skills nurses need for clinical practice currently and in the future.

  3. SUBSURFACE CONSTRUCTION AND DEVELOPMENT ANALYSIS

    International Nuclear Information System (INIS)

    N.E. Kramer

    1998-01-01

    The purpose of this analysis is to identify appropriate construction methods and develop a feasible approach for construction and development of the repository subsurface facilities. The objective of this analysis is to support development of the subsurface repository layout for License Application (LA) design. The scope of the analysis for construction and development of the subsurface Repository facilities covers: (1) Excavation methods, including application of knowledge gained from construction of the Exploratory Studies Facility (ESF). (2) Muck removal from excavation headings to the surface. This task will examine ways of preventing interference with other subsurface construction activities. (3) The logistics and equipment for the construction and development rail haulage systems. (4) Impact of ground support installation on excavation and other construction activities. (5) Examination of how drift mapping will be accomplished. (6) Men and materials handling. (7) Installation and removal of construction utilities and ventilation systems. (8) Equipping and finishing of the emplacement drift mains and access ramps to fulfill waste emplacement operational needs. (9) Emplacement drift and access mains and ramps commissioning prior to handover for emplacement operations. (10) Examination of ways to structure the contracts for construction of the repository. (11) Discussion of different construction schemes and how to minimize the schedule risks implicit in those schemes. (12) Surface facilities needed for subsurface construction activities

  4. Economic analysis of alternative LLW disposal methods

    International Nuclear Information System (INIS)

    Foutes, C.E.

    1987-01-01

    The Environmental Protection Agency (EPA) has evaluated the costs and benefits of alternative disposal technologies as part of its program to develop generally applicable environmental standards for the land disposal of low-level radioactive waste (LLW). Costs, population health effects and Critical Population Group (CPG) exposures resulting from alternative waste treatment and disposal methods were developed and input into the analysis. The cost-effectiveness analysis took into account a number of waste streams, hydrogeologic and climatic region settings, and waste treatment and disposal methods. Total costs of each level of a standard included costs for packaging, processing, transportation, and burial of waste. Benefits are defined in terms of reductions in the general population health risk (expected fatal cancers and genetic effects) evaluated over 10,000 years. A cost-effectiveness ratio, was calculated for each alternative standard. This paper describes the alternatives considered and preliminary results of the cost-effectiveness analysis

  5. A Method of Fire Scenarios Identification in a Consolidated Fire Risk Analysis

    International Nuclear Information System (INIS)

    Lim, Ho Gon; Han, Sang Hoon; Yang, Joon Eon

    2010-01-01

    Conventional fire PSA consider only two cases of fire scenarios, that is one for fire without propagation and the other for single propagation to neighboring compartment. Recently, a consolidated fire risk analysis using single fault tree (FT) was developed. However, the fire scenario identification in the new method is similar to conventional fire analysis method. The present study develops a new method of fire scenario identification in a consolidated fire risk analysis method. An equation for fire propagation is developed to identify fire scenario and a mapping method of fire scenarios into internal event risk model is discussed. Finally, an algorithm for automatic program is suggested

  6. Innovative method of RES integration into the regional energy development scenarios

    International Nuclear Information System (INIS)

    Klevas, Valentinas; Biekša, Kestutis; Murauskaitė, Lina

    2014-01-01

    Scarcity or abundance of energy resources usually depends on physical and geographical conditions in the region. However, the energy flow in the region also depends on the efficient use of energy resources, the consumption rate of energy and the possibility to use local renewable and non-renewable energy resources. Production, distribution and the use of energy resources in the region are the challenges for central and local government, business and social service, customers and other stakeholders. Development of regional energy economy should be optimized according to the available energy flow in the region using a network system analysis method, which provides solutions for developing sustainable energy economy models. The network system analysis method enables to optimize the use of local and renewable resources at the regional level and reveals available local energy resources. An efficient use of available regional resources and the use of renewable energy sources (RES) should be the main goals for the development of regional energy system. RES can compete with traditional fossil fuel with the condition that all hidden aspects are revealed. The network system analysis method enables to indicate energy flows in the region as well as indicate pros and cons of using renewable energy technologies. - Highlights: • RES integration into the regional energy development scenarios is done. • Innovative process network system (PNS) analysis method is used. • PNS method is used to optimize the use of local and renewable resources. • Analysis of energy flow in region using PNS method is done

  7. Metaphysics methods development for high temperature gas cooled reactor analysis

    International Nuclear Information System (INIS)

    Seker, V.; Downar, T. J.

    2007-01-01

    Gas cooled reactors have been characterized as one of the most promising nuclear reactor concepts in the Generation-IV technology road map. Considerable research has been performed on the design and safety analysis of these reactors. However, the calculational tools being used to perform these analyses are not state-of-the-art and are not capable of performing detailed three-dimensional analyses. This paper presents the results of an effort to develop an improved thermal-hydraulic solver for the pebble bed type high temperature gas cooled reactors. The solution method is based on the porous medium approach and the momentum equation including the modified Ergun's resistance model for pebble bed is solved in three-dimensional geometry. The heat transfer in the pebble bed is modeled considering the local thermal non-equilibrium between the solid and gas, which results in two separate energy equations for each medium. The effective thermal conductivity of the pebble-bed can be calculated both from Zehner-Schluender and Robold correlations. Both the fluid flow and the heat transfer are modeled in three dimensional cylindrical coordinates and can be solved in steady-state and time dependent. The spatial discretization is performed using the finite volume method and the theta-method is used in the temporal discretization. A preliminary verification was performed by comparing the results with the experiments conducted at the SANA test facility. This facility is located at the Institute for Safety Research and Reactor Technology (ISR), Julich, Germany. Various experimental cases are modeled and good agreement in the gas and solid temperatures is observed. An on-going effort is to model the control rod ejection scenarios as described in the OECD/NEA/NSC PBMR-400 benchmark problem. In order to perform these analyses PARCS reactor simulator code will be coupled with the new thermal-hydraulic solver. Furthermore, some of the other anticipated accident scenarios in the benchmark

  8. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method.

    Science.gov (United States)

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-02-01

    To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  9. Modern methods of wine quality analysis

    Directory of Open Access Journals (Sweden)

    Галина Зуфарівна Гайда

    2015-06-01

    Full Text Available  In this paper physical-chemical and enzymatic methods of quantitative analysis of the basic wine components were reviewed. The results of own experiments were presented for the development of enzyme- and cell-based amperometric sensors on ethanol, lactate, glucose, arginine

  10. Development of methods for body composition studies

    International Nuclear Information System (INIS)

    Mattsson, Soeren; Thomas, Brian J

    2006-01-01

    This review is focused on experimental methods for determination of the composition of the human body, its organs and tissues. It summarizes the development and current status of fat determinations from body density, total body water determinations through the dilution technique, whole and partial body potassium measurements for body cell mass estimates, in vivo neutron activation analysis for body protein measurements, dual-energy absorptiometry (DEXA), computed tomography (CT) and magnetic resonance imaging (MRI, fMRI) and spectroscopy (MRS) for body composition studies on tissue and organ levels, as well as single- and multiple-frequency bioimpedance (BIA) and anthropometry as simple easily available methods. Methods for trace element analysis in vivo are also described. Using this wide range of measurement methods, together with gradually improved body composition models, it is now possible to quantify a number of body components and follow their changes in health and disease. (review)

  11. Development of methods for body composition studies

    Energy Technology Data Exchange (ETDEWEB)

    Mattsson, Soeren [Department of Radiation Physics, Lund University, Malmoe University Hospital, SE-205 02 Malmoe (Sweden); Thomas, Brian J [School of Physical and Chemical Sciences, Queensland University of Technology, Brisbane, QLD 4001 (Australia)

    2006-07-07

    This review is focused on experimental methods for determination of the composition of the human body, its organs and tissues. It summarizes the development and current status of fat determinations from body density, total body water determinations through the dilution technique, whole and partial body potassium measurements for body cell mass estimates, in vivo neutron activation analysis for body protein measurements, dual-energy absorptiometry (DEXA), computed tomography (CT) and magnetic resonance imaging (MRI, fMRI) and spectroscopy (MRS) for body composition studies on tissue and organ levels, as well as single- and multiple-frequency bioimpedance (BIA) and anthropometry as simple easily available methods. Methods for trace element analysis in vivo are also described. Using this wide range of measurement methods, together with gradually improved body composition models, it is now possible to quantify a number of body components and follow their changes in health and disease. (review)

  12. Development of a preparation and staining method for fetal erythroblasts in maternal blood : Simultaneous immunocytochemical staining and FISH analysis

    NARCIS (Netherlands)

    Oosterwijk, JC; Mesker, WE; Ouwerkerk-van Velzen, MCM; Knepfle, CFHM; Wiesmeijer, KC; van den Burg, MJM; Beverstock, GC; Bernini, LF; van Ommen, Gert-Jan B; Kanhai, HHH; Tanke, HJ

    1998-01-01

    In order to detect fetal nucleated red blood cells (NRBCs) in maternal blood, a protocol was developed which aimed at producing a reliable staining method for combined immunocytochemical and FISH analysis. The technique had to be suitable for eventual automated screening of slides. Chorionic villi

  13. Analysis of mixed data methods & applications

    CERN Document Server

    de Leon, Alexander R

    2013-01-01

    A comprehensive source on mixed data analysis, Analysis of Mixed Data: Methods & Applications summarizes the fundamental developments in the field. Case studies are used extensively throughout the book to illustrate interesting applications from economics, medicine and health, marketing, and genetics. Carefully edited for smooth readability and seamless transitions between chaptersAll chapters follow a common structure, with an introduction and a concluding summary, and include illustrative examples from real-life case studies in developmental toxicolog

  14. Development of a diagnostic expert system for eddy current data analysis using applied artificial intelligence methods

    International Nuclear Information System (INIS)

    Upadhyaya, B.R.; Yan, W.; Henry, G.

    1999-01-01

    A diagnostic expert system that integrates database management methods, artificial neural networks, and decision-making using fuzzy logic has been developed for the automation of steam generator eddy current test (ECT) data analysis. The new system, known as EDDYAI, considers the following key issues: (1) digital eddy current test data calibration, compression, and representation; (2) development of robust neural networks with low probability of misclassification for flaw depth estimation; (3) flaw detection using fuzzy logic; (4) development of an expert system for database management, compilation of a trained neural network library, and a decision module; and (5) evaluation of the integrated approach using eddy current data. The implementation to field test data includes the selection of proper feature vectors for ECT data analysis, development of a methodology for large eddy current database management, artificial neural networks for flaw depth estimation, and a fuzzy logic decision algorithm for flaw detection. A large eddy current inspection database from the Electric Power Research Institute NDE Center is being utilized in this research towards the development of an expert system for steam generator tube diagnosis. The integration of ECT data pre-processing as part of the data management, fuzzy logic flaw detection technique, and tube defect parameter estimation using artificial neural networks are the fundamental contributions of this research. (orig.)

  15. Development of a diagnostic expert system for eddy current data analysis using applied artificial intelligence methods

    Energy Technology Data Exchange (ETDEWEB)

    Upadhyaya, B.R.; Yan, W. [Tennessee Univ., Knoxville, TN (United States). Dept. of Nuclear Engineering; Behravesh, M.M. [Electric Power Research Institute, Palo Alto, CA (United States); Henry, G. [EPRI NDE Center, Charlotte, NC (United States)

    1999-09-01

    A diagnostic expert system that integrates database management methods, artificial neural networks, and decision-making using fuzzy logic has been developed for the automation of steam generator eddy current test (ECT) data analysis. The new system, known as EDDYAI, considers the following key issues: (1) digital eddy current test data calibration, compression, and representation; (2) development of robust neural networks with low probability of misclassification for flaw depth estimation; (3) flaw detection using fuzzy logic; (4) development of an expert system for database management, compilation of a trained neural network library, and a decision module; and (5) evaluation of the integrated approach using eddy current data. The implementation to field test data includes the selection of proper feature vectors for ECT data analysis, development of a methodology for large eddy current database management, artificial neural networks for flaw depth estimation, and a fuzzy logic decision algorithm for flaw detection. A large eddy current inspection database from the Electric Power Research Institute NDE Center is being utilized in this research towards the development of an expert system for steam generator tube diagnosis. The integration of ECT data pre-processing as part of the data management, fuzzy logic flaw detection technique, and tube defect parameter estimation using artificial neural networks are the fundamental contributions of this research. (orig.)

  16. Development of fuel assembly seismic analysis against vertical and horizontal earthquake

    International Nuclear Information System (INIS)

    Sato, T.; Akitake, J.; Kobayashi, H.; Azumi, S.; Koike, H.; Takeda, N.; Suzuki, S.

    2001-01-01

    Vertical vibration with large acceleration was observed in KOBE earthquake in 1995. Concerning PWR fuel assembly, though the vertical response has been calculated by a static analysis, it had better be calculated by a dynamic analysis in detail. Furthermore, mutual effects between horizontal and vertical motions attract our attention. For these reasons, a dynamic analysis method in the vertical direction was developed and linked with the previously developed method in the horizontal direction. This is the method that takes effect of vertical vibration into the horizontal vibration analysis as the change of horizontal stiffness, which is brought by axial compressive force. In this paper, fundamental test results for developing the method are introduced and summary of the advanced method's procedure and analysis results are also described. (authors)

  17. A method for studying decision-making by guideline development groups.

    Science.gov (United States)

    Gardner, Benjamin; Davidson, Rosemary; McAteer, John; Michie, Susan

    2009-08-05

    Multidisciplinary guideline development groups (GDGs) have considerable influence on UK healthcare policy and practice, but previous research suggests that research evidence is a variable influence on GDG recommendations. The Evidence into Recommendations (EiR) study has been set up to document social-psychological influences on GDG decision-making. In this paper we aim to evaluate the relevance of existing qualitative methodologies to the EiR study, and to develop a method best-suited to capturing influences on GDG decision-making. A research team comprised of three postdoctoral research fellows and a multidisciplinary steering group assessed the utility of extant qualitative methodologies for coding verbatim GDG meeting transcripts and semi-structured interviews with GDG members. A unique configuration of techniques was developed to permit data reduction and analysis. Our method incorporates techniques from thematic analysis, grounded theory analysis, content analysis, and framework analysis. Thematic analysis of individual interviews conducted with group members at the start and end of the GDG process defines discrete problem areas to guide data extraction from GDG meeting transcripts. Data excerpts are coded both inductively and deductively, using concepts taken from theories of decision-making, social influence and group processes. These codes inform a framework analysis to describe and explain incidents within GDG meetings. We illustrate the application of the method by discussing some preliminary findings of a study of a National Institute for Health and Clinical Excellence (NICE) acute physical health GDG. This method is currently being applied to study the meetings of three of NICE GDGs. These cover topics in acute physical health, mental health and public health, and comprise a total of 45 full-day meetings. The method offers potential for application to other health care and decision-making groups.

  18. Cost–benefit analysis method for building solutions

    International Nuclear Information System (INIS)

    Araújo, Catarina; Almeida, Manuela; Bragança, Luís; Barbosa, José Amarilio

    2016-01-01

    Highlights: • A new cost–benefit method was developed to compare building solutions. • The method considers energy performance, life cycle costs and investment willingness. • The graphical analysis helps stakeholders to easily compare building solutions. • The method was applied to a case study showing consistency and feasibility. - Abstract: The building sector is responsible for consuming approximately 40% of the final energy in Europe. However, more than 50% of this consumption can be reduced through energy-efficient measures. Our society is facing not only a severe and unprecedented environmental crisis but also an economic crisis of similar magnitude. In light of this, EU has developed legislation promoting the use of the Cost-Optimal (CO) method in order to improve building energy efficiency, in which selection criteria is based on life cycle costs. Nevertheless, studies show that the implementation of energy-efficient solutions is far from ideal. Therefore, it is very important to analyse the reasons for this gap between theory and implementation as well as improve selection methods. This study aims to develop a methodology based on a cost-effectiveness analysis, which can be seen as an improvement to the CO method as it considers the investment willingness of stakeholders in the selection process of energy-efficient solutions. The method uses a simple graphical display in which the stakeholders’ investment willingness is identified as the slope of a reference line, allowing easy selection between building solutions. This method will lead to the selection of more desired – from stakeholders’ point of view – and more energy-efficient solutions than those selected through the CO method.

  19. A Case Study of a Mixed Methods Study Engaged in Integrated Data Analysis

    Science.gov (United States)

    Schiazza, Daniela Marie

    2013-01-01

    The nascent field of mixed methods research has yet to develop a cohesive framework of guidelines and procedures for mixed methods data analysis (Greene, 2008). To support the field's development of analytical frameworks, this case study reflects on the development and implementation of a mixed methods study engaged in integrated data analysis.…

  20. Development and in house validation of a new thermogravimetric method for water content analysis in soft brown sugar.

    Science.gov (United States)

    Ducat, Giseli; Felsner, Maria L; da Costa Neto, Pedro R; Quináia, Sueli P

    2015-06-15

    Recently the use of brown sugar has increased due to its nutritional characteristics, thus requiring a more rigid quality control. The development of a method for water content analysis in soft brown sugar is carried out for the first time by TG/DTA with application of different statistical tests. The results of the optimization study suggest that heating rates of 5°C min(-1) and an alumina sample holder improve the efficiency of the drying process. The validation study showed that thermo gravimetry presents good accuracy and precision for water content analysis in soft brown sugar samples. This technique offers advantages over other analytical methods as it does not use toxic and costly reagents or solvents, it does not need any sample preparation, and it allows the identification of the temperature at which water is completely eliminated in relation to other volatile degradation products. This is an important advantage over the official method (loss on drying). Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Sensitivity Analysis of Structures by Virtual Distortion Method

    DEFF Research Database (Denmark)

    Gierlinski, J.T.; Holnicki-Szulc, J.; Sørensen, John Dalsgaard

    1991-01-01

    are used in structural optimization, see Haftka [4]. The recently developed Virtual Distortion Method (VDM) is a numerical technique which offers an efficient approach to calculation of the sensitivity derivatives. This method has been orginally applied to structural remodelling and collapse analysis, see...

  2. Analysis and synthesis of a logic control circuit by binary analysis methods

    International Nuclear Information System (INIS)

    Chicheportiche, Armand

    1974-06-01

    The analytical study of the logic circuits described in this report clearly shows the fruitful efficiency of the methods proposed by Binary Analysis. This study is a very new approach in logic and these mathematical methods are systematically precise in their applications. The detailed operations of an automatic system are to be studied in a way which cannot be reached by other methods. The definition and utilization of transition equations allow the determination of the different commutations in the auxiliary switch functions of a sequential system. This new way of analysis digital circuits will certainly develop in a very near future [fr

  3. A Comparison of Card-sorting Analysis Methods

    DEFF Research Database (Denmark)

    Nawaz, Ather

    2012-01-01

    This study investigates how the choice of analysis method for card sorting studies affects the suggested information structure for websites. In the card sorting technique, a variety of methods are used to analyse the resulting data. The analysis of card sorting data helps user experience (UX......) designers to discover the patterns in how users make classifications and thus to develop an optimal, user-centred website structure. During analysis, the recurrence of patterns of classification between users influences the resulting website structure. However, the algorithm used in the analysis influences...... the recurrent patterns found and thus has consequences for the resulting website design. This paper draws an attention to the choice of card sorting analysis and techniques and shows how it impacts the results. The research focuses on how the same data for card sorting can lead to different website structures...

  4. Development of Analytical Method for Detection of Some ...

    African Journals Online (AJOL)

    All rights reserved. ... 3Centre for Water Research and Analysis (ALIR), Faculty of Science and Technology, Universiti Kebangsaan (UKM), ... Purpose: To develop and validate a simple method using solid – phase extraction along with liquid.

  5. Multivariate Methods for Meta-Analysis of Genetic Association Studies.

    Science.gov (United States)

    Dimou, Niki L; Pantavou, Katerina G; Braliou, Georgia G; Bagos, Pantelis G

    2018-01-01

    Multivariate meta-analysis of genetic association studies and genome-wide association studies has received a remarkable attention as it improves the precision of the analysis. Here, we review, summarize and present in a unified framework methods for multivariate meta-analysis of genetic association studies and genome-wide association studies. Starting with the statistical methods used for robust analysis and genetic model selection, we present in brief univariate methods for meta-analysis and we then scrutinize multivariate methodologies. Multivariate models of meta-analysis for a single gene-disease association studies, including models for haplotype association studies, multiple linked polymorphisms and multiple outcomes are discussed. The popular Mendelian randomization approach and special cases of meta-analysis addressing issues such as the assumption of the mode of inheritance, deviation from Hardy-Weinberg Equilibrium and gene-environment interactions are also presented. All available methods are enriched with practical applications and methodologies that could be developed in the future are discussed. Links for all available software implementing multivariate meta-analysis methods are also provided.

  6. Review of strain buckling: analysis methods

    International Nuclear Information System (INIS)

    Moulin, D.

    1987-01-01

    This report represents an attempt to review the mechanical analysis methods reported in the literature to account for the specific behaviour that we call buckling under strain. In this report, this expression covers all buckling mechanisms in which the strains imposed play a role, whether they act alone (as in simple buckling under controlled strain), or whether they act with other loadings (primary loading, such as pressure, for example). Attention is focused on the practical problems relevant to LMFBR reactors. The components concerned are distinguished by their high slenderness ratios and by rather high thermal levels, both constant and variable with time. Conventional static buckling analysis methods are not always appropriate for the consideration of buckling under strain. New methods must therefore be developed in certain cases. It is also hoped that this review will facilitate the coding of these analytical methods to aid the constructor in his design task and to identify the areas which merit further investigation

  7. Development of fuel assembly seismic analysis against vertical and horizontal earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Sato, T.; Akitake, J.; Kobayashi, H. [Nuclear Development Corporation, Ibaraki (Japan); Azumi, S. [Kansai Electric Power co., inc., Osaka (Japan); Koike, H.; Takeda, N.; Suzuki, S. [Kobe Shipyard and Machinery Works, Mitsubishi Heavy Industries, LTD., Kobe (Japan)

    2001-07-01

    Vertical vibration with large acceleration was observed in KOBE earthquake in 1995. Concerning PWR fuel assembly, though the vertical response has been calculated by a static analysis, it had better be calculated by a dynamic analysis in detail. Furthermore, mutual effects between horizontal and vertical motions attract our attention. For these reasons, a dynamic analysis method in the vertical direction was developed and linked with the previously developed method in the horizontal direction. This is the method that takes effect of vertical vibration into the horizontal vibration analysis as the change of horizontal stiffness, which is brought by axial compressive force. In this paper, fundamental test results for developing the method are introduced and summary of the advanced method's procedure and analysis results are also described. (authors)

  8. Development of a simple method for classifying the degree of importance of components in nuclear power plants using probabilistic analysis technique

    International Nuclear Information System (INIS)

    Shimada, Yoshio; Miyazaki, Takamasa

    2006-01-01

    In order to analyze large amounts of trouble information of overseas nuclear power plants, it is necessary to select information that is significant in terms of both safety and reliability. In this research, a method of efficiently and simply classifying degrees of importance of components in terms of safety and reliability while paying attention to root-cause components appearing in the information was developed. Regarding safety, the reactor core damage frequency (CDF), which is used in the probabilistic analysis of a reactor, was used. Regarding reliability, the automatic plant trip probability (APTP), which is used in the probabilistic analysis of automatic reactor trips, was used. These two aspects were reflected in the development of criteria for classifying degrees of importance of components. By applying these criteria, a method of quantitatively and simply judging the significance of trouble information of overseas nuclear power plants was developed. (author)

  9. Quantitative bioanalytical and analytical method development of dibenzazepine derivative, carbamazepine: A review

    Directory of Open Access Journals (Sweden)

    Prasanna A. Datar

    2015-08-01

    Full Text Available Bioanalytical methods are widely used for quantitative estimation of drugs and their metabolites in physiological matrices. These methods could be applied to studies in areas of human clinical pharmacology and toxicology. The major bioanalytical services are method development, method validation and sample analysis (method application. Various methods such as GC, LC–MS/MS, HPLC, HPTLC, micellar electrokinetic chromatography, and UFLC have been used in laboratories for the qualitative and quantitative analysis of carbamazepine in biological samples throughout all phases of clinical research and quality control. The article incorporates various reported methods developed to help analysts in choosing crucial parameters for new method development of carbamazepine and its derivatives and also enumerates metabolites, and impurities reported so far. Keywords: Carbamazepine, HPLC, LC–MS/MS, HPTLC, RP-UFLC, Micellar electrokinetic chromatography

  10. Accident Analysis and Barrier Function (AEB) Method. Manual for Incident Analysis

    International Nuclear Information System (INIS)

    Svenson, Ola

    2000-02-01

    The Accident Analysis and Barrier Function (AEB) Method models an accident or incident as a series of interactions between human and technical systems. In the sequence of human and technical errors leading to an accident there is, in principle, a possibility to arrest the development between each two successive errors. This can be done by a barrier function which, for example, can stop an operator from making an error. A barrier function can be performed by one or several barrier function systems. To illustrate, a mechanical system, a computer system or another operator can all perform a given barrier function to stop an operator from making an error. The barrier function analysis consists of analysis of suggested improvements, the effectiveness of the improvements, the costs of implementation, probability of implementation, the cost of maintaining the barrier function, the probability that maintenance will be kept up to standards and the generalizability of the suggested improvement. The AEB method is similar to the US method called HPES, but differs from that method in different ways. To exemplify, the AEB method has more emphasis on technical errors than HPES. In contrast to HPES that describes a series of events, the AEB method models only errors. This gives a more focused analysis making it well suited for checking other HPES-type accident analyses. However, the AEB method is a generic and stand-alone method that has been applied in other fields than nuclear power, such as, in traffic accident analyses

  11. Accident Analysis and Barrier Function (AEB) Method. Manual for Incident Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Svenson, Ola [Stockholm Univ. (Sweden). Dept. of Psychology

    2000-02-01

    The Accident Analysis and Barrier Function (AEB) Method models an accident or incident as a series of interactions between human and technical systems. In the sequence of human and technical errors leading to an accident there is, in principle, a possibility to arrest the development between each two successive errors. This can be done by a barrier function which, for example, can stop an operator from making an error. A barrier function can be performed by one or several barrier function systems. To illustrate, a mechanical system, a computer system or another operator can all perform a given barrier function to stop an operator from making an error. The barrier function analysis consists of analysis of suggested improvements, the effectiveness of the improvements, the costs of implementation, probability of implementation, the cost of maintaining the barrier function, the probability that maintenance will be kept up to standards and the generalizability of the suggested improvement. The AEB method is similar to the US method called HPES, but differs from that method in different ways. To exemplify, the AEB method has more emphasis on technical errors than HPES. In contrast to HPES that describes a series of events, the AEB method models only errors. This gives a more focused analysis making it well suited for checking other HPES-type accident analyses. However, the AEB method is a generic and stand-alone method that has been applied in other fields than nuclear power, such as, in traffic accident analyses.

  12. Quantitative developments in the cognitive reliability and error analysis method (CREAM) for the assessment of human performance

    International Nuclear Information System (INIS)

    Marseguerra, Marzio; Zio, Enrico; Librizzi, Massimo

    2006-01-01

    The current 'second generation' approaches in human reliability analysis focus their attention on the contextual conditions under which a given action is performed rather than on the notion of inherent human error probabilities, as was done in the earlier 'first generation' techniques. Among the 'second generation' methods, this paper considers the Cognitive Reliability and Error Analysis Method (CREAM) and proposes some developments with respect to a systematic procedure for computing probabilities of action failure. The starting point for the quantification is a previously introduced fuzzy version of the CREAM paradigm which is here further extended to include uncertainty on the qualification of the conditions under which the action is performed and to account for the fact that the effects of the common performance conditions (CPCs) on performance reliability may not all be equal. By the proposed approach, the probability of action failure is estimated by rating the performance conditions in terms of their effect on the action

  13. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    Science.gov (United States)

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  14. Development of medical application methods using radiation. Radionuclide therapy

    International Nuclear Information System (INIS)

    Choi, Chang Woon; Lim, S. M.; Kim, E.H.; Woo, K. S.; Chung, W. S.; Lim, S. J.; Choi, T. H.; Hong, S. W.; Chung, H. Y.; No, W. C.; Oh, B. H.; Hong, H. J.

    1999-04-01

    In this project, we studied following subjects: 1. development of monoclonal antibodies and radiopharmaceuticals 2. clinical applications of radionuclide therapy 3. radioimmunoguided surgery 4. prevention of restenosis with intracoronary radiation. The results can be applied for the following objectives: 1) radionuclide therapy will be applied in clinical practice to treat the cancer patients or other diseases in multi-center trial. 2) The newly developed monoclonal antibodies and biomolecules can be used in biology, chemistry or other basic life science research. 3) The new methods for the analysis of therapeutic effects, such as dosimetry, and quantitative analysis methods of radioactivity, can be applied in basic research, such as radiation oncology and radiation biology

  15. Development of medical application methods using radiation. Radionuclide therapy

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Chang Woon; Lim, S. M.; Kim, E.H.; Woo, K. S.; Chung, W. S.; Lim, S. J.; Choi, T. H.; Hong, S. W.; Chung, H. Y.; No, W. C. [Korea Atomic Energy Research Institute. Korea Cancer Center Hospital, Seoul, (Korea, Republic of); Oh, B. H. [Seoul National University. Hospital, Seoul (Korea, Republic of); Hong, H. J. [Antibody Engineering Research Unit, Taejon (Korea, Republic of)

    1999-04-01

    In this project, we studied following subjects: 1. development of monoclonal antibodies and radiopharmaceuticals 2. clinical applications of radionuclide therapy 3. radioimmunoguided surgery 4. prevention of restenosis with intracoronary radiation. The results can be applied for the following objectives: (1) radionuclide therapy will be applied in clinical practice to treat the cancer patients or other diseases in multi-center trial. (2) The newly developed monoclonal antibodies and biomolecules can be used in biology, chemistry or other basic life science research. (3) The new methods for the analysis of therapeutic effects, such as dosimetry, and quantitative analysis methods of radioactivity, can be applied in basic research, such as radiation oncology and radiation biology.

  16. Coupling Neumann development and component mode synthesis methods for stochastic analysis of random structures

    Directory of Open Access Journals (Sweden)

    Driss Sarsri

    2014-05-01

    Full Text Available In this paper, we propose a method to calculate the first two moments (mean and variance of the structural dynamics response of a structure with uncertain variables and subjected to random excitation. For this, Newmark method is used to transform the equation of motion of the structure into a quasistatic equilibrium equation in the time domain. The Neumann development method was coupled with Monte Carlo simulations to calculate the statistical values of the random response. The use of modal synthesis methods can reduce the dimensions of the model before integration of the equation of motion. Numerical applications have been developed to highlight effectiveness of the method developed to analyze the stochastic response of large structures.

  17. Review of various dynamic modeling methods and development of an intuitive modeling method for dynamic systems

    International Nuclear Information System (INIS)

    Shin, Seung Ki; Seong, Poong Hyun

    2008-01-01

    Conventional static reliability analysis methods are inadequate for modeling dynamic interactions between components of a system. Various techniques such as dynamic fault tree, dynamic Bayesian networks, and dynamic reliability block diagrams have been proposed for modeling dynamic systems based on improvement of the conventional modeling methods. In this paper, we review these methods briefly and introduce dynamic nodes to the existing Reliability Graph with General Gates (RGGG) as an intuitive modeling method to model dynamic systems. For a quantitative analysis, we use a discrete-time method to convert an RGGG to an equivalent Bayesian network and develop a software tool for generation of probability tables

  18. A thermogravimetric analysis (TGA) method developed for estimating the stoichiometric ratio of solid-state {alpha}-cyclodextrin-based inclusion complexes

    Energy Technology Data Exchange (ETDEWEB)

    Bai, Yuxiang; Wang, Jinpeng; Bashari, Mohanad; Hu, Xiuting [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Feng, Tao [School of Perfume and Aroma Technology, Shanghai Institute of Technology, Shanghai 201418 (China); Xu, Xueming [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Jin, Zhengyu, E-mail: jinlab2008@yahoo.com [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Tian, Yaoqi, E-mail: yqtian@jiangnan.edu.cn [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China)

    2012-08-10

    Highlights: Black-Right-Pointing-Pointer We develop a TGA method for the measurement of the stoichiometric ratio. Black-Right-Pointing-Pointer A series of formulas are deduced to calculate the stoichiometric ratio. Black-Right-Pointing-Pointer Four {alpha}-CD-based inclusion complexes were successfully prepared. Black-Right-Pointing-Pointer The developed method is applicable. - Abstract: An approach mainly based on thermogravimetric analysis (TGA) was developed to evaluate the stoichiometric ratio (SR, guest to host) of the guest-{alpha}-cyclodextrin (Guest-{alpha}-CD) inclusion complexes (4-cresol-{alpha}-CD, benzyl alcohol-{alpha}-CD, ferrocene-{alpha}-CD and decanoic acid-{alpha}-CD). The present data obtained from Fourier transform-infrared (FT-IR) spectroscopy showed that all the {alpha}-CD-based inclusion complexes were successfully prepared in a solid-state form. The stoichiometric ratios of {alpha}-CD to the relative guests (4-cresol, benzyl alcohol, ferrocene and decanoic acid) determined by the developed method were 1:1, 1:2, 2:1 and 1:2, respectively. These SR data were well demonstrated by the previously reported X-ray diffraction (XRD) method and the NMR confirmatory experiments, except the SR of decanoic acid with a larger size and longer chain was not consistent. It is, therefore, suggested that the TGA-based method is applicable to follow the stoichiometric ratio of the polycrystalline {alpha}-CD-based inclusion complexes with smaller and shorter chain guests.

  19. A thermogravimetric analysis (TGA) method developed for estimating the stoichiometric ratio of solid-state α-cyclodextrin-based inclusion complexes

    International Nuclear Information System (INIS)

    Bai, Yuxiang; Wang, Jinpeng; Bashari, Mohanad; Hu, Xiuting; Feng, Tao; Xu, Xueming; Jin, Zhengyu; Tian, Yaoqi

    2012-01-01

    Highlights: ► We develop a TGA method for the measurement of the stoichiometric ratio. ► A series of formulas are deduced to calculate the stoichiometric ratio. ► Four α-CD-based inclusion complexes were successfully prepared. ► The developed method is applicable. - Abstract: An approach mainly based on thermogravimetric analysis (TGA) was developed to evaluate the stoichiometric ratio (SR, guest to host) of the guest–α-cyclodextrin (Guest-α-CD) inclusion complexes (4-cresol-α-CD, benzyl alcohol-α-CD, ferrocene-α-CD and decanoic acid-α-CD). The present data obtained from Fourier transform-infrared (FT-IR) spectroscopy showed that all the α-CD-based inclusion complexes were successfully prepared in a solid-state form. The stoichiometric ratios of α-CD to the relative guests (4-cresol, benzyl alcohol, ferrocene and decanoic acid) determined by the developed method were 1:1, 1:2, 2:1 and 1:2, respectively. These SR data were well demonstrated by the previously reported X-ray diffraction (XRD) method and the NMR confirmatory experiments, except the SR of decanoic acid with a larger size and longer chain was not consistent. It is, therefore, suggested that the TGA-based method is applicable to follow the stoichiometric ratio of the polycrystalline α-CD-based inclusion complexes with smaller and shorter chain guests.

  20. Application of status uncertainty analysis methods for AP1000 LBLOCA calculation

    International Nuclear Information System (INIS)

    Zhang Shunxiang; Liang Guoxing

    2012-01-01

    Parameter uncertainty analysis is developed by using the reasonable method to establish the response relations between input parameter uncertainties and output uncertainties. The application of the parameter uncertainty analysis makes the simulation of plant state more accuracy and improves the plant economy with reasonable security assurance. The AP1000 LBLOCA was analyzed in this paper and the results indicate that the random sampling statistical analysis method, sensitivity analysis numerical method and traditional error propagation analysis method can provide quite large peak cladding temperature (PCT) safety margin, which is much helpful for choosing suitable uncertainty analysis method to improve the plant economy. Additionally, the random sampling statistical analysis method applying mathematical statistics theory makes the largest safety margin due to the reducing of the conservation. Comparing with the traditional conservative bounding parameter analysis method, the random sampling method can provide the PCT margin of 100 K, while the other two methods can only provide 50-60 K. (authors)

  1. Analysis and development of the method for calculating calibration of the working plank in the cold tube roller rolling mills

    Directory of Open Access Journals (Sweden)

    S. V. Pilipenko

    2017-05-01

    Full Text Available Analysis and development of the existing method of calculation of the calibrated profile of the working strips mills CTRR roller cold rolling pipe to ensure the required distribution of energy-power parameters along the cone. In presented paper, which has for aim the development of existing method for calculating the profile of calibrated working plank in the cold tube roller rolling mills, the analysis had been made and it was proposed to use Besier-lines while building the the profile of the plank working surface. It was established that the use of Besier spline-curve for calculating the calibration of supporting planks creates the possibility to calculate the parameters proceeding from reduction over the external diameter. The proposed method for calculating deformation parameters in CTRR mills is the result of development of existing method and as such shows the scientific novelty. Comparison of the plots for distribution of the force parameters of the CTRR process along the cone of deformation presents as evidence the advantage of the method to be proposed. The decrease of reduction value at the end of deformation zone favors the manufacture of tubes with lesser wall thickness deviation (especially longitudinal one, caused with waviness induced by the cold pilgering process. Joined the further development of the method of calculating the deformation parameters CTRR. It is proposed for the calculation of the calibration work surface support bracket mills CTRR to use a spline Bezier. The practical significance of the proposed method consists in the fact that calculation of all zones of the plank by means of one dependence allows simplifying the process of manufacturing the latter in machines with programmed numerical control. In this case the change of reduction parameters over the thickness of the wall will not exert the considerable influence on the character of the force parameters (the character and not the value distribution along the

  2. Scientific methods for developing ultrastable structures

    International Nuclear Information System (INIS)

    Gamble, M.; Thompson, T.; Miller, W.

    1990-01-01

    Scientific methods used by the Los Alamos National Laboratory for developing an ultrastable structure for study of silicon-based elementary particle tracking systems are addressed. In particular, the design, analysis, and monitoring of this system are explored. The development methodology was based on a triad of analytical, computational, and experimental techniques. These were used to achieve a significant degree of mechanical stability (alignment accuracy >1 μrad) and yet allow dynamic manipulation of the system. Estimates of system thermal and vibratory stability and component performance are compared with experimental data collected using laser interferometry and accelerometers. 8 refs., 5 figs., 4 tabs

  3. Instrumental neutron activation analysis as a routine method for rock analysis

    International Nuclear Information System (INIS)

    Rosenberg, R.J.

    1977-06-01

    Instrumental neutron activation methods for the analysis of geological samples have been developed. Special emphasis has been laid on the improvement of sensitivity and accuracy in order to maximize tha quality of the analyses. Furthermore, the procedures have been automated as far as possible in order to minimize the cost of the analysis. A short review of the basic literature is given followed by a description of the principles of the method. All aspects concerning the sensitivity are discussed thoroughly in view of the analyst's possibility of influencing them. Experimentally determined detection limits for Na, Al, K, Ca, Sc, Cr, Ti, V, Mn, Fe, Ni, Co, Rb, Zr, Sb, Cs, Ba, La, Ce, Nd, Sm, Eu, Gd, Tb, Dy, Yb, Lu, Hf, Ta, Th and U are given. The errors of the method are discussed followed by actions taken to avoid them. The most significant error was caused by flux deviation, but this was avoided by building a rotating sample holder for rotating the samples during irradiation. A scheme for the INAA of 32 elements is proposed. The method has been automated as far as possible and an automatic γ-spectrometer and a computer program for the automatic calculation of the results are described. Furthermore, a completely automated uranium analyzer based on delayed neutron counting is described. The methods are discussed in view of their applicability to rock analysis. It is stated that the sensitivity varies considerably from element to element and instrumental activation analysis is an excellent method for the analysis of some specific elements like lanthanides, thorium and uranium but less so for many other elements. The accuracy is good varying from 2% to 10% for most elements. Instrumental activation analysis for most elements is rather an expensive method there being, however, a few exceptions. The most important of these is uranium. The analysis of uranium by delayed neutron counting is an inexpensive means for the analysis of large numbers of samples needed for

  4. Development of reliability centered maintenance methods and tools

    International Nuclear Information System (INIS)

    Jacquot, J.P.; Dubreuil-Chambardel, A.; Lannoy, A.; Monnier, B.

    1992-12-01

    This paper recalls the development of the RCM (Reliability Centered Maintenance) approach in the nuclear industry and describes the trial study implemented by EDF in the context of the OMF (RCM) Project. The approach developed is currently being applied to about thirty systems (Industrial Project). On a parallel, R and D efforts are being maintained to improve the selectivity of the analysis methods. These methods use Probabilistic Safety Study models, thereby guaranteeing better selectivity in the identification of safety critical elements and enhancing consistency between Maintenance and Safety studies. They also offer more detailed analysis of operation feedback, invoking for example Bayes' methods combining expert judgement and feedback data. Finally, they propose a functional and material representation of the plant. This dual representation describes both the functions assured by maintenance provisions and the material elements required for their implementation. In the final chapter, the targets of the future OMF workstation are summarized and the latter's insertion in the EDF information system is briefly described. (authors). 5 figs., 2 tabs., 7 refs

  5. SENSITIVITY ANALYSIS as a methodical approach to the development of design strategies for environmentally sustainable buildings

    DEFF Research Database (Denmark)

    Hansen, Hanne Tine Ring

    . The research methodology applied in the project combines a literature study of descriptions of methodical approaches and built examples with a sensitivity analysis and a qualitative interview with two designers from a best practice example of a practice that has achieved environmentally sustainable...... architecture, such as: ecological, green, bio-climatic, sustainable, passive, low-energy and environmental architecture. This PhD project sets out to gain a better understanding of environmentally sustainable architecture and the methodical approaches applied in the development of this type of architecture...... an increase in scientific and political awareness, which has lead to an escalation in the number of research publications in the field, as well as, legislative demands for the energy consumption of buildings. The publications in the field refer to many different approaches to environmentally sustainable...

  6. Recent developments in methods for analysis of perfluorinated persistent pollutants

    International Nuclear Information System (INIS)

    Trojanowicz, Marek; Koc, Mariusz

    2013-01-01

    Perfluoroalkyl substances (PFASs) are proliferated into the environment on a global scale and present in the organisms of animals and humans even in remote locations. Persistent organic pollutants of that kind therefore have stimulated substantial improvement in analytical methods. The aim of this review is to present recent achievements in PFASs determination in various matrices with different methods and its comparison to measurements of Total Organic Fluorine (TOF). Analytical methods used for PFASs determinations are dominated by chromatography, mostly in combination with mass spectrometric detection. However, HPLC may be also hyphenated with conductivity or fluorimetric detection, and gas chromatography may be combined with flame ionization or electron capture detection. The presence of a large number of PFASs species in environmental and biological samples necessitates parallel attempts to develop a total PFASs index that reflects the total content of PFASs in various matrices. Increasing attention is currently paid to the determination of branched isomers of PFASs, and their determination in food. (author)

  7. Pathways to Lean Software Development: An Analysis of Effective Methods of Change

    Science.gov (United States)

    Hanson, Richard D.

    2014-01-01

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is…

  8. Development and validation of a reversed phase liquid chromatographic method for analysis of oxytetracycline and related impurities.

    Science.gov (United States)

    Kahsay, Getu; Shraim, Fairouz; Villatte, Philippe; Rotger, Jacques; Cassus-Coussère, Céline; Van Schepdael, Ann; Hoogmartens, Jos; Adams, Erwin

    2013-03-05

    A simple, robust and fast high-performance liquid chromatographic method is described for the analysis of oxytetracycline and its related impurities. The principal peak and impurities are all baseline separated in 20 min using an Inertsil C₈ (150 mm × 4.6 mm, 5 μm) column kept at 50 °C. The mobile phase consists of a gradient mixture of mobile phases A (0.05% trifluoroacetic acid in water) and B (acetonitrile-methanol-tetrahydrofuran, 80:15:5, v/v/v) pumped at a flow rate of 1.3 ml/min. UV detection was performed at 254 nm. The developed method was validated for its robustness, sensitivity, precision and linearity in the range from limit of quantification (LOQ) to 120%. The limits of detection (LOD) and LOQ were found to be 0.08 μg/ml and 0.32 μg/ml, respectively. This method allows the separation of oxytetracycline from all known and 5 unknown impurities, which is better than previously reported in the literature. Moreover, the simple mobile phase composition devoid of non-volatile buffers made the method suitable to interface with mass spectrometry for further characterization of unknown impurities. The developed method has been applied for determination of related substances in oxytetracycline bulk samples available from four manufacturers. The validation results demonstrate that the method is reliable for quantification of oxytetracycline and its impurities. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Development of the numerical method for liquid metal magnetohydrodynamics (I). Investigation of the method and development of the 2D method

    International Nuclear Information System (INIS)

    Ohira, H.; Ara, K.

    2002-11-01

    Advanced electromagnetic components are investigated in Feasibility Studies on Commercialized FR Cycle System to apply to the main cooling systems of Liquid Metal Fast Reactor. Although a lot of experiments and numerical analysis were carried out on both high Reynolds numbers and high magnetic Reynolds numbers, the complex phenomena could not be evaluated in detail. As the first step of the development of the numerical methods for the liquid metal magnetohydrodynamics, we investigated numerical methods that could be applied to the electromagnetic components with both complex structures and high magnetic turbulent field. As a result, we selected GSMAC (Generalized-Simplified MArker and Cell) method for calculating the liquid metal fluid dynamics because it could be easily applied to the complex flow field. We also selected the vector-FEM for calculating the magnetic field of the large components because the method had no interaction procedure. In the high magnetic turbulent field, the dynamic-SGS models would be also a promising model for the good estimation, because it could calculate the field directly without any experimental constant. In order to verify the GSMAC and the vector-FEM, we developed the 2D numerical models and calculated the magnetohydrodynamics in the large electromagnetic pump. It was estimated from these results that the methods were basically reasonable, because the calculated pressure differences had the similar tendencies to the experimental ones. (author)

  10. Development of inelastic design method for liquid metal reactor plants

    International Nuclear Information System (INIS)

    Takahashi, Yukio; Take, Kohji; Kaguchi, Hitoshi; Fukuda, Yoshio; Uno, Tetsuro.

    1991-01-01

    Effective utilization of inelastic analysis in structural design assessment is expected to play an important role for avoiding too conservative design of liquid metal reactor plants. Studies have been conducted by the authors to develop a guideline for application of detailed inelastic analysis in design assessment. Both fundamental material characteristics tests and structural failure tests were conducted. Fundamental investigations were made on inelastic analysis method and creep-fatigue life prediction method based on the results of material characteristics tests. It was demonstrated through structural failure tests that the design method constructed based on these fundamental investigations can predict failure lives in structures subjected to cyclic thermal loadings with sufficient accuracy. (author)

  11. Development and validation of a spectroscopic method for the ...

    African Journals Online (AJOL)

    Tropical Journal of Pharmaceutical Research ... Purpose: To develop a new analytical method for the quantitative analysis of miconazole ... a simple, reliable and robust method for the characterization of a mixture of the drugs in a dosage form. ... By Country · List All Titles · Free To Read Titles This Journal is Open Access.

  12. Development of headspace solid-phase microextraction method for ...

    African Journals Online (AJOL)

    A headspace solid-phase microextraction (HS-SPME) method was developed as a preliminary investigation using univariate approach for the analysis of 14 multiclass pesticide residues in fruits and vegetable samples. The gas chromatography mass spectrometry parameters (desorption temperature and time, column flow ...

  13. Linear Algebraic Method for Non-Linear Map Analysis

    International Nuclear Information System (INIS)

    Yu, L.; Nash, B.

    2009-01-01

    We present a newly developed method to analyze some non-linear dynamics problems such as the Henon map using a matrix analysis method from linear algebra. Choosing the Henon map as an example, we analyze the spectral structure, the tune-amplitude dependence, the variation of tune and amplitude during the particle motion, etc., using the method of Jordan decomposition which is widely used in conventional linear algebra.

  14. Applicability of soil-structure interaction analysis methods for earthquake loadings (IV)

    International Nuclear Information System (INIS)

    Chang, S. P.; Ko, H. M.; Kim, J. K.; Yoon, J. Y.; Chin, B. M.; Yang, T. S.; Park, D. H.; Chung, W.; Park, J. Y.

    1996-07-01

    The ultimate goals of this research are to cultivate the capability of accurate SSI analysis and to develop the effective soil-structure interaction analysis method and computer program by comparing analysis results obtained in Lotung/Hualien LSST project. In this research, computer analysis program using hyper element was developed to analyze the forced vibration test and seismic test of the on-going Hualien LSST project. Prediction analysis and post-prediction analysis for Hualien LSST forced vibration and seismic response were executed by developed program. Thus this report is mainly composed of two parts. One is the summary of theoretical background of hyper element and the other is prediction analysis and post-prediction analysis results for Hualien LSST forced vibration and seismic response tests executed by developed program. Also, a coupling method of hyper element and generalized three-dimensional finite element or general axisymmetric finite element was presented for the further development of computer analysis program related to three dimensional hybrid soil-structure interaction and for the verification, the dynamic stiffness' of rigid circular /rectangular foundation are calculated. It is confirmed that program using hyper element is efficient and practical because it can consider non-homogeneity easily and execute the analysis in short time by using analytic solution m horizontal direction

  15. Elastic and inelastic methods of piping systems analysis: a preliminary review

    International Nuclear Information System (INIS)

    Reich, M.; Esztergar, E.P.; Spence, J.; Boyle, J.; Chang, T.Y.

    1975-02-01

    A preliminary review of the methods used for elastic and inelastic piping system analysis is presented. The following principal conclusions are reached: techniques for the analysis of complex piping systems operating in the high temperature creep regime should be further developed; accurate analysis of a complete pipework system in creep using the ''complete shell finite element methods'' is not feasible at the present, and the ''reduced shell finite element method'' still requires excessive computer time and also requires further investigation regarding the compatibility problems associated with the pipe bend element, particularly when applied to cases involving general loading conditions; and with the current size of proposed high temperature systems requiring the evaluation of long-term operating life (30 to 40 years), it is important to adopt a simplified analysis method. A design procedure for a simplified analysis method based on currently available techniques applied in a three-stage approach is outlined. The work required for implementation of these procedures together with desirable future developments are also briefly discussed. Other proposed simplified approximations also are reviewed in the text. 101 references. (U.S.)

  16. Vibration analysis of the piping system using the modal analysis method, 1

    International Nuclear Information System (INIS)

    Fujikawa, Takeshi; Kurohashi, Michiya; Inoue, Yoshio

    1975-01-01

    Modal analysis method was developed for the vibration analysis of piping system in nuclear or chemical plants, with finite element theory, and verified by sinusoidal vibration method. The natural vibration equation for pipings was derived with stiffness, attenuation and mass matrices, and eigenvalues are obtained with usual method, then the forced vibration equation for pipings was derived with the same manner, and the special solutions are given by modal method from the eigenvalues of the natural vibration equation. Three simple piping models (one, two and three dimensional) were made, and the natural vibration frequency was measured with forced input from an electrical dynamic shaker and a sound speaker. The experimental values of natural vibration frequency showed good agreement with the results by the analytical method. Therefore the theoretical approach for piping system vibration was proved to be valid. (Iwase, T.)

  17. Synthesis of Enterprise and Value-Based Methods for Multiattribute Risk Analysis

    International Nuclear Information System (INIS)

    Kenley, C. Robert; Collins, John W.; Beck, John M.; Heydt, Harold J.; Garcia, Chad B.

    2001-01-01

    This paper describes a method for performing multiattribute decision analysis to prioritize ap-proaches to handling risks during the development and operation of complex socio-technical systems. The method combines risk categorization based on enterprise views, risk prioritization of the categories based on the Analytic Hierarchy Process (AHP), and more standard probability-consequence ratings schemes. We also apply value-based testing me-thods used in software development to prioritize risk-handling approaches. We describe a tool that synthesizes the methods and performs a multiattribute analysis of the technical and pro-grammatic risks on the Next Generation Nuclear Plant (NGNP) enterprise.

  18. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  19. Development of an evaluation method for seismic isolation systems of nuclear power facilities. Seismic design analysis methods for crossover piping system

    International Nuclear Information System (INIS)

    Tai, Koichi; Sasajima, Keisuke; Fukushima, Shunsuke; Takamura, Noriyuki; Onishi, Shigenobu

    2014-01-01

    This paper provides seismic design analysis methods suitable for crossover piping system, which connects between seismic isolated building and non-isolated building in the seismic isolated nuclear power plant. Through the numerical study focused on the main steam crossover piping system, seismic response spectrum analysis applying ISM (Independent Support Motion) method with SRSS combination or CCFS (Cross-oscillator, Cross-Floor response Spectrum) method has found to be quite effective for the seismic design of multiply supported crossover piping system. (author)

  20. Development of a Univariate Membrane-Based Mid-Infrared Method for Protein Quantitation and Total Lipid Content Analysis of Biological Samples

    Directory of Open Access Journals (Sweden)

    Ivona Strug

    2014-01-01

    Full Text Available Biological samples present a range of complexities from homogeneous purified protein to multicomponent mixtures. Accurate qualification of such samples is paramount to downstream applications. We describe the development of an MIR spectroscopy-based analytical method offering simultaneous protein quantitation (0.25–5 mg/mL and analysis of total lipid or detergent species, as well as the identification of other biomolecules present in biological samples. The method utilizes a hydrophilic PTFE membrane engineered for presentation of aqueous samples in a dried format compatible with fast infrared analysis. Unlike classical quantification techniques, the reported method is amino acid sequence independent and thus applicable to complex samples of unknown composition. By comparison to existing platforms, this MIR-based method enables direct quantification using minimal sample volume (2 µL; it is well-suited where repeat access and limited sample size are critical parameters. Further, accurate results can be derived without specialized training or knowledge of IR spectroscopy. Overall, the simplified application and analysis system provides a more cost-effective alternative to high-throughput IR systems for research laboratories with minimal throughput demands. In summary, the MIR-based system provides a viable alternative to current protein quantitation methods; it also uniquely offers simultaneous qualification of other components, notably lipids and detergents.

  1. Developing and validating a nutrition knowledge questionnaire: key methods and considerations.

    Science.gov (United States)

    Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina

    2017-10-01

    To outline key statistical considerations and detailed methodologies for the development and evaluation of a valid and reliable nutrition knowledge questionnaire. Literature on questionnaire development in a range of fields was reviewed and a set of evidence-based guidelines specific to the creation of a nutrition knowledge questionnaire have been developed. The recommendations describe key qualitative methods and statistical considerations, and include relevant examples from previous papers and existing nutrition knowledge questionnaires. Where details have been omitted for the sake of brevity, the reader has been directed to suitable references. We recommend an eight-step methodology for nutrition knowledge questionnaire development as follows: (i) definition of the construct and development of a test plan; (ii) generation of the item pool; (iii) choice of the scoring system and response format; (iv) assessment of content validity; (v) assessment of face validity; (vi) purification of the scale using item analysis, including item characteristics, difficulty and discrimination; (vii) evaluation of the scale including its factor structure and internal reliability, or Rasch analysis, including assessment of dimensionality and internal reliability; and (viii) gathering of data to re-examine the questionnaire's properties, assess temporal stability and confirm construct validity. Several of these methods have previously been overlooked. The measurement of nutrition knowledge is an important consideration for individuals working in the nutrition field. Improved methods in the development of nutrition knowledge questionnaires, such as the use of factor analysis or Rasch analysis, will enable more confidence in reported measures of nutrition knowledge.

  2. A Review of Classical Methods of Item Analysis.

    Science.gov (United States)

    French, Christine L.

    Item analysis is a very important consideration in the test development process. It is a statistical procedure to analyze test items that combines methods used to evaluate the important characteristics of test items, such as difficulty, discrimination, and distractibility of the items in a test. This paper reviews some of the classical methods for…

  3. Development of environmental sample analysis techniques for safeguards

    International Nuclear Information System (INIS)

    Magara, Masaaki; Hanzawa, Yukiko; Esaka, Fumitaka

    1999-01-01

    JAERI has been developing environmental sample analysis techniques for safeguards and preparing a clean chemistry laboratory with clean rooms. Methods to be developed are a bulk analysis and a particle analysis. In the bulk analysis, Inductively-Coupled Plasma Mass Spectrometer or Thermal Ionization Mass Spectrometer are used to measure nuclear materials after chemical treatment of sample. In the particle analysis, Electron Probe Micro Analyzer and Secondary Ion Mass Spectrometer are used for elemental analysis and isotopic analysis, respectively. The design of the clean chemistry laboratory has been carried out and construction will be completed by the end of March, 2001. (author)

  4. Numerical analysis of melting/solidification phenomena using a moving boundary problem analysis method X-FEM

    International Nuclear Information System (INIS)

    Uchibori, Akihiro; Ohshima, Hiroyuki

    2008-01-01

    A numerical analysis method for melting/solidification phenomena has been developed to evaluate a feasibility of several candidate techniques in the nuclear fuel cycle. Our method is based on the eXtended Finite Element Method (X-FEM) which has been used for moving boundary problems. Key technique of the X-FEM is to incorporate signed distance function into finite element interpolation to represent a discontinuous gradient of the temperature at a moving solid-liquid interface. Construction of the finite element equation, the technique of quadrature and the method to solve the equation are reported here. The numerical solutions of the one-dimensional Stefan problem, solidification in a two-dimensional square corner and melting of pure gallium are compared to the exact solutions or to the experimental data. Through these analyses, validity of the newly developed numerical analysis method has been demonstrated. (author)

  5. Code development for eigenvalue total sensitivity analysis and total uncertainty analysis

    International Nuclear Information System (INIS)

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Zu, Tiejun; Shen, Wei

    2015-01-01

    Highlights: • We develop a new code for total sensitivity and uncertainty analysis. • The implicit effects of cross sections can be considered. • The results of our code agree well with TSUNAMI-1D. • Detailed analysis for origins of implicit effects is performed. - Abstract: The uncertainties of multigroup cross sections notably impact eigenvalue of neutron-transport equation. We report on a total sensitivity analysis and total uncertainty analysis code named UNICORN that has been developed by applying the direct numerical perturbation method and statistical sampling method. In order to consider the contributions of various basic cross sections and the implicit effects which are indirect results of multigroup cross sections through resonance self-shielding calculation, an improved multigroup cross-section perturbation model is developed. The DRAGON 4.0 code, with application of WIMSD-4 format library, is used by UNICORN to carry out the resonance self-shielding and neutron-transport calculations. In addition, the bootstrap technique has been applied to the statistical sampling method in UNICORN to obtain much steadier and more reliable uncertainty results. The UNICORN code has been verified against TSUNAMI-1D by analyzing the case of TMI-1 pin-cell. The numerical results show that the total uncertainty of eigenvalue caused by cross sections can reach up to be about 0.72%. Therefore the contributions of the basic cross sections and their implicit effects are not negligible

  6. Multiresolution and Explicit Methods for Vector Field Analysis and Visualization

    Science.gov (United States)

    Nielson, Gregory M.

    1997-01-01

    This is a request for a second renewal (3d year of funding) of a research project on the topic of multiresolution and explicit methods for vector field analysis and visualization. In this report, we describe the progress made on this research project during the second year and give a statement of the planned research for the third year. There are two aspects to this research project. The first is concerned with the development of techniques for computing tangent curves for use in visualizing flow fields. The second aspect of the research project is concerned with the development of multiresolution methods for curvilinear grids and their use as tools for visualization, analysis and archiving of flow data. We report on our work on the development of numerical methods for tangent curve computation first.

  7. Development of the next generation reactor analysis code system, MARBLE

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Hazama, Taira; Nagaya, Yasunobu; Chiba, Go; Kugo, Teruhiko; Ishikawa, Makoto; Tatsumi, Masahiro; Hirai, Yasushi; Hyoudou, Hideaki; Numata, Kazuyuki; Iwai, Takehiko; Jin, Tomoyuki

    2011-03-01

    A next generation reactor analysis code system, MARBLE, has been developed. MARBLE is a successor of the fast reactor neutronics analysis code systems, JOINT-FR and SAGEP-FR (conventional systems), which were developed for so-called JUPITER standard analysis methods. MARBLE has the equivalent analysis capability to the conventional system because MARBLE can utilize sub-codes included in the conventional system without any change. On the other hand, burnup analysis functionality for power reactors is improved compared with the conventional system by introducing models on fuel exchange treatment and control rod operation and so on. In addition, MARBLE has newly developed solvers and some new features of burnup calculation by the Krylov sub-space method and nuclear design accuracy evaluation by the extended bias factor method. In the development of MARBLE, the object oriented technology was adopted from the view-point of improvement of the software quality such as flexibility, expansibility, facilitation of the verification by the modularization and assistance of co-development. And, software structure called the two-layer system consisting of scripting language and system development language was applied. As a result, MARBLE is not an independent analysis code system which simply receives input and returns output, but an assembly of components for building an analysis code system (i.e. framework). Furthermore, MARBLE provides some pre-built analysis code systems such as the fast reactor neutronics analysis code system. SCHEME, which corresponds to the conventional code and the fast reactor burnup analysis code system, ORPHEUS. (author)

  8. New design procedure development of future reactor critical power estimation. (1) Practical design-by-analysis method for BWR critical power design correlation

    International Nuclear Information System (INIS)

    Yamamoto, Yasushi; Mitsutake, Toru

    2007-01-01

    For present BWR fuels, the full mock-up thermal-hydraulic test, such as the critical power measurement test, pressure drop measurement test and so on, has been needed. However, the full mock-up test required the high costs and large-scale test facility. At present, there are only a few test facilities to perform the full mock-up thermal-hydraulic test in the world. Moreover, for future BWR, the bundle size tends to be larger, because of reducing the plant construction costs and minimizing the routine check period. For instance, AB1600, improved ABWR, was proposed from Toshiba, whose bundle size was 1.2 times larger than the conventional BWR fuel size. It is too expensive and far from realistic to perform the full mock-up thermal-hydraulic test for such a large size fuel bundle. The new design procedure is required to realize the large scale bundle design development, especially for the future reactor. Therefore, the new design procedure, Practical Design-by-Analysis (PDBA) method, has been developed. This new procedure consists of the partial mock-up test and numerical analysis. At present, the subchannel analysis method based on three-fluid two-phase flow model only is a realistic choice. Firstly, the partial mock-up test is performed, for instance, the 1/4 partial mock-up bundle. Then, the first-step critical power correlation coefficients are evaluated with the measured data. The input data, such as the spacer effect model coefficient, on the subchannel analysis are also estimated with the data. Next, the radial power effect on the critical power of the full-bundle size was estimated with the subchannel analysis. Finally, the critical power correlation is modified by the subchannel analysis results. In the present study, the critical power correlation of the conventional 8x8 BWR fuel was developed with the PDBA method by 4x4 partial mock-up tests and the subchannel analysis code. The accuracy of the estimated critical power was 3.8%. The several themes remain to

  9. Development of task analysis method for operator tasks in main control room of an advanced nuclear power plant

    International Nuclear Information System (INIS)

    Lin Chiuhsiangloe; Hsieh Tsungling

    2016-01-01

    Task analysis methods provide an insight for quantitative and qualitative predictions of how people will use a proposed system, though the different versions have different emphases. Most of the methods can attest to the coverage of the functionality of a system and all provide estimates of task performance time. However, most of the tasks that operators deal with in a digital work environment in the main control room of an advanced nuclear power plant require high mental activity. Such mental tasks overlap and must be dealt with at the same time; most of them can be assumed to be highly parallel in nature. Therefore, the primary aim to be addressed in this paper was to develop a method that adopts CPM-GOMS (cognitive perceptual motor-goals operators methods selection rules) as the basic pattern of mental task analysis for the advanced main control room. A within-subjects experiment design was used to examine the validity of the modified CPM-GOMS. Thirty participants participated in two task types, which included high- and low-compatibility types. The results indicated that the performance was significantly higher on the high-compatibility task type than on the low-compatibility task type; that is, the modified CPM-GOMS could distinguish the difference between high- and low-compatibility mental tasks. (author)

  10. Principles and methods of neutron activation analysis (NAA) in improved water resources development

    International Nuclear Information System (INIS)

    Dim, L. A.

    2000-01-01

    The methods of neutron activation analysis (NAA) as it applies to water resources exploration, exploitation and management has been reviewed and its capabilities demonstrated. NAA has been found to be superior and offer higher sensitivity to many other analytical techniques in analysis of water. The implications of chemical and element concentrations (water pollution and quality) determined in water on environmental impact assessment to aquatic life and human health are briefly highlighted

  11. Methods for geochemical analysis

    Science.gov (United States)

    Baedecker, Philip A.

    1987-01-01

    The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.

  12. Development of a PSA-based Loss of Large Area Analysis Method

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Mee Jeong; Jung, Woosik [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Myungsu [Korea Hydro Nuclear Power, Central Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    As a result of these initial post 9-11 assessments in 2002, the NRC issued an interim safeguards and security compensatory measures order. In 'Interim Compensatory Measures for High Threat Environment,'. Section B.5.b (not publically available) of this order, current NPP licensees had to adopt mitigation or restore reactor core cooling, containment, and spent fuel pool (SFP) cooling capabilities to cope with a LOLA due to large fires and explosions from any cause, including beyond-design basis threat(BDBT) aircraft impacts. In 2009, the NRC issued amendments to 10CFR Part 52, and Part 73 for power reactor security requirements for operating and new reactors. New U.S. licensed commercial nuclear power plant operators are required to provide a LOLA(Loss of Large Area) analysis as per the U.S. Code of Federal Regulations, 10CFR50.54(hh)(2). Additionally 10CFR52.80(d) provides the required submittal information on how an applicant for a combined operating license(COL) for a nuclear power plant to meet these requirements. It is necessary to prepare our own guidance for a development of LOLA strategies. In this paper, we proposed a method to look for interesting combinations of rooms in certain targets getting through VAI model, and produced insights that could be used to influence LOLA strategies.

  13. Dynamic relaxation method in analysis of reinforced concrete bent elements

    Directory of Open Access Journals (Sweden)

    Anna Szcześniak

    2015-12-01

    Full Text Available The paper presents a method for the analysis of nonlinear behaviour of reinforced concrete bent elements subjected to short-term static load. The considerations in the range of modelling of deformation processes of reinforced concrete element were carried out. The method of structure effort analysis was developed using the finite difference method. The Dynamic Relaxation Method, which — after introduction of critical damping — allows for description of the static behaviour of a structural element, was used to solve the system of nonlinear equilibrium equations. In order to increase the method effectiveness in the range of the post-critical analysis, the Arc Length Parameter on the equilibrium path was introduced into the computational procedure.[b]Keywords[/b]: reinforced concrete elements, physical nonlinearity, geometrical nonlinearity, dynamic relaxation method, arc-length method

  14. Best-estimate analysis development for BWR systems

    International Nuclear Information System (INIS)

    Sutherland, W.A.; Alamgir, M.; Kalra, S.P.; Beckner, W.D.

    1986-01-01

    The Full Integral Simulation Test (FIST) Program is a three pronged approach to the development of best-estimate analysis capability for BWR systems. An experimental program in the FIST BWR system simulator facility extends the LOCA data base and adds operational transients data. An analytical method development program with the BWR-TRAC computer program extends the modeling of BWR specific components and major interfacing systems, and improves numerical techniques to reduce computer running time. A method qualification program tests TRAC-B against experiments run in the FIST facility and extends the results to reactor system applications. With the completion and integration of these three activities, the objective of a best-estimate analysis capability has been achieved. (author)

  15. Sensitivity and Uncertainty Analysis of Coupled Reactor Physics Problems : Method Development for Multi-Physics in Reactors

    NARCIS (Netherlands)

    Perkó, Z.

    2015-01-01

    This thesis presents novel adjoint and spectral methods for the sensitivity and uncertainty (S&U) analysis of multi-physics problems encountered in the field of reactor physics. The first part focuses on the steady state of reactors and extends the adjoint sensitivity analysis methods well

  16. A strategy to the development of a human error analysis method for accident management in nuclear power plants using industrial accident dynamics

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Kim, Jae Whan; Jung, Won Dae; Ha, Jae Ju

    1998-06-01

    This technical report describes the early progress of he establishment of a human error analysis method as a part of a human reliability analysis(HRA) method for the assessment of the human error potential in a given accident management strategy. At first, we review the shortages and limitations of the existing HRA methods through an example application. In order to enhance the bias to the quantitative aspect of the HRA method, we focused to the qualitative aspect, i.e., human error analysis(HEA), during the proposition of a strategy to the new method. For the establishment of a new HEA method, we discuss the basic theories and approaches to the human error in industry, and propose three basic requirements that should be maintained as pre-requisites for HEA method in practice. Finally, we test IAD(Industrial Accident Dynamics) which has been widely utilized in industrial fields, in order to know whether IAD can be so easily modified and extended to the nuclear power plant applications. We try to apply IAD to the same example case and develop new taxonomy of the performance shaping factors in accident management and their influence matrix, which could enhance the IAD method as an HEA method. (author). 33 refs., 17 tabs., 20 figs

  17. 252Cf-source-driven neutron noise analysis method

    International Nuclear Information System (INIS)

    Mihalczo, J.T.; King, W.T.; Blakeman, E.D.

    1985-01-01

    The 252 Cf-source-driven neutron noise analysis method has been tested in a a wide variety of experiments that have indicated the broad range of applicability of the method. The neutron multiplication factor, k/sub eff/ has been satisfactorily determined for a variety of materials including uranium metal, light water reactor fuel pins, fissile solutions, fuel plates in water, and interacting cylinders. For a uranyl nitrate solution tank which is typical of a fuel processing or reprocessing plant, the k/sub eff/ values were satisfactorily determined for values between 0.92 and 0.5 using a simple point kinetics interpretation of the experimental data. The short measurement times, in several cases as low as 1 min, have shown that the development of this method can lead to a practical subcriticality monitor for many in-plant applications. The further development of the method will require experiments and the development of theoretical methods to predict the experimental observables

  18. 252Cf-source-driven neutron noise analysis method

    International Nuclear Information System (INIS)

    Mihalczo, J.T.; King, W.T.; Blakeman, E.D.

    1985-01-01

    The 252 Cf-source-driven neutron noise analysis method has been tested in a wide variety of experiments that have indicated the broad range of applicability of the method. The neutron multiplication factor k/sub eff/ has been satisfactorily detemined for a variety of materials including uranium metal, light water reactor fuel pins, fissile solutions, fuel plates in water, and interacting cylinders. For a uranyl nitrate solution tank which is typical of a fuel processing or reprocessing plant, the k/sub eff/ values were satisfactorily determined for values between 0.92 and 0.5 using a simple point kinetics interpretation of the experimental data. The short measurement times, in several cases as low as 1 min, have shown that the development of this method can lead to a practical subcriticality monitor for many in-plant applications. The further development of the method will require experiments oriented toward particular applications including dynamic experiments and the development of theoretical methods to predict the experimental observables

  19. Reliability and risk analysis methods research plan

    International Nuclear Information System (INIS)

    1984-10-01

    This document presents a plan for reliability and risk analysis methods research to be performed mainly by the Reactor Risk Branch (RRB), Division of Risk Analysis and Operations (DRAO), Office of Nuclear Regulatory Research. It includes those activities of other DRAO branches which are very closely related to those of the RRB. Related or interfacing programs of other divisions, offices and organizations are merely indicated. The primary use of this document is envisioned as an NRC working document, covering about a 3-year period, to foster better coordination in reliability and risk analysis methods development between the offices of Nuclear Regulatory Research and Nuclear Reactor Regulation. It will also serve as an information source for contractors and others to more clearly understand the objectives, needs, programmatic activities and interfaces together with the overall logical structure of the program

  20. A Product Analysis Method and Its Staging to Develop Redesign Competences

    Science.gov (United States)

    Hansen, Claus Thorp; Lenau, Torben Anker

    2013-01-01

    Most product development work in industrial practice is incremental, i.e., the company has had a product in production and on the market for some time, and now time has come to design an upgraded variant. This type of redesign project requires that the engineering designers have competences to carry through an analysis of the existing product…

  1. Robust Methods for Moderation Analysis with a Two-Level Regression Model.

    Science.gov (United States)

    Yang, Miao; Yuan, Ke-Hai

    2016-01-01

    Moderation analysis has many applications in social sciences. Most widely used estimation methods for moderation analysis assume that errors are normally distributed and homoscedastic. When these assumptions are not met, the results from a classical moderation analysis can be misleading. For more reliable moderation analysis, this article proposes two robust methods with a two-level regression model when the predictors do not contain measurement error. One method is based on maximum likelihood with Student's t distribution and the other is based on M-estimators with Huber-type weights. An algorithm for obtaining the robust estimators is developed. Consistent estimates of standard errors of the robust estimators are provided. The robust approaches are compared against normal-distribution-based maximum likelihood (NML) with respect to power and accuracy of parameter estimates through a simulation study. Results show that the robust approaches outperform NML under various distributional conditions. Application of the robust methods is illustrated through a real data example. An R program is developed and documented to facilitate the application of the robust methods.

  2. DEVELOPMENT OF A RISK SCREENING METHOD FOR CREDITED OPERATOR ACTIONS

    International Nuclear Information System (INIS)

    HIGGINS, J.C.; O'HARA, J.M.; LEWIS, P.M.; PERSENSKY, J.; BONGARRA, J.

    2002-01-01

    DEVELOPMENT OF A RISK SCREENING METHOD FOR CREDITED OPERATOR ACTIONS. THE U.S. NUCLEAR REGULATORY COMMISSION (NRC) REVIEWS THE HUMAN FACTORS ASPECTS OF PROPOSED LICENSE AMENDMENTS THAT IMPACT HUMAN ACTIONS THAT ARE CREDITED IN A PLANTS SAFETY ANALYSIS. THE STAFF IS COMMITTED TO A GRADED APPROACH TO THESE REVIEWS THAT FOCUS RESOURCES ON THE MOST RISK IMPORTANT CHANGES. THEREFORE, A RISK INFORMED SCREENING METHOD WAS DEVELOPED BASED ON AN ADAPTATION OF EXISTING GUIDANCE FOR RISK INFORMED REGULATION AND HUMAN FACTORS. THE METHOD USES BOTH QUANTITATIVE AND QUALITATIVE INFORMATION TO DIVIDE THE AMENDMENT REQUESTS INTO DIFFERENT LEVELS OF REVIEW. THE METHOD WAS EVALUATED USING A VARIETY OF TESTS. THIS PAPER WILL SUMMARIZE THE DEVELOPMENT OF THE METHODOLOGY AND THE EVALUATIONS THAT WERE PERFORMED TO VERIFY ITS USEFULNESS

  3. Development of a New RP-UPLC Method for the Determination of ...

    African Journals Online (AJOL)

    Erah

    Results: The developed method was linear for rabeprazole sodium from 0.03 - 30 µg/ml ... regression obtained was > 0.999. ... cost-effective for routine analysis in the pharmaceutical industry. ... a simple UPLC method for the determination.

  4. Who's in and why? A typology of stakeholder analysis methods for natural resource management.

    Science.gov (United States)

    Reed, Mark S; Graves, Anil; Dandy, Norman; Posthumus, Helena; Hubacek, Klaus; Morris, Joe; Prell, Christina; Quinn, Claire H; Stringer, Lindsay C

    2009-04-01

    Stakeholder analysis means many things to different people. Various methods and approaches have been developed in different fields for different purposes, leading to confusion over the concept and practice of stakeholder analysis. This paper asks how and why stakeholder analysis should be conducted for participatory natural resource management research. This is achieved by reviewing the development of stakeholder analysis in business management, development and natural resource management. The normative and instrumental theoretical basis for stakeholder analysis is discussed, and a stakeholder analysis typology is proposed. This consists of methods for: i) identifying stakeholders; ii) differentiating between and categorising stakeholders; and iii) investigating relationships between stakeholders. The range of methods that can be used to carry out each type of analysis is reviewed. These methods and approaches are then illustrated through a series of case studies funded through the Rural Economy and Land Use (RELU) programme. These case studies show the wide range of participatory and non-participatory methods that can be used, and discuss some of the challenges and limitations of existing methods for stakeholder analysis. The case studies also propose new tools and combinations of methods that can more effectively identify and categorise stakeholders and help understand their inter-relationships.

  5. The Dynamic Monte Carlo Method for Transient Analysis of Nuclear Reactors

    NARCIS (Netherlands)

    Sjenitzer, B.L.

    2013-01-01

    In this thesis a new method for the analysis of power transients in a nuclear reactor is developed, which is more accurate than the present state-of-the-art methods. Transient analysis is important tool when designing nuclear reactors, since they predict the behaviour of a reactor during changing

  6. The surface analysis methods

    International Nuclear Information System (INIS)

    Deville, J.P.

    1998-01-01

    Nowadays, there are a lot of surfaces analysis methods, each having its specificity, its qualities, its constraints (for instance vacuum) and its limits. Expensive in time and in investment, these methods have to be used deliberately. This article appeals to non specialists. It gives some elements of choice according to the studied information, the sensitivity, the use constraints or the answer to a precise question. After having recalled the fundamental principles which govern these analysis methods, based on the interaction between radiations (ultraviolet, X) or particles (ions, electrons) with matter, two methods will be more particularly described: the Auger electron spectroscopy (AES) and x-rays photoemission spectroscopy (ESCA or XPS). Indeed, they are the most widespread methods in laboratories, the easier for use and probably the most productive for the analysis of surface of industrial materials or samples submitted to treatments in aggressive media. (O.M.)

  7. STOCHASTIC METHODS IN RISK ANALYSIS

    Directory of Open Access Journals (Sweden)

    Vladimíra OSADSKÁ

    2017-06-01

    Full Text Available In this paper, we review basic stochastic methods which can be used to extend state-of-the-art deterministic analytical methods for risk analysis. We can conclude that the standard deterministic analytical methods highly depend on the practical experience and knowledge of the evaluator and therefore, the stochastic methods should be introduced. The new risk analysis methods should consider the uncertainties in input values. We present how large is the impact on the results of the analysis solving practical example of FMECA with uncertainties modelled using Monte Carlo sampling.

  8. Development of Human Performance Analysis and Advanced HRA Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-15

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants.

  9. Development of Human Performance Analysis and Advanced HRA Methodology

    International Nuclear Information System (INIS)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-01

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants

  10. Development of an analytical method for the simultaneous analysis of MCPD esters and glycidyl esters in oil-based foodstuffs.

    Science.gov (United States)

    Ermacora, Alessia; Hrnčiřík, Karel

    2014-01-01

    Substantial progress has been recently made in the development and optimisation of analytical methods for the quantification of 2-MCPD, 3-MCPD and glycidyl esters in oils and fats, and there are a few methods currently available that allow a reliable quantification of these contaminants in bulk oils and fats. On the other hand, no standard method for the analysis of foodstuffs has yet been established. The aim of this study was the development and validation of a new method for the simultaneous quantification of 2-MCPD, 3-MCPD and glycidyl esters in oil-based food products. The developed protocol includes a first step of liquid-liquid extraction and purification of the lipophilic substances of the sample, followed by the application of a previously developed procedure based on acid transesterification, for the indirect quantification of these contaminants in oils and fats. The method validation was carried out on food products (fat-based spreads, creams, margarine, mayonnaise) manufactured in-house, in order to control the manufacturing process and account for any food matrix-analyte interactions (the sample spiking was carried out on the single components used for the formulations rather than the final products). The method showed good accuracy (the recoveries ranged from 97% to 106% for bound 3-MCPD and 2-MCPD and from 88% to 115% for bound glycidol) and sensitivity (the LOD was 0.04 and 0.05 mg kg(-1) for bound MCPD and glycidol, respectively). Repeatability and reproducibility were satisfactory (RSD below 2% and 5%, respectively) for all analytes. The levels of salts and surface-active compounds in the formulation were found to have no impact on the accuracy and the other parameters of the method.

  11. Development of thermal analysis method for the near field of HLW repository using ABAQUS

    Energy Technology Data Exchange (ETDEWEB)

    Kuh, Jung Eui; Kang, Chul Hyung; Park, Jeong Hwa [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-10-01

    An appropriate tool is needed to evaluate the thermo-mechanical stability of high level radioactive waste (HLW) repository. In this report a thermal analysis methodology for the near field of HLW repository is developed to use ABAQUS which is one of the multi purpose FEM code and has been used for many engineering area. The main contents of this methodology development are the structural and material modelling to simulate a repository, setup of side conditions, e.g., boundary and load conditions, and initial conditions, and the procedure to selection proper material parameters. In addition to these, the interface programs for effective production of input data and effective change of model size for sensitivity analysis for disposal concept development are developed. The results of this work will be apply to evaluate the thermal stability and to use as main input data for mechanical analysis of HLW repository. (author). 20 refs., 15 figs., 5 tabs.

  12. An exploratory survey of methods used to develop measures of performance

    Science.gov (United States)

    Hamner, Kenneth L.; Lafleur, Charles A.

    1993-09-01

    Nonmanufacturing organizations are being challenged to provide high-quality products and services to their customers, with an emphasis on continuous process improvement. Measures of performance, referred to as metrics, can be used to foster process improvement. The application of performance measurement to nonmanufacturing processes can be very difficult. This research explored methods used to develop metrics in nonmanufacturing organizations. Several methods were formally defined in the literature, and the researchers used a two-step screening process to determine the OMB Generic Method was most likely to produce high-quality metrics. The OMB Generic Method was then used to develop metrics. A few other metric development methods were found in use at nonmanufacturing organizations. The researchers interviewed participants in metric development efforts to determine their satisfaction and to have them identify the strengths and weaknesses of, and recommended improvements to, the metric development methods used. Analysis of participants' responses allowed the researchers to identify the key components of a sound metrics development method. Those components were incorporated into a proposed metric development method that was based on the OMB Generic Method, and should be more likely to produce high-quality metrics that will result in continuous process improvement.

  13. Analysis of slippery droplet on tilted plate by development of optical correction method

    Science.gov (United States)

    Ko, Han Seo; Gim, Yeonghyeon; Choi, Sung Ho; Jang, Dong Kyu; Sohn, Dong Kee

    2017-11-01

    Because of distortion effects on a surface of a sessile droplet, the inner flow field of the droplet is measured by a PIV (particle image velocimetry) method with low reliability. In order to solve this problem, many researchers have studied and developed the optical correction method. However, the method cannot be applied for various cases such as the tilted droplet or other asymmetric shaped droplets since most methods were considered only for the axisymmetric shaped droplets. For the optical correction of the asymmetric shaped droplet, the surface function was calculated by the three-dimensional reconstruction using the ellipse curve fitting method. Also, the optical correction using the surface function was verified by the numerical simulation. Then, the developed method was applied to reconstruct the inner flow field of the droplet on the tilted plate. The colloidal droplet of water on the tilted surface was used, and the distorted effect on the surface of the droplet was calculated. Using the obtained results and the PIV method, the corrected flow field for the inner and interface parts of the droplet was reconstructed. Consequently, the error caused by the distortion effect of the velocity vector located on the apex of the droplet was removed. National Research Foundation (NRF) of Korea, (2016R1A2B4011087).

  14. A method for data base management and analysis for wind tunnel data

    Science.gov (United States)

    Biser, Aileen O.

    1987-01-01

    To respond to the need for improved data base management and analysis capabilities for wind-tunnel data at the Langley 16-Foot Transonic Tunnel, research was conducted into current methods of managing wind-tunnel data and a method was developed as a solution to this need. This paper describes the development of the data base management and analysis method for wind-tunnel data. The design and implementation of the software system are discussed and examples of its use are shown.

  15. Field Sample Preparation Method Development for Isotope Ratio Mass Spectrometry

    International Nuclear Information System (INIS)

    Leibman, C.; Weisbrod, K.; Yoshida, T.

    2015-01-01

    Non-proliferation and International Security (NA-241) established a working group of researchers from Los Alamos National Laboratory (LANL), Pacific Northwest National Laboratory (PNNL) and Savannah River National Laboratory (SRNL) to evaluate the utilization of in-field mass spectrometry for safeguards applications. The survey of commercial off-the-shelf (COTS) mass spectrometers (MS) revealed no instrumentation existed capable of meeting all the potential safeguards requirements for performance, portability, and ease of use. Additionally, fieldable instruments are unlikely to meet the International Target Values (ITVs) for accuracy and precision for isotope ratio measurements achieved with laboratory methods. The major gaps identified for in-field actinide isotope ratio analysis were in the areas of: 1. sample preparation and/or sample introduction, 2. size reduction of mass analyzers and ionization sources, 3. system automation, and 4. decreased system cost. Development work in 2 through 4, numerated above continues, in the private and public sector. LANL is focusing on developing sample preparation/sample introduction methods for use with the different sample types anticipated for safeguard applications. Addressing sample handling and sample preparation methods for MS analysis will enable use of new MS instrumentation as it becomes commercially available. As one example, we have developed a rapid, sample preparation method for dissolution of uranium and plutonium oxides using ammonium bifluoride (ABF). ABF is a significantly safer and faster alternative to digestion with boiling combinations of highly concentrated mineral acids. Actinides digested with ABF yield fluorides, which can then be analyzed directly or chemically converted and separated using established column chromatography techniques as needed prior to isotope analysis. The reagent volumes and the sample processing steps associated with ABF sample digestion lend themselves to automation and field

  16. Review of Recent Methodological Developments in Group-Randomized Trials: Part 2-Analysis.

    Science.gov (United States)

    Turner, Elizabeth L; Prague, Melanie; Gallis, John A; Li, Fan; Murray, David M

    2017-07-01

    In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have updated that review with developments in analysis of the past 13 years, with a companion article to focus on developments in design. We discuss developments in the topics of the earlier review (e.g., methods for parallel-arm GRTs, individually randomized group-treatment trials, and missing data) and in new topics, including methods to account for multiple-level clustering and alternative estimation methods (e.g., augmented generalized estimating equations, targeted maximum likelihood, and quadratic inference functions). In addition, we describe developments in analysis of alternative group designs (including stepped-wedge GRTs, network-randomized trials, and pseudocluster randomized trials), which require clustering to be accounted for in their design and analysis.

  17. Application of Stochastic Sensitivity Analysis to Integrated Force Method

    Directory of Open Access Journals (Sweden)

    X. F. Wei

    2012-01-01

    Full Text Available As a new formulation in structural analysis, Integrated Force Method has been successfully applied to many structures for civil, mechanical, and aerospace engineering due to the accurate estimate of forces in computation. Right now, it is being further extended to the probabilistic domain. For the assessment of uncertainty effect in system optimization and identification, the probabilistic sensitivity analysis of IFM was further investigated in this study. A set of stochastic sensitivity analysis formulation of Integrated Force Method was developed using the perturbation method. Numerical examples are presented to illustrate its application. Its efficiency and accuracy were also substantiated with direct Monte Carlo simulations and the reliability-based sensitivity method. The numerical algorithm was shown to be readily adaptable to the existing program since the models of stochastic finite element and stochastic design sensitivity are almost identical.

  18. The development of methods of analysis of documents on the basis of the methods of Raman spectroscopy and fluorescence analysis

    Science.gov (United States)

    Gorshkova, Kseniia O.; Tumkin, Ilya I.; Kirillova, Elizaveta O.; Panov, Maxim S.; Kochemirovsky, Vladimir A.

    2017-05-01

    The investigation of natural aging of writing inks printed on paper using Raman spectroscopy was performed. Based on the obtained dependencies of the Raman peak intensities ratios on the exposure time, the dye degradation model was proposed. It was suggested that there are several competing bond breaking and bond forming reactions corresponding to the characteristic vibration frequencies of the dye molecule that simultaneously occur during ink aging process. Also we propose a methodology based on the study of the optical properties of paper, particularly changes in the fluorescence of optical brighteners included in its composition as well as the paper reflectivity using spectrophotometric methods. These results can be implemented to develop the novel and promising method of criminology.

  19. Direct methods of soil-structure interaction analysis for earthquake loadings (III)

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, J B; Lee, S R; Kim, J M; Park, K R; Choi, J S; Oh, S B [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1995-06-15

    In this study, direct methods for seismic analysis of soil-structure interaction system have been studied. A computer program 'KIESSI-QK' has been developed based on the finite element technique coupled with infinite element formulation. A substructuring method isolating the displacement solution of near field soil region was adopted. The computer program developed was verified using a free-field site response problem. The post-correlation analysis for the forced vibration tests after backfill of the Hualien LSST project has been carried out. The seismic analyses for the Hualien and Lotung LSST structures have been also performed utilizing the developed computer program 'KIESSI-QK'.

  20. α-Cut method based importance measure for criticality analysis in fuzzy probability – Based fault tree analysis

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry; Sony Tjahyani, D.T.; Widodo, Surip; Tjahjono, Hendro

    2017-01-01

    Highlights: •FPFTA deals with epistemic uncertainty using fuzzy probability. •Criticality analysis is important for reliability improvement. •An α-cut method based importance measure is proposed for criticality analysis in FPFTA. •The α-cut method based importance measure utilises α-cut multiplication, α-cut subtraction, and area defuzzification technique. •Benchmarking confirm that the proposed method is feasible for criticality analysis in FPFTA. -- Abstract: Fuzzy probability – based fault tree analysis (FPFTA) has been recently developed and proposed to deal with the limitations of conventional fault tree analysis. In FPFTA, reliabilities of basic events, intermediate events and top event are characterized by fuzzy probabilities. Furthermore, the quantification of the FPFTA is based on fuzzy multiplication rule and fuzzy complementation rule to propagate uncertainties from basic event to the top event. Since the objective of the fault tree analysis is to improve the reliability of the system being evaluated, it is necessary to find the weakest path in the system. For this purpose, criticality analysis can be implemented. Various importance measures, which are based on conventional probabilities, have been developed and proposed for criticality analysis in fault tree analysis. However, not one of those importance measures can be applied for criticality analysis in FPFTA, which is based on fuzzy probability. To be fully applied in nuclear power plant probabilistic safety assessment, FPFTA needs to have its corresponding importance measure. The objective of this study is to develop an α-cut method based importance measure to evaluate and rank the importance of basic events for criticality analysis in FPFTA. To demonstrate the applicability of the proposed measure, a case study is performed and its results are then benchmarked to the results generated by the four well known importance measures in conventional fault tree analysis. The results

  1. Development of measurement and analysis method for long-term monitoring of {sup 41}-K

    Energy Technology Data Exchange (ETDEWEB)

    Yuita, Koichi; Miyagawa, Saburo [National Inst. of Agro-Environmental Sciences, Tsukuba, Ibaraki (Japan)

    2000-02-01

    This study aimed to develop a double labeling method with {sup 41}K and {sup 15}N for animal feed and excreta. Guinea pig was used as the subjects for the preliminary experiment. Animal feces and urine were separately collected once a day and the feces were dried at 70degC and urine was lyophilized. Those samples were submitted to analysis after mixing. Then, {sup 41}KCl solution and {sup 15}NH{sub 4}SO{sub 4} solution were absorbed to the conventional guinea pig feed and 1.0 g of the feed was given once a day. The amount of {sup 41}K in feces was determined using flame photometric detector and {sup 15}N was determined by ANCA-SL Mass spectrometer. The isotope abundances of {sup 41}K and {sup 15}N in the feed were 6.11% and 0.829%, respectively and the excess % was -0.062 % and 0.46 % for {sup 41}K and {sup 15}N, respectively. The present results showed that 15-N labeling for feces was fairly succeeded, but {sup 41}K labeling was insufficient. Therefore, it is thought necessary to use K tracer of a larger excess % (-0.3% or more) and raise the accuracy of analysis for total K and {sup 41}K. (M.N.)

  2. Momentum integral network method for thermal-hydraulic transient analysis

    International Nuclear Information System (INIS)

    Van Tuyle, G.J.

    1983-01-01

    A new momentum integral network method has been developed, and tested in the MINET computer code. The method was developed in order to facilitate the transient analysis of complex fluid flow and heat transfer networks, such as those found in the balance of plant of power generating facilities. The method employed in the MINET code is a major extension of a momentum integral method reported by Meyer. Meyer integrated the momentum equation over several linked nodes, called a segment, and used a segment average pressure, evaluated from the pressures at both ends. Nodal mass and energy conservation determined nodal flows and enthalpies, accounting for fluid compression and thermal expansion

  3. Development of a method for the analysis of perfluoroalkylated compounds in whole blood

    Energy Technology Data Exchange (ETDEWEB)

    Kaerrman, A.; Bavel, B. van; Lindstroem, G. [Oerebro Univ. (Sweden). Man-Technology-Environmental Research Centre; Jaernberg, U. [Stockholm Univ. (Sweden). Inst. of Applied Environmental Research

    2004-09-15

    The commercialisation of interfaced high performance liquid chromatography mass spectrometry (HPLC-MS) facilitated selective and sensitive analysis of perfluoroalkylated (PFA) acids, a group of compounds frequently used for example as industrial surfactants and which are very persistent and biologically active, in a more convenient way than before. Since then a number of reports on PFA compounds found in humans and wildlife have been published. The most used technique for the analysis of perfluoroalkylated compounds has been ion-pair extraction followed by high performance liquid chromatography (HPLC) and negative electrospray tandem mass spectrometry (MS/MS). Tetrabutylammonium ion as the counter ion in the ion-pair extraction has been used together with GC-analysis, LC-fluorescence and LC-MS/MS. Recently, solid phase extraction (SPE) has been used instead of ion-pair extraction for the extraction of human serum. Previously reported studies on human exposure have mainly been on serum, probably because there are indications that PFA acids bind to plasma proteins. We here present a fast and simple method that involves SPE and which is suitable for extracting whole blood samples. Further more, 13 PFAs were included in the method, which uses HPLC and single quadropole mass spectrometry.

  4. Simple gas chromatographic method for furfural analysis.

    Science.gov (United States)

    Gaspar, Elvira M S M; Lopes, João F

    2009-04-03

    A new, simple, gas chromatographic method was developed for the direct analysis of 5-hydroxymethylfurfural (5-HMF), 2-furfural (2-F) and 5-methylfurfural (5-MF) in liquid and water soluble foods, using direct immersion SPME coupled to GC-FID and/or GC-TOF-MS. The fiber (DVB/CAR/PDMS) conditions were optimized: pH effect, temperature, adsorption and desorption times. The method is simple and accurate (RSDfurfurals will contribute to characterise and quantify their presence in the human diet.

  5. Analysis of methods. [information systems evolution environment

    Science.gov (United States)

    Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity

  6. Validation study of core analysis methods for full MOX BWR

    International Nuclear Information System (INIS)

    2013-01-01

    JNES has been developing a technical database used in reviewing validation of core analysis methods of LWRs in the coming occasions: (1) confirming the core safety parameters of the initial core (one-third MOX core) through a full MOX core in Oma Nuclear Power Plant, which is under the construction, (2) licensing high-burnup MOX cores in the future and (3) reviewing topical reports on core analysis codes for safety design and evaluation. Based on the technical database, JNES will issue a guide of reviewing the core analysis methods used for safety design and evaluation of LWRs. The database will be also used for validation and improving of core analysis codes developed by JNES. JNES has progressed with the projects: (1) improving a Doppler reactivity analysis model in a Monte Carlo calculation code MVP, (2) sensitivity study of nuclear cross section date on reactivity calculation of experimental cores composed of UO 2 and MOX fuel rods, (3) analysis of isotopic composition data for UO 2 and MOX fuels and (4) the guide of reviewing the core analysis codes and others. (author)

  7. Validation study of core analysis methods for full MOX BWR

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    JNES has been developing a technical database used in reviewing validation of core analysis methods of LWRs in the coming occasions: (1) confirming the core safety parameters of the initial core (one-third MOX core) through a full MOX core in Oma Nuclear Power Plant, which is under the construction, (2) licensing high-burnup MOX cores in the future and (3) reviewing topical reports on core analysis codes for safety design and evaluation. Based on the technical database, JNES will issue a guide of reviewing the core analysis methods used for safety design and evaluation of LWRs. The database will be also used for validation and improving of core analysis codes developed by JNES. JNES has progressed with the projects: (1) improving a Doppler reactivity analysis model in a Monte Carlo calculation code MVP, (2) sensitivity study of nuclear cross section date on reactivity calculation of experimental cores composed of UO{sub 2} and MOX fuel rods, (3) analysis of isotopic composition data for UO{sub 2} and MOX fuels and (4) the guide of reviewing the core analysis codes and others. (author)

  8. Analysis of spectral methods for the homogeneous Boltzmann equation

    KAUST Repository

    Filbet, Francis; Mouhot, Clé ment

    2011-01-01

    The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.

  9. Analysis of spectral methods for the homogeneous Boltzmann equation

    KAUST Repository

    Filbet, Francis

    2011-04-01

    The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.

  10. Analysis of heavy oils: Method development and application to Cerro Negro heavy petroleum

    Energy Technology Data Exchange (ETDEWEB)

    None

    1989-12-01

    On March 6, 1980, the US Department of Energy (DOE) and the Ministry of Energy and Mines of Venezuela (MEMV) entered into a joint agreement which included analysis of heavy crude oils from the Venezuelan Orinoco oil belt. The purpose of this report is to present compositional data and describe new analytical methods obtained from work on the Cerro Negro Orinoco belt crude oil since 1980. Most of the chapters focus on the methods rather than the resulting data on Cerro Negro oil, and results from other oils obtained during the verification of the method are included. In addition, published work on analysis of heavy oils, tar sand bitumens, and like materials is reviewed, and the overall state of the art in analytical methodology for heavy fossil liquids is assessed. The various phases of the work included: distillation and determination of routine'' physical/chemical properties (Chapter 1); preliminary separation of >200{degrees} C distillates and the residue into acid, base, neutral, saturated hydrocarbon and neutral-aromatic concentrates (Chapter 2); further separation of acid, base, and neutral concentrates into subtypes (Chapters 3--5); and determination of the distribution of metal-containing compounds in all fractions (Chapter 6).

  11. Analysis of heavy oils: Method development and application to Cerro Negro heavy petroleum

    Energy Technology Data Exchange (ETDEWEB)

    Carbognani, L.; Hazos, M.; Sanchez, V. (INTEVEP, Filial de Petroleos de Venezuela, SA, Caracas (Venezuela)); Green, J.A.; Green, J.B.; Grigsby, R.D.; Pearson, C.D.; Reynolds, J.W.; Shay, J.Y.; Sturm, G.P. Jr.; Thomson, J.S.; Vogh, J.W.; Vrana, R.P.; Yu, S.K.T.; Diehl, B.H.; Grizzle, P.L.; Hirsch, D.E; Hornung, K.W.; Tang, S.Y.

    1989-12-01

    On March 6, 1980, the US Department of Energy (DOE) and the Ministry of Energy and Mines of Venezuela (MEMV) entered into a joint agreement which included analysis of heavy crude oils from the Venezuelan Orinoco oil belt.The purpose of this report is to present compositional data and describe new analytical methods obtained from work on the Cerro Negro Orinoco belt crude oil since 1980. Most of the chapters focus on the methods rather than the resulting data on Cerro Negro oil, and results from other oils obtained during the verification of the method are included. In addition, published work on analysis of heavy oils, tar sand bitumens, and like materials is reviewed, and the overall state of the art in analytical methodology for heavy fossil liquids is assessed. The various phases of the work included: distillation and determination of routine'' physical/chemical properties (Chapter 1); preliminary separation of >200{degree}C distillates and the residue into acid, base, neutral, saturated hydrocarbon and neutral-aromatic concentrates (Chapter 2); further separation of acid, base, and neutral concentrates into subtypes (Chapters 3-5); and determination of the distribution of metal-containing compounds in all fractions (Chapter 6).

  12. DEVELOPMENT OF A METHOD STATISTICAL ANALYSIS ACCURACY AND PROCESS STABILITY PRODUCTION OF EPOXY RESIN ED-20

    Directory of Open Access Journals (Sweden)

    N. V. Zhelninskaya

    2015-01-01

    Full Text Available Statistical methods play an important role in the objective evaluation of quantitative and qualitative characteristics of the process and are one of the most important elements of the quality assurance system production and total quality management process. To produce a quality product, one must know the real accuracy of existing equipment, to determine compliance with the accuracy of a selected technological process specified accuracy products, assess process stability. Most of the random events in life, particularly in manufacturing and scientific research, are characterized by the presence of a large number of random factors, is described by a normal distribution, which is the main in many practical studies. Modern statistical methods is quite difficult to grasp and wide practical use without in-depth mathematical training of all participants in the process. When we know the distribution of a random variable, you can get all the features of this batch of products, to determine the mean value and the variance. Using statistical control methods and quality control in the analysis of accuracy and stability of the technological process of production of epoxy resin ED20. Estimated numerical characteristics of the law of distribution of controlled parameters and determined the percentage of defects of the investigated object products. For sustainability assessment of manufacturing process of epoxy resin ED-20 selected Shewhart control charts, using quantitative data, maps of individual values of X and sliding scale R. Using Pareto charts identify the causes that affect low dynamic viscosity in the largest extent. For the analysis of low values of dynamic viscosity were the causes of defects using Ishikawa diagrams, which shows the most typical factors of the variability of the results of the process. To resolve the problem, it is recommended to modify the polymer composition of carbon fullerenes and to use the developed method for the production of

  13. In Vitro Dissolution Profile of Dapagliflozin: Development, Method Validation, and Analysis of Commercial Tablets

    Directory of Open Access Journals (Sweden)

    Rafaela Zielinski Cavalheiro de Meira

    2017-01-01

    Full Text Available Dapagliflozin was the first of its class (inhibitors of sodium-glucose cotransporter to be approved in Europe, USA, and Brazil. As the drug was recently approved, there is the need for research on analytical methods, including dissolution studies for the quality evaluation and assurance of tablets. The dissolution methodology was developed with apparatus II (paddle in 900 mL of medium (simulated gastric fluid, pH 1.2, temperature set at 37±0.5°C, and stirring speed of 50 rpm. For the quantification, a spectrophotometric (λ=224 nm method was developed and validated. In validation studies, the method proved to be specific and linear in the range from 0.5 to 15 μg·mL−1 (r2=0.998. The precision showed results with RSD values lower than 2%. The recovery of 80.72, 98.47, and 119.41% proved the accuracy of the method. Through a systematic approach by applying Factorial 23, the robustness of the method was confirmed (p>0.05. The studies of commercial tablets containing 5 or 10 mg demonstrated that they could be considered similar through f1, f2, and dissolution efficiency analyses. Also, the developed method can be used for the quality evaluation of dapagliflozin tablets and can be considered as a scientific basis for future official pharmacopoeial methods.

  14. Development on quantitative safety analysis method of accident scenario. The automatic scenario generator development for event sequence construction of accident

    International Nuclear Information System (INIS)

    Kojima, Shigeo; Onoue, Akira; Kawai, Katsunori

    1998-01-01

    This study intends to develop a more sophisticated tool that will advance the current event tree method used in all PSA, and to focus on non-catastrophic events, specifically a non-core melt sequence scenario not included in an ordinary PSA. In the non-catastrophic event PSA, it is necessary to consider various end states and failure combinations for the purpose of multiple scenario construction. Therefore it is anticipated that an analysis work should be reduced and automated method and tool is required. A scenario generator that can automatically handle scenario construction logic and generate the enormous size of sequences logically identified by state-of-the-art methodology was developed. To fulfill the scenario generation as a technical tool, a simulation model associated with AI technique and graphical interface, was introduced. The AI simulation model in this study was verified for the feasibility of its capability to evaluate actual systems. In this feasibility study, a spurious SI signal was selected to test the model's applicability. As a result, the basic capability of the scenario generator could be demonstrated and important scenarios were generated. The human interface with a system and its operation, as well as time dependent factors and their quantification in scenario modeling, was added utilizing human scenario generator concept. Then the feasibility of an improved scenario generator was tested for actual use. Automatic scenario generation with a certain level of credibility, was achieved by this study. (author)

  15. Analysis of QCD sum rule based on the maximum entropy method

    International Nuclear Information System (INIS)

    Gubler, Philipp

    2012-01-01

    QCD sum rule was developed about thirty years ago and has been used up to the present to calculate various physical quantities like hadrons. It has been, however, needed to assume 'pole + continuum' for the spectral function in the conventional analyses. Application of this method therefore came across with difficulties when the above assumption is not satisfied. In order to avoid this difficulty, analysis to make use of the maximum entropy method (MEM) has been developed by the present author. It is reported here how far this new method can be successfully applied. In the first section, the general feature of the QCD sum rule is introduced. In section 2, it is discussed why the analysis by the QCD sum rule based on the MEM is so effective. In section 3, the MEM analysis process is described, and in the subsection 3.1 likelihood function and prior probability are considered then in subsection 3.2 numerical analyses are picked up. In section 4, some cases of applications are described starting with ρ mesons, then charmoniums in the finite temperature and finally recent developments. Some figures of the spectral functions are shown. In section 5, summing up of the present analysis method and future view are given. (S. Funahashi)

  16. RETROSPECTIVE ANALYSIS OF FREIGHT CARS REPAIR ORGANIZATION METHODS IN THE DEPOT AND THE WAYS OF THEIR FURTHER DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    V. V. Myamlin

    2010-05-01

    Full Text Available A critical analysis of existing methods for repair of freight wagons is presented. The conclusion, that with probability nature of repair activities the “classic” type of a “rigid” production line with regulated step in long-term outlook is inexpedient, has been done. The further development of production-line wagon repair activities is seen in the creation of advanced enterprises equipped with multi-object flexible asynchronous systems with high level of mechanization and automation of technologic processes.

  17. Validation of spectral methods for the seismic analysis of multi-supported structures

    International Nuclear Information System (INIS)

    Viola, B.

    1999-01-01

    There are many methodologies for the seismic analysis of buildings. When a seism occurs, structures such piping systems in nuclear power plants are subjected to motions that may be different at each support point. Therefore it is necessary to develop methods that take into account the multi-supported effect. In a first time, a bibliography analysis on the different methods that exist has been carried out. The aim was to find a particular method applicable to the study of piping systems. The second step of this work consisted in developing a program that may be used to test and make comparisons on different selected methods. So spectral methods have the advantage to give an estimation of the maximum values for strain in the structure, in reduced calculation time. The time history analysis is used as the reference for the tests. (author)

  18. Determination of gold in lump by the gamma-activation analysis method

    International Nuclear Information System (INIS)

    Yantsen, V.A.; Ermakov, K.S.

    2006-01-01

    Full text: In the report the installation is described used in the Central gamma-activation analysis laboratory (CGAAL) for express quantitative determination of gold concentration in large powdered samples. The method of gold contents determination for non-crushed samples (pieces up to 100 mm). The given gamma-activation analysis method is widely used in mining industry, and at researches related with selection of optimal technological circuits designed for sorting the pieces of ore and rock materials. By developing this method it is now possible to create the technological collection of separated pieces by size, large by the amount of samples, imitating various sorts (by gold concentration in them) and types (by elemental composition) ores, and, based on these collection, to compare the efficiencies of various enrichment methods by knowing in advance the concentrations of gold in these lumps being the final sorting products. The Gamma-activation analysis method of large pieces is mainly used as foundation for the x-ray radiometric (XRR) method of pieces separation of gold-bearing ores from the deposits mined by the Navoi mining combine. It allows significant increase in the rate of research and development works on selection of the most reliable separation characteristics. Based on these one can develop optimal technological circuits for ore enrichment with portion sorting methods. (author)

  19. Transportation and quantitative analysis of socio-economic development of relations

    Science.gov (United States)

    Chen, Yun

    2017-12-01

    Transportation has a close relationship with socio-economic. This article selects the indicators which can measure the development of transportation and socio-economic, using the method of correlation analysis, regression analysis, intensity of transportation analysis and transport elastic analysis, to analyze the relationship between them quantitatively, so that it has the fact guiding sense in the national development planning for the future.

  20. Regional frequency analysis of extreme rainfalls using partial L moments method

    Science.gov (United States)

    Zakaria, Zahrahtul Amani; Shabri, Ani

    2013-07-01

    An approach based on regional frequency analysis using L moments and LH moments are revisited in this study. Subsequently, an alternative regional frequency analysis using the partial L moments (PL moments) method is employed, and a new relationship for homogeneity analysis is developed. The results were then compared with those obtained using the method of L moments and LH moments of order two. The Selangor catchment, consisting of 37 sites and located on the west coast of Peninsular Malaysia, is chosen as a case study. PL moments for the generalized extreme value (GEV), generalized logistic (GLO), and generalized Pareto distributions were derived and used to develop the regional frequency analysis procedure. PL moment ratio diagram and Z test were employed in determining the best-fit distribution. Comparison between the three approaches showed that GLO and GEV distributions were identified as the suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation used for performance evaluation shows that the method of PL moments would outperform L and LH moments methods for estimation of large return period events.

  1. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  2. Probabilistic methods in fire-risk analysis

    International Nuclear Information System (INIS)

    Brandyberry, M.D.

    1989-01-01

    The first part of this work outlines a method for assessing the frequency of ignition of a consumer product in a building and shows how the method would be used in an example scenario utilizing upholstered furniture as the product and radiant auxiliary heating devices (electric heaters, wood stoves) as the ignition source. Deterministic thermal models of the heat-transport processes are coupled with parameter uncertainty analysis of the models and with a probabilistic analysis of the events involved in a typical scenario. This leads to a distribution for the frequency of ignition for the product. In second part, fire-risk analysis as currently used in nuclear plants is outlines along with a discussion of the relevant uncertainties. The use of the computer code COMPBRN is discussed for use in the fire-growth analysis along with the use of response-surface methodology to quantify uncertainties in the code's use. Generalized response surfaces are developed for temperature versus time for a cable tray, as well as a surface for the hot gas layer temperature and depth for a room of arbitrary geometry within a typical nuclear power plant compartment. These surfaces are then used to simulate the cable tray damage time in a compartment fire experiment

  3. Methods for Force Analysis of Overconstrained Parallel Mechanisms: A Review

    Science.gov (United States)

    Liu, Wen-Lan; Xu, Yun-Dou; Yao, Jian-Tao; Zhao, Yong-Sheng

    2017-11-01

    The force analysis of overconstrained PMs is relatively complex and difficult, for which the methods have always been a research hotspot. However, few literatures analyze the characteristics and application scopes of the various methods, which is not convenient for researchers and engineers to master and adopt them properly. A review of the methods for force analysis of both passive and active overconstrained PMs is presented. The existing force analysis methods for these two kinds of overconstrained PMs are classified according to their main ideas. Each category is briefly demonstrated and evaluated from such aspects as the calculation amount, the comprehensiveness of considering limbs' deformation, and the existence of explicit expressions of the solutions, which provides an important reference for researchers and engineers to quickly find a suitable method. The similarities and differences between the statically indeterminate problem of passive overconstrained PMs and that of active overconstrained PMs are discussed, and a universal method for these two kinds of overconstrained PMs is pointed out. The existing deficiencies and development directions of the force analysis methods for overconstrained systems are indicated based on the overview.

  4. COMPETITIVE INTELLIGENCE ANALYSIS - SCENARIOS METHOD

    Directory of Open Access Journals (Sweden)

    Ivan Valeriu

    2014-07-01

    Full Text Available Keeping a company in the top performing players in the relevant market depends not only on its ability to develop continually, sustainably and balanced, to the standards set by the customer and competition, but also on the ability to protect its strategic information and to know in advance the strategic information of the competition. In addition, given that economic markets, regardless of their profile, enable interconnection not only among domestic companies, but also between domestic companies and foreign companies, the issue of economic competition moves from the national economies to the field of interest of regional and international economic organizations. The stakes for each economic player is to keep ahead of the competition and to be always prepared to face market challenges. Therefore, it needs to know as early as possible, how to react to others’ strategy in terms of research, production and sales. If a competitor is planning to produce more and cheaper, then it must be prepared to counteract quickly this movement. Competitive intelligence helps to evaluate the capabilities of competitors in the market, legally and ethically, and to develop response strategies. One of the main goals of the competitive intelligence is to acknowledge the role of early warning and prevention of surprises that could have a major impact on the market share, reputation, turnover and profitability in the medium and long term of a company. This paper presents some aspects of competitive intelligence, mainly in terms of information analysis and intelligence generation. Presentation is theoretical and addresses a structured method of information analysis - scenarios method – in a version that combines several types of analysis in order to reveal some interconnecting aspects of the factors governing the activity of a company.

  5. The colour analysis method applied to homogeneous rocks

    Directory of Open Access Journals (Sweden)

    Halász Amadé

    2015-12-01

    Full Text Available Computer-aided colour analysis can facilitate cyclostratigraphic studies. Here we report on a case study involving the development of a digital colour analysis method for examination of the Boda Claystone Formation which is the most suitable in Hungary for the disposal of high-level radioactive waste. Rock type colours are reddish brown or brownish red, or any shade between brown and red. The method presented here could be used to differentiate similar colours and to identify gradual transitions between these; the latter are of great importance in a cyclostratigraphic analysis of the succession. Geophysical well-logging has demonstrated the existence of characteristic cyclic units, as detected by colour and natural gamma. Based on our research, colour, natural gamma and lithology correlate well. For core Ib-4, these features reveal the presence of orderly cycles with thicknesses of roughly 0.64 to 13 metres. Once the core has been scanned, this is a time- and cost-effective method.

  6. Development of a segmentation method for analysis of Campos basin typical reservoir rocks

    Energy Technology Data Exchange (ETDEWEB)

    Rego, Eneida Arendt; Bueno, Andre Duarte [Universidade Estadual do Norte Fluminense Darcy Ribeiro (UENF), Macae, RJ (Brazil). Lab. de Engenharia e Exploracao de Petroleo (LENEP)]. E-mails: eneida@lenep.uenf.br; bueno@lenep.uenf.br

    2008-07-01

    This paper represents a master thesis proposal in Exploration and Reservoir Engineering that have the objective to development a specific segmentation method for digital images of reservoir rocks, which produce better results than the global methods available in the bibliography for the determination of rocks physical properties as porosity and permeability. (author)

  7. Methods for Developing Emissions Scenarios for Integrated Assessment Models

    Energy Technology Data Exchange (ETDEWEB)

    Prinn, Ronald [MIT; Webster, Mort [MIT

    2007-08-20

    The overall objective of this research was to contribute data and methods to support the future development of new emissions scenarios for integrated assessment of climate change. Specifically, this research had two main objectives: 1. Use historical data on economic growth and energy efficiency changes, and develop probability density functions (PDFs) for the appropriate parameters for two or three commonly used integrated assessment models. 2. Using the parameter distributions developed through the first task and previous work, we will develop methods of designing multi-gas emission scenarios that usefully span the joint uncertainty space in a small number of scenarios. Results on the autonomous energy efficiency improvement (AEEI) parameter are summarized, an uncertainty analysis of elasticities of substitution is described, and the probabilistic emissions scenario approach is presented.

  8. Coloured Petri Nets: Basic Concepts, Analysis Methods and Practical Use. Vol. 2, Analysis Methods

    DEFF Research Database (Denmark)

    Jensen, Kurt

    ideas behind the analysis methods are described as well as the mathematics on which they are based and also how the methods are supported by computer tools. Some parts of the volume are theoretical while others are application oriented. The purpose of the volume is to teach the reader how to use......This three-volume work presents a coherent description of the theoretical and practical aspects of coloured Petri nets (CP-nets). The second volume contains a detailed presentation of the analysis methods for CP-nets. They allow the modeller to investigate dynamic properties of CP-nets. The main...... the formal analysis methods, which does not require a deep understanding of the underlying mathematical theory....

  9. Direct methods of soil-structure interaction analysis for earthquake loadings (III)

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, J. B.; Lee, S. R.; Kim, J. M.; Park, K. R.; Choi, J. S.; Oh, S. B. [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1995-06-15

    In this study, direct methods for seismic analysis of soil-structure interaction system have been studied. A computer program 'KIESSI-QK' has been developed based on the finite element technique coupled with infinite element formulation. A substructuring method isolating the displacement solution of near field soil region was adopted. The computer program developed was verified using a free-field site response problem. The post-correlation analysis for the forced vibration tests after backfill of the Hualien LSST project has been carried out. The seismic analyses for the Hualien and Lotung LSST structures have been also performed utilizing the developed computer program 'KIESSI-QK'.

  10. Analysis of human serum and whole blood for mineral content by ICP-MS and ICP-OES: development of a mineralomics method.

    Science.gov (United States)

    Harrington, James M; Young, Daniel J; Essader, Amal S; Sumner, Susan J; Levine, Keith E

    2014-07-01

    Minerals are inorganic compounds that are essential to the support of a variety of biological functions. Understanding the range and variability of the content of these minerals in biological samples can provide insight into the relationships between mineral content and the health of individuals. In particular, abnormal mineral content may serve as an indicator of illness. The development of robust, reliable analytical methods for the determination of the mineral content of biological samples is essential to developing biological models for understanding the relationship between minerals and illnesses. This paper describes a method for the analysis of the mineral content of small volumes of serum and whole blood samples from healthy individuals. Interday and intraday precision for the mineral content of the blood (250 μL) and serum (250 μL) samples was measured for eight essential minerals--sodium (Na), calcium (Ca), magnesium (Mg), potassium (K), iron (Fe), zinc (Zn), copper (Cu), and selenium (Se)--by plasma spectrometric methods and ranged from 0.635 to 10.1% relative standard deviation (RSD) for serum and 0.348-5.98% for whole blood. A comparison of the determined ranges for ten serum samples and six whole blood samples provided good agreement with literature reference ranges. The results demonstrate that the digestion and analysis methods can be used to reliably measure the content of these minerals and potentially of other minerals.

  11. Review of multi-physics temporal coupling methods for analysis of nuclear reactors

    International Nuclear Information System (INIS)

    Zerkak, Omar; Kozlowski, Tomasz; Gajev, Ivan

    2015-01-01

    Highlights: • Review of the numerical methods used for the multi-physics temporal coupling. • Review of high-order improvements to the Operator Splitting coupling method. • Analysis of truncation error due to the temporal coupling. • Recommendations on best-practice approaches for multi-physics temporal coupling. - Abstract: The advanced numerical simulation of a realistic physical system typically involves multi-physics problem. For example, analysis of a LWR core involves the intricate simulation of neutron production and transport, heat transfer throughout the structures of the system and the flowing, possibly two-phase, coolant. Such analysis involves the dynamic coupling of multiple simulation codes, each one devoted to the solving of one of the coupled physics. Multiple temporal coupling methods exist, yet the accuracy of such coupling is generally driven by the least accurate numerical scheme. The goal of this paper is to review in detail the approaches and numerical methods that can be used for the multi-physics temporal coupling, including a comprehensive discussion of the issues associated with the temporal coupling, and define approaches that can be used to perform multi-physics analysis. The paper is not limited to any particular multi-physics process or situation, but is intended to provide a generic description of multi-physics temporal coupling schemes for any development stage of the individual (single-physics) tools and methods. This includes a wide spectrum of situation, where the individual (single-physics) solvers are based on pre-existing computation codes embedded as individual components, or a new development where the temporal coupling can be developed and implemented as a part of code development. The discussed coupling methods are demonstrated in the framework of LWR core analysis

  12. Method Development in Forensic Toxicology.

    Science.gov (United States)

    Peters, Frank T; Wissenbach, Dirk K; Busardo, Francesco Paolo; Marchei, Emilia; Pichini, Simona

    2017-01-01

    In the field of forensic toxicology, the quality of analytical methods is of great importance to ensure the reliability of results and to avoid unjustified legal consequences. A key to high quality analytical methods is a thorough method development. The presented article will provide an overview on the process of developing methods for forensic applications. This includes the definition of the method's purpose (e.g. qualitative vs quantitative) and the analytes to be included, choosing an appropriate sample matrix, setting up separation and detection systems as well as establishing a versatile sample preparation. Method development is concluded by an optimization process after which the new method is subject to method validation. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  13. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  14. Comparability of river suspended-sediment sampling and laboratory analysis methods

    Science.gov (United States)

    Groten, Joel T.; Johnson, Gregory D.

    2018-03-06

    Accurate measurements of suspended sediment, a leading water-quality impairment in many Minnesota rivers, are important for managing and protecting water resources; however, water-quality standards for suspended sediment in Minnesota are based on grab field sampling and total suspended solids (TSS) laboratory analysis methods that have underrepresented concentrations of suspended sediment in rivers compared to U.S. Geological Survey equal-width-increment or equal-discharge-increment (EWDI) field sampling and suspended sediment concentration (SSC) laboratory analysis methods. Because of this underrepresentation, the U.S. Geological Survey, in collaboration with the Minnesota Pollution Control Agency, collected concurrent grab and EWDI samples at eight sites to compare results obtained using different combinations of field sampling and laboratory analysis methods.Study results determined that grab field sampling and TSS laboratory analysis results were biased substantially low compared to EWDI sampling and SSC laboratory analysis results, respectively. Differences in both field sampling and laboratory analysis methods caused grab and TSS methods to be biased substantially low. The difference in laboratory analysis methods was slightly greater than field sampling methods.Sand-sized particles had a strong effect on the comparability of the field sampling and laboratory analysis methods. These results indicated that grab field sampling and TSS laboratory analysis methods fail to capture most of the sand being transported by the stream. The results indicate there is less of a difference among samples collected with grab field sampling and analyzed for TSS and concentration of fines in SSC. Even though differences are present, the presence of strong correlations between SSC and TSS concentrations provides the opportunity to develop site specific relations to address transport processes not captured by grab field sampling and TSS laboratory analysis methods.

  15. Development and experience of quality control methods for digital breast tomosynthesis systems.

    Science.gov (United States)

    Strudley, Cecilia J; Young, Kenneth C; Looney, Padraig; Gilbert, Fiona J

    2015-01-01

    To develop tomosynthesis quality control (QC) test methods and use them alongside established two-dimensional (2D) QC tests to measure the performance of digital breast tomosynthesis (DBT) systems used in a comparative trial with 2D mammography. DBT QC protocols and associated analysis were developed, incorporating adaptions of some 2D tests as well as some novel tests. The tomosynthesis tests were: mean glandular dose to the standard breast model; contrast-to-noise ratio in reconstructed focal planes; geometric distortion; artefact spread; threshold contrast detail detection in reconstructed focal planes, alignment of the X-ray beam to the reconstructed image and missed tissue; reproducibility of the tomosynthesis exposure; and homogeneity of the reconstructed focal planes. Summaries of results from the tomosynthesis QC tests are presented together with some 2D results for comparison. The tomosynthesis QC tests and analysis methods developed were successfully applied. The lessons learnt, which are detailed in the Discussion section, may be helpful to others embarking on DBT QC programmes. DBT performance test equipment and analysis methods have been developed. The experience gained has contributed to the subsequent drafting of DBT QC protocols in the UK and Europe.

  16. Multivariate analysis: models and method

    International Nuclear Information System (INIS)

    Sanz Perucha, J.

    1990-01-01

    Data treatment techniques are increasingly used since computer methods result of wider access. Multivariate analysis consists of a group of statistic methods that are applied to study objects or samples characterized by multiple values. A final goal is decision making. The paper describes the models and methods of multivariate analysis

  17. Research on Visual Analysis Methods of Terrorism Events

    Science.gov (United States)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  18. CASE METHOD: THE STORY OF THE DEVELOPMENT AND USE OF THE METHOD IN EDUCATION

    Directory of Open Access Journals (Sweden)

    Светлана Юрьевна Грузкова

    2013-08-01

    Full Text Available This article deals with the history of origin and issue of case-study method (of a кейс-method in the practice of professional education, which is based on case studies. The distinctive feature of the case-study method is to create a problematic situation on the basis of the facts of real life. As a method, analysis of situations-has become widespread in the world in the 70-80 years, in the same period, he became known in the USSR. This method was used at the beginning, when training managers mainly economic universities to form the students ' ability to make decisions. It is connected with the changes taking place in the economy, because at that time, it has generated substantial reform demand for specialists who know how to act in situations of uncertainty, high risk specialists who can analyze and make decisions. On the one hand, as the authors note, the wide dissemination of this method in education due to its orientation, so that the case method is focused not so much on the development of specific knowledge or skills, as on the development of the common intellectual and communicative capacity of the trainee and the training. In addition, the case method is quite effective in training, he can be connected easily enough with other learning methods. On the other hand, have been certain difficulties in introducing the case-study method in the practice of professional education: general orientation for the development of education and development of the quality requirements for surface treatment, specialist teachers to the methodological basis for the other method.DOI: http://dx.doi.org/10.12731/2218-7405-2013-6-24

  19. The Analysis Methods Of 3-Monochloropropane-1,2-Diol and Glycydyl Esters in Foods, Mitigation Studies, and Current Developments About their Effects on Health

    Directory of Open Access Journals (Sweden)

    Aslı Yıldırım

    2017-12-01

    Full Text Available Chloropropanols are known as undesired food contaminants liberated during the processing of various food products. When the adverse effects of chloropropanols, especially 3-monochloropropane-1,2-diol (3-MCPD, 2-monochloropropane-1,3-diol (2-MCPD and glycidols along with their esters were first understood, the studies about the detection and mitigation of these compounds were accelerated. 3-MCPD, which was detected in food products in higher amounts when compared to other chloropropanols, usually occurs during refining process of vegetable oils, especially in deodorisation step. The novel methods in terms of the analysis of 3-MCPD and other chloropropanols are continuously updated. However, there are two basic methods today namely direct and indirect methods. Direct methods enable to detect all of the esters individually, yet, due to the necessity of a huge number of reference standards, indirect methods are currently more preferred. The first essential step of reducing chloropropanols in food products is to determine the proper analysis method. In this review, general information, new developments in analysis methods, mitigation studies and the toxigolocial data about various chloropropanols were summarized.

  20. Safety relief valve alternate analysis method

    International Nuclear Information System (INIS)

    Adams, R.H.; Javid, A.; Khatua, T.P.

    1981-01-01

    An experimental test program was started in the United States in 1976 to define and quantify Safety Relief Valve (SRV) phenomena in General Electric Mark I Suppression Chambers. The testing considered several discharged devices and was used to correlate SRV load prediction models. The program was funded by utilities with Mark I containments and has resulted in a detailed SRV load definition as a portion of the Mark I containment program Load Definition Report (LDR). The (USNRC) has reviewed and approved the LDR SRV load definition. In addition, the USNRC has permitted calibration of structural models used for predicting torus response to SRV loads. Model calibration is subject to confirmatory in-plant testing. The SRV methodology given in the LDR requires that transient dynamic pressures be applied to a torus structural model that includes a fluid added mass matrix. Preliminary evaluations of torus response have indicated order of magnitude conservatisms, with respect to test results, which could result in unrealistic containment modifications. In addition, structural response trends observed in full-scale tests between cold pipe, first valve actuation and hot pipe, subsequent valve actuation conditions have not been duplicated using current analysis methods. It was suggested by others that an energy approach using current fluid models be utilized to define loads. An alternate SRV analysis method is defined to correct suppression chamber structural response to a level that permits economical but conservative design. Simple analogs are developed for the purpose of correcting the analytical response obtained from LDR analysis methods. Analogs evaluated considered forced vibration and free vibration structural response. The corrected response correlated well with in-plant test response. The correlation of the analytical model at test conditions permits application of the alternate analysis method at design conditions. (orig./HP)

  1. A biosegmentation benchmark for evaluation of bioimage analysis methods

    Directory of Open Access Journals (Sweden)

    Kvilekval Kristian

    2009-11-01

    Full Text Available Abstract Background We present a biosegmentation benchmark that includes infrastructure, datasets with associated ground truth, and validation methods for biological image analysis. The primary motivation for creating this resource comes from the fact that it is very difficult, if not impossible, for an end-user to choose from a wide range of segmentation methods available in the literature for a particular bioimaging problem. No single algorithm is likely to be equally effective on diverse set of images and each method has its own strengths and limitations. We hope that our benchmark resource would be of considerable help to both the bioimaging researchers looking for novel image processing methods and image processing researchers exploring application of their methods to biology. Results Our benchmark consists of different classes of images and ground truth data, ranging in scale from subcellular, cellular to tissue level, each of which pose their own set of challenges to image analysis. The associated ground truth data can be used to evaluate the effectiveness of different methods, to improve methods and to compare results. Standard evaluation methods and some analysis tools are integrated into a database framework that is available online at http://bioimage.ucsb.edu/biosegmentation/. Conclusion This online benchmark will facilitate integration and comparison of image analysis methods for bioimages. While the primary focus is on biological images, we believe that the dataset and infrastructure will be of interest to researchers and developers working with biological image analysis, image segmentation and object tracking in general.

  2. Development of a framework for the neutronics analysis system for next generation (3)

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Hirai, Yasushi; Hyoudou, Hideaki; Tatsumi, Masahiro

    2010-02-01

    Development of innovative analysis methods and models in fundamental studies for next-generation nuclear reactor systems is in progress. In order to efficiently and effectively reflect the latest analysis methods and models to primary design of commercial reactor and/or in-core fuel management for power reactors, a next-generation analysis system MARBLE has been developed. The next-generation analysis system provides solutions to the following requirements: (1) flexibility, extensibility and user-friendliness that can apply new methods and models rapidly and effectively for fundamental studies, (2) quantitative proof of solution accuracy and adaptive scoping range for design studies, (3) coupling analysis among different study domains for the purpose of rationalization of plant systems and improvement of reliability, (4) maintainability and reusability for system extensions for the purpose of total quality management and development efficiency. The next-generation analysis system supports many fields, such as thermal-hydraulic analysis, structure analysis, reactor physics etc., and now we are studying reactor physics analysis system for fast reactor in advance. As for reactor physics analysis methods for fast reactor, we have established the JUPITER standard analysis methods based on the past study. But, there has been a problem of extreme inefficiency due to lack of functionality in the conventional analysis system when changing analysis targets and/or modeling levels. That is why, we have developed the next-generation analysis system for reactor physics which reproduces the JUPITER standard analysis method that has been developed so far and newly realizes burnup and design analysis for fast reactor and functions for cross section adjustment. In the present study, we examined in detail the existing design and implementation of ZPPR critical experiment analysis database followed by unification of models within the framework of the next-generation analysis system by

  3. Transit Traffic Analysis Zone Delineating Method Based on Thiessen Polygon

    Directory of Open Access Journals (Sweden)

    Shuwei Wang

    2014-04-01

    Full Text Available A green transportation system composed of transit, busses and bicycles could be a significant in alleviating traffic congestion. However, the inaccuracy of current transit ridership forecasting methods is imposing a negative impact on the development of urban transit systems. Traffic Analysis Zone (TAZ delineating is a fundamental and essential step in ridership forecasting, existing delineating method in four-step models have some problems in reflecting the travel characteristics of urban transit. This paper aims to come up with a Transit Traffic Analysis Zone delineation method as supplement of traditional TAZs in transit service analysis. The deficiencies of current TAZ delineating methods were analyzed, and the requirements of Transit Traffic Analysis Zone (TTAZ were summarized. Considering these requirements, Thiessen Polygon was introduced into TTAZ delineating. In order to validate its feasibility, Beijing was then taken as an example to delineate TTAZs, followed by a spatial analysis of office buildings within a TTAZ and transit station departure passengers. Analysis result shows that the TTAZs based on Thiessen polygon could reflect the transit travel characteristic and is of in-depth research value.

  4. Numerical methods and analysis of multiscale problems

    CERN Document Server

    Madureira, Alexandre L

    2017-01-01

    This book is about numerical modeling of multiscale problems, and introduces several asymptotic analysis and numerical techniques which are necessary for a proper approximation of equations that depend on different physical scales. Aimed at advanced undergraduate and graduate students in mathematics, engineering and physics – or researchers seeking a no-nonsense approach –, it discusses examples in their simplest possible settings, removing mathematical hurdles that might hinder a clear understanding of the methods. The problems considered are given by singular perturbed reaction advection diffusion equations in one and two-dimensional domains, partial differential equations in domains with rough boundaries, and equations with oscillatory coefficients. This work shows how asymptotic analysis can be used to develop and analyze models and numerical methods that are robust and work well for a wide range of parameters.

  5. Development of a chromatographic separation method hyphenated to electro-spray ionization mass spectrometry (ESI-MS) and inductively coupled plasma mass spectrometry (ICP-MS): application to the lanthanides speciation analysis

    International Nuclear Information System (INIS)

    Beuvier, Ludovic

    2015-01-01

    This work focuses on the development of a chromatographic separation method coupled to both ESI-MS and ICP-MS in order to achieve the comprehensive speciation analysis of lanthanides in aqueous phase representative of back-extraction phases of advanced spent nuclear fuel treatment processes. This analytical method allowed the separation, the characterization and the quantitation of lanthanides complexes holding poly-aminocarboxylic ligands, such as DTPA and EDTA, used as complexing agents in these processes. A HILIC separation method of lanthanides complexes has been developed with an amide bonded stationary phase. A screening of a wide range of mobile phase compositions demonstrated that the adsorption mechanism was predominant. This screening allowed also obtaining optimized separation conditions. Faster analysis conditions with shorter amide column packed with sub 2 μm particles reduced analysis time by 2.5 and 25% solvent consumption. Isotopic and structural characterization by HILIC ESI-MS was performed as well as the development of external calibration quantitation method. Analytical performances of quantitation method were determined. Finally, the development of the HILIC coupling to ESI-MS and ICP-MS was achieved. A simultaneous quantitation method by ESI-MS and ICP-MS was performed to determine the species quantitative distribution in solution. Analytical performances of quantitation method were also determined. (author) [fr

  6. Studies on application of neutron activation analysis -Applied research on air pollution monitoring and development of analytical method of environmental samples

    International Nuclear Information System (INIS)

    Chung, Yong Sam; Moon, Jong Hwa; Chung, Young Ju; Jeong, Eui Sik; Lee, Sang Mi; Kang, Sang Hun; Cho, Seung Yeon; Kwon, Young Sik; Chung, Sang Wuk; Lee, Kyu Sung; Chun, Ki Hong; Kim, Nak Bae; Lee, Kil Yong; Yoon, Yoon Yeol; Chun, Sang Ki.

    1997-09-01

    This research report is written for results of applied research on air pollution monitoring using instrumental neutron activation analysis. For identification and standardization of analytical method, 24 environmental samples are analyzed quantitatively, and accuracy and precision of this method are measured. Using airborne particulate matter and biomonitor chosen as environmental indicators, trace elemental concentrations of sample collected at urban and rural site monthly are determined ant then the calculation of statistics and the factor analysis are carried out for investigation of emission source. Facilities for NAA are installed in a new HANARO reactor, functional test is performed for routine operation. In addition, unified software code for NAA is developed to improve accuracy, precision and abilities of analytical processes. (author). 103 refs., 61 tabs., 19 figs

  7. Development of output user interface software to support analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wahanani, Nursinta Adi, E-mail: sintaadi@batan.go.id; Natsir, Khairina, E-mail: sintaadi@batan.go.id; Hartini, Entin, E-mail: sintaadi@batan.go.id [Center for Development of Nuclear Informatics - National Nuclear Energy Agency, PUSPIPTEK, Serpong, Tangerang, Banten (Indonesia)

    2014-09-30

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu{sup 239} and Pu{sup 241}. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis.

  8. Development of output user interface software to support analysis

    International Nuclear Information System (INIS)

    Wahanani, Nursinta Adi; Natsir, Khairina; Hartini, Entin

    2014-01-01

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu 239 and Pu 241 . Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis

  9. Reliability-Based Stability Analysis of Rock Slopes Using Numerical Analysis and Response Surface Method

    Science.gov (United States)

    Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.

    2017-08-01

    While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.

  10. Developing the Students’ English Speaking Ability Through Impromptu Speaking Method.

    Science.gov (United States)

    Lumettu, A.; Runtuwene, T. L.

    2018-01-01

    Having multi -purposes, English mastery has becomea necessary for us.Of the four language skills, speaking skill should get the first priority in English teaching and speaking skills development cannot be separated from listening.One communicative way of developing speaking skill is impromptu speaking,a method sudden speaking which depends only on experience and insight by applying spontaneity or improvisation. It is delivered based on the need of the moment of speaking using simple language.This research aims to know (1). Why impromptu speaking is necessary in teaching speaking? (2). How can impromptu speaking develop the students’ speaking skills.The method of this research is qualitative method and the techniques of data collection are: observation,interview and documentation. The results of data analysis using Correlation shows a strong relation between the students’ speaking ability and impromptu speaking method (r = 0.80).The research show that by using impromptu speaking method, the students are trained to interact faster naturally and spontaneously and enrich their vocabulary and general science to support speaking development through interview, speech, presentation, discussion and storytelling.

  11. Three-Phase Harmonic Analysis Method for Unbalanced Distribution Systems

    Directory of Open Access Journals (Sweden)

    Jen-Hao Teng

    2014-01-01

    Full Text Available Due to the unbalanced features of distribution systems, a three-phase harmonic analysis method is essential to accurately analyze the harmonic impact on distribution systems. Moreover, harmonic analysis is the basic tool for harmonic filter design and harmonic resonance mitigation; therefore, the computational performance should also be efficient. An accurate and efficient three-phase harmonic analysis method for unbalanced distribution systems is proposed in this paper. The variations of bus voltages, bus current injections and branch currents affected by harmonic current injections can be analyzed by two relationship matrices developed from the topological characteristics of distribution systems. Some useful formulas are then derived to solve the three-phase harmonic propagation problem. After the harmonic propagation for each harmonic order is calculated, the total harmonic distortion (THD for bus voltages can be calculated accordingly. The proposed method has better computational performance, since the time-consuming full admittance matrix inverse employed by the commonly-used harmonic analysis methods is not necessary in the solution procedure. In addition, the proposed method can provide novel viewpoints in calculating the branch currents and bus voltages under harmonic pollution which are vital for harmonic filter design. Test results demonstrate the effectiveness and efficiency of the proposed method.

  12. Response Matrix Method Development Program at Savannah River Laboratory

    International Nuclear Information System (INIS)

    Sicilian, J.M.

    1976-01-01

    The Response Matrix Method Development Program at Savannah River Laboratory (SRL) has concentrated on the development of an effective system of computer codes for the analysis of Savannah River Plant (SRP) reactors. The most significant contribution of this program to date has been the verification of the accuracy of diffusion theory codes as used for routine analysis of SRP reactor operation. This paper documents the two steps carried out in achieving this verification: confirmation of the accuracy of the response matrix technique through comparison with experiment and Monte Carlo calculations; and establishment of agreement between diffusion theory and response matrix codes in situations which realistically approximate actual operating conditions

  13. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Isolation and HPLC method development of azafrin from Alectra parasitica var. chitrakutensis.

    Science.gov (United States)

    Agrawal, Poonam; Laddha, Kirti; Tiwari, Ashok

    2014-01-01

    This study was undertaken to isolate and quantify azafrin in Alectra parasitica (Scrophulariaceae) rhizomes. A simple method for the isolation of carotenoid, azafrin, involves solvent extraction of the dried rhizome powder using a single solvent and further purification by recrystallisation. The structure of the compound was elucidated and confirmed by thin-layer chromatography, infrared spectroscopy, mass spectroscopy and nuclear magnetic resonance spectral analysis. A specific and rapid reversed-phase high-performance liquid chromatography (HPLC) method was developed for the analysis of azafrin. The method was validated for accuracy, precision, linearity and specificity. Validation revealed that the method is specific, accurate, precise, reliable and reproducible. The proposed HPLC method can be used for the identification and quantitative analysis of azafrin in A. parasitica rhizomes.

  15. Overview of methods for uncertainty analysis and sensitivity analysis in probabilistic risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.; Helton, J.C.

    1985-01-01

    Probabilistic Risk Assessment (PRA) is playing an increasingly important role in the nuclear reactor regulatory process. The assessment of uncertainties associated with PRA results is widely recognized as an important part of the analysis process. One of the major criticisms of the Reactor Safety Study was that its representation of uncertainty was inadequate. The desire for the capability to treat uncertainties with the MELCOR risk code being developed at Sandia National Laboratories is indicative of the current interest in this topic. However, as yet, uncertainty analysis and sensitivity analysis in the context of PRA is a relatively immature field. In this paper, available methods for uncertainty analysis and sensitivity analysis in a PRA are reviewed. This review first treats methods for use with individual components of a PRA and then considers how these methods could be combined in the performance of a complete PRA. In the context of this paper, the goal of uncertainty analysis is to measure the imprecision in PRA outcomes of interest, and the goal of sensitivity analysis is to identify the major contributors to this imprecision. There are a number of areas that must be considered in uncertainty analysis and sensitivity analysis for a PRA: (1) information, (2) systems analysis, (3) thermal-hydraulic phenomena/fission product behavior, (4) health and economic consequences, and (5) display of results. Each of these areas and the synthesis of them into a complete PRA are discussed

  16. Development of a Probabilistic Dynamic Synthesis Method for the Analysis of Nondeterministic Structures

    Science.gov (United States)

    Brown, A. M.

    1998-01-01

    Accounting for the statistical geometric and material variability of structures in analysis has been a topic of considerable research for the last 30 years. The determination of quantifiable measures of statistical probability of a desired response variable, such as natural frequency, maximum displacement, or stress, to replace experience-based "safety factors" has been a primary goal of these studies. There are, however, several problems associated with their satisfactory application to realistic structures, such as bladed disks in turbomachinery. These include the accurate definition of the input random variables (rv's), the large size of the finite element models frequently used to simulate these structures, which makes even a single deterministic analysis expensive, and accurate generation of the cumulative distribution function (CDF) necessary to obtain the probability of the desired response variables. The research presented here applies a methodology called probabilistic dynamic synthesis (PDS) to solve these problems. The PDS method uses dynamic characteristics of substructures measured from modal test as the input rv's, rather than "primitive" rv's such as material or geometric uncertainties. These dynamic characteristics, which are the free-free eigenvalues, eigenvectors, and residual flexibility (RF), are readily measured and for many substructures, a reasonable sample set of these measurements can be obtained. The statistics for these rv's accurately account for the entire random character of the substructure. Using the RF method of component mode synthesis, these dynamic characteristics are used to generate reduced-size sample models of the substructures, which are then coupled to form system models. These sample models are used to obtain the CDF of the response variable by either applying Monte Carlo simulation or by generating data points for use in the response surface reliability method, which can perform the probabilistic analysis with an order of

  17. Development of Performance Analysis Program for an Axial Compressor with Meanline Analysis

    International Nuclear Information System (INIS)

    Park, Jun Young; Park, Moo Ryong; Choi, Bum Suk; Song, Je Wook

    2009-01-01

    Axial-flow compressor is one of the most important parts of gas turbine units with axial turbine and combustor. Therefore, precise prediction of performance is very important for development of new compressor or modification of existing one. Meanline analysis is a simple, fast and powerful method for performance prediction of axial-flow compressors with different geometries. So, Meanline analysis is frequently used in preliminary design stage and performance analysis for given geometry data. Much correlations for meanline analysis have been developed theoretically and experimentally for estimating various types of losses and flow deviation angle for long time. In present study, meanline analysis program was developed to estimate compressor losses, incidence angles, deviation angles, stall and surge conditions with many correlations. Performance prediction of one stage axial compressors is conducted with this meanline analysis program. The comparison between experimental and numerical results show a good agreement. This meanline analysis program can be used for various types of single stage axial-flow compressors with different geometries, as well as multistage axial-flow compressors

  18. METHODIC OF DEVELOPMENT OF MOTOR GIFTEDNESS OF PRESCHOOL CHILDREN

    Directory of Open Access Journals (Sweden)

    Fedorova Svetlana Yurievna

    2013-04-01

    Full Text Available Education and training of gifted children today appropriate to consider as an important strategic task of modern society. In this context, the purpose of research is the development motor giftedness, which is particularly relevant at the stage of pre-school education, which is caused by age-preschoolers. Preschoolers' motor giftedness is considered by the author as developing integrated quality, including psychomotor skills, inclinations, increased motivation for motor activity. In the process of study the following methods are used: the study and analysis of the scientific and methodological literature on studies, questioning, interview, testing of physical fitness, statistical data processing. The result of research work is methodic of development of motor giftedness on physical education in preschool. The author's methodic consists of four steps: diagnostic, prognostic, practice and activity, social and pedagogical. Each step determines the inclusion of preschool children in sports and developing environment that meets his or her abilities and needs through the creation of certain social and educational conditions. The area of using results of the author's methodic is preschool and the system of improvement professional skill of teachers.

  19. Development, optimization, validation and application of faster gas chromatography - flame ionization detector method for the analysis of total petroleum hydrocarbons in contaminated soils.

    Science.gov (United States)

    Zubair, Abdulrazaq; Pappoe, Michael; James, Lesley A; Hawboldt, Kelly

    2015-12-18

    This paper presents an important new approach to improving the timeliness of Total Petroleum Hydrocarbon (TPH) analysis in the soil by Gas Chromatography - Flame Ionization Detector (GC-FID) using the CCME Canada-Wide Standard reference method. The Canada-Wide Standard (CWS) method is used for the analysis of petroleum hydrocarbon compounds across Canada. However, inter-laboratory application of this method for the analysis of TPH in the soil has often shown considerable variability in the results. This could be due, in part, to the different gas chromatography (GC) conditions, other steps involved in the method, as well as the soil properties. In addition, there are differences in the interpretation of the GC results, which impacts the determination of the effectiveness of remediation at hydrocarbon-contaminated sites. In this work, multivariate experimental design approach was used to develop and validate the analytical method for a faster quantitative analysis of TPH in (contaminated) soil. A fractional factorial design (fFD) was used to screen six factors to identify the most significant factors impacting the analysis. These factors included: injection volume (μL), injection temperature (°C), oven program (°C/min), detector temperature (°C), carrier gas flow rate (mL/min) and solvent ratio (v/v hexane/dichloromethane). The most important factors (carrier gas flow rate and oven program) were then optimized using a central composite response surface design. Robustness testing and validation of model compares favourably with the experimental results with percentage difference of 2.78% for the analysis time. This research successfully reduced the method's standard analytical time from 20 to 8min with all the carbon fractions eluting. The method was successfully applied for fast TPH analysis of Bunker C oil contaminated soil. A reduced analytical time would offer many benefits including an improved laboratory reporting times, and overall improved clean up

  20. Multivariate analysis methods in physics

    International Nuclear Information System (INIS)

    Wolter, M.

    2007-01-01

    A review of multivariate methods based on statistical training is given. Several multivariate methods useful in high-energy physics analysis are discussed. Selected examples from current research in particle physics are discussed, both from the on-line trigger selection and from the off-line analysis. Also statistical training methods are presented and some new application are suggested [ru

  1. Development of precursors recognition methods in vector signals

    Science.gov (United States)

    Kapralov, V. G.; Elagin, V. V.; Kaveeva, E. G.; Stankevich, L. A.; Dremin, M. M.; Krylov, S. V.; Borovov, A. E.; Harfush, H. A.; Sedov, K. S.

    2017-10-01

    Precursor recognition methods in vector signals of plasma diagnostics are presented. Their requirements and possible options for their development are considered. In particular, the variants of using symbolic regression for building a plasma disruption prediction system are discussed. The initial data preparation using correlation analysis and symbolic regression is discussed. Special attention is paid to the possibility of using algorithms in real time.

  2. Improvement of human reliability analysis method for PRA

    International Nuclear Information System (INIS)

    Tanji, Junichi; Fujimoto, Haruo

    2013-09-01

    It is required to refine human reliability analysis (HRA) method by, for example, incorporating consideration for the cognitive process of operator into the evaluation of diagnosis errors and decision-making errors, as a part of the development and improvement of methods used in probabilistic risk assessments (PRAs). JNES has been developed a HRA method based on ATHENA which is suitable to handle the structured relationship among diagnosis errors, decision-making errors and operator cognition process. This report summarizes outcomes obtained from the improvement of HRA method, in which enhancement to evaluate how the plant degraded condition affects operator cognitive process and to evaluate human error probabilities (HEPs) which correspond to the contents of operator tasks is made. In addition, this report describes the results of case studies on the representative accident sequences to investigate the applicability of HRA method developed. HEPs of the same accident sequences are also estimated using THERP method, which is most popularly used HRA method, and comparisons of the results obtained using these two methods are made to depict the differences of these methods and issues to be solved. Important conclusions obtained are as follows: (1) Improvement of HRA method using operator cognitive action model. Clarification of factors to be considered in the evaluation of human errors, incorporation of degraded plant safety condition into HRA and investigation of HEPs which are affected by the contents of operator tasks were made to improve the HRA method which can integrate operator cognitive action model into ATHENA method. In addition, the detail of procedures of the improved method was delineated in the form of flowchart. (2) Case studies and comparison with the results evaluated by THERP method. Four operator actions modeled in the PRAs of representative BWR5 and 4-loop PWR plants were selected and evaluated as case studies. These cases were also evaluated using

  3. Robust methods for multivariate data analysis A1

    DEFF Research Database (Denmark)

    Frosch, Stina; Von Frese, J.; Bro, Rasmus

    2005-01-01

    Outliers may hamper proper classical multivariate analysis, and lead to incorrect conclusions. To remedy the problem of outliers, robust methods are developed in statistics and chemometrics. Robust methods reduce or remove the effect of outlying data points and allow the ?good? data to primarily...... determine the result. This article reviews the most commonly used robust multivariate regression and exploratory methods that have appeared since 1996 in the field of chemometrics. Special emphasis is put on the robust versions of chemometric standard tools like PCA and PLS and the corresponding robust...

  4. Image segmentation and particles classification using texture analysis method

    Directory of Open Access Journals (Sweden)

    Mayar Aly Atteya

    Full Text Available Introduction: Ingredients of oily fish include a large amount of polyunsaturated fatty acids, which are important elements in various metabolic processes of humans, and have also been used to prevent diseases. However, in an attempt to reduce cost, recent developments are starting a replace the ingredients of fish oil with products of microalgae, that also produce polyunsaturated fatty acids. To do so, it is important to closely monitor morphological changes in algae cells and monitor their age in order to achieve the best results. This paper aims to describe an advanced vision-based system to automatically detect, classify, and track the organic cells using a recently developed SOPAT-System (Smart On-line Particle Analysis Technology, a photo-optical image acquisition device combined with innovative image analysis software. Methods The proposed method includes image de-noising, binarization and Enhancement, as well as object recognition, localization and classification based on the analysis of particles’ size and texture. Results The methods allowed for correctly computing cell’s size for each particle separately. By computing an area histogram for the input images (1h, 18h, and 42h, the variation could be observed showing a clear increase in cell. Conclusion The proposed method allows for algae particles to be correctly identified with accuracies up to 99% and classified correctly with accuracies up to 100%.

  5. Development Of The Computer Code For Comparative Neutron Activation Analysis

    International Nuclear Information System (INIS)

    Purwadi, Mohammad Dhandhang

    2001-01-01

    The qualitative and quantitative chemical analysis with Neutron Activation Analysis (NAA) is an importance utilization of a nuclear research reactor, and this should be accelerated and promoted in application and its development to raise the utilization of the reactor. The application of Comparative NAA technique in GA Siwabessy Multi Purpose Reactor (RSG-GAS) needs special (not commercially available yet) soft wares for analyzing the spectrum of multiple elements in the analysis at once. The application carried out using a single spectrum software analyzer, and comparing each result manually. This method really degrades the quality of the analysis significantly. To solve the problem, a computer code was designed and developed for comparative NAA. Spectrum analysis in the code is carried out using a non-linear fitting method. Before the spectrum analyzed, it was passed to the numerical filter which improves the signal to noise ratio to do the deconvolution operation. The software was developed using the G language and named as PASAN-K The testing result of the developed software was benchmark with the IAEA spectrum and well operated with less than 10 % deviation

  6. Perfection Of Methods Of Mathematical Analysis For Increasing The Completeness Of Subsoil Development

    Science.gov (United States)

    Fokina, Mariya

    2017-11-01

    The economy of Russia is based around the mineral-raw material complex to the highest degree. The mining industry is a prioritized and important area. Given the high competitiveness of businesses in this sector, increasing the efficiency of completed work and manufactured products will become a central issue. Improvement of planning and management in this sector should be based on multivariant study and the optimization of planning decisions, the appraisal of their immediate and long-term results, taking the dynamic of economic development into account. All of this requires the use of economic mathematic models and methodsApplying an economic-mathematic model to determine optimal ore mine production capacity, we receive a figure of 4,712,000 tons. The production capacity of the Uchalinsky ore mine is 1560 thousand tons, and the Uzelginsky ore mine - 3650 thousand. Conducting a corresponding analysis of the production of OAO "Uchalinsky Gok", an optimal production plan was received: the optimal production of copper - 77961,4 rubles; the optimal production of zinc - 17975.66 rubles. The residual production volume of the two main ore mines of OAO "UGOK" is 160 million tons of ore.

  7. Cleanup standards and pathways analysis methods

    International Nuclear Information System (INIS)

    Devgun, J.S.

    1993-01-01

    Remediation of a radioactively contaminated site requires that certain regulatory criteria be met before the site can be released for unrestricted future use. Since the ultimate objective of remediation is to protect the public health and safety, residual radioactivity levels remaining at a site after cleanup must be below certain preset limits or meet acceptable dose or risk criteria. Release of a decontaminated site requires proof that the radiological data obtained from the site meet the regulatory criteria for such a release. Typically release criteria consist of a composite of acceptance limits that depend on the radionuclides, the media in which they are present, and federal and local regulations. In recent years, the US Department of Energy (DOE) has developed a pathways analysis model to determine site-specific soil activity concentration guidelines for radionuclides that do not have established generic acceptance limits. The DOE pathways analysis computer code (developed by Argonne National Laboratory for the DOE) is called RESRAD (Gilbert et al. 1989). Similar efforts have been initiated by the US Nuclear Regulatory Commission (NRC) to develop and use dose-related criteria based on genetic pathways analyses rather than simplistic numerical limits on residual radioactivity. The focus of this paper is radionuclide contaminated soil. Cleanup standards are reviewed, pathways analysis methods are described, and an example is presented in which RESRAD was used to derive cleanup guidelines

  8. A New Boron Analysis Method

    Energy Technology Data Exchange (ETDEWEB)

    Weitman, J; Daaverhoeg, N; Farvolden, S

    1970-07-01

    In connection with fast neutron (n, {alpha}) cross section measurements a novel boron analysis method has been developed. The boron concentration is inferred from the mass spectrometrically determined number of helium atoms produced in the thermal and epithermal B-10 (n, {alpha}) reaction. The relation between helium amount and boron concentration is given, including corrections for self shielding effects and background levels. Direct and diffusion losses of helium are calculated and losses due to gettering, adsorption and HF-ionization in the release stage are discussed. A series of boron determinations is described and the results are compared with those obtained by other methods, showing excellent agreement. The lower limit of boron concentration which can be measured varies with type of sample. In e.g. steel, concentrations below 10-5 % boron in samples of 0.1-1 gram may be determined.

  9. A strategy for evaluating pathway analysis methods.

    Science.gov (United States)

    Yu, Chenggang; Woo, Hyung Jun; Yu, Xueping; Oyama, Tatsuya; Wallqvist, Anders; Reifman, Jaques

    2017-10-13

    Researchers have previously developed a multitude of methods designed to identify biological pathways associated with specific clinical or experimental conditions of interest, with the aim of facilitating biological interpretation of high-throughput data. Before practically applying such pathway analysis (PA) methods, we must first evaluate their performance and reliability, using datasets where the pathways perturbed by the conditions of interest have been well characterized in advance. However, such 'ground truths' (or gold standards) are often unavailable. Furthermore, previous evaluation strategies that have focused on defining 'true answers' are unable to systematically and objectively assess PA methods under a wide range of conditions. In this work, we propose a novel strategy for evaluating PA methods independently of any gold standard, either established or assumed. The strategy involves the use of two mutually complementary metrics, recall and discrimination. Recall measures the consistency of the perturbed pathways identified by applying a particular analysis method to an original large dataset and those identified by the same method to a sub-dataset of the original dataset. In contrast, discrimination measures specificity-the degree to which the perturbed pathways identified by a particular method to a dataset from one experiment differ from those identifying by the same method to a dataset from a different experiment. We used these metrics and 24 datasets to evaluate six widely used PA methods. The results highlighted the common challenge in reliably identifying significant pathways from small datasets. Importantly, we confirmed the effectiveness of our proposed dual-metric strategy by showing that previous comparative studies corroborate the performance evaluations of the six methods obtained by our strategy. Unlike any previously proposed strategy for evaluating the performance of PA methods, our dual-metric strategy does not rely on any ground truth

  10. Application of the probabilistic approximate analysis method to a turbopump blade analysis. [for Space Shuttle Main Engine

    Science.gov (United States)

    Thacker, B. H.; Mcclung, R. C.; Millwater, H. R.

    1990-01-01

    An eigenvalue analysis of a typical space propulsion system turbopump blade is presented using an approximate probabilistic analysis methodology. The methodology was developed originally to investigate the feasibility of computing probabilistic structural response using closed-form approximate models. This paper extends the methodology to structures for which simple closed-form solutions do not exist. The finite element method will be used for this demonstration, but the concepts apply to any numerical method. The results agree with detailed analysis results and indicate the usefulness of using a probabilistic approximate analysis in determining efficient solution strategies.

  11. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    International Nuclear Information System (INIS)

    Hunt, M.A.; Klatt, L.N.; Thompson, D.H.

    1998-02-01

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons

  12. Ratio of slopes method for quantitative analysis in ceramic bodies

    International Nuclear Information System (INIS)

    Zainal Arifin Ahmad; Ahmad Fauzi Mohd Noor; Radzali Othman; Messer, P.F.

    1996-01-01

    A quantitative x-ray diffraction analysis technique developed at University of Sheffield was adopted, rather than the previously widely used internal standard method, to determine the amount of the phases present in a reformulated whiteware porcelain and a BaTiO sub 3 electrochemical material. This method, although still employs an internal standard, was found to be very easy and accurate. The required weight fraction of a phase in the mixture to be analysed is determined from the ratio of slopes of two linear plots, designated as the analysis and reference lines, passing through their origins using the least squares method

  13. The reaction π-p → π-π-π+p: Development of the analysis methods and selected results

    Science.gov (United States)

    Ryabchikov, D.

    2016-01-01

    We present the description of the analysis methods and results of applying them to the exclusive diffractive reaction π-p → π-π-π+p of 50 . 106 events measured with COMPASS detector. The large statistics of π-π-π+ events enables the two-dimensional partial-wave analysis independently in 100 bins of m(3π) with 0.5 < m(3π) < 2.5 GeV/c2 and in 11 intervals of squared momentum transfer with 0.1 < t' < 1 GeV2/c2. The partial-wave analysis sub-density matrix is the subject to further mass-dependent fits describing the data in terms of resonances in 3π system and coherent background contributions. The novel approach of extracting JPC = 0++(π+π-)S isobar amplitudes as model-free functions, different for several JPC 3π states, is used. It demonstrates the presence of processes π(1800) → f0(980)π and π(1800) → f0(1500)π as well as π2(1880) → f0(980)π and new narrow signal a1(1420) → f0(980)π, without any established shapes used for (π+π-)S isobars. The presented analysis is subject to further development and refinements which currently take place.

  14. IoT System Development Methods

    NARCIS (Netherlands)

    Giray, G.; Tekinerdogan, B.; Tüzün, E.

    2018-01-01

    It is generally believed that the application of methods plays an important role in developing quality systems. A development method is mainly necessary for structuring the process in producing largescale and complex systems that involve high costs. Similar to the development of other systems, it is

  15. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-04-01

    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  16. Dynamic Sustainability. Sustainability Window Analysis of Chinese Poverty-Environment Nexus Development

    Directory of Open Access Journals (Sweden)

    Jyrki Luukkanen

    2015-10-01

    Full Text Available Sustainability Window is a new analysis tool for assessing the sustainability of development simultaneously in all of its three dimensions (environmental, economic, and social. The analysis method provides information of the maximum and minimum economic development that is required to maintain the direction of social and environmental development towards more sustainable targets. With the Sustainability Window method it is possible to easily analyze the sustainability using different indicators and different time periods making comparative analyses easy. The new method makes it also possible to analyze the dynamics of the sustainability and the changes over time in the width of the window. This provides a new perspective for analyzing the trends of sustainability and the impacts of underlying sustainability policies. As an illustration of the method, we have carried out an analysis of Chinese development using CO2 and SO2 emissions as indicators of the environmental dimension, number of non-poor people as an indicator of the social dimension and GDP as an indicator of the economic dimension.

  17. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J

    2008-01-01

    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  18. Application of Looped Network Analysis Method to Core of Prismatic VHTR

    International Nuclear Information System (INIS)

    Lee, Jeong-Hun; Cho, Hyoung-Kyu; Park, Goon-Cherl

    2016-01-01

    Most of reactor coolant flows through the coolant channel within the fuel block, but some portion of the reactor coolant bypasses to the interstitial gaps. The vertical gap and horizontal gap are called bypass gap and cross gap, respectively as shown in Fig. 1. CFD simulation for the full core of VHTR might be possible but it requires vast computational cost and time. Moreover, it is hard to cover whole cases corresponding to the various bypass gap distribution in the whole VHTR core. In order to solve this problem, in this study, the flow network analysis code, FastNet (Flow Analysis for Steady-state Network), was developed using the Looped Network Analysis Method. The applied method was validated by comparing with SNU VHTR multi-block experiment. A 3-demensional network modeling was conducted representing flow paths as flow resistances. Flow network analysis code, FastNet, was developed to evaluate the core bypass flow distribution by using looped network analysis method. Complex flow network could be solved simply by converting the non-linear momentum equation to the linearized equation. The FastNet code predicted the flow distribution of the SNU multi-block experiment accurately

  19. Basic methods of isotope analysis

    International Nuclear Information System (INIS)

    Ochkin, A.V.; Rozenkevich, M.B.

    2000-01-01

    The bases of the most applied methods of the isotope analysis are briefly presented. The possibilities and analytical characteristics of the mass-spectrometric, spectral, radiochemical and special methods of the isotope analysis, including application of the magnetic resonance, chromatography and refractometry, are considered [ru

  20. Intellectual Data Analysis Method for Evaluation of Virtual Teams

    Directory of Open Access Journals (Sweden)

    Sandra Strigūnaitė

    2013-01-01

    Full Text Available The purpose of the article is to present a method for virtual team performance evaluation based on intelligent team member collaboration data analysis. The motivation for the research is based on the ability to create an evaluation method that is similar to ambiguous expert evaluations. The concept of the hierarchical fuzzy rule based method aims to evaluate the data from virtual team interaction instances related to implementation of project tasks. The suggested method is designed for project managers or virtual team leaders to help in virtual teamwork evaluation that is based on captured data analysis. The main point of the method is the ability to repeat human thinking and expert valuation process for data analysis by applying fuzzy logic: fuzzy sets, fuzzy signatures and fuzzy rules. The fuzzy set principle used in the method allows evaluation criteria numerical values to transform into linguistic terms and use it in constructing fuzzy rules. Using a fuzzy signature is possible in constructing a hierarchical criteria structure. This structure helps to solve the problem of exponential increase of fuzzy rules including more input variables. The suggested method is aimed to be applied in the virtual collaboration software as a real time teamwork evaluation tool. The research shows that by applying fuzzy logic for team collaboration data analysis it is possible to get evaluations equal to expert insights. The method includes virtual team, project task and team collaboration data analysis. The advantage of the suggested method is the possibility to use variables gained from virtual collaboration systems as fuzzy rules inputs. Information on fuzzy logic based virtual teamwork collaboration evaluation has evidence that can be investigated in the future. Also the method can be seen as the next virtual collaboration software development step.

  1. Nuclear Fuel Cycle Analysis Technology to Develop Advanced Nuclear Fuel Cycle

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byung Heung [Chungju National University, Chungju (Korea, Republic of); Ko, Won IL [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-12-15

    The nuclear fuel cycle (NFC) analysis is a study to set a NFC policy and to promote systematic researches by analyzing technologies and deriving requirements at each stage of a fuel cycle. System analysis techniques are utilized for comparative analysis and assessment of options on a considered system. In case that NFC is taken into consideration various methods of the system analysis techniques could be applied depending on the range of an interest. This study presented NFC analysis strategies for the development of a domestic advanced NFC and analysis techniques applicable to different phases of the analysis. Strategically, NFC analysis necessitates the linkage with technology analyses, domestic and international interests, and a national energy program. In this respect, a trade-off study is readily applicable since it includes various aspects on NFC as metrics and then analyzes the considered NFC options according to the derived metrics. In this study, the trade-off study was identified as a method for NFC analysis with the derived strategies and it was expected to be used for development of an advanced NFC. A technology readiness level (TRL) method and NFC simulation codes could be utilized to obtain the required metrics and data for assessment in the trade-off study. The methodologies would guide a direction of technology development by comparing and assessing technological, economical, environmental, and other aspects on the alternatives. Consequently, they would contribute for systematic development and deployment of an appropriate advanced NFC.

  2. Nuclear Fuel Cycle Analysis Technology to Develop Advanced Nuclear Fuel Cycle

    International Nuclear Information System (INIS)

    Park, Byung Heung; Ko, Won IL

    2011-01-01

    The nuclear fuel cycle (NFC) analysis is a study to set a NFC policy and to promote systematic researches by analyzing technologies and deriving requirements at each stage of a fuel cycle. System analysis techniques are utilized for comparative analysis and assessment of options on a considered system. In case that NFC is taken into consideration various methods of the system analysis techniques could be applied depending on the range of an interest. This study presented NFC analysis strategies for the development of a domestic advanced NFC and analysis techniques applicable to different phases of the analysis. Strategically, NFC analysis necessitates the linkage with technology analyses, domestic and international interests, and a national energy program. In this respect, a trade-off study is readily applicable since it includes various aspects on NFC as metrics and then analyzes the considered NFC options according to the derived metrics. In this study, the trade-off study was identified as a method for NFC analysis with the derived strategies and it was expected to be used for development of an advanced NFC. A technology readiness level (TRL) method and NFC simulation codes could be utilized to obtain the required metrics and data for assessment in the trade-off study. The methodologies would guide a direction of technology development by comparing and assessing technological, economical, environmental, and other aspects on the alternatives. Consequently, they would contribute for systematic development and deployment of an appropriate advanced NFC.

  3. METHODS TO DEVELOP A TOROIDAL SURFACE

    Directory of Open Access Journals (Sweden)

    DANAILA Ligia

    2017-05-01

    Full Text Available The paper work presents two practical methods to draw the development of a surface unable to be developed applying classical methods of Descriptive Geometry, the toroidal surface, frequently met in technical practice. The described methods are approximate ones; the development is obtained with the help of points. The accuracy of the methods is given by the number of points used when drawing. As for any other approximate method, when practically manufactured the development may need to be adjusted on site.

  4. Method developments approaches in supercritical fluid chromatography applied to the analysis of cosmetics.

    Science.gov (United States)

    Lesellier, E; Mith, D; Dubrulle, I

    2015-12-04

    Analyses of complex samples of cosmetics, such as creams or lotions, are generally achieved by HPLC. These analyses are often multistep gradients, due to the presence of compounds with a large range of polarity. For instance, the bioactive compounds may be polar, while the matrix contains lipid components that are rather non-polar, thus cosmetic formulations are usually oil-water emulsions. Supercritical fluid chromatography (SFC) uses mobile phases composed of carbon dioxide and organic co-solvents, allowing for good solubility of both the active compounds and the matrix excipients. Moreover, the classical and well-known properties of these mobile phases yield fast analyses and ensure rapid method development. However, due to the large number of stationary phases available for SFC and to the varied additional parameters acting both on retention and separation factors (co-solvent nature and percentage, temperature, backpressure, flow rate, column dimensions and particle size), a simplified approach can be followed to ensure a fast method development. First, suited stationary phases should be carefully selected for an initial screening, and then the other operating parameters can be limited to the co-solvent nature and percentage, maintaining the oven temperature and back-pressure constant. To describe simple method development guidelines in SFC, three sample applications are discussed in this paper: UV-filters (sunscreens) in sunscreen cream, glyceryl caprylate in eye liner and caffeine in eye serum. Firstly, five stationary phases (ACQUITY UPC(2)) are screened with isocratic elution conditions (10% methanol in carbon dioxide). Complementary of the stationary phases is assessed based on our spider diagram classification which compares a large number of stationary phases based on five molecular interactions. Secondly, the one or two best stationary phases are retained for further optimization of mobile phase composition, with isocratic elution conditions or, when

  5. A complex neutron activation method for the analysis of biological materials

    International Nuclear Information System (INIS)

    Ordogh, M.

    1978-01-01

    The aim of the present work was to deal primarily with a few essential trace elements and to obtain reliable results of adequate accuracy and precision for the analysis of biological samples. A few other than trace elements were determined by the nondestructive technique as they can be well evaluated from the gamma-spectra. In the development of the method BOWEN's kale was chosen as model material. To confirm the reliability of the method two samples were analysed proposed by the IAEA in the frame of an international comparative analysis series. The comparative analysis shows the present method to be reliable, the precision and accuracy are good. (author)

  6. Developments of an Interactive Sail Design Method

    Directory of Open Access Journals (Sweden)

    S. M. Malpede

    2000-01-01

    Full Text Available This paper presents a new tool for performing the integrated design and analysis of a sail. The features of the system are the geometrical definition of a sail shape, using the Bezier surface method, the creation of a finite element model for the non-linear structural analysis and a fluid-dynamic model for the aerodynamic analysis. The system has been developed using MATLAB(r. Recent sail design efforts have been focused on solving the aeroelastic behavior of the sail. The pressure distribution on a sail changes continuously, by virtue of cloth stretch and flexing. The sail shape determines the pressure distribution and, at the same time, the pressure distribution on the sail stretches and flexes the sail material determining its shape. This characteristic non-linear behavior requires iterative solution strategies to obtain the equilibrium configuration and evaluate the forces involved. The aeroelastic problem is tackled by combining structural with aerodynamic analysis. Firstly, pressure loads for a known sail-shape are computed (aerodynamic analysis. Secondly, the sail-shape is analyzed for the obtained external loads (structural analysis. The final solution is obtained by using an iterative analysis process, which involves both aerodynamic and the structural analysis. When the solution converges, it is possible to make design modifications.

  7. Gravimetric and titrimetric methods of analysis

    International Nuclear Information System (INIS)

    Rives, R.D.; Bruks, R.R.

    1983-01-01

    Gravimetric and titrimetric methods of analysis are considered. Methods of complexometric titration are mentioned, as well as methods of increasing sensitivity in titrimetry. Gravimetry and titrimetry are applied during analysis for traces of geological materials

  8. Data analysis through interactive computer animation method (DATICAM)

    International Nuclear Information System (INIS)

    Curtis, J.N.; Schwieder, D.H.

    1983-01-01

    DATICAM is an interactive computer animation method designed to aid in the analysis of nuclear research data. DATICAM was developed at the Idaho National Engineering Laboratory (INEL) by EG and G Idaho, Inc. INEL analysts use DATICAM to produce computer codes that are better able to predict the behavior of nuclear power reactors. In addition to increased code accuracy, DATICAM has saved manpower and computer costs. DATICAM has been generalized to assist in the data analysis of virtually any data-producing dynamic process

  9. Analysis of live cell images: Methods, tools and opportunities.

    Science.gov (United States)

    Nketia, Thomas A; Sailem, Heba; Rohde, Gustavo; Machiraju, Raghu; Rittscher, Jens

    2017-02-15

    Advances in optical microscopy, biosensors and cell culturing technologies have transformed live cell imaging. Thanks to these advances live cell imaging plays an increasingly important role in basic biology research as well as at all stages of drug development. Image analysis methods are needed to extract quantitative information from these vast and complex data sets. The aim of this review is to provide an overview of available image analysis methods for live cell imaging, in particular required preprocessing image segmentation, cell tracking and data visualisation methods. The potential opportunities recent advances in machine learning, especially deep learning, and computer vision provide are being discussed. This review includes overview of the different available software packages and toolkits. Copyright © 2017. Published by Elsevier Inc.

  10. METHODIC OF DEVELOPMENT OF MOTOR GIFTEDNESS OF PRESCHOOL CHILDREN

    Directory of Open Access Journals (Sweden)

    Светлана Юрьевна Федорова

    2013-05-01

    Full Text Available Education and training of gifted children today appropriate to consider as an important strategic task of modern society. In this context, the purpose of research is the development motor giftedness, which is particularly relevant at the stage of pre-school education, which is caused by age-preschoolers. Preschoolers' motor giftedness is considered by the author as developing integrated quality, including psychomotor skills, inclinations, increased motivation for motor activity. In the process of study the following methods are used:  the study and analysis of the scientific and methodological literature on studies, questioning, interview, testing of physical fitness, statistical data processing.The result of research work is methodic of development of motor giftedness on physical education in preschool. The author's methodic consists of four steps:  diagnostic, prognostic, practice and activity, social and pedagogical. Each step determines the inclusion of preschool children in sports and developing environment that meets his or her abilities and needs through the creation of certain social and educational conditions.The area of using results of the author's methodic is preschool and the system of improvement professional skill of teachers. DOI: http://dx.doi.org/10.12731/2218-7405-2013-4-31

  11. A DATA-MINING BASED METHOD FOR THE GAIT PATTERN ANALYSIS

    Directory of Open Access Journals (Sweden)

    Marcelo Rudek

    2015-12-01

    Full Text Available The paper presents a method developed for the gait classification based on the analysis of the trajectory of the pressure centres (CoP extracted from the contact points of the feet with the ground during walking. The data acquirement is performed ba means of a walkway with embedded tactile sensors. The proposed method includes capturing procedures, standardization of data, creation of an organized repository (data warehouse, and development of a process mining. A graphical analysis is applied to looking at the footprint signature patterns. The aim is to obtain a visual interpretation of the grouping by situating it into the normal walking patterns or deviations associated with an individual way of walking. The method consists of data classification automation which divides them into healthy and non-healthy subjects in order to assist in rehabilitation treatments for the people with related mobility problems.

  12. Method Development for Pesticide Residue Analysis in Farmland Soil using High Perfomance Liquid Chromatography

    Science.gov (United States)

    Theresia Djue Tea, Marselina; Sabarudin, Akhmad; Sulistyarti, Hermin

    2018-01-01

    A method for the determination of diazinon and chlorantraniliprole in soil samples has been developed. The analyte was extracted with acetonitrile from farmland soil sample. Determination and quantification of diazinon and chlorantraniliprole were perfomed by high perfomance liquid chromatography (HPLC) with an UV detector. Several parameters of HPLC method were optimized with respect to sensitivity, high resolution of separation, and accurate determination of diazinon and chlorantraniliprole. Optimum conditions for the separation of two pesticides were eluent composition of acetonitrile:water ratio of 60:40, 0.4 mL/min of flow rate, and 220 nm of wavelength. Under the optimum conditions, diazinon linearity was in the range from 1-25 ppm with R2 of 0.9976, 1.19 mgL-1 LOD, and 3.98 mgL-1 LOQ; while the linearity of chlorantraniliprole was in the range from 0.2-5 mgL-1 with R2 of 0.9972, 0.39 mgL-1 LOD, and 1.29 mgL-1 LOQ. When the method was applied to the soil sample, both pesticides showed acceptable recoveries for real sample of more than 85%: thus, the developed method meets the validation requirement. Under this developed method, the concentrations of both pesticides in the soil samples were below the LOD and LOQ (0.577 mgL-1 for diazinon and 0.007 mgL-1 for chlorantraniliprole). Therefore, it can be concluded that the soil samples used in this study have neither diazinon nor chlorantraniliprole.

  13. Moyer's method of mixed dentition analysis: a meta-analysis ...

    African Journals Online (AJOL)

    The applicability of tables derived from the data Moyer used to other ethnic groups has ... This implies that Moyer's method of prediction may have population variations. ... Key Words: meta-analysis, mixed dentition analysis, Moyer's method

  14. Project-Based Learning in Undergraduate Environmental Chemistry Laboratory: Using EPA Methods to Guide Student Method Development for Pesticide Quantitation

    Science.gov (United States)

    Davis, Eric J.; Pauls, Steve; Dick, Jonathan

    2017-01-01

    Presented is a project-based learning (PBL) laboratory approach for an upper-division environmental chemistry or quantitative analysis course. In this work, a combined laboratory class of 11 environmental chemistry students developed a method based on published EPA methods for the extraction of dichlorodiphenyltrichloroethane (DDT) and its…

  15. Using mixed methods to develop and evaluate complex interventions in palliative care research.

    Science.gov (United States)

    Farquhar, Morag C; Ewing, Gail; Booth, Sara

    2011-12-01

    there is increasing interest in combining qualitative and quantitative research methods to provide comprehensiveness and greater knowledge yield. Mixed methods are valuable in the development and evaluation of complex interventions. They are therefore particularly valuable in palliative care research where the majority of interventions are complex, and the identification of outcomes particularly challenging. this paper aims to introduce the role of mixed methods in the development and evaluation of complex interventions in palliative care, and how they may be used in palliative care research. the paper defines mixed methods and outlines why and how mixed methods are used to develop and evaluate complex interventions, with a pragmatic focus on design and data collection issues and data analysis. Useful texts are signposted and illustrative examples provided of mixed method studies in palliative care, including a detailed worked example of the development and evaluation of a complex intervention in palliative care for breathlessness. Key challenges to conducting mixed methods in palliative care research are identified in relation to data collection, data integration in analysis, costs and dissemination and how these might be addressed. the development and evaluation of complex interventions in palliative care benefit from the application of mixed methods. Mixed methods enable better understanding of whether and how an intervention works (or does not work) and inform the design of subsequent studies. However, they can be challenging: mixed method studies in palliative care will benefit from working with agreed protocols, multidisciplinary teams and engaging staff with appropriate skill sets.

  16. Development of HANARO Activation Analysis System and Utilization Technology

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Y. S.; Moon, J. H.; Cho, H. J. (and others)

    2007-06-15

    1. Establishment of evaluation system using a data for a neutron activation analysis : Improvement of NAA measurement system and its identification, Development of combined data evaluation code of NAA/PGAA, International technical cooperation project 2. Development of technique for a industrial application of high precision gamma nuclide spectroscopic analysis : Analytical quality control, Development of industrial application techniques and its identification 3. Industrial application research for a prompt gamma-ray activation analysis : Improvement of Compton suppression counting system (PGAA), Development of applied technology using a PGAA system 4. Establishment of NAA user supporting system and KOLAS management : Development and validation of KOLAS/ISO accreditation testing and identification method, Cooperation researches for a industrial application, Establishment of integrated user analytical supporting system, Accomplishment of sample irradiation facility.

  17. Development of HANARO Activation Analysis System and Utilization Technology

    International Nuclear Information System (INIS)

    Chung, Y. S.; Moon, J. H.; Cho, H. J.

    2007-06-01

    1. Establishment of evaluation system using a data for a neutron activation analysis : Improvement of NAA measurement system and its identification, Development of combined data evaluation code of NAA/PGAA, International technical cooperation project 2. Development of technique for a industrial application of high precision gamma nuclide spectroscopic analysis : Analytical quality control, Development of industrial application techniques and its identification 3. Industrial application research for a prompt gamma-ray activation analysis : Improvement of Compton suppression counting system (PGAA), Development of applied technology using a PGAA system 4. Establishment of NAA user supporting system and KOLAS management : Development and validation of KOLAS/ISO accreditation testing and identification method, Cooperation researches for a industrial application, Establishment of integrated user analytical supporting system, Accomplishment of sample irradiation facility

  18. Development of Multigrid Methods for diffusion, Advection, and the incompressible Navier-Stokes Equations

    Energy Technology Data Exchange (ETDEWEB)

    Gjesdal, Thor

    1997-12-31

    This thesis discusses the development and application of efficient numerical methods for the simulation of fluid flows, in particular the flow of incompressible fluids. The emphasis is on practical aspects of algorithm development and on application of the methods either to linear scalar model equations or to the non-linear incompressible Navier-Stokes equations. The first part deals with cell centred multigrid methods and linear correction scheme and presents papers on (1) generalization of the method to arbitrary sized grids for diffusion problems, (2) low order method for advection-diffusion problems, (3) attempt to extend the basic method to advection-diffusion problems, (4) Fourier smoothing analysis of multicolour relaxation schemes, and (5) analysis of high-order discretizations for advection terms. The second part discusses a multigrid based on pressure correction methods, non-linear full approximation scheme, and papers on (1) systematic comparison of the performance of different pressure correction smoothers and some other algorithmic variants, low to moderate Reynolds numbers, and (2) systematic study of implementation strategies for high order advection schemes, high-Re flow. An appendix contains Fortran 90 data structures for multigrid development. 160 refs., 26 figs., 22 tabs.

  19. Implantation of the method of quantitative analysis by proton induced X-ray analysis and application to the analysis of aerosols

    International Nuclear Information System (INIS)

    Margulis, W.

    1977-09-01

    Fundamental aspects for the implementation of the method of quantitative analysis by proton induced X-ray spectroscopy are discussed. The calibration of the system was made by determining a response coefficient for selected elements, both by irradiating known amounts of these elements as well as by the use of theoretical and experimental parameters. The results obtained by these two methods agree within 5% for the analysed elements. A computer based technique of spectrum decomposition was developed to facilitate routine analysis. Finally, aerosol samples were measured as an example of a possible application of the method, and the results are discussed. (Author) [pt

  20. Challenges on innovations of newly-developed safety analysis codes

    International Nuclear Information System (INIS)

    Yang, Yanhua; Zhang, Hao

    2016-01-01

    With the development of safety analysis method, the safety analysis codes meet more challenges. Three challenges are presented in this paper, which are mathematic model, code design and user interface. Combined with the self-reliance safety analysis code named COSINE, the ways of meeting these requirements are suggested, that is to develop multi-phases, multi-fields and multi-dimension models, to adopt object-oriented code design ideal and to improve the way of modeling, calculation control and data post-processing in the user interface.

  1. Challenges on innovations of newly-developed safety analysis codes

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yanhua [Shanghai Jiao Tong Univ. (China). School of Nuclear Science and Engineering; Zhang, Hao [State Nuclear Power Software Development Center, Beijing (China). Beijing Future Science and Technology City

    2016-05-15

    With the development of safety analysis method, the safety analysis codes meet more challenges. Three challenges are presented in this paper, which are mathematic model, code design and user interface. Combined with the self-reliance safety analysis code named COSINE, the ways of meeting these requirements are suggested, that is to develop multi-phases, multi-fields and multi-dimension models, to adopt object-oriented code design ideal and to improve the way of modeling, calculation control and data post-processing in the user interface.

  2. GEM simulation methods development

    International Nuclear Information System (INIS)

    Tikhonov, V.; Veenhof, R.

    2002-01-01

    A review of methods used in the simulation of processes in gas electron multipliers (GEMs) and in the accurate calculation of detector characteristics is presented. Such detector characteristics as effective gas gain, transparency, charge collection and losses have been calculated and optimized for a number of GEM geometries and compared with experiment. A method and a new special program for calculations of detector macro-characteristics such as signal response in a real detector readout structure, and spatial and time resolution of detectors have been developed and used for detector optimization. A detailed development of signal induction on readout electrodes and electronics characteristics are included in the new program. A method for the simulation of charging-up effects in GEM detectors is described. All methods show good agreement with experiment

  3. QUALITATIVE ANALYSIS METHOD OF DETECTION OF WAX CONTENT IN GORENGAN USING SMARTPHONE

    Directory of Open Access Journals (Sweden)

    Yulia Yulia

    2018-05-01

    Full Text Available Wax is one of the compounds that can be misused to be added to Gorengan, Indonesian fritter, to keep them crispy. Gorengan containing wax is difficult to identify visually, so a quick and easy method of detecting wax content is required. The purpose of this research is to develop and evaluate the analytical performance of detecting wax content in gorengan using smartphone. Gorengan sample was dissolved with hexane and then added reagent that will give discoloration followed by analysis using smartphone. Some analysis performance parameters were evaluated in terms of linearity and detection limit, qualitative analysis capability, precision, and selectivity test. The developed method was also applied in some gorengan samples. The result shows that the detection of wax content in gorengan can be conducted by using reagent consisting of NaOH, Schift, and curcumin (1 : 2 : 2. Performance analysis shows that the linearity measurement at concentration between 10% and 25% has correlation coefficient (r of 0.9537 with detection limit at concentration of 2% and precision (%RSD less than 3%. The developed method can be applied for the detection of wax content in gorengan in the market.

  4. Data Analysis Methods for Library Marketing

    Science.gov (United States)

    Minami, Toshiro; Kim, Eunja

    Our society is rapidly changing to information society, where the needs and requests of the people on information access are different widely from person to person. Library's mission is to provide its users, or patrons, with the most appropriate information. Libraries have to know the profiles of their patrons, in order to achieve such a role. The aim of library marketing is to develop methods based on the library data, such as circulation records, book catalogs, book-usage data, and others. In this paper we discuss the methodology and imporatnce of library marketing at the beginning. Then we demonstrate its usefulness through some examples of analysis methods applied to the circulation records in Kyushu University and Guacheon Library, and some implication that obtained as the results of these methods. Our research is a big beginning towards the future when library marketing is an unavoidable tool.

  5. Analysis of factors affecting the development of food crop varieties bred by mutation method in China

    International Nuclear Information System (INIS)

    Wang Zhidong; Hu Ruifa

    2002-01-01

    The research developed a production function on crop varieties developed by mutation method in order to explore factors affecting the development of new varieties. It is found that the research investment, human capital and radiation facilities were the most important factors that affected the development and cultivation area of new varieties through the mutation method. It is concluded that not all institutions involved in the breeding activities using mutation method must have radiation facilities and the national government only needed to invest in those key research institutes, which had strong research capacities. The saved research budgets can be used in the entrusting the institutes that have stronger research capacities with irradiating more breeding materials developed by the institutes that have weak research capacities, by which more opportunities to breed better varieties can be created

  6. A Method for Developing Enterprise Architecture Frameworks: An Interpretive Phenomenology Study

    Directory of Open Access Journals (Sweden)

    Ali Moeini

    2015-03-01

    Full Text Available Nowadays, many of organizations, who are involved in enterprise architecting, make their own architecture framework or customize existing frameworks. These endeavors are based on the knowledge and the experience of each organization, and there is no defined method for developing the enterprise architecture framework. Therefore, a method for developing architecture framework is presented in this qualitative research. For this reason, 15 versions of 5 most used architecture frameworks are analyzed based on the interpretive phenomenology. Based on this analysis, a method for developing architecture frameworks is introduced which contains 8 disciplines and 6 phases. Analyzing the qualitative data of the research and also the validation of the research are carried out using the guidelines of Van Manen in the interpretive phenomenology.

  7. Multifunctional Collaborative Modeling and Analysis Methods in Engineering Science

    Science.gov (United States)

    Ransom, Jonathan B.; Broduer, Steve (Technical Monitor)

    2001-01-01

    Engineers are challenged to produce better designs in less time and for less cost. Hence, to investigate novel and revolutionary design concepts, accurate, high-fidelity results must be assimilated rapidly into the design, analysis, and simulation process. This assimilation should consider diverse mathematical modeling and multi-discipline interactions necessitated by concepts exploiting advanced materials and structures. Integrated high-fidelity methods with diverse engineering applications provide the enabling technologies to assimilate these high-fidelity, multi-disciplinary results rapidly at an early stage in the design. These integrated methods must be multifunctional, collaborative, and applicable to the general field of engineering science and mechanics. Multifunctional methodologies and analysis procedures are formulated for interfacing diverse subdomain idealizations including multi-fidelity modeling methods and multi-discipline analysis methods. These methods, based on the method of weighted residuals, ensure accurate compatibility of primary and secondary variables across the subdomain interfaces. Methods are developed using diverse mathematical modeling (i.e., finite difference and finite element methods) and multi-fidelity modeling among the subdomains. Several benchmark scalar-field and vector-field problems in engineering science are presented with extensions to multidisciplinary problems. Results for all problems presented are in overall good agreement with the exact analytical solution or the reference numerical solution. Based on the results, the integrated modeling approach using the finite element method for multi-fidelity discretization among the subdomains is identified as most robust. The multiple-method approach is advantageous when interfacing diverse disciplines in which each of the method's strengths are utilized. The multifunctional methodology presented provides an effective mechanism by which domains with diverse idealizations are

  8. Application of computational aerodynamics methods to the design and analysis of transport aircraft

    Science.gov (United States)

    Da Costa, A. L.

    1978-01-01

    The application and validation of several computational aerodynamic methods in the design and analysis of transport aircraft is established. An assessment is made concerning more recently developed methods that solve three-dimensional transonic flow and boundary layers on wings. Capabilities of subsonic aerodynamic methods are demonstrated by several design and analysis efforts. Among the examples cited are the B747 Space Shuttle Carrier Aircraft analysis, nacelle integration for transport aircraft, and winglet optimization. The accuracy and applicability of a new three-dimensional viscous transonic method is demonstrated by comparison of computed results to experimental data

  9. Statistical Analysis of the labor Market in Ukraine Using Multidimensional Classification Methods: the Regional Aspect

    Directory of Open Access Journals (Sweden)

    Korepanov Oleksiy S.

    2017-12-01

    Full Text Available The aim of the article is to study the labor market in Ukraine in the regional context using cluster analysis methods. The current state of the labor market in regions of Ukraine is analyzed, and a system of statistical indicators that influence the state and development of this market is formed. The expediency of using cluster analysis for grouping regions according to the level of development of the labor market is substantiated. The essence of cluster analysis is revealed, its main goal, key tasks, which can be solved by means of such analysis, are presented, basic stages of the analysis are considered. The main methods of clustering are described and, based on the results of the simulation, the advantages and disadvantages of each method are justified. In the work the clustering of regions of Ukraine by the level of labor market development using different methods of cluster analysis is carried out, conclusions on the results of the calculations performed are presented, and the main directions for further research are outlined.

  10. Work Domain Analysis Methodology for Development of Operational Concepts for Advanced Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Hugo, Jacques [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-05-01

    This report describes a methodology to conduct a Work Domain Analysis in preparation for the development of operational concepts for new plants. This method has been adapted from the classical method described in the literature in order to better deal with the uncertainty and incomplete information typical of first-of-a-kind designs. The report outlines the strategy for undertaking a Work Domain Analysis of a new nuclear power plant and the methods to be used in the development of the various phases of the analysis. Basic principles are described to the extent necessary to explain why and how the classical method was adapted to make it suitable as a tool for the preparation of operational concepts for a new nuclear power plant. Practical examples are provided of the systematic application of the method and the various presentation formats in the operational analysis of advanced reactors.

  11. Scenario development, qualitative causal analysis and system dynamics

    Directory of Open Access Journals (Sweden)

    Michael H. Ruge

    2009-02-01

    Full Text Available The aim of this article is to demonstrate that technology assessments can be supported by methods such as scenario modeling and qualitative causal analysis. At Siemens, these techniques are used to develop preliminary purely qualitative models. These or parts of these comprehensive models may be extended to system dynamics models. While it is currently not possible to automatically generate a system dynamics models (or vice versa, obtain a qualitative simulation model from a system dynamics model, the two thechniques scenario development and qualitative causal analysis provide valuable indications on how to proceed towards a system dynamics model. For the qualitative analysis phase, the Siemens – proprietary prototype Computer – Aided Technology Assessment Software (CATS supportes complete cycle and submodel analysis. Keywords: Health care, telecommucations, qualitative model, sensitivity analysis, system dynamics.

  12. A rapid chemical method for lysing Arabidopsis cells for protein analysis

    Directory of Open Access Journals (Sweden)

    Takano Tetsuo

    2011-07-01

    Full Text Available Abstract Background Protein extraction is a frequent procedure in biological research. For preparation of plant cell extracts, plant materials usually have to be ground and homogenized to physically break the robust cell wall, but this step is laborious and time-consuming when a large number of samples are handled at once. Results We developed a chemical method for lysing Arabidopsis cells without grinding. In this method, plants are boiled for just 10 minutes in a solution containing a Ca2+ chelator and detergent. Cell extracts prepared by this method were suitable for SDS-PAGE and immunoblot analysis. This method was also applicable to genomic DNA extraction for PCR analysis. Our method was applied to many other plant species, and worked well for some of them. Conclusions Our method is rapid and economical, and allows many samples to be prepared simultaneously for protein analysis. Our method is useful not only for Arabidopsis research but also research on certain other species.

  13. Direct methods of soil-structure interaction analysis for earthquake loadings (IV)

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, J B; Kim, D S; Choi, J S; Kwon, K C; Kim, Y J; Lee, H J; Kim, S B; Kim, D K [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1996-07-15

    Methodologies of SSI analysis for earthquake loadings have been reviewed. Based on the finite element method incorporating infinite element technique for the unbounded exterior region, a computer program for the nonlinear seismic analysis named as 'KIESSI-QK' has been developed. The computer program has been verified using a free-field site-response problem. The Hualien FVT stochastic finite element analysis after backfill and the blind prediction of earthquake responses have been carried out utilizing the developed computer program. The earthquake response analysis for the LSST structure has also been performed and compared with the measured data.

  14. Direct methods of soil-structure interaction analysis for earthquake loadings (IV)

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, J. B.; Kim, D. S.; Choi, J. S.; Kwon, K. C.; Kim, Y. J.; Lee, H. J.; Kim, S. B.; Kim, D. K. [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1996-07-15

    Methodologies of SSI analysis for earthquake loadings have been reviewed. Based on the finite element method incorporating infinite element technique for the unbounded exterior region, a computer program for the nonlinear seismic analysis named as 'KIESSI-QK' has been developed. The computer program has been verified using a free-field site-response problem. The Hualien FVT stochastic finite element analysis after backfill and the blind prediction of earthquake responses have been carried out utilizing the developed computer program. The earthquake response analysis for the LSST structure has also been performed and compared with the measured data.

  15. Development of Chiral LC-MS Methods for small Molecules and Their Applications in the Analysis of Enantiomeric Composition and Pharmacokinetic Studies

    Energy Technology Data Exchange (ETDEWEB)

    Desai, Meera Jay [Iowa State Univ., Ames, IA (United States)

    2004-01-01

    The purpose of this research was to develop sensitive LC-MS methods for enantiomeric separation and detection, and then apply these methods for determination of enantiomeric composition and for the study of pharmacokinetic and pharmacodynamic properties of a chiral nutraceutical. Our first study, evaluated the use of reverse phase and polar organic mode for chiral LC-API/MS method development. Reverse phase methods containing high water were found to decrease ionization efficiency in electrospray, while polar organic methods offered good compatibility and low limits of detection with ESI. The use of lower flow rates dramatically increased the sensitivity by an order of magnitude. Additionally, for rapid chiral screening, the coupled Chirobiotic column afforded great applicability for LC-MS method development. Our second study, continued with chiral LC-MS method development in this case for the normal phase mode. Ethoxynonafluorobutane, a fluorocarbon with low flammability and no flashpoint, was used as a substitute solvent for hexane/heptane mobile phases for LC-APCI/MS. Comparable chromatographic resolutions and selectivities were found using ENFB substituted mobile phase systems, although, peak efficiencies were significantly diminished. Limits of detection were either comparable or better for ENFB-MS over heptane-PDA detection. The miscibility of ENFB with a variety of commonly used organic modifiers provided for flexibility in method development. For APCI, lower flow rates did not increase sensitivity as significantly as was previously found for ESI-MS detection. The chiral analysis of native amino acids was evaluated using both APCI and ESI sources. For free amino acids and small peptides, APCI was found to have better sensitivities over ESI at high flow rates. For larger peptides, however, sensitivity was greatly improved with the use of electrospray. Additionally, sensitivity was enhanced with the use of non-volatile additives, This optimized method was then

  16. Analysis of investment appeal of the industrial enterprise by eigenstate method

    Directory of Open Access Journals (Sweden)

    Buslaeva O.S.

    2017-01-01

    Full Text Available An analysis of enterprise performance is considered. The problem is solved based on the analysis of the base indicators of functioning of the enterprise in terms of improving economic stability, and also development of autoregulation mechanism of economical stability of enterprise. An eigenstate method is proposed for the analysis of the basic indicators of the enterprise as it allows to construct an economomical stability model of enterprise. Methodology of economic stability analysis of enterprise on the basis of eigenstate method is described. The formulas for calculating the complex indicator of economic stability are given. The effectiveness of the methodology is demonstrated on the example of economic stability analysis of the large trading company.

  17. Development of methods for nuclear power plant personnel qualifications and training

    International Nuclear Information System (INIS)

    Jorgensen, C.C.; Carter, R.J.

    1985-01-01

    The Nuclear Regulatory Commission (NRC) has proposed that additions and revisions should be made to Title 10 of the ''Code of Federal Regulations,'' Parts 50 and 55, and to Regulatory Guides 1.8 and 1.149. Oak Ridge National Laboratory (ORNL) is developing methods and some aspects of the technical basis for the implementation and assessment of training programs, personnel qualifications, and simulation facilities to be designed in accordance with the proposed rule changes. The paper describes the three methodologies which were developed during the FY-1984 research. The three methodologies are: (1) a task sort procedure (TSORT); (2) a simulation facility evaluation methodology; and (3) a task analysis profiling system (TAPS). TAPS is covered in detail in this paper. The task analysis profiling system has been designed to support training research. It draws on artificial intelligence concepts of pattern matching to provide an automated task analysis of normal English descriptions of job behaviors. TAPS development consisted of creating a precise method for the definition of skills, knowledge, abilities, and attitudes (SKAA), and generating SKAA taxonomic elements. It systematically outputs skills, knowledge, attitudes, and abilities, and information associated with them

  18. Methodology for developing new test methods

    Directory of Open Access Journals (Sweden)

    A. I. Korobko

    2017-06-01

    Full Text Available The paper describes the methodology for developing new test methods and forming solutions for the development of new test methods. The basis of the methodology for developing new test methods is the individual elements of the system and process approaches. They contribute to the development of an effective research strategy for the object, the study of interrelations, the synthesis of an adequate model of the test method. The effectiveness of the developed test method is determined by the correct choice of the set of concepts, their interrelations and mutual influence. This allows you to solve the tasks assigned to achieve the goal. The methodology is based on the use of fuzzy cognitive maps. The question of the choice of the method on the basis of which the model for the formation of solutions is based is considered. The methodology provides for recording a model for a new test method in the form of a finite set of objects. These objects are significant for the test method characteristics. Then a causal relationship is established between the objects. Further, the values of fitness indicators and the observability of the method and metrological tolerance for the indicator are established. The work is aimed at the overall goal of ensuring the quality of tests by improving the methodology for developing the test method.

  19. LightCDD: Application of a Capability-Driven Development Method for Start-ups Development

    Directory of Open Access Journals (Sweden)

    Hasan Koç

    2017-04-01

    Full Text Available Novice innovators and entrepreneurs face the risk of designing naive business models. In fact, lack of viability in business models is perceived to be a major threat for the start-up success. Both the literature and the responses we gathered from experts in incubation present evidences of this problem. The LightCDD method helps entrepreneurs in the analysis, design and specification of start-ups that are context aware and adaptive to contextual changes and evolution. In this article we describe the LightCDD method, a context-aware enterprise modeling method that is tailored for business model generation. The LightCDD applies a lightweight Capability‑Driven Development (CDD methodology. It reduces the set of modeling constructs and guidelines to facilitate its adoption by entrepreneurs, yet keeping it expressive enough for their purposes and, at the same time, compatible with the CDD methodology. We provide a booklet with the LightCDD method for start-ups development. The feasibility of the LightCDD method is validated by means of its application to one start-up development case. From a practitioner viewpoint (entrepreneurs and experts in incubation, it is important to provide integrative modeling perspectives to specify business ideas, but it is vital to keep it light. The LightCDD is giving a step forward in this direction. From a researcher point of view, the LightCDD booklet facilitates the application of LightCDD to different start-up development cases. The feasibility validation has produced important feedback for further empirical validation exercises in which is necessary to study the scalability and sensitivity of LightCDD.

  20. Development of an Objective Measurement Method for Situation Awareness of Operation Teams in NPPs

    International Nuclear Information System (INIS)

    Lee, Seung Woo; Kim, Ar Ryum; Kim, Hyoung Ju; Seong, Poong Hyun; Park, Jin Kyun

    2011-01-01

    Situation awareness (SA) continues to receive a considerable amount of attention from the ergonomics community since the need for operators to maintain SA is frequently cited as a key to effective and efficient performance. Even though complex and dynamic environments such as main control room (MCR) in the nuclear power plants (NPPs) is operated in teams and still SA which teams posses is important, research is currently focused on individual SA not for team situation awareness (TSA). Since there are not many measurement methods developed for TSA, individual SA measurement methods are at first reviewed and the critical requirements which new TSA measurements should consider are derived. With an assumption that TSA is an integration of individual SA, a new and objective TSA measurement method is developed. This method is developed mainly based on logical connections between TSA and team communication and implements verbal protocol analysis. This method provides measure for each level of TSA. By performing preliminary analysis with this method, it was shown that this method is feasible to some extent

  1. Development of an Objective Measurement Method for Situation Awareness of Operation Teams in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Woo; Kim, Ar Ryum; Kim, Hyoung Ju; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of); Park, Jin Kyun [KAERI, Daejeon (Korea, Republic of)

    2011-08-15

    Situation awareness (SA) continues to receive a considerable amount of attention from the ergonomics community since the need for operators to maintain SA is frequently cited as a key to effective and efficient performance. Even though complex and dynamic environments such as main control room (MCR) in the nuclear power plants (NPPs) is operated in teams and still SA which teams posses is important, research is currently focused on individual SA not for team situation awareness (TSA). Since there are not many measurement methods developed for TSA, individual SA measurement methods are at first reviewed and the critical requirements which new TSA measurements should consider are derived. With an assumption that TSA is an integration of individual SA, a new and objective TSA measurement method is developed. This method is developed mainly based on logical connections between TSA and team communication and implements verbal protocol analysis. This method provides measure for each level of TSA. By performing preliminary analysis with this method, it was shown that this method is feasible to some extent.

  2. Human reliability analysis methods for probabilistic safety assessment

    International Nuclear Information System (INIS)

    Pyy, P.

    2000-11-01

    Human reliability analysis (HRA) of a probabilistic safety assessment (PSA) includes identifying human actions from safety point of view, modelling the most important of them in PSA models, and assessing their probabilities. As manifested by many incidents and studies, human actions may have both positive and negative effect on safety and economy. Human reliability analysis is one of the areas of probabilistic safety assessment (PSA) that has direct applications outside the nuclear industry. The thesis focuses upon developments in human reliability analysis methods and data. The aim is to support PSA by extending the applicability of HRA. The thesis consists of six publications and a summary. The summary includes general considerations and a discussion about human actions in the nuclear power plant (NPP) environment. A condensed discussion about the results of the attached publications is then given, including new development in methods and data. At the end of the summary part, the contribution of the publications to good practice in HRA is presented. In the publications, studies based on the collection of data on maintenance-related failures, simulator runs and expert judgement are presented in order to extend the human reliability analysis database. Furthermore, methodological frameworks are presented to perform a comprehensive HRA, including shutdown conditions, to study reliability of decision making, and to study the effects of wrong human actions. In the last publication, an interdisciplinary approach to analysing human decision making is presented. The publications also include practical applications of the presented methodological frameworks. (orig.)

  3. CARBON SEQUESTRATION: A METHODS COMPARATIVE ANALYSIS

    International Nuclear Information System (INIS)

    Christopher J. Koroneos; Dimitrios C. Rovas

    2008-01-01

    All human activities are related with the energy consumption. Energy requirements will continue to rise, due to the modern life and the developing countries growth. Most of the energy demand emanates from fossil fuels. Fossil fuels combustion has negative environmental impacts, with the CO 2 production to be dominating. The fulfillment of the Kyoto protocol criteria requires the minimization of CO 2 emissions. Thus the management of the CO 2 emissions is an urgent matter. The use of appliances with low energy use and the adoption of an energy policy that prevents the unnecessary energy use, can play lead to the reduction of carbon emissions. A different route is the introduction of ''clean'' energy sources, such as renewable energy sources. Last but not least, the development of carbon sequestration methods can be promising technique with big future potential. The objective of this work is the analysis and comparison of different carbon sequestration and deposit methods. Ocean deposit, land ecosystems deposit, geological formations deposit and radical biological and chemical approaches will be analyzed

  4. Development of new HRA methods based upon operational experience

    International Nuclear Information System (INIS)

    Cooper, S.E.; Luckas, W.J.; Barriere, M.T.; Wreathall, J.

    2004-01-01

    Under the auspices of the US Nuclear Regulatory Commission (NRC), previously unaddressed human reliability issues are being investigated in order to support the development of human reliability analysis (HRA) methods for both low power and shutdown (LP and S) and full-power conditions. Actual operational experience, such as that reported in Licensee Event Reports (LERs), have been used to gain insights and provide a basis for the requirements of new HRA methods. In particular, operational experience has shown that new HRA methods for LP and S must address human-induced initiators, errors of commission, mistakes (vs. slips), dependencies, and the effects of multiple performance shaping factors (PSFs). (author)

  5. Optically stimulated luminescence (OSL) dating of shallow marine sediments to develop an analysis method of late Quaternary geodynamics

    International Nuclear Information System (INIS)

    Hataya, Ryuta; Shirai, Masaaki

    2003-01-01

    To develop an analysis method of geodynamics, we have examined the applicability of the OSL dating of marine terrace deposits. We have done the OSL dating, using the multiple-aliquot additive-dose technique, of shallow marine sediments from the upper part the Kioroshi Formation in Ibaraki Prefecture, which are correlated to Marine Oxygen Isotope Stage (MIS) 5e-5c. Marine terrace deposit consists mainly of shallow marine sediment. OSL ages of foreshore and foreshore-shoreface beds are 88-112 Ka, and are in good agreement with the geological/geomorphological data. On the other hand, OSL ages of the backshore bed are younger, and ones of the shoreface bed are older than geologically estimated ages. These results show that OPSL dating method can date shallow marine sediment using samples from foreshore and foreshore-shoreface beds, and that this method can distinguish terrace deposits formed in MIS5 and that in MIS7 by taking geomorphologic information into account. These results contribute to the characterization of long-term geological movement in coastal areas. (author)

  6. Comparative Analysis Of Dempster Shafer Method With Certainty Factor Method For Diagnose Stroke Diseases

    Directory of Open Access Journals (Sweden)

    Erwin Kuit Panggabean

    2018-02-01

    Full Text Available The development of artificial intelligence technology that has occurred has allowed expert systems to be applied in detecting disease using programming languages. One in terms of providing information about a variety of disease problems that have recently been feared by Indonesian society, namely stroke. Expert system method used is dempster shafer and certainty factor method is used to analyze the comparison of both methods in stroke.Based on the analysis result, it is found that certainty factor is better than demster shafer and more accurate in handling the knowledge representation of stoke disease according to the symptoms of disease obtained from one hospital in medan city, uniqueness of algorithm that exist in both methods.

  7. Annular dispersed flow analysis model by Lagrangian method and liquid film cell method

    International Nuclear Information System (INIS)

    Matsuura, K.; Kuchinishi, M.; Kataoka, I.; Serizawa, A.

    2003-01-01

    A new annular dispersed flow analysis model was developed. In this model, both droplet behavior and liquid film behavior were simultaneously analyzed. Droplet behavior in turbulent flow was analyzed by the Lagrangian method with refined stochastic model. On the other hand, liquid film behavior was simulated by the boundary condition of moving rough wall and liquid film cell model, which was used to estimate liquid film flow rate. The height of moving rough wall was estimated by disturbance wave height correlation. In each liquid film cell, liquid film flow rate was calculated by considering droplet deposition and entrainment flow rate. Droplet deposition flow rate was calculated by Lagrangian method and entrainment flow rate was calculated by entrainment correlation. For the verification of moving rough wall model, turbulent flow analysis results under the annular flow condition were compared with the experimental data. Agreement between analysis results and experimental results were fairly good. Furthermore annular dispersed flow experiments were analyzed, in order to verify droplet behavior model and the liquid film cell model. The experimental results of radial distribution of droplet mass flux were compared with analysis results. The agreement was good under low liquid flow rate condition and poor under high liquid flow rate condition. But by modifying entrainment rate correlation, the agreement become good even under high liquid flow rate. This means that basic analysis method of droplet and liquid film behavior was right. In future work, verification calculation should be carried out under different experimental condition and entrainment ratio correlation also should be corrected

  8. Seismic design and analysis methods

    International Nuclear Information System (INIS)

    Varpasuo, P.

    1993-01-01

    Seismic load is in many areas of the world the most important loading situation from the point of view of structural strength. Taking this into account it is understandable, that there has been a strong allocation of resources in the seismic analysis during the past ten years. In this study there are three areas of the center of gravity: (1) Random vibrations; (2) Soil-structure interaction and (3) The methods for determining structural response. The solution of random vibration problems is clarified with the aid of applications in this study and from the point of view of mathematical treatment and mathematical formulations it is deemed sufficient to give the relevant sources. In the soil-structure interaction analysis the focus has been the significance of frequency dependent impedance functions. As a result it was obtained, that the description of the soil with the aid of frequency dependent impedance functions decreases the structural response and it is thus always the preferred method when compared to more conservative analysis types. From the methods to determine the C structural response the following four were tested: (1) The time history method; (2) The complex frequency-response method; (3) Response spectrum method and (4) The equivalent static force method. The time history appeared to be the most accurate method and the complex frequency-response method did have the widest area of application. (orig.). (14 refs., 35 figs.)

  9. Structural system reliability calculation using a probabilistic fault tree analysis method

    Science.gov (United States)

    Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.

    1992-01-01

    The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.

  10. Relativity Concept Inventory: Development, Analysis, and Results

    Science.gov (United States)

    Aslanides, J. S.; Savage, C. M.

    2013-01-01

    We report on a concept inventory for special relativity: the development process, data analysis methods, and results from an introductory relativity class. The Relativity Concept Inventory tests understanding of relativistic concepts. An unusual feature is confidence testing for each question. This can provide additional information; for example,…

  11. Constructing an Intelligent Patent Network Analysis Method

    Directory of Open Access Journals (Sweden)

    Chao-Chan Wu

    2012-11-01

    Full Text Available Patent network analysis, an advanced method of patent analysis, is a useful tool for technology management. This method visually displays all the relationships among the patents and enables the analysts to intuitively comprehend the overview of a set of patents in the field of the technology being studied. Although patent network analysis possesses relative advantages different from traditional methods of patent analysis, it is subject to several crucial limitations. To overcome the drawbacks of the current method, this study proposes a novel patent analysis method, called the intelligent patent network analysis method, to make a visual network with great precision. Based on artificial intelligence techniques, the proposed method provides an automated procedure for searching patent documents, extracting patent keywords, and determining the weight of each patent keyword in order to generate a sophisticated visualization of the patent network. This study proposes a detailed procedure for generating an intelligent patent network that is helpful for improving the efficiency and quality of patent analysis. Furthermore, patents in the field of Carbon Nanotube Backlight Unit (CNT-BLU were analyzed to verify the utility of the proposed method.

  12. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...

  13. Methodical Approaches To Analysis And Forecasting Of Development Fuel And Energy Complex And Gas Industry In The Region

    Directory of Open Access Journals (Sweden)

    Vladimir Andreyevich Tsybatov

    2014-12-01

    Full Text Available Fuel and energy complex (FEC is one of the main elements of the economy of any territory over which intertwine the interests of all economic entities. To ensure economic growth of the region should ensure that internal balance of energy resources, which should be developed with account of regional specifics of economic growth and energy security. The study examined the status of this equilibrium, indicating fuel and energy balance of the region (TEB. The aim of the research is the development of the fuel and energy balance, which will allow to determine exactly how many and what resources are not enough to ensure the regional development strategy and what resources need to be brought in. In the energy balances as the focus of displays all issues of regional development, so thermopile is necessary as a mechanism of analysis of current issues, economic development, and in the forward-looking version — as a tool future vision for the fuel and energy complex, energy threats and ways of overcoming them. The variety of relationships in the energy sector with other sectors and aspects of society lead to the fact that the development of the fuel and energy balance of the region have to go beyond the actual energy sector, involving the analysis of other sectors of economy, as well as systems such as banking, budgetary, legislative, tax. Due to the complexity of the discussed problems, the obvious is the need to develop appropriate forecast-analytical system, allowing regional authorities to implement evidence-based predictions of the consequences of management decisions. Multivariant scenario study on development of fuel and energy complex and separately industry, to use the methods of project-based management, harmonized application of state regulation of strategic and market mechanisms on the operational directions of development of fuel and energy complex and separately industry in the economy of the region.

  14. Development and validation of reversed-phase high performance liquid chromatographic method for analysis of cephradine in human plasma samples

    International Nuclear Information System (INIS)

    Ahmad, M.; Usman, M.; Madni, A.; Akhtar, N.; Khalid, N.; Asghar, W.

    2010-01-01

    An HPLC method with high precision, accuracy and selectivity was developed and validated for the assessment of cephradine in human plasma samples. The extraction procedure was simple and accurate with single step followed by direct injection of sample into HPLC system. The extracted cephradine in spiked human plasma was separated and quantitated using reversed phase C/sub 18/ column and UV detection wavelength of 254 nm. The optimized mobile phase of new composition of 0.05 M potassium dihydrogen phosphate (pH 3.4)-acetonitrile (88: 12) was pumped at an optimum flow rate of 1 mL.min/sup 1/. The method resulted linearity in the concentration range 0.15- 20 micro g mL/sup -1/. The limit of detection (LOD) and limit of quantification (LOQ) were 0.05 and 0.150 Microg.mL/sup -1/, respectively. The accuracy of method was 98.68 %. This method can 1>e applied for bioequivalence studies and therapeutic drug monitoring as well as for the routine analysis of cephradine. (author)

  15. The development of chemical speciation analysis

    International Nuclear Information System (INIS)

    Martin, R.; Santana, J.L.; Lima, L.; De La Rosa, D.; Melchor, K.

    2003-01-01

    The knowledge of many metals species on the environmental, its bioaccumulation, quantification and its effect in human body has been studied by a wide researchers groups in the last two decades. The development of speciation analysis has an vertiginous advance close to the developing of novel analytical techniques. Separation and quantification at low level is a problem that's has been afford by a coupling of high resolution chromatographic techniques like HPLC and HRGC with a specific method of detection (ICP-MS or CV-AAS). This methodological approach make possible the success in chemical speciation nowadays

  16. Shielding methods development in the United States

    International Nuclear Information System (INIS)

    Mynatt, F.R.

    1977-01-01

    A generalized shielding methodology has been developed in the U.S.A. that is adaptable to the shielding analyses of all reactor types. Thus far used primarily for liquid-metal fast breeder reactors, the methodology includes several component activities: (1) developing methods for calculating radiation transport through reactor-shield systems; (2) processing cross-section libraries; (3) performing design calculations for specific systems; (4) performing and analyzing pertinent integral experiments; (5) performing sensitivity studies on both the design calculations and the experimental analyses; and, finally, (6) calculating shield design parameters and their uncertainties. The criteria for the methodology are a 5 to 10 percent accuracy for responses at locations near the core and a factor of 2 accuracy for responses at distant locations. The methodology has been successfully adapted to most in-vessel and ex-vessel problems encountered in the shield analyses of the Fast Flux Test Facility and the Fast Flux Test Facility and the Clinch River Breeder Reactor; however, improved techniques are needed for calculating regions in which radiation streaming is dominant. Areas of the methodology in which significant progress has recently been made are those involving the development of cross-section libraries, sensitivity analysis methods, and transport codes

  17. Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods

    International Nuclear Information System (INIS)

    Michalska, J; Chmiela, B

    2014-01-01

    The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods

  18. Earthquake Hazard Analysis Methods: A Review

    Science.gov (United States)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  19. Deterministic factor analysis: methods of integro-differentiation of non-integral order

    Directory of Open Access Journals (Sweden)

    Valentina V. Tarasova

    2016-12-01

    Full Text Available Objective to summarize the methods of deterministic factor economic analysis namely the differential calculus and the integral method. nbsp Methods mathematical methods for integrodifferentiation of nonintegral order the theory of derivatives and integrals of fractional nonintegral order. Results the basic concepts are formulated and the new methods are developed that take into account the memory and nonlocality effects in the quantitative description of the influence of individual factors on the change in the effective economic indicator. Two methods are proposed for integrodifferentiation of nonintegral order for the deterministic factor analysis of economic processes with memory and nonlocality. It is shown that the method of integrodifferentiation of nonintegral order can give more accurate results compared with standard methods method of differentiation using the first order derivatives and the integral method using the integration of the first order for a wide class of functions describing effective economic indicators. Scientific novelty the new methods of deterministic factor analysis are proposed the method of differential calculus of nonintegral order and the integral method of nonintegral order. Practical significance the basic concepts and formulas of the article can be used in scientific and analytical activity for factor analysis of economic processes. The proposed method for integrodifferentiation of nonintegral order extends the capabilities of the determined factorial economic analysis. The new quantitative method of deterministic factor analysis may become the beginning of quantitative studies of economic agents behavior with memory hereditarity and spatial nonlocality. The proposed methods of deterministic factor analysis can be used in the study of economic processes which follow the exponential law in which the indicators endogenous variables are power functions of the factors exogenous variables including the processes

  20. Eddy current analysis by the finite element circuit method

    International Nuclear Information System (INIS)

    Kameari, A.; Suzuki, Y.

    1977-01-01

    The analysis of the transient eddy current in the conductors by ''Finite Element Circuit Method'' is developed. This method can be easily applied to various geometrical shapes of thin conductors. The eddy currents on the vacuum vessel and the upper and lower support plates of JT-60 machine (which is now being constructed by Japan Atomic Energy Research Institute) are calculated by this method. The magnetic field induced by the eddy current is estimated in the domain occupied by the plasma. And the force exerted to the vacuum vessel is also estimated

  1. Quantitative analysis method for niobium in lead zirconate titanate

    International Nuclear Information System (INIS)

    Hara, Hideo; Hashimoto, Toshio

    1986-01-01

    Lead zirconate titanate (PZT) is a strong dielectric ceramic having piezoelectric and pyroelectric properties, and is used most as a piezoelectric material. Also it is a main component of lead lanthanum zirconate titanate (PLZT), which is a typical electrical-optical conversion element. Since these have been developed, the various electronic parts utilizing the piezoelectric characteristics have been put in practical use. The characteristics can be set up by changing the composition of PZT and the kinds and amount of additives. Among the additives, niobium has the action to make metallic ion vacancy in crystals, and by the formation of this vacancy, to ease the movement of domain walls in crystal grains, and to increase resistivity. Accordingly, it is necessary to accurately determine the niobium content for the research and development, quality control and process control. The quantitative analysis methods for niobium used so far have respective demerits, therefore, the authors examined the quantitative analysis of niobium in PZT by using an inductively coupled plasma emission spectro-analysis apparatus which has remarkably developed recently. As the result, the method of dissolving a specimen with hydrochloric acid and hydrofluoric acid, and masking unstable lead with ethylene diamine tetraacetic acid 2 sodium and fluoride ions with boric acid was established. The apparatus, reagents, the experiment and the results are reported. (Kako, I.)

  2. Application of computer intensive data analysis methods to the analysis of digital images and spatial data

    DEFF Research Database (Denmark)

    Windfeld, Kristian

    1992-01-01

    Computer-intensive methods for data analysis in a traditional setting has developed rapidly in the last decade. The application of and adaption of some of these methods to the analysis of multivariate digital images and spatial data are explored, evaluated and compared to well established classical...... into the projection pursuit is presented. Examples from remote sensing are given. The ACE algorithm for computing non-linear transformations for maximizing correlation is extended and applied to obtain a non-linear transformation that maximizes autocorrelation or 'signal' in a multivariate image....... This is a generalization of the minimum /maximum autocorrelation factors (MAF's) which is a linear method. The non-linear method is compared to the linear method when analyzing a multivariate TM image from Greenland. The ACE method is shown to give a more detailed decomposition of the image than the MAF-transformation...

  3. Evaluation of sample extraction methods for proteomics analysis of green algae Chlorella vulgaris.

    Science.gov (United States)

    Gao, Yan; Lim, Teck Kwang; Lin, Qingsong; Li, Sam Fong Yau

    2016-05-01

    Many protein extraction methods have been developed for plant proteome analysis but information is limited on the optimal protein extraction method from algae species. This study evaluated four protein extraction methods, i.e. direct lysis buffer method, TCA-acetone method, phenol method, and phenol/TCA-acetone method, using green algae Chlorella vulgaris for proteome analysis. The data presented showed that phenol/TCA-acetone method was superior to the other three tested methods with regards to shotgun proteomics. Proteins identified using shotgun proteomics were validated using sequential window acquisition of all theoretical fragment-ion spectra (SWATH) technique. Additionally, SWATH provides protein quantitation information from different methods and protein abundance using different protein extraction methods was evaluated. These results highlight the importance of green algae protein extraction method for subsequent MS analysis and identification. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Analysis methods for fast impurity ion dynamics data

    International Nuclear Information System (INIS)

    Den Hartog, D.J.; Almagri, A.F.; Prager, S.C.; Fonck, R.J.

    1994-08-01

    A high resolution spectrometer has been developed and used on the MST reversed-field pinch (RFP) to measure passively impurity ion temperatures and flow velocities with 10 μs temporal resolution. Such measurements of MHD-scale fluctuations are particularly relevant in the RFP because the flow velocity fluctuation induced transport of current (the ''MHD dynamo'') may produce the magnetic field reversal characteristic of an RFP. This instrument will also be used to measure rapid changes in the equilibrium flow velocity, such as occur during locking and H-mode transition. The precision of measurements made to date is <0.6 km/s. The authors are developing accurate analysis techniques appropriate to the reduction of this fast ion dynamics data. Moment analysis and curve-fitting routines have been evaluated for noise sensitivity and robustness. Also presented is an analysis method which correctly separates the flux-surface average of the correlated fluctuations in u and B from the fluctuations due to rigid shifts of the plasma column

  5. The surface analysis methods; Les methodes d`analyse des surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Deville, J.P. [Institut de Physique et Chimie, 67 - Strasbourg (France)

    1998-11-01

    Nowadays, there are a lot of surfaces analysis methods, each having its specificity, its qualities, its constraints (for instance vacuum) and its limits. Expensive in time and in investment, these methods have to be used deliberately. This article appeals to non specialists. It gives some elements of choice according to the studied information, the sensitivity, the use constraints or the answer to a precise question. After having recalled the fundamental principles which govern these analysis methods, based on the interaction between radiations (ultraviolet, X) or particles (ions, electrons) with matter, two methods will be more particularly described: the Auger electron spectroscopy (AES) and x-rays photoemission spectroscopy (ESCA or XPS). Indeed, they are the most widespread methods in laboratories, the easier for use and probably the most productive for the analysis of surface of industrial materials or samples submitted to treatments in aggressive media. (O.M.) 11 refs.

  6. The Use of Object-Oriented Analysis Methods in Surety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automatic model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.

  7. Overview of the South African mechanistic pavement design analysis method

    CSIR Research Space (South Africa)

    Theyse, HL

    1996-01-01

    Full Text Available A historical overview of the South African mechanistic pavement design method, from its development in the early 1970s to the present, is presented. Material characterization, structural analysis, and pavement life prediction are discussed...

  8. Optimization Models and Methods Developed at the Energy Systems Institute

    OpenAIRE

    N.I. Voropai; V.I. Zorkaltsev

    2013-01-01

    The paper presents shortly some optimization models of energy system operation and expansion that have been created at the Energy Systems Institute of the Siberian Branch of the Russian Academy of Sciences. Consideration is given to the optimization models of energy development in Russia, a software package intended for analysis of power system reliability, and model of flow distribution in hydraulic systems. A general idea of the optimization methods developed at the Energy Systems Institute...

  9. Evaluation of prognostic models developed using standardised image features from different PET automated segmentation methods.

    Science.gov (United States)

    Parkinson, Craig; Foley, Kieran; Whybra, Philip; Hills, Robert; Roberts, Ashley; Marshall, Chris; Staffurth, John; Spezi, Emiliano

    2018-04-11

    Prognosis in oesophageal cancer (OC) is poor. The 5-year overall survival (OS) rate is approximately 15%. Personalised medicine is hoped to increase the 5- and 10-year OS rates. Quantitative analysis of PET is gaining substantial interest in prognostic research but requires the accurate definition of the metabolic tumour volume. This study compares prognostic models developed in the same patient cohort using individual PET segmentation algorithms and assesses the impact on patient risk stratification. Consecutive patients (n = 427) with biopsy-proven OC were included in final analysis. All patients were staged with PET/CT between September 2010 and July 2016. Nine automatic PET segmentation methods were studied. All tumour contours were subjectively analysed for accuracy, and segmentation methods with segmentation methods studied, clustering means (KM2), general clustering means (GCM3), adaptive thresholding (AT) and watershed thresholding (WT) methods were included for analysis. Known clinical prognostic factors (age, treatment and staging) were significant in all of the developed prognostic models. AT and KM2 segmentation methods developed identical prognostic models. Patient risk stratification was dependent on the segmentation method used to develop the prognostic model with up to 73 patients (17.1%) changing risk stratification group. Prognostic models incorporating quantitative image features are dependent on the method used to delineate the primary tumour. This has a subsequent effect on risk stratification, with patients changing groups depending on the image segmentation method used.

  10. Methods for Rapid Screening in Woody Plant Herbicide Development

    Directory of Open Access Journals (Sweden)

    William Stanley

    2014-07-01

    Full Text Available Methods for woody plant herbicide screening were assayed with the goal of reducing resources and time required to conduct preliminary screenings for new products. Rapid screening methods tested included greenhouse seedling screening, germinal screening, and seed screening. Triclopyr and eight experimental herbicides from Dow AgroSciences (DAS 313, 402, 534, 548, 602, 729, 779, and 896 were tested on black locust, loblolly pine, red maple, sweetgum, and water oak. Screening results detected differences in herbicide and species in all experiments in much less time (days to weeks than traditional field screenings and consumed significantly less resources (<500 mg acid equivalent per herbicide per screening. Using regression analysis, various rapid screening methods were linked into a system capable of rapidly and inexpensively assessing herbicide efficacy and spectrum of activity. Implementation of such a system could streamline early-stage herbicide development leading to field trials, potentially freeing resources for use in development of beneficial new herbicide products.

  11. Developing Teaching Material Software Assisted for Numerical Methods

    Science.gov (United States)

    Handayani, A. D.; Herman, T.; Fatimah, S.

    2017-09-01

    The NCTM vision shows the importance of two things in school mathematics, which is knowing the mathematics of the 21st century and the need to continue to improve mathematics education to answer the challenges of a changing world. One of the competencies associated with the great challenges of the 21st century is the use of help and tools (including IT), such as: knowing the existence of various tools for mathematical activity. One of the significant challenges in mathematical learning is how to teach students about abstract concepts. In this case, technology in the form of mathematics learning software can be used more widely to embed the abstract concept in mathematics. In mathematics learning, the use of mathematical software can make high level math activity become easier accepted by student. Technology can strengthen student learning by delivering numerical, graphic, and symbolic content without spending the time to calculate complex computing problems manually. The purpose of this research is to design and develop teaching materials software assisted for numerical method. The process of developing the teaching material starts from the defining step, the process of designing the learning material developed based on information obtained from the step of early analysis, learners, materials, tasks that support then done the design step or design, then the last step is the development step. The development of teaching materials software assisted for numerical methods is valid in content. While validator assessment for teaching material in numerical methods is good and can be used with little revision.

  12. Error analysis of motion correction method for laser scanning of moving objects

    Science.gov (United States)

    Goel, S.; Lohani, B.

    2014-05-01

    The limitation of conventional laser scanning methods is that the objects being scanned should be static. The need of scanning moving objects has resulted in the development of new methods capable of generating correct 3D geometry of moving objects. Limited literature is available showing development of very few methods capable of catering to the problem of object motion during scanning. All the existing methods utilize their own models or sensors. Any studies on error modelling or analysis of any of the motion correction methods are found to be lacking in literature. In this paper, we develop the error budget and present the analysis of one such `motion correction' method. This method assumes availability of position and orientation information of the moving object which in general can be obtained by installing a POS system on board or by use of some tracking devices. It then uses this information along with laser scanner data to apply correction to laser data, thus resulting in correct geometry despite the object being mobile during scanning. The major application of this method lie in the shipping industry to scan ships either moving or parked in the sea and to scan other objects like hot air balloons or aerostats. It is to be noted that the other methods of "motion correction" explained in literature can not be applied to scan the objects mentioned here making the chosen method quite unique. This paper presents some interesting insights in to the functioning of "motion correction" method as well as a detailed account of the behavior and variation of the error due to different sensor components alone and in combination with each other. The analysis can be used to obtain insights in to optimal utilization of available components for achieving the best results.

  13. Method development of damage detection in asymmetric buildings

    Science.gov (United States)

    Wang, Yi; Thambiratnam, David P.; Chan, Tommy H. T.; Nguyen, Andy

    2018-01-01

    Aesthetics and functionality requirements have caused most buildings to be asymmetric in recent times. Such buildings exhibit complex vibration characteristics under dynamic loads as there is coupling between the lateral and torsional components of vibration, and are referred to as torsionally coupled buildings. These buildings require three dimensional modelling and analysis. In spite of much recent research and some successful applications of vibration based damage detection methods to civil structures in recent years, the applications to asymmetric buildings has been a challenging task for structural engineers. There has been relatively little research on detecting and locating damage specific to torsionally coupled asymmetric buildings. This paper aims to compare the difference in vibration behaviour between symmetric and asymmetric buildings and then use the vibration characteristics for predicting damage in them. The need for developing a special method to detect damage in asymmetric buildings thus becomes evident. Towards this end, this paper modifies the traditional modal strain energy based damage index by decomposing the mode shapes into their lateral and vertical components and to form component specific damage indices. The improved approach is then developed by combining the modified strain energy based damage indices with the modal flexibility method which was modified to suit three dimensional structures to form a new damage indicator. The procedure is illustrated through numerical studies conducted on three dimensional five-story symmetric and asymmetric frame structures with the same layout, after validating the modelling techniques through experimental testing of a laboratory scale asymmetric building model. Vibration parameters obtained from finite element analysis of the intact and damaged building models are then applied into the proposed algorithms for detecting and locating the single and multiple damages in these buildings. The results

  14. Development of flow network analysis code for block type VHTR core by linear theory method

    International Nuclear Information System (INIS)

    Lee, J. H.; Yoon, S. J.; Park, J. W.; Park, G. C.

    2012-01-01

    VHTR (Very High Temperature Reactor) is high-efficiency nuclear reactor which is capable of generating hydrogen with high temperature of coolant. PMR (Prismatic Modular Reactor) type reactor consists of hexagonal prismatic fuel blocks and reflector blocks. The flow paths in the prismatic VHTR core consist of coolant holes, bypass gaps and cross gaps. Complicated flow paths are formed in the core since the coolant holes and bypass gap are connected by the cross gap. Distributed coolant was mixed in the core through the cross gap so that the flow characteristics could not be modeled as a simple parallel pipe system. It requires lot of effort and takes very long time to analyze the core flow with CFD analysis. Hence, it is important to develop the code for VHTR core flow which can predict the core flow distribution fast and accurate. In this study, steady state flow network analysis code is developed using flow network algorithm. Developed flow network analysis code was named as FLASH code and it was validated with the experimental data and CFD simulation results. (authors)

  15. Quantitative bioanalytical and analytical method development of dibenzazepine derivative, carbamazepine: A review ☆

    OpenAIRE

    Datar, Prasanna A.

    2015-01-01

    Bioanalytical methods are widely used for quantitative estimation of drugs and their metabolites in physiological matrices. These methods could be applied to studies in areas of human clinical pharmacology and toxicology. The major bioanalytical services are method development, method validation and sample analysis (method application). Various methods such as GC, LC–MS/MS, HPLC, HPTLC, micellar electrokinetic chromatography, and UFLC have been used in laboratories for the qualitative and qua...

  16. New method development in prehistoric stone tool research: evaluating use duration and data analysis protocols.

    Science.gov (United States)

    Evans, Adrian A; Macdonald, Danielle A; Giusca, Claudiu L; Leach, Richard K

    2014-10-01

    Lithic microwear is a research field of prehistoric stone tool (lithic) analysis that has been developed with the aim to identify how stone tools were used. It has been shown that laser scanning confocal microscopy has the potential to be a useful quantitative tool in the study of prehistoric stone tool function. In this paper, two important lines of inquiry are investigated: (1) whether the texture of worn surfaces is constant under varying durations of tool use, and (2) the development of rapid objective data analysis protocols. This study reports on the attempt to further develop these areas of study and results in a better understanding of the complexities underlying the development of flexible analytical algorithms for surface analysis. The results show that when sampling is optimised, surface texture may be linked to contact material type, independent of use duration. Further research is needed to validate this finding and test an expanded range of contact materials. The use of automated analytical protocols has shown promise but is only reliable if sampling location and scale are defined. Results suggest that the sampling protocol reports on the degree of worn surface invasiveness, complicating the ability to investigate duration related textural characterisation. Copyright © 2014. Published by Elsevier Ltd.

  17. A methodology for developing high-integrity knowledge base using document analysis and ECPN matrix analysis with backward simulation

    International Nuclear Information System (INIS)

    Park, Joo Hyun

    1999-02-01

    When transitions occur in large systems such as nuclear power plants (NPPs) or industrial process plants, it is often difficult to diagnose them. Various computer-based operator-aiding systems have been developed in order to help operators diagnose the transitions of the plants. In procedures for developing knowledge base system like operator-aiding systems, the knowledge acquisition and the knowledge base verification are core activities. This dissertation describes a knowledge acquisition method and a knowledge base verification method for developing high-integrity knowledge base system of NPP expert systems. The knowledge acquisition is one of the most difficult and time-consuming activities in developing knowledge base systems. There are two kinds of knowledge acquisition methods in view of knowledge sources. One is an acquisition method from human expert. This method, however, is not adequate to acquire the knowledge of NPP expert systems because the number of experts is not sufficient. In this work, we propose a novel knowledge acquisition method through documents analysis. The knowledge base can be built correctly, rapidly, and partially automatically through this method. This method is especially useful when it is difficult to find domain experts. Reliability of knowledge base systems depends on the quality of their knowledge base. Petri Net has been used to verify knowledge bases due to their formal outputs. The methods using Petri Net however are difficult to apply to large and complex knowledge bases because the Net becomes very large and complex. Also, with Petri Net, it is difficult to find proper input patterns that make anomalies occur. In order to overcome this difficulty, in this work, the anomaly candidates detection methods are developed based on Extended CPN (ECPN) matrix analysis. This work also defines the backward simulation of CPN to find compact input patterns for anomaly detection, which starts simulation from the anomaly candidates

  18. Development of methods for evaluating active faults

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    The report for long-term evaluation of active faults was published by the Headquarters for Earthquake Research Promotion on Nov. 2010. After occurrence of the 2011 Tohoku-oki earthquake, the safety review guide with regard to geology and ground of site was revised by the Nuclear Safety Commission on Mar. 2012 with scientific knowledges of the earthquake. The Nuclear Regulation Authority established on Sep. 2012 is newly planning the New Safety Design Standard related to Earthquakes and Tsunamis of Light Water Nuclear Power Reactor Facilities. With respect to those guides and standards, our investigations for developing the methods of evaluating active faults are as follows; (1) For better evaluation on activities of offshore fault, we proposed a work flow to date marine terrace (indicator for offshore fault activity) during the last 400,000 years. We also developed the analysis of fault-related fold for evaluating of blind fault. (2) To clarify the activities of active faults without superstratum, we carried out the color analysis of fault gouge and divided the activities into thousand of years and tens of thousands. (3) To reduce uncertainties of fault activities and frequency of earthquakes, we compiled the survey data and possible errors. (4) For improving seismic hazard analysis, we compiled the fault activities of the Yunotake and Itozawa faults, induced by the 2011 Tohoku-oki earthquake. (author)

  19. Development of plant dynamic analysis code for integrated self-pressurized water reactor (ISPDYN), and comparative study of pressure control methods

    International Nuclear Information System (INIS)

    Kusunoki, Tsuyoshi; Yokomura, Takeyoshi; Nabeshima, Kunihiko; Shimazaki, Junya; Shinohara, Yoshikuni.

    1988-01-01

    This report describes the development of plant dynamic analysis code (ISPDYN) for integrated self-pressurized water reactor, and comparative study of pressure control methods with this code. ISPDYN is developed for integrated self-pressurized water reactor, one of the trial design by JAERI. In the transient responses, the calculated results by ISPDYN are in good agreement with the DRUCK calculations. In addition, this report presents some sensitivity studies for selected cases. Computing time of this code is very short so as about one fifth of real time. The comparative study of self-pressurized system with forced-pressurized system by this code, for rapid load decrease and increase cases, has provided useful informations. (author)

  20. Interface and thin film analysis: Comparison of methods, trends

    International Nuclear Information System (INIS)

    Werner, H.W.; Torrisi, A.

    1990-01-01

    Thin film properties are governed by a number of parameters such as: Surface and interface chemical composition, microstructure and the distribution of defects, dopants and impurities. For the determination of most of these aspects sophisticated analytical methods are needed. An overview of these analytical methods is given including: - Features and modes of analytical methods; - Main characteristics, advantages and disadvantages of the established methods [e.g. ESCA (Electron Spectroscopy for Chemical Analysis), AES (Auger Electron Spectroscopy), SIMS (Secondary Ion Mass Spectrometry), RBS (Rutherford Backscattering Spectrometry), SEM (Scanning Electron Microscopy), TEM (Transmission Electron Microscopy), illustrated with typical examples]; - Presentation of relatively new methods such as XRM (X-ray Microscopy) and SCAM (Scanning Acoustic Microscopy). Some features of ESCA (chemical information, insulator analysis, non-destructive depth profiling) have been selected for a more detailed presentation, viz. to illustrate the application of ESCA to practical problems. Trends in instrumental development and analytical applications of the techniques are discussed; the need for a multi-technique approach to solve complex analytical problems is emphasized. (orig.)

  1. Method Development for Rapid Analysis of Natural Radioactive Nuclides Using Sector Field Inductively Coupled Plasma Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Lim, J.M.; Ji, Y.Y.; Lee, H.; Park, J.H.; Jang, M.; Chung, K.H.; Kang, M.J.; Choi, G.S. [Korea Atomic Energy Research Institute (Korea, Republic of)

    2014-07-01

    As an attempt to reduce the social costs and apprehension arising from radioactivity in the environment, an accurate and rapid assessment of radioactivity is highly desirable. Naturally occurring radioactive materials (NORM) are widely spread throughout the environment. The concern with radioactivity from these materials has therefore been growing for the last decade. In particular, radiation exposure in the industry when handling raw materials (e.g., coal mining and combustion, oil and gas production, metal mining and smelting, mineral sands (REE, Ti, Zr), fertilizer (phosphate), and building materials) has been brought to the public's attention. To decide the proper handling options, a rapid and accurate analytical method that can be used to evaluate the radioactivity of radionuclides (e.g., {sup 238}U, {sup 235}U, {sup 232}Th, {sup 226}Ra, and {sup 40}K) should be developed and validated. Direct measuring methods such as alpha spectrometry, a liquid scintillation counter (LSC), and mass-spectrometry are usually used for the measurement of radioactivity in NORM samples, and they encounter the most significant difficulties during pretreatment (e.g., purification, speciation, and dilution/enrichment). Since the pretreatment process consequently plays an important role in the measurement uncertainty, method development and validation should be performed. Furthermore, a-spectrometry has a major disadvantage of a long counting time, while it has a prominent measurement capability at a very low activity level of {sup 238}U, {sup 235}U, {sup 232}Th, and {sup 226}Ra. Contrary to the α-spectrometry method, a measurement technique using ICP-MS allow radioactivity in many samples to be measured in a short time period with a high degree of accuracy and precision. In this study, a method was developed for a rapid analysis of natural radioactive nuclides using ICP-MS. A sample digestion process was established using LiBO{sub 2} fusion and Fe co-precipitation. A magnetic

  2. Development and application of a biorelevant dissolution method using USP apparatus 4 in early phase formulation development.

    Science.gov (United States)

    Fang, Jiang B; Robertson, Vivian K; Rawat, Archana; Flick, Tawnya; Tang, Zhe J; Cauchon, Nina S; McElvain, James S

    2010-10-04

    Dissolution testing is frequently used to determine the rate and extent at which a drug is released from a dosage form, and it plays many important roles throughout drug product development. However, the traditional dissolution approach often emphasizes its application in quality control testing and usually strives to obtain 100% drug release. As a result, dissolution methods are not necessarily biorelevant and meaningful application of traditional dissolution methods in the early phases of drug product development can be very limited. This article will describe the development of a biorelevant in vitro dissolution method using USP apparatus 4, biorelevant media, and real-time online UV analysis. Several case studies in the areas of formulation selection, lot-to-lot variability, and food effect will be presented to demonstrate the application of this method in early phase formulation development. This biorelevant dissolution method using USP apparatus 4 provides a valuable tool to predict certain aspects of the in vivo drug release. It can be used to facilitate the formulation development/selection for pharmacokinetic (PK) and clinical studies. It may also potentially be used to minimize the number of PK studies, and to aid in the design of more efficient PK and clinical studies.

  3. Text analysis methods, text analysis apparatuses, and articles of manufacture

    Science.gov (United States)

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M

    2014-10-28

    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  4. Method of quantitative analysis of superconducting metal-conducting composite materials

    International Nuclear Information System (INIS)

    Bogomolov, V.N.; Zhuravlev, V.V.; Petranovskij, V.P.; Pimenov, V.A.

    1990-01-01

    Technique for quantitative analysis of superconducting metal-containing composite materials, SnO 2 -InSn, WO 3 -InW, Zn)-InZn in particular, has been developed. The method of determining metal content in a composite is based on the dependence of superconducting transition temperature on alloy composition. Sensitivity of temperature determination - 0.02K, error of analysis for InSn system - 0.5%

  5. Using Multidimensional Methods to Understand the Development, Interpretation and Enactment of Quality Assurance Policy within the Educational Development Community

    Science.gov (United States)

    Smith, Karen

    2018-01-01

    Policy texts are representations of practice that both reflect and shape the world around them. There is, however, little higher education research that critically analyses the impact of higher education policy on educational developers and educational development practice. Extending methods from critical discourse analysis by combining textual…

  6. Direct methods of soil-structure interaction analysis for earthquake loadings (V)

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, J. B.; Choi, J. S.; Lee, J. J.; Park, D. U. [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-07-15

    Methodologies of SSI analysis for earthquake loadings have been reviewed. Based on the finite method incorporating infinite element technique for the unbounded exterior region, a computer program for the nonlinear seismic analysis named as 'KIESSI' has been developed. The computer program has been verified using a free-field site-response problem. Post-correlation analysis for the Hualien FVT after backfill and the blind prediction of earthquake responses have been carried out utilizing the developed computer program. The earthquake response analyses for three LSST structures (Hualien, Lotung and Tepsco structure) have also been performed and compared with the measured data.

  7. Developments of the neutron scattering analysis method for the determination of magnetic structures

    Energy Technology Data Exchange (ETDEWEB)

    Park, Je-Geun; Chung, Jae Gwan; Park, Jung Hwan; Kong, Ung Girl [Inha Univ., Incheon (Korea); So, Ji Yong [Seoul National University, Seoul(Korea)

    2001-04-01

    Neutron diffraction is up to now almost the only and very important experimental method of determining the magnetic structure of materials. Unlike the studies of crystallographic structure, however to use neutron diffraction for magnetic structure determination is not easily accessible to non-experts because of the complexity of magnetic group theory: which is very important in the magnetic structure analysis. With the recent development of computer code for magnetic group, it is now time to rethink of these difficulties. In this work, we have used the computer code of the magnetic group (Mody-2) and Fullprof refinement program in order to study the magnetic structure of YMnO{sub 3} and other interesting materials. YMnO{sub 3} forms in the hexagonal structure and show both ferroelectric and antiferromagnetic phase transitions. Since it was recently found that YMnO{sub 3} can be used as a nonvolatile memory device, there has been many numbers of applied research on this material. We used neutron diffraction to determine the magnetic structure, and, in particular, to investigate the correlation between the order parameters of the ferroelectric and antiferromagnetic phase transitions. From this study, we have demonstrated that with a proper use of the computer code of the magnetic group one can overcome most of difficulties arising from the magnetic group theory. 4 refs., 8 figs., 5 tabs. (Author)

  8. Sternal instability measured with radiostereometric analysis. A study of method feasibility, accuracy and precision

    DEFF Research Database (Denmark)

    Vestergaard, Rikke Falsig; Søballe, Kjeld; Hasenkam, John Michael

    2018-01-01

    BACKGROUND: A small, but unstable, saw-gap may hinder bone-bridging and induce development of painful sternal dehiscence. We propose the use of Radiostereometric Analysis (RSA) for evaluation of sternal instability and present a method validation. METHODS: Four bone analogs (phantoms) were sterno...... modality feasible for clinical evaluation of sternal stability in research. TRIAL REGISTRATION: ClinicalTrials.gov Identifier: NCT02738437 , retrospectively registered.......BACKGROUND: A small, but unstable, saw-gap may hinder bone-bridging and induce development of painful sternal dehiscence. We propose the use of Radiostereometric Analysis (RSA) for evaluation of sternal instability and present a method validation. METHODS: Four bone analogs (phantoms) were...

  9. PIXE - a new method for elemental analysis

    International Nuclear Information System (INIS)

    Johansson, S.A.E.

    1983-01-01

    With elemental analysis we mean the determination of which chemical elements are present in a sample and of their concentration. This is an old and important problem in chemistry. The earliest methods were purely chemical and many such methods are still used. However, various methods based on physical principles have gradually become more and more important. One such method is neutron activation. When the sample is bombarded with neutrons it becomes radioactive and the various radioactive isotopes produced can be identified by the radiation they emit. From the measured intensity of the radiation one can calculate how much of a certain element that is present in the sample. Another possibility is to study the light emitted when the sample is excited in various ways. A spectroscopic investigation of the light can identify the chemical elements and allows also a determination of their concentration in the sample. In the same way, if a sample can be brought to emit X-rays, this radiation is also characteristic for the elements present and can be used to determine the elemental concentration. One such X-ray method which has been developed recently is PIXE. The name is an acronym for Particle Induced X-ray Emission and indicates the principle of the method. Particles in this context means heavy, charged particles such as protons and a-particles of rather high energy. Hence, in PIXE-analysis the sample is irradiated in the beam of an accelerator and the emitted X-rays are studied. (author)

  10. Development of in vitro assay method with radioisotope

    International Nuclear Information System (INIS)

    Choi, Chang Woon; Lim, S. M.; An, S. H.; Woo, K. S.; Chung, W. S.; Lim, S. J.; Hong, S. W.; Oh, O. D.

    1999-04-01

    Radioimmunoassay (RIA) and related competitive protein-binding methods began a little over 20 years ago as a cumbersome research methodology in a few specialized laboratories. Endocrinology has been greatly enriched by the new knowledge that has come as a direct result of RIA methods. Establishment of the taxol RIA system will be expected to develop RIA for drug monitoring. Scintillation proximity assay was useful since any separation step is not required, it has the advantage of dealing with multiple samples. The increased sensitivity of the new assay in determining HCV RT([ 125 I]dUTP) suggests that it would be worth investigating whether the system can be applied to analysis. [ 125 I] lodotyramine with 98.5% radiochemical purity. Optimal background counts was certificated using varied radioactivity of radionuclides. Appropriate standard curve was obtained from SPA method successively, and the concentration of hCG from unknown serum was determined by standard curve. The result concentration of hCG from unknown serum was determined by synthesized successively and purified by HPLC system. Hybridoma reducing monoclonal anti thyroglobulin antibodies titer is measured by ELISA. These studies play an important role in development of in vitro assay with radionuclides

  11. Development of in vitro assay method with radioisotope

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Chang Woon; Lim, S. M.; An, S. H.; Woo, K. S.; Chung, W. S.; Lim, S. J.; Hong, S. W. [Korea Atomic Energy Research Institute. Korea Cancer Center Hospital, Seoul (Korea, Republic of); Oh, O. D. [Yonsei University, Seoul (Korea, Republic of)

    1999-04-01

    Radioimmunoassay (RIA) and related competitive protein-binding methods began a little over 20 years ago as a cumbersome research methodology in a few specialized laboratories. Endocrinology has been greatly enriched by the new knowledge that has come as a direct result of RIA methods. Establishment of the taxol RIA system will be expected to develop RIA for drug monitoring. Scintillation proximity assay was useful since any separation step is not required, it has the advantage of dealing with multiple samples. The increased sensitivity of the new assay in determining HCV RT([{sup 125}I]dUTP) suggests that it would be worth investigating whether the system can be applied to analysis. [{sup 125}I] lodotyramine with 98.5% radiochemical purity. Optimal background counts was certificated using varied radioactivity of radionuclides. Appropriate standard curve was obtained from SPA method successively, and the concentration of hCG from unknown serum was determined by standard curve. The result concentration of hCG from unknown serum was determined by synthesized successively and purified by HPLC system. Hybridoma reducing monoclonal anti thyroglobulin antibodies titer is measured by ELISA. These studies play an important role in development of in vitro assay with radionuclides.

  12. Dependent data in social sciences research forms, issues, and methods of analysis

    CERN Document Server

    Eye, Alexander; Wiedermann, Wolfgang

    2015-01-01

    This volume presents contributions on handling data in which the postulate of independence in the data matrix is violated. When this postulate is violated and when the methods assuming independence are still applied, the estimated parameters are likely to be biased, and statistical decisions are very likely to be incorrect. Problems associated with dependence in data have been known for a long time, and led to the development of tailored methods for the analysis of dependent data in various areas of statistical analysis. These methods include, for example, methods for the analysis of longitudinal data, corrections for dependency, and corrections for degrees of freedom. This volume contains the following five sections: growth curve modeling, directional dependence, dyadic data modeling, item response modeling (IRT), and other methods for the analysis of dependent data (e.g., approaches for modeling cross-section dependence, multidimensional scaling techniques, and mixed models). Researchers and graduate stud...

  13. Deterministic methods for sensitivity and uncertainty analysis in large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Oblow, E.M.; Pin, F.G.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.; Lucius, J.L.

    1987-01-01

    The fields of sensitivity and uncertainty analysis are dominated by statistical techniques when large-scale modeling codes are being analyzed. This paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. The paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. The paper demonstrates the deterministic approach to sensitivity and uncertainty analysis as applied to a sample problem that models the flow of water through a borehole. The sample problem is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. The DUA method gives a more accurate result based upon only two model executions compared to fifty executions in the statistical case

  14. A laser ablation ICP-MS based method for multiplexed immunoblot analysis

    DEFF Research Database (Denmark)

    de Bang, Thomas Christian; Petersen, Jørgen; Pedas, Pai Rosager

    2015-01-01

    developed a multiplexed antibody-based assay and analysed selected PSII subunits in barley (Hordeum vulgare L.). A selection of antibodies were labelled with specific lanthanides and immunoreacted with thylakoids exposed to Mn deficiency after western blotting. Subsequently, western blot membranes were...... analysed by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS), which allowed selective and relative quantitative analysis via the different lanthanides. The method was evaluated against established liquid chromatography electrospray ionization tandem mass spectrometry (LC...... by more than one technique. The developed method enables a higher number of proteins to be multiplexed in comparison to existing immunoassays. Furthermore, multiplexed protein analysis by LA-ICP-MS provides an analytical platform with high throughput appropriate for screening large collections of plants....

  15. Development of a System Analysis Toolkit for Sensitivity Analysis, Uncertainty Propagation, and Estimation of Parameter Distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Statistical approaches to uncertainty quantification and sensitivity analysis are very important in estimating the safety margins for an engineering design application. This paper presents a system analysis and optimization toolkit developed by Korea Atomic Energy Research Institute (KAERI), which includes multiple packages of the sensitivity analysis and uncertainty quantification algorithms. In order to reduce the computing demand, multiple compute resources including multiprocessor computers and a network of workstations are simultaneously used. A Graphical User Interface (GUI) was also developed within the parallel computing framework for users to readily employ the toolkit for an engineering design and optimization problem. The goal of this work is to develop a GUI framework for engineering design and scientific analysis problems by implementing multiple packages of system analysis methods in the parallel computing toolkit. This was done by building an interface between an engineering simulation code and the system analysis software packages. The methods and strategies in the framework were designed to exploit parallel computing resources such as those found in a desktop multiprocessor workstation or a network of workstations. Available approaches in the framework include statistical and mathematical algorithms for use in science and engineering design problems. Currently the toolkit has 6 modules of the system analysis methodologies: deterministic and probabilistic approaches of data assimilation, uncertainty propagation, Chi-square linearity test, sensitivity analysis, and FFTBM

  16. Development of a System Analysis Toolkit for Sensitivity Analysis, Uncertainty Propagation, and Estimation of Parameter Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok; Kim, Kyung Doo [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    Statistical approaches to uncertainty quantification and sensitivity analysis are very important in estimating the safety margins for an engineering design application. This paper presents a system analysis and optimization toolkit developed by Korea Atomic Energy Research Institute (KAERI), which includes multiple packages of the sensitivity analysis and uncertainty quantification algorithms. In order to reduce the computing demand, multiple compute resources including multiprocessor computers and a network of workstations are simultaneously used. A Graphical User Interface (GUI) was also developed within the parallel computing framework for users to readily employ the toolkit for an engineering design and optimization problem. The goal of this work is to develop a GUI framework for engineering design and scientific analysis problems by implementing multiple packages of system analysis methods in the parallel computing toolkit. This was done by building an interface between an engineering simulation code and the system analysis software packages. The methods and strategies in the framework were designed to exploit parallel computing resources such as those found in a desktop multiprocessor workstation or a network of workstations. Available approaches in the framework include statistical and mathematical algorithms for use in science and engineering design problems. Currently the toolkit has 6 modules of the system analysis methodologies: deterministic and probabilistic approaches of data assimilation, uncertainty propagation, Chi-square linearity test, sensitivity analysis, and FFTBM.

  17. [Adverse events management. Methods and results of a development project].

    Science.gov (United States)

    Rabøl, Louise Isager; Jensen, Elisabeth Brøgger; Hellebek, Annemarie H; Pedersen, Beth Lilja

    2006-11-27

    This article describes the methods and results of a project in the Copenhagen Hospital Corporation (H:S) on preventing adverse events. The aim of the project was to raise awareness about patients' safety, test a reporting system for adverse events, develop and test methods of analysis of events and propagate ideas about how to prevent adverse events. H:S developed an action plan and a reporting system for adverse events, founded an organization and developed an educational program on theories and methods of learning from adverse events for both leaders and employees. During the three-year period from 1 January 2002 to 31 December 2004, the H:S staff reported 6011 adverse events. In the same period, the organization completed 92 root cause analyses. More than half of these dealt with events that had been optional to report, the other half events that had been mandatory to report. The number of reports and the front-line staff's attitude towards reporting shows that the H:S succeeded in founding a safety culture. Future work should be centred on developing and testing methods that will prevent adverse events from happening. The objective is to suggest and complete preventive initiatives which will help increase patient safety.

  18. Developing Vulnerability Analysis Method for Climate Change Adaptation on Agropolitan Region in Malang District

    Science.gov (United States)

    Sugiarto, Y.; Perdinan; Atmaja, T.; Wibowo, A.

    2017-03-01

    Agriculture plays a strategic role in strengthening sustainable development. Based on agropolitan concept, the village becomes the center of economic activities by combining agriculture, agro-industry, agribusiness and tourism that able to create high value-added economy. The impact of climate change on agriculture and water resources may increase the pressure on agropolitan development. The assessment method is required to measure the vulnerability of area-based communities in the agropolitan to climate change impact. An analysis of agropolitan vulnerability was conducted in Malang district based on four aspects and considering the availability and distribution of water as the problem. The indicators used to measure was vulnerability component which consisted of sensitivity and adaptive capacity and exposure component. The studies earned 21 indicators derived from the 115 village-based data. The results of vulnerability assessments showed that most of the villages were categorised at a moderate level. Around 20% of 388 villages were categorized at high to very high level of vulnerability due to low level of agricultural economic. In agropolitan region within the sub-district of Poncokusumo, the vulnerability of the villages varies between very low to very high. The most villages were vulnerable due to lower adaptive capacity, eventhough the level of sensitivity and exposure of all villages were relatively similar. The existence of water resources was the biggest contributor to the high exposure of the villages in Malang district, while the reception of credit facilities and source of family income were among the indicators that lead to high sensitivity component.

  19. Markov Chain Monte Carlo Methods for Bayesian Data Analysis in Astronomy

    Science.gov (United States)

    Sharma, Sanjib

    2017-08-01

    Markov Chain Monte Carlo based Bayesian data analysis has now become the method of choice for analyzing and interpreting data in almost all disciplines of science. In astronomy, over the last decade, we have also seen a steady increase in the number of papers that employ Monte Carlo based Bayesian analysis. New, efficient Monte Carlo based methods are continuously being developed and explored. In this review, we first explain the basics of Bayesian theory and discuss how to set up data analysis problems within this framework. Next, we provide an overview of various Monte Carlo based methods for performing Bayesian data analysis. Finally, we discuss advanced ideas that enable us to tackle complex problems and thus hold great promise for the future. We also distribute downloadable computer software (available at https://github.com/sanjibs/bmcmc/ ) that implements some of the algorithms and examples discussed here.

  20. Negotiating a Systems Development Method

    Science.gov (United States)

    Karlsson, Fredrik; Hedström, Karin

    Systems development methods (or methods) are often applied in tailored version to fit the actual situation. Method tailoring is in most the existing literature viewed as either (a) a highly rational process with the method engineer as the driver where the project members are passive information providers or (b) an unstructured process where the systems developer makes individual choices, a selection process without any driver. The purpose of this chapter is to illustrate that important design decisions during method tailoring are made by project members through negotiation. The study has been carried out using the perspective of actor-network theory. Our narratives depict method tailoring as more complex than (a) and (b) show the driver role rotates between the project members, and design decisions are based on influences from several project members. However, these design decisions are not consensus decisions.

  1. Implementation of statistical analysis methods for medical physics data

    International Nuclear Information System (INIS)

    Teixeira, Marilia S.; Pinto, Nivia G.P.; Barroso, Regina C.; Oliveira, Luis F.

    2009-01-01

    The objective of biomedical research with different radiation natures is to contribute for the understanding of the basic physics and biochemistry of the biological systems, the disease diagnostic and the development of the therapeutic techniques. The main benefits are: the cure of tumors through the therapy, the anticipated detection of diseases through the diagnostic, the using as prophylactic mean for blood transfusion, etc. Therefore, for the better understanding of the biological interactions occurring after exposure to radiation, it is necessary for the optimization of therapeutic procedures and strategies for reduction of radioinduced effects. The group pf applied physics of the Physics Institute of UERJ have been working in the characterization of biological samples (human tissues, teeth, saliva, soil, plants, sediments, air, water, organic matrixes, ceramics, fossil material, among others) using X-rays diffraction and X-ray fluorescence. The application of these techniques for measurement, analysis and interpretation of the biological tissues characteristics are experimenting considerable interest in the Medical and Environmental Physics. All quantitative data analysis must be initiated with descriptive statistic calculation (means and standard deviations) in order to obtain a previous notion on what the analysis will reveal. It is well known que o high values of standard deviation found in experimental measurements of biologicals samples can be attributed to biological factors, due to the specific characteristics of each individual (age, gender, environment, alimentary habits, etc). This work has the main objective the development of a program for the use of specific statistic methods for the optimization of experimental data an analysis. The specialized programs for this analysis are proprietary, another objective of this work is the implementation of a code which is free and can be shared by the other research groups. As the program developed since the

  2. Computational Methods for ChIP-seq Data Analysis and Applications

    KAUST Repository

    Ashoor, Haitham

    2017-04-25

    The development of Chromatin immunoprecipitation followed by sequencing (ChIP-seq) technology has enabled the construction of genome-wide maps of protein-DNA interaction. Such maps provide information about transcriptional regulation at the epigenetic level (histone modifications and histone variants) and at the level of transcription factor (TF) activity. This dissertation presents novel computational methods for ChIP-seq data analysis and applications. The work of this dissertation addresses four main challenges. First, I address the problem of detecting histone modifications from ChIP-seq cancer samples. The presence of copy number variations (CNVs) in cancer samples results in statistical biases that lead to inaccurate predictions when standard methods are used. To overcome this issue I developed HMCan, a specially designed algorithm to handle ChIP-seq cancer data by accounting for the presence of CNVs. When using ChIP-seq data from cancer cells, HMCan demonstrates unbiased and accurate predictions compared to the standard state of the art methods. Second, I address the problem of identifying changes in histone modifications between two ChIP-seq samples with different genetic backgrounds (for example cancer vs. normal). In addition to CNVs, different antibody efficiency between samples and presence of samples replicates are challenges for this problem. To overcome these issues, I developed the HMCan-diff algorithm as an extension to HMCan. HMCan-diff implements robust normalization methods to address the challenges listed above. HMCan-diff significantly outperforms another state of the art methods on data containing cancer samples. Third, I investigate and analyze predictions of different methods for enhancer prediction based on ChIP-seq data. The analysis shows that predictions generated by different methods are poorly overlapping. To overcome this issue, I developed DENdb, a database that integrates enhancer predictions from different methods. DENdb also

  3. The Development and Numerical Analysis of the Conical Radiator Extrusion Process

    Directory of Open Access Journals (Sweden)

    Michalczyk J.

    2017-12-01

    Full Text Available The article presents a newly developed method for single-operation extrusion of conical radiators. This is the author’s radiator manufacturing method being the subject of a patent application. The proposed method enables the manufacture of radiators either with or without an inner opening and with an integral plate. Selected results of numerical computations made within Forge®3D, a finite element method (FEM-based software program, were presented during the analysis of the process. A comparative analysis of the proposed manufacturing method using the double-sided extrusion method was also made.

  4. Research for developing precise tsunami evaluation methods. Probabilistic tsunami hazard analysis/numerical simulation method with dispersion and wave breaking

    International Nuclear Information System (INIS)

    2007-01-01

    The present report introduces main results of investigations on precise tsunami evaluation methods, which were carried out from the viewpoint of safety evaluation for nuclear power facilities and deliberated by the Tsunami Evaluation Subcommittee. A framework for the probabilistic tsunami hazard analysis (PTHA) based on logic tree is proposed and calculation on the Pacific side of northeastern Japan is performed as a case study. Tsunami motions with dispersion and wave breaking were investigated both experimentally and numerically. The numerical simulation method is verified for its practicability by applying to a historical tsunami. Tsunami force is also investigated and formulae of tsunami pressure acting on breakwaters and on building due to inundating tsunami are proposed. (author)

  5. Thermoelastic stress analysis system developed for industrial applications

    DEFF Research Database (Denmark)

    Haldorsen, Lars Magne

    The thesis is divided into three parts. The first part describes an extensive evaluation of the existing thermoelastic theory. The second part describes the development and results af a reliable numerical simulation code of the thermoelastic effect and the associated heat transfer effects. Finall......, theories, methods and additional equipment are developed in order to adopt a commercial IR-imaging system to preform Termoelastic Stress Analysis (TSA)....

  6. ATTA - A new method of ultrasensitive isotope trace analysis

    International Nuclear Information System (INIS)

    Bailey, K.; Chen, C.Y.; Du, X.; Li, Y.M.; Lu, Z.-T.; O'Connor, T.P.; Young, L.

    2000-01-01

    A new method of ultrasensitive isotope trace analysis has been developed. This method, based on the technique of laser manipulation of neutral atoms, has been used to count individual 85 Kr and 81 Kr atoms present in a natural krypton gas sample with isotopic abundances in the range of 10 -11 and 10 -13 , respectively. This method is free of contamination from other isotopes and elements and can be applied to various different isotope tracers for a wide range of applications. The demonstrated detection efficiency is 1x10 -7 . System improvements could increase the efficiency by many orders of magnitude

  7. Machine Learning Methods for Production Cases Analysis

    Science.gov (United States)

    Mokrova, Nataliya V.; Mokrov, Alexander M.; Safonova, Alexandra V.; Vishnyakov, Igor V.

    2018-03-01

    Approach to analysis of events occurring during the production process were proposed. Described machine learning system is able to solve classification tasks related to production control and hazard identification at an early stage. Descriptors of the internal production network data were used for training and testing of applied models. k-Nearest Neighbors and Random forest methods were used to illustrate and analyze proposed solution. The quality of the developed classifiers was estimated using standard statistical metrics, such as precision, recall and accuracy.

  8. Development of a cause analysis system for a CPCS trip by using the rule-base deduction method.

    Science.gov (United States)

    Park, Je-Yun; Koo, In-Soo; Sohn, Chang-Ho; Kim, Jung-Seon; Cho, Gi-Ho; Park, Hee-Seok

    2009-07-01

    A Core Protection Calculator System (CPCS) was developed to initiate a Reactor Trip under the circumstance of certain transients by a Combustion Engineering Company. The major function of the Core Protection Calculator System is to generate contact outputs for the Departure from Nucleate Boiling Ratio (DNBR) Trip and a Local Power Density (LPD) Trip. But in a Core Protection Calculator System, a trip cause cannot be identified, thus only trip signals are transferred to the Plant Protection System (PPS) and only the trip status is displayed. It could take a considerable amount of time and effort for a plant operator to analyze the trip causes of a Core Protection Calculator System. So, a Cause Analysis System for a Core Protection Calculator System (CASCPCS) has been developed by using the rule-base deduction method to assist operators in a Nuclear Power Plant. CASCPCS consists of three major parts. Inference engine has a role of controlling the searching knowledge base, executing the rules and tracking the inference process by using the depth-first searching method. Knowledge base consists of four major parts: rules, data base constants, trip buffer variables and causes. And a user interface is implemented by using menu-driven and window display techniques. The advantage of CASCPCS is that it saves time and effort to diagnose the trip causes of a Core Protection Calculator System, it increases a plant's availability and reliability, and it makes it easy to manage CASCPCS because of using only a cursor control.

  9. Cooperative Experimental System Development - cooperative techniques beyound initial design and analysis

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Kyng, Morten; Mogensen, Preben Holst

    1995-01-01

    This chapter represents a step towards the establishment of a new system development approach, called Cooperative Experimental System Development (CESD). CESD seeks to overcome a number of limitations in existing approaches: specification oriented methods usually assume that system design can....../design activities of development projects. In contrast, the CESD approach is characterized by its focus on: active user involvement throughout the entire development process; prototyping experiments closely coupled to work-situations and use-scenarios; transforming results from early cooperative analysis...... be based solely on observation and detached reflection; prototyping methods often have a narrow focus on the technical construction of various kinds of prototypes; Participatory Design techniques—including the Scandinavian Cooperative Design (CD) approaches—seldom go beyond the early analysis...

  10. Developing methods of controlling quality costs

    OpenAIRE

    Gorbunova A. V.; Maximova O. N.; Ekova V. A.

    2017-01-01

    The article examines issues of managing quality costs, problems of applying economic methods of quality control, implementation of progressive methods of quality costs management in enterprises with the view of improving the efficiency of their evaluation and analysis. With the aim of increasing the effectiveness of the cost management mechanism, authors introduce controlling as a tool of deviation analysis from the standpoint of the process approach. A list of processes and corresponding eva...

  11. APPLICATION OF QUALITY ECONOMY METHODS IN MANAGING INNOVATION CAPACITY DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    V. V. Okrepilov

    2017-01-01

    Full Text Available Purpose: to reveal the possibilities of applying the methods of the quality economy to improve the management efficiency of the development of the region's innovative potential.Results: in the article topical questions of development of innovative potential of the region are considered - one of the important factors providing improvement of quality of life of the population and sustainable development of the territory. The content of the concept of "innovation potential of the region" is disclosed and its components are described. The purposes of development of innovative potential are considered. A comparative analysis of the methods of managing the development of potential in the market and planned economy is carried out. Particular attention is paid to the application of quality management methods to improve management effectiveness. In conclusion, the advantages and disadvantages of the economy of the macro-region "North-West" from the point of view of transition to an innovative development path, as well as scenarios for the development of the economy, are discussed.Conclusions and relevance: the use of quality economy tools to manage the development of innovative capacity will help to avoid mistakes and choose the right directions for development. In the era of globalization, improving quality is a prerequisite for successful competition in world markets. Therefore, the issues of innovation capacity assessment should be taken into account in the development of regional plans and programs, as well as socio-economic policy.

  12. Development of Advanced Life Cycle Costing Methods for Technology Benefit/Cost/Risk Assessment

    Science.gov (United States)

    Yackovetsky, Robert (Technical Monitor)

    2002-01-01

    The overall objective of this three-year grant is to provide NASA Langley's System Analysis Branch with improved affordability tools and methods based on probabilistic cost assessment techniques. In order to accomplish this objective, the Aerospace Systems Design Laboratory (ASDL) needs to pursue more detailed affordability, technology impact, and risk prediction methods and to demonstrate them on variety of advanced commercial transports. The affordability assessment, which is a cornerstone of ASDL methods, relies on the Aircraft Life Cycle Cost Analysis (ALCCA) program originally developed by NASA Ames Research Center and enhanced by ASDL. This grant proposed to improve ALCCA in support of the project objective by updating the research, design, test, and evaluation cost module, as well as the engine development cost module. Investigations into enhancements to ALCCA include improved engine development cost, process based costing, supportability cost, and system reliability with airline loss of revenue for system downtime. A probabilistic, stand-alone version of ALCCA/FLOPS will also be developed under this grant in order to capture the uncertainty involved in technology assessments. FLOPS (FLight Optimization System program) is an aircraft synthesis and sizing code developed by NASA Langley Research Center. This probabilistic version of the coupled program will be used within a Technology Impact Forecasting (TIF) method to determine what types of technologies would have to be infused in a system in order to meet customer requirements. A probabilistic analysis of the CER's (cost estimating relationships) within ALCCA will also be carried out under this contract in order to gain some insight as to the most influential costs and the impact that code fidelity could have on future RDS (Robust Design Simulation) studies.

  13. Analysis of flexible-membrane aerofoils by a method of velocity singularities

    International Nuclear Information System (INIS)

    Mateescu, D.; Newman, B.G.

    1985-01-01

    Two dimensional sails were originally treated as flexible, impervious, inextensible membranes. These methods are developed in the context of thin aerofoil theory, the membrane being replaced by a vortex sheet and the boundary conditions satisfied at the corresponding positions on the aerofoil chord. The present present methos is developed as a linear potential theory, although it may be further extended to include non-linear and viscous effects. The new analysis is based on the method of velocity singularities associated with the changes in aerofoil slope developed for rigid aerofoils; it eliminates the need of formally solving an integral equation

  14. Methods for the reactivity evaluation in subcritical systems analysis: a review

    International Nuclear Information System (INIS)

    Dulla, S.; Picca, P.; Carta, M.

    2011-01-01

    The assessment of the subcritical source-driven system technology for waste incineration and power production requires the development of reliable and efficient techniques for the reactivity evaluation and monitoring. Starting from the standard methods developed for close-to-criticality systems, extensive research activities have been carried out to analyze the behavior of subcritical assembly in time-dependent condition and to infer the subcriticality level from local flux values. In the present work, a review of some key aspects in the method development for ADS analysis is proposed, with special attention to the techniques for reactivity evaluation. (author)

  15. Economic analysis of alternative LLW disposal methods

    International Nuclear Information System (INIS)

    Foutes, C.E.; Queenan, C.J. III

    1987-01-01

    The Environmental Protection Agency (EPA) has evaluated the costs and benefits of alternative disposal technologies as part of its program to develop generally applicable environmental standards for the land disposal of low-level radioactive waste (LLW). Costs, population health effects and Critical Population Group (CPG) exposures resulting from alternative waste treatment and disposal methods were evaluated both in absolute terms and also relative to a base case (current practice). Incremental costs of the standard included costs for packaging, processing, transportation, and burial of waste. Benefits are defined in terms of reductions in the general population health risk (expected fatal cancers and genetic effects) evaluated over 10,000 years. A cost-effectiveness ratio, defined as the incremental cost per avoided health effect, was calculated for each alternative standard. The cost-effectiveness analysis took into account a number of waste streams, hydrogeologic and climatic region settings, and waste treatment and disposal methods. This paper describes the alternatives considered and preliminary results of the cost-effectiveness analysis. 15 references, 7 figures, 3 tables

  16. Study of nasal swipe analysis methods at Los Alamos National Laboratory

    International Nuclear Information System (INIS)

    Metcalf, R.A.

    1996-01-01

    The Health Physics Analysis Laboratory (HPAL) performs around 30,000 nasal swipe analyses for transuranic nuclides each year in support of worker health and safety at the Los Alamos National Laboratory (LANL). The analysis method used employs cotton swabs swiped inside a nostril and liquid scintillation analyses of the swabs. The technical basis of this method was developed at LANL and has been in use for over 10 years. Recently, questions regarding the usefulness of a non-homogeneous mixture in liquid scintillation analyses have created a need for re-evaluation of the method. A study of the validity of the method shows it provides reliable, stable, and useful data as an indicator of personnel contamination. The study has also provided insight into the underlying process which occurs to allow the analysis. Further review of this process has shown that similar results can be obtained with different sample matrices, using less material than the current analysis method. This reduction can save HPAL the cost of materials as well as greatly reduce the waste created. Radionuclides of concern include Am-241, Pu-239, and Pu-238

  17. Methodology of analysis sustainable development of Ukraine by using the theory fuzzy logic

    Directory of Open Access Journals (Sweden)

    Methodology of analysis sustainable development of Ukraine by using the theory fuzzy logic

    2016-02-01

    Full Text Available Article objective is analysis of the theoretical and methodological aspects for the assessment of sustainable development in times of crisis. The methodical approach to the analysis of sustainable development territory taking into account the assessment of the level of economic security has been proposed. A necessity of development of the complex methodical approach to the accounting of the indeterminacy properties and multicriterial in the tasks to provide economic safety on the basis of using the fuzzy logic theory (or the fuzzy sets theory was proved. The results of using the method of fuzzy sets of during the 2002-2012 years the dynamics of changes dynamics of sustainable development in Ukraine were presented.

  18. Identification of Hepatoprotective Constituents in Limonium tetragonum and Development of Simultaneous Analysis Method using High-performance Liquid Chromatography

    Science.gov (United States)

    Lee, Jae Sun; Kim, Yun Na; Kim, Na-Hyun; Heo, Jeong-Doo; Yang, Min Hye; Rho, Jung-Rae; Jeong, Eun Ju

    2017-01-01

    Background: Limonium tetragonum, a naturally salt-tolerant halophyte, has been studied recently and is of much interest to researchers due to its potent antioxidant and hepatoprotective activities. Objective: In the present study, we attempted to elucidate bioactive compounds from ethyl acetate (EtOAc) soluble fraction of L. tetragonum extract. Furthermore, the simultaneous analysis method of bioactive EtOAc fraction of L. tetragonum has been developed using high-performance liquid chromatography (HPLC). Materials and Methods: Thirteen compounds have been successfully isolated from EtOAc fraction of L. tetragonum, and the structures of 1–13 were elucidated by extensive one-dimensional and two-dimensional spectroscopic methods including 1H-NMR, 13C-NMR, 1H-1H COSY, heteronuclear single quantum coherence, heteronuclear multiple bond correlation, and nuclear Overhauser effect spectroscopy. Hepatoprotection of the isolated compounds against liver fibrosis was evaluated by measuring inhibition on hepatic stellate cells (HSCs) undergoing proliferation. Results: Compounds 1–13 were identified as gallincin (1), apigenin-3-O-β-D-galactopyranoside (2), quercetin (3), quercetin-3-O-β-D-galactopyranoside (4), (−)-epigallocatechin (5), (−)-epigallocatechin-3-gallate (6), (−)-epigallocatechin-3-(3″-O-methyl) gallate (7), myricetin-3-O-β-D-galactopyranoside (8), myricetin-3-O-(6″-O-galloyl)-β-D-galactopyranoside (9), myricetin-3-O-α-L-rhamnopyranoside (10), myricetin-3-O-(2″-O-galloyl)-α-L-rhamnopyranoside (11), myricetin-3-O-(3″-O-galloyl)-α-L-rhamnopyranoside (12), and myricetin-3-O-α-L-arabinopyranoside (13), respectively. All compounds except for 4, 8, and 10 are reported for the first time from this plant. Conclusion: Myricetin glycosides which possess galloyl substituent (9, 11, and 12) showed most potent inhibitory effects on the proliferation of HSCs. SUMMARY In the present study, we have successfully isolated 13 compounds from bioactive fraction

  19. GC Method Validation for the Analysis of Menthol in Suppository Pharmaceutical Dosage Form

    Directory of Open Access Journals (Sweden)

    Murad N. Abualhasan

    2017-01-01

    Full Text Available Menthol is widely used as a fragrance and flavor in the food and cosmetic industries. It is also used in the medical and pharmaceutical fields for its various biological effects. Gas chromatography (GC is considered to be a sensitive method for the analysis of menthol. GC chromatographic separation was developed using capillary column (VF-624 and a flame ionization detector (FID. The method was validated as per ICH guidelines for various parameters such as precision, linearity, accuracy, solution stability, robustness, limit of detection, and quantification. The tested validation parameters were found to be within acceptable limits. The method was successfully applied for the quantification of menthol in suppositories formulations. Quality control departments and official pharmacopeias can use our developed method in the analysis of menthol in pharmaceutical dosage formulation and raw material.

  20. Development of a traceability analysis method based on case grammar for NPP requirement documents written in Korean language

    International Nuclear Information System (INIS)

    Yoo, Yeong Jae; Seong, Poong Hyun; Kim, Man Cheol

    2004-01-01

    Software inspection is widely believed to be an effective method for software verification and validation (V and V). However, software inspection is labor-intensive and, since it uses little technology, software inspection is viewed upon as unsuitable for a more technology-oriented development environment. Nevertheless, software inspection is gaining in popularity. KAIST Nuclear I and C and Information Engineering Laboratory (NICIEL) has developed software management and inspection support tools, collectively named 'SIS-RT.' SIS-RT is designed to partially automate the software inspection processes. SIS-RT supports the analyses of traceability between a given set of specification documents. To make SIS-RT compatible for documents written in Korean, certain techniques in natural language processing have been studied. Among the techniques considered, case grammar is most suitable for analyses of the Korean language. In this paper, we propose a methodology that uses a case grammar approach to analyze the traceability between documents written in Korean. A discussion regarding some examples of such an analysis will follow

  1. Development of soil-structure interaction analysis method (II) - Volume 1

    International Nuclear Information System (INIS)

    Chang, S. P.; Ko, H. M.; Park, H. K. and others

    1994-02-01

    This project includes following six items : free field analysis for the determination of site input motions, impedance analysis which simplifies the effects of soil-structure interaction by using lumped parameters, soil-structure interaction analysis including the material nonlinearity of soil depending on the level of strains, strong geometric nonlinearity due to the uplifting of the base, seismic analysis of underground structure such as varied pipes, seismic analysis of liquid storage tanks. Each item contains following contents respectively : state-of-the-art review on each item and data base construction on the past researches, theoretical review on the technology of soil-structure interaction analysis, proposing preferable technology and estimating the domestic applicability, proposing guidelines for evaluation of safety and analysis scheme

  2. Evaluating public involvement in research design and grant development: Using a qualitative document analysis method to analyse an award scheme for researchers.

    Science.gov (United States)

    Baxter, Susan; Muir, Delia; Brereton, Louise; Allmark, Christine; Barber, Rosemary; Harris, Lydia; Hodges, Brian; Khan, Samaira; Baird, Wendy

    2016-01-01

    money was used, including a description of the aims and outcomes of the public involvement activities. The purpose of this study was to analyse the content of these reports. We aimed to find out what researcher views and experiences of public involvement activities were, and what lessons might be learned. Methods We used an innovative method of data analysis, drawing on group participatory approaches, qualitative content analysis, and Framework Analysis to sort and label the content of the reports. We developed a framework of categories and sub-categories (or themes and sub-themes) from this process. Results Twenty five documents were analysed. Four main themes were identified in the data: the added value of public involvement; planning and designing involvement; the role of public members; and valuing public member contributions. Within these themes, sub-themes related to the timing of involvement (prior to the research study/intended during the research study), and also specific benefits of public involvement such as: validating ideas; ensuring appropriate outcomes; ensuring the acceptability of data collection methods/tools and advice regarding research processes. Other sub-themes related to: finding and approaching public members; timing of events; training/support; the format of sessions; setting up public involvement panels: use of public contributors in analysis and interpretation of data; and using public members to assist with dissemination and translation into practice. Conclusions The analysis of reports submitted by researchers following involvement events provides evidence of the value of public involvement during the development of applications for research funding, and details a method for involving members of the public in data analysis which could be of value to other researchers The findings of the analysis indicate recognition amongst researchers of the variety in potential roles for public members in research, and also an acknowledgement of how

  3. Development of laboratory acceleration test method for service life prediction of concrete structures

    International Nuclear Information System (INIS)

    Cho, M. S.; Song, Y. C.; Bang, K. S.; Lee, J. S.; Kim, D. K.

    1999-01-01

    Service life prediction of nuclear power plants depends on the application of history of structures, field inspection and test, the development of laboratory acceleration tests, their analysis method and predictive model. In this study, laboratory acceleration test method for service life prediction of concrete structures and application of experimental test results are introduced. This study is concerned with environmental condition of concrete structures and is to develop the acceleration test method for durability factors of concrete structures e.g. carbonation, sulfate attack, freeze-thaw cycles and shrinkage-expansion etc

  4. Parametric Methods for Order Tracking Analysis

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Jensen, Tobias Lindstrøm

    2017-01-01

    Order tracking analysis is often used to find the critical speeds at which structural resonances are excited by a rotating machine. Typically, order tracking analysis is performed via non-parametric methods. In this report, however, we demonstrate some of the advantages of using a parametric method...

  5. Between practice and theory: Melanie Klein, Anna Freud and the development of child analysis.

    Science.gov (United States)

    Donaldson, G

    1996-04-01

    An examination of the early history of child analysis in the writings of Melanie Klein and Anna Freud reveals how two different and opposing approaches to child analysis arose at the same time. The two methods of child analysis are rooted in a differential emphasis on psychoanalytic theory and practice. The Kleinian method derives from the application of technique while the Anna Freudian method is driven by theory. Furthermore, by holding to the Freudian theory of child development Anna Freud was forced to limit the scope of child analysis, while Klein's application of Freudian practice has led to new discoveries about the development of the infant psyche.

  6. Development and validation of AccuTOF-DART™ as a screening method for analysis of bank security device and pepper spray components.

    Science.gov (United States)

    Pfaff, Allison M; Steiner, Robert R

    2011-03-20

    Analysis of bank security devices, containing 1-methylaminoanthraquinone (MAAQ) and o-chlorobenzylidenemalononitrile (CS), and pepper sprays, containing capsaicin, is a lengthy process with no specific screening technique to aid in identifying samples of interest. Direct Analysis in Real Time (DART™) ionization coupled with an Accurate Time of Flight (AccuTOF) mass detector is a fast, ambient ionization source that could significantly reduce time spent on these cases and increase the specificity of the screening process. A new method for screening clothing for bank dye and pepper spray, using AccuTOF-DART™ analysis, has been developed. Detection of MAAQ, CS, and capsaicin was achieved via extraction of each compound onto cardstock paper, which was then sampled in the AccuTOF-DART™. All results were verified using gas chromatography coupled with electron impact mass spectrometry. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  7. Why Map Issues? On Controversy Analysis as a Digital Method.

    Science.gov (United States)

    Marres, Noortje

    2015-09-01

    This article takes stock of recent efforts to implement controversy analysis as a digital method in the study of science, technology, and society (STS) and beyond and outlines a distinctive approach to address the problem of digital bias. Digital media technologies exert significant influence on the enactment of controversy in online settings, and this risks undermining the substantive focus of controversy analysis conducted by digital means. To address this problem, I propose a shift in thematic focus from controversy analysis to issue mapping. The article begins by distinguishing between three broad frameworks that currently guide the development of controversy analysis as a digital method, namely, demarcationist, discursive, and empiricist. Each has been adopted in STS, but only the last one offers a digital "move beyond impartiality." I demonstrate this approach by analyzing issues of Internet governance with the aid of the social media platform Twitter.

  8. Methods in carbon K-edge NEXAFS: Experiment and analysis

    International Nuclear Information System (INIS)

    Watts, B.; Thomsen, L.; Dastoor, P.C.

    2006-01-01

    Near-edge X-ray absorption spectroscopy (NEXAFS) is widely used to probe the chemistry and structure of surface layers. Moreover, using ultra-high brilliance polarised synchrotron light sources, it is possible to determine the molecular alignment of ultra-thin surface films. However, the quantitative analysis of NEXAFS data is complicated by many experimental factors and, historically, the essential methods of calibration, normalisation and artefact removal are presented in the literature in a somewhat fragmented manner, thus hindering their integrated implementation as well as their further development. This paper outlines a unified, systematic approach to the collection and quantitative analysis of NEXAFS data with a particular focus upon carbon K-edge spectra. As a consequence, we show that current methods neglect several important aspects of the data analysis process, which we address with a combination of novel and adapted techniques. We discuss multiple approaches in solving the issues commonly encountered in the analysis of NEXAFS data, revealing the inherent assumptions of each approach and providing guidelines for assessing their appropriateness in a broad range of experimental situations

  9. Learning from environmental data: Methods for analysis of forest nutrition time series

    Energy Technology Data Exchange (ETDEWEB)

    Sulkava, M. (Helsinki Univ. of Technology, Espoo (Finland). Computer and Information Science)

    2008-07-01

    Data analysis methods play an important role in increasing our knowledge of the environment as the amount of data measured from the environment increases. This thesis fits under the scope of environmental informatics and environmental statistics. They are fields, in which data analysis methods are developed and applied for the analysis of environmental data. The environmental data studied in this thesis are time series of nutrient concentration measurements of pine and spruce needles. In addition, there are data of laboratory quality and related environmental factors, such as the weather and atmospheric depositions. The most important methods used for the analysis of the data are based on the self-organizing map and linear regression models. First, a new clustering algorithm of the self-organizing map is proposed. It is found to provide better results than two other methods for clustering of the self-organizing map. The algorithm is used to divide the nutrient concentration data into clusters, and the result is evaluated by environmental scientists. Based on the clustering, the temporal development of the forest nutrition is modeled and the effect of nitrogen and sulfur deposition on the foliar mineral composition is assessed. Second, regression models are used for studying how much environmental factors and properties of the needles affect the changes in the nutrient concentrations of the needles between their first and second year of existence. The aim is to build understandable models with good prediction capabilities. Sparse regression models are found to outperform more traditional regression models in this task. Third, fusion of laboratory quality data from different sources is performed to estimate the precisions of the analytical methods. Weighted regression models are used to quantify how much the precision of observations can affect the time needed to detect a trend in environmental time series. The results of power analysis show that improving the

  10. Method Engineering: Engineering of Information Systems Development Methods and Tools

    NARCIS (Netherlands)

    Brinkkemper, J.N.; Brinkkemper, Sjaak

    1996-01-01

    This paper proposes the term method engineering for the research field of the construction of information systems development methods and tools. Some research issues in method engineering are identified. One major research topic in method engineering is discussed in depth: situational methods, i.e.

  11. The order and priority of research and design method application within an assistive technology new product development process: a summative content analysis of 20 case studies.

    Science.gov (United States)

    Torrens, George Edward

    2018-01-01

    Summative content analysis was used to define methods and heuristics from each case study. The review process was in two parts: (1) A literature review to identify conventional research methods and (2) a summative content analysis of published case studies, based on the identified methods and heuristics to suggest an order and priority of where and when were used. Over 200 research and design methods and design heuristics were identified. From the review of the 20 case studies 42 were identified as being applied. The majority of methods and heuristics were applied in phase two, market choice. There appeared a disparity between the limited numbers of methods frequently used, under 10 within the 20 case studies, when hundreds were available. Implications for Rehabilitation The communication highlights a number of issues that have implication for those involved in assistive technology new product development: •The study defined over 200 well-established research and design methods and design heuristics that are available for use by those who specify and design assistive technology products, which provide a comprehensive reference list for practitioners in the field; •The review within the study suggests only a limited number of research and design methods are regularly used by industrial design focused assistive technology new product developers; and, •Debate is required within the practitioners working in this field to reflect on how a wider range of potentially more effective methods and heuristics may be incorporated into daily working practice.

  12. Gene set analysis: limitations in popular existing methods and proposed improvements.

    Science.gov (United States)

    Mishra, Pashupati; Törönen, Petri; Leino, Yrjö; Holm, Liisa

    2014-10-01

    Gene set analysis is the analysis of a set of genes that collectively contribute to a biological process. Most popular gene set analysis methods are based on empirical P-value that requires large number of permutations. Despite numerous gene set analysis methods developed in the past decade, the most popular methods still suffer from serious limitations. We present a gene set analysis method (mGSZ) based on Gene Set Z-scoring function (GSZ) and asymptotic P-values. Asymptotic P-value calculation requires fewer permutations, and thus speeds up the gene set analysis process. We compare the GSZ-scoring function with seven popular gene set scoring functions and show that GSZ stands out as the best scoring function. In addition, we show improved performance of the GSA method when the max-mean statistics is replaced by the GSZ scoring function. We demonstrate the importance of both gene and sample permutations by showing the consequences in the absence of one or the other. A comparison of asymptotic and empirical methods of P-value estimation demonstrates a clear advantage of asymptotic P-value over empirical P-value. We show that mGSZ outperforms the state-of-the-art methods based on two different evaluations. We compared mGSZ results with permutation and rotation tests and show that rotation does not improve our asymptotic P-values. We also propose well-known asymptotic distribution models for three of the compared methods. mGSZ is available as R package from cran.r-project.org. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. Development and validation of analytical methods for dietary supplements

    International Nuclear Information System (INIS)

    Sullivan, Darryl; Crowley, Richard

    2006-01-01

    The expanding use of innovative botanical ingredients in dietary supplements and foods has resulted in a flurry of research aimed at the development and validation of analytical methods for accurate measurement of active ingredients. The pressing need for these methods is being met through an expansive collaborative initiative involving industry, government, and analytical organizations. This effort has resulted in the validation of several important assays as well as important advances in the method engineering procedures which have improved the efficiency of the process. The initiative has also allowed researchers to hurdle many of the barricades that have hindered accurate analysis such as the lack of reference standards and comparative data. As the availability for nutraceutical products continues to increase these methods will provide consumers and regulators with the scientific information needed to assure safety and dependable labeling

  14. Non-contact method of search and analysis of pulsating vessels

    Science.gov (United States)

    Avtomonov, Yuri N.; Tsoy, Maria O.; Postnov, Dmitry E.

    2018-04-01

    Despite the variety of existing methods of recording the human pulse and a solid history of their development, there is still considerable interest in this topic. The development of new non-contact methods, based on advanced image processing, caused a new wave of interest in this issue. We present a simple but quite effective method for analyzing the mechanical pulsations of blood vessels lying close to the surface of the skin. Our technique is a modification of imaging (or remote) photoplethysmography (i-PPG). We supplemented this method with the addition of a laser light source, which made it possible to use other methods of searching for the proposed pulsation zone. During the testing of the method, several series of experiments were carried out with both artificial oscillating objects as well as with the target signal source (human wrist). The obtained results show that our method allows correct interpretation of complex data. To summarize, we proposed and tested an alternative method for the search and analysis of pulsating vessels.

  15. Development of a new extraction technique and HPLC method for the analysis of non-psychoactive cannabinoids in fibre-type Cannabis sativa L. (hemp).

    Science.gov (United States)

    Brighenti, Virginia; Pellati, Federica; Steinbach, Marleen; Maran, Davide; Benvenuti, Stefania

    2017-09-05

    The present work was aimed at the development and validation of a new, efficient and reliable technique for the analysis of the main non-psychoactive cannabinoids in fibre-type Cannabis sativa L. (hemp) inflorescences belonging to different varieties. This study was designed to identify samples with a high content of bioactive compounds, with a view to underscoring the importance of quality control in derived products as well. Different extraction methods, including dynamic maceration (DM), ultrasound-assisted extraction (UAE), microwave-assisted extraction (MAE) and supercritical-fluid extraction (SFE) were applied and compared in order to obtain a high yield of the target analytes from hemp. Dynamic maceration for 45min with ethanol (EtOH) at room temperature proved to be the most suitable technique for the extraction of cannabinoids in hemp samples. The analysis of the target analytes in hemp extracts was carried out by developing a new reversed-phase high-performance liquid chromatography (HPLC) method coupled with diode array (UV/DAD) and electrospray ionization-mass spectrometry (ESI-MS) detection, by using an ion trap mass analyser. An Ascentis Express C 18 column (150mm×3.0mm I.D., 2.7μm) was selected for the HPLC analysis, with a mobile phase composed of 0.1% formic acid in both water and acetonitrile, under gradient elution. The application of the fused-core technology allowed us to obtain a significant improvement of the HPLC performance compared with that of conventional particulate stationary phases, with a shorter analysis time and a remarkable reduction of solvent usage. The analytical method optimized in this study was fully validated to show compliance with international requirements. Furthermore, it was applied to the characterization of nine hemp samples and six hemp-based pharmaceutical products. As such, it was demonstrated to be a very useful tool for the analysis of cannabinoids in both the plant material and its derivatives for

  16. Moral counselling: a method in development.

    Science.gov (United States)

    de Groot, Jack; Leget, Carlo

    2011-01-01

    This article describes a method of moral counselling developed in the Radboud University Medical Centre Nijmegen (The Netherlands). The authors apply insights of Paul Ricoeur to the non-directive counselling method of Carl Rogers in their work of coaching patients with moral problems in health care. The developed method was shared with other health care professionals in a training course. Experiences in the course and further practice led to further improvement of the method.

  17. A Mixed-Methods Analysis in Assessing Students' Professional Development by Applying an Assessment for Learning Approach.

    Science.gov (United States)

    Peeters, Michael J; Vaidya, Varun A

    2016-06-25

    Objective. To describe an approach for assessing the Accreditation Council for Pharmacy Education's (ACPE) doctor of pharmacy (PharmD) Standard 4.4, which focuses on students' professional development. Methods. This investigation used mixed methods with triangulation of qualitative and quantitative data to assess professional development. Qualitative data came from an electronic developmental portfolio of professionalism and ethics, completed by PharmD students during their didactic studies. Quantitative confirmation came from the Defining Issues Test (DIT)-an assessment of pharmacists' professional development. Results. Qualitatively, students' development reflections described growth through this course series. Quantitatively, the 2015 PharmD class's DIT N2-scores illustrated positive development overall; the lower 50% had a large initial improvement compared to the upper 50%. Subsequently, the 2016 PharmD class confirmed these average initial improvements of students and also showed further substantial development among students thereafter. Conclusion. Applying an assessment for learning approach, triangulation of qualitative and quantitative assessments confirmed that PharmD students developed professionally during this course series.

  18. Principles of Developing Multi-Pesticide Methods Based on HPLC Determination

    Energy Technology Data Exchange (ETDEWEB)

    Dudar, E. [Plant Protection & Soil Conservation Service of Budapest, Budapest (Hungary)

    2009-07-15

    Principles for the development of multi-pesticide methods based on HPLC determination are outlined. Flow charts and block diagrams give guidance on how to proceed stepwise in the set-up of respective analytical methods. Detailed information is provided on what to take into consideration for setting up a pesticide formulation analysis method. HPLC variables like the types of column, solvents and their strength, pH value, eluent modifiers, column temperature, etc, and the influence on the separation and resolution of chromatographic peaks are discussed as well as the necessity and benefits of internal standardization. Examples of system suitability testing experiments are given for illustration. (author)

  19. Three-dimensional wake field analysis by boundary element method

    International Nuclear Information System (INIS)

    Miyata, K.

    1987-01-01

    A computer code HERTPIA was developed for the calculation of electromagnetic wake fields excited by charged particles travelling through arbitrarily shaped accelerating cavities. This code solves transient wave problems for a Hertz vector. The numerical analysis is based on the boundary element method. This program is validated by comparing its results with analytical solutions in a pill-box cavity

  20. Finite element random vibration method for soil-structure interaction analysis

    International Nuclear Information System (INIS)

    Romo-Organista, M.P.; Lysmer, J.; Seed, H.B.

    1977-01-01

    The authors present a method in which the seismic environment is defined directly in terms of the given design response spectrum. Response spectra cannot be used directly for random analysis, thus using extreme value theory a new procedure has been developed for converting the design response spectrum into a design power spectrum. This procedure is reversible and can also be used to compute response spectra the distribution of which can be expressed in terms of Confidence limits. Knowing the design power spctrum the resulting output power spectra and their statistical distribution can be computed by a response analysis of the soil-structure system in the frequency domain. Due to the complexity of soil structure systems, this is most conveniently done by the finite element method. Having obtained the power spectra for all motions in the system, these spectra can be used to determine other statistical information about the response such as maximum accelerations, stresses, bending moments, etc, all with appropriate confidence limits. This type of information is actually more useful for design than corresponding deterministic values. The authors have developed a computer program, PLUSH, which can perform the above procedures. Results obtained by the new method are in excellent agreement with the results of corresponding deterministic analysis. Furthermore, the probabilistic results can be obtained at a fraction of the cost of deterministic results

  1. Methods for air cleaning system design and accident analysis

    International Nuclear Information System (INIS)

    Gregory, W.S.; Nichols, B.D.

    1987-01-01

    This paper describes methods, in the form of a handbook and five computer codes, that can be used for nuclear facility air cleaning system design and accident analysis. Four of the codes were developed primarily at the Los Alamos National Laboratory, and one was developed in France. Tools such as these are used to design ventilation systems in the mining industry but do not seem to be commonly used in the nuclear industry. For example, the Nuclear Air Cleaning Handbook is an excellent design reference, but it fails to include information on computer codes that can be used to aid in the design process. These computer codes allow the analyst to use the handbook information to form all the elements of a complete system design. Because these analysis methods are in the form of computer codes they allow the analyst to investigate many alternative designs. In addition, the effects of many accident scenarios on the operation of the air cleaning system can be evaluated. These tools originally were intended for accident analysis, but they have been used mostly as design tools by several architect-engineering firms. The Cray, VAX, and personal computer versions of the codes, an accident analysis handbook, and the codes availability will be discussed. The application of these codes to several design operations of nuclear facilities will be illustrated, and their use to analyze the effect of several accident scenarios also will be described

  2. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    Science.gov (United States)

    Sanfilippo, Antonio P [Richland, WA; Cowell, Andrew J [Kennewick, WA; Gregory, Michelle L [Richland, WA; Baddeley, Robert L [Richland, WA; Paulson, Patrick R [Pasco, WA; Tratz, Stephen C [Richland, WA; Hohimer, Ryan E [West Richland, WA

    2012-03-20

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  3. Development of digital gamma-activation autoradiography for analysis of samples of large area

    International Nuclear Information System (INIS)

    Kolotov, V.P.; Grozdov, D.S.; Dogadkin, N.N.; Korobkov, V.I.

    2011-01-01

    Gamma-activation autoradiography is a prospective method for screening detection of inclusions of precious metals in geochemical samples. Its characteristics allow analysis of thin sections of large size (tens of cm2), that favourably distinguishes it among the other methods for local analysis. At the same time, the activating field of the accelerator bremsstrahlung, displays a sharp intensity decrease relative to the distance along the axis. A method for activation dose ''equalization'' during irradiation of the large size thin sections has been developed. The method is based on the usage of a hardware-software system. This includes a device for moving the sample during the irradiation, a program for computer modelling of the acquired activating dose for the chosen kinematics of the sample movement and a program for pixel-by pixel correction of the autoradiographic images. For detection of inclusions of precious metals, a method for analysis of the acquired dose dynamics during sample decay has been developed. The method is based on the software processing pixel by pixel a time-series of coaxial autoradiographic images and generation of the secondary meta-images allowing interpretation regarding the presence of interesting inclusions based on half-lives. The method is tested for analysis of copper-nickel polymetallic ores. The developed solutions considerably expand the possible applications of digital gamma-activation autoradiography. (orig.)

  4. Development of digital gamma-activation autoradiography for analysis of samples of large area

    Energy Technology Data Exchange (ETDEWEB)

    Kolotov, V.P.; Grozdov, D.S.; Dogadkin, N.N.; Korobkov, V.I. [Russian Academy of Sciences, Moscow (Russian Federation). Vernadsky Inst. of Geochemistry and Analytical Chemistry

    2011-07-01

    Gamma-activation autoradiography is a prospective method for screening detection of inclusions of precious metals in geochemical samples. Its characteristics allow analysis of thin sections of large size (tens of cm2), that favourably distinguishes it among the other methods for local analysis. At the same time, the activating field of the accelerator bremsstrahlung, displays a sharp intensity decrease relative to the distance along the axis. A method for activation dose ''equalization'' during irradiation of the large size thin sections has been developed. The method is based on the usage of a hardware-software system. This includes a device for moving the sample during the irradiation, a program for computer modelling of the acquired activating dose for the chosen kinematics of the sample movement and a program for pixel-by pixel correction of the autoradiographic images. For detection of inclusions of precious metals, a method for analysis of the acquired dose dynamics during sample decay has been developed. The method is based on the software processing pixel by pixel a time-series of coaxial autoradiographic images and generation of the secondary meta-images allowing interpretation regarding the presence of interesting inclusions based on half-lives. The method is tested for analysis of copper-nickel polymetallic ores. The developed solutions considerably expand the possible applications of digital gamma-activation autoradiography. (orig.)

  5. 3D spatially-adaptive canonical correlation analysis: Local and global methods.

    Science.gov (United States)

    Yang, Zhengshi; Zhuang, Xiaowei; Sreenivasan, Karthik; Mishra, Virendra; Curran, Tim; Byrd, Richard; Nandy, Rajesh; Cordes, Dietmar

    2018-04-01

    Local spatially-adaptive canonical correlation analysis (local CCA) with spatial constraints has been introduced to fMRI multivariate analysis for improved modeling of activation patterns. However, current algorithms require complicated spatial constraints that have only been applied to 2D local neighborhoods because the computational time would be exponentially increased if the same method is applied to 3D spatial neighborhoods. In this study, an efficient and accurate line search sequential quadratic programming (SQP) algorithm has been developed to efficiently solve the 3D local CCA problem with spatial constraints. In addition, a spatially-adaptive kernel CCA (KCCA) method is proposed to increase accuracy of fMRI activation maps. With oriented 3D spatial filters anisotropic shapes can be estimated during the KCCA analysis of fMRI time courses. These filters are orientation-adaptive leading to rotational invariance to better match arbitrary oriented fMRI activation patterns, resulting in improved sensitivity of activation detection while significantly reducing spatial blurring artifacts. The kernel method in its basic form does not require any spatial constraints and analyzes the whole-brain fMRI time series to construct an activation map. Finally, we have developed a penalized kernel CCA model that involves spatial low-pass filter constraints to increase the specificity of the method. The kernel CCA methods are compared with the standard univariate method and with two different local CCA methods that were solved by the SQP algorithm. Results show that SQP is the most efficient algorithm to solve the local constrained CCA problem, and the proposed kernel CCA methods outperformed univariate and local CCA methods in detecting activations for both simulated and real fMRI episodic memory data. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Investigation on method of elasto-plastic analysis for piping system (benchmark analysis)

    International Nuclear Information System (INIS)

    Kabaya, Takuro; Kojima, Nobuyuki; Arai, Masashi

    2015-01-01

    This paper provides method of an elasto-plastic analysis for practical seismic design of nuclear piping system. JSME started up the task to establish method of an elasto-plastic analysis for nuclear piping system. The benchmark analyses have been performed in the task to investigate on method of an elasto-plastic analysis. And our company has participated in the benchmark analyses. As a result, we have settled on the method which simulates the result of piping exciting test accurately. Therefore the recommended method of an elasto-plastic analysis is shown as follows; 1) An elasto-plastic analysis is composed of dynamic analysis of piping system modeled by using beam elements and static analysis of deformed elbow modeled by using shell elements. 2) Bi-linear is applied as an elasto-plastic property. Yield point is standardized yield point multiplied by 1.2 times, and second gradient is 1/100 young's modulus. Kinematic hardening is used as a hardening rule. 3) The fatigue life is evaluated on strain ranges obtained by elasto-plastic analysis, by using the rain flow method and the fatigue curve of previous studies. (author)

  7. Development of a quantitative safety assessment method for nuclear I and C systems including human operators

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2004-02-01

    Conventional PSA (probabilistic safety analysis) is performed in the framework of event tree analysis and fault tree analysis. In conventional PSA, I and C systems and human operators are assumed to be independent for simplicity. But, the dependency of human operators on I and C systems and the dependency of I and C systems on human operators are gradually recognized to be significant. I believe that it is time to consider the interdependency between I and C systems and human operators in the framework of PSA. But, unfortunately it seems that we do not have appropriate methods for incorporating the interdependency between I and C systems and human operators in the framework of Pasa. Conventional human reliability analysis (HRA) methods are not developed to consider the interdependecy, and the modeling of the interdependency using conventional event tree analysis and fault tree analysis seem to be, event though is does not seem to be impossible, quite complex. To incorporate the interdependency between I and C systems and human operators, we need a new method for HRA and a new method for modeling the I and C systems, man-machine interface (MMI), and human operators for quantitative safety assessment. As a new method for modeling the I and C systems, MMI and human operators, I develop a new system reliability analysis method, reliability graph with general gates (RGGG), which can substitute conventional fault tree analysis. RGGG is an intuitive and easy-to-use method for system reliability analysis, while as powerful as conventional fault tree analysis. To demonstrate the usefulness of the RGGG method, it is applied to the reliability analysis of Digital Plant Protection System (DPPS), which is the actual plant protection system of Ulchin 5 and 6 nuclear power plants located in Republic of Korea. The latest version of the fault tree for DPPS, which is developed by the Integrated Safety Assessment team in Korea Atomic Energy Research Institute (KAERI), consists of 64

  8. A NRC-BNL benchmark evaluation of seismic analysis methods for non-classically damped coupled systems

    International Nuclear Information System (INIS)

    Xu, J.; DeGrassi, G.; Chokshi, N.

    2004-01-01

    Under the auspices of the U.S. Nuclear Regulatory Commission (NRC), Brookhaven National Laboratory (BNL) developed a comprehensive program to evaluate state-of-the-art methods and computer programs for seismic analysis of typical coupled nuclear power plant (NPP) systems with non-classical damping. In this program, four benchmark models of coupled building-piping/equipment systems with different damping characteristics were developed and analyzed by BNL for a suite of earthquakes. The BNL analysis was carried out by the Wilson-θ time domain integration method with the system-damping matrix computed using a synthesis formulation as presented in a companion paper [Nucl. Eng. Des. (2002)]. These benchmark problems were subsequently distributed to and analyzed by program participants applying their uniquely developed methods and computer programs. This paper is intended to offer a glimpse at the program, and provide a summary of major findings and principle conclusions with some representative results. The participant's analysis results established using complex modal time history methods showed good comparison with the BNL solutions, while the analyses produced with either complex-mode response spectrum methods or classical normal-mode response spectrum method, in general, produced more conservative results, when averaged over a suite of earthquakes. However, when coupling due to damping is significant, complex-mode response spectrum methods performed better than the classical normal-mode response spectrum method. Furthermore, as part of the program objectives, a parametric assessment is also presented in this paper, aimed at evaluation of the applicability of various analysis methods to problems with different dynamic characteristics unique to coupled NPP systems. It is believed that the findings and insights learned from this program will be useful in developing new acceptance criteria and providing guidance for future regulatory activities involving license

  9. [A factor analysis method for contingency table data with unlimited multiple choice questions].

    Science.gov (United States)

    Toyoda, Hideki; Haiden, Reina; Kubo, Saori; Ikehara, Kazuya; Isobe, Yurie

    2016-02-01

    The purpose of this study is to propose a method of factor analysis for analyzing contingency tables developed from the data of unlimited multiple-choice questions. This method assumes that the element of each cell of the contingency table has a binominal distribution and a factor analysis model is applied to the logit of the selection probability. Scree plot and WAIC are used to decide the number of factors, and the standardized residual, the standardized difference between the sample, and the proportion ratio, is used to select items. The proposed method was applied to real product impression research data on advertised chips and energy drinks. Since the results of the analysis showed that this method could be used in conjunction with conventional factor analysis model, and extracted factors were fully interpretable, and suggests the usefulness of the proposed method in the study of psychology using unlimited multiple-choice questions.

  10. Using Mixed Methods to Analyze Video Data: A Mathematics Teacher Professional Development Example

    Science.gov (United States)

    DeCuir-Gunby, Jessica T.; Marshall, Patricia L.; McCulloch, Allison W.

    2012-01-01

    This article uses data from 65 teachers participating in a K-2 mathematics professional development research project as an example of how to analyze video recordings of teachers' classroom lessons using mixed methods. Through their discussion, the authors demonstrate how using a mixed methods approach to classroom video analysis allows researchers…

  11. Viscous-Inviscid Methods in Unsteady Aerodynamic Analysis of Bio-Inspired Morphing Wings

    Science.gov (United States)

    Dhruv, Akash V.

    Flight has been one of the greatest realizations of human imagination, revolutionizing communication and transportation over the years. This has greatly influenced the growth of technology itself, enabling researchers to communicate and share their ideas more effectively, extending the human potential to create more sophisticated systems. While the end product of a sophisticated technology makes our lives easier, its development process presents an array of challenges in itself. In last decade, scientists and engineers have turned towards bio-inspiration to design more efficient and robust aerodynamic systems to enhance the ability of Unmanned Aerial Vehicles (UAVs) to be operated in cluttered environments, where tight maneuverability and controllability are necessary. Effective use of UAVs in domestic airspace will mark the beginning of a new age in communication and transportation. The design of such complex systems necessitates the need for faster and more effective tools to perform preliminary investigations in design, thereby streamlining the design process. This thesis explores the implementation of numerical panel methods for aerodynamic analysis of bio-inspired morphing wings. Numerical panel methods have been one of the earliest forms of computational methods for aerodynamic analysis to be developed. Although the early editions of this method performed only inviscid analysis, the algorithm has matured over the years as a result of contributions made by prominent aerodynamicists. The method discussed in this thesis is influenced by recent advancements in panel methods and incorporates both viscous and inviscid analysis of multi-flap wings. The surface calculation of aerodynamic coefficients makes this method less computationally expensive than traditional Computational Fluid Dynamics (CFD) solvers available, and thus is effective when both speed and accuracy are desired. The morphing wing design, which consists of sequential feather-like flaps installed

  12. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system structural components

    Science.gov (United States)

    Cruse, T. A.

    1987-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  13. Probabilistic Structural Analysis Methods for select space propulsion system structural components (PSAM)

    Science.gov (United States)

    Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.

    1988-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  14. Sensitivity analysis methods and a biosphere test case implemented in EIKOS

    Energy Technology Data Exchange (ETDEWEB)

    Ekstroem, P.A.; Broed, R. [Facilia AB, Stockholm, (Sweden)

    2006-05-15

    Computer-based models can be used to approximate real life processes. These models are usually based on mathematical equations, which are dependent on several variables. The predictive capability of models is therefore limited by the uncertainty in the value of these. Sensitivity analysis is used to apportion the relative importance each uncertain input parameter has on the output variation. Sensitivity analysis is therefore an essential tool in simulation modelling and for performing risk assessments. Simple sensitivity analysis techniques based on fitting the output to a linear equation are often used, for example correlation or linear regression coefficients. These methods work well for linear models, but for non-linear models their sensitivity estimations are not accurate. Usually models of complex natural systems are non-linear. Within the scope of this work, various sensitivity analysis methods, which can cope with linear, non-linear, as well as non-monotone problems, have been implemented, in a software package, EIKOS, written in Matlab language. The following sensitivity analysis methods are supported by EIKOS: Pearson product moment correlation coefficient (CC), Spearman Rank Correlation Coefficient (RCC), Partial (Rank) Correlation Coefficients (PCC), Standardized (Rank) Regression Coefficients (SRC), Sobol' method, Jansen's alternative, Extended Fourier Amplitude Sensitivity Test (EFAST) as well as the classical FAST method and the Smirnov and the Cramer-von Mises tests. A graphical user interface has also been developed, from which the user easily can load or call the model and perform a sensitivity analysis as well as uncertainty analysis. The implemented sensitivity analysis methods has been benchmarked with well-known test functions and compared with other sensitivity analysis software, with successful results. An illustration of the applicability of EIKOS is added to the report. The test case used is a landscape model consisting of several

  15. Sensitivity analysis methods and a biosphere test case implemented in EIKOS

    International Nuclear Information System (INIS)

    Ekstroem, P.A.; Broed, R.

    2006-05-01

    Computer-based models can be used to approximate real life processes. These models are usually based on mathematical equations, which are dependent on several variables. The predictive capability of models is therefore limited by the uncertainty in the value of these. Sensitivity analysis is used to apportion the relative importance each uncertain input parameter has on the output variation. Sensitivity analysis is therefore an essential tool in simulation modelling and for performing risk assessments. Simple sensitivity analysis techniques based on fitting the output to a linear equation are often used, for example correlation or linear regression coefficients. These methods work well for linear models, but for non-linear models their sensitivity estimations are not accurate. Usually models of complex natural systems are non-linear. Within the scope of this work, various sensitivity analysis methods, which can cope with linear, non-linear, as well as non-monotone problems, have been implemented, in a software package, EIKOS, written in Matlab language. The following sensitivity analysis methods are supported by EIKOS: Pearson product moment correlation coefficient (CC), Spearman Rank Correlation Coefficient (RCC), Partial (Rank) Correlation Coefficients (PCC), Standardized (Rank) Regression Coefficients (SRC), Sobol' method, Jansen's alternative, Extended Fourier Amplitude Sensitivity Test (EFAST) as well as the classical FAST method and the Smirnov and the Cramer-von Mises tests. A graphical user interface has also been developed, from which the user easily can load or call the model and perform a sensitivity analysis as well as uncertainty analysis. The implemented sensitivity analysis methods has been benchmarked with well-known test functions and compared with other sensitivity analysis software, with successful results. An illustration of the applicability of EIKOS is added to the report. The test case used is a landscape model consisting of several linked

  16. Development of method quantitative content of dihydroquercetin. Report 1

    Directory of Open Access Journals (Sweden)

    Олександр Юрійович Владимиров

    2016-01-01

    Full Text Available Today is markedly increasing scientific interest in the study of flavonoids in plant objects due to their high biological activity. In this regard, the urgent task of analytical chemistry is in developing available analytical techniques of determination for flavonoids in plant objects.Aim. The aim was to develop specific techniques of quantitative determination for dihydroquercetin and determination of its validation characteristics.Methods. The technique for photocolorimetric quantification of DQW, which was based on the specific reaction of cyanidine chloride formation when added zinc powder to dihydroquercetin solution in an acidic medium has been elaborated.Results. Photocolorimetric technique of determining flavonoids recalculating on DQW has been developed, its basic validation characteristics have been determined. The obtained metrological characteristics of photocolorimetric technique for determining DQW did not exceed admissibility criteria in accordance with the requirements of the State Pharmacopoeia of Ukraine.Conclusions. By the results of statistical analysis of experimental data, it has been stated that the developed technique can be used for quantification of DQW. Metrological data obtained indicate that the method reproduced in conditions of two different laboratories with confidence probability 95 % unit value deviation was 101,85±2,54 %

  17. Reactor physics methods development at Westinghouse

    International Nuclear Information System (INIS)

    Mueller, E.; Mayhue, L.; Zhang, B.

    2007-01-01

    The current state of reactor physics methods development at Westinghouse is discussed. The focus is on the methods that have been or are under development within the NEXUS project which was launched a few years ago. The aim of this project is to merge and modernize the methods employed in the PWR and BWR steady-state reactor physics codes of Westinghouse. (author)

  18. Scope-Based Method Cache Analysis

    DEFF Research Database (Denmark)

    Huber, Benedikt; Hepp, Stefan; Schoeberl, Martin

    2014-01-01

    The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution, as it req......The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution...

  19. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    of the transcriptome, 5’ end capture of RNA is combined with next-generation sequencing for high-throughput quantitative assessment of transcription start sites by two different methods. The methods presented here allow for functional investigation of coding as well as noncoding RNA and contribute to future...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA...

  20. Chapter 11. Community analysis-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  1. Improvement of numerical analysis method for FBR core characteristics. 3

    International Nuclear Information System (INIS)

    Takeda, Toshikazu; Yamamoto, Toshihisa; Kitada, Takanori; Katagi, Yousuke

    1998-03-01

    As the improvement of numerical analysis method for FBR core characteristics, studies on several topics have been conducted; multiband method, Monte Carlo perturbation and nodal transport method. This report is composed of the following three parts. Part 1: Improvement of Reaction Rate Calculation Method in the Blanket Region Based on the Multiband Method; A method was developed for precise evaluation of the reaction rate distribution in the blanket region using the multiband method. With the 3-band parameters obtained from the ordinary fitting method, major reaction rates such as U-238 capture, U-235 fission, Pu-239 fission and U-238 fission rate distributions were analyzed. Part 2: Improvement of Estimation Method for Reactivity Based on Monte-Carlo Perturbation Theory; Perturbation theory based on Monte-Carlo perturbation theory have been investigated and introduced into the calculational code. The Monte-Carlo perturbation code was applied to MONJU core and the calculational results were compared to the reference. Part 3: Improvement of Nodal Transport Calculation for Hexagonal Geometry; A method to evaluate the intra-subassembly power distribution from the nodal averaged neutron flux and surface fluxes at the node boundaries, was developed based on the transport theory. (J.P.N.)

  2. Development and validation of a rapid chromatographic method for the analysis of flunarizine and its main production impurities

    Directory of Open Access Journals (Sweden)

    Niamh O’Connor

    2013-06-01

    Full Text Available A rapid selective method for the analysis of flunarizine and its associated impurities was developed and validated according to ICH guidelines. The separation was carried out using a Thermo Scientific Hypersil Gold C18 column (50mm×4.6mm i.d., 1.9μm particle size with a gradient mobile phase of acetonitrile–ammonium acetate–tetrabutylammoniumhydrogen sulfate buffer, at a flow rate of 1.8mL/min and UV detection at 230nm. Naturally aged samples were also tested to determine sample stability. A profile of sample and impurity breakdown was also presented. Keywords: Flunarizine, Sub 2μm column, Active pharmaceutical ingredient, HPLC

  3. Regional analysis of annual maximum rainfall using TL-moments method

    Science.gov (United States)

    Shabri, Ani Bin; Daud, Zalina Mohd; Ariff, Noratiqah Mohd

    2011-06-01

    Information related to distributions of rainfall amounts are of great importance for designs of water-related structures. One of the concerns of hydrologists and engineers is the probability distribution for modeling of regional data. In this study, a novel approach to regional frequency analysis using L-moments is revisited. Subsequently, an alternative regional frequency analysis using the TL-moments method is employed. The results from both methods were then compared. The analysis was based on daily annual maximum rainfall data from 40 stations in Selangor Malaysia. TL-moments for the generalized extreme value (GEV) and generalized logistic (GLO) distributions were derived and used to develop the regional frequency analysis procedure. TL-moment ratio diagram and Z-test were employed in determining the best-fit distribution. Comparison between the two approaches showed that the L-moments and TL-moments produced equivalent results. GLO and GEV distributions were identified as the most suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation was used for performance evaluation, and it showed that the method of TL-moments was more efficient for lower quantile estimation compared with the L-moments.

  4. Development of a method for holistic energy renovation

    DEFF Research Database (Denmark)

    Morelli, Martin

    recovery. The long-term performance of the renovation may be reduced due to mould growth behind the interior insulation or decay of the wooden beams. The energy saving potential in two multi-family buildings was investigated by parameter studies of existing energy saving measures for both the building....... Measurements of temperature and relative humidity showed that conditions for mould growth were present. However, no signs of mould growth were documented at dismantling of the interior insulation. A method was developed for the design of energy saving measures based on both Failure Mode and Effect Analysis...

  5. A posteriori error analysis of multiscale operator decomposition methods for multiphysics models

    International Nuclear Information System (INIS)

    Estep, D; Carey, V; Tavener, S; Ginting, V; Wildey, T

    2008-01-01

    Multiphysics, multiscale models present significant challenges in computing accurate solutions and for estimating the error in information computed from numerical solutions. In this paper, we describe recent advances in extending the techniques of a posteriori error analysis to multiscale operator decomposition solution methods. While the particulars of the analysis vary considerably with the problem, several key ideas underlie a general approach being developed to treat operator decomposition multiscale methods. We explain these ideas in the context of three specific examples

  6. Methods of Analysis of Electronic Money in Banks

    Directory of Open Access Journals (Sweden)

    Melnychenko Oleksandr V.

    2014-03-01

    Full Text Available The article identifies methods of analysis of electronic money, formalises its instruments and offers an integral indicator, which should be calculated by issuing banks and those banks, which carry out operations with electronic money, issued by other banks. Calculation of the integral indicator would allow complex assessment of activity of the studied bank with electronic money and would allow comparison of parameters of different banks by the aggregate of indicators for the study of the electronic money market, its level of development, etc. The article presents methods which envisage economic analysis of electronic money in banks by the following directions: solvency and liquidity, efficiency of electronic money issue, business activity of the bank and social responsibility. Moreover, the proposed indicators by each of the directions are offered to be taken into account when building integral indicators, with the help of which banks are studied: business activity, profitability, solvency, liquidity and so on.

  7. Development of a DNBR evaluation method for the CEA ejection accident in SMART core

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Dae Hyun; Yoo, Y. J.; In, W. K.; Chang, M. H. [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-12-01

    A methodology applicable to the analysis of the CEA ejection accident in SMART is developed for the evaluation of the fraction of fuel failure caused by DNB. The transient behavior of the core thermal-hydraulic conditions is calculated by the subchannel analysis code MATRA. The minimum DNBR during the accident is calculated by KRB-1 CHF correlation considering the 1/8 symmetry of hot assembly. The variation of hot assembly power during the accident is simulated by the LTC(Limiting transient Curve) which is determined from the analysis of power distribution data resulting from the three-dimensional core dynamics calculations. The initial condition of the accident is determined by considering LOC(Limiting Conditions for Operation) of SMART core. Two different methodologies for the evaluation of DNB failure rate are established; a deterministic method based on the DNB envelope, and a probabilistic method based on the DNB probability of each fuel rod. The methodology developed in this study is applied to the analysis of CEA ejection accident in the preliminary design core of SMART. As the result, the fractions of DNB fuel failure by the deterministic method and the probabilistic method are calculated as 38.7% and 7.8%, respectively. 16 refs., 16 figs., 5 tabs. (Author)

  8. Report of a CSNI workshop on uncertainty analysis methods. Volume 1 + 2

    International Nuclear Information System (INIS)

    Wickett, A.J.; Yadigaroglu, G.

    1994-08-01

    The OECD NEA CSNI Principal Working Group 2 (PWG2) Task Group on Thermal Hydraulic System Behaviour (TGTHSB) has, in recent years, received presentations of a variety of different methods to analyze the uncertainty in the calculations of advanced unbiased (best estimate) codes. Proposals were also made for an International Standard Problem (ISP) to compare the uncertainty analysis methods. The objectives for the Workshop were to discuss and fully understand the principles of uncertainty analysis relevant to LOCA modelling and like problems, to examine the underlying issues from first principles, in preference to comparing and contrasting the currently proposed methods, to reach consensus on the issues identified as far as possible while not avoiding the controversial aspects, to identify as clearly as possible unreconciled differences, and to issue a Status Report. Eight uncertainty analysis methods were presented. A structured discussion of various aspects of uncertainty analysis followed - the need for uncertainty analysis, identification and ranking of uncertainties, characterisation, quantification and combination of uncertainties and applications, resources and future developments. As a result, the objectives set out above were, to a very large extent, achieved. Plans for the ISP were also discussed. Volume 1 contains a record of the discussions on uncertainty methods. Volume 2 is a compilation of descriptions of the eight uncertainty analysis methods presented at the workshop

  9. NOLB: Nonlinear Rigid Block Normal Mode Analysis Method

    OpenAIRE

    Hoffmann , Alexandre; Grudinin , Sergei

    2017-01-01

    International audience; We present a new conceptually simple and computationally efficient method for nonlinear normal mode analysis called NOLB. It relies on the rotations-translations of blocks (RTB) theoretical basis developed by Y.-H. Sanejouand and colleagues. We demonstrate how to physically interpret the eigenvalues computed in the RTB basis in terms of angular and linear velocities applied to the rigid blocks and how to construct a nonlinear extrapolation of motion out of these veloci...

  10. The Impact of Normalization Methods on RNA-Seq Data Analysis

    Science.gov (United States)

    Zyprych-Walczak, J.; Szabelska, A.; Handschuh, L.; Górczak, K.; Klamecka, K.; Figlerowicz, M.; Siatkowski, I.

    2015-01-01

    High-throughput sequencing technologies, such as the Illumina Hi-seq, are powerful new tools for investigating a wide range of biological and medical problems. Massive and complex data sets produced by the sequencers create a need for development of statistical and computational methods that can tackle the analysis and management of data. The data normalization is one of the most crucial steps of data processing and this process must be carefully considered as it has a profound effect on the results of the analysis. In this work, we focus on a comprehensive comparison of five normalization methods related to sequencing depth, widely used for transcriptome sequencing (RNA-seq) data, and their impact on the results of gene expression analysis. Based on this study, we suggest a universal workflow that can be applied for the selection of the optimal normalization procedure for any particular data set. The described workflow includes calculation of the bias and variance values for the control genes, sensitivity and specificity of the methods, and classification errors as well as generation of the diagnostic plots. Combining the above information facilitates the selection of the most appropriate normalization method for the studied data sets and determines which methods can be used interchangeably. PMID:26176014

  11. Overview of hybrid subspace methods for uncertainty quantification, sensitivity analysis

    International Nuclear Information System (INIS)

    Abdel-Khalik, Hany S.; Bang, Youngsuk; Wang, Congjian

    2013-01-01

    Highlights: ► We overview the state-of-the-art in uncertainty quantification and sensitivity analysis. ► We overview new developments in above areas using hybrid methods. ► We give a tutorial introduction to above areas and the new developments. ► Hybrid methods address the explosion in dimensionality in nonlinear models. ► Representative numerical experiments are given. -- Abstract: The role of modeling and simulation has been heavily promoted in recent years to improve understanding of complex engineering systems. To realize the benefits of modeling and simulation, concerted efforts in the areas of uncertainty quantification and sensitivity analysis are required. The manuscript intends to serve as a pedagogical presentation of the material to young researchers and practitioners with little background on the subjects. We believe this is important as the role of these subjects is expected to be integral to the design, safety, and operation of existing as well as next generation reactors. In addition to covering the basics, an overview of the current state-of-the-art will be given with particular emphasis on the challenges pertaining to nuclear reactor modeling. The second objective will focus on presenting our own development of hybrid subspace methods intended to address the explosion in the computational overhead required when handling real-world complex engineering systems.

  12. Probabilistic structural analysis methods for select space propulsion system components

    Science.gov (United States)

    Millwater, H. R.; Cruse, T. A.

    1989-01-01

    The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.

  13. Computer-aided event tree analysis by the impact vector method

    International Nuclear Information System (INIS)

    Lima, J.E.P.

    1984-01-01

    In the development of the Probabilistic Risk Analysis of Angra I, the ' large event tree/small fault tree' approach was adopted for the analysis of the plant behavior in an emergency situation. In this work, the event tree methodology is presented along with the adaptations which had to be made in order to attain a correct description of the safety system performances according to the selected analysis method. The problems appearing in the application of the methodology and their respective solutions are presented and discussed, with special emphasis to the impact vector technique. A description of the ETAP code ('Event Tree Analysis Program') developed for constructing and quantifying event trees is also given in this work. A preliminary version of the small-break LOCA analysis for Angra 1 is presented as an example of application of the methodology and of the code. It is shown that the use of the ETAP code sigmnificantly contributes to decreasing the time spent in event tree analyses, making it viable the practical application of the analysis approach referred above. (author) [pt

  14. Developing rapid methods for analyzing upland riparian functions and values.

    Science.gov (United States)

    Hruby, Thomas

    2009-06-01

    Regulators protecting riparian areas need to understand the integrity, health, beneficial uses, functions, and values of this resource. Up to now most methods providing information about riparian areas are based on analyzing condition or integrity. These methods, however, provide little information about functions and values. Different methods are needed that specifically address this aspect of riparian areas. In addition to information on functions and values, regulators have very specific needs that include: an analysis at the site scale, low cost, usability, and inclusion of policy interpretations. To meet these needs a rapid method has been developed that uses a multi-criteria decision matrix to categorize riparian areas in Washington State, USA. Indicators are used to identify the potential of the site to provide a function, the potential of the landscape to support the function, and the value the function provides to society. To meet legal needs fixed boundaries for assessment units are established based on geomorphology, the distance from "Ordinary High Water Mark" and different categories of land uses. Assessment units are first classified based on ecoregions, geomorphic characteristics, and land uses. This simplifies the data that need to be collected at a site, but it requires developing and calibrating a separate model for each "class." The approach to developing methods is adaptable to other locations as its basic structure is not dependent on local conditions.

  15. Development of a method of continuous improvement of services using the Business Intelligence tools

    Directory of Open Access Journals (Sweden)

    Svetlana V. Kulikova

    2018-01-01

    Full Text Available The purpose of the study was to develop a method of continuous improvement of services using the Business Intelligence tools.Materials and methods: the materials are used on the concept of the Deming Cycle, methods and Business Intelligence technologies, Agile methodology and SCRUM.Results: the article considers the problem of continuous improvement of services and offers solutions using methods and technologies of Business Intelligence. In this case, the purpose of this technology is to solve and make the final decision regarding what needs to be improved in the current organization of services. In other words, Business Intelligence helps the product manager to see what is hidden from the “human eye” on the basis of received and processed data. Development of a method based on the concept of the Deming Cycle and Agile methodologies, and SCRUM.The article describes the main stages of development of method based on activity of the enterprise. It is necessary to fully build the Business Intelligence system in the enterprise to identify bottlenecks and justify the need for their elimination and, in general, for continuous improvement of the services. This process is represented in the notation of DFD. The article presents a scheme for the selection of suitable agile methodologies.The proposed concept of the solution of the stated objectives, including methods of identification of problems through Business Intelligence technology, development of the system for troubleshooting and analysis of results of the introduced changes. The technical description of the project is given.Conclusion: following the work of the authors there was formed the concept of the method for the continuous improvement of the services, using the Business Intelligence technology with the specifics of the enterprises, offering SaaS solutions. It was also found that when using this method, the recommended development methodology is SCRUM. The result of this scientific

  16. The use of experimental design in the development of an HPLC-ECD method for the analysis of captopril.

    Science.gov (United States)

    Khamanga, Sandile M; Walker, Roderick B

    2011-01-15

    An accurate, sensitive and specific high performance liquid chromatography-electrochemical detection (HPLC-ECD) method that was developed and validated for captopril (CPT) is presented. Separation was achieved using a Phenomenex(®) Luna 5 μm (C(18)) column and a mobile phase comprised of phosphate buffer (adjusted to pH 3.0): acetonitrile in a ratio of 70:30 (v/v). Detection was accomplished using a full scan multi channel ESA Coulometric detector in the "oxidative-screen" mode with the upstream electrode (E(1)) set at +600 mV and the downstream (analytical) electrode (E(2)) set at +950 mV, while the potential of the guard cell was maintained at +1050 mV. The detector gain was set at 300. Experimental design using central composite design (CCD) was used to facilitate method development. Mobile phase pH, molarity and concentration of acetonitrile (ACN) were considered the critical factors to be studied to establish the retention time of CPT and cyclizine (CYC) that was used as the internal standard. Twenty experiments including centre points were undertaken and a quadratic model was derived for the retention time for CPT using the experimental data. The method was validated for linearity, accuracy, precision, limits of quantitation and detection, as per the ICH guidelines. The system was found to produce sharp and well-resolved peaks for CPT and CYC with retention times of 3.08 and 7.56 min, respectively. Linear regression analysis for the calibration curve showed a good linear relationship with a regression coefficient of 0.978 in the concentration range of 2-70 μg/mL. The linear regression equation was y=0.0131x+0.0275. The limits of detection (LOQ) and quantitation (LOD) were found to be 2.27 and 0.6 μg/mL, respectively. The method was used to analyze CPT in tablets. The wide range for linearity, accuracy, sensitivity, short retention time and composition of the mobile phase indicated that this method is better for the quantification of CPT than the

  17. Microarray Analysis of the Developing Rat Mandible

    Institute of Scientific and Technical Information of China (English)

    Hideo KABURAGI; Naoyuki SUGANO; Maiko OSHIKAWA; Ryosuke KOSHI; Naoki SENDA; Kazuhiro KAWAMOTO; Koichi ITO

    2007-01-01

    To analyze the molecular events that occur in the developing mandible, we examined the expression of 8803 genes from samples taken at different time points during rat postnatal mandible development.Total RNA was extracted from the mandibles of 1-day-old, 1-week-old, and 2-week-old rats. Complementary RNA (cRNA) was synthesized from cDNA and biotinylated. Fragmented cRNA was hybridized to RGU34A GeneChip arrays. Among the 8803 genes tested, 4344 were detectable. We identified 148 genes with significantly increased expression, and 19 genes with significantly decreased expression. A comprehensive analysis appears to be an effective method of studying the complex process of development.

  18. A comparison of three methods of Nitrogen analysis for feedstuffs

    African Journals Online (AJOL)

    Unknown

    Introduction. The Kjeldahl method for determining crude protein is very widely used for analysis of feed samples. However, it has its drawbacks and hence new techniques which are without some of the disadvantages are considered desirable. One such modification was developed by Hach et al. (1987). This promising ...

  19. An analytic data analysis method for oscillatory slug tests.

    Science.gov (United States)

    Chen, Chia-Shyun

    2006-01-01

    An analytical data analysis method is developed for slug tests in partially penetrating wells in confined or unconfined aquifers of high hydraulic conductivity. As adapted from the van der Kamp method, the determination of the hydraulic conductivity is based on the occurrence times and the displacements of the extreme points measured from the oscillatory data and their theoretical counterparts available in the literature. This method is applied to two sets of slug test response data presented by Butler et al.: one set shows slow damping with seven discernable extremities, and the other shows rapid damping with three extreme points. The estimates of the hydraulic conductivity obtained by the analytic method are in good agreement with those determined by an available curve-matching technique.

  20. The development of quantative and qualitive analysis methods of suppositories with Maclura Pomifera extract

    Directory of Open Access Journals (Sweden)

    V. A. Korotkov

    2014-08-01

    Full Text Available Chronic prostatitis and BPH are still very common diseases. In recent years, herbal preparations are widely used in the treatment of prostate diseases gland. The effectiveness of herbal medicinal products derived from MacluraPomifera is associated with their content of phytosterols and terpenes. Derived oil extract of MacluraPomiferafruit Orange (Maclurapomifera, Moraceae is a rich source of terpenes and phytosterols. Previous studies indicated, that the content of such substances as lupeol and β-sitosterol, which is known its prostatoprotectors properties as well as the presence of isoflavones possessing anti-inflammatory and antioxidant properties. Aim of the work The aim of this work is a developing of methods that allow assaying qualitative and quantitative assessment of the suppositories with MacluraPomifera extract. Materials and methods Theobjects of this study are the suppositories with oil extract of MacluraPomifera. For the suppositories ingredients’identification thin layer chromatography has been used. Quantitative determination of active compounds has been carried out by spectrophotometry in ultraviolet and visible spectrum. The spectrophotometers of Thermo Scientific Evolution S60 (USA and Apel PD303S (Japan have been used. Determination of phytosterols and triterpenes amount has been performed in the equivalent of a reliable sample of lupeol ('Santa Cruz Biotechnology', USA; CAS: 545-47-1. Determination of isoflavones amount has been carried out in the equivalent of a reliable sample osayin ('BioBioPhaCo., Ltd.',China; CAS: 482-53-1. Results and discussion For identification of phytosterols and isoflavones in the suppositories composition a method of identifying their joint presence by TLC has been developed. Inalcoholic extraction from suppository mass two purple spots with Rf 0,8 and 0,57 are observed at the level of spots solution (lupeol and β-sitosterol and two yellow spots with Rf 0,45 and 0,21 are observed at the level