WorldWideScience

Sample records for analysis methods developed

  1. Development of advanced MCR task analysis methods

    International Nuclear Information System (INIS)

    Na, J. C.; Park, J. H.; Lee, S. K.; Kim, J. K.; Kim, E. S.; Cho, S. B.; Kang, J. S.

    2008-07-01

    This report describes task analysis methodology for advanced HSI designs. Task analyses was performed by using procedure-based hierarchical task analysis and task decomposition methods. The results from the task analysis were recorded in a database. Using the TA results, we developed static prototype of advanced HSI and human factors engineering verification and validation methods for an evaluation of the prototype. In addition to the procedure-based task analysis methods, workload estimation based on the analysis of task performance time and analyses for the design of information structure and interaction structures will be necessary

  2. Development and analysis of finite volume methods

    International Nuclear Information System (INIS)

    Omnes, P.

    2010-05-01

    This document is a synthesis of a set of works concerning the development and the analysis of finite volume methods used for the numerical approximation of partial differential equations (PDEs) stemming from physics. In the first part, the document deals with co-localized Godunov type schemes for the Maxwell and wave equations, with a study on the loss of precision of this scheme at low Mach number. In the second part, discrete differential operators are built on fairly general, in particular very distorted or nonconforming, bidimensional meshes. These operators are used to approach the solutions of PDEs modelling diffusion, electro and magneto-statics and electromagnetism by the discrete duality finite volume method (DDFV) on staggered meshes. The third part presents the numerical analysis and some a priori as well as a posteriori error estimations for the discretization of the Laplace equation by the DDFV scheme. The last part is devoted to the order of convergence in the L2 norm of the finite volume approximation of the solution of the Laplace equation in one dimension and on meshes with orthogonality properties in two dimensions. Necessary and sufficient conditions, relatively to the mesh geometry and to the regularity of the data, are provided that ensure the second-order convergence of the method. (author)

  3. Method development for trace and ultratrace analysis

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Method development, that is, selection of a mode of chromatography and the right column and mobile-phase combination for trace and ultratrace analysis, requires several main considerations. The method should be useful for resolving various trace and ultratrace components present in the sample. If the nature of these components is known, the choice of method may be straightforward, that is, a selection can be made from the following modes of HPLC: (1) adsorption chromatography; (2) normal-phase chromatography; (3) reversed-phase chromatography; (4) ion-pair chromatography; (5) ion-exchange chromatography; (6) ion chromatography. Unfortunately, the nature of all of the components is frequently unknown. However, several intelligent judgments can be made on the nature of impurities. This chapter deals with some basic approaches to mobile-phase selection and optimization. More detailed information may be found in basic texts. Techniques for separation of high-molecular-weight compounds (macromolecules) and chiral compounds may be found elsewhere. Mainly compounds with molecular weight lower than 2,000 are discussed here. 123 refs

  4. Development of Ultraviolet Spectrophotometric Method for Analysis ...

    African Journals Online (AJOL)

    HP

    Method for Analysis of Lornoxicam in Solid Dosage. Forms. Sunit Kumar Sahoo ... testing. Mean recovery was 100.82 % for tablets. Low values of % RSD indicate .... Saharty E, Refaat YS, Khateeb ME. Stability-. Indicating. Spectrophotometric.

  5. Development of analysis methods for seismically isolated nuclear structures

    International Nuclear Information System (INIS)

    Yoo, Bong; Lee, Jae-Han; Koo, Gyeng-Hoi

    2002-01-01

    KAERI's contributions to the project entitled Development of Analysis Methods for Seismically Isolated Nuclear Structures under IAEA CRP of the intercomparison of analysis methods for predicting the behaviour of seismically isolated nuclear structures during 1996-1999 in effort to develop the numerical analysis methods and to compare the analysis results with the benchmark test results of seismic isolation bearings and isolated nuclear structures provided by participating countries are briefly described. Certain progress in the analysis procedures for isolation bearings and isolated nuclear structures has been made throughout the IAEA CRPs and the analysis methods developed can be improved for future nuclear facility applications. (author)

  6. Development of motion image prediction method using principal component analysis

    International Nuclear Information System (INIS)

    Chhatkuli, Ritu Bhusal; Demachi, Kazuyuki; Kawai, Masaki; Sakakibara, Hiroshi; Kamiaka, Kazuma

    2012-01-01

    Respiratory motion can induce the limit in the accuracy of area irradiated during lung cancer radiation therapy. Many methods have been introduced to minimize the impact of healthy tissue irradiation due to the lung tumor motion. The purpose of this research is to develop an algorithm for the improvement of image guided radiation therapy by the prediction of motion images. We predict the motion images by using principal component analysis (PCA) and multi-channel singular spectral analysis (MSSA) method. The images/movies were successfully predicted and verified using the developed algorithm. With the proposed prediction method it is possible to forecast the tumor images over the next breathing period. The implementation of this method in real time is believed to be significant for higher level of tumor tracking including the detection of sudden abdominal changes during radiation therapy. (author)

  7. Development of a general method for photovoltaic system analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nolay, P

    1987-01-01

    The photovoltaic conversion for energetic applications is now widely used, but its development still needs the resolution of many problems for the sizing and for the real working of the installations. The precise analysis of the components and whole system behaviour has led to the development of accurate models for the simulation of such systems. From this modelling phase, a simulation code has been built. The validation of this software has been achieved from experimental test measurements. Since the quality of the software depends on the precision of the input data, an original method of determination of component characteristics, by means of model identification, has been developed. These tools permit the prediction of system behaviour and the dynamic simulation of systems under real conditions. Used for the study of photovoltaic system sizing, this software has allowed the definition of new concepts which will serve as a basis for the development of a sizing method.

  8. Development of sample preparation method for honey analysis using PIXE

    International Nuclear Information System (INIS)

    Saitoh, Katsumi; Chiba, Keiko; Sera, Koichiro

    2008-01-01

    We developed an original preparation method for honey samples (samples in paste-like state) specifically designed for PIXE analysis. The results of PIXE analysis of thin targets prepared by adding a standard containing nine elements to honey samples demonstrated that the preparation method bestowed sufficient accuracy on quantitative values. PIXE analysis of 13 kinds of honey was performed, and eight mineral components (Si, P, S, K, Ca, Mn, Cu and Zn) were detected in all honey samples. The principal mineral components were K and Ca, and the quantitative value for K accounted for the majority of the total value for mineral components. K content in honey varies greatly depending on the plant source. Chestnuts had the highest K content. In fact, it was 2-3 times that of Manuka, which is known as a high quality honey. K content of false-acacia, which is produced in the greatest abundance, was 1/20 that of chestnuts. (author)

  9. Development and simulation of various methods for neutron activation analysis

    International Nuclear Information System (INIS)

    Otgooloi, B.

    1993-01-01

    Simple methods for neutron activation analysis have been developed. The results on the studies of installation for determination of fluorine in fluorite ores directly on the lorry by fast neutron activation analysis have been shown. Nitrogen in organic materials was shown by N 14 and N 15 activation. The description of the new equipment 'FLUORITE' for fluorate factory have been shortly given. Pu and Be isotope in organic materials, including in wheat, was measured. 25 figs, 19 tabs. (Author, Translated by J.U)

  10. Development of rapid urine analysis method for uranium

    Energy Technology Data Exchange (ETDEWEB)

    Kuwabara, J.; Noguchi, H. [Japan Atomic Energy Research Institute, Tokai, Ibaraki (Japan)

    2000-05-01

    ICP-MS has begun to spread in the field of individual monitoring for internal exposure as a very effective machine for uranium analysis. Although the ICP-MS has very high sensitivity, it requires longer time than conventional analysis, such as fluorescence analysis, because it is necessary to remove matrix from a urine sample sufficiently. To shorten time required for the urine bioassay by ICP-MS, a rapid uranium analysis method using the ICP-MS connected with a flow injection system was developed. Since this method does not involve chemical separation steps, the time required is equivalent to the conventional analysis. A measurement test was carried out using 10 urine solutions prepared from a urine sample. Required volume of urine solution is 5 ml. Main chemical treatment is only the digestion with 5 ml of nitric acid using a microwave oven to decompose organic matter and to dissolve suspended or precipitated matter. The microwave oven can digest 10 samples at once within an hour. Volume of digested sample solution was adjusted to 10 ml. The prepared sample solutions were directly introduced to the ICP-MS without any chemical separation procedure. The ICP-MS was connected with a flow injection system and an auto sampler. The flow injection system can minimize the matrix effects caused from salt dissolved in high matrix solution, such as non chemical separated urine sample, because it can introduce micro volume of sample solution into the ICP-MS. The ICP-MS detected uranium within 2 min/sample using the auto sampler. The 10 solutions prepared from a urine sample showed an average of 7.5 ng/l of uranium concentration in urine with 10 % standard deviation. A detection limit is about 1 ng/l. The total time required was less than 4 hours for 10 sample analysis. In the series of measurement, any memory effect was not observed. The present analysis method using the ICP-MS equipped with the flow injection system demonstrated that the shortening of time required on high

  11. Development of rapid urine analysis method for uranium

    International Nuclear Information System (INIS)

    Kuwabara, J.; Noguchi, H.

    2000-01-01

    ICP-MS has begun to spread in the field of individual monitoring for internal exposure as a very effective machine for uranium analysis. Although the ICP-MS has very high sensitivity, it requires longer time than conventional analysis, such as fluorescence analysis, because it is necessary to remove matrix from a urine sample sufficiently. To shorten time required for the urine bioassay by ICP-MS, a rapid uranium analysis method using the ICP-MS connected with a flow injection system was developed. Since this method does not involve chemical separation steps, the time required is equivalent to the conventional analysis. A measurement test was carried out using 10 urine solutions prepared from a urine sample. Required volume of urine solution is 5 ml. Main chemical treatment is only the digestion with 5 ml of nitric acid using a microwave oven to decompose organic matter and to dissolve suspended or precipitated matter. The microwave oven can digest 10 samples at once within an hour. Volume of digested sample solution was adjusted to 10 ml. The prepared sample solutions were directly introduced to the ICP-MS without any chemical separation procedure. The ICP-MS was connected with a flow injection system and an auto sampler. The flow injection system can minimize the matrix effects caused from salt dissolved in high matrix solution, such as non chemical separated urine sample, because it can introduce micro volume of sample solution into the ICP-MS. The ICP-MS detected uranium within 2 min/sample using the auto sampler. The 10 solutions prepared from a urine sample showed an average of 7.5 ng/l of uranium concentration in urine with 10 % standard deviation. A detection limit is about 1 ng/l. The total time required was less than 4 hours for 10 sample analysis. In the series of measurement, any memory effect was not observed. The present analysis method using the ICP-MS equipped with the flow injection system demonstrated that the shortening of time required on high

  12. Metaphysics methods development for high temperature gas cooled reactor analysis

    International Nuclear Information System (INIS)

    Seker, V.; Downar, T. J.

    2007-01-01

    Gas cooled reactors have been characterized as one of the most promising nuclear reactor concepts in the Generation-IV technology road map. Considerable research has been performed on the design and safety analysis of these reactors. However, the calculational tools being used to perform these analyses are not state-of-the-art and are not capable of performing detailed three-dimensional analyses. This paper presents the results of an effort to develop an improved thermal-hydraulic solver for the pebble bed type high temperature gas cooled reactors. The solution method is based on the porous medium approach and the momentum equation including the modified Ergun's resistance model for pebble bed is solved in three-dimensional geometry. The heat transfer in the pebble bed is modeled considering the local thermal non-equilibrium between the solid and gas, which results in two separate energy equations for each medium. The effective thermal conductivity of the pebble-bed can be calculated both from Zehner-Schluender and Robold correlations. Both the fluid flow and the heat transfer are modeled in three dimensional cylindrical coordinates and can be solved in steady-state and time dependent. The spatial discretization is performed using the finite volume method and the theta-method is used in the temporal discretization. A preliminary verification was performed by comparing the results with the experiments conducted at the SANA test facility. This facility is located at the Institute for Safety Research and Reactor Technology (ISR), Julich, Germany. Various experimental cases are modeled and good agreement in the gas and solid temperatures is observed. An on-going effort is to model the control rod ejection scenarios as described in the OECD/NEA/NSC PBMR-400 benchmark problem. In order to perform these analyses PARCS reactor simulator code will be coupled with the new thermal-hydraulic solver. Furthermore, some of the other anticipated accident scenarios in the benchmark

  13. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  14. Development of an analysis rule of diagnosis error for standard method of human reliability analysis

    International Nuclear Information System (INIS)

    Jeong, W. D.; Kang, D. I.; Jeong, K. S.

    2003-01-01

    This paper presents the status of development of Korea standard method for Human Reliability Analysis (HRA), and proposed a standard procedure and rules for the evaluation of diagnosis error probability. The quality of KSNP HRA was evaluated using the requirement of ASME PRA standard guideline, and the design requirement for the standard HRA method was defined. Analysis procedure and rules, developed so far, to analyze diagnosis error probability was suggested as a part of the standard method. And also a study of comprehensive application was performed to evaluate the suitability of the proposed rules

  15. The development of a 3D risk analysis method.

    Science.gov (United States)

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  16. Recent developments in methods for analysis of perfluorinated persistent pollutants

    International Nuclear Information System (INIS)

    Trojanowicz, Marek; Koc, Mariusz

    2013-01-01

    Perfluoroalkyl substances (PFASs) are proliferated into the environment on a global scale and present in the organisms of animals and humans even in remote locations. Persistent organic pollutants of that kind therefore have stimulated substantial improvement in analytical methods. The aim of this review is to present recent achievements in PFASs determination in various matrices with different methods and its comparison to measurements of Total Organic Fluorine (TOF). Analytical methods used for PFASs determinations are dominated by chromatography, mostly in combination with mass spectrometric detection. However, HPLC may be also hyphenated with conductivity or fluorimetric detection, and gas chromatography may be combined with flame ionization or electron capture detection. The presence of a large number of PFASs species in environmental and biological samples necessitates parallel attempts to develop a total PFASs index that reflects the total content of PFASs in various matrices. Increasing attention is currently paid to the determination of branched isomers of PFASs, and their determination in food. (author)

  17. Viscous wing theory development. Volume 1: Analysis, method and results

    Science.gov (United States)

    Chow, R. R.; Melnik, R. E.; Marconi, F.; Steinhoff, J.

    1986-01-01

    Viscous transonic flows at large Reynolds numbers over 3-D wings were analyzed using a zonal viscid-inviscid interaction approach. A new numerical AFZ scheme was developed in conjunction with the finite volume formulation for the solution of the inviscid full-potential equation. A special far-field asymptotic boundary condition was developed and a second-order artificial viscosity included for an improved inviscid solution methodology. The integral method was used for the laminar/turbulent boundary layer and 3-D viscous wake calculation. The interaction calculation included the coupling conditions of the source flux due to the wing surface boundary layer, the flux jump due to the viscous wake, and the wake curvature effect. A method was also devised incorporating the 2-D trailing edge strong interaction solution for the normal pressure correction near the trailing edge region. A fully automated computer program was developed to perform the proposed method with one scalar version to be used on an IBM-3081 and two vectorized versions on Cray-1 and Cyber-205 computers.

  18. Validation and further development of a novel thermal analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Mathews, E.H.; Shuttleworth, A.G.; Rousseau, P.G. [Pretoria Univ. (South Africa). Dept. of Mechanical Engineering

    1994-12-31

    The design of thermal and energy efficient buildings requires inter alia the investigation of the passive performance, natural ventilation, mechanical ventilation as well as structural and evaporative cooling of the building. Only when these fail to achieve the desired thermal comfort should mechanical cooling systems be considered. Few computer programs have the ability to investigate all these comfort regulating methods at the design stage. The QUICK design program can simulate these options with the exception of mechanical cooling. In this paper, Quick`s applicability is extended to include the analysis of basic air-conditioning systems. Since the design of these systems is based on indoor loads, it was necessary to validate QUICK`s load predictions before extending it. This article addresses validation in general and proposes a procedure to establish the efficiency of a program`s load predictions. This proposed procedure is used to compare load predictions by the ASHRAE, CIBSE, CARRIER, CHEETAH, BSIMAC and QUICK methods for 46 case studies involving 36 buildings in various climatic conditions. Although significant differences in the results of the various methods were observed, it is concluded that QUICK can be used with the same confidence as the other methods. It was further shown that load prediction programs usually under-estimate the effect of building mass and therefore over-estimate the peak loads. The details for the 46 case studies are available to other researchers for further verification purposes. With the confidence gained in its load predictions, QUICK was extended to include air-conditioning system analysis. The program was then applied to different case studies. It is shown that system size and energy usage can be reduced by more than 60% by using a combination of passive and mechanical cooling systems as well as different control strategies. (author)

  19. Developing Methods of praxeology to Perform Document-analysis

    DEFF Research Database (Denmark)

    Frederiksen, Jesper

    2016-01-01

    This paper provides a contribution to the methodological development on praxeologic document analysis of neoliberal welfare state policies. Different institutions related to the Danish Healthcare area, transform international health policies and these institutions produce a range of strategies...... is possible. The different works are unique but at the same time part of a common neoliberal welfare state practice. They have a structural similarity as homologous strategies related to an institutional production field of Health- and Social care service. From the construction of these strategies, it is thus...... possible to discuss more overall consequences of the neoliberal policies and the impact on nurses and their position as a health-profession....

  20. Developing the UIC 406 Method for Capacity Analysis

    DEFF Research Database (Denmark)

    Khadem Sameni, Melody; Landex, Alex; Preston, John

    2011-01-01

    This paper applies an improvement cycle for analysing and enhancing capacity utilisation of an existing timetable. Macro and micro capacity utilisation are defined based on the discrete nature of capacity utilisation and different capacity metrics are analysed. In the category of macro asset...... utilisation, two methods of CUI and the UIC 406 are compared with each other. A British and a Danish case study are explored for a periodic and a nonperiodic timetable: 1- Freeing up capacity by omitting the train that has the highest capacity consumption (British case study). 2- Adding trains to use...... the spare capacity (Danish case study). Some suggestions are made to develop meso indices by using the UIC 406 method to decide between the alternatives for adding or removing trains....

  1. DEVELOPMENT OF METHODS FOR STABILITY ANALYSIS OF TOWER CRANES

    Directory of Open Access Journals (Sweden)

    Sinel'shchikov Aleksey Vladimirovich

    2018-01-01

    Full Text Available Tower cranes are one of the main tools for execution of reloading works during construction. Design of tower cranes is carried out in accordance with RD 22-166-86 “Construction of tower cranes. Rules of analysis”, according to which to ensure stability it is required not to exceed the overturning moment upper limit. The calculation of these moments is carried out with the use of empirical coefficients and quite time-consuming. Moreover, normative methodology only considers the static position of the crane and does not take into account the presence of dynamic transients due to crane functioning (lifting and swinging of the load, boom turning and the presence of the dynamic external load (e.g. from wind for different orientations of the crane. This paper proposes a method of determining the stability coefficient of the crane based on acting reaction forces at the support points - the points of contact of wheels with the crane rail track, which allows us, at the design stage, to investigate stability of tower crane under variable external loads and operating conditions. Subject: the safety of tower cranes operation with regard to compliance with regulatory requirements of ensuring their stability both at the design stage and at the operational stage. Research objectives: increasing the safety of operation of tower cranes on the basis of improving methodology of their design to ensure static and dynamic stability. Materials and methods: analysis and synthesis of the regulatory framework and modern research works on provision of safe operation of tower cranes, the method of numerical simulation. Results: we proposed the formula for analysis of stability of tower cranes using the resulting reaction forces at the supports of the crane at the point of contact of the wheel with the rail track.

  2. Development of rupture process analysis method for great earthquakes using Direct Solution Method

    Science.gov (United States)

    Yoshimoto, M.; Yamanaka, Y.; Takeuchi, N.

    2010-12-01

    Conventional rupture process analysis methods using teleseismic body waves were based on ray theory. Therefore, these methods have the following problems in applying to great earthquakes such as 2004 Sumatra earthquake: (1) difficulty in computing all later phases such as the PP reflection phase, (2) impossibility of computing called “W phase”, the long period phase arriving before S wave, (3) implausibility of hypothesis that the distance is far enough from the observation points to the hypocenter compared to the fault length. To solve above mentioned problems, we have developed a new method which uses the synthetic seismograms computed by the Direct Solution Method (DSM, e.g. Kawai et al. 2006) as Green’s functions. We used the DSM software (http://www.eri.u-tokyo.ac.jp/takeuchi/software/) for computing the Green’s functions up to 1 Hz for the IASP91 (Kennett and Engdahl, 1991) model, and determined the final slip distributions using the waveform inversion method (Kikuchi et al. 2003). First we confirmed whether the Green’s functions computed by DSM were accurate in higher frequencies up to 1 Hz. Next we performed the rupture process analysis of this new method for Mw8.0 (GCMT) large Solomon Islands earthquake on April 1, 2007. We found that this earthquake consisted of two asperities and the rupture propagated across the subducting Sinbo ridge. The obtained slip distribution better correlates to the aftershock distributions than existing method. Furthermore, this new method keep same accuracy of existing method (which has the advantage of calculating) with respect to direct P-wave and reflection phases near the source, and also accurately calculate the later phases such a PP-wave.

  3. Development of Rotor Diagnosis Method via Motor Current Signature Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Seok; Huh, Hyung; Kim, Min Hwan; Jeong, Kyeong Hoon; Lee, Gyu Mhan; Park, Jin Ho; Park, Keun Bae; Lee, Cheol Kwon; Hur, S

    2006-01-15

    A study on motor current signature analysis has been performed to monitor a journal bearing fault due to increasing clearance. It was known that the journal bearing clearance produces side band frequencies, the supplied current frequency plus and minus rotational rotor frequency in motor current. But the existence information of the side band frequencies is not sufficient to diagnose whether the journal bearing is safe or not. Four journal bearing sets with different clearances are used to measure the side band frequency amplitude and the rotor vibration amplitude versus the journal bearing clearance. The side band frequency amplitude and the rotor vibration amplitude are increased as the journal bearing clearance is increasing. This trend assures that ASME OM vibration guide line can be applied to estimate the journal bearing clearance size. In this research, 2.5 times the reference side band amplitude is suggested as an indicator of a journal bearing fault. Further study is necessary to make out more specific quantitative relations between the side band frequency amplitude and the journal bearing clearance of a motor.

  4. Development of Rotor Diagnosis Method via Motor Current Signature Analysis

    International Nuclear Information System (INIS)

    Park, Jin Seok; Huh, Hyung; Kim, Min Hwan; Jeong, Kyeong Hoon; Lee, Gyu Mhan; Park, Jin Ho; Park, Keun Bae; Lee, Cheol Kwon; Hur, S.

    2006-01-01

    A study on motor current signature analysis has been performed to monitor a journal bearing fault due to increasing clearance. It was known that the journal bearing clearance produces side band frequencies, the supplied current frequency plus and minus rotational rotor frequency in motor current. But the existence information of the side band frequencies is not sufficient to diagnose whether the journal bearing is safe or not. Four journal bearing sets with different clearances are used to measure the side band frequency amplitude and the rotor vibration amplitude versus the journal bearing clearance. The side band frequency amplitude and the rotor vibration amplitude are increased as the journal bearing clearance is increasing. This trend assures that ASME OM vibration guide line can be applied to estimate the journal bearing clearance size. In this research, 2.5 times the reference side band amplitude is suggested as an indicator of a journal bearing fault. Further study is necessary to make out more specific quantitative relations between the side band frequency amplitude and the journal bearing clearance of a motor

  5. Development of computational methods of design by analysis for pressure vessel components

    International Nuclear Information System (INIS)

    Bao Shiyi; Zhou Yu; He Shuyan; Wu Honglin

    2005-01-01

    Stress classification is not only one of key steps when pressure vessel component is designed by analysis, but also a difficulty which puzzles engineers and designers at all times. At present, for calculating and categorizing the stress field of pressure vessel components, there are several computation methods of design by analysis such as Stress Equivalent Linearization, Two-Step Approach, Primary Structure method, Elastic Compensation method, GLOSS R-Node method and so on, that are developed and applied. Moreover, ASME code also gives an inelastic method of design by analysis for limiting gross plastic deformation only. When pressure vessel components design by analysis, sometimes there are huge differences between the calculating results for using different calculating and analysis methods mentioned above. As consequence, this is the main reason that affects wide application of design by analysis approach. Recently, a new approach, presented in the new proposal of a European Standard, CEN's unfired pressure vessel standard EN 13445-3, tries to avoid problems of stress classification by analyzing pressure vessel structure's various failure mechanisms directly based on elastic-plastic theory. In this paper, some stress classification methods mentioned above, are described briefly. And the computational methods cited in the European pressure vessel standard, such as Deviatoric Map, and nonlinear analysis methods (plastic analysis and limit analysis), are depicted compendiously. Furthermore, the characteristics of computational methods of design by analysis are summarized for selecting the proper computational method when design pressure vessel component by analysis. (authors)

  6. Development of three-dimensional ENRICHED FREE MESH METHOD and its application to crack analysis

    International Nuclear Information System (INIS)

    Suzuki, Hayato; Matsubara, Hitoshi; Ezawa, Yoshitaka; Yagawa, Genki

    2010-01-01

    In this paper, we describe a method for three-dimensional high accurate analysis of a crack included in a large-scale structure. The Enriched Free Mesh Method (EFMM) is a method for improving the accuracy of the Free Mesh Method (FMM), which is a kind of meshless method. First, we developed an algorithm of the three-dimensional EFMM. The elastic problem was analyzed using the EFMM and we find that its accuracy compares advantageously with the FMM, and the number of CG iterations is smaller. Next, we developed a method for calculating the stress intensity factor by employing the EFMM. The structure with a crack was analyzed using the EFMM, and the stress intensity factor was calculated by the developed method. The analysis results were very well in agreement with reference solution. It was shown that the proposed method is very effective in the analysis of the crack included in a large-scale structure. (author)

  7. Development of automatic image analysis methods for high-throughput and high-content screening

    NARCIS (Netherlands)

    Di, Zi

    2013-01-01

    This thesis focuses on the development of image analysis methods for ultra-high content analysis of high-throughput screens where cellular phenotype responses to various genetic or chemical perturbations that are under investigation. Our primary goal is to deliver efficient and robust image analysis

  8. Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Ettehadtavakkol, Amin, E-mail: amin.ettehadtavakkol@ttu.edu [Texas Tech University (United States); Jablonowski, Christopher [Shell Exploration and Production Company (United States); Lake, Larry [University of Texas at Austin (United States)

    2017-04-15

    Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum design concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.

  9. Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs

    International Nuclear Information System (INIS)

    Ettehadtavakkol, Amin; Jablonowski, Christopher; Lake, Larry

    2017-01-01

    Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum design concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.

  10. Development of calculation method for one-dimensional kinetic analysis in fission reactors, including feedback effects

    International Nuclear Information System (INIS)

    Paixao, S.B.; Marzo, M.A.S.; Alvim, A.C.M.

    1986-01-01

    The calculation method used in WIGLE code is studied. Because of the non availability of such a praiseworthy solution, expounding the method minutely has been tried. This developed method has been applied for the solution of the one-dimensional, two-group, diffusion equations in slab, axial analysis, including non-boiling heat transfer, accountig for feedback. A steady-state program (CITER-1D), written in FORTRAN 4, has been implemented, providing excellent results, ratifying the developed work quality. (Author) [pt

  11. Chemical analysis of solid residue from liquid and solid fuel combustion: Method development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Trkmic, M. [University of Zagreb, Faculty of Mechanical Engineering and Naval Architecturek Zagreb (Croatia); Curkovic, L. [University of Zagreb, Faculty of Chemical Engineering and Technology, Zagreb (Croatia); Asperger, D. [HEP-Proizvodnja, Thermal Power Plant Department, Zagreb (Croatia)

    2012-06-15

    This paper deals with the development and validation of methods for identifying the composition of solid residue after liquid and solid fuel combustion in thermal power plant furnaces. The methods were developed for energy dispersive X-ray fluorescence (EDXRF) spectrometer analysis. Due to the fuels used, the different composition and the location of creation of solid residue, it was necessary to develop two methods. The first method is used for identifying solid residue composition after fuel oil combustion (Method 1), while the second method is used for identifying solid residue composition after the combustion of solid fuels, i. e. coal (Method 2). Method calibration was performed on sets of 12 (Method 1) and 6 (Method 2) certified reference materials (CRM). CRMs and analysis test samples were prepared in pellet form using hydraulic press. For the purpose of method validation the linearity, accuracy, precision and specificity were determined, and the measurement uncertainty of methods for each analyte separately was assessed. The methods were applied in the analysis of real furnace residue samples. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  12. Development of spectral history methods for pin-by-pin core analysis method using three-dimensional direct response matrix

    International Nuclear Information System (INIS)

    Mitsuyasu, T.; Ishii, K.; Hino, T.; Aoyama, M.

    2009-01-01

    Spectral history methods for pin-by-pin core analysis method using the three-dimensional direct response matrix have been developed. The direct response matrix is formalized by four sub-response matrices in order to respond to a core eigenvalue k and thus can be recomposed at each outer iteration in the core analysis. For core analysis, it is necessary to take into account the burn-up effect related to spectral history. One of the methods is to evaluate the nodal burn-up spectrum obtained using the out-going neutron current. The other is to correct the fuel rod neutron production rates obtained the pin-by-pin correction. These spectral history methods were tested in a heterogeneous system. The test results show that the neutron multiplication factor error can be reduced by half during burn-up, the nodal neutron production rates errors can be reduced by 30% or more. The root-mean-square differences between the relative fuel rod neutron production rate distributions can be reduced within 1.1% error. This means that these methods can accurately reflect the effects of intra- and inter-assembly heterogeneities during burn-up and can be used for core analysis. Core analysis with the DRM method was carried out for an ABWR quarter core and it was found that both thermal power and coolant-flow distributions were smoothly converged. (authors)

  13. Preliminary Tests For Development Of A Non-Pertechnetate Analysis Method

    Energy Technology Data Exchange (ETDEWEB)

    Diprete, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); McCabe, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-09-28

    The objective of this task was to develop a non-pertechnetate analysis method that 222-S lab could easily implement. The initial scope involved working with 222-S laboratory personnel to adapt the existing Tc analytical method to fractionate the non-pertechnetate and pertechnetate. SRNL then developed and tested a method using commercial sorbents containing Aliquat® 336 to extract the pertechnetate (thereby separating it from non-pertechnetate), followed by oxidation, extraction, and stripping steps, and finally analysis by beta counting and Mass Spectroscopy. Several additional items were partially investigated, including impacts of a 137Cs removal step. The method was initially tested on SRS tank waste samples to determine its viability. Although SRS tank waste does not contain non-pertechnetate, testing with it was useful to investigate the compatibility, separation efficiency, interference removal efficacy, and method sensitivity.

  14. Development and validation of a multiresidue method for pesticide analysis in honey by UFLC-MS

    Directory of Open Access Journals (Sweden)

    Adriana M. Zamudio S.

    2017-05-01

    Full Text Available A method for the determination of pesticide residues in honey by ultra fast liquid chromatography coupled with mass spectrometry was developed. For this purpose, different variations of the QuECHERS method were performed: (i amount of sample, (ii type of salt to control pH, (iii buffer pH, and (iv different mixtures for cleaning-up. In addition, to demonstrate that the method is reliable, different validation parameters were studied: accuracy, limits of detection and quantification, linearity and selectivity. The results showed that by means of the changes introduced it was possible to get a more selective method that improves the accuracy of about 19 pesticides selected from the original method. It was found that the method is suitable for the analysis of 50 pesticides, out of 56. Furthermore, with the developed method recoveries between 70 and 120% and relative standard deviation below 15% were found.

  15. Development of A Standard Method for Human Reliability Analysis (HRA) of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kang, Dae Il; Jung, Won Dea; Kim, Jae Whan

    2005-12-01

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  16. Development of A Standard Method for Human Reliability Analysis of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Jung, Won Dea; Kang, Dae Il; Kim, Jae Whan

    2005-12-01

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  17. Development of A Standard Method for Human Reliability Analysis of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kang, Dae Il; Kim, Jae Whan

    2005-12-15

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method.

  18. Development of A Standard Method for Human Reliability Analysis (HRA) of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Jung, Won Dea; Kim, Jae Whan

    2005-12-15

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method.

  19. The development of methods of analysis of documents on the basis of the methods of Raman spectroscopy and fluorescence analysis

    Science.gov (United States)

    Gorshkova, Kseniia O.; Tumkin, Ilya I.; Kirillova, Elizaveta O.; Panov, Maxim S.; Kochemirovsky, Vladimir A.

    2017-05-01

    The investigation of natural aging of writing inks printed on paper using Raman spectroscopy was performed. Based on the obtained dependencies of the Raman peak intensities ratios on the exposure time, the dye degradation model was proposed. It was suggested that there are several competing bond breaking and bond forming reactions corresponding to the characteristic vibration frequencies of the dye molecule that simultaneously occur during ink aging process. Also we propose a methodology based on the study of the optical properties of paper, particularly changes in the fluorescence of optical brighteners included in its composition as well as the paper reflectivity using spectrophotometric methods. These results can be implemented to develop the novel and promising method of criminology.

  20. Considerations for Task Analysis Methods and Rapid E-Learning Development Techniques

    Directory of Open Access Journals (Sweden)

    Dr. Ismail Ipek

    2014-02-01

    Full Text Available The purpose of this paper is to provide basic dimensions for rapid training development in e-learning courses in education and business. Principally, it starts with defining task analysis and how to select tasks for analysis and task analysis methods for instructional design. To do this, first, learning and instructional technologies as visions of the future were discussed. Second, the importance of task analysis methods in rapid e-learning was considered, with learning technologies as asynchronous and synchronous e-learning development. Finally, rapid instructional design concepts and e-learning design strategies were defined and clarified with examples, that is, all steps for effective task analysis and rapid training development techniques based on learning and instructional design approaches were discussed, such as m-learning and other delivery systems. As a result, the concept of task analysis, rapid e-learning development strategies and the essentials of online course design were discussed, alongside learner interface design features for learners and designers.

  1. Adjoint-based Mesh Optimization Method: The Development and Application for Nuclear Fuel Analysis

    International Nuclear Information System (INIS)

    Son, Seongmin; Lee, Jeong Ik

    2016-01-01

    In this research, methods for optimizing mesh distribution is proposed. The proposed method uses adjoint base optimization method (adjoint method). The optimized result will be obtained by applying this meshing technique to the existing code input deck and will be compared to the results produced from the uniform meshing method. Numerical solutions are calculated form an in-house 1D Finite Difference Method code while neglecting the axial conduction. The fuel radial node optimization was first performed to match the Fuel Centerline Temperature (FCT) the best. This was followed by optimizing the axial node which the Peak Cladding Temperature (PCT) is matched the best. After obtaining the optimized radial and axial nodes, the nodalization is implemented into the system analysis code and transient analyses were performed to observe the optimum nodalization performance. The developed adjoint-based mesh optimization method in the study is applied to MARS-KS, which is a nuclear system analysis code. Results show that the newly established method yields better results than that of the uniform meshing method from the numerical point of view. It is again stressed that the optimized mesh for the steady state can also give better numerical results even during a transient analysis

  2. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    Ure, James; Chen, Haofeng; Tipping, David

    2014-01-01

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  3. Development of high performance liquid chromatography method for miconazole analysis in powder sample

    Science.gov (United States)

    Hermawan, D.; Suwandri; Sulaeman, U.; Istiqomah, A.; Aboul-Enein, H. Y.

    2017-02-01

    A simple high performance liquid chromatography (HPLC) method has been developed in this study for the analysis of miconazole, an antifungal drug, in powder sample. The optimized HPLC system using C8 column was achieved using mobile phase composition containing methanol:water (85:15, v/v), a flow rate of 0.8 mL/min, and UV detection at 220 nm. The calibration graph was linear in the range from 10 to 50 mg/L with r 2 of 0.9983. The limit of detection (LOD) and limit of quantitation (LOQ) obtained were 2.24 mg/L and 7.47 mg/L, respectively. The present HPLC method is applicable for the determination of miconazole in the powder sample with a recovery of 101.28 % (RSD = 0.96%, n = 3). The developed HPLC method provides short analysis time, high reproducibility and high sensitivity.

  4. Deaf: A Concept Analysis From a Cultural Perspective Using the Wilson Method of Concept Analysis Development.

    Science.gov (United States)

    Pendergrass, Kathy M; Newman, Susan D; Jones, Elaine; Jenkins, Carolyn H

    2017-07-01

    The purpose of this article is to provide an analysis of the concept Deaf to increase health care provider (HCP) understanding from a cultural perspective. Deaf signers, people with hearing loss who communicate primarily in American Sign Language (ASL), generally define the term Deaf as a cultural heritage. In the health care setting, the term deaf is most often defined as a pathological condition requiring medical intervention. When HCPs are unaware that there are both cultural and pathological views of hearing loss, significant barriers may exist between the HCP and the Deaf individual. The concept of Deaf is analyzed using the Wilsonian method. Essential elements of the concept "Deaf" from a cultural perspective include a personal choice to communicate primarily in ASL and identify with the Deaf community. Resources for HCPs are needed to quickly identify Deaf signers and provide appropriate communication.

  5. Original methods of quantitative analysis developed for diverse samples in various research fields. Quantitative analysis at NMCC

    International Nuclear Information System (INIS)

    Sera, Koichiro

    2003-01-01

    Nishina Memorial Cyclotron Center (NMCC) has been opened for nationwide-common utilization of positron nuclear medicine (PET) and PIXE since April 1993. At the present time, nearly 40 subjects of PIXE in various research fields are pursued here, and more than 50,000 samples have been analyzed up to the present. In order to perform quantitative analyses of diverse samples, technical developments in sample preparation, measurement and data analysis have been continuously carried out. Especially, a standard-free method for quantitative analysis'' made it possible to perform analysis of infinitesimal samples, powdered samples and untreated bio samples, which could not be well analyzed quantitatively in the past. The standard-free method'' and a ''powdered internal standard method'' made the process for target preparation quite easier. It has been confirmed that results obtained by these methods show satisfactory accuracy and reproducibility preventing any ambiguity coming from complicated target preparation processes. (author)

  6. DEVELOPMENT AND VALIDATION OF NUMERICAL METHOD FOR STRENGTH ANALYSIS OF LATTICE COMPOSITE FUSELAGE STRUCTURES

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available Lattice composite fuselage structures are developed as an alternative to conventional composite structures based on laminated skin and stiffeners. Structure layout of lattice structures allows to realize advantages of current composite materials to a maximal extent, at the same time minimizing their main shortcomings, that allows to provide higher weight efficiency for these structures in comparison with conventional analogues.Development and creation of lattice composite structures requires development of novel methods of strength anal- ysis, as conventional methods, as a rule, are aiming to strength analysis of thin-walled elements and do not allow to get confident estimation of local strength of high-loaded unidirectional composite ribs.In the present work the method of operative strength analysis of lattice composite structure is presented, based onspecialized FE-models of unidirectional composite ribs and their intersections. In the frames of the method, every rib is modeled by a caisson structure, consisting of arbitrary number of flanges and webs, modeled by membrane finite elements. Parameters of flanges and webs are calculated automatically from the condition of stiffness characteristics equality of real rib and the model. This method allows to perform local strength analysis of high-loaded ribs of lattice structure without use of here-dimensional finite elements, that allows to shorten time of calculations and sufficiently simplify the procedure of analysis of results of calculations.For validation of the suggested method, the results of experimental investigations of full-scale prototype of shell of lattice composite fuselage section have been used. The prototype of the lattice section was manufactured in CRISM and tested in TsAGI within the frames of a number of Russian and International scientific projects. The results of validation have shown that the suggested method allows to provide high operability of strength analysis, keeping

  7. Ion beam analysis - development and application of nuclear reaction analysis methods, in particular at a nuclear microprobe

    International Nuclear Information System (INIS)

    Sjoeland, K.A.

    1996-11-01

    This thesis treats the development of Ion Beam Analysis methods, principally for the analysis of light elements at a nuclear microprobe. The light elements in this context are defined as having an atomic number less than approx. 13. The work reported is to a large extent based on multiparameter methods. Several signals are recorded simultaneously, and the data can be effectively analyzed to reveal structures that can not be observed through one-parameter collection. The different techniques are combined in a new set-up at the Lund Nuclear Microprobe. The various detectors for reaction products are arranged in such a way that they can be used for the simultaneous analysis of hydrogen, lithium, boron and fluorine together with traditional PIXE analysis and Scanning Transmission Ion Microscopy as well as photon-tagged Nuclear Reaction Analysis. 48 refs

  8. Developing a Clustering-Based Empirical Bayes Analysis Method for Hotspot Identification

    Directory of Open Access Journals (Sweden)

    Yajie Zou

    2017-01-01

    Full Text Available Hotspot identification (HSID is a critical part of network-wide safety evaluations. Typical methods for ranking sites are often rooted in using the Empirical Bayes (EB method to estimate safety from both observed crash records and predicted crash frequency based on similar sites. The performance of the EB method is highly related to the selection of a reference group of sites (i.e., roadway segments or intersections similar to the target site from which safety performance functions (SPF used to predict crash frequency will be developed. As crash data often contain underlying heterogeneity that, in essence, can make them appear to be generated from distinct subpopulations, methods are needed to select similar sites in a principled manner. To overcome this possible heterogeneity problem, EB-based HSID methods that use common clustering methodologies (e.g., mixture models, K-means, and hierarchical clustering to select “similar” sites for building SPFs are developed. Performance of the clustering-based EB methods is then compared using real crash data. Here, HSID results, when computed on Texas undivided rural highway cash data, suggest that all three clustering-based EB analysis methods are preferred over the conventional statistical methods. Thus, properly classifying the road segments for heterogeneous crash data can further improve HSID accuracy.

  9. Development of TRU waste mobile analysis methods for RCRA-regulated metals

    International Nuclear Information System (INIS)

    Mahan, C.A.; Villarreal, R.; Drake, L.; Figg, D.; Wayne, D.; Goldstein, S.

    1998-01-01

    This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). Glow-discharge mass spectrometry (GD-MS), laser-induced breakdown spectroscopy (LIBS), dc-arc atomic-emission spectroscopy (DC-ARC-AES), laser-ablation inductively-coupled-plasma mass spectrometry (LA-ICP-MS), and energy-dispersive x-ray fluorescence (EDXRF) were identified as potential solid-sample analytical techniques for mobile characterization of TRU waste. Each technology developers was provided with surrogate TRU waste samples in order to develop an analytical method. Following successful development of the analytical method, five performance evaluation samples were distributed to each of the researchers in a blind round-robin format. Results of the round robin were compared to known values and Transuranic Waste Characterization Program (TWCP) data quality objectives. Only two techniques, DC-ARC-AES and EDXRF, were able to complete the entire project. Methods development for GD-MS and LA-ICP-MS was halted due to the stand-down at the CMR facility. Results of the round-robin analysis are given for the EDXRF and DCARC-AES techniques. While DC-ARC-AES met several of the data quality objectives, the performance of the EDXRF technique by far surpassed the DC-ARC-AES technique. EDXRF is a simple, rugged, field portable instrument that appears to hold great promise for mobile characterization of TRU waste. The performance of this technique needs to be tested on real TRU samples in order to assess interferences from actinide constituents. In addition, mercury and beryllium analysis will require another analytical technique because the EDXRF method failed to meet the TWCP data quality objectives. Mercury analysis is easily accomplished on solid samples by cold vapor atomic fluorescence (CVAFS). Beryllium can be analyzed by any of a variety of emission techniques

  10. Development of a Probabilistic Tsunami Hazard Analysis Method and Application to an NPP in Korea

    International Nuclear Information System (INIS)

    Kim, M. K.; Choi, Ik

    2012-01-01

    A methodology of tsunami PSA was developed in this study. A tsunami PSA consists of tsunami hazard analysis, tsunami fragility analysis and system analysis. In the case of tsunami hazard analysis, evaluation of tsunami return period is a major task. For the evaluation of tsunami return period was evaluated with empirical method using historical tsunami record and tidal gauge record. For the performing a tsunami fragility analysis, procedure of tsunami fragility analysis was established and target equipment and structures for investigation of tsunami fragility assessment were selected. A sample fragility calculation was performed for the equipment in a Nuclear Power Plant. For the system analysis, accident sequence of tsunami event was developed according to the tsunami run-up and draw down, and tsunami induced core damage frequency (CDF) is determined. For the application to the real nuclear power plant, the Ulchin 56 NPP which is located on the east coast of Korean peninsula was selected. Through this study, whole tsunami PSA (Probabilistic Safety Assessment) working procedure was established and an example calculation was performed for one nuclear power plant in Korea

  11. Development of Uncertainty Analysis Method for SMART Digital Core Protection and Monitoring System

    International Nuclear Information System (INIS)

    Koo, Bon Seung; In, Wang Kee; Hwang, Dae Hyun

    2012-01-01

    The Korea Atomic Energy Research Institute has developed a system-integrated modular advanced reactor (SMART) for a seawater desalination and electricity generation. Online digital core protection and monitoring systems, called SCOPS and SCOMS respectively were developed. SCOPS calculates minimum DNBR and maximum LPD based on the several online measured system parameters. SCOMS calculates the variables of limiting conditions for operation. KAERI developed overall uncertainty analysis methodology which is used statistically combining uncertainty components of SMART core protection and monitoring system. By applying overall uncertainty factors in on-line SCOPS/SCOMS calculation, calculated LPD and DNBR are conservative with a 95/95 probability/confidence level. In this paper, uncertainty analysis method is described for SMART core protection and monitoring system

  12. Development of seismic analysis method considered FSI effect on a neutron reflector for APWR reactor internals

    Energy Technology Data Exchange (ETDEWEB)

    Hideyuki, Morika; Tomomichi, Nakamura [Mitsubishi Heavy Industries Ltd., Takasago R and D Center, Hyogo (Japan); Toshio, Ichikawa; Kazuo, Hirota; Hiroyuki, Murakiso [Mitsubishi Heavy Industries Ltd., Kobe Shipyard and Machinery Works, Hyogo, Kobe (Japan); Minoru, Murota [Japan Atomic Power Co., Tokyo (Japan)

    2004-07-01

    A Neutron Reflector (NR) is a new structure designed for improving the structure reliability of Advanced Pressurized Water Reactors (APWR,). The NR is placed in a narrow gap between the NR and a Core Barrel (CB). In the case of a structure surrounded by liquid in a narrow gap, the added fluid mass and the damping increases compared with in the air. This effect is famous for Fluid-Structure Interaction effect (FSI effect) in the narrow gap and it depends on the vibration displacement of the structure. A new method to estimate the added fluid damping for this case has been introduced by some of the authors in 2001, which is based on a narrow passage flow theory (Morita et al., 2001). Following this theory, a vibration test was performed to assess the appropriateness of the analysis method employed to measure the response of the NR during an earthquake (Nakamura et al., 2002). In this paper, results of a model test are shown comparing the data with the calculated ones based on the new analysis method that is combined the above method with the ANSYS computer code. As a result, a new seismic analysis method using the above theory was developed. The analytical results are in good agreement with the test results. (authors)

  13. The development of a task analysis method applicable to the tasks of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Wan Chul; Park, Ji Soo; Baek, Dong Hyeon; Ham, Dong Han; Kim, Huhn [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1996-07-01

    While task analysis is one of the essential processes for human factors studies, traditional methods reveal weaknesses in dealing with the cognitive aspects, which become more critical in modern complex system. This report proposes a cognitive task analysis (CTA) method for identifying cognitive requirements of operators' tasks in nuclear power plants. The proposed CTA method is characterized by the information-oriented concept and procedure-based approach. The task prescription identifies the information requirements and trace the information flow to reveal the cognitive organization of task procedure with emphasis to the relations among the information requirements. The cognitive requirements are then analyzed in terms of cognitive span of task information, cognitive envelope and working memory relief point of t procedures, and working memory load. The proposed method is relatively simple and, possibly being incorporated in a full task analysis scheme, directly applicable to the design/evaluation of human-machine interfaces and operating procedures. A prototype of a computerized support system is developed for supporting the practicality of the proposed method. (Author) 104 refs., 8 tabs., 7 figs.

  14. The development of a task analysis method applicable to the tasks of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Wan Chul; Park, Ji Soo; Baek, Dong Hyeon; Ham, Dong Han; Kim, Huhn [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1996-07-01

    While task analysis is one of the essential processes for human factors studies, traditional methods reveal weaknesses in dealing with the cognitive aspects, which become more critical in modern complex system. This report proposes a cognitive task analysis (CTA) method for identifying cognitive requirements of operators' tasks in nuclear power plants. The proposed CTA method is characterized by the information-oriented concept and procedure-based approach. The task prescription identifies the information requirements and trace the information flow to reveal the cognitive organization of task procedure with emphasis to the relations among the information requirements. The cognitive requirements are then analyzed in terms of cognitive span of task information, cognitive envelope and working memory relief point of t procedures, and working memory load. The proposed method is relatively simple and, possibly being incorporated in a full task analysis scheme, directly applicable to the design/evaluation of human-machine interfaces and operating procedures. A prototype of a computerized support system is developed for supporting the practicality of the proposed method. (Author) 104 refs., 8 tabs., 7 figs.

  15. Development of a Method for Tool Wear Analysis Using 3D Scanning

    Directory of Open Access Journals (Sweden)

    Hawryluk Marek

    2017-12-01

    Full Text Available The paper deals with evaluation of a 3D scanning method elaborated by the authors, by applying it to the analysis of the wear of forging tools. The 3D scanning method in the first place consists in the application of scanning to the analysis of changes in geometry of a forging tool by way of comparing the images of a worn tool with a CAD model or an image of a new tool. The method was evaluated in the context of the important measurement problems resulting from the extreme conditions present during the industrial hot forging processes. The method was used to evaluate wear of tools with an increasing wear degree, which made it possible to determine the wear characteristics in a function of the number of produced forgings. The following stage was the use it for a direct control of the quality and geometry changes of forging tools (without their disassembly by way of a direct measurement of the geometry of periodically collected forgings (indirect method based on forgings. The final part of the study points to the advantages and disadvantages of the elaborated method as well as the potential directions of its further development.

  16. k{sub 0}-neutron activation analysis based method at CDTN: history, development and main achievements

    Energy Technology Data Exchange (ETDEWEB)

    Menezes, Maria Ângela de B.C.; Jacimovic, Radojko; Dalmazio, Ilza, E-mail: menezes@cdtn.br, E-mail: id@cdtn.br, E-mail: radojko.jacimovic@ijs.si [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte - MG (Brazil); Jožef Stefan Institute, Department of Environmental Sciences, Ljubljana (Slovenia)

    2017-11-01

    Neutron Activation Analysis (NAA) is an analytical technique to assay the elemental chemical composition in samples of several matrices. It has been applied by the Laboratory for Neutron Activation Analysis, located at Centro de Desenvolvimento da Tecnologia Nuclear (Nuclear Technology Development Centre) /Comissao Nacional de Energia Nuclear (Brazilian Commission for Nuclear Energy), CDTN/CNEN, since the starting up of the TRIGA MARK I IPR-R1 reactor, in 1960. Among the methods of this technique, the k{sub 0}-standardization method, which was established at CDTN in 1995, is the most efficient and in 2003 it was re-established and optimized. This paper is about the history and the main achievements since then. (author)

  17. Development of Compressive Failure Strength for Composite Laminate Using Regression Analysis Method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Myoung Keon [Agency for Defense Development, Daejeon (Korea, Republic of); Lee, Jeong Won; Yoon, Dong Hyun; Kim, Jae Hoon [Chungnam Nat’l Univ., Daejeon (Korea, Republic of)

    2016-10-15

    This paper provides the compressive failure strength value of composite laminate developed by using regression analysis method. Composite material in this document is a Carbon/Epoxy unidirection(UD) tape prepreg(Cycom G40-800/5276-1) cured at 350°F(177°C). The operating temperature is –60°F~+200°F(-55°C - +95°C). A total of 56 compression tests were conducted on specimens from eight (8) distinct laminates that were laid up by standard angle layers (0°, +45°, –45° and 90°). The ASTM-D-6484 standard was used for test method. The regression analysis was performed with the response variable being the laminate ultimate fracture strength and the regressor variables being two ply orientations (0° and ±45°)

  18. k_0-neutron activation analysis based method at CDTN: history, development and main achievements

    International Nuclear Information System (INIS)

    Menezes, Maria Ângela de B.C.; Jacimovic, Radojko; Dalmazio, Ilza

    2017-01-01

    Neutron Activation Analysis (NAA) is an analytical technique to assay the elemental chemical composition in samples of several matrices. It has been applied by the Laboratory for Neutron Activation Analysis, located at Centro de Desenvolvimento da Tecnologia Nuclear (Nuclear Technology Development Centre) /Comissao Nacional de Energia Nuclear (Brazilian Commission for Nuclear Energy), CDTN/CNEN, since the starting up of the TRIGA MARK I IPR-R1 reactor, in 1960. Among the methods of this technique, the k_0-standardization method, which was established at CDTN in 1995, is the most efficient and in 2003 it was re-established and optimized. This paper is about the history and the main achievements since then. (author)

  19. Development of Compressive Failure Strength for Composite Laminate Using Regression Analysis Method

    International Nuclear Information System (INIS)

    Lee, Myoung Keon; Lee, Jeong Won; Yoon, Dong Hyun; Kim, Jae Hoon

    2016-01-01

    This paper provides the compressive failure strength value of composite laminate developed by using regression analysis method. Composite material in this document is a Carbon/Epoxy unidirection(UD) tape prepreg(Cycom G40-800/5276-1) cured at 350°F(177°C). The operating temperature is –60°F~+200°F(-55°C - +95°C). A total of 56 compression tests were conducted on specimens from eight (8) distinct laminates that were laid up by standard angle layers (0°, +45°, –45° and 90°). The ASTM-D-6484 standard was used for test method. The regression analysis was performed with the response variable being the laminate ultimate fracture strength and the regressor variables being two ply orientations (0° and ±45°)

  20. Development and Validation of an HPLC Method for the Analysis of Sirolimus in Drug Products

    Directory of Open Access Journals (Sweden)

    Hadi Valizadeh

    2012-05-01

    Full Text Available Purpose: The aim of this study was to develop a simple, rapid and sensitive reverse phase high performance liquid chromatography (RP-HPLC method for quantification of sirolimus (SRL in pharmaceutical dosage forms. Methods: The chromatographic system employs isocratic elution using a Knauer- C18, 5 mm, 4.6 × 150 mm. Mobile phase consisting of acetonitril and ammonium acetate buffer set at flow rate 1.5 ml/min. The analyte was detected and quantified at 278nm using ultraviolet detector. The method was validated as per ICH guidelines. Results: The standard curve was found to have a linear relationship (r2 > 0.99 over the analytical range of 125–2000ng/ml. For all quality control (QC standards in intraday and interday assay, accuracy and precision range were -0.96 to 6.30 and 0.86 to 13.74 respectively, demonstrating the precision and accuracy over the analytical range. Samples were stable during preparation and analysis procedure. Conclusion: Therefore the rapid and sensitive developed method can be used for the routine analysis of sirolimus such as dissolution and stability assays of pre- and post-marketed dosage forms.

  1. Development of quantitative methods for spill response planning: a trajectory analysis planner

    International Nuclear Information System (INIS)

    Galt, J.A.; Payton, D.L.

    1999-01-01

    In planning for response to oil spills, a great deal of information must be assimilated. Typically, geophysical flow patterns, ocean turbulence, complex chemical processes, ecological setting, fisheries activities, economics of land use, and engineering constraints on response equipment all need to be considered. This presents a formidable analysis problem. It can be shown, however, that if an appropriate set of evaluation data is available, an objective function and appropriate constraints can be formulated. From these equations, the response problem can be cast in terms of game theory of decision analysis and an optimal solution can be obtained using common scarce-resource allocation methods. The optimal solution obtained by this procedure maximises the expected return over all possible implementations of a given set of response options. While considering the development of an optimal spill response, it is useful to consider whether (in the absence of complete data) implementing some subset of these methods is possible to provide relevant and useful information for the spill planning process, even though it may fall short of a statistically optimal solution. In this work we introduce a trajectory analysis planning (TAP) methodology that can provide a cohesive framework for integrating physical transport processes, environmental sensitivity of regional sites, and potential response options. This trajectory analysis planning methodology can be shown to implement a significant part of the game theory analysis and provide 'minimum regret' strategy advice, without actually carrying out the optimisation procedures. (Author)

  2. Development of a micropulverized extraction method for rapid toxicological analysis of methamphetamine in hair.

    Science.gov (United States)

    Miyaguchi, Hajime; Kakuta, Masaya; Iwata, Yuko T; Matsuda, Hideaki; Tazawa, Hidekatsu; Kimura, Hiroko; Inoue, Hiroyuki

    2007-09-07

    We developed a rapid sample preparation method for the toxicological analysis of methamphetamine and amphetamine (the major metabolite of methamphetamine) in human hair by high-performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS), to facilitate fast screening and quantitation. Two milligrams of hair were mechanically micropulverized for 5 min in a 2-ml plastic tube together with 100 microl of an aqueous solvent containing 10% acetonitrile, 100 mM trifluoroacetic acid and the corresponding deuterium analogues as internal standards. The pulverizing highly disintegrated the hair components, simultaneously allowing the extraction of any drugs present in the hair. After filtering the suspension with a membrane-filter unit, the clear filtrate was directly analyzed by HPLC-MS/MS. No evaporation processes were required for sample preparation. Method optimization and validation study were carried out using real-case specimens and fortified samples in which the drugs had been artificially absorbed, respectively. Concentration ranges for quantitation were 0.040-125 and 0.040-25 ng/mg for methamphetamine and amphetamine, respectively. Real-case specimens were analyzed by the method presented here and by conventional ones to verify the applicability of our method to real-world analysis. Our method took less than 30 min for a set of chromatograms to be obtained from a washed hair sample.

  3. Advanced organic analysis and analytical methods development: FY 1995 progress report. Waste Tank Organic Safety Program

    International Nuclear Information System (INIS)

    Wahl, K.L.; Campbell, J.A.; Clauss, S.A.

    1995-09-01

    This report describes the work performed during FY 1995 by Pacific Northwest Laboratory in developing and optimizing analysis techniques for identifying organics present in Hanford waste tanks. The main focus was to provide a means for rapidly obtaining the most useful information concerning the organics present in tank waste, with minimal sample handling and with minimal waste generation. One major focus has been to optimize analytical methods for organic speciation. Select methods, such as atmospheric pressure chemical ionization mass spectrometry and matrix-assisted laser desorption/ionization mass spectrometry, were developed to increase the speciation capabilities, while minimizing sample handling. A capillary electrophoresis method was developed to improve separation capabilities while minimizing additional waste generation. In addition, considerable emphasis has been placed on developing a rapid screening tool, based on Raman and infrared spectroscopy, for determining organic functional group content when complete organic speciation is not required. This capability would allow for a cost-effective means to screen the waste tanks to identify tanks that require more specialized and complete organic speciation to determine tank safety

  4. Development of a new method for hydrogen isotope analysis of trace hydrocarbons in natural gas samples

    Directory of Open Access Journals (Sweden)

    Xibin Wang

    2016-12-01

    Full Text Available A new method had been developed for the analysis of hydrogen isotopic composition of trace hydrocarbons in natural gas samples by using solid phase microextraction (SPME combined with gas chromatography-isotope ratio mass spectrometry (GC/IRMS. In this study, the SPME technique had been initially introduced to achieve the enrichment of trace content of hydrocarbons with low abundance and coupled to GC/IRMS for hydrogen isotopic analysis. The main parameters, including the equilibration time, extraction temperature, and the fiber type, were systematically optimized. The results not only demonstrated that high extraction yield was true but also shows that the hydrogen isotopic fractionation was not observed during the extraction process, when the SPME device fitted with polydimethylsiloxane/divinylbenzene/carbon molecular sieve (PDMS/DVB/CAR fiber. The applications of SPME-GC/IRMS method were evaluated by using natural gas samples collected from different sedimentary basins; the standard deviation (SD was better than 4‰ for reproducible measurements; and also, the hydrogen isotope values from C1 to C9 can be obtained with satisfying repeatability. The SPME-GC/IRMS method fitted with PDMS/DVB/CAR fiber is well suited for the preconcentration of trace hydrocarbons, and provides a reliable hydrogen isotopic analysis for trace hydrocarbons in natural gas samples.

  5. Development of a diagnostic expert system for eddy current data analysis using applied artificial intelligence methods

    International Nuclear Information System (INIS)

    Upadhyaya, B.R.; Yan, W.; Henry, G.

    1999-01-01

    A diagnostic expert system that integrates database management methods, artificial neural networks, and decision-making using fuzzy logic has been developed for the automation of steam generator eddy current test (ECT) data analysis. The new system, known as EDDYAI, considers the following key issues: (1) digital eddy current test data calibration, compression, and representation; (2) development of robust neural networks with low probability of misclassification for flaw depth estimation; (3) flaw detection using fuzzy logic; (4) development of an expert system for database management, compilation of a trained neural network library, and a decision module; and (5) evaluation of the integrated approach using eddy current data. The implementation to field test data includes the selection of proper feature vectors for ECT data analysis, development of a methodology for large eddy current database management, artificial neural networks for flaw depth estimation, and a fuzzy logic decision algorithm for flaw detection. A large eddy current inspection database from the Electric Power Research Institute NDE Center is being utilized in this research towards the development of an expert system for steam generator tube diagnosis. The integration of ECT data pre-processing as part of the data management, fuzzy logic flaw detection technique, and tube defect parameter estimation using artificial neural networks are the fundamental contributions of this research. (orig.)

  6. Development of a diagnostic expert system for eddy current data analysis using applied artificial intelligence methods

    Energy Technology Data Exchange (ETDEWEB)

    Upadhyaya, B.R.; Yan, W. [Tennessee Univ., Knoxville, TN (United States). Dept. of Nuclear Engineering; Behravesh, M.M. [Electric Power Research Institute, Palo Alto, CA (United States); Henry, G. [EPRI NDE Center, Charlotte, NC (United States)

    1999-09-01

    A diagnostic expert system that integrates database management methods, artificial neural networks, and decision-making using fuzzy logic has been developed for the automation of steam generator eddy current test (ECT) data analysis. The new system, known as EDDYAI, considers the following key issues: (1) digital eddy current test data calibration, compression, and representation; (2) development of robust neural networks with low probability of misclassification for flaw depth estimation; (3) flaw detection using fuzzy logic; (4) development of an expert system for database management, compilation of a trained neural network library, and a decision module; and (5) evaluation of the integrated approach using eddy current data. The implementation to field test data includes the selection of proper feature vectors for ECT data analysis, development of a methodology for large eddy current database management, artificial neural networks for flaw depth estimation, and a fuzzy logic decision algorithm for flaw detection. A large eddy current inspection database from the Electric Power Research Institute NDE Center is being utilized in this research towards the development of an expert system for steam generator tube diagnosis. The integration of ECT data pre-processing as part of the data management, fuzzy logic flaw detection technique, and tube defect parameter estimation using artificial neural networks are the fundamental contributions of this research. (orig.)

  7. Development of a low-cost method of analysis for the qualitative and quantitative analysis of butyltins in environmental samples.

    Science.gov (United States)

    Bangkedphol, Sornnarin; Keenan, Helen E; Davidson, Christine; Sakultantimetha, Arthit; Songsasen, Apisit

    2008-12-01

    Most analytical methods for butyltins are based on high resolution techniques with complicated sample preparation. For this study, a simple application of an analytical method was developed using High Performance Liquid Chromatography (HPLC) with UV detection. The developed method was studied to determine tributyltin (TBT), dibutyltin (DBT) and monobutyltin (MBT) in sediment and water samples. The separation was performed in isocratic mode on an ultra cyanopropyl column with a mobile phase of hexane containing 5% THF and 0.03% acetic acid. This method was confirmed using standard GC/MS techniques and verified by statistical paired t-test method. Under the experimental conditions used, the limit of detection (LOD) of TBT and DBT were 0.70 and 0.50 microg/mL, respectively. The optimised extraction method for butyltins in water and sediment samples involved using hexane containing 0.05-0.5% tropolone and 0.2% sodium chloride in water at pH 1.7. The quantitative extraction of butyltin compounds in a certified reference material (BCR-646) and naturally contaminated samples was achieved with recoveries ranging from 95 to 108% and at %RSD 0.02-1.00%. This HPLC method and optimum extraction conditions were used to determine the contamination level of butyltins in environmental samples collected from the Forth and Clyde canal, Scotland, UK. The values obtained severely exceeded the Environmental Quality Standard (EQS) values. Although high resolution methods are utilised extensively for this type of research, the developed method is cheaper in both terms of equipment and running costs, faster in analysis time and has comparable detection limits to the alternative methods. This is advantageous not just as a confirmatory technique but also to enable further research in this field.

  8. Development of 3D CFD simulation method in nuclear reactor safety analysis

    International Nuclear Information System (INIS)

    Rosli Darmawan; Mariah Adam

    2012-01-01

    One of the most prevailing issues in the operation of nuclear reactor is the safety of the system. Worldwide publicity on a few nuclear accidents as well as the notorious Hiroshima and Nagasaki bombing have always brought about public fear on anything related to nuclear. Most findings on the nuclear reactor accidents are closely related to the reactor cooling system. Thus, the understanding of the behaviour of reactor cooling system is very important to ensure the development and improvement on safety can be continuously done. Throughout the development of nuclear reactor technology, investigation and analysis on reactor safety have gone through several phases. In the early days, analytical and experimental methods were employed. For the last three decades 1D system level codes were widely used. The continuous development of nuclear reactor technology has brought about more complex system and processes of nuclear reactor operation. More detailed dimensional simulation codes are needed to assess these new reactors. This paper discusses the development of 3D CFD usage in nuclear reactor safety analysis worldwide. A brief review on the usage of CFD at Malaysia's Reactor TRIGA PUSPATI is also presented. (author)

  9. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    Science.gov (United States)

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  10. SENSITIVITY ANALYSIS as a methodical approach to the development of design strategies for environmentally sustainable buildings

    DEFF Research Database (Denmark)

    Hansen, Hanne Tine Ring

    . The research methodology applied in the project combines a literature study of descriptions of methodical approaches and built examples with a sensitivity analysis and a qualitative interview with two designers from a best practice example of a practice that has achieved environmentally sustainable...... architecture, such as: ecological, green, bio-climatic, sustainable, passive, low-energy and environmental architecture. This PhD project sets out to gain a better understanding of environmentally sustainable architecture and the methodical approaches applied in the development of this type of architecture...... an increase in scientific and political awareness, which has lead to an escalation in the number of research publications in the field, as well as, legislative demands for the energy consumption of buildings. The publications in the field refer to many different approaches to environmentally sustainable...

  11. Development of Quality Control Method for Glucofarmaka Antidiabetic Jamu by HPLC Fingerprint Analysis

    Directory of Open Access Journals (Sweden)

    Hanifullah Habibie

    2017-04-01

    Full Text Available Herbal medicines become increasingly popular all over the world for preventive and therapeutic purposes. Quality control of herbal medicines is important to make sure their safety and efficacy. Chromatographic fingerprinting has been accepted by the World Health Organization as one reliable strategy for quality control method in herbal medicines. In this study, high-performance liquid chromatography fingerprint analysis was developed as a quality control method for glucofarmaka antidiabetic jamu. The optimum fingerprint chromatogram were obtained using C18 as the stationary phase and linear gradient elution using 10–95% acetonitrile:water as the mobile phase within 60 minutes of elution and detection at 210 nm. About 20 peaks were detected and could be used as fingerprint of glucofarmaka jamu. To evaluate the analytical performance of the method, we determined the precision, reproducibility, and stability. The result of the analytical performance showed reliable results. The proposed method could be used as a quality control method for glucofarmaka antidiabetic jamu and also for its raw materials.

  12. Analysis and development of methods of correcting for heterogeneities to cobalt-60: computing application

    International Nuclear Information System (INIS)

    Kappas, K.

    1982-11-01

    The purpose of this work is the analysis of the influence of inhomogeneities of the human body on the determination of the dose in Cobalt-60 radiation therapy. The first part is dedicated to the physical characteristics of inhomogeneities and to the conventional methods of correction. New methods of correction are proposed based on the analysis of the scatter. This analysis allows to take account, with a greater accuracy of their physical characteristics and of the corresponding modifications of the dose: ''the differential TAR method'' and ''the Beam Substraction Method''. The second part is dedicated to the computer implementation of the second method of correction for routine application in hospital [fr

  13. Development of mechanical analysis module for simulation of SFR fuel rod behavior using finite element method

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Andong; Jeong, Hyedong; Suh, Namduk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of); Kim, Hyochan; Yang, Yongsik [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    Korean SFR developer decided to adapt metal fuel, current study focused on the metal fuel instead of oxide fuel. The SFR metal fuel has been developed by Korea Atomic Energy Research Institute (KAERI) and many efforts focused on designing and manufacturing the metal fuel. Since a nuclear fuel is the first barrier to protect radioactive isotope release, the fuel's integrity must be secured during steady-state operation and accident condition within an acceptable range. Whereas the design and evaluation methodologies, code systems and test procedures of a light water reactor fuel are sufficiently established, those of the SFR fuel needs more technical advances. In the view of regulatory point, there are still many challenging issues which are required to secure the safety of fuel and reactors. For this reason, the Korea Institute of Nuclear Safety (KINS) has launched the new project to develop the regulatory technology for SFR system including a fuel area. The ALFUS code was developed by CRIEPI and employs mechanistic model for fission gas release and swelling of fuel slug. In the code system, a finite element method was introduced to analyze the fuel and cladding's mechanical behaviors. The FEAST code is more advanced code system for SFR which adopted mechanistic FGR and swelling model but still use analytical model to simulate fuel and cladding mechanical behavior. Based on the survey of the previous studies, fuel and cladding mechanical model should be improved. Analysis of mechanical behavior for fuel rod is crucial to evaluate overall rod's integrity. In addition, it is because contact between fuel slug and cladding or an over-pressure of rod internal pressure can cause rod failure during steady-state and other operation condition. The most of reference codes have simplified mechanical analysis model, so called 'analytical mode', because the detailed mechanical analysis requires large amount of calculation time and computing power. Even

  14. Development of mechanical analysis module for simulation of SFR fuel rod behavior using finite element method

    International Nuclear Information System (INIS)

    Shin, Andong; Jeong, Hyedong; Suh, Namduk; Kim, Hyochan; Yang, Yongsik

    2014-01-01

    Korean SFR developer decided to adapt metal fuel, current study focused on the metal fuel instead of oxide fuel. The SFR metal fuel has been developed by Korea Atomic Energy Research Institute (KAERI) and many efforts focused on designing and manufacturing the metal fuel. Since a nuclear fuel is the first barrier to protect radioactive isotope release, the fuel's integrity must be secured during steady-state operation and accident condition within an acceptable range. Whereas the design and evaluation methodologies, code systems and test procedures of a light water reactor fuel are sufficiently established, those of the SFR fuel needs more technical advances. In the view of regulatory point, there are still many challenging issues which are required to secure the safety of fuel and reactors. For this reason, the Korea Institute of Nuclear Safety (KINS) has launched the new project to develop the regulatory technology for SFR system including a fuel area. The ALFUS code was developed by CRIEPI and employs mechanistic model for fission gas release and swelling of fuel slug. In the code system, a finite element method was introduced to analyze the fuel and cladding's mechanical behaviors. The FEAST code is more advanced code system for SFR which adopted mechanistic FGR and swelling model but still use analytical model to simulate fuel and cladding mechanical behavior. Based on the survey of the previous studies, fuel and cladding mechanical model should be improved. Analysis of mechanical behavior for fuel rod is crucial to evaluate overall rod's integrity. In addition, it is because contact between fuel slug and cladding or an over-pressure of rod internal pressure can cause rod failure during steady-state and other operation condition. The most of reference codes have simplified mechanical analysis model, so called 'analytical mode', because the detailed mechanical analysis requires large amount of calculation time and computing power. Even

  15. Development of a sensitive and rapid method for rifampicin impurity analysis using supercritical fluid chromatography.

    Science.gov (United States)

    Li, Wei; Wang, Jun; Yan, Zheng-Yu

    2015-10-10

    A novel simple, fast and efficient supercritical fluid chromatography (SFC) method was developed and compared with RPLC method for the separation and determination of impurities in rifampicin. The separation was performed using a packed diol column and a mobile phase B (modifier) consisting of methanol with 0.1% ammonium formate (w/v) and 2% water (v/v). Overall satisfactory resolutions and peak shapes for rifampicin quinone (RQ), rifampicin (RF), rifamycin SV (RSV), rifampicin N-oxide (RNO) and 3-formylrifamycinSV (3-FR) were obtained by optimization of the chromatography system. With gradient elution of mobile phase, all of the impurities and the active were separated within 4 min. Taking full advantage of features of SFC (such as particular selectivity, non-sloping baseline in gradient elution, and without injection solvent effects), the method was successfully used for determination of impurities in rifampicin, with more impurity peaks detected, better resolution achieved and much less analysis time needed compared with conventional reversed-phase liquid chromatography (RPLC) methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. DEVELOPMENT OF A METHOD STATISTICAL ANALYSIS ACCURACY AND PROCESS STABILITY PRODUCTION OF EPOXY RESIN ED-20

    Directory of Open Access Journals (Sweden)

    N. V. Zhelninskaya

    2015-01-01

    Full Text Available Statistical methods play an important role in the objective evaluation of quantitative and qualitative characteristics of the process and are one of the most important elements of the quality assurance system production and total quality management process. To produce a quality product, one must know the real accuracy of existing equipment, to determine compliance with the accuracy of a selected technological process specified accuracy products, assess process stability. Most of the random events in life, particularly in manufacturing and scientific research, are characterized by the presence of a large number of random factors, is described by a normal distribution, which is the main in many practical studies. Modern statistical methods is quite difficult to grasp and wide practical use without in-depth mathematical training of all participants in the process. When we know the distribution of a random variable, you can get all the features of this batch of products, to determine the mean value and the variance. Using statistical control methods and quality control in the analysis of accuracy and stability of the technological process of production of epoxy resin ED20. Estimated numerical characteristics of the law of distribution of controlled parameters and determined the percentage of defects of the investigated object products. For sustainability assessment of manufacturing process of epoxy resin ED-20 selected Shewhart control charts, using quantitative data, maps of individual values of X and sliding scale R. Using Pareto charts identify the causes that affect low dynamic viscosity in the largest extent. For the analysis of low values of dynamic viscosity were the causes of defects using Ishikawa diagrams, which shows the most typical factors of the variability of the results of the process. To resolve the problem, it is recommended to modify the polymer composition of carbon fullerenes and to use the developed method for the production of

  17. Advanced methods for a probabilistic safety analysis of fires. Development of advanced methods for performing as far as possible realistic plant specific fire risk analysis (fire PSA)

    International Nuclear Information System (INIS)

    Hofer, E.; Roewekamp, M.; Tuerschmann, M.

    2003-07-01

    In the frame of the research project RS 1112 'Development of Methods for a Recent Probabilistic Safety Analysis, Particularly Level 2' funded by the German Federal Ministry of Economics and Technology (BMWi), advanced methods, in particular for performing as far as possible realistic plant specific fire risk analyses (fire PSA), should be developed. The present Technical Report gives an overview on the methodologies developed in this context for assessing the fire hazard. In the context of developing advanced methodologies for fire PSA, a probabilistic dynamics analysis with a fire simulation code including an uncertainty and sensitivity study has been performed for an exemplary scenario of a cable fire induced by an electric cabinet inside the containment of a modern Konvoi type German nuclear power plant taking into consideration the effects of fire detection and fire extinguishing means. With the present study, it was possible for the first time to determine the probabilities of specified fire effects from a class of fire events by means of probabilistic dynamics supplemented by uncertainty and sensitivity analyses. The analysis applies a deterministic dynamics model, consisting of a dynamic fire simulation code and a model of countermeasures, considering effects of the stochastics (so-called aleatory uncertainties) as well as uncertainties in the state of knowledge (so-called epistemic uncertainties). By this means, probability assessments including uncertainties are provided to be used within the PSA. (orig.) [de

  18. CZECHOSLOVAK FOOTPRINTS IN THE DEVELOPMENT OF METHODS OF THERMOMETRY, CALORIMETRY AND THERMAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Pavel Holba

    2012-07-01

    Full Text Available A short history on the development of thermometric methods are reviewed accentuating the role of Rudolf Bárta in underpinning special thermoanalytical conferences and new journal Silikáty in fifties as well as Vladimir Šatava mentioning his duty in the creation of the Czech school on thermoanalytical kinetics. This review surveys the innovative papers dealing with thermal analysis and the related fields (e.g. calorimetry, kinetics which have been published by noteworthy postwar Czechoslovak scholars and scientists and by their disciples in 1950-1980. Itemized 227 references with titles show rich scientific productivity revealing that many of them were ahead of time even at international connotation.

  19. Compare the user interface of digital libraries\\' websites between the developing and developed countries in content analysis method

    Directory of Open Access Journals (Sweden)

    Gholam Abbas Mousavi

    2017-03-01

    Full Text Available Purpose: This study performed with goals of determining the Items in designing and developing the user interface of digital libraries' websites and to determine the best digital libraries' websites and discuss their advantages and disadvantages; to analyze and compare digital libraries' websites in developing countries with those in the developed countries. Methodology: to do so, 50 digital libraries' websites were selected by purposive sampling method. By analyzing the level of development of the countries in the sample regarding their digital libraries' websites, 12 websites were classified as belonging to developing and 38 countries to developed counties. Then, their content was studied by using a qualitative content analysis. The study was conducted by using a research-constructed checklist containing 12 main categories and 44 items, whose validity was decided by content validity method. The data was analyzed in SPSS (version 16. Findings: The results showed that in terms of “online resources”, “library collection,” and “navigation”, there is a significant relationship between the digital library' user interface design in both types of countries. Results: The items of “online public access catalogue (OPAC” and “visits statistics” were observed in more developing countries’ digital libraries' websites. However, the item of “menu and submenus to introduce library' sections” was presented in more developed countries’ digital libraries' websites. Moreover, by analyzing the number of items in the selected websites, “American Memory” with 44 items, “International Children Digital Library” with 40 items, and “California” with 39 items were the best, and “Berkeley Sun Site” with 10 items was the worst website. Despite more and better quality digital libraries in developed countries, the quality of digital libraries websites in developing countries is considerable. In general, some of the newly established

  20. Chemical sensors and the development of potentiometric methods for liquid media analysis

    International Nuclear Information System (INIS)

    Vlasov, Yu.G.; Kolodnikov, V.V.; Ermolenko, Yu.E.; Mikhajlova, S.S.

    1996-01-01

    Aspects of applying indirect potentiometric determination to chemical analysis are considered. Among them are the standard and modified addition and subtraction methods, the multiple addition method, and potentiometric titration using ion-selective electrodes as indicators. These methods significantly extend the capabilities of ion-selective potentiometric analysis. Conditions for the applicability of the above-mentioned methods to various samples (Cd, REE, Th, iodides and others) are discussed using all available ion-selective electrodes as examples. 162 refs., 2 figs., 5 tabs

  1. COMPUTER METHODS OF GENETIC ANALYSIS.

    Directory of Open Access Journals (Sweden)

    A. L. Osipov

    2017-02-01

    Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.

  2. Development of High Precision Tsunami Runup Calculation Method Coupled with Structure Analysis

    Science.gov (United States)

    Arikawa, Taro; Seki, Katsumi; Chida, Yu; Takagawa, Tomohiro; Shimosako, Kenichiro

    2017-04-01

    Calculation Method Based on a Hierarchical Simulation", Journal of Disaster ResearchVol.11 No.4 T. Arikawa, K. Hamaguchi, K. Kitagawa, T. Suzuki (2009): "Development of Numerical Wave Tank Coupled with Structure Analysis Based on FEM", Journal of J.S.C.E., Ser. B2 (Coastal Engineering) Vol. 65, No. 1 T. Arikawa et. al.(2012) "Failure Mechanism of Kamaishi Breakwaters due to the Great East Japan Earthquake Tsunami", 33rd International Conference on Coastal Engineering, No.1191

  3. Development on quantitative safety analysis method of accident scenario. The automatic scenario generator development for event sequence construction of accident

    International Nuclear Information System (INIS)

    Kojima, Shigeo; Onoue, Akira; Kawai, Katsunori

    1998-01-01

    This study intends to develop a more sophisticated tool that will advance the current event tree method used in all PSA, and to focus on non-catastrophic events, specifically a non-core melt sequence scenario not included in an ordinary PSA. In the non-catastrophic event PSA, it is necessary to consider various end states and failure combinations for the purpose of multiple scenario construction. Therefore it is anticipated that an analysis work should be reduced and automated method and tool is required. A scenario generator that can automatically handle scenario construction logic and generate the enormous size of sequences logically identified by state-of-the-art methodology was developed. To fulfill the scenario generation as a technical tool, a simulation model associated with AI technique and graphical interface, was introduced. The AI simulation model in this study was verified for the feasibility of its capability to evaluate actual systems. In this feasibility study, a spurious SI signal was selected to test the model's applicability. As a result, the basic capability of the scenario generator could be demonstrated and important scenarios were generated. The human interface with a system and its operation, as well as time dependent factors and their quantification in scenario modeling, was added utilizing human scenario generator concept. Then the feasibility of an improved scenario generator was tested for actual use. Automatic scenario generation with a certain level of credibility, was achieved by this study. (author)

  4. Development of a micrometre-scale radiographic measuring method for residual stress analysis

    International Nuclear Information System (INIS)

    Moeller, D.

    1999-01-01

    The radiographic method described uses micrometre X-ray diffraction for high-resolution residual stress analysis in single crystals. The focus is on application of two x-ray optics (glass capillaries) for shaping a sufficiently fine and intensive primary beam. Due to application of a proper one-grain measuring and analysis method, the resolution results are applicable to the characteristic grain sizes of many materials. (orig.) [de

  5. Development of measurement and analysis method for long-term monitoring of {sup 41}-K

    Energy Technology Data Exchange (ETDEWEB)

    Yuita, Koichi; Miyagawa, Saburo [National Inst. of Agro-Environmental Sciences, Tsukuba, Ibaraki (Japan)

    2000-02-01

    This study aimed to develop a double labeling method with {sup 41}K and {sup 15}N for animal feed and excreta. Guinea pig was used as the subjects for the preliminary experiment. Animal feces and urine were separately collected once a day and the feces were dried at 70degC and urine was lyophilized. Those samples were submitted to analysis after mixing. Then, {sup 41}KCl solution and {sup 15}NH{sub 4}SO{sub 4} solution were absorbed to the conventional guinea pig feed and 1.0 g of the feed was given once a day. The amount of {sup 41}K in feces was determined using flame photometric detector and {sup 15}N was determined by ANCA-SL Mass spectrometer. The isotope abundances of {sup 41}K and {sup 15}N in the feed were 6.11% and 0.829%, respectively and the excess % was -0.062 % and 0.46 % for {sup 41}K and {sup 15}N, respectively. The present results showed that 15-N labeling for feces was fairly succeeded, but {sup 41}K labeling was insufficient. Therefore, it is thought necessary to use K tracer of a larger excess % (-0.3% or more) and raise the accuracy of analysis for total K and {sup 41}K. (M.N.)

  6. Development of a reliability-analysis method for category I structures

    International Nuclear Information System (INIS)

    Shinozuka, M.; Kako, T.; Hwang, H.; Reich, M.

    1983-01-01

    The present paper develops a reliability analysis method for category I nuclear structures, particularly for reinforced concrete containment structures subjected to various load combinations. The loads considered here include dead loads, accidental internal pressure and earthquake ground acceleration. For mathematical tractability, an earthquake occurrence is assumed to be governed by the Poisson arrival law, while its acceleration history is idealized as a Gaussian vector process of finite duration. A vector process consists of three component processes, each with zero mean. The second order statistics of this process are specified by a three-by-three spectral density matrix with a multiplying factor representing the overall intensity of the ground acceleration. With respect to accidental internal pressure, the following assumptions are made: (a) it occurs in accordance with the Poisson law; (b) its intensity and duration are random; and (c) its temporal rise and fall behaviors are such that a quasi-static structural analysis applies. A dead load is considered to be a deterministic constant

  7. Method Development of Cadmium Investigation in Rice by Radiochemical Neutron Activation Analysis

    International Nuclear Information System (INIS)

    Promsawad, Arunee; Pareepart, Ratirot; Laoharojanaphand, Sirinart; Arunee, Kongsakpaisal

    2007-08-01

    Full text: A radiochemical neutron activation analysis for the determination of cadmium was investigated. A chemical separation of cadmium utilized ion exchange chromatography of a strong basic anion-exchange resin BIO-RAD 1X 8 (Chloride form). The adsorbing medium of 2M HCl was found to be the most suitable among the concentration attempted (2, 4, 6, 8 and 10M HCl) and the eluent for desorption of the cadmium from column was 8M NH 3 solution. A chemical yield of 95% was found. The method has been evaluated by analyzing certified reference materials with 0.5.g/g (SRM 1577b, Bovine Liver) and 2.48.g/g (SRM 1566b, Oyster Tissue) cadmium. The agreement of the result with certified values is within 92% for Bovine Liver and 96% for Oyster Tissue. The method developed was applied to determine the cadmium concentrations in contaminated Thai rice. It was found that the cadmium concentrations ranged from 7.4 to 578.9 ppb

  8. Analysis and development of stochastic multigrid methods in lattice field theory

    International Nuclear Information System (INIS)

    Grabenstein, M.

    1994-01-01

    We study the relation between the dynamical critical behavior and the kinematics of stochastic multigrid algorithms. The scale dependence of acceptance rates for nonlocal Metropolis updates is analyzed with the help of an approximation formula. A quantitative study of the kinematics of multigrid algorithms in several interacting models is performed. We find that for a critical model with Hamiltonian H(Φ) absence of critical slowing down can only be expected if the expansion of (H(Φ+ψ)) in terms of the shift ψ contains no relevant term (mass term). The predictions of this rule was verified in a multigrid Monte Carlo simulation of the Sine Gordon model in two dimensions. Our analysis can serve as a guideline for the development of new algorithms: We propose a new multigrid method for nonabelian lattice gauge theory, the time slice blocking. For SU(2) gauge fields in two dimensions, critical slowing down is almost completely eliminated by this method, in accordance with the theoretical prediction. The generalization of the time slice blocking to SU(2) in four dimensions is investigated analytically and by numerical simulations. Compared to two dimensions, the local disorder in the four dimensional gauge field leads to kinematical problems. (orig.)

  9. A Roadmap of Risk Diagnostic Methods: Developing an Integrated View of Risk Identification and Analysis Techniques

    National Research Council Canada - National Science Library

    Williams, Ray; Ambrose, Kate; Bentrem, Laura

    2004-01-01

    ...), which is envisioned to be a comprehensive reference tool for risk identification and analysis (RI AND A) techniques. Program Managers (PMs) responsible for developing or acquiring software-intensive systems typically identify risks in different ways...

  10. Coupling Neumann development and component mode synthesis methods for stochastic analysis of random structures

    Directory of Open Access Journals (Sweden)

    Driss Sarsri

    2014-05-01

    Full Text Available In this paper, we propose a method to calculate the first two moments (mean and variance of the structural dynamics response of a structure with uncertain variables and subjected to random excitation. For this, Newmark method is used to transform the equation of motion of the structure into a quasistatic equilibrium equation in the time domain. The Neumann development method was coupled with Monte Carlo simulations to calculate the statistical values of the random response. The use of modal synthesis methods can reduce the dimensions of the model before integration of the equation of motion. Numerical applications have been developed to highlight effectiveness of the method developed to analyze the stochastic response of large structures.

  11. Development of the complex of nuclear-physical methods of analysis for geology and technology tasks in Kazakhstan

    International Nuclear Information System (INIS)

    Solodukhin, V.; Silachyov, I.; Poznyak, V.; Gorlachev, I.

    2016-01-01

    The paper describes the development of nuclear-physical methods of analysis and their applications in Kazakhstan for geological tasks and technology. The basic methods of this complex include instrumental neutron-activation analysis, x-ray fluorescent analysis and instrumental γ-spectrometry. The following aspects are discussed: applications of developed and adopted analytical techniques for assessment and calculations of rare-earth metal reserves at various deposits in Kazakhstan, for technology development of mining and extraction from uranium-phosphorous ore and wastes, for radioactive coal gasification technology, for studies of rare metal contents in chromite, bauxites, black shales and their processing products. (author)

  12. Analysis of factors affecting the development of food crop varieties bred by mutation method in China

    International Nuclear Information System (INIS)

    Wang Zhidong; Hu Ruifa

    2002-01-01

    The research developed a production function on crop varieties developed by mutation method in order to explore factors affecting the development of new varieties. It is found that the research investment, human capital and radiation facilities were the most important factors that affected the development and cultivation area of new varieties through the mutation method. It is concluded that not all institutions involved in the breeding activities using mutation method must have radiation facilities and the national government only needed to invest in those key research institutes, which had strong research capacities. The saved research budgets can be used in the entrusting the institutes that have stronger research capacities with irradiating more breeding materials developed by the institutes that have weak research capacities, by which more opportunities to breed better varieties can be created

  13. Pathways to Lean Software Development: An Analysis of Effective Methods of Change

    Science.gov (United States)

    Hanson, Richard D.

    2014-01-01

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is…

  14. Development of a segmentation method for analysis of Campos basin typical reservoir rocks

    Energy Technology Data Exchange (ETDEWEB)

    Rego, Eneida Arendt; Bueno, Andre Duarte [Universidade Estadual do Norte Fluminense Darcy Ribeiro (UENF), Macae, RJ (Brazil). Lab. de Engenharia e Exploracao de Petroleo (LENEP)]. E-mails: eneida@lenep.uenf.br; bueno@lenep.uenf.br

    2008-07-01

    This paper represents a master thesis proposal in Exploration and Reservoir Engineering that have the objective to development a specific segmentation method for digital images of reservoir rocks, which produce better results than the global methods available in the bibliography for the determination of rocks physical properties as porosity and permeability. (author)

  15. Development of thermal analysis method for the near field of HLW repository using ABAQUS

    Energy Technology Data Exchange (ETDEWEB)

    Kuh, Jung Eui; Kang, Chul Hyung; Park, Jeong Hwa [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-10-01

    An appropriate tool is needed to evaluate the thermo-mechanical stability of high level radioactive waste (HLW) repository. In this report a thermal analysis methodology for the near field of HLW repository is developed to use ABAQUS which is one of the multi purpose FEM code and has been used for many engineering area. The main contents of this methodology development are the structural and material modelling to simulate a repository, setup of side conditions, e.g., boundary and load conditions, and initial conditions, and the procedure to selection proper material parameters. In addition to these, the interface programs for effective production of input data and effective change of model size for sensitivity analysis for disposal concept development are developed. The results of this work will be apply to evaluate the thermal stability and to use as main input data for mechanical analysis of HLW repository. (author). 20 refs., 15 figs., 5 tabs.

  16. Principles and methods of neutron activation analysis (NAA) in improved water resources development

    International Nuclear Information System (INIS)

    Dim, L. A.

    2000-01-01

    The methods of neutron activation analysis (NAA) as it applies to water resources exploration, exploitation and management has been reviewed and its capabilities demonstrated. NAA has been found to be superior and offer higher sensitivity to many other analytical techniques in analysis of water. The implications of chemical and element concentrations (water pollution and quality) determined in water on environmental impact assessment to aquatic life and human health are briefly highlighted

  17. Developments of the neutron scattering analysis method for the determination of magnetic structures

    Energy Technology Data Exchange (ETDEWEB)

    Park, Je-Geun; Chung, Jae Gwan; Park, Jung Hwan; Kong, Ung Girl [Inha Univ., Incheon (Korea); So, Ji Yong [Seoul National University, Seoul(Korea)

    2001-04-01

    Neutron diffraction is up to now almost the only and very important experimental method of determining the magnetic structure of materials. Unlike the studies of crystallographic structure, however to use neutron diffraction for magnetic structure determination is not easily accessible to non-experts because of the complexity of magnetic group theory: which is very important in the magnetic structure analysis. With the recent development of computer code for magnetic group, it is now time to rethink of these difficulties. In this work, we have used the computer code of the magnetic group (Mody-2) and Fullprof refinement program in order to study the magnetic structure of YMnO{sub 3} and other interesting materials. YMnO{sub 3} forms in the hexagonal structure and show both ferroelectric and antiferromagnetic phase transitions. Since it was recently found that YMnO{sub 3} can be used as a nonvolatile memory device, there has been many numbers of applied research on this material. We used neutron diffraction to determine the magnetic structure, and, in particular, to investigate the correlation between the order parameters of the ferroelectric and antiferromagnetic phase transitions. From this study, we have demonstrated that with a proper use of the computer code of the magnetic group one can overcome most of difficulties arising from the magnetic group theory. 4 refs., 8 figs., 5 tabs. (Author)

  18. Perfection Of Methods Of Mathematical Analysis For Increasing The Completeness Of Subsoil Development

    Science.gov (United States)

    Fokina, Mariya

    2017-11-01

    The economy of Russia is based around the mineral-raw material complex to the highest degree. The mining industry is a prioritized and important area. Given the high competitiveness of businesses in this sector, increasing the efficiency of completed work and manufactured products will become a central issue. Improvement of planning and management in this sector should be based on multivariant study and the optimization of planning decisions, the appraisal of their immediate and long-term results, taking the dynamic of economic development into account. All of this requires the use of economic mathematic models and methodsApplying an economic-mathematic model to determine optimal ore mine production capacity, we receive a figure of 4,712,000 tons. The production capacity of the Uchalinsky ore mine is 1560 thousand tons, and the Uzelginsky ore mine - 3650 thousand. Conducting a corresponding analysis of the production of OAO "Uchalinsky Gok", an optimal production plan was received: the optimal production of copper - 77961,4 rubles; the optimal production of zinc - 17975.66 rubles. The residual production volume of the two main ore mines of OAO "UGOK" is 160 million tons of ore.

  19. Development of a PSA-based Loss of Large Area Analysis Method

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Mee Jeong; Jung, Woosik [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Myungsu [Korea Hydro Nuclear Power, Central Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    As a result of these initial post 9-11 assessments in 2002, the NRC issued an interim safeguards and security compensatory measures order. In 'Interim Compensatory Measures for High Threat Environment,'. Section B.5.b (not publically available) of this order, current NPP licensees had to adopt mitigation or restore reactor core cooling, containment, and spent fuel pool (SFP) cooling capabilities to cope with a LOLA due to large fires and explosions from any cause, including beyond-design basis threat(BDBT) aircraft impacts. In 2009, the NRC issued amendments to 10CFR Part 52, and Part 73 for power reactor security requirements for operating and new reactors. New U.S. licensed commercial nuclear power plant operators are required to provide a LOLA(Loss of Large Area) analysis as per the U.S. Code of Federal Regulations, 10CFR50.54(hh)(2). Additionally 10CFR52.80(d) provides the required submittal information on how an applicant for a combined operating license(COL) for a nuclear power plant to meet these requirements. It is necessary to prepare our own guidance for a development of LOLA strategies. In this paper, we proposed a method to look for interesting combinations of rooms in certain targets getting through VAI model, and produced insights that could be used to influence LOLA strategies.

  20. Pathways to lean software development: An analysis of effective methods of change

    Science.gov (United States)

    Hanson, Richard D.

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is firmly entrenched in American business today was to blame for this difficulty (Chatterjee, 2010). Each of these proponents of new methods sought to remove waste, lighten out the process, and implement lean principles in software development. Through this study, the experts evaluated the barriers to effective development principles and defined leadership qualities necessary to overcome these barriers. The barriers identified were issues of resistance to change, risk and reward issues, and management buy-in. Thirty experts in software development from several Fortune 500 companies across the United States explored each of these issues in detail. The conclusion reached by these experts was that visionary leadership is necessary to overcome these challenges.

  1. Developing Vulnerability Analysis Method for Climate Change Adaptation on Agropolitan Region in Malang District

    Science.gov (United States)

    Sugiarto, Y.; Perdinan; Atmaja, T.; Wibowo, A.

    2017-03-01

    Agriculture plays a strategic role in strengthening sustainable development. Based on agropolitan concept, the village becomes the center of economic activities by combining agriculture, agro-industry, agribusiness and tourism that able to create high value-added economy. The impact of climate change on agriculture and water resources may increase the pressure on agropolitan development. The assessment method is required to measure the vulnerability of area-based communities in the agropolitan to climate change impact. An analysis of agropolitan vulnerability was conducted in Malang district based on four aspects and considering the availability and distribution of water as the problem. The indicators used to measure was vulnerability component which consisted of sensitivity and adaptive capacity and exposure component. The studies earned 21 indicators derived from the 115 village-based data. The results of vulnerability assessments showed that most of the villages were categorised at a moderate level. Around 20% of 388 villages were categorized at high to very high level of vulnerability due to low level of agricultural economic. In agropolitan region within the sub-district of Poncokusumo, the vulnerability of the villages varies between very low to very high. The most villages were vulnerable due to lower adaptive capacity, eventhough the level of sensitivity and exposure of all villages were relatively similar. The existence of water resources was the biggest contributor to the high exposure of the villages in Malang district, while the reception of credit facilities and source of family income were among the indicators that lead to high sensitivity component.

  2. In Vitro Dissolution Profile of Dapagliflozin: Development, Method Validation, and Analysis of Commercial Tablets

    Directory of Open Access Journals (Sweden)

    Rafaela Zielinski Cavalheiro de Meira

    2017-01-01

    Full Text Available Dapagliflozin was the first of its class (inhibitors of sodium-glucose cotransporter to be approved in Europe, USA, and Brazil. As the drug was recently approved, there is the need for research on analytical methods, including dissolution studies for the quality evaluation and assurance of tablets. The dissolution methodology was developed with apparatus II (paddle in 900 mL of medium (simulated gastric fluid, pH 1.2, temperature set at 37±0.5°C, and stirring speed of 50 rpm. For the quantification, a spectrophotometric (λ=224 nm method was developed and validated. In validation studies, the method proved to be specific and linear in the range from 0.5 to 15 μg·mL−1 (r2=0.998. The precision showed results with RSD values lower than 2%. The recovery of 80.72, 98.47, and 119.41% proved the accuracy of the method. Through a systematic approach by applying Factorial 23, the robustness of the method was confirmed (p>0.05. The studies of commercial tablets containing 5 or 10 mg demonstrated that they could be considered similar through f1, f2, and dissolution efficiency analyses. Also, the developed method can be used for the quality evaluation of dapagliflozin tablets and can be considered as a scientific basis for future official pharmacopoeial methods.

  3. Gap analysis: a method to assess core competency development in the curriculum.

    Science.gov (United States)

    Fater, Kerry H

    2013-01-01

    To determine the extent to which safety and quality improvement core competency development occurs in an undergraduate nursing program. Rapid change and increased complexity of health care environments demands that health care professionals are adequately prepared to provide high quality, safe care. A gap analysis compared the present state of competency development to a desirable (ideal) state. The core competencies, Nurse of the Future Nursing Core Competencies, reflect the ideal state and represent minimal expectations for entry into practice from pre-licensure programs. Findings from the gap analysis suggest significant strengths in numerous competency domains, deficiencies in two competency domains, and areas of redundancy in the curriculum. Gap analysis provides valuable data to direct curriculum revision. Opportunities for competency development were identified, and strategies were created jointly with the practice partner, thereby enhancing relevant knowledge, attitudes, and skills nurses need for clinical practice currently and in the future.

  4. Development of CFD analysis method based on droplet tracking model for BWR fuel assemblies

    International Nuclear Information System (INIS)

    Onishi, Yoichi; Minato, Akihiko; Ichikawa, Ryoko; Mashara, Yasuhiro

    2011-01-01

    It is well known that the minimum critical power ratio (MCPR) of the boiling water reactor (BWR) fuel assembly depends on the spacer grid type. Recently, improvement of the critical power is being studied by using a spacer grid with mixing devices attaching various types of flow deflectors. In order to predict the critical power of the improved BWR fuel assembly, we have developed an analysis method based on the consideration of detailed thermal-hydraulic mechanism of annular mist flow regime in the subchannels for an arbitrary spacer type. The proposed method is based on a computational fluid dynamics (CFD) model with a droplet tracking model for analyzing the vapor-phase turbulent flow in which droplets are transported in the subchannels of the BWR fuel assembly. We adopted the general-purpose CFD software Advance/FrontFlow/red (AFFr) as the base code, which is a commercial software package created as a part of Japanese national project. AFFr employs a three-dimensional (3D) unstructured grid system for application to complex geometries. First, AFFr was applied to single-phase flows of gas in the present paper. The calculated results were compared with experiments using a round cellular spacer in one subchannel to investigate the influence of the choice of turbulence model. The analyses using the large eddy simulation (LES) and re-normalisation group (RNG) k-ε models were carried out. The results of both the LES and RNG k-ε models show that calculations of velocity distribution and velocity fluctuation distribution in the spacer downstream reproduce the experimental results qualitatively. However, the velocity distribution analyzed by the LES model is better than that by the RNG k-ε model. The velocity fluctuation near the fuel rod, which is important for droplet deposition to the rod, is also simulated well by the LES model. Then, to examine the effect of the spacer shape on the analytical result, the gas flow analyses with the RNG k-ε model were performed

  5. Shlaer-Mellor object-oriented analysis and recursive design, an effective modern software development method for development of computing systems for a large physics detector

    International Nuclear Information System (INIS)

    Kozlowski, T.; Carey, T.A.; Maguire, C.F.

    1995-01-01

    After evaluation of several modern object-oriented methods for development of the computing systems for the PHENIX detector at RHIC, we selected the Shlaer-Mellor Object-Oriented Analysis and Recursive Design method as the most appropriate for the needs and development environment of a large nuclear or high energy physics detector. This paper discusses our specific needs and environment, our method selection criteria, and major features and components of the Shlaer-Mellor method

  6. Analysis of numerical methods

    CERN Document Server

    Isaacson, Eugene

    1994-01-01

    This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.

  7. RESULTS OF ANALYSIS OF BENCHMARKING METHODS OF INNOVATION SYSTEMS ASSESSMENT IN ACCORDANCE WITH AIMS OF SUSTAINABLE DEVELOPMENT OF SOCIETY

    Directory of Open Access Journals (Sweden)

    A. Vylegzhanina

    2016-01-01

    Full Text Available In this work, we introduce results of comparative analysis of international ratings indexes of innovation systems for their compliance with purposes of sustainable development. Purpose of this research is defining requirements to benchmarking methods of assessing national or regional innovation systems and compare them basing on assumption, that innovation system is aligned with sustainable development concept. Analysis of goal sets and concepts, which underlie observed international composite innovation indexes, comparison of their metrics and calculation techniques, allowed us to reveal opportunities and limitations of using these methods in frames of sustainable development concept. We formulated targets of innovation development on the base of innovation priorities of sustainable socio-economic development. Using comparative analysis of indexes with these targets, we revealed two methods of assessing innovation systems, maximally connected with goals of sustainable development. Nevertheless, today no any benchmarking method, which meets need of innovation systems assessing in compliance with sustainable development concept to a sufficient extent. We suggested practical directions of developing methods, assessing innovation systems in compliance with goals of societal sustainable development.

  8. Research for developing precise tsunami evaluation methods. Probabilistic tsunami hazard analysis/numerical simulation method with dispersion and wave breaking

    International Nuclear Information System (INIS)

    2007-01-01

    The present report introduces main results of investigations on precise tsunami evaluation methods, which were carried out from the viewpoint of safety evaluation for nuclear power facilities and deliberated by the Tsunami Evaluation Subcommittee. A framework for the probabilistic tsunami hazard analysis (PTHA) based on logic tree is proposed and calculation on the Pacific side of northeastern Japan is performed as a case study. Tsunami motions with dispersion and wave breaking were investigated both experimentally and numerically. The numerical simulation method is verified for its practicability by applying to a historical tsunami. Tsunami force is also investigated and formulae of tsunami pressure acting on breakwaters and on building due to inundating tsunami are proposed. (author)

  9. Analysis of slippery droplet on tilted plate by development of optical correction method

    Science.gov (United States)

    Ko, Han Seo; Gim, Yeonghyeon; Choi, Sung Ho; Jang, Dong Kyu; Sohn, Dong Kee

    2017-11-01

    Because of distortion effects on a surface of a sessile droplet, the inner flow field of the droplet is measured by a PIV (particle image velocimetry) method with low reliability. In order to solve this problem, many researchers have studied and developed the optical correction method. However, the method cannot be applied for various cases such as the tilted droplet or other asymmetric shaped droplets since most methods were considered only for the axisymmetric shaped droplets. For the optical correction of the asymmetric shaped droplet, the surface function was calculated by the three-dimensional reconstruction using the ellipse curve fitting method. Also, the optical correction using the surface function was verified by the numerical simulation. Then, the developed method was applied to reconstruct the inner flow field of the droplet on the tilted plate. The colloidal droplet of water on the tilted surface was used, and the distorted effect on the surface of the droplet was calculated. Using the obtained results and the PIV method, the corrected flow field for the inner and interface parts of the droplet was reconstructed. Consequently, the error caused by the distortion effect of the velocity vector located on the apex of the droplet was removed. National Research Foundation (NRF) of Korea, (2016R1A2B4011087).

  10. Development of an unbiased statistical method for the analysis of unigenic evolution

    Directory of Open Access Journals (Sweden)

    Shilton Brian H

    2006-03-01

    Full Text Available Abstract Background Unigenic evolution is a powerful genetic strategy involving random mutagenesis of a single gene product to delineate functionally important domains of a protein. This method involves selection of variants of the protein which retain function, followed by statistical analysis comparing expected and observed mutation frequencies of each residue. Resultant mutability indices for each residue are averaged across a specified window of codons to identify hypomutable regions of the protein. As originally described, the effect of changes to the length of this averaging window was not fully eludicated. In addition, it was unclear when sufficient functional variants had been examined to conclude that residues conserved in all variants have important functional roles. Results We demonstrate that the length of averaging window dramatically affects identification of individual hypomutable regions and delineation of region boundaries. Accordingly, we devised a region-independent chi-square analysis that eliminates loss of information incurred during window averaging and removes the arbitrary assignment of window length. We also present a method to estimate the probability that conserved residues have not been mutated simply by chance. In addition, we describe an improved estimation of the expected mutation frequency. Conclusion Overall, these methods significantly extend the analysis of unigenic evolution data over existing methods to allow comprehensive, unbiased identification of domains and possibly even individual residues that are essential for protein function.

  11. A Product Analysis Method and Its Staging to Develop Redesign Competences

    Science.gov (United States)

    Hansen, Claus Thorp; Lenau, Torben Anker

    2013-01-01

    Most product development work in industrial practice is incremental, i.e., the company has had a product in production and on the market for some time, and now time has come to design an upgraded variant. This type of redesign project requires that the engineering designers have competences to carry through an analysis of the existing product…

  12. Developing strategies to reduce the risk of hazardous materials transportation in iran using the method of fuzzy SWOT analysis

    Directory of Open Access Journals (Sweden)

    A. S. Kheirkhah

    2009-12-01

    Full Text Available An increase in hazardous materials transportation in Iran along with the industrial development and increase of resulted deadly accidents necessitate the development and implementation of some strategies to reduce these incidents. SWOT analysis is an efficient method for developing strategies, however, its structural problems, including a lack of prioritizing internal and external factors and inability to consider two sided factors reducing its performance in the situations where the number of internal and external factors affecting the risk of hazardous materials is relatively high and some factors are two sided in nature are presented in the article. Fuzzy SWOT analysis is a method the use of which helps with solving these problems and is the issue of employing an effective methodology. Also, the article compares the resulted strategies of the fuzzy method with the strategies developed following SWOT in order to show the relative supremacy of the new method.

  13. Development of conjugate methods with gas chromatography for inorganic compounds analysis

    International Nuclear Information System (INIS)

    Baccan, N.

    1975-01-01

    The application of gas chromatography combined with mass spectrometry or with nuclear methods for the analysis of inorganic compounds is studied. The advantages of the use of a gas chromatograph coupled with a quadrupole mass spectrometer or with a high resolution radiation detector, are discussed. We also studied the formation and solvent extraction of metal chelates; an aliquot of the organic phase was directly injected into the gas chromatograph and the eluted compounds were detected by mass spectrometry or, when radioactive, by nuclear methods. (author)

  14. Personnel planning in general practices: development and testing of a skill mix analysis method.

    NARCIS (Netherlands)

    Eitzen-Strassel, J. von; Vrijhoef, H.J.M.; Derckx, E.W.C.C.; Bakker, D.H. de

    2014-01-01

    Background: General practitioners (GPs) have to match patients’ demands with the mix of their practice staff’s competencies. However, apart from some general principles, there is little guidance on recruiting new staff. The purpose of this study was to develop and test a method which would allow GPs

  15. Personnel planning in general practices : Development and testing of a skill mix analysis method

    NARCIS (Netherlands)

    von Eitzen-Strassel, J.; Vrijhoef, H.J.M.; Derckx, E.W.C.C.; de Bakker, D.H.

    2014-01-01

    Background General practitioners (GPs) have to match patients’ demands with the mix of their practice staff’s competencies. However, apart from some general principles, there is little guidance on recruiting new staff. The purpose of this study was to develop and test a method which would allow GPs

  16. An Observational Analysis of Coaching Behaviors for Career Development Event Teams: A Mixed Methods Study

    Science.gov (United States)

    Ball, Anna L.; Bowling, Amanda M.; Sharpless, Justin D.

    2016-01-01

    School Based Agricultural Education (SBAE) teachers can use coaching behaviors, along with their agricultural content knowledge to help their Career Development Event (CDE) teams succeed. This mixed methods, collective case study observed three SBAE teachers preparing multiple CDEs throughout the CDE season. The teachers observed had a previous…

  17. Development of sampling method and chromatographic analysis of volatile organic compounds emitted from human skin.

    Science.gov (United States)

    Grabowska-Polanowska, Beata; Miarka, Przemysław; Skowron, Monika; Sułowicz, Joanna; Wojtyna, Katarzyna; Moskal, Karolina; Śliwka, Ireneusz

    2017-10-01

    The studies on volatile organic compounds emitted from skin are an interest for chemists, biologists and physicians due to their role in development of different scientific areas, including medical diagnostics, forensic medicine and the perfume design. This paper presents a proposal of two sampling methods applied to skin odor collection: the first one uses a bag of cellulose film, the second one, using cellulose sachets filled with active carbon. Volatile organic compounds were adsorbed on carbon sorbent, removed via thermal desorption and analyzed using gas chromatograph with mass spectrometer. The first sampling method allowed identification of more compounds (52) comparing to the second one (30). Quantitative analyses for acetone, butanal, pentanal and hexanal were done. The skin odor sampling method using a bag of cellulose film, allowed the identification of many more compounds when compared with the method using a sachet filled with active carbon.

  18. New method development in prehistoric stone tool research: evaluating use duration and data analysis protocols.

    Science.gov (United States)

    Evans, Adrian A; Macdonald, Danielle A; Giusca, Claudiu L; Leach, Richard K

    2014-10-01

    Lithic microwear is a research field of prehistoric stone tool (lithic) analysis that has been developed with the aim to identify how stone tools were used. It has been shown that laser scanning confocal microscopy has the potential to be a useful quantitative tool in the study of prehistoric stone tool function. In this paper, two important lines of inquiry are investigated: (1) whether the texture of worn surfaces is constant under varying durations of tool use, and (2) the development of rapid objective data analysis protocols. This study reports on the attempt to further develop these areas of study and results in a better understanding of the complexities underlying the development of flexible analytical algorithms for surface analysis. The results show that when sampling is optimised, surface texture may be linked to contact material type, independent of use duration. Further research is needed to validate this finding and test an expanded range of contact materials. The use of automated analytical protocols has shown promise but is only reliable if sampling location and scale are defined. Results suggest that the sampling protocol reports on the degree of worn surface invasiveness, complicating the ability to investigate duration related textural characterisation. Copyright © 2014. Published by Elsevier Ltd.

  19. Applied research and development of neutron activation analysis - Development of the precise analysis method for plastic materials by the use of NAA

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kil Yong; Sim, Sang Kwan; Yoon, Yoon Yeol; Chun, Sang Ki [Korea Institute of Geology, Mining and Materials, Taejon (Korea)

    2000-04-01

    The demand for inorganic analysis of plastics has significantly increased in the fields of microelectronic, environmental, nuclear and resource recycling. The difficulties of chemical analysis methods have led to the application of NAA which has great advantages of non-destructivity, freedom from blank, high sensitivity. The goal of the present work is to optimize and to develop the NAA procedures for the inorganic analysis of plastics. Even though NAA has unique advantages, it has two problems for plastics. One is the contamination by metallic utensils during sample treatment and the other is destruction of sample ampule due to pressure build-up by hydrogen and methane gas formed from oxyhydrogenation reaction with neutrons. For the first problem, large plastics were cut to pieces after immersion in liquid nitrogen. And the second problem has been solved by making an aperture on top side of sample ampule. These research results have been applied to analysis of various plastic materials which were used in food, drug containers and toys for children. Moreover, korean irradiation rabbit could be produced by the application of the results and standard reference materials of plastics which were used for the analysis in XRF and ICP could be produced. 36 refs., 6 figs., 37 tabs (Author)

  20. Development of flow network analysis code for block type VHTR core by linear theory method

    International Nuclear Information System (INIS)

    Lee, J. H.; Yoon, S. J.; Park, J. W.; Park, G. C.

    2012-01-01

    VHTR (Very High Temperature Reactor) is high-efficiency nuclear reactor which is capable of generating hydrogen with high temperature of coolant. PMR (Prismatic Modular Reactor) type reactor consists of hexagonal prismatic fuel blocks and reflector blocks. The flow paths in the prismatic VHTR core consist of coolant holes, bypass gaps and cross gaps. Complicated flow paths are formed in the core since the coolant holes and bypass gap are connected by the cross gap. Distributed coolant was mixed in the core through the cross gap so that the flow characteristics could not be modeled as a simple parallel pipe system. It requires lot of effort and takes very long time to analyze the core flow with CFD analysis. Hence, it is important to develop the code for VHTR core flow which can predict the core flow distribution fast and accurate. In this study, steady state flow network analysis code is developed using flow network algorithm. Developed flow network analysis code was named as FLASH code and it was validated with the experimental data and CFD simulation results. (authors)

  1. Development of a method for the analysis of perfluoroalkylated compounds in whole blood

    Energy Technology Data Exchange (ETDEWEB)

    Kaerrman, A.; Bavel, B. van; Lindstroem, G. [Oerebro Univ. (Sweden). Man-Technology-Environmental Research Centre; Jaernberg, U. [Stockholm Univ. (Sweden). Inst. of Applied Environmental Research

    2004-09-15

    The commercialisation of interfaced high performance liquid chromatography mass spectrometry (HPLC-MS) facilitated selective and sensitive analysis of perfluoroalkylated (PFA) acids, a group of compounds frequently used for example as industrial surfactants and which are very persistent and biologically active, in a more convenient way than before. Since then a number of reports on PFA compounds found in humans and wildlife have been published. The most used technique for the analysis of perfluoroalkylated compounds has been ion-pair extraction followed by high performance liquid chromatography (HPLC) and negative electrospray tandem mass spectrometry (MS/MS). Tetrabutylammonium ion as the counter ion in the ion-pair extraction has been used together with GC-analysis, LC-fluorescence and LC-MS/MS. Recently, solid phase extraction (SPE) has been used instead of ion-pair extraction for the extraction of human serum. Previously reported studies on human exposure have mainly been on serum, probably because there are indications that PFA acids bind to plasma proteins. We here present a fast and simple method that involves SPE and which is suitable for extracting whole blood samples. Further more, 13 PFAs were included in the method, which uses HPLC and single quadropole mass spectrometry.

  2. Method Development for Pesticide Residue Analysis in Farmland Soil using High Perfomance Liquid Chromatography

    Science.gov (United States)

    Theresia Djue Tea, Marselina; Sabarudin, Akhmad; Sulistyarti, Hermin

    2018-01-01

    A method for the determination of diazinon and chlorantraniliprole in soil samples has been developed. The analyte was extracted with acetonitrile from farmland soil sample. Determination and quantification of diazinon and chlorantraniliprole were perfomed by high perfomance liquid chromatography (HPLC) with an UV detector. Several parameters of HPLC method were optimized with respect to sensitivity, high resolution of separation, and accurate determination of diazinon and chlorantraniliprole. Optimum conditions for the separation of two pesticides were eluent composition of acetonitrile:water ratio of 60:40, 0.4 mL/min of flow rate, and 220 nm of wavelength. Under the optimum conditions, diazinon linearity was in the range from 1-25 ppm with R2 of 0.9976, 1.19 mgL-1 LOD, and 3.98 mgL-1 LOQ; while the linearity of chlorantraniliprole was in the range from 0.2-5 mgL-1 with R2 of 0.9972, 0.39 mgL-1 LOD, and 1.29 mgL-1 LOQ. When the method was applied to the soil sample, both pesticides showed acceptable recoveries for real sample of more than 85%: thus, the developed method meets the validation requirement. Under this developed method, the concentrations of both pesticides in the soil samples were below the LOD and LOQ (0.577 mgL-1 for diazinon and 0.007 mgL-1 for chlorantraniliprole). Therefore, it can be concluded that the soil samples used in this study have neither diazinon nor chlorantraniliprole.

  3. Analysis of heavy oils: Method development and application to Cerro Negro heavy petroleum

    Energy Technology Data Exchange (ETDEWEB)

    None

    1989-12-01

    On March 6, 1980, the US Department of Energy (DOE) and the Ministry of Energy and Mines of Venezuela (MEMV) entered into a joint agreement which included analysis of heavy crude oils from the Venezuelan Orinoco oil belt. The purpose of this report is to present compositional data and describe new analytical methods obtained from work on the Cerro Negro Orinoco belt crude oil since 1980. Most of the chapters focus on the methods rather than the resulting data on Cerro Negro oil, and results from other oils obtained during the verification of the method are included. In addition, published work on analysis of heavy oils, tar sand bitumens, and like materials is reviewed, and the overall state of the art in analytical methodology for heavy fossil liquids is assessed. The various phases of the work included: distillation and determination of routine'' physical/chemical properties (Chapter 1); preliminary separation of >200{degrees} C distillates and the residue into acid, base, neutral, saturated hydrocarbon and neutral-aromatic concentrates (Chapter 2); further separation of acid, base, and neutral concentrates into subtypes (Chapters 3--5); and determination of the distribution of metal-containing compounds in all fractions (Chapter 6).

  4. Analysis of heavy oils: Method development and application to Cerro Negro heavy petroleum

    Energy Technology Data Exchange (ETDEWEB)

    Carbognani, L.; Hazos, M.; Sanchez, V. (INTEVEP, Filial de Petroleos de Venezuela, SA, Caracas (Venezuela)); Green, J.A.; Green, J.B.; Grigsby, R.D.; Pearson, C.D.; Reynolds, J.W.; Shay, J.Y.; Sturm, G.P. Jr.; Thomson, J.S.; Vogh, J.W.; Vrana, R.P.; Yu, S.K.T.; Diehl, B.H.; Grizzle, P.L.; Hirsch, D.E; Hornung, K.W.; Tang, S.Y.

    1989-12-01

    On March 6, 1980, the US Department of Energy (DOE) and the Ministry of Energy and Mines of Venezuela (MEMV) entered into a joint agreement which included analysis of heavy crude oils from the Venezuelan Orinoco oil belt.The purpose of this report is to present compositional data and describe new analytical methods obtained from work on the Cerro Negro Orinoco belt crude oil since 1980. Most of the chapters focus on the methods rather than the resulting data on Cerro Negro oil, and results from other oils obtained during the verification of the method are included. In addition, published work on analysis of heavy oils, tar sand bitumens, and like materials is reviewed, and the overall state of the art in analytical methodology for heavy fossil liquids is assessed. The various phases of the work included: distillation and determination of routine'' physical/chemical properties (Chapter 1); preliminary separation of >200{degree}C distillates and the residue into acid, base, neutral, saturated hydrocarbon and neutral-aromatic concentrates (Chapter 2); further separation of acid, base, and neutral concentrates into subtypes (Chapters 3-5); and determination of the distribution of metal-containing compounds in all fractions (Chapter 6).

  5. Development of advanced methods for analysis of experimental data in diffusion

    Science.gov (United States)

    Jaques, Alonso V.

    There are numerous experimental configurations and data analysis techniques for the characterization of diffusion phenomena. However, the mathematical methods for estimating diffusivities traditionally do not take into account the effects of experimental errors in the data, and often require smooth, noiseless data sets to perform the necessary analysis steps. The current methods used for data smoothing require strong assumptions which can introduce numerical "artifacts" into the data, affecting confidence in the estimated parameters. The Boltzmann-Matano method is used extensively in the determination of concentration - dependent diffusivities, D(C), in alloys. In the course of analyzing experimental data, numerical integrations and differentiations of the concentration profile are performed. These methods require smoothing of the data prior to analysis. We present here an approach to the Boltzmann-Matano method that is based on a regularization method to estimate a differentiation operation on the data, i.e., estimate the concentration gradient term, which is important in the analysis process for determining the diffusivity. This approach, therefore, has the potential to be less subjective, and in numerical simulations shows an increased accuracy in the estimated diffusion coefficients. We present a regression approach to estimate linear multicomponent diffusion coefficients that eliminates the need pre-treat or pre-condition the concentration profile. This approach fits the data to a functional form of the mathematical expression for the concentration profile, and allows us to determine the diffusivity matrix directly from the fitted parameters. Reformulation of the equation for the analytical solution is done in order to reduce the size of the problem and accelerate the convergence. The objective function for the regression can incorporate point estimations for error in the concentration, improving the statistical confidence in the estimated diffusivity matrix

  6. Development of a method for analysis for wind turbines horizontal shaft by a method of fluid dynamics computational (CFD)

    International Nuclear Information System (INIS)

    Farinnas Wong, E. Y.; Jauregui Rigo, S.; Betancourt Mena, J.

    2009-01-01

    In this paper we describe different approaches to solving problems computational fluid dynamics using the finite element method, there is a perspective what are the different problems that must be addressed when choose a path to develop a code that solves the problems of boundary layer and turbulence to simulate the transport equipment and fluid handling. In principle, the turbulent flow is governed by the equations of dynamics fluids. The nonlinearity of the Navier-Stokes equations, make the solution analytical is only possible in a few very specific cases and for senior Reynolds numbers the flow equations become a more complex, for it is necessary to use certain models dependent on some settings, usually obtained experimentally. Existing in the powerful techniques present numerical resolution of these equations such as the direct numerical simulation (DNS) and large eddy simulation or vertices (RES), discussed for use in solving problems flow machines. (author)

  7. Cooperative method development

    DEFF Research Database (Denmark)

    Dittrich, Yvonne; Rönkkö, Kari; Eriksson, Jeanette

    2008-01-01

    The development of methods tools and process improvements is best to be based on the understanding of the development practice to be supported. Qualitative research has been proposed as a method for understanding the social and cooperative aspects of software development. However, qualitative...... research is not easily combined with the improvement orientation of an engineering discipline. During the last 6 years, we have applied an approach we call `cooperative method development', which combines qualitative social science fieldwork, with problem-oriented method, technique and process improvement....... The action research based approach focusing on shop floor software development practices allows an understanding of how contextual contingencies influence the deployment and applicability of methods, processes and techniques. This article summarizes the experiences and discusses the further development...

  8. Method developments approaches in supercritical fluid chromatography applied to the analysis of cosmetics.

    Science.gov (United States)

    Lesellier, E; Mith, D; Dubrulle, I

    2015-12-04

    Analyses of complex samples of cosmetics, such as creams or lotions, are generally achieved by HPLC. These analyses are often multistep gradients, due to the presence of compounds with a large range of polarity. For instance, the bioactive compounds may be polar, while the matrix contains lipid components that are rather non-polar, thus cosmetic formulations are usually oil-water emulsions. Supercritical fluid chromatography (SFC) uses mobile phases composed of carbon dioxide and organic co-solvents, allowing for good solubility of both the active compounds and the matrix excipients. Moreover, the classical and well-known properties of these mobile phases yield fast analyses and ensure rapid method development. However, due to the large number of stationary phases available for SFC and to the varied additional parameters acting both on retention and separation factors (co-solvent nature and percentage, temperature, backpressure, flow rate, column dimensions and particle size), a simplified approach can be followed to ensure a fast method development. First, suited stationary phases should be carefully selected for an initial screening, and then the other operating parameters can be limited to the co-solvent nature and percentage, maintaining the oven temperature and back-pressure constant. To describe simple method development guidelines in SFC, three sample applications are discussed in this paper: UV-filters (sunscreens) in sunscreen cream, glyceryl caprylate in eye liner and caffeine in eye serum. Firstly, five stationary phases (ACQUITY UPC(2)) are screened with isocratic elution conditions (10% methanol in carbon dioxide). Complementary of the stationary phases is assessed based on our spider diagram classification which compares a large number of stationary phases based on five molecular interactions. Secondly, the one or two best stationary phases are retained for further optimization of mobile phase composition, with isocratic elution conditions or, when

  9. Development of an evaluation method for the quality of NPP MCR operators' communication using Work Domain Analysis (WDA)

    International Nuclear Information System (INIS)

    Jang, Inseok; Park, Jinkyun; Seong, Poonghyun

    2011-01-01

    Research highlights: → No evaluation method is available for operators' communication quality in NPPs. → To model this evaluation method, the Work Domain Analysis (WDA) method was found. → This proposed method was applied to NPP MCR operators. → The quality of operators' communication can be evaluated with the propose method. - Abstract: The evolution of work demands has seen industrial evolution itself evolve into the computerization of these demands, making systems more complex. This field is now known as the Complex Socio-Technical System. As communication failures are problems associated with Complex Socio-Technical Systems, it has been discovered that communication failures are the cause of many incidents and accidents in various industries, including the nuclear, aerospace and railway industries. Despite the fact that there have been many studies on the severity of communication failures, there is no evaluation method for operators' communication quality in Nuclear Power Plants (NPPs). Therefore, the objectives of this study are to develop an evaluation method for the quality of NPP Main Control Room (MCR) operators' communication and to apply the proposed method to operators in a full-scope simulator. To develop the proposed method, the Work Domain Analysis (WDA) method is introduced. Several characteristics of WDA, including Abstraction Decomposition Space (ADS) and the diagonal of ADS are the important points in developing an evaluation method for the quality of NPP MCR operators' communication. In addition, to apply the proposed method, nine teams working in NPPs participated in a field simulation. The results of this evaluation reveal that operators' communication quality improved as a greater proportion of the components in the developed evaluation criteria were mentioned. Therefore, the proposed method could be useful for evaluating the communication quality in any complex system.

  10. Alternative method of highway traffic safety analysis for developing countries using delphi technique and Bayesian network.

    Science.gov (United States)

    Mbakwe, Anthony C; Saka, Anthony A; Choi, Keechoo; Lee, Young-Jae

    2016-08-01

    Highway traffic accidents all over the world result in more than 1.3 million fatalities annually. An alarming number of these fatalities occurs in developing countries. There are many risk factors that are associated with frequent accidents, heavy loss of lives, and property damage in developing countries. Unfortunately, poor record keeping practices are very difficult obstacle to overcome in striving to obtain a near accurate casualty and safety data. In light of the fact that there are numerous accident causes, any attempts to curb the escalating death and injury rates in developing countries must include the identification of the primary accident causes. This paper, therefore, seeks to show that the Delphi Technique is a suitable alternative method that can be exploited in generating highway traffic accident data through which the major accident causes can be identified. In order to authenticate the technique used, Korea, a country that underwent similar problems when it was in its early stages of development in addition to the availability of excellent highway safety records in its database, is chosen and utilized for this purpose. Validation of the methodology confirms the technique is suitable for application in developing countries. Furthermore, the Delphi Technique, in combination with the Bayesian Network Model, is utilized in modeling highway traffic accidents and forecasting accident rates in the countries of research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Development of a Probabilistic Dynamic Synthesis Method for the Analysis of Nondeterministic Structures

    Science.gov (United States)

    Brown, A. M.

    1998-01-01

    Accounting for the statistical geometric and material variability of structures in analysis has been a topic of considerable research for the last 30 years. The determination of quantifiable measures of statistical probability of a desired response variable, such as natural frequency, maximum displacement, or stress, to replace experience-based "safety factors" has been a primary goal of these studies. There are, however, several problems associated with their satisfactory application to realistic structures, such as bladed disks in turbomachinery. These include the accurate definition of the input random variables (rv's), the large size of the finite element models frequently used to simulate these structures, which makes even a single deterministic analysis expensive, and accurate generation of the cumulative distribution function (CDF) necessary to obtain the probability of the desired response variables. The research presented here applies a methodology called probabilistic dynamic synthesis (PDS) to solve these problems. The PDS method uses dynamic characteristics of substructures measured from modal test as the input rv's, rather than "primitive" rv's such as material or geometric uncertainties. These dynamic characteristics, which are the free-free eigenvalues, eigenvectors, and residual flexibility (RF), are readily measured and for many substructures, a reasonable sample set of these measurements can be obtained. The statistics for these rv's accurately account for the entire random character of the substructure. Using the RF method of component mode synthesis, these dynamic characteristics are used to generate reduced-size sample models of the substructures, which are then coupled to form system models. These sample models are used to obtain the CDF of the response variable by either applying Monte Carlo simulation or by generating data points for use in the response surface reliability method, which can perform the probabilistic analysis with an order of

  12. Development of a calculation method for one dimensional kinetic analysis in fission reactors, with feedback effects

    International Nuclear Information System (INIS)

    Paixao, S.B.

    1985-01-01

    The methodology used in the WIGLE3 computer code is studied. This methodology has been applied for the steady-state and transient solutions of the one-dimensional, two-group, diffusion equations in slab geometry, in axial type probelm analysis. It's also studied, based in a WIGLE3 computer code, reactor representative models, considering non-boiling heat transfer. A steady-state program for control rod bank position search- CITER 1D- has been developed. Some criticality research on the proposed system has been done using different control rod bank initial positions, time steps and convergence parameters. (E.G.) [pt

  13. Effective methods of consumer protection in Brazil. An analysis in the context of property development contracts

    Directory of Open Access Journals (Sweden)

    Deborah Alcici Salomão

    2015-12-01

    Full Text Available This study examines consumer protection in arbitration, especially under the example of property development contract disputes in Brazil. This is a very current issue in light of the presidential veto of consumer arbitration on May 26, 2015. The article discusses the arbitrability of these disputes based on Brazilian legislation and relevant case law. It also analyzes of the advantages, disadvantages and trends of consumer arbitration in the context of real estate contracts. The paper concludes by providing suggestions specific to consumer protection in arbitration based on this analysis.

  14. The development of trend and pattern analysis methods for incident data by CEC'S joint research at Ispra

    International Nuclear Information System (INIS)

    Amesz, J.; Kalfsbeek, H.W.

    1990-01-01

    The Abnormal Occurrences Reporting System of the Commission of the European Communities was developed by the Joint Research Centre at Ispra in the period 1982 through 1985. It collects in a unique format all safety relevant events from NPPs as recorded in the participating countries. The system has been set-up with the specific objective of providing an advanced tool for a synoptic analysis of a large number of events, identifying patterns of sequences, trends, multiple dependencies between incident descriptors, precursors to severe incidents, performance indicators etc. This paper gives an overview of the development of trend and pattern analysis techniques of two different types: - event sequence analysis; - statistical methods. Though these methods have been developed and applied in relation with the AORS data, they can be regarded as generic in the sense that they may be applied to any incident reporting system satisfying the necessary criteria as to homogeneity and completeness, for rendering valid results

  15. Spectral Analysis of Dynamic PET Studies: A Review of 20 Years of Method Developments and Applications.

    Science.gov (United States)

    Veronese, Mattia; Rizzo, Gaia; Bertoldo, Alessandra; Turkheimer, Federico E

    2016-01-01

    In Positron Emission Tomography (PET), spectral analysis (SA) allows the quantification of dynamic data by relating the radioactivity measured by the scanner in time to the underlying physiological processes of the system under investigation. Among the different approaches for the quantification of PET data, SA is based on the linear solution of the Laplace transform inversion whereas the measured arterial and tissue time-activity curves of a radiotracer are used to calculate the input response function of the tissue. In the recent years SA has been used with a large number of PET tracers in brain and nonbrain applications, demonstrating that it is a very flexible and robust method for PET data analysis. Differently from the most common PET quantification approaches that adopt standard nonlinear estimation of compartmental models or some linear simplifications, SA can be applied without defining any specific model configuration and has demonstrated very good sensitivity to the underlying kinetics. This characteristic makes it useful as an investigative tool especially for the analysis of novel PET tracers. The purpose of this work is to offer an overview of SA, to discuss advantages and limitations of the methodology, and to inform about its applications in the PET field.

  16. Analysis and development of spatial hp-refinement methods for solving the neutron transport equation

    International Nuclear Information System (INIS)

    Fournier, D.

    2011-01-01

    The different neutronic parameters have to be calculated with a higher accuracy in order to design the 4. generation reactor cores. As memory storage and computation time are limited, adaptive methods are a solution to solve the neutron transport equation. The neutronic flux, solution of this equation, depends on the energy, angle and space. The different variables are successively discretized. The energy with a multigroup approach, considering the different quantities to be constant on each group, the angle by a collocation method called SN approximation. Once the energy and angle variable are discretized, a system of spatially-dependent hyperbolic equations has to be solved. Discontinuous finite elements are used to make possible the development of hp-refinement methods. Thus, the accuracy of the solution can be improved by spatial refinement (h-refinement), consisting into subdividing a cell into sub-cells, or by order refinement (p-refinement), by increasing the order of the polynomial basis. In this thesis, the properties of this methods are analyzed showing the importance of the regularity of the solution to choose the type of refinement. Thus, two error estimators are used to lead the refinement process. Whereas the first one requires high regularity hypothesis (analytical solution), the second one supposes only the minimal hypothesis required for the solution to exist. The comparison of both estimators is done on benchmarks where the analytic solution is known by the method of manufactured solutions. Thus, the behaviour of the solution as a regard of the regularity can be studied. It leads to a hp-refinement method using the two estimators. Then, a comparison is done with other existing methods on simplified but also realistic benchmarks coming from nuclear cores. These adaptive methods considerably reduces the computational cost and memory footprint. To further improve these two points, an approach with energy-dependent meshes is proposed. Actually, as the

  17. Development of a preparation and staining method for fetal erythroblasts in maternal blood : Simultaneous immunocytochemical staining and FISH analysis

    NARCIS (Netherlands)

    Oosterwijk, JC; Mesker, WE; Ouwerkerk-van Velzen, MCM; Knepfle, CFHM; Wiesmeijer, KC; van den Burg, MJM; Beverstock, GC; Bernini, LF; van Ommen, Gert-Jan B; Kanhai, HHH; Tanke, HJ

    1998-01-01

    In order to detect fetal nucleated red blood cells (NRBCs) in maternal blood, a protocol was developed which aimed at producing a reliable staining method for combined immunocytochemical and FISH analysis. The technique had to be suitable for eventual automated screening of slides. Chorionic villi

  18. APPLICATION OF THE SPECTRUM ANALYSIS WITH USING BERG METHOD TO DEVELOPED SPECIAL SOFTWARE TOOLS FOR OPTICAL VIBRATION DIAGNOSTICS SYSTEM

    Directory of Open Access Journals (Sweden)

    E. O. Zaitsev

    2016-01-01

    Full Text Available The objective of this paper is development and experimental verification special software of spectral analysis. Spectral analysis use of controlled vibrations objects. Spectral analysis of vibration based on use maximum-entropy autoregressive method of spectral analysis by the Berg algorithm. For measured signals use preliminary analysis based on regression analysis. This analysis of the signal enables to eliminate uninformative parameters such as – the noise and the trend. For preliminary analysis developed special software tools. Non-contact measurement of mechanical vibrations parameters rotating diffusely-reflecting surfaces used in circumstances where the use of contact sensors difficult or impossible for a number of reasons, including lack of access to the object, the small size of the controlled area controlled portion has a high temperature or is affected by strong electromagnetic fields. For control use offered laser measuring system. This measuring system overcomes the shortcomings interference or Doppler optical measuring systems. Such as measure the large amplitude and inharmonious vibration. On the basis of the proposed methods developed special software tools for use measuring laser system. LabVIEW using for developed special software. Experimental research of the proposed method of vibration signals processing is checked in the analysis of the diagnostic information obtained by measuring the vibration system grinding diamond wheel cold solid tungsten-containing alloy TK8. A result of work special software tools was complex spectrum obtained «purified» from non-informative parameters. Spectrum of the signal corresponding to the vibration process observed object. 

  19. Development of analytical methods for the determination of some radiologically important elements in biological materials using neutron activation analysis

    International Nuclear Information System (INIS)

    Dang, H.S.; Jaiswal, D.D.; Pullat, V.R.; Krishnamony, S.

    1998-01-01

    This paper describes the analytical methods developed for the estimation of Cs, I, Sr, Th and U in biological materials such as food and human tissues. The methods employ both, the instrumental neutron activation analysis (INAA) and radiochemical neutron activation analysis (RNAA). The adequacy of these methods to determine the concentrations of the above elements in dietary and tissue materials was also studied. The study showed that the analytical methods described in this paper are adequate for the determination of Cs, Sr, Th and U in all kinds of biological samples. In the case of I however, the method is adequate only for determining its concentration in thyroid, but needs to be modified to improve its sensitivity for the determination of I in diet samples. (author)

  20. The development of quantative and qualitive analysis methods of suppositories with Maclura Pomifera extract

    Directory of Open Access Journals (Sweden)

    V. A. Korotkov

    2014-08-01

    Full Text Available Chronic prostatitis and BPH are still very common diseases. In recent years, herbal preparations are widely used in the treatment of prostate diseases gland. The effectiveness of herbal medicinal products derived from MacluraPomifera is associated with their content of phytosterols and terpenes. Derived oil extract of MacluraPomiferafruit Orange (Maclurapomifera, Moraceae is a rich source of terpenes and phytosterols. Previous studies indicated, that the content of such substances as lupeol and β-sitosterol, which is known its prostatoprotectors properties as well as the presence of isoflavones possessing anti-inflammatory and antioxidant properties. Aim of the work The aim of this work is a developing of methods that allow assaying qualitative and quantitative assessment of the suppositories with MacluraPomifera extract. Materials and methods Theobjects of this study are the suppositories with oil extract of MacluraPomifera. For the suppositories ingredients’identification thin layer chromatography has been used. Quantitative determination of active compounds has been carried out by spectrophotometry in ultraviolet and visible spectrum. The spectrophotometers of Thermo Scientific Evolution S60 (USA and Apel PD303S (Japan have been used. Determination of phytosterols and triterpenes amount has been performed in the equivalent of a reliable sample of lupeol ('Santa Cruz Biotechnology', USA; CAS: 545-47-1. Determination of isoflavones amount has been carried out in the equivalent of a reliable sample osayin ('BioBioPhaCo., Ltd.',China; CAS: 482-53-1. Results and discussion For identification of phytosterols and isoflavones in the suppositories composition a method of identifying their joint presence by TLC has been developed. Inalcoholic extraction from suppository mass two purple spots with Rf 0,8 and 0,57 are observed at the level of spots solution (lupeol and β-sitosterol and two yellow spots with Rf 0,45 and 0,21 are observed at the level

  1. A development and integration of the concentration database for relative method, k0 method and absolute method in instrumental neutron activation analysis using Microsoft Access

    International Nuclear Information System (INIS)

    Hoh Siew Sin

    2012-01-01

    Instrumental Neutron Activation Analysis (INAA) is offen used to determine and calculate the concentration of an element in the sample by the National University of Malaysia, especially students of Nuclear Science Program. The lack of a database service leads consumers to take longer time to calculate the concentration of an element in the sample. This is because we are more dependent on software that is developed by foreign researchers which are costly. To overcome this problem, a study has been carried out to build an INAA database software. The objective of this study is to build a database software that help the users of INAA in Relative Method and Absolute Method for calculating the element concentration in the sample using Microsoft Excel 2010 and Microsoft Access 2010. The study also integrates k 0 data, k 0 Concent and k 0 -Westcott to execute and complete the system. After the integration, a study was conducted to test the effectiveness of the database software by comparing the concentrations between the experiments and in the database. Triple Bare Monitor Zr-Au and Cr-Mo-Au were used in Abs-INAA as monitor to determine the thermal to epithermal neutron flux ratio (f). Calculations involved in determining the concentration are the net peak area (N p ), the measurement time (t m ), the irradiation time (t irr ), k-factor (k), thermal to epithermal neutron flux ratio (f), the parameters of the neutron flux distribution epithermal (α) and detection efficiency (ε p ). For Com-INAA databases, reference material IAEA-375 Soil was used to calculate the concentration of elements in the sample. CRM, SRM are also used in this database. After the INAA database integration, a verification process was to examine the effectiveness of the Abs-INAA was carried out by comparing the sample concentration between the in database and the experiment. The result of the experimental concentration value of INAA database software performed with high accuracy and precision. ICC

  2. Coupled Electro-Magneto-Mechanical-Acoustic Analysis Method Developed by Using 2D Finite Element Method for Flat Panel Speaker Driven by Magnetostrictive-Material-Based Actuator

    Science.gov (United States)

    Yoo, Byungjin; Hirata, Katsuhiro; Oonishi, Atsurou

    In this study, a coupled analysis method for flat panel speakers driven by giant magnetostrictive material (GMM) based actuator was developed. The sound field produced by a flat panel speaker that is driven by a GMM actuator depends on the vibration of the flat panel, this vibration is a result of magnetostriction property of the GMM. In this case, to predict the sound pressure level (SPL) in the audio-frequency range, it is necessary to take into account not only the magnetostriction property of the GMM but also the effect of eddy current and the vibration characteristics of the actuator and the flat panel. In this paper, a coupled electromagnetic-structural-acoustic analysis method is presented; this method was developed by using the finite element method (FEM). This analysis method is used to predict the performance of a flat panel speaker in the audio-frequency range. The validity of the analysis method is verified by comparing with the measurement results of a prototype speaker.

  3. Development and application of an automated analysis method for individual cerebral perfusion single photon emission tomography images

    International Nuclear Information System (INIS)

    Cluckie, Alice Jane

    2001-01-01

    Neurological images may be analysed by performing voxel by voxel comparisons with a group of control subject images. An automated, 3D, voxel-based method has been developed for the analysis of individual single photon emission tomography (SPET) scans. Clusters of voxels are identified that represent regions of abnormal radiopharmaceutical uptake. Morphological operators are applied to reduce noise in the clusters, then quantitative estimates of the size and degree of the radiopharmaceutical uptake abnormalities are derived. Statistical inference has been performed using a Monte Carlo method that has not previously been applied to SPET scans, or for the analysis of individual images. This has been validated for group comparisons of SPET scans and for the analysis of an individual image using comparison with a group. Accurate statistical inference was obtained independent of experimental factors such as degrees of freedom, image smoothing and voxel significance level threshold. The analysis method has been evaluated for application to cerebral perfusion SPET imaging in ischaemic stroke. It has been shown that useful quantitative estimates, high sensitivity and high specificity may be obtained. Sensitivity and the accuracy of signal quantification were found to be dependent on the operator defined analysis parameters. Recommendations for the values of these parameters have been made. The analysis method developed has been compared with an established method and shown to result in higher specificity for the data and analysis parameter sets tested. In addition, application to a group of ischaemic stroke patient SPET scans has demonstrated its clinical utility. The influence of imaging conditions has been assessed using phantom data acquired with different gamma camera SPET acquisition parameters. A lower limit of five million counts and standardisation of all acquisition parameters has been recommended for the analysis of individual SPET scans. (author)

  4. Development of distinction method of production area of ginsengs by using a neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Youngjin; Chung, Yongsam; Sim, Chulmuu; Sun, Gwangmin; Lee, Yuna; Yoo, Sangho

    2011-01-15

    During the last 2 years of the project, we have tried to develop the technology to make a distinction of the production areas for Korean ginsengs cultivated in the various provinces in Korea and foreign countries. It will contribute to secure the health food safety for public and stability of its market. In this year, we collected ginseng samples cultivated in the northeastern province in Chinese mainland such as Liaoning province, Jilin province and Baekdu mountain within Jilin province. 10 ginseng samples were collected at each province. The elemental concentrations in the ginseng were analyzed by using a neutron activation analysis technique at the HANARO research reactor. The distinction of production area was made by using a statistical software. As a result, the Chinese Korean ginsengs were certainly differentiated from those cultivated in the famous province in Korea though there was a limitation that the number of our sample we analyzed is very small.

  5. METHODS AND MODELS FOR ANALYSIS OF THE ORGANIZATIONAL ECONOMICS ACTIVITY USED FOR DEVELOPMENT OF INFORMATICS SYSTEMS

    Directory of Open Access Journals (Sweden)

    TEODORA VĂTUIU

    2014-10-01

    Full Text Available Study of organizational activity and highlighting problem situations that require specific solutions, require a detailed analysis of the models defined for the real system of the economic companies, regarded not as a sum of assets, but as organizations in which there are activities related into processes. In addition to the usual approach of using modeling languages in the development of information systems, in this paper we intend to present some examples that demonstrate the usefulness of a standard modeling language (UML to analyze organizational activities and to report problem situations that may occur in data management registered on primary documents or in processes that bring together activities. Examples that have been focused on a travel agency can be extrapolated to any other organization, and the diagrams can be used in different contexts, depending on the complexity of the activities identified.

  6. Development of distinction method of production area of ginsengs by using a neutron activation analysis

    International Nuclear Information System (INIS)

    Kim, Youngjin; Chung, Yongsam; Sim, Chulmuu; Sun, Gwangmin; Lee, Yuna; Yoo, Sangho

    2011-01-01

    During the last 2 years of the project, we have tried to develop the technology to make a distinction of the production areas for Korean ginsengs cultivated in the various provinces in Korea and foreign countries. It will contribute to secure the health food safety for public and stability of its market. In this year, we collected ginseng samples cultivated in the northeastern province in Chinese mainland such as Liaoning province, Jilin province and Baekdu mountain within Jilin province. 10 ginseng samples were collected at each province. The elemental concentrations in the ginseng were analyzed by using a neutron activation analysis technique at the HANARO research reactor. The distinction of production area was made by using a statistical software. As a result, the Chinese Korean ginsengs were certainly differentiated from those cultivated in the famous province in Korea though there was a limitation that the number of our sample we analyzed is very small

  7. Methods for Risk Analysis

    International Nuclear Information System (INIS)

    Alverbro, Karin

    2010-01-01

    Many decision-making situations today affect humans and the environment. In practice, many such decisions are made without an overall view and prioritise one or other of the two areas. Now and then these two areas of regulation come into conflict, e.g. the best alternative as regards environmental considerations is not always the best from a human safety perspective and vice versa. This report was prepared within a major project with the aim of developing a framework in which both the environmental aspects and the human safety aspects are integrated, and decisions can be made taking both fields into consideration. The safety risks have to be analysed in order to be successfully avoided and one way of doing this is to use different kinds of risk analysis methods. There is an abundance of existing methods to choose from and new methods are constantly being developed. This report describes some of the risk analysis methods currently available for analysing safety and examines the relationships between them. The focus here is mainly on human safety aspects

  8. SYSTEM ANALYSIS OF MAJOR TRENDS IN DEVELOPMENT OF ADAPTIVE TRAFFIC FLOW MANAGEMENT METHODS

    Directory of Open Access Journals (Sweden)

    A. N. Klimovich

    2017-01-01

    Full Text Available Adaptive algorithms, which current traffic systems are based on, exist for many decades. Information technologies have developed significantly over this period and it makes more relevant their application in the field of transport. This paper analyses modern trends in the development of adaptive traffic flow control methods. Reviewed the most perspective directions in the field of intelligent transport systems, such as high-speed wireless communication between vehicles and road infrastructure based on such technologies as DSRC and WAVE, traffic jams prediction having such features as traffic flow information, congestion, velocity of vehicles using machine learning, fuzzy logic rules and genetic algorithms, application of driver assistance systems to increase vehicle’s autonomy. Advantages of such technologies in safety, efficiency and usability of transport are shown. Described multi-agent approach, which uses V2I-communication between vehicles and intersection controller to improve efficiency of control due to more complete traffic flow information and possibility to give orders to separate vehicles. Presented number of algorithms which use such approach to create new generation of adaptive transport systems.

  9. First characterization of the expiratory flow increase technique: method development and results analysis

    International Nuclear Information System (INIS)

    Maréchal, L; Barthod, C; Jeulin, J C

    2009-01-01

    This study provides an important contribution to the definition of the expiratory flow increase technique (EFIT). So far, no measuring means were suited to assess the manual EFIT performed on infants. The proposed method aims at objectively defining the EFIT based on the quantification of pertinent cognitive parameters used by physiotherapists when practicing. We designed and realized customized instrumented gloves endowed with pressure and displacement sensors, and the associated electronics and software. This new system is specific to the manoeuvre, to the user and innocuous for the patient. Data were collected and analysed on infants with bronchiolitis managed by an expert physiotherapist. The analysis presented is realized on a group of seven subjects (mean age: 6.1 months, SD: 1.1; mean chest circumference: 44.8 cm, SD: 1.9). The results are consistent with the physiotherapist's tactility. In spite of inevitable variability due to measurements on infants, repeatable quantitative data could be reported regarding the manoeuvre characteristics: the magnitudes of displacements do not exceed 10 mm on both hands; the movement of the thoracic hand is more vertical than the movement of the abdominal hand; the maximum applied pressure with the thoracic hand is about twice higher than with the abdominal hand; the thrust of the manual compression lasts (590 ± 62) ms. Inter-operators measurements are in progress in order to generalize these results

  10. New developments in radiometrics and mass spectrometry methods for radionuclide analysis of environmental samples

    International Nuclear Information System (INIS)

    Povinec, P.P.; LaRosa, J.J.; Lee, S.H.; Wyse, E.

    2002-01-01

    The radionuclide levels observed at present in the environment are very low, therefore high sensitive analytical systems are required for carrying out environmental investigations. One very important recent development in analytical techniques for low-level activity measurements is the production of large volume HPGe detectors (up to 200% relative efficiency to 75 mm diameter x 75 mm long NaI (Tl) crystals). Their high efficiency and excellent energy resolution permit the analyses of various gamma-emitters in composite samples selectively and very often non-destructively (e.g. in sea sediments). However, this technique is restricted to gamma-emitters only (e.g. for 7 Be, 40 K, 54 Mn, 60 Co, 137 Cs, 210 Pb, etc.). Other radionuclides frequently found in the marine environment are the pure beta-emitters, like 3 H, 14 C, 32 Si, 32 P, 90 Sr, 241 Pu, etc., where mainly liquid scintillation counting has made great improvements in recent years. However, for some of these radionuclides mass spectrometry methods represent a real breakthrough in low-level counting, e.g. 3 He in-growth mass spectrometry for 3 H, or accelerator mass spectrometry (AMS) for 14 C

  11. Development and Analysis of Volume Multi-Sphere Method Model Generation using Electric Field Fitting

    Science.gov (United States)

    Ingram, G. J.

    Electrostatic modeling of spacecraft has wide-reaching applications such as detumbling space debris in the Geosynchronous Earth Orbit regime before docking, servicing and tugging space debris to graveyard orbits, and Lorentz augmented orbits. The viability of electrostatic actuation control applications relies on faster-than-realtime characterization of the electrostatic interaction. The Volume Multi-Sphere Method (VMSM) seeks the optimal placement and radii of a small number of equipotential spheres to accurately model the electrostatic force and torque on a conducting space object. Current VMSM models tuned using force and torque comparisons with commercially available finite element software are subject to the modeled probe size and numerical errors of the software. This work first investigates fitting of VMSM models to Surface-MSM (SMSM) generated electrical field data, removing modeling dependence on probe geometry while significantly increasing performance and speed. A proposed electric field matching cost function is compared to a force and torque cost function, the inclusion of a self-capacitance constraint is explored and 4 degree-of-freedom VMSM models generated using electric field matching are investigated. The resulting E-field based VMSM development framework is illustrated on a box-shaped hub with a single solar panel, and convergence properties of select models are qualitatively analyzed. Despite the complex non-symmetric spacecraft geometry, elegantly simple 2-sphere VMSM solutions provide force and torque fits within a few percent.

  12. Development of evaluation method for the quality of NPP MCR operators' communication using work domain analysis (WDA)

    International Nuclear Information System (INIS)

    Jang, In Seok

    2010-02-01

    Evolution of work demands has changed industrial evolution to computerization which makes systems complex and complicated: this field is called Complex Socio-Technical Systems. As communication failure is one problem of Complex Socio-Technical Systems, it has been discovered that communication failure is the reason for many incidents and accidents in various industries, including the nuclear, aerospace and railway industries. Despite the fact that there have been many studies on the severity of communication failure, there is no evaluation method for operators' communication quality in NPPs. Therefore, the objectives of this study are to develop an evaluation method for the quality of NPP Main Control Room (MCR) operators' communication and to apply the proposed method to operators in a full-scope simulator. In order to develop the proposed method, the Work Domain Analysis (WDA) method is introduced. Several characteristic of WDA, such as Abstraction Decomposition Space (ADS) and the diagonal of ADS are the key points in developing an evaluation method for the quality of NPP MCR operators' communication. In addition, in order to apply the proposed method, nine teams working in NPPs participated in the field simulation. Evaluation results reveal that operators' communication quality was higher as larger portion of components in the developed evaluation criteria were mentioned. Therefore, the proposed method could be a useful one for evaluating the communication quality in any complex system. In order to verify that the proposed method is meaningful to evaluate communication quality, the evaluation results were further investigated with objective performance measures. Further investigation of the evaluation results also supports the idea that the proposed method can be used in evaluating communication quality

  13. Development of soil-structure interaction analysis method (II) - Volume 1

    International Nuclear Information System (INIS)

    Chang, S. P.; Ko, H. M.; Park, H. K. and others

    1994-02-01

    This project includes following six items : free field analysis for the determination of site input motions, impedance analysis which simplifies the effects of soil-structure interaction by using lumped parameters, soil-structure interaction analysis including the material nonlinearity of soil depending on the level of strains, strong geometric nonlinearity due to the uplifting of the base, seismic analysis of underground structure such as varied pipes, seismic analysis of liquid storage tanks. Each item contains following contents respectively : state-of-the-art review on each item and data base construction on the past researches, theoretical review on the technology of soil-structure interaction analysis, proposing preferable technology and estimating the domestic applicability, proposing guidelines for evaluation of safety and analysis scheme

  14. GEM simulation methods development

    International Nuclear Information System (INIS)

    Tikhonov, V.; Veenhof, R.

    2002-01-01

    A review of methods used in the simulation of processes in gas electron multipliers (GEMs) and in the accurate calculation of detector characteristics is presented. Such detector characteristics as effective gas gain, transparency, charge collection and losses have been calculated and optimized for a number of GEM geometries and compared with experiment. A method and a new special program for calculations of detector macro-characteristics such as signal response in a real detector readout structure, and spatial and time resolution of detectors have been developed and used for detector optimization. A detailed development of signal induction on readout electrodes and electronics characteristics are included in the new program. A method for the simulation of charging-up effects in GEM detectors is described. All methods show good agreement with experiment

  15. Analysis and development of the method for calculating calibration of the working plank in the cold tube roller rolling mills

    Directory of Open Access Journals (Sweden)

    S. V. Pilipenko

    2017-05-01

    Full Text Available Analysis and development of the existing method of calculation of the calibrated profile of the working strips mills CTRR roller cold rolling pipe to ensure the required distribution of energy-power parameters along the cone. In presented paper, which has for aim the development of existing method for calculating the profile of calibrated working plank in the cold tube roller rolling mills, the analysis had been made and it was proposed to use Besier-lines while building the the profile of the plank working surface. It was established that the use of Besier spline-curve for calculating the calibration of supporting planks creates the possibility to calculate the parameters proceeding from reduction over the external diameter. The proposed method for calculating deformation parameters in CTRR mills is the result of development of existing method and as such shows the scientific novelty. Comparison of the plots for distribution of the force parameters of the CTRR process along the cone of deformation presents as evidence the advantage of the method to be proposed. The decrease of reduction value at the end of deformation zone favors the manufacture of tubes with lesser wall thickness deviation (especially longitudinal one, caused with waviness induced by the cold pilgering process. Joined the further development of the method of calculating the deformation parameters CTRR. It is proposed for the calculation of the calibration work surface support bracket mills CTRR to use a spline Bezier. The practical significance of the proposed method consists in the fact that calculation of all zones of the plank by means of one dependence allows simplifying the process of manufacturing the latter in machines with programmed numerical control. In this case the change of reduction parameters over the thickness of the wall will not exert the considerable influence on the character of the force parameters (the character and not the value distribution along the

  16. Development of systematic evaluation method on nonlinear behavior of the constructions using repeated finite element method analysis

    International Nuclear Information System (INIS)

    Kasahara, Naoto

    1997-01-01

    Supposing that the nuclear reactor stops on any reason, the temperature of flown out coolant from the reactor core will decrease and the temperature of elements touched with the coolant in the nuclear plant equipments also decreases on response to this. On the other hand, temperature pursuit at non-touched portions is delayed to form a thermal stress due to their temperature difference. In particular, a stress over its yield value at discontinuous portion of structure due to stress concentration generates, which could be thought of possibility to form a creep fatigue crack if repeating such thermal stress under high temperature. The Power Reactor and Nuclear Fuel Development Corporation has developed the transient thermal stress real time simulation code for calculating thermal stress formed within a construction in accompany with temperature changes of the coolant once and at high speed since 1994 FY, and after 1995 FY the development of FEM simulation technique from macroscopic region to microscopic region which set an objective regions from construction level to material texture has been promoted. In future, development of total simulation technique connected both and optimum design technique due to its results will be planned. (G.K.)

  17. Computational Fluid Dynamics Analysis Method Developed for Rocket-Based Combined Cycle Engine Inlet

    Science.gov (United States)

    1997-01-01

    Renewed interest in hypersonic propulsion systems has led to research programs investigating combined cycle engines that are designed to operate efficiently across the flight regime. The Rocket-Based Combined Cycle Engine is a propulsion system under development at the NASA Lewis Research Center. This engine integrates a high specific impulse, low thrust-to-weight, airbreathing engine with a low-impulse, high thrust-to-weight rocket. From takeoff to Mach 2.5, the engine operates as an air-augmented rocket. At Mach 2.5, the engine becomes a dual-mode ramjet; and beyond Mach 8, the rocket is turned back on. One Rocket-Based Combined Cycle Engine variation known as the "Strut-Jet" concept is being investigated jointly by NASA Lewis, the U.S. Air Force, Gencorp Aerojet, General Applied Science Labs (GASL), and Lockheed Martin Corporation. Work thus far has included wind tunnel experiments and computational fluid dynamics (CFD) investigations with the NPARC code. The CFD method was initiated by modeling the geometry of the Strut-Jet with the GRIDGEN structured grid generator. Grids representing a subscale inlet model and the full-scale demonstrator geometry were constructed. These grids modeled one-half of the symmetric inlet flow path, including the precompression plate, diverter, center duct, side duct, and combustor. After the grid generation, full Navier-Stokes flow simulations were conducted with the NPARC Navier-Stokes code. The Chien low-Reynolds-number k-e turbulence model was employed to simulate the high-speed turbulent flow. Finally, the CFD solutions were postprocessed with a Fortran code. This code provided wall static pressure distributions, pitot pressure distributions, mass flow rates, and internal drag. These results were compared with experimental data from a subscale inlet test for code validation; then they were used to help evaluate the demonstrator engine net thrust.

  18. Development of chromatographic methods for analysis of sulfamethoxazole, trimethoprim, their degradation products and preservatives in syrup

    Directory of Open Access Journals (Sweden)

    Perović Ivana

    2014-01-01

    Full Text Available In this paper the experimental conditions for optimal reversed-phase liquid chromatographic (RP-HPLC determination of sulfamethoxazole, trimethoprim and preservatives, as well as degradation products of sulfamethoxazole and trimethoprim in syrup were defined. The determination of active compounds and preservatives was carried out on Zorbax Eclipse XDB-C18, 150 mm × 4.6 mm, 5 μm particle size column, mobile phase flow rate was 1.5 mL min-1, and detection at 235 nm for the active compounds and 254 nm for preservatives. Mobile phase A consisted of 150 mL of acetonitrile, 850 mL of water and 1 mL of triethanolamine (pH 5.90 adjusted with diluted acetic acid, while mobile phase B was acetonitrile. The mobile phase ratio was defined by the gradient program. For the determination of degradation products Zorbax Eclipse Plus C18, 100 mm x 4.6 mm, 3.5 μm particle size column was used, the mobile phase flow rate was 0.5 mL min-1 and detection at 210 nm for 3,4,5-trimethoxybenzoic acid and 254 nm for sulfanilic acid and sulfanilamide. Mobile phase A was 50 mM potassium dihydrogenphosphate (pH 5.60 adjusted with a 0.5 mol L-1 potassium hydroxide, while mobile phase B was acetonitrile. The mobile phase ratio was defined by the gradient program. Through the validation of the developed methods their efficiency and reliability is confirmed and consequently the adequacy for the routine control.

  19. Development of multielement neutron-capture prompt γ-rays activation analysis method

    International Nuclear Information System (INIS)

    Liu Yuren; Xie Yali; Zhao Yunzhi; Liu Jiping; Meng Bonian

    1998-01-01

    The relationship between content of the measured elements and area of typical peaks of prompt γ-rays is presented. The root-mean square errors on both the regression value of instrumentation analysis and chemical analysis for some common elements are lower than 0.5wt%. Function of the slowing body was found and analysis sensitivity was enhanced obviously in the iron ore analysis. The FWHM of the spectrometer for the H prompt γ-ray peak (2.223 MeV) is 3 keV

  20. Development of a quantitative method for the analysis of cocaine analogue impregnated into textiles by Raman spectroscopy.

    Science.gov (United States)

    Xiao, Linda; Alder, Rhiannon; Mehta, Megha; Krayem, Nadine; Cavasinni, Bianca; Laracy, Sean; Cameron, Shane; Fu, Shanlin

    2018-04-01

    Cocaine trafficking in the form of textile impregnation is routinely encountered as a concealment method. Raman spectroscopy has been a popular and successful testing method used for in situ screening of cocaine in textiles and other matrices. Quantitative analysis of cocaine in these matrices using Raman spectroscopy has not been reported to date. This study aimed to develop a simple Raman method for quantifying cocaine using atropine as the model analogue in various types of textiles. Textiles were impregnated with solutions of atropine in methanol. The impregnated atropine was extracted using less hazardous acidified water with the addition of potassium thiocyanate (KSCN) as an internal standard for Raman analysis. Despite the presence of background matrix signals arising from the textiles, the cocaine analogue could easily be identified by its characteristic Raman bands. The successful use of KSCN normalised the analyte signal response due to different textile matrix background interferences and thus removed the need for a matrix-matched calibration. The method was linear over a concentration range of 6.25-37.5 mg/cm 2 with a coefficient of determination (R 2 ) at 0.975 and acceptable precision and accuracy. A simple and accurate Raman spectroscopy method for the analysis and quantification of a cocaine analogue impregnated in textiles has been developed and validated for the first time. This proof-of-concept study has demonstrated that atropine can act as an ideal model compound to study the problem of cocaine impregnation in textile. The method has the potential to be further developed and implemented in real world forensic cases. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    Science.gov (United States)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize

  2. Development of Optimized Core Design and Analysis Methods for High Power Density BWRs

    Science.gov (United States)

    Shirvan, Koroush

    temperature was kept the same for the BWR-HD and ABWR which resulted in 4 °K cooler core inlet temperature for the BWR-HD given that its feedwater makes up a larger fraction of total core flow. The stability analysis using the STAB and S3K codes showed satisfactory results for the hot channel, coupled regional out-of-phase and coupled core-wide in-phase modes. A RELAPS model of the ABWR system was constructed and applied to six transients for the BWR-HD and ABWR. The 6MCPRs during all the transients were found to be equal or less for the new design and the core remained covered for both. The lower void coefficient along with smaller core volume proved to be advantages for the simulated transients. Helical Cruciform Fuel (HCF) rods were proposed in prior MIT studies to enhance the fuel surface to volume ratio. In this work, higher fidelity models (e.g. CFD instead of subchannel methods for the hydraulic behaviour) are used to investigate the resolution needed for accurate assessment of the HCF design. For neutronics, conserving the fuel area of cylindrical rods results in a different reactivity level with a lower void coefficient for the HCF design. In single-phase flow, for which experimental results existed, the friction factor is found to be sensitive to HCF geometry and cannot be calculated using current empirical models. A new approach for analysis of flow crisis conditions for HCF rods in the context of Departure from Nucleate Boiling (DNB) and dryout using the two phase interface tracking method was proposed and initial results are presented. It is shown that the twist of the HCF rods promotes detachment of a vapour bubble along the elbows which indicates no possibility for an early DNB for the HCF rods and in fact a potential for a higher DNB heat flux. Under annular flow conditions, it was found that the twist suppressed the liquid film thickness on the HCF rods, at the locations of the highest heat flux, which increases the possibility of reaching early dryout. It

  3. Development of an environment-insensitive PWR radial reflector model applicable to modern nodal reactor analysis method

    International Nuclear Information System (INIS)

    Mueller, E.M.

    1989-05-01

    This research is concerned with the development and analysis of methods for generating equivalent nodal diffusion parameters for the radial reflector of a PWR. The requirement that the equivalent reflector data be insensitive to changing core conditions is set as a principle objective. Hence, the environment dependence of the currently most reputable nodal reflector models, almost all of which are based on the nodal equivalence theory homgenization methods of Koebke and Smith, is investigated in detail. For this purpose, a special 1-D nodal equivalence theory reflector model, called the NGET model, is developed and used in 1-D and 2-D numerical experiments. The results demonstrate that these modern radial reflector models exhibit sufficient sensitivity to core conditions to warrant the development of alternative models. A new 1-D nodal reflector model, which is based on a novel combination of the nodal equivalence theory and the response matrix homogenization methods, is developed. Numerical results varify that this homogenized baffle/reflector model, which is called the NGET-RM model, is highly insensitive to changing core conditions. It is also shown that the NGET-RM model is not inferior to any of the existing 1-D nodal reflector models and that it has features which makes it an attractive alternative model for multi-dimensional reactor analysis. 61 refs., 40 figs., 36 tabs

  4. Development of Distinction Method of Production Area of Ginsengs by Using a Neutron Activation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Jin; Chung, Yong Sam; Sun, Gwang Min; Lee, Yu Na; Yoo, Sang Ho [KAERI, Daejeon (Korea, Republic of)

    2010-05-15

    Distinction of production area of Korean ginsengs has been tried by using neutron activation techniques such as an instrumental neutron activation analysis (INAA) and a prompt gamma activation analysis (PGAA). A distribution of elements has varied according to the part of plant clue to the difference of enrichment effect and influence from a soil where the plants have been grown. So correlation study between plants and soil has been an Issue. In this study, the distribution of trace elements within a Korean ginseng was investigated by using an instrumental neutron activation analysis

  5. Comparison of critical methods developed for fatty acid analysis: A review.

    Science.gov (United States)

    Wu, Zhuona; Zhang, Qi; Li, Ning; Pu, Yiqiong; Wang, Bing; Zhang, Tong

    2017-01-01

    Fatty acids are important nutritional substances and metabolites in living organisms. These acids are abundant in Chinese herbs, such as Brucea javanica, Notopterygium forbesii, Isatis tinctoria, Astragalus membranaceus, and Aconitum szechenyianum. This review illustrates the types of fatty acids and their significant roles in the human body. Many analytical methods are used for the qualitative and quantitative evaluation of fatty acids. Some of the methods used to analyze fatty acids in more than 30 kinds of plants, drugs, and other samples are presented in this paper. These analytical methods include gas chromatography, liquid chromatography, near-infrared spectroscopy, and NMR spectroscopy. The advantages and disadvantages of these techniques are described and compared. This review provides a valuable reference for establishing methods for fatty acid determination. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Analysis apparatus and method of analysis

    International Nuclear Information System (INIS)

    1976-01-01

    A continuous streaming method developed for the excution of immunoassays is described in this patent. In addition, a suitable apparatus for the method was developed whereby magnetic particles are automatically employed for the consecutive analysis of a series of liquid samples via the RIA technique

  7. Development of an analytical method for the simultaneous analysis of MCPD esters and glycidyl esters in oil-based foodstuffs.

    Science.gov (United States)

    Ermacora, Alessia; Hrnčiřík, Karel

    2014-01-01

    Substantial progress has been recently made in the development and optimisation of analytical methods for the quantification of 2-MCPD, 3-MCPD and glycidyl esters in oils and fats, and there are a few methods currently available that allow a reliable quantification of these contaminants in bulk oils and fats. On the other hand, no standard method for the analysis of foodstuffs has yet been established. The aim of this study was the development and validation of a new method for the simultaneous quantification of 2-MCPD, 3-MCPD and glycidyl esters in oil-based food products. The developed protocol includes a first step of liquid-liquid extraction and purification of the lipophilic substances of the sample, followed by the application of a previously developed procedure based on acid transesterification, for the indirect quantification of these contaminants in oils and fats. The method validation was carried out on food products (fat-based spreads, creams, margarine, mayonnaise) manufactured in-house, in order to control the manufacturing process and account for any food matrix-analyte interactions (the sample spiking was carried out on the single components used for the formulations rather than the final products). The method showed good accuracy (the recoveries ranged from 97% to 106% for bound 3-MCPD and 2-MCPD and from 88% to 115% for bound glycidol) and sensitivity (the LOD was 0.04 and 0.05 mg kg(-1) for bound MCPD and glycidol, respectively). Repeatability and reproducibility were satisfactory (RSD below 2% and 5%, respectively) for all analytes. The levels of salts and surface-active compounds in the formulation were found to have no impact on the accuracy and the other parameters of the method.

  8. Development and validation of a reversed phase liquid chromatographic method for analysis of oxytetracycline and related impurities.

    Science.gov (United States)

    Kahsay, Getu; Shraim, Fairouz; Villatte, Philippe; Rotger, Jacques; Cassus-Coussère, Céline; Van Schepdael, Ann; Hoogmartens, Jos; Adams, Erwin

    2013-03-05

    A simple, robust and fast high-performance liquid chromatographic method is described for the analysis of oxytetracycline and its related impurities. The principal peak and impurities are all baseline separated in 20 min using an Inertsil C₈ (150 mm × 4.6 mm, 5 μm) column kept at 50 °C. The mobile phase consists of a gradient mixture of mobile phases A (0.05% trifluoroacetic acid in water) and B (acetonitrile-methanol-tetrahydrofuran, 80:15:5, v/v/v) pumped at a flow rate of 1.3 ml/min. UV detection was performed at 254 nm. The developed method was validated for its robustness, sensitivity, precision and linearity in the range from limit of quantification (LOQ) to 120%. The limits of detection (LOD) and LOQ were found to be 0.08 μg/ml and 0.32 μg/ml, respectively. This method allows the separation of oxytetracycline from all known and 5 unknown impurities, which is better than previously reported in the literature. Moreover, the simple mobile phase composition devoid of non-volatile buffers made the method suitable to interface with mass spectrometry for further characterization of unknown impurities. The developed method has been applied for determination of related substances in oxytetracycline bulk samples available from four manufacturers. The validation results demonstrate that the method is reliable for quantification of oxytetracycline and its impurities. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Estimation of Bouguer Density Precision: Development of Method for Analysis of La Soufriere Volcano Gravity Data

    OpenAIRE

    Gunawan, Hendra; Micheldiament, Micheldiament; Mikhailov, Valentin

    2008-01-01

    http://dx.doi.org/10.17014/ijog.vol3no3.20084The precision of topographic density (Bouguer density) estimation by the Nettleton approach is based on a minimum correlation of Bouguer gravity anomaly and topography. The other method, the Parasnis approach, is based on a minimum correlation of Bouguer gravity anomaly and Bouguer correction. The precision of Bouguer density estimates was investigated by both methods on simple 2D syntetic models and under an assumption free-air anomaly consisting ...

  10. Quantitative developments in the cognitive reliability and error analysis method (CREAM) for the assessment of human performance

    International Nuclear Information System (INIS)

    Marseguerra, Marzio; Zio, Enrico; Librizzi, Massimo

    2006-01-01

    The current 'second generation' approaches in human reliability analysis focus their attention on the contextual conditions under which a given action is performed rather than on the notion of inherent human error probabilities, as was done in the earlier 'first generation' techniques. Among the 'second generation' methods, this paper considers the Cognitive Reliability and Error Analysis Method (CREAM) and proposes some developments with respect to a systematic procedure for computing probabilities of action failure. The starting point for the quantification is a previously introduced fuzzy version of the CREAM paradigm which is here further extended to include uncertainty on the qualification of the conditions under which the action is performed and to account for the fact that the effects of the common performance conditions (CPCs) on performance reliability may not all be equal. By the proposed approach, the probability of action failure is estimated by rating the performance conditions in terms of their effect on the action

  11. The analysis of lipophilic marine toxins : development of an alternative method

    NARCIS (Netherlands)

    Gerssen, A.

    2010-01-01

    Lipophilic marine toxins are produced by certain algae species and can accumulate in filter feeding shellfish such as mussels, scallops and oysters. Consumption of contaminated shellfish can lead to severe intoxications such as diarrhea, abdominal cramps and vomiting. Methods described in

  12. Development of safety evaluation methods and analysis codes applied to the safety regulations for the design and construction stage of fast breeder reactor

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    The purposes of this study are to develop the safety evaluation methods and analysis codes needed in the design and construction stage of fast breeder reactor (FBR). In JFY 2012, the following results are obtained. As for the development of safety evaluation methods needed in the safety examination conducted for the reactor establishment permission, development of the analysis codes, such as core damage analysis code, were carried out following the planned schedule. As for the development of the safety evaluation method needed for the risk informed safety regulation, the quantification technique of the event tree using the Continuous Markov chain Monte Carlo method (CMMC method) were studied. (author)

  13. Developments in Surrogating Methods

    Directory of Open Access Journals (Sweden)

    Hans van Dormolen

    2005-11-01

    Full Text Available In this paper, I would like to talk about the developments in surrogating methods for preservation. My main focus will be on the technical aspects of preservation surrogates. This means that I will tell you something about my job as Quality Manager Microfilming for the Netherlands’ national preservation program, Metamorfoze, which is coordinated by the National Library. I am responsible for the quality of the preservation microfilms, which are produced for Metamorfoze. Firstly, I will elaborate on developments in preservation methods in relation to the following subjects: · Preservation microfilms · Scanning of preservation microfilms · Preservation scanning · Computer Output Microfilm. In the closing paragraphs of this paper, I would like to tell you something about the methylene blue test. This is an important test for long-term storage of preservation microfilms. Also, I will give you a brief report on the Cellulose Acetate Microfilm Conference that was held in the British Library in London, May 2005.

  14. Development and Implementation of Efficiency-Improving Analysis Methods for the SAGE III on ISS Thermal Model Originating

    Science.gov (United States)

    Liles, Kaitlin; Amundsen, Ruth; Davis, Warren; Scola, Salvatore; Tobin, Steven; McLeod, Shawn; Mannu, Sergio; Guglielmo, Corrado; Moeller, Timothy

    2013-01-01

    The Stratospheric Aerosol and Gas Experiment III (SAGE III) instrument is the fifth in a series of instruments developed for monitoring aerosols and gaseous constituents in the stratosphere and troposphere. SAGE III will be delivered to the International Space Station (ISS) via the SpaceX Dragon vehicle in 2015. A detailed thermal model of the SAGE III payload has been developed in Thermal Desktop (TD). Several novel methods have been implemented to facilitate efficient payload-level thermal analysis, including the use of a design of experiments (DOE) methodology to determine the worst-case orbits for SAGE III while on ISS, use of TD assemblies to move payloads from the Dragon trunk to the Enhanced Operational Transfer Platform (EOTP) to its final home on the Expedite the Processing of Experiments to Space Station (ExPRESS) Logistics Carrier (ELC)-4, incorporation of older models in varying unit sets, ability to change units easily (including hardcoded logic blocks), case-based logic to facilitate activating heaters and active elements for varying scenarios within a single model, incorporation of several coordinate frames to easily map to structural models with differing geometries and locations, and streamlined results processing using an Excel-based text file plotter developed in-house at LaRC. This document presents an overview of the SAGE III thermal model and describes the development and implementation of these efficiency-improving analysis methods.

  15. Quantitative analysis of concrete using portable x-ray fluorescence: Method development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Washington, Aaron L. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Narrows, William [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Christian, Jonathan H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Msgwood, Leroy [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-07-27

    During Decommissioning and Demolition (D&D) activities at SRS, it is important that the building be screened for radionuclides and heavy metals to ensure that the proper safety and disposal metrics are in place. A major source of contamination at DOE facilities is the accumulation of mercury contamination, from nuclear material processing and Liquid Waste System (LWS). This buildup of mercury could possibly cause harm to any demolition crew or the environment should this material be released. The current standard method is to take core samples in various places in the facility and use X-ray fluorescence (XRF) to detect the contamination. This standard method comes with a high financial value due to the security levels of these sample facilities with unknown contamination levels. Here in we propose the use of portable XRF units to detect for this contamination on-site. To validate this method, the instrument has to be calibrated to detect the heavy metal contamination, be both precise with the known elemental concentrations and consistent with its actual results of a sample concrete and pristine contaminant, and be able to detect changes in the sample concrete’s composition. After receiving the various concrete samples with their compositions found by a XRF wave-dispersive method, the calibration factor’s linear regressions were adjusted to give the baseline concentration of the concrete with no contamination. Samples of both concrete and concrete/flyash were evaluated; their standard deviations revealed that the measurements were consistent with the known composition. Finally, the samples were contaminated with different concentrations of sodium tungsten dihydrate, allowed to air dry, and measured. When the contaminated samples were analyzed, the heavy metal contamination was seen within the spectrum of the instrument, but there was not a trend of quantification based on the concentration of the solution.

  16. Methods for geochemical analysis

    Science.gov (United States)

    Baedecker, Philip A.

    1987-01-01

    The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.

  17. Analysis of the Difficulties and Improvement Method on Introduction of PBL Approach in Developing Country

    Science.gov (United States)

    Okano, Takasei; Sessa, Salvatore

    In the field of international cooperation, it is increasing to introduce Japanese engineering educational model in the developing country to improve the quality of education and research activity. A naive implementation of such model in different cultures and educational systems may lead to several problems. In this paper, we evaluated the Project Based Learning (PBL) class, developed at Waseda University in Japan, and employed to the Egyptian education context at the Egypt-Japan University of Science and Technology (E-JUST) . We found difficulties such as : non-homogeneous student’ s background, disconnection with the student’ s research, weak learning style adaptation, and irregular course conduction. To solve these difficulties at E-JUST, we proposed : the groupware introduction, project theme choice based on student’ s motivation, and curriculum modification.

  18. Development of an Evaluation Method for Team Safety Culture Competencies using Social Network Analysis

    International Nuclear Information System (INIS)

    Han, Sang Min; Kim, Ar Ryum; Seong, Poong Hyun

    2016-01-01

    In this study, team safety culture competency of a team was estimated through SNA, as a team safety culture index. To overcome the limit of existing safety culture evaluation methods, the concept of competency and SNA were adopted. To estimate team safety culture competency, we defined the definition, range and goal of team safety culture competencies. Derivation of core team safety culture competencies is performed and its behavioral characteristics were derived for each safety culture competency, from the procedures used in NPPs and existing criteria to assess safety culture. Then observation was chosen as a method to provide the input data for the SNA matrix of team members versus insufficient team safety culture competencies. Then through matrix operation, the matrix was converted into the two meaningful values, which are density of team members and degree centralities of each team safety culture competency. Density of tem members and degree centrality of each team safety culture competency represent the team safety culture index and the priority of team safety culture competency to be improved

  19. Estimation of Bouguer Density Precision: Development of Method for Analysis of La Soufriere Volcano Gravity Data

    Directory of Open Access Journals (Sweden)

    Hendra Gunawan

    2014-06-01

    Full Text Available http://dx.doi.org/10.17014/ijog.vol3no3.20084The precision of topographic density (Bouguer density estimation by the Nettleton approach is based on a minimum correlation of Bouguer gravity anomaly and topography. The other method, the Parasnis approach, is based on a minimum correlation of Bouguer gravity anomaly and Bouguer correction. The precision of Bouguer density estimates was investigated by both methods on simple 2D syntetic models and under an assumption free-air anomaly consisting of an effect of topography, an effect of intracrustal, and an isostatic compensation. Based on simulation results, Bouguer density estimates were then investigated for a gravity survey of 2005 on La Soufriere Volcano-Guadeloupe area (Antilles Islands. The Bouguer density based on the Parasnis approach is 2.71 g/cm3 for the whole area, except the edifice area where average topography density estimates are 2.21 g/cm3 where Bouguer density estimates from previous gravity survey of 1975 are 2.67 g/cm3. The Bouguer density in La Soufriere Volcano was uncertainly estimated to be 0.1 g/cm3. For the studied area, the density deduced from refraction seismic data is coherent with the recent Bouguer density estimates. New Bouguer anomaly map based on these Bouguer density values allows to a better geological intepretation.    

  20. Development of an Evaluation Method for Team Safety Culture Competencies using Social Network Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Han, Sang Min; Kim, Ar Ryum; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)

    2016-05-15

    In this study, team safety culture competency of a team was estimated through SNA, as a team safety culture index. To overcome the limit of existing safety culture evaluation methods, the concept of competency and SNA were adopted. To estimate team safety culture competency, we defined the definition, range and goal of team safety culture competencies. Derivation of core team safety culture competencies is performed and its behavioral characteristics were derived for each safety culture competency, from the procedures used in NPPs and existing criteria to assess safety culture. Then observation was chosen as a method to provide the input data for the SNA matrix of team members versus insufficient team safety culture competencies. Then through matrix operation, the matrix was converted into the two meaningful values, which are density of team members and degree centralities of each team safety culture competency. Density of tem members and degree centrality of each team safety culture competency represent the team safety culture index and the priority of team safety culture competency to be improved.

  1. Developing a digital photography-based method for dietary analysis in self-serve dining settings.

    Science.gov (United States)

    Christoph, Mary J; Loman, Brett R; Ellison, Brenna

    2017-07-01

    Current population-based methods for assessing dietary intake, including food frequency questionnaires, food diaries, and 24-h dietary recall, are limited in their ability to objectively measure food intake. Digital photography has been identified as a promising addition to these techniques but has rarely been assessed in self-serve settings. We utilized digital photography to examine university students' food choices and consumption in a self-serve dining hall setting. Research assistants took pre- and post-photos of students' plates during lunch and dinner to assess selection (presence), servings, and consumption of MyPlate food groups. Four coders rated the same set of approximately 180 meals for inter-rater reliability analyses; approximately 50 additional meals were coded twice by each coder to assess intra-rater agreement. Inter-rater agreement on the selection, servings, and consumption of food groups was high at 93.5%; intra-rater agreement was similarly high with an average of 95.6% agreement. Coders achieved the highest rates of agreement in assessing if a food group was present on the plate (95-99% inter-rater agreement, depending on food group) and estimating the servings of food selected (81-98% inter-rater agreement). Estimating consumption, particularly for items such as beans and cheese that were often in mixed dishes, was more challenging (77-94% inter-rater agreement). Results suggest that the digital photography method presented is feasible for large studies in real-world environments and can provide an objective measure of food selection, servings, and consumption with a high degree of agreement between coders; however, to make accurate claims about the state of dietary intake in all-you-can-eat, self-serve settings, researchers will need to account for the possibility of diners taking multiple trips through the serving line. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Development of partitioning method

    International Nuclear Information System (INIS)

    Kubota, Kazuo; Dojiri, Shigeru; Kubota, Masumitsu

    1988-10-01

    The literature survey was carried out on the amount of natural resources, behaviors in reprocessing process and in separation and recovery methods of the platinum group elements and technetium which are contained in spent fuel. The essential results are described below. (1) The platinum group elements, which are contained in spent fuel, are quantitatively limited, compared with total demand for them in Japan. And estimated separation and recovery cost is rather high. In spite of that, development of these techniques is considered to be very important because the supply of these elements is almost from foreign resources in Japan. (2) For recovery of these elements, studies of recovery from undisolved residue and from high level liquid waste (HLLW) also seem to be required. (3) As separation and recovery methods, following techniques are considered to be effective; lead extraction, liquid metal extraction, solvent extraction, ion-exchange, adsorption, precipitation, distillation, electrolysis or their combination. (4) But each of these methods has both advantages and disadvantages. So development of such processes largely depends on future works. (author) 94 refs

  3. Methodical Approaches To Analysis And Forecasting Of Development Fuel And Energy Complex And Gas Industry In The Region

    Directory of Open Access Journals (Sweden)

    Vladimir Andreyevich Tsybatov

    2014-12-01

    Full Text Available Fuel and energy complex (FEC is one of the main elements of the economy of any territory over which intertwine the interests of all economic entities. To ensure economic growth of the region should ensure that internal balance of energy resources, which should be developed with account of regional specifics of economic growth and energy security. The study examined the status of this equilibrium, indicating fuel and energy balance of the region (TEB. The aim of the research is the development of the fuel and energy balance, which will allow to determine exactly how many and what resources are not enough to ensure the regional development strategy and what resources need to be brought in. In the energy balances as the focus of displays all issues of regional development, so thermopile is necessary as a mechanism of analysis of current issues, economic development, and in the forward-looking version — as a tool future vision for the fuel and energy complex, energy threats and ways of overcoming them. The variety of relationships in the energy sector with other sectors and aspects of society lead to the fact that the development of the fuel and energy balance of the region have to go beyond the actual energy sector, involving the analysis of other sectors of economy, as well as systems such as banking, budgetary, legislative, tax. Due to the complexity of the discussed problems, the obvious is the need to develop appropriate forecast-analytical system, allowing regional authorities to implement evidence-based predictions of the consequences of management decisions. Multivariant scenario study on development of fuel and energy complex and separately industry, to use the methods of project-based management, harmonized application of state regulation of strategic and market mechanisms on the operational directions of development of fuel and energy complex and separately industry in the economy of the region.

  4. RETROSPECTIVE ANALYSIS OF FREIGHT CARS REPAIR ORGANIZATION METHODS IN THE DEPOT AND THE WAYS OF THEIR FURTHER DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    V. V. Myamlin

    2010-05-01

    Full Text Available A critical analysis of existing methods for repair of freight wagons is presented. The conclusion, that with probability nature of repair activities the “classic” type of a “rigid” production line with regulated step in long-term outlook is inexpedient, has been done. The further development of production-line wagon repair activities is seen in the creation of advanced enterprises equipped with multi-object flexible asynchronous systems with high level of mechanization and automation of technologic processes.

  5. [Quantitative analysis method based on fractal theory for medical imaging of normal brain development in infants].

    Science.gov (United States)

    Li, Heheng; Luo, Liangping; Huang, Li

    2011-02-01

    The present paper is aimed to study the fractal spectrum of the cerebral computerized tomography in 158 normal infants of different age groups, based on the calculation of chaotic theory. The distribution range of neonatal period was 1.88-1.90 (mean = 1.8913 +/- 0.0064); It reached a stable condition at the level of 1.89-1.90 during 1-12 months old (mean = 1.8927 +/- 0.0045); The normal range of 1-2 years old infants was 1.86-1.90 (mean = 1.8863 +/- 4 0.0085); It kept the invariance of the quantitative value among 1.88-1.91(mean = 1.8958 +/- 0.0083) during 2-3 years of age. ANOVA indicated there's no significant difference between boys and girls (F = 0.243, P > 0.05), but the difference of age groups was significant (F = 8.947, P development.

  6. Method development for the analysis of nitrotoluenes, nitramines and other organic compounds in ammunition waste water

    International Nuclear Information System (INIS)

    Mussmann, P.; Preiss, A.; Levsen, K.; Wuensch, G.

    1994-01-01

    Gas chromatography and high performance liquid chromatography were used to determine explosives, their by- and degradation products near the former ammunition plant Elsnig in Saxony. Enrichment procedures using liquid/liquid-and solid-phase extraction, which have already been developed, were used to investigate ground and surface water samples. Mono-, di- and trinitrotoluenes as well as aminonitro- and chlorinated nitroaromatics were identified and quantified using GC/MS, the electron capture detector (ECD) and the nitrogen-phosphorus detector (NPD). Besides, some nitrophenols were identified in ground water. Additionally, RDX, which is hardly to be determined by GC, was quantified using high performance liquid chromatography. Identification was performed by the UV-spectra using a photodiode array detector. (orig.) [de

  7. Development of improved methods for remote access of DIII-D data and data analysis

    International Nuclear Information System (INIS)

    Greene, K.L.; McHarg, B.B. Jr.

    1997-11-01

    The DIII-D tokamak is a national fusion research facility. There is an increasing need to access data from remote sites in order to facilitate data analysis by collaborative researchers at remote locations, both nationally and internationally. In the past, this has usually been done by remotely logging into computers at the DIII-D site. With the advent of faster networking and powerful computers at remote sites, it is becoming possible to access and analyze data from anywhere in the world as if the remote user were actually at the DIII-D site. The general mechanism for accessing DIII-D data has always been via the PTDATA subroutine. Substantial enhancements are being made to that routine to make it more useful in a non-local environment. In particular, a caching mechanism is being built into PTDATA to make network data access more efficient. Studies are also being made of using Distributed File System (DFS) disk storage in a Distributed Computing Environment (DCE). A data server has been created that will migrate, on request, shot data from the DIII-D environment into the DFS environment

  8. Development of a sensitive GC-C-IRMS method for the analysis of androgens.

    Science.gov (United States)

    Polet, Michael; Van Gansbeke, Wim; Deventer, Koen; Van Eenoo, Peter

    2013-02-01

    The administration of anabolic steroids is one of the most important issues in doping control and is detectable through a change in the carbon isotopic composition of testosterone and/or its metabolites. Gas chromatography-combustion-isotope ratio mass spectrometry (GC-C-IRMS), however, remains a very laborious and expensive technique and substantial amounts of urine are needed to meet the sensitivity requirements of the IRMS. This can be problematic because only a limited amount of urine is available for anti-doping analysis on a broad spectrum of substances. In this work we introduce a new type of injection that increases the sensitivity of GC-C-IRMS by a factor of 13 and reduces the limit of detection, simply by using solvent vent injections instead of splitless injection. This drastically reduces the amount of urine required. On top of that, by only changing the injection technique, the detection parameters of the IRMS are not affected and there is no loss in linearity. Copyright © 2012 John Wiley & Sons, Ltd.

  9. Development of task analysis method for operator tasks in main control room of an advanced nuclear power plant

    International Nuclear Information System (INIS)

    Lin Chiuhsiangloe; Hsieh Tsungling

    2016-01-01

    Task analysis methods provide an insight for quantitative and qualitative predictions of how people will use a proposed system, though the different versions have different emphases. Most of the methods can attest to the coverage of the functionality of a system and all provide estimates of task performance time. However, most of the tasks that operators deal with in a digital work environment in the main control room of an advanced nuclear power plant require high mental activity. Such mental tasks overlap and must be dealt with at the same time; most of them can be assumed to be highly parallel in nature. Therefore, the primary aim to be addressed in this paper was to develop a method that adopts CPM-GOMS (cognitive perceptual motor-goals operators methods selection rules) as the basic pattern of mental task analysis for the advanced main control room. A within-subjects experiment design was used to examine the validity of the modified CPM-GOMS. Thirty participants participated in two task types, which included high- and low-compatibility types. The results indicated that the performance was significantly higher on the high-compatibility task type than on the low-compatibility task type; that is, the modified CPM-GOMS could distinguish the difference between high- and low-compatibility mental tasks. (author)

  10. The surface analysis methods

    International Nuclear Information System (INIS)

    Deville, J.P.

    1998-01-01

    Nowadays, there are a lot of surfaces analysis methods, each having its specificity, its qualities, its constraints (for instance vacuum) and its limits. Expensive in time and in investment, these methods have to be used deliberately. This article appeals to non specialists. It gives some elements of choice according to the studied information, the sensitivity, the use constraints or the answer to a precise question. After having recalled the fundamental principles which govern these analysis methods, based on the interaction between radiations (ultraviolet, X) or particles (ions, electrons) with matter, two methods will be more particularly described: the Auger electron spectroscopy (AES) and x-rays photoemission spectroscopy (ESCA or XPS). Indeed, they are the most widespread methods in laboratories, the easier for use and probably the most productive for the analysis of surface of industrial materials or samples submitted to treatments in aggressive media. (O.M.)

  11. Development and in house validation of a new thermogravimetric method for water content analysis in soft brown sugar.

    Science.gov (United States)

    Ducat, Giseli; Felsner, Maria L; da Costa Neto, Pedro R; Quináia, Sueli P

    2015-06-15

    Recently the use of brown sugar has increased due to its nutritional characteristics, thus requiring a more rigid quality control. The development of a method for water content analysis in soft brown sugar is carried out for the first time by TG/DTA with application of different statistical tests. The results of the optimization study suggest that heating rates of 5°C min(-1) and an alumina sample holder improve the efficiency of the drying process. The validation study showed that thermo gravimetry presents good accuracy and precision for water content analysis in soft brown sugar samples. This technique offers advantages over other analytical methods as it does not use toxic and costly reagents or solvents, it does not need any sample preparation, and it allows the identification of the temperature at which water is completely eliminated in relation to other volatile degradation products. This is an important advantage over the official method (loss on drying). Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  13. Probabilistic methods for rotordynamics analysis

    Science.gov (United States)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  14. Development of achiral and chiral 2D HPLC methods for analysis of albendazole metabolites in microsomal fractions using multivariate analysis for the in vitro metabolism.

    Science.gov (United States)

    Belaz, Kátia Roberta A; Pereira-Filho, Edenir Rodrigues; Oliveira, Regina V

    2013-08-01

    In this work, the development of two multidimensional liquid chromatography methods coupled to a fluorescence detector is described for direct analysis of microsomal fractions obtained from rat livers. The chiral multidimensional method was then applied for the optimization of the in vitro metabolism of albendazole by experimental design. Albendazole was selected as a model drug because of its anthelmintics properties and recent potential for cancer treatment. The development of two fully automated achiral-chiral and chiral-chiral high performance liquid chromatography (HPLC) methods for the determination of albendazole (ABZ) and its metabolites albendazole sulphoxide (ABZ-SO), albendazole sulphone (ABZ-SO2) and albendazole 2-aminosulphone (ABZ-SO2NH2) in microsomal fractions are described. These methods involve the use of a phenyl (RAM-phenyl-BSA) or octyl (RAM-C8-BSA) restricted access media bovine serum albumin column for the sample clean-up, followed by an achiral phenyl column (15.0×0.46cmI.D.) or a chiral amylose tris(3,5-dimethylphenylcarbamate) column (15.0×0.46cmI.D.). The chiral 2D HPLC method was applied to the development of a compromise condition for the in vitro metabolism of ABZ by means of experimental design involving multivariate analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. The reaction π-p → π-π-π+p: Development of the analysis methods and selected results

    Science.gov (United States)

    Ryabchikov, D.

    2016-01-01

    We present the description of the analysis methods and results of applying them to the exclusive diffractive reaction π-p → π-π-π+p of 50 . 106 events measured with COMPASS detector. The large statistics of π-π-π+ events enables the two-dimensional partial-wave analysis independently in 100 bins of m(3π) with 0.5 < m(3π) < 2.5 GeV/c2 and in 11 intervals of squared momentum transfer with 0.1 < t' < 1 GeV2/c2. The partial-wave analysis sub-density matrix is the subject to further mass-dependent fits describing the data in terms of resonances in 3π system and coherent background contributions. The novel approach of extracting JPC = 0++(π+π-)S isobar amplitudes as model-free functions, different for several JPC 3π states, is used. It demonstrates the presence of processes π(1800) → f0(980)π and π(1800) → f0(1500)π as well as π2(1880) → f0(980)π and new narrow signal a1(1420) → f0(980)π, without any established shapes used for (π+π-)S isobars. The presented analysis is subject to further development and refinements which currently take place.

  16. Development and validation of reversed-phase high performance liquid chromatographic method for analysis of cephradine in human plasma samples

    International Nuclear Information System (INIS)

    Ahmad, M.; Usman, M.; Madni, A.; Akhtar, N.; Khalid, N.; Asghar, W.

    2010-01-01

    An HPLC method with high precision, accuracy and selectivity was developed and validated for the assessment of cephradine in human plasma samples. The extraction procedure was simple and accurate with single step followed by direct injection of sample into HPLC system. The extracted cephradine in spiked human plasma was separated and quantitated using reversed phase C/sub 18/ column and UV detection wavelength of 254 nm. The optimized mobile phase of new composition of 0.05 M potassium dihydrogen phosphate (pH 3.4)-acetonitrile (88: 12) was pumped at an optimum flow rate of 1 mL.min/sup 1/. The method resulted linearity in the concentration range 0.15- 20 micro g mL/sup -1/. The limit of detection (LOD) and limit of quantification (LOQ) were 0.05 and 0.150 Microg.mL/sup -1/, respectively. The accuracy of method was 98.68 %. This method can 1>e applied for bioequivalence studies and therapeutic drug monitoring as well as for the routine analysis of cephradine. (author)

  17. Sensitivity and Uncertainty Analysis of Coupled Reactor Physics Problems : Method Development for Multi-Physics in Reactors

    NARCIS (Netherlands)

    Perkó, Z.

    2015-01-01

    This thesis presents novel adjoint and spectral methods for the sensitivity and uncertainty (S&U) analysis of multi-physics problems encountered in the field of reactor physics. The first part focuses on the steady state of reactors and extends the adjoint sensitivity analysis methods well

  18. Method Development for Rapid Analysis of Natural Radioactive Nuclides Using Sector Field Inductively Coupled Plasma Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Lim, J.M.; Ji, Y.Y.; Lee, H.; Park, J.H.; Jang, M.; Chung, K.H.; Kang, M.J.; Choi, G.S. [Korea Atomic Energy Research Institute (Korea, Republic of)

    2014-07-01

    As an attempt to reduce the social costs and apprehension arising from radioactivity in the environment, an accurate and rapid assessment of radioactivity is highly desirable. Naturally occurring radioactive materials (NORM) are widely spread throughout the environment. The concern with radioactivity from these materials has therefore been growing for the last decade. In particular, radiation exposure in the industry when handling raw materials (e.g., coal mining and combustion, oil and gas production, metal mining and smelting, mineral sands (REE, Ti, Zr), fertilizer (phosphate), and building materials) has been brought to the public's attention. To decide the proper handling options, a rapid and accurate analytical method that can be used to evaluate the radioactivity of radionuclides (e.g., {sup 238}U, {sup 235}U, {sup 232}Th, {sup 226}Ra, and {sup 40}K) should be developed and validated. Direct measuring methods such as alpha spectrometry, a liquid scintillation counter (LSC), and mass-spectrometry are usually used for the measurement of radioactivity in NORM samples, and they encounter the most significant difficulties during pretreatment (e.g., purification, speciation, and dilution/enrichment). Since the pretreatment process consequently plays an important role in the measurement uncertainty, method development and validation should be performed. Furthermore, a-spectrometry has a major disadvantage of a long counting time, while it has a prominent measurement capability at a very low activity level of {sup 238}U, {sup 235}U, {sup 232}Th, and {sup 226}Ra. Contrary to the α-spectrometry method, a measurement technique using ICP-MS allow radioactivity in many samples to be measured in a short time period with a high degree of accuracy and precision. In this study, a method was developed for a rapid analysis of natural radioactive nuclides using ICP-MS. A sample digestion process was established using LiBO{sub 2} fusion and Fe co-precipitation. A magnetic

  19. Development of a cause analysis system for a CPCS trip by using the rule-base deduction method.

    Science.gov (United States)

    Park, Je-Yun; Koo, In-Soo; Sohn, Chang-Ho; Kim, Jung-Seon; Cho, Gi-Ho; Park, Hee-Seok

    2009-07-01

    A Core Protection Calculator System (CPCS) was developed to initiate a Reactor Trip under the circumstance of certain transients by a Combustion Engineering Company. The major function of the Core Protection Calculator System is to generate contact outputs for the Departure from Nucleate Boiling Ratio (DNBR) Trip and a Local Power Density (LPD) Trip. But in a Core Protection Calculator System, a trip cause cannot be identified, thus only trip signals are transferred to the Plant Protection System (PPS) and only the trip status is displayed. It could take a considerable amount of time and effort for a plant operator to analyze the trip causes of a Core Protection Calculator System. So, a Cause Analysis System for a Core Protection Calculator System (CASCPCS) has been developed by using the rule-base deduction method to assist operators in a Nuclear Power Plant. CASCPCS consists of three major parts. Inference engine has a role of controlling the searching knowledge base, executing the rules and tracking the inference process by using the depth-first searching method. Knowledge base consists of four major parts: rules, data base constants, trip buffer variables and causes. And a user interface is implemented by using menu-driven and window display techniques. The advantage of CASCPCS is that it saves time and effort to diagnose the trip causes of a Core Protection Calculator System, it increases a plant's availability and reliability, and it makes it easy to manage CASCPCS because of using only a cursor control.

  20. Development of an evaluation method for seismic isolation systems of nuclear power facilities. Seismic design analysis methods for crossover piping system

    International Nuclear Information System (INIS)

    Tai, Koichi; Sasajima, Keisuke; Fukushima, Shunsuke; Takamura, Noriyuki; Onishi, Shigenobu

    2014-01-01

    This paper provides seismic design analysis methods suitable for crossover piping system, which connects between seismic isolated building and non-isolated building in the seismic isolated nuclear power plant. Through the numerical study focused on the main steam crossover piping system, seismic response spectrum analysis applying ISM (Independent Support Motion) method with SRSS combination or CCFS (Cross-oscillator, Cross-Floor response Spectrum) method has found to be quite effective for the seismic design of multiply supported crossover piping system. (author)

  1. Optically stimulated luminescence (OSL) dating of shallow marine sediments to develop an analysis method of late Quaternary geodynamics

    International Nuclear Information System (INIS)

    Hataya, Ryuta; Shirai, Masaaki

    2003-01-01

    To develop an analysis method of geodynamics, we have examined the applicability of the OSL dating of marine terrace deposits. We have done the OSL dating, using the multiple-aliquot additive-dose technique, of shallow marine sediments from the upper part the Kioroshi Formation in Ibaraki Prefecture, which are correlated to Marine Oxygen Isotope Stage (MIS) 5e-5c. Marine terrace deposit consists mainly of shallow marine sediment. OSL ages of foreshore and foreshore-shoreface beds are 88-112 Ka, and are in good agreement with the geological/geomorphological data. On the other hand, OSL ages of the backshore bed are younger, and ones of the shoreface bed are older than geologically estimated ages. These results show that OPSL dating method can date shallow marine sediment using samples from foreshore and foreshore-shoreface beds, and that this method can distinguish terrace deposits formed in MIS5 and that in MIS7 by taking geomorphologic information into account. These results contribute to the characterization of long-term geological movement in coastal areas. (author)

  2. Development and validation of a rapid chromatographic method for the analysis of flunarizine and its main production impurities

    Directory of Open Access Journals (Sweden)

    Niamh O’Connor

    2013-06-01

    Full Text Available A rapid selective method for the analysis of flunarizine and its associated impurities was developed and validated according to ICH guidelines. The separation was carried out using a Thermo Scientific Hypersil Gold C18 column (50mm×4.6mm i.d., 1.9μm particle size with a gradient mobile phase of acetonitrile–ammonium acetate–tetrabutylammoniumhydrogen sulfate buffer, at a flow rate of 1.8mL/min and UV detection at 230nm. Naturally aged samples were also tested to determine sample stability. A profile of sample and impurity breakdown was also presented. Keywords: Flunarizine, Sub 2μm column, Active pharmaceutical ingredient, HPLC

  3. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    of the transcriptome, 5’ end capture of RNA is combined with next-generation sequencing for high-throughput quantitative assessment of transcription start sites by two different methods. The methods presented here allow for functional investigation of coding as well as noncoding RNA and contribute to future...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA...

  4. The use of experimental design in the development of an HPLC-ECD method for the analysis of captopril.

    Science.gov (United States)

    Khamanga, Sandile M; Walker, Roderick B

    2011-01-15

    An accurate, sensitive and specific high performance liquid chromatography-electrochemical detection (HPLC-ECD) method that was developed and validated for captopril (CPT) is presented. Separation was achieved using a Phenomenex(®) Luna 5 μm (C(18)) column and a mobile phase comprised of phosphate buffer (adjusted to pH 3.0): acetonitrile in a ratio of 70:30 (v/v). Detection was accomplished using a full scan multi channel ESA Coulometric detector in the "oxidative-screen" mode with the upstream electrode (E(1)) set at +600 mV and the downstream (analytical) electrode (E(2)) set at +950 mV, while the potential of the guard cell was maintained at +1050 mV. The detector gain was set at 300. Experimental design using central composite design (CCD) was used to facilitate method development. Mobile phase pH, molarity and concentration of acetonitrile (ACN) were considered the critical factors to be studied to establish the retention time of CPT and cyclizine (CYC) that was used as the internal standard. Twenty experiments including centre points were undertaken and a quadratic model was derived for the retention time for CPT using the experimental data. The method was validated for linearity, accuracy, precision, limits of quantitation and detection, as per the ICH guidelines. The system was found to produce sharp and well-resolved peaks for CPT and CYC with retention times of 3.08 and 7.56 min, respectively. Linear regression analysis for the calibration curve showed a good linear relationship with a regression coefficient of 0.978 in the concentration range of 2-70 μg/mL. The linear regression equation was y=0.0131x+0.0275. The limits of detection (LOQ) and quantitation (LOD) were found to be 2.27 and 0.6 μg/mL, respectively. The method was used to analyze CPT in tablets. The wide range for linearity, accuracy, sensitivity, short retention time and composition of the mobile phase indicated that this method is better for the quantification of CPT than the

  5. Identification of Hepatoprotective Constituents in Limonium tetragonum and Development of Simultaneous Analysis Method using High-performance Liquid Chromatography

    Science.gov (United States)

    Lee, Jae Sun; Kim, Yun Na; Kim, Na-Hyun; Heo, Jeong-Doo; Yang, Min Hye; Rho, Jung-Rae; Jeong, Eun Ju

    2017-01-01

    Background: Limonium tetragonum, a naturally salt-tolerant halophyte, has been studied recently and is of much interest to researchers due to its potent antioxidant and hepatoprotective activities. Objective: In the present study, we attempted to elucidate bioactive compounds from ethyl acetate (EtOAc) soluble fraction of L. tetragonum extract. Furthermore, the simultaneous analysis method of bioactive EtOAc fraction of L. tetragonum has been developed using high-performance liquid chromatography (HPLC). Materials and Methods: Thirteen compounds have been successfully isolated from EtOAc fraction of L. tetragonum, and the structures of 1–13 were elucidated by extensive one-dimensional and two-dimensional spectroscopic methods including 1H-NMR, 13C-NMR, 1H-1H COSY, heteronuclear single quantum coherence, heteronuclear multiple bond correlation, and nuclear Overhauser effect spectroscopy. Hepatoprotection of the isolated compounds against liver fibrosis was evaluated by measuring inhibition on hepatic stellate cells (HSCs) undergoing proliferation. Results: Compounds 1–13 were identified as gallincin (1), apigenin-3-O-β-D-galactopyranoside (2), quercetin (3), quercetin-3-O-β-D-galactopyranoside (4), (−)-epigallocatechin (5), (−)-epigallocatechin-3-gallate (6), (−)-epigallocatechin-3-(3″-O-methyl) gallate (7), myricetin-3-O-β-D-galactopyranoside (8), myricetin-3-O-(6″-O-galloyl)-β-D-galactopyranoside (9), myricetin-3-O-α-L-rhamnopyranoside (10), myricetin-3-O-(2″-O-galloyl)-α-L-rhamnopyranoside (11), myricetin-3-O-(3″-O-galloyl)-α-L-rhamnopyranoside (12), and myricetin-3-O-α-L-arabinopyranoside (13), respectively. All compounds except for 4, 8, and 10 are reported for the first time from this plant. Conclusion: Myricetin glycosides which possess galloyl substituent (9, 11, and 12) showed most potent inhibitory effects on the proliferation of HSCs. SUMMARY In the present study, we have successfully isolated 13 compounds from bioactive fraction

  6. Development of a Matlab/Simulink tool to facilitate system analysis and simulation via the adjoint and covariance methods

    NARCIS (Netherlands)

    Bucco, D.; Weiss, M.

    2007-01-01

    The COVariance and ADjoint Analysis Tool (COVAD) is a specially designed software tool, written for the Matlab/Simulink environment, which allows the user the capability to carry out system analysis and simulation using the adjoint, covariance or Monte Carlo methods. This paper describes phase one

  7. Social Phenomenological Analysis as a Research Method in Art Education: Developing an Empirical Model for Understanding Gallery Talks

    Science.gov (United States)

    Hofmann, Fabian

    2016-01-01

    Social phenomenological analysis is presented as a research method to study gallery talks or guided tours in art museums. The research method is based on the philosophical considerations of Edmund Husserl and sociological/social science concepts put forward by Max Weber and Alfred Schuetz. Its starting point is the everyday lifeworld; the…

  8. Development of a novel rotary desiccant cooling cycle with isothermal dehumidification and regenerative evaporative cooling using thermodynamic analysis method

    International Nuclear Information System (INIS)

    La, D.; Li, Y.; Dai, Y.J.; Ge, T.S.; Wang, R.Z.

    2012-01-01

    A novel rotary desiccant cooling cycle is proposed and studied using thermodynamic analysis method. The proposed cycle integrates the technologies of isothermal dehumidification and regenerative evaporative cooling, which are beneficial for irreversibility reduction. Thermodynamic investigation on the basic rotary desiccant cooling cycle shows that the exergy efficiency of the basic cycle is only 8.6%. The processes of desiccant dehumidification and evaporative cooling, which are essentially the basis for rotary desiccant cooling, affect the exergy performance of the cycle greatly and account for about one third of the total exergy destruction. The proposed cycle has potential to improve rotary desiccant cooling technology. It is advantageous in terms of both heat source utilization rate and space cooling capacity. The exergy efficiency of the new cycle is enhanced significantly to 29.1%, which is about three times that of the ventilation cycle, and 60% higher than that of the two-stage rotary desiccant cooling cycle. Furthermore, the regeneration temperature is reduced from 80 °C to about 60 °C. The corresponding specific exergy of the supply air is increased by nearly 30% when compared with the conventional cycles. -- Highlights: ► A novel rotary desiccant cooling cycle is developed using thermodynamic analysis method. ► Isothermal dehumidification and regenerative evaporative cooling have been integrated. ► The cycle is advantageous in terms of both heat source utilization rate and space cooling capacity. ► Cascaded energy utilization is beneficial for cycle performance improvement. ► Upper limits, which will be helpful to practical design and optimization, are obtained.

  9. Development of a traceability analysis method based on case grammar for NPP requirement documents written in Korean language

    International Nuclear Information System (INIS)

    Yoo, Yeong Jae; Seong, Poong Hyun; Kim, Man Cheol

    2004-01-01

    Software inspection is widely believed to be an effective method for software verification and validation (V and V). However, software inspection is labor-intensive and, since it uses little technology, software inspection is viewed upon as unsuitable for a more technology-oriented development environment. Nevertheless, software inspection is gaining in popularity. KAIST Nuclear I and C and Information Engineering Laboratory (NICIEL) has developed software management and inspection support tools, collectively named 'SIS-RT.' SIS-RT is designed to partially automate the software inspection processes. SIS-RT supports the analyses of traceability between a given set of specification documents. To make SIS-RT compatible for documents written in Korean, certain techniques in natural language processing have been studied. Among the techniques considered, case grammar is most suitable for analyses of the Korean language. In this paper, we propose a methodology that uses a case grammar approach to analyze the traceability between documents written in Korean. A discussion regarding some examples of such an analysis will follow

  10. Combined fluvial and pluvial urban flood hazard analysis: method development and application to Can Tho City, Mekong Delta, Vietnam

    Science.gov (United States)

    Apel, H.; Trepat, O. M.; Hung, N. N.; Chinh, D. T.; Merz, B.; Dung, N. V.

    2015-08-01

    Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas, and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either fluvial or pluvial flood hazard, studies of combined fluvial and pluvial flood hazard are hardly available. Thus this study aims at the analysis of fluvial and pluvial flood hazard individually, but also at developing a method for the analysis of combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as example. In this tropical environment the annual monsoon triggered floods of the Mekong River can coincide with heavy local convective precipitation events causing both fluvial and pluvial flooding at the same time. Fluvial flood hazard was estimated with a copula based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. Pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data, and a stochastic rain storm generator. Inundation was simulated by a 2-dimensional hydrodynamic model implemented on a Graphical Processor Unit (GPU) for time-efficient flood propagation modelling. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation considering the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by

  11. DEVELOPMENT OF METHOD OF QUALITATIVE ANALYSIS OF BIRD CHERRY FRUIT FOR INCLUSION IN THE MONOGRAPH OF STATE PHARMACOPOEIA OF UKRAINE

    Directory of Open Access Journals (Sweden)

    Lenchyk L.V.

    2016-06-01

    Full Text Available Introduction. Bird cherry Padus avium Mill, Rosaceae, is widespread in Ukraine, especially in forests and forest-steppe areas. Bird cherry fruits have long been used in medicine and is a valuable medicinal raw materials. They stated to posess astringent, anti-inflammatory, phytoncidal properties. Bird cherry fruits are included in the USSR Pharmacopoeia IX ed., The State Pharmacopoeia of the Russian Federation, The State Pharmacopoeia of Republic of Belarus. In Ukraine there are no contemporary normative documents for this medicinal plant material, therefore it is the actual to develop projects in the national monographs "dry bird cherry fruit" and "fresh bird cherry fruit" to be included in the State Pharmacopoeia of Ukraine. According to European Pharmacopoeia recommendation method of thin-layer chromatography (TLC is prescribed only for the identification of the herbal drug. The principles of thin-layer chromatography and application of the technique in pharmaceutical analysis are described in State Pharmacopoeia of Ukraine. As it is effective and easy to perform, and the equipment required is inexpensive, the technique is frequently used for evaluating medicinal plant materials and their preparations. The TLC is aimed at elucidating the chromatogram of the drug with respect to selected reference compounds that are described for inclusion as reagents. Aim of this study was to develop methods of qualitative analysis of bird cherry fruits for a monograph in the State Pharmacopoeia of Ukraine (SPU. Materials and Methods. The object of our study was dried bird cherry fruits (7 samples and fresh bird cherry fruits (7 samples harvested in 2013-2015 in Kharkiv, Poltava, Luhansk, Sumy, Lviv, Mykolaiv regions and the city Mariupol. Samples were registered in the department of SPU State Enterprise "Pharmacopeia center". In accordance with the Ph. Eur. and SPU requirements in "identification C" determination was performed by TLC. TLC was performed on

  12. Automatic Earthquake Shear Stress Measurement Method Developed for Accurate Time- Prediction Analysis of Forthcoming Major Earthquakes Along Shallow Active Faults

    Science.gov (United States)

    Serata, S.

    2006-12-01

    The Serata Stressmeter has been developed to measure and monitor earthquake shear stress build-up along shallow active faults. The development work made in the past 25 years has established the Stressmeter as an automatic stress measurement system to study timing of forthcoming major earthquakes in support of the current earthquake prediction studies based on statistical analysis of seismological observations. In early 1982, a series of major Man-made earthquakes (magnitude 4.5-5.0) suddenly occurred in an area over deep underground potash mine in Saskatchewan, Canada. By measuring underground stress condition of the mine, the direct cause of the earthquake was disclosed. The cause was successfully eliminated by controlling the stress condition of the mine. The Japanese government was interested in this development and the Stressmeter was introduced to the Japanese government research program for earthquake stress studies. In Japan the Stressmeter was first utilized for direct measurement of the intrinsic lateral tectonic stress gradient G. The measurement, conducted at the Mt. Fuji Underground Research Center of the Japanese government, disclosed the constant natural gradients of maximum and minimum lateral stresses in an excellent agreement with the theoretical value, i.e., G = 0.25. All the conventional methods of overcoring, hydrofracturing and deformation, which were introduced to compete with the Serata method, failed demonstrating the fundamental difficulties of the conventional methods. The intrinsic lateral stress gradient determined by the Stressmeter for the Japanese government was found to be the same with all the other measurements made by the Stressmeter in Japan. The stress measurement results obtained by the major international stress measurement work in the Hot Dry Rock Projects conducted in USA, England and Germany are found to be in good agreement with the Stressmeter results obtained in Japan. Based on this broad agreement, a solid geomechanical

  13. THE DEVELOPMENT OF METHOD FOR MINT AND TURMERIC ESSENTIAL OILS IDENTIFICATION AND QUANTITATIVE ANALYSIS IN COMPLEX DRUG

    Directory of Open Access Journals (Sweden)

    O. G. Smalyuh

    2015-04-01

    Full Text Available The aim of our study was to develop the method for identification and assay of essential oils of mint and turmeric in complex medicinal product in capsule form. Materials and method.The paper used samples of turmeric and mint essential oils and complex drug, in the form of capsules containing oil of peppermint, oil of Curcuma longa, a mixture of extracts sandy everlasting (Helichrysumarenarium (L. Moench, marigold (Caléndulaofficinális L, wild carrot (Daucussarota and Curcuma longa (Curcuma longa. Results and discussion. The structure of the complex drug is dry extract sand everlasting flowers, wild carrot extract of marigold flowers and fruits thick, dry extract of Curcuma longa and essential oils of peppermint and turmeric. According to the research of different samples of peppermint oil, and given the need for its identification and quantification of the finished medicinal product, we have decided to choose menthol as analytical marker. In order to establish the identity of complex drug its main components - Ar- turmeric, α-and β- turmeric, and their total content must meet the quantitative indicators "content turmerics" in the specifications for turmeric oil. Past studies of sample preparation conditions allowed to offer 96% ethanol to extract oil components from the sample; ultrasonic and centrifugation to improve removal of the capsule weight. Cromatographiccharacteristics of substances was obtained by column firm Agilent, HP-Innowax. It has been established that other active pharmaceutical ingredients capsule (placebo did not affect the quantification of the components of essential oils of mint and turmeric. Conclusions. 1. Chromatographic conditions of identification and assay of essential oils of mint and turmeric in a complex drug and optimal conditions for sample preparation and analysis by gas chromatographyhave been studied. 2. Methods for identification and assay of menthol, α-, β- and Ar- turmerics in complex drug based on

  14. Development of partitioning method

    International Nuclear Information System (INIS)

    Kondo, Yasuo; Kubota, Masumitsu; Abe, Tadashi; Nagato, Kotaro.

    1991-09-01

    Spent fuels from nuclear power stations contain many useful elements, which can be utilized as heat and irradiation sources, radioisotope, elemental resource, etc. Their recovery from spent fuel and effective uses have the advantages in not only converting the radioactive waste to beneficial resources but also promoting rationalization of the management and disposal of the radioactive wastes. In present study, published literature related to recovery and utilization of useful elements in spent fuel was mainly surveyed, present states and trends in their research and development were analyzed, and their future prospects were conjectured. Research and development on recovery and utilization of useful elements are being continued mainly in USA, Europe and Japan. A transportable food irradiator with Cs-137 and an electric power source with Sr-90 for remote weather station are typical examples in major past applications. However, research and development on recovery and utilization are not so much active and the future efforts should be expected hereafter. Present study was conducted under the auspices of the Science and Technology Agency of Japan. (author)

  15. Development of the finite element method in the thermal field. TRIO-EF software for thermal and radiation analysis

    International Nuclear Information System (INIS)

    Casalotti, N.; Magnaud, J.P.

    1989-01-01

    The possibilities of the TRIO-EF software in the thermal field are presented. The TRIO-EF is a computer program based on the finite element method and used for three-dimensional incompressible flow analysis. It enables the calculation of three-dimensional heat transfer and the fluid/structure analysis. The geometrically complex radiative reactor systems are taken into account in the form factor calculation. The implemented algorithms are described [fr

  16. Development and experimental verification of a finite element method for accurate analysis of a surface acoustic wave device

    Science.gov (United States)

    Mohibul Kabir, K. M.; Matthews, Glenn I.; Sabri, Ylias M.; Russo, Salvy P.; Ippolito, Samuel J.; Bhargava, Suresh K.

    2016-03-01

    Accurate analysis of surface acoustic wave (SAW) devices is highly important due to their use in ever-growing applications in electronics, telecommunication and chemical sensing. In this study, a novel approach for analyzing the SAW devices was developed based on a series of two-dimensional finite element method (FEM) simulations, which has been experimentally verified. It was found that the frequency response of the two SAW device structures, each having slightly different bandwidth and center lobe characteristics, can be successfully obtained utilizing the current density of the electrodes via FEM simulations. The two SAW structures were based on XY Lithium Niobate (LiNbO3) substrates and had two and four electrode finger pairs in both of their interdigital transducers, respectively. Later, SAW devices were fabricated in accordance with the simulated models and their measured frequency responses were found to correlate well with the obtained simulations results. The results indicated that better match between calculated and measured frequency response can be obtained when one of the input electrode finger pairs was set at zero volts and all the current density components were taken into account when calculating the frequency response of the simulated SAW device structures.

  17. Development for ultra-trace analysis method of U and Pu in safeguards environmental samples at the clean facility

    International Nuclear Information System (INIS)

    Takahashi, Masato; Magara, Masaaki; Sakurai, Satoshi

    2002-01-01

    Based on the strengthen safeguard program of the IAEA to detect undeclared nuclear activities and nuclear materials, the method of precise and accurate isotope ratio determination for uranium and plutonium in the environmental samples (cotton swipes) has been developed at JAERI. The samples should be treated in clean environment in order to secure the analytical reliability by eliminating external contamination from the samples containing trace amount of uranium and plutonium. Since the measurement by ICP-MS is favorable to bulk analysis from view points of analytical capacity and operation simplicity, we have studied sample preparation procedures for the trace amount of uranium and plutonium to be applied to ICP-MS. Up to the present, interfering factors involved during analytical processes and the ICP-MS measurement of uranium and plutonium were examined. As a result, uranium isotope measurement more than 100 pg became possible at JAERI clean facility by diminishing uranium blank introduced in the entire sample treatment procedure. And also, the estimation of plutonium recovery yield and uranium decontamination factor suggested the possibility in plutonium isotope measurement more than 100 fg. (author)

  18. Method of signal analysis

    International Nuclear Information System (INIS)

    Berthomier, Charles

    1975-01-01

    A method capable of handling the amplitude and the frequency time laws of a certain kind of geophysical signals is described here. This method is based upon the analytical signal idea of Gabor and Ville, which is constructed either in the time domain by adding an imaginary part to the real signal (in-quadrature signal), or in the frequency domain by suppressing negative frequency components. The instantaneous frequency of the initial signal is then defined as the time derivative of the phase of the analytical signal, and his amplitude, or envelope, as the modulus of this complex signal. The method is applied to three types of magnetospheric signals: chorus, whistlers and pearls. The results obtained by analog and numerical calculations are compared to results obtained by classical systems using filters, i.e. based upon a different definition of the concept of frequency. The precision with which the frequency-time laws are determined leads then to the examination of the principle of the method and to a definition of instantaneous power density spectrum attached to the signal, and to the first consequences of this definition. In this way, a two-dimensional representation of the signal is introduced which is less deformed by the analysis system properties than the usual representation, and which moreover has the advantage of being obtainable practically in real time [fr

  19. Development of partitioning method

    International Nuclear Information System (INIS)

    Kobayashi, Tsutomu; Shirahashi, Koichi; Kubota, Masumitsu

    1989-11-01

    Precipitation behavior of elements in a high-level liquid waste (HLW) was studied by using the simulated liquid waste, when the transuranic elements group was precipitated and separated as oxalate from HLW generated from the reprocessing of spent nuclear fuel. The results showed that over 90 % of strontium and barium were precipitated when oxalic acid was directly added to HLW to precipitate the transuranic elements group, and the percentages of these elements precipitated were affected by molybdenum and or zirconium. Therefore, a method of adding oxalic acid into the filtrate was studied after removing previously molybdenum and zirconium as precipitate by denitrating HLW, and it was found that precipitated fractions of strontium and barium could be suppressed about 10 %. Adding oxalic acid under the co-existance of ascorbic acid is effective for quantitative precipitation of neptunium in HLW. In this case, it was found that adding ascorbic acid had little influence on precipitation behavior of the other elements except palladium. (author)

  20. A Contribution To The Development And Analysis Of A Combined Current-Voltage Instrument Transformer By Using Modern CAD Methods

    International Nuclear Information System (INIS)

    Chundeva-Blajer, Marija M.

    2004-01-01

    The principle aim and task of the thesis is the analysis and development of 20 kV combined current-voltage instrument transformer (CCVIT) by using modern CAD techniques. CCVIT is a complex electromagnetic system comprising of four windings and two magnetic cores in one insulation housing for simultaneous transformation of high voltages and currents to measurable signal values by standard instruments. The analytical design methods can be applied on simple electromagnetic configurations, which is not the case with the CCVIT. There is mutual electromagnetic influence between the voltage measurement core (VMC) and the current measurement core (CMC). After the analytical CCVIT design had been done, exact determination of its metrological characteristics has been accomplished by using the numerical finite element method implemented in the FEM-3D program package. The FEM-3D calculation is made in 19 cross-sectional layers of the z-axis of the CCVIT three-dimensional domain. By FEM-3D application the three-dimensional CCVIT magnetic field distribution is derived. This is the basis for calculation of the initial metrological characteristics of the CCVIT (VMC is accuracy class 3 and CMC is accuracy class 1). By using the stochastic optimization technique based on genetic algorithm the CCVIT optimal design is achieved. The objective function is the minimum of the metrological parameters (VIM voltage error and CMC current error). There are I I independent input variables during the optimization process by which the optimal project is derived. The optimal project is adapted for realization of a prototype and the optimized project is derived. Full comparative analysis of the metrological and the electromagnetic characteristics of the three projects is accomplished. By application of the program package MATLAB/SIMULINK the CCVIT transient phenomena is analyzed for different regimes in the three design projects. In the Instrument Transformer Factory of EMO A. D.-Ohrid a CCVIT

  1. Motion as perturbation. II. Development of the method for dosimetric analysis of motion effects with fixed-gantry IMRT

    Energy Technology Data Exchange (ETDEWEB)

    Nelms, Benjamin E. [Canis Lupus LLC, Merrimac, Wisconsin 53561 (United States); Opp, Daniel; Zhang, Geoffrey; Moros, Eduardo; Feygelman, Vladimir, E-mail: vladimir.feygelman@moffitt.org [Department of Radiation Oncology, Moffitt Cancer Center, Tampa, Florida 33612 (United States)

    2014-06-15

    Purpose: In this work, the feasibility of implementing a motion-perturbation approach to accurately estimate volumetric dose in the presence of organ motion—previously demonstrated for VMAT-–is studied for static gantry IMRT. The method's accuracy is improved for the voxels that have very low planned dose but acquire appreciable dose due to motion. The study describes the modified algorithm and its experimental validation and provides an example of a clinical application. Methods: A contoured region-of-interest is propagated according to the predefined motion kernel throughout time-resolved 4D phantom dose grids. This timed series of 3D dose grids is produced by the measurement-guided dose reconstruction algorithm, based on an irradiation of a staticARCCHECK (AC) helical dosimeter array (Sun Nuclear Corp., Melbourne, FL). Each moving voxel collects dose over the dynamic simulation. The difference in dose-to-moving voxel vs dose-to-static voxel in-phantom forms the basis of a motion perturbation correction that is applied to the corresponding voxel in the patient dataset. A new method to synchronize the accelerator and dosimeter clocks, applicable to fixed-gantry IMRT, was developed. Refinements to the algorithm account for the excursion of low dose voxels into high dose regions, causing appreciable dose increase due to motion (LDVE correction). For experimental validation, four plans using TG-119 structure sets and objectives were produced using segmented IMRT direct machine parameters optimization in Pinnacle treatment planning system (v. 9.6, Philips Radiation Oncology Systems, Fitchburg, WI). All beams were delivered with the gantry angle of 0°. Each beam was delivered three times: (1) to the static AC centered on the room lasers; (2) to a static phantom containing a MAPCHECK2 (MC2) planar diode array dosimeter (Sun Nuclear); and (3) to the moving MC2 phantom. The motion trajectory was an ellipse in the IEC XY plane, with 3 and 1.5 cm axes. The period

  2. Complementing Gender Analysis Methods.

    Science.gov (United States)

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital.

  3. Development and interlaboratory validation of quantitative polymerase chain reaction method for screening analysis of genetically modified soybeans.

    Science.gov (United States)

    Takabatake, Reona; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi

    2013-01-01

    A novel real-time polymerase chain reaction (PCR)-based quantitative screening method was developed for three genetically modified soybeans: RRS, A2704-12, and MON89788. The 35S promoter (P35S) of cauliflower mosaic virus is introduced into RRS and A2704-12 but not MON89788. We then designed a screening method comprised of the combination of the quantification of P35S and the event-specific quantification of MON89788. The conversion factor (Cf) required to convert the amount of a genetically modified organism (GMO) from a copy number ratio to a weight ratio was determined experimentally. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDR), respectively. The determined RSDR values for the method were less than 25% for both targets. We consider that the developed method would be suitable for the simple detection and approximate quantification of GMO.

  4. Development of techniques using DNA analysis method for detection/analysis of radiation-induced mutation. Development of an useful probe/primer and improvement of detection efficacy

    International Nuclear Information System (INIS)

    Maekawa, Hideaki; Tsuchida, Kozo; Hashido, Kazuo; Takada, Naoko; Kameoka, Yosuke; Hirata, Makoto

    1999-01-01

    Previously, it was demonstrated that detection of centromere became easy and reliable through fluorescent staining by FISH method using a probe of the sequence preserved in α-satelite DNA. Since it was, however, found inappropriate to detect dicentrics based on the relative amount of DNA probe on each chromosome. A prove which allows homogeneous detection of α-satelite DNA for each chromosome was constructed. A presumed sequence specific to kinetochore, CENP-B box was amplified by PCR method and the product DNA was used as a probe. However, the variation in amounts of probe DNA among chromosomes was decreased by only about 20%. Then, a program for image processing of the results obtained from FISH using α-satelite DNA was constructed to use as a marker for centromere. When compared with detection of abnormal chromosomes stained by the conventional method, calculation efficacy for only detection of centromere was improved by the use of this program. Calculation to discriminate the normal or not was still complicated and the detection efficacy was little improved. Chromosomal abnormalities in lymphocytes were used to detect the effects of radiation. In this method, it is needed to shift the phase of cells into metaphase. The mutation induced by radiation might be often repaired during shifting. To exclude this possibility, DNA extraction was conducted at a low temperature and immediately after exposure to 137 Cs, and a rapid genome detection method was established using the genome DNA. As the model genomes, the following three were used: 1) long chain repeated sequences widely dispersed over chromosome, 2) cluster genes, 3) single copy genes. The effects of radiation were detectable at 1-2 Gy for the long repeated sequences and at 7 Gy for the cluster genes, respectively, whereas no significant effects were observed at any Gy tested for the single copy genes. Amplification was marked in the cells exposed at 1-10 Gy (peak at 4 Gy), suggesting that these regions had

  5. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    International Nuclear Information System (INIS)

    Hunt, M.A.; Klatt, L.N.; Thompson, D.H.

    1998-02-01

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons

  6. Analysis and assessment on heavy metal sources in the coastal soils developed from alluvial deposits using multivariate statistical methods.

    Science.gov (United States)

    Li, Jinling; He, Ming; Han, Wei; Gu, Yifan

    2009-05-30

    An investigation on heavy metal sources, i.e., Cu, Zn, Ni, Pb, Cr, and Cd in the coastal soils of Shanghai, China, was conducted using multivariate statistical methods (principal component analysis, clustering analysis, and correlation analysis). All the results of the multivariate analysis showed that: (i) Cu, Ni, Pb, and Cd had anthropogenic sources (e.g., overuse of chemical fertilizers and pesticides, industrial and municipal discharges, animal wastes, sewage irrigation, etc.); (ii) Zn and Cr were associated with parent materials and therefore had natural sources (e.g., the weathering process of parent materials and subsequent pedo-genesis due to the alluvial deposits). The effect of heavy metals in the soils was greatly affected by soil formation, atmospheric deposition, and human activities. These findings provided essential information on the possible sources of heavy metals, which would contribute to the monitoring and assessment process of agricultural soils in worldwide regions.

  7. LC-MS/MS method development for quantitative analysis of acetaminophen uptake by the aquatic fungus Mucor hiemalis.

    Science.gov (United States)

    Esterhuizen-Londt, Maranda; Schwartz, Katrin; Balsano, Evelyn; Kühn, Sandra; Pflugmacher, Stephan

    2016-06-01

    Acetaminophen is a pharmaceutical, frequently found in surface water as a contaminant. Bioremediation, in particular, mycoremediation of acetaminophen is a method to remove this compound from waters. Owing to the lack of quantitative analytical method for acetaminophen in aquatic organisms, the present study aimed to develop a method for the determination of acetaminophen using LC-MS/MS in the aquatic fungus Mucor hiemalis. The method was then applied to evaluate the uptake of acetaminophen by M. hiemalis, cultured in pellet morphology. The method was robust, sensitive and reproducible with a lower limit of quantification of 5 pg acetaminophen on column. It was found that M. hiemalis internalize the pharmaceutical, and bioaccumulate it with time. Therefore, M. hiemalis was deemed a suitable candidate for further studies to elucidate its pharmaceutical tolerance and the longevity in mycoremediation applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Method development for liquid chromatographic/triple quadrupole mass spectrometric analysis of trace level perfluorocarboxylic acids in articles of commerce

    Science.gov (United States)

    An analytical method to identify and quantify trace levels of C5 to C12 perfluorocarboxylic acids (PFCAs) in articles of commerce (AOC) is developed and rigorously validated. Solid samples were extracted in methanol, and liquid samples were diluted with a solvent consisting of 60...

  9. Quality by Design in the development of hydrophilic interaction liquid chromatography method with gradient elution for the analysis of olanzapine.

    Science.gov (United States)

    Tumpa, Anja; Stajić, Ana; Jančić-Stojanović, Biljana; Medenica, Mirjana

    2017-02-05

    This paper deals with the development of hydrophilic interaction liquid chromatography (HILIC) method with gradient elution, in accordance with Analytical Quality by Design (AQbD) methodology, for the first time. The method is developed for olanzapine and its seven related substances. Following step by step AQbD methodology, firstly as critical process parameters (CPPs) temperature, starting content of aqueous phase and duration of linear gradient are recognized, and as critical quality attributes (CQAs) separation criterion S of critical pairs of substances are investigated. Rechtschaffen design is used for the creation of models that describe the dependence between CPPs and CQAs. The design space that is obtained at the end is used for choosing the optimal conditions (set point). The method is fully validated at the end to verify the adequacy of the chosen optimal conditions and applied to real samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. A Mixed-Methods Analysis in Assessing Students' Professional Development by Applying an Assessment for Learning Approach.

    Science.gov (United States)

    Peeters, Michael J; Vaidya, Varun A

    2016-06-25

    Objective. To describe an approach for assessing the Accreditation Council for Pharmacy Education's (ACPE) doctor of pharmacy (PharmD) Standard 4.4, which focuses on students' professional development. Methods. This investigation used mixed methods with triangulation of qualitative and quantitative data to assess professional development. Qualitative data came from an electronic developmental portfolio of professionalism and ethics, completed by PharmD students during their didactic studies. Quantitative confirmation came from the Defining Issues Test (DIT)-an assessment of pharmacists' professional development. Results. Qualitatively, students' development reflections described growth through this course series. Quantitatively, the 2015 PharmD class's DIT N2-scores illustrated positive development overall; the lower 50% had a large initial improvement compared to the upper 50%. Subsequently, the 2016 PharmD class confirmed these average initial improvements of students and also showed further substantial development among students thereafter. Conclusion. Applying an assessment for learning approach, triangulation of qualitative and quantitative assessments confirmed that PharmD students developed professionally during this course series.

  11. Development of method for evaluating estimated inundation area by using river flood analysis based on multiple flood scenarios

    Science.gov (United States)

    Ono, T.; Takahashi, T.

    2017-12-01

    Non-structural mitigation measures such as flood hazard map based on estimated inundation area have been more important because heavy rains exceeding the design rainfall frequently occur in recent years. However, conventional method may lead to an underestimation of the area because assumed locations of dike breach in river flood analysis are limited to the cases exceeding the high-water level. The objective of this study is to consider the uncertainty of estimated inundation area with difference of the location of dike breach in river flood analysis. This study proposed multiple flood scenarios which can set automatically multiple locations of dike breach in river flood analysis. The major premise of adopting this method is not to be able to predict the location of dike breach correctly. The proposed method utilized interval of dike breach which is distance of dike breaches placed next to each other. That is, multiple locations of dike breach were set every interval of dike breach. The 2D shallow water equations was adopted as the governing equation of river flood analysis, and the leap-frog scheme with staggered grid was used. The river flood analysis was verified by applying for the 2015 Kinugawa river flooding, and the proposed multiple flood scenarios was applied for the Akutagawa river in Takatsuki city. As the result of computation in the Akutagawa river, a comparison with each computed maximum inundation depth of dike breaches placed next to each other proved that the proposed method enabled to prevent underestimation of estimated inundation area. Further, the analyses on spatial distribution of inundation class and maximum inundation depth in each of the measurement points also proved that the optimum interval of dike breach which can evaluate the maximum inundation area using the minimum assumed locations of dike breach. In brief, this study found the optimum interval of dike breach in the Akutagawa river, which enabled estimated maximum inundation area

  12. Development of methods for theoretical analysis of nuclear reactors (Phase II), I-V, Part IV, Fuel depletion

    International Nuclear Information System (INIS)

    Pop-Jordanov, J.

    1962-10-01

    This report includes the analysis of plutonium isotopes from U 238 depletion chain. Two theoretical approaches for solving the depletion of fuel are shown. One results in the system of differential equations that can be solved only by using electronic calculators and the second, Machinari-Goto method enables obtaining analytical equations for approximative values of particular nuclei. In addition, differential equations are given for different approximation levels in calculating Pu 239 , as well as relations between the released energy and irradiation [sr

  13. Analysis of combined conduction and radiation heat transfer in presence of participating medium by the development of hybrid method

    International Nuclear Information System (INIS)

    Mahapatra, S.K.; Dandapat, B.K.; Sarkar, A.

    2006-01-01

    The current study addresses the mathematical modeling aspects of coupled conductive and radiative heat transfer in the presence of absorbing, emitting and isotropic scattering gray medium within two-dimensional square enclosure. A blended method where the concepts of modified differential approximation employed by combining discrete ordinate method and spherical harmonics method, has been developed for modeling the radiative transport equation. The gray participating medium is bounded by isothermal walls of two-dimensional enclosure which are considered to be opaque, diffuse and gray. The effect of various influencing parameters i.e., radiation-conduction parameter, surface emissivity, single scattering albedo and optical thickness has been illustrated. The adaptability of the present method has also been addressed

  14. An analysis of normalization methods for Drosophila RNAi genomic screens and development of a robust validation scheme

    Science.gov (United States)

    Wiles, Amy M.; Ravi, Dashnamoorthy; Bhavani, Selvaraj; Bishop, Alexander J.R.

    2010-01-01

    Genome-wide RNAi screening is a powerful, yet relatively immature technology that allows investigation into the role of individual genes in a process of choice. Most RNAi screens identify a large number of genes with a continuous gradient in the assessed phenotype. Screeners must then decide whether to examine just those genes with the most robust phenotype or to examine the full gradient of genes that cause an effect and how to identify the candidate genes to be validated. We have used RNAi in Drosophila cells to examine viability in a 384-well plate format and compare two screens, untreated control and treatment. We compare multiple normalization methods, which take advantage of different features within the data, including quantile normalization, background subtraction, scaling, cellHTS2 1, and interquartile range measurement. Considering the false-positive potential that arises from RNAi technology, a robust validation method was designed for the purpose of gene selection for future investigations. In a retrospective analysis, we describe the use of validation data to evaluate each normalization method. While no normalization method worked ideally, we found that a combination of two methods, background subtraction followed by quantile normalization and cellHTS2, at different thresholds, captures the most dependable and diverse candidate genes. Thresholds are suggested depending on whether a few candidate genes are desired or a more extensive systems level analysis is sought. In summary, our normalization approaches and experimental design to perform validation experiments are likely to apply to those high-throughput screening systems attempting to identify genes for systems level analysis. PMID:18753689

  15. Analysis and development of adjoint-based h-adaptive direct discontinuous Galerkin method for the compressible Navier-Stokes equations

    Science.gov (United States)

    Cheng, Jian; Yue, Huiqiang; Yu, Shengjiao; Liu, Tiegang

    2018-06-01

    In this paper, an adjoint-based high-order h-adaptive direct discontinuous Galerkin method is developed and analyzed for the two dimensional steady state compressible Navier-Stokes equations. Particular emphasis is devoted to the analysis of the adjoint consistency for three different direct discontinuous Galerkin discretizations: including the original direct discontinuous Galerkin method (DDG), the direct discontinuous Galerkin method with interface correction (DDG(IC)) and the symmetric direct discontinuous Galerkin method (SDDG). Theoretical analysis shows the extra interface correction term adopted in the DDG(IC) method and the SDDG method plays a key role in preserving the adjoint consistency. To be specific, for the model problem considered in this work, we prove that the original DDG method is not adjoint consistent, while the DDG(IC) method and the SDDG method can be adjoint consistent with appropriate treatment of boundary conditions and correct modifications towards the underlying output functionals. The performance of those three DDG methods is carefully investigated and evaluated through typical test cases. Based on the theoretical analysis, an adjoint-based h-adaptive DDG(IC) method is further developed and evaluated, numerical experiment shows its potential in the applications of adjoint-based adaptation for simulating compressible flows.

  16. Development of radiochemical method of analysis of binding of tritium labeled drotaverine hydrochloride with human blood serum albumin

    International Nuclear Information System (INIS)

    Kim, A.A.; Djuraeva, G.T.; Shukurov, B.V.; Mavlyanov, I.R.

    2004-01-01

    Full text: The albumin, being a basic functional linkage of numerous endogenous and exogenous substances is the most important protein of blood plasma. At the diseases connected to liver disfunction, collected in blood metabolite reduce connecting ability of albumino. The aim of the present research was a development of radiochemical method of determination of ability of albumin to bind the tritium labeled preparation drotaverine hydrochloride (no - spa). We had developed a micromethod of definition of connecting ability of albumin, allowing to analyse 20 mkl of blood serum. The method consists in incubation of tritium labeled drotaverine hydrochloride with blood serum in vitro, the following fractionation of serum proteins by gel - filtration on a microcolumn with Sephadex G-25, and direct measurement of the radioactivity connected to fraction of proteins of blood serum. The method has been tested on a series of blood serum of control group of healthy people and on a series of blood serum of patients with hepatitis B. We received quantitative characteristics of binding of drotaverine hydrochloride with albumin of patients with hepatitis B. It was preliminary established that binding ability of serum albumin of children with various forms of acute virus hepatitis tends to decrease in comparison with group of the control. Advantage of the developed radiochemical method is high precision and the high sensitivity of detection of infringement of binding ability of albumin. Application of tritium labeled drotaverine hydrochloride allows to measure directly levels of binding of a preparation with albumin

  17. Comparison of infrared spectroscopy techniques: developing an efficient method for high resolution analysis of sediment properties from long records

    Science.gov (United States)

    Hahn, Annette; Rosén, Peter; Kliem, Pierre; Ohlendorf, Christian; Persson, Per; Zolitschka, Bernd; Pasado Science Team

    2010-05-01

    the sample is necessary. This could not be accomplished, therefore absorbance in higher wavelengths was not recorded correctly. As a result of the poor spectral quality no calibration model was established for BSi using the Equinox device. Since this is by far the most time-consuming and elaborate conventional measurement, results give clear advantages for the Alpha device. Further calibration models were developed using spectra from the Visible Near Infrared Spectroscopy (VNIRS) region (400-2500 nm). Sample preparation for VNIRS analysis also is faster than for DRIFTS. However, FTIRS calibrations seem to perform better than those for VNIRS which show an R of 0.75 (BSi), 0.93 (TOC), 0.93 (TN), and 0.89 (TIC). NIRS primarily measures overtones of molecular vibrations and is typically used for quantitative measurement of organic functional groups. FTIRS is similar to NIRS, but uses longer wavelengths and directly monitors molecular vibrations. As a consequence, FTIRS allows more detailed structural and compositional analyses of both organic and inorganic compounds. Statistical analysis of the FTIRS-PLS models shows that the calibration depends on specific wave numbers, which compare well with spectra of pure compounds. The VNIRS technique gives rise to a spectrum with broad peaks and many overlapping signals which makes interpretation difficult without statistical analyses. In conclusion, the DRIFTS technique shows the best statistical performance for the analysis of biogeochemical properties. However, the VNIRS techniques and especially the ATR-FTIRS Alpha device show comparable results and can also be used as a rapid screening tool when time and costs are limiting factors. Kellner R, Mermet J-M, Otto M, Widmer HM (1998) Analytical chemistry. Wiley-VCH, Weinheim, etc. Rosén P, Vogel H, Cunnigham L, Reuss N, Conley DJ, Persson P (2009) Fourier transform infrared spectroscopy, a new method for rapid determination of total organic and inorganic carbon and biogenic silica

  18. Method Development in Forensic Toxicology.

    Science.gov (United States)

    Peters, Frank T; Wissenbach, Dirk K; Busardo, Francesco Paolo; Marchei, Emilia; Pichini, Simona

    2017-01-01

    In the field of forensic toxicology, the quality of analytical methods is of great importance to ensure the reliability of results and to avoid unjustified legal consequences. A key to high quality analytical methods is a thorough method development. The presented article will provide an overview on the process of developing methods for forensic applications. This includes the definition of the method's purpose (e.g. qualitative vs quantitative) and the analytes to be included, choosing an appropriate sample matrix, setting up separation and detection systems as well as establishing a versatile sample preparation. Method development is concluded by an optimization process after which the new method is subject to method validation. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  19. Development of core design/analysis technology for integral reactor; verification of SMART nuclear design by Monte Carlo method

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Hyo; Hong, In Seob; Han, Beom Seok; Jeong, Jong Seong [Seoul National University, Seoul (Korea)

    2002-03-01

    The objective of this project is to verify neutronics characteristics of the SMART core design as to compare computational results of the MCNAP code with those of the MASTER code. To achieve this goal, we will analyze neutronics characteristics of the SMART core using the MCNAP code and compare these results with results of the MASTER code. We improved parallel computing module and developed error analysis module of the MCNAP code. We analyzed mechanism of the error propagation through depletion computation and developed a calculation module for quantifying these errors. We performed depletion analysis for fuel pins and assemblies of the SMART core. We modeled a 3-D structure of the SMART core and considered a variation of material compositions by control rods operation and performed depletion analysis for the SMART core. We computed control-rod worths of assemblies and a reactor core for operation of individual control-rod groups. We computed core reactivity coefficients-MTC, FTC and compared these results with computational results of the MASTER code. To verify error analysis module of the MCNAP code, we analyzed error propagation through depletion of the SMART B-type assembly. 18 refs., 102 figs., 36 tabs. (Author)

  20. Development of an RGB color analysis method for controlling uniformity in a long-length GdBCO coated conductor

    International Nuclear Information System (INIS)

    Kim, Tae-Jin; Lee, Jae-Hun; Lee, Yu-Ri; Moon, Seung-Hyun

    2015-01-01

    Reactive co-evaporation-deposition and reaction (RCE-DR) is a very productive GdBa 2 Cu 3 O 7−x (GdBCO) coated conductor (CC) fabrication process, which involves the fast phase conversion of an amorphous film formed by co-evaporation of three metal sources, Gd, Ba and Cu, and thus reduces the time and cost for fabrication of a GdBCO CC. We routinely use quartz crystal microbalance (QCM) to measure and control the evaporation rates of each metal source to keep a constant nominal composition of the superconducting (SC) layer. However, in the case of kilometre long GdBCO CC fabrication, evaporation rates measured by QCM do not exactly reflect deposition rates onto the substrate as source levels decrease, and thus an RGB color analysis method for quality control is designed. With this RGB color analysis method, it is possible to measure the composition of the converted SC layer very close to the actual composition, even in real time. We set up the RGB color analysis program by establishing a database, where RGB color values are matched to composition of the SC layer, and as a result of applying the program to the RCE-DR process, could fabricate high quality GdBCO CC with average critical current of 561 A cm −1 and 95% uniformity along a 1 km length. (paper)

  1. Development of a Direct Headspace Collection Method from Arabidopsis Seedlings Using HS-SPME-GC-TOF-MS Analysis

    Directory of Open Access Journals (Sweden)

    Kazuki Saito

    2013-04-01

    Full Text Available Plants produce various volatile organic compounds (VOCs, which are thought to be a crucial factor in their interactions with harmful insects, plants and animals. Composition of VOCs may differ when plants are grown under different nutrient conditions, i.e., macronutrient-deficient conditions. However, in plants, relationships between macronutrient assimilation and VOC composition remain unclear. In order to identify the kinds of VOCs that can be emitted when plants are grown under various environmental conditions, we established a conventional method for VOC profiling in Arabidopsis thaliana (Arabidopsis involving headspace-solid-phase microextraction-gas chromatography-time-of-flight-mass spectrometry (HS-SPME-GC-TOF-MS. We grew Arabidopsis seedlings in an HS vial to directly perform HS analysis. To maximize the analytical performance of VOCs, we optimized the extraction method and the analytical conditions of HP-SPME-GC-TOF-MS. Using the optimized method, we conducted VOC profiling of Arabidopsis seedlings, which were grown under two different nutrition conditions, nutrition-rich and nutrition-deficient conditions. The VOC profiles clearly showed a distinct pattern with respect to each condition. This study suggests that HS-SPME-GC-TOF-MS analysis has immense potential to detect changes in the levels of VOCs in not only Arabidopsis, but other plants grown under various environmental conditions.

  2. Solid phase microextraction headspace sampling of chemical warfare agent contaminated samples : method development for GC-MS analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jackson Lepage, C.R.; Hancock, J.R. [Defence Research and Development Canada, Medicine Hat, AB (Canada); Wyatt, H.D.M. [Regina Univ., SK (Canada)

    2004-07-01

    Defence R and D Canada-Suffield (DRDC-Suffield) is responsible for analyzing samples that are suspected to contain chemical warfare agents, either collected by the Canadian Forces or by first-responders in the event of a terrorist attack in Canada. The analytical techniques used to identify the composition of the samples include gas chromatography-mass spectrometry (GC-MS), liquid chromatography-mass spectrometry (LC-MS), Fourier-transform infrared spectroscopy (FT-IR) and nuclear magnetic resonance spectroscopy. GC-MS and LC-MS generally require solvent extraction and reconcentration, thereby increasing sample handling. The authors examined analytical techniques which reduce or eliminate sample manipulation. In particular, this paper presented a screening method based on solid phase microextraction (SPME) headspace sampling and GC-MS analysis for chemical warfare agents such as mustard, sarin, soman, and cyclohexyl methylphosphonofluoridate in contaminated soil samples. SPME is a method which uses small adsorbent polymer coated silica fibers that trap vaporous or liquid analytes for GC or LC analysis. Collection efficiency can be increased by adjusting sampling time and temperature. This method was tested on two real-world samples, one from excavated chemical munitions and the second from a caustic decontamination mixture. 7 refs., 2 tabs., 3 figs.

  3. Developing a New Sampling And Analysis Method For Hydrazine And Monomethyl Hydrazine: Using a Derivatizing Agent With Solid Phase Microextraction

    Science.gov (United States)

    Allen, John

    2001-01-01

    Solid phase microextraction (SPME) will be used to develop a method for detecting monomethyl hydrazine (MMH) and hydrazine (Hz). A derivatizing agent, pentafluorobenzoyl chloride (PFBCI), is known to react readily with MMH and Hz. The SPME fiber can either be coated with PFBCl and introduced into a gaseous stream containing MMH, or PFBCl and MMH can react first in a syringe barrel and after a short equilibration period a SPME is used to sample the resulting solution. These methods were optimized and compared. Because Hz and MMH can degrade the SPME, letting the reaction occur first gave better results. Only MMH could be detected using either of these methods. Future research will concentrate on constructing calibration curves and determining the detection limit.

  4. Current status of methods for shielding analysis

    International Nuclear Information System (INIS)

    Engle, W.W.

    1980-01-01

    Current methods used in shielding analysis and recent improvements in those methods are discussed. The status of methods development is discussed based on needs cited at the 1977 International Conference on Reactor Shielding. Additional areas where methods development is needed are discussed

  5. Development of quantification methods for the myocardial blood flow using ensemble independent component analysis for dynamic H215O PET

    International Nuclear Information System (INIS)

    Lee, Byeong Il; Lee, Jae Sung; Lee, Dong Soo; Kang, Won Jun; Lee, Jong Jin; Kim, Soo Jin; Chung, June Key; Lee, Myung Chul; Choi, Seung Jin

    2004-01-01

    Factor analysis and independent component analysis (lCA) has been used for handling dynamic image sequences. Theoretical advantages of a newly suggested ICA method, ensemble ICA, leaded us to consider applying this method to the analysis of dynamic myocardial H 2 15 O PET data. In this study, we quantified patients, blood flow using the ensemble ICA method. Twenty subjects underwent H 2 15 O PET scans using ECAT EXACT 47 scanner and myocardial perfusion SPECT using Vertex scanner. After transmission scanning, dynamic emission scans were initiated simultaneously with the injection of 555∼740 MBq H 2 15 O. Hidden independent components can be extracted from the observed mixed data (PET image) by means of ICA algorithms. Ensemble learning is a variational Bayesian method that provides an analytical approximation to the parameter posterior using a tractable distribution. Variational approximation forms a lower bound on the ensemble likelihood and the maximization of the lower bound is achieved through minimizing the Kullback-Leibler divergence between the true posterior and the variational posterior. In this study, posterior pdf was approximated by a rectified Gaussian distribution to incorporate non-negativity constraint, which is suitable to dynamic images in nuclear medicine. Blood flow was measured in 9 regions - apex, four areas in mid wall, and four areas in base wall. Myocardial perfusion SPECT score and angiography results were compared with the regional blood flow. Major cardiac components were separated successfully by the ensemble ICA method and blood flow could be estimated in 15 among 20 patients. Mean myocardial blood flow was 1.2±0.40 ml/min/g in rest, 1.85±1.12 ml/min/g in stress state. Blood flow values obtained by an operator in two different occasion were highly correlated (r=0.99). In myocardium component image, the image contrast between left ventricle and myocardium was 1:2.7 in average. Perfusion reserve was significantly different between

  6. Method Development for the Analysis of Pharmaceuticals with Acethylcholinesterase Activity in Water Using HPLC-DAD and Solid Phase Extraction

    Directory of Open Access Journals (Sweden)

    Samuel Budi Wardhana

    2014-03-01

    Full Text Available An SPE followed by HPLC-DAD method with ion pair chromatography technique to analyze pharmaceuticals with acethylcholinesterase activity including pyridostigmine (PYR, galathamine (GAL, neostigmine (NEO, eserine (ESE, and donepezil (DON in water samples was developed. Acetylcholinesterase (AChE inhibitors have been used to treat less severe dementias such as Alzheimer’s disease. Chromatographic separation was achieved using reversed-phase SymmetryShield column using gradient system with mobile phase consisting of H2O/ACN (99:1, v/v as mobile phase A with 10 mM sodium 1-hexanesulfonate and 0.1% acetic acid (HAc. The HPLC/DAD method was linear between concentrations of 5 to 100 ng/μL. The IDL and IQL ranged from 0.50 to 1.25 ng/μL and 1.5 to 3.0 ng/μL, respectively. SPE was used to extract and clean up the target substances in spiked pure water, tap water, and wastewater samples. The application of extraction method of 5 target substances in wastewater sample was divided into 2 parts: Oasis WCX (6 mL, 500 mg for PYR and Oasis HLB (6 mL, 200 mg for GAL, NEO, ESE and DON. The developed SPE and HPLC/DAD method is applicable for quantification of the 5 target substances in water samples in a concentration range > 50 µg/L and assumable lower for DON (> 25 µg/L.

  7. Methods and Model Development for Coupled RELAP5/PARCS Analysis of the Atucha-II Nuclear Power Plant

    Directory of Open Access Journals (Sweden)

    Andrew M. Ward

    2011-01-01

    Full Text Available In order to analyze the steady state and transient behavior of CNA-II, several tasks were required. Methods and models were developed in several areas. HELIOS lattice models were developed and benchmarked against WIMS/MCNP5 results generated by NA-SA. Cross-sections for the coupled RELAP5/PARCS calculation were extracted from HELIOS within the GenPMAXS framework. The validation of both HELIOS and PARCS was performed primarily by comparisons to WIMS/PUMA and MCNP for idealized models. Special methods were developed to model the control rods and boron injection systems of CNA-II. The insertion of the rods is oblique, and a special routine was added to PARCS to treat this effect. CFD results combined with specialized mapping routines were used to model the boron injection system. In all cases there was good agreement in the results which provided confidence in the neutronics methods and modeling. A coupled code benchmark between U of M and U of Pisa is ongoing and results are still preliminary. Under a LOCA transient, the best estimate behavior of the core appears to be acceptable.

  8. Methods and Model Development for Coupled RELAP5/PARCS Analysis of the Atucha-II Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ward, A.M.; Collins, B.S.; Xu, Y.; Downar, Th.J.; Madariaga, M.

    2011-01-01

    In order to analyze the steady state and transient behavior of CNA-II, several tasks were required. Methods and models were developed in several areas. HELIOS lattice models were developed and benchmarked against WIMS/MCNP5 results generated by NA-SA. Cross-sections for the coupled RELAP5/PARCS calculation were extracted from HELIOS within the Gen PMAXS framework. The validation of both HELIOS and PARCS was performed primarily by comparisons to WIMS/PUMA and MCNP for idealized models. Special methods were developed to model the control rods and boron injection systems of CNA-II. The insertion of the rods is oblique, and a special routine was added to PARCS to treat this effect. CFD results combined with specialized mapping routines were used to model the boron injection system. In all cases there was good agreement in the results which provided confidence in the neutronics methods and modeling. A coupled code benchmark between U of M and U of Pisa is ongoing and results are still preliminary. Under a LOCA transient, the best estimate behavior of the core appears to be acceptable

  9. Development and validation of RP-HPLC method for analysis of multicomponent cough-cold syrup formulation

    OpenAIRE

    Ivković, Branka; Marković, Bojan; Vladimirov, Sote

    2014-01-01

    In this study a reversed phase HPLC method for rapid and simultaneous identification and quantification of doxylamine succinate, ephedrine sulfate, dextrometorphane hydrobromide, paracetamole and sodium benzoate in cough-cold syrup formulation was described. Separation was carried out on XTerraTM RP 18, Waters (150 mm x 4.6 mm column, 5 μm particle size). For the analysis of investigated substances gradient elution was used employing water, pH adjusted at 2.5 with 85 % ortophosphoric acid as ...

  10. Methods of nonlinear analysis

    CERN Document Server

    Bellman, Richard Ernest

    1970-01-01

    In this book, we study theoretical and practical aspects of computing methods for mathematical modelling of nonlinear systems. A number of computing techniques are considered, such as methods of operator approximation with any given accuracy; operator interpolation techniques including a non-Lagrange interpolation; methods of system representation subject to constraints associated with concepts of causality, memory and stationarity; methods of system representation with an accuracy that is the best within a given class of models; methods of covariance matrix estimation;methods for low-rank mat

  11. Analysis of Scientific and Methodical Approaches to Portfolio Investment as a Tool of Financial Provision of Sustainable Economic Development

    Directory of Open Access Journals (Sweden)

    Leus Daryna V.

    2013-12-01

    Full Text Available The article analyses scientific and methodical approaches to portfolio investment. It develops recommendations on specification of the categorical apparatus of portfolio investment in the context of differentiation of strategic (direct and portfolio investments as alternative approaches to the conduct of investment activity. It identifies the composition and functions of objects and subjects of portfolio investment under conditions of globalisation of the world financial markets. It studies main postulates of the portfolio theory and justifies a necessity of identification of the place, role and functions of subjects of portfolio investment in them for ensuring sustainable development of the economy. It offers to specify, as one of the ways of further development of portfolio theories, a separate direction in the financial provision of economy with consideration of ecologic and social components – socio responsible investment.

  12. Development of a sample preparation method for the analysis of current-use pesticides in sediment using gas chromatography.

    Science.gov (United States)

    Wang, Dongli; Weston, Donald P; Ding, Yuping; Lydy, Michael J

    2010-02-01

    Pyrethroid insecticides have been implicated as the cause of sediment toxicity to Hyalella azteca in both agricultural and urban areas of California; however, for a subset of these toxic sediments (approximately 30%), the cause of toxicity remains unidentified. This article describes the analytical method development for seven additional pesticides that are being examined to determine if they might play a role in the unexplained toxicity. A pressurized liquid extraction method was optimized to simultaneously extract diazinon, methyl parathion, oxyfluorfen, dicofol, fenpropathrin, pyraclostrobin, and indoxacarb from sediment, and the extracts were cleaned using a two-step solid-phase extraction procedure. The final extract was analyzed for the target pesticides by gas chromatography/nitrogen-phosphorus detector (GC/NPD), and gas chromatography/electron capture detector (GC/ECD), after sulfur was removed by shaking with copper and cold crystallization. Three sediments were used as reference matrices to assess method accuracy and precision. Method detection limits were 0.23-1.8 ng/g dry sediment using seven replicates of sediment spiked at 1.0 ng/g dry sediment. Recoveries ranged from 61.6 to 118% with relative standard deviations of 2.1-17% when spiked at 5.0 and 50 ng/g dry sediment. The three reference sediments, spiked with 50 ng/g dry weight of the pesticide mixture, were aged for 0.25, 1, 4, 7, and 14 days. Recoveries of the pesticides in the sediments generally decreased with increased aging time, but the magnitude of the decline was pesticide and sediment dependent. The developed method was applied to field-collected sediments from the Central Valley of California.

  13. Development of a Novel Nuclear Safety Culture Evaluation Method for an Operating Team Using Probabilistic Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Han, Sangmin; Lee, Seung Min; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)

    2015-05-15

    IAEA defined safety culture as follows: 'Safety Culture is that assembly of characteristics and attitudes in organizations and individuals which establishes that, as an overriding priority, nuclear plant safety issues receive the attention warranted by their significance'. Also, celebrated behavioral scientist, Cooper, defined safety culture as,'safety culture is that observable degree of effort by which all organizational members direct their attention and actions toward improving safety on a daily basis' with his internal psychological, situational, and behavioral context model. With these various definitions and criteria of safety culture, several safety culture assessment methods have been developed to improve and manage safety culture. To develop a new quantitative safety culture evaluation method for an operating team, we unified and redefined safety culture assessment items. Then we modeled a new safety culture evaluation by adopting level 1 PSA concept. Finally, we suggested the criteria to obtain nominal success probabilities of assessment items by using 'operational definition'. To validate the suggested evaluation method, we analyzed the collected audio-visual recording data collected from a full scope main control room simulator of a NPP in Korea.

  14. Development of a Novel Nuclear Safety Culture Evaluation Method for an Operating Team Using Probabilistic Safety Analysis

    International Nuclear Information System (INIS)

    Han, Sangmin; Lee, Seung Min; Seong, Poong Hyun

    2015-01-01

    IAEA defined safety culture as follows: 'Safety Culture is that assembly of characteristics and attitudes in organizations and individuals which establishes that, as an overriding priority, nuclear plant safety issues receive the attention warranted by their significance'. Also, celebrated behavioral scientist, Cooper, defined safety culture as,'safety culture is that observable degree of effort by which all organizational members direct their attention and actions toward improving safety on a daily basis' with his internal psychological, situational, and behavioral context model. With these various definitions and criteria of safety culture, several safety culture assessment methods have been developed to improve and manage safety culture. To develop a new quantitative safety culture evaluation method for an operating team, we unified and redefined safety culture assessment items. Then we modeled a new safety culture evaluation by adopting level 1 PSA concept. Finally, we suggested the criteria to obtain nominal success probabilities of assessment items by using 'operational definition'. To validate the suggested evaluation method, we analyzed the collected audio-visual recording data collected from a full scope main control room simulator of a NPP in Korea

  15. Development and application of an efficient method for performing modal analysis of steam generator tubes in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, Huinam [Dept of Mechanical and Aerospace Engineering, Sunchon National University, Sunchon, 540-742 (Korea, Republic of); Boo, Myung-Hwan [Korea Hydro and Nuclear Power Company, Yuseong-Gu, Daejeon 305-343 (Korea, Republic of); Park, Chi-Yong [KEPCO Research Institute, Yuseong-Gu, Daejeon 305-380 (Korea, Republic of); Ryu, Ki-Wahn, E-mail: kwryu@chonbuk.ac.k [Department of Aerospace Engineering, Chonbuk National University, 664-14, Deogjin-Dong, Jeonju 561-756 (Korea, Republic of)

    2010-10-15

    A typical pressurized water reactor (PWR) steam generator has approximately 10,000 tubes. These tubes have different geometries, supporting conditions, and different material properties due to the non-uniform temperature distribution throughout the steam generator. Even though some tubes may have the same geometry and boundary conditions, the non-uniform distribution of coolant densities adjacent to the tubes causes them to have different added mass effects and dynamic characteristics. Therefore, for a reliable design of the steam generator, a separate modal analysis for each tube is necessary to perform the FIV (flow-induced vibration) analysis. However, the modal analysis of a tube including the finite element modeling is cumbersome and takes lots of time. And when a commercial finite element code is used, interfacing the modal analysis result, such as natural frequencies and mode shapes, with the FIV analysis procedure requires an additional significant amount of time and can possibly incur inadvertent error due to the complexity of data processing. It is therefore impossible to perform the complete FIV analysis for ten thousands of tubes when designing or maintaining a steam generator although it is necessary. Rather, to verify the safe design against the FIV, only a couple of tubes are chosen based on engineering judgment or past experience. In this paper, a computer program, PIAT-MODE, was developed which is able to perform modal analysis of all tubes of a PWR steam generator in a very efficient way. The geometries and boundary conditions of every tube were incorporated into PIAT-MODE using appropriate mathematical formulae. Material property data including the added mass effect was also included in the program. Once a specific tube is selected, the program automatically constructs the finite element model and generates the modal data very quickly. Therefore, modal analysis can be performed for every single tube in a straight way. When PIAT-MODE is coupled

  16. Contribution of ion beam analysis methods to the development of 2nd generation high temperature superconducting (HTS) wires

    Energy Technology Data Exchange (ETDEWEB)

    Usov, Igor O [Los Alamos National Laboratory; Arendt, Paul N [Los Alamos National Laboratory; Stan, Liliana [Los Alamos National Laboratory; Holesinger, Terry G [Los Alamos National Laboratory; Foltyn, Steven R [Los Alamos National Laboratory; Depaula, Raymond F [Los Alamos National Laboratory

    2009-01-01

    One of the crucial steps in the second generation high temperature superconducting wire program was development of the buffer layer architecture. The architecture designed at the Superconductivity Technology Center at Los Alamos National Laboratory consists of several oxide layers wherein each layer plays a specific role, namely: nucleation layer, diffusion barrier, biaxially textured template, and an intermediate layer with a good match to the lattice parameter of superconducting Y{sub 1}Ba{sub 2}Cu{sub 3}O{sub 7} (YBCO) compound. This report demonstrates how a wide range of ion beam analysis techniques (SIMS, RBS, channeling, PIXE, PIGE, NRA, ERD) was employed for analysis of each buffer layer and the YBCO films. These results assisted in understanding of a variety of physical processes occurring during the buffet layer fabrication and helped to optimize the buffer layer architecture as a whole.

  17. Development of quantitative analysis method for stereotactic brain image. Assessment of reduced accumulation in extent and severity using anatomical segmentation

    International Nuclear Information System (INIS)

    Mizumura, Sunao; Kumita, Shin-ichiro; Cho, Keiichi; Ishihara, Makiko; Nakajo, Hidenobu; Toba, Masahiro; Kumazaki, Tatsuo

    2003-01-01

    Through visual assessment by three-dimensional (3D) brain image analysis methods using stereotactic brain coordinates system, such as three-dimensional stereotactic surface projections and statistical parametric mapping, it is difficult to quantitatively assess anatomical information and the range of extent of an abnormal region. In this study, we devised a method to quantitatively assess local abnormal findings by segmenting a brain map according to anatomical structure. Through quantitative local abnormality assessment using this method, we studied the characteristics of distribution of reduced blood flow in cases with dementia of the Alzheimer type (DAT). Using twenty-five cases with DAT (mean age, 68.9 years old), all of whom were diagnosed as probable Alzheimer's disease based on National Institute of Neurological and Communicative Disorders and Stroke-Alzheimer's Disease and Related Disorders Association (NINCDS-ADRDA), we collected I-123 iodoamphetamine SPECT data. A 3D brain map using the 3D-stereotactic surface projections (SSP) program was compared with the data of 20 cases in the control group, who age-matched the subject cases. To study local abnormalities on the 3D images, we divided the whole brain into 24 segments based on anatomical classification. We assessed the extent of an abnormal region in each segment (rate of the coordinates with a Z-value that exceeds the threshold value, in all coordinates within a segment), and severity (average Z-value of the coordinates with a Z-value that exceeds the threshold value). This method clarified orientation and expansion of reduced accumulation, through classifying stereotactic brain coordinates according to the anatomical structure. This method was considered useful for quantitatively grasping distribution abnormalities in the brain and changes in abnormality distribution. (author)

  18. The development of human behavior analysis techniques - A study on knowledge representation methods for operator cognitive model

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Park, Young Tack [Soongsil University, Seoul (Korea, Republic of)

    1996-07-01

    The main objective of this project is modeling of human operator in a main control room of Nuclear Power Plant. For this purpose, we carried out research on knowledge representation and inference method based on Rasmussen`s decision ladder structure. And we have developed SACOM(Simulation= Analyzer with a Cognitive Operator Model) using G2 shell on Sun workstations. SACOM consists of Operator Model, Interaction Analyzer, Situation Generator. Cognitive model aims to build a more detailed model of human operators in an effective way. SACOM is designed to model knowledge-based behavior of human operators more easily. The followings are main research topics carried out this year. First, in order to model knowledge-based behavior of human operators, more detailed scenarios are constructed. And, knowledge representation and inference methods are developed to support the scenarios. Second, meta knowledge structures are studied to support human operators 4 types of diagnoses. This work includes a study on meta and scheduler knowledge structures for generate-and-test, topographic, decision tree and case-based approaches. Third, domain knowledge structure are improved to support meta knowledge. Especially, domain knowledge structures are developed to model topographic diagnosis model. Fourth, more applicable interaction analyzer and situation generator are designed and implemented. The new version is implemented in G2 on Sun workstations. 35 refs., 49 figs. (author)

  19. Experimental design-based isotope-dilution SPME-GC/MS method development for the analysis of smoke flavouring products.

    Science.gov (United States)

    Giri, Anupam; Zelinkova, Zuzana; Wenzl, Thomas

    2017-12-01

    For the implementation of Regulation (EC) No 2065/2003 related to smoke flavourings used or intended for use in or on foods a method based on solid-phase micro extraction (SPME) GC/MS was developed for the characterisation of liquid smoke products. A statistically based experimental design (DoE) was used for method optimisation. The best general conditions to quantitatively analyse the liquid smoke compounds were obtained with a polydimethylsiloxane/divinylbenzene (PDMS/DVB) fibre, 60°C extraction temperature, 30 min extraction time, 250°C desorption temperature, 180 s desorption time, 15 s agitation time, and 250 rpm agitation speed. Under the optimised conditions, 119 wood pyrolysis products including furan/pyran derivatives, phenols, guaiacol, syringol, benzenediol, and their derivatives, cyclic ketones, and several other heterocyclic compounds were identified. The proposed method was repeatable (RSD% <5) and the calibration functions were linear for all compounds under study. Nine isotopically labelled internal standards were used for improving quantification of analytes by compensating matrix effects that might affect headspace equilibrium and extractability of compounds. The optimised isotope dilution SPME-GC/MS based analytical method proved to be fit for purpose, allowing the rapid identification and quantification of volatile compounds in liquid smoke flavourings.

  20. Development of multi-dimensional analysis method for porous blockage in fuel subassembly. Numerical simulation for 4 subchannel geometry water test

    International Nuclear Information System (INIS)

    Tanaka, Masa-aki; Kamide, Hideki

    2001-02-01

    This investigation deals with the porous blockage in a wire spacer type fuel subassembly in Fast Breeder Reactors (FBR's). Multi-dimensional analysis method for a porous blockage in a fuel subassembly is developed using the standard k-ε turbulence model with the typical correlations in handbooks. The purpose of this analysis method is to evaluate the position and the magnitude of the maximum temperature, and to investigate the thermo-hydraulic phenomena in the porous blockage. Verification of this analysis method was conducted based on the results of 4-subchannel geometry water test. It was revealed that the evaluation of the porosity distribution and the particle diameter in a porous blockage was important to predict the temperature distribution. This analysis method could simulate the spatial characteristic of velocity and temperature distributions in the blockage and evaluate the pin surface temperature inside the porous blockage. Through the verification of this analysis method, it is shown that this multi-dimensional analysis method is useful to predict the thermo-hydraulic field and the highest temperature in a porous blockage. (author)

  1. High-performance liquid chromatography analysis methods developed for quantifying enzymatic esterification of flavonoids in ionic liquids.

    Science.gov (United States)

    Lue, Bena-Marie; Guo, Zheng; Xu, Xuebing

    2008-07-11

    Methods using reversed-phase high-performance liquid chromatography (RP-HPLC) with ELSD were investigated to quantify enzymatic reactions of flavonoids with fatty acids in the presence of diverse room temperature ionic liquids (RTILs). A buffered salt (preferably triethylamine-acetate) was found essential for separation of flavonoids from strongly polar RTILs, whereby RTILs were generally visible as two major peaks identified based on an ion-pairing/exchanging hypothesis. C8 and C12 stationary phases were optimal while mobile phase pH (3-7) had only a minor influence on separation. The method developed was successfully applied for primary screening of RTILs (>20), with in depth evaluation of substrates in 10 RTILs, for their evaluation as reaction media.

  2. New analysis methods for skin fine-structure via optical image and development of 3D skin Cycloscan(™).

    Science.gov (United States)

    Han, J Y; Nam, G W; Lee, H K; Kim, M J; Kim, E J

    2015-11-01

    This study was conducted to develop methods for measuring skin fine-structure via optical image and apparatus for photographing to analyze efficacy of anti-aging. We developed an apparatus named 3D Skin CycloScan(™) to evaluate the efficacy of cosmetics by imagification of skin fine-structure such as wrinkles, pores, and skin texture. The semi-sphere shaped device has 12 different sequential flashing light sources captures optical image simultaneously in one second to exclude the influence of the subject's movement. The normal map that is extracted through shape from shading method is composed of face contour and skin fine-structure parts. When the low-frequency component which is the result of the Gaussian Filter application is eliminated, we can get only skin fine-structure. In this normal map, it is possible to extract two-dimensional vector map called direction map and we can regulate the intensity of the image of wrinkles, pores, and skin texture after filtering the direction map. We performed a clinical study to apply this new apparatus and methods to evaluate an anti-aging efficacy of cosmetics visually and validate with other conventional methods. After using anti-aging cream including 2% adenosine for 8 weeks, the total amount of fine wrinkle around eye area detected via 3D Skin CycloScan(™) was reduced by 12.1%. Also, wrinkles on crow's feet measured by PRIMOS COMPACT(®) (GFMesstechnik GmbH, Germany) reduced 11.7%. According to an aspect of the present study, by changing the direction of the lights toward to subject's skin, we can obtain the information about the fine structures present on the skin such as wrinkles, pores, or skin texture and represent it as an image. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. A computer programme for use in the development of multi-element x-ray-fluorescence methods of analysis

    International Nuclear Information System (INIS)

    Wall, G.J.

    1985-01-01

    A computer programme (written in BASIC) is described for the evaluation of spectral-line intensities in X-ray-fluorescence spectrometry. The programme is designed to assist the analyst while he is developing new analytical methods, because it facilitates the selection of the following evaluation parameters: calculation models, spectral-line correction factors, calibration curves, calibration ranges, and point deletions. In addition, the programme enables the analyst to undertake routine calculations of data from multi-element analyses in which variable data-reduction parameters are used for each element

  4. Development of New Method for Simultaneous Analysis of Piracetam and Levetiracetam in Pharmaceuticals and Biological Fluids: Application in Stability Studies

    OpenAIRE

    Siddiqui, Farhan Ahmed; Sher, Nawab; Shafi, Nighat; Wafa Sial, Alisha; Ahmad, Mansoor; Mehjebeen,; Naseem, Huma

    2014-01-01

    RP-HPLC ultraviolet detection simultaneous quantification of piracetam and levetiracetam has been developed and validated. The chromatography was obtained on a Nucleosil C18 column of 25 cm × 0.46 cm, 10 μm, dimension. The mobile phase was a (70 : 30 v/v) mixture of 0.1 g/L of triethylamine and acetonitrile. Smooth flow of mobile phase at 1 mL/min was set and 205 nm wavelength was selected. Results were evaluated through statistical parameters which qualify the method reproducibility and sele...

  5. Implementation of the semi-aerobic landfill system (Fukuoka method) in developing countries: a Malaysia cost analysis.

    Science.gov (United States)

    Chong, Theng Lee; Matsufuji, Yasushi; Hassan, Mohd Nasir

    2005-01-01

    Most of the existing solid waste landfill sites in developing countries are practicing either open dumping or controlled dumping. Proper sanitary landfill concepts are not fully implemented due to technological and financial constraints. Implementation of a fully engineered sanitary landfill is necessary and a more economically feasible landfill design is crucial, particularly for developing countries. This study was carried out by focusing on the economics from the development of a new landfill site within a natural clay area with no cost of synthetic liner up to 10 years after its closure by using the Fukuoka method semi-aerobic landfill system. The findings of the study show that for the development of a 15-ha landfill site in Malaysia with an estimated volume of 2,000,000 m(3), the capital investment required was about US 1,312,895 dollars, or about US 0.84 dollars/tonne of waste. Assuming that the lifespan of the landfill is 20 years, the total cost of operation was about US 11,132,536 dollars or US 7.15 dollars/tonne of waste. The closure cost of the landfill was estimated to be US 1,385,526 dollars or US 0.89 dollars/tonne of waste. Therefore, the total cost required to dispose of a tonne of waste at the semi-aerobic landfill was estimated to be US 8.89 dollars. By considering an average tipping fee of about US 7.89 dollars/tonne of waste in Malaysia in the first year, and an annual increase of 3% to about US 13.84 dollars in year-20, the overall system recorded a positive revenue of US 1,734,749 dollars. This is important information for the effort of privatisation of landfill sites in Malaysia, as well as in other developing countries, in order to secure efficient and effective landfill development and management.

  6. Trial Sequential Methods for Meta-Analysis

    Science.gov (United States)

    Kulinskaya, Elena; Wood, John

    2014-01-01

    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  7. Development of Extraction Methods for the Analysis of Perfluorinated Compounds in Leather with High Performance Liquid Chromatography Tandem Mass Spectrometry

    Science.gov (United States)

    Zhang, Yan; Wang, Youchao; Tang, Chuanjiang; Nie, Jingmei; Xu, Chengtao

    2018-01-01

    Perfluorinated compounds (PFCs), used to provide water, oil, grease, heat and stain repellency to a range of textile and other products, have been found to be persistent in the environment and are associated with adverse effects on humans and wildlife. This study presents the development and validation of an analytical method to determine the simultaneous presence of eleven PFCs in leather using solid-phase extraction followed by liquid chromatography-tandem mass spectrometry (LC-MS/MS). The perfluorinated compounds were primarily extracted from the samples by a liquid extraction procedure by ultrasonic, in which the parameters were optimized. Then the solid-phase extraction (SPE) is the most important advantages of the developed methodology. The sample volume and elution conditions were optimized by means of an experimental design. The proposed method was applied to determine the PFCs in leather, where the detection limits of the eleven compounds were 0.09-0.96 ng/L, and the recoveries of all compounds spiked at 5 ng/L concentration level were in the range of 65-96%, with a better RSD lower than 19% (n = 7).

  8. Biological Methods and Manual Development

    Science.gov (United States)

    EPA scientists conduct research to develop and evaluate analytical methods for the identification, enumeration, evaluation of aquatic organisms exposed to environmental stressors and to correlate exposures with effects on chemical and biological indicators

  9. DDOT MXD+ method development report.

    Science.gov (United States)

    2015-09-01

    Mixed-use development has become increasingly common across the country, including Washington, D.C. : However, a straightforward and empirically validated method for evaluating the traffic impacts of such : projects is still needed. The data presente...

  10. Development of new method for simultaneous analysis of piracetam and levetiracetam in pharmaceuticals and biological fluids: application in stability studies.

    Science.gov (United States)

    Siddiqui, Farhan Ahmed; Sher, Nawab; Shafi, Nighat; Wafa Sial, Alisha; Ahmad, Mansoor; Mehjebeen; Naseem, Huma

    2014-01-01

    RP-HPLC ultraviolet detection simultaneous quantification of piracetam and levetiracetam has been developed and validated. The chromatography was obtained on a Nucleosil C18 column of 25 cm×0.46 cm, 10 μm, dimension. The mobile phase was a (70:30 v/v) mixture of 0.1 g/L of triethylamine and acetonitrile. Smooth flow of mobile phase at 1 mL/min was set and 205 nm wavelength was selected. Results were evaluated through statistical parameters which qualify the method reproducibility and selectivity for the quantification of piracetam, levetiracetam, and their impurities hence proving stability-indicating properties. The proposed method is significantly important, permitting the separation of the main constituent piracetam from levetiracetam. Linear behavior was observed between 20 ng/mL and 10,000 ng/mL for both drugs. The proposed method was checked in bulk drugs, dosage formulations, physiological condition, and clinical investigations and excellent outcome was witnessed.

  11. Development of New Method for Simultaneous Analysis of Piracetam and Levetiracetam in Pharmaceuticals and Biological Fluids: Application in Stability Studies

    Directory of Open Access Journals (Sweden)

    Farhan Ahmed Siddiqui

    2014-01-01

    Full Text Available RP-HPLC ultraviolet detection simultaneous quantification of piracetam and levetiracetam has been developed and validated. The chromatography was obtained on a Nucleosil C18 column of 25 cm×0.46 cm, 10 μm, dimension. The mobile phase was a (70 : 30 v/v mixture of 0.1 g/L of triethylamine and acetonitrile. Smooth flow of mobile phase at 1 mL/min was set and 205 nm wavelength was selected. Results were evaluated through statistical parameters which qualify the method reproducibility and selectivity for the quantification of piracetam, levetiracetam, and their impurities hence proving stability-indicating properties. The proposed method is significantly important, permitting the separation of the main constituent piracetam from levetiracetam. Linear behavior was observed between 20 ng/mL and 10000 ng/mL for both drugs. The proposed method was checked in bulk drugs, dosage formulations, physiological condition, and clinical investigations and excellent outcome was witnessed.

  12. The order and priority of research and design method application within an assistive technology new product development process: a summative content analysis of 20 case studies.

    Science.gov (United States)

    Torrens, George Edward

    2018-01-01

    Summative content analysis was used to define methods and heuristics from each case study. The review process was in two parts: (1) A literature review to identify conventional research methods and (2) a summative content analysis of published case studies, based on the identified methods and heuristics to suggest an order and priority of where and when were used. Over 200 research and design methods and design heuristics were identified. From the review of the 20 case studies 42 were identified as being applied. The majority of methods and heuristics were applied in phase two, market choice. There appeared a disparity between the limited numbers of methods frequently used, under 10 within the 20 case studies, when hundreds were available. Implications for Rehabilitation The communication highlights a number of issues that have implication for those involved in assistive technology new product development: •The study defined over 200 well-established research and design methods and design heuristics that are available for use by those who specify and design assistive technology products, which provide a comprehensive reference list for practitioners in the field; •The review within the study suggests only a limited number of research and design methods are regularly used by industrial design focused assistive technology new product developers; and, •Debate is required within the practitioners working in this field to reflect on how a wider range of potentially more effective methods and heuristics may be incorporated into daily working practice.

  13. Negotiating a Systems Development Method

    Science.gov (United States)

    Karlsson, Fredrik; Hedström, Karin

    Systems development methods (or methods) are often applied in tailored version to fit the actual situation. Method tailoring is in most the existing literature viewed as either (a) a highly rational process with the method engineer as the driver where the project members are passive information providers or (b) an unstructured process where the systems developer makes individual choices, a selection process without any driver. The purpose of this chapter is to illustrate that important design decisions during method tailoring are made by project members through negotiation. The study has been carried out using the perspective of actor-network theory. Our narratives depict method tailoring as more complex than (a) and (b) show the driver role rotates between the project members, and design decisions are based on influences from several project members. However, these design decisions are not consensus decisions.

  14. A strategy to the development of a human error analysis method for accident management in nuclear power plants using industrial accident dynamics

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Kim, Jae Whan; Jung, Won Dae; Ha, Jae Ju

    1998-06-01

    This technical report describes the early progress of he establishment of a human error analysis method as a part of a human reliability analysis(HRA) method for the assessment of the human error potential in a given accident management strategy. At first, we review the shortages and limitations of the existing HRA methods through an example application. In order to enhance the bias to the quantitative aspect of the HRA method, we focused to the qualitative aspect, i.e., human error analysis(HEA), during the proposition of a strategy to the new method. For the establishment of a new HEA method, we discuss the basic theories and approaches to the human error in industry, and propose three basic requirements that should be maintained as pre-requisites for HEA method in practice. Finally, we test IAD(Industrial Accident Dynamics) which has been widely utilized in industrial fields, in order to know whether IAD can be so easily modified and extended to the nuclear power plant applications. We try to apply IAD to the same example case and develop new taxonomy of the performance shaping factors in accident management and their influence matrix, which could enhance the IAD method as an HEA method. (author). 33 refs., 17 tabs., 20 figs

  15. Development of quantification analysis software for measuring regional cerebral blood flow by the modified split-dose method with 123I-IMP before and after acetazolamide loading

    International Nuclear Information System (INIS)

    Nagaki, Akio; Kobara, Kouichi; Matsutomo, Norikazu

    2003-01-01

    We developed a quantification analysis software program for measuring regional cerebral blood flow (rCBF) at rest and under acetazolamide (ACZ) stress by the modified split-dose (MSD) method with iodine-123 N-isopropyl-p-iodoamphetamine (IMP) and compared the rCBF values measured by the MSD method and by the split dose 123 I-IMP SPECT (SD) method requiring one continuous withdrawal of arterial blood. Since the MSD method allows the input of two arterial blood sampling parameter values, the background subtraction procedure for obtaining ACZ-induced images in the MSD method is not identical to the procedure in the SD method. With our software program for rCBF quantification, the resting rCBF values determined by the MSD method were closely correlated with the values measured by the SD method (r=0.94), and there was also a good correlation between the ACZ-induced rCBF values obtained by the MSD method and by the SD method (r=0.81). The increase in rCBF under ACZ stress was estimated to be approximately 26% by the SD method and 38% by the MSD method, suggesting that the MSD method tends to overestimate the increase in rCBF under ACZ stress in comparison with the SD method, but the variability of the rCBF values at rest and during ACZ stress analyzed by the MSD method was smaller than the variability with the SD method. Further clinical studies are required to validate our rCBF quantification analysis program for the MSD method. (author)

  16. Analytical method development and validation for quantification of uranium by Fourier Transform Infrared Spectroscopy (FTIR) for routine quality control analysis

    International Nuclear Information System (INIS)

    Pereira, Elaine; Silva, Ieda de S.; Gomide, Ricardo G.; Pires, Maria Aparecida F.

    2015-01-01

    This work presents a low cost, simple and new methodology for direct determination uranium in different matrices uranium: organic phase (UO 2 (NO 3 ) 2 .2TBP - uranyl nitrate complex) and aqueous phase (UO 2 (NO 3 ) 2 - NTU - uranyl nitrate), based on Fourier Transform Infrared spectroscopy (FTIR) using KBr pellets technique. The analytical validation is essential to define if a developed methodology is completely adjusted to the objectives that it is destined and is considered one of the main instruments of quality control. The parameters used in the validation process were: selectivity, linearity, limits of detection (LD) and quantitation (LQ), precision (repeatability and intermediate precision), accuracy and robustness. The method for uranium in organic phase (UO 2 (NO 3 ) 2 .2TBP in hexane/embedded in KBr) was linear (r=0.9989) over the range of 1.0 g L -1 a 14.3 g L -1 , LD were 92.1 mg L -1 and LQ 113.1 mg L -1 , precision (RSD < 1.6% and p-value < 0.05), accurate (recovery of 100.1% - 102.9%). The method for uranium aqueous phase (UO 2 (NO 3 )2/embedded in KBr) was linear (r=0.9964) over the range of 5.4 g L -1 a 51.2 g L -1 , LD were 835 mg L -1 and LQ 958 mg L -1 , precision (RSD < 1.0% and p-value < 0.05), accurate (recovery of 99.1% - 102.0%). The FTIR method is robust regarding most of the variables analyzed, as the difference between results obtained under nominal and modified conditions were lower than the critical value for all analytical parameters studied. Some process samples were analyzed in FTIR and compared with gravimetric and x ray fluorescence (XRF) analyses showing similar results in all three methods. The statistical tests (Student-t and Fischer) showed that the techniques are equivalent. (author)

  17. SWOT analysis: The analytical method in the process of planning and its application in the development of orthopaedic hospital department

    Directory of Open Access Journals (Sweden)

    Terzić Zorica

    2010-01-01

    Full Text Available Introduction. SWOT analysis is a managerial tool used to evaluate internal and external environment through strengths and weaknesses, opportunities and threats. Objective. The aim was to demonstrate the application of the SWOT analysis on the example of the Department for Paediatric Orthopaedics and Traumatology at the Institute of Orthopaedic Surgery 'Banjica' in Belgrade. Methods. Qualitative research was conducted during December 2008 at the Department for Paediatric Orthopaedics and Traumatology of the Institute of Orthopaedic Surgery 'Banjica' by applying the focus group technique. Participants were members of the medical staff and patients. In the first phase of the focus group brainstorming was applied to collect the factors of internal and external environment, and to identify strengths and weaknesses, opportunities and threats, respectively. In the second phase the nominal group technique was applied in order to reduce the list of factors. The factors were assessed according to their influence on the Department. Factors ranked by the three point Likert scale from 3 (highest impact to 1 (lowest impact. Results. The most important strengths of the Department are competent and skilled staff, high quality of services, average hospital bed utilization, the Department providing the educational basis of the School of Medicine, satisfied patients, pleasant setting, and additional working hours. The weaknesses are: poor spatial organization, personnel unmotivated to refresh knowledge, lack of specifically trained personnel, inadequate sanitary facilities, and uncovered services by the Insurance Fund, long average hospital stay, and low economic status of patients. The opportunities are: legislative regulations, formed paediatric traumatology service at the City level, good regional position of the Institute, and extension of referral areas. The threats are: absent Department autonomy in the personnel policy of the Institute, competitions within

  18. Development and application of RP-HPLC methods for the analysis of transition metals and their radioactive isotops in radioactive waste

    International Nuclear Information System (INIS)

    Seekamp, S.

    1999-07-01

    A major criterion in the final disposal of nuclear waste is to keep possible changes in the geosphere due to the introduction of radioactive waste as small as possible and to prevent any escape into the biosphere in the long term. The Federal Office for Radiation Protection (BfS) has therefore established limit values for a number of nuclides. Verifying these limits has to date involved laborious wet chemical analysis. In order to accelerate quantification there is a need to develop rapid multielement methods. HPLC methods represent a starting point for this development. Chemical separation is necessary to quantify β-emitters via their radioactive radiation since they are characterized by a continuous energy spectrum. A method for quantifying transition metals and their radioactive isotopes from radioactive waste has been created by using a chelating agent to select the analytes and RP-HPLC to separate the complexes formed. In addition to separating the matrix, complexation on a precolumn has the advantage of enriching the analytes. The subject of this thesis is the development and application of the method including studies of the mobile and stationary phase, as well as the optimization of all parameters, such as pH value, sample volume etc., which influence separation, enrichment or detection. The method developed was successfully tested using cement samples. It was also used for investigations of ion exchange resins and for trace analysis in calcium fluoride. Furthermore, the transferability of the method to actinides was examined by using a different complexing agent. (orig.) [de

  19. DEVELOPMENT OF METHODS OF ESTIMATION, ANALYSIS, SUBSTANTIATION AND SELECTION OF ORGANIZATIONAL AND TECHNOLOGICAL DECISIONS FOR RECONSTRUCTION OF INDUSTRIAL ENTERPRISES

    Directory of Open Access Journals (Sweden)

    SIEDIN V. L.

    2017-02-01

    Full Text Available Raising of problem. Over the past decade, changes in the economy have led to the decline of many industrial enterprises, which in turn led to the emergence of abandoned buildings and degraded areas that create a social and environmental hazard. Accordingly, the buildings and structures of such enterprises do not function and need reconstruction. Purpose of the aricle. Study of the development of methods for assessing, analyzing, substantiating and selecting rational organizational and technological decisions for the reconstruction of industrial enterprises. Conclusion. With the aim of transforming degraded and disordered territories into modern centers of vital activity, it is necessary to identify in each populated area the areas of priority renovation and reconstruction, and also to concentrate budgetary funds and private investments for the implementation of such projects. In the implementation of the above measures, the settlements will be systematically updated in accordance with european standards.

  20. Multivariate analysis: models and method

    International Nuclear Information System (INIS)

    Sanz Perucha, J.

    1990-01-01

    Data treatment techniques are increasingly used since computer methods result of wider access. Multivariate analysis consists of a group of statistic methods that are applied to study objects or samples characterized by multiple values. A final goal is decision making. The paper describes the models and methods of multivariate analysis

  1. Multivariate analysis methods in physics

    International Nuclear Information System (INIS)

    Wolter, M.

    2007-01-01

    A review of multivariate methods based on statistical training is given. Several multivariate methods useful in high-energy physics analysis are discussed. Selected examples from current research in particle physics are discussed, both from the on-line trigger selection and from the off-line analysis. Also statistical training methods are presented and some new application are suggested [ru

  2. Methods in algorithmic analysis

    CERN Document Server

    Dobrushkin, Vladimir A

    2009-01-01

    …helpful to any mathematics student who wishes to acquire a background in classical probability and analysis … This is a remarkably beautiful book that would be a pleasure for a student to read, or for a teacher to make into a year's course.-Harvey Cohn, Computing Reviews, May 2010

  3. Communication Network Analysis Methods.

    Science.gov (United States)

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  4. Formulation of an aloe-based product according to Iranian traditional medicine and development of its analysis method.

    Science.gov (United States)

    Moein, Elham; Hajimehdipoor, Homa; Toliyat, Tayebeh; Choopani, Rasool; Hamzeloo-Moghadam, Maryam

    2017-08-29

    Currently, people are more interested to traditional medicine. The traditional formulations should be converted to modern drug delivery systems to be more acceptable for the patients. In the present investigation, a poly herbal medicine "Ayarij-e-Faiqra" (AF) based on Iranian traditional medicine (ITM) has been formulated and its quality control parameters have been developed. The main ingredients of AF including barks of Cinnamomum zeylanicum Blume and Cinnamomum cassia J. Presl, the rhizomes of Nardostachys jatamansi DC., the fruits of Piper cubeba L.f., the flowers of Rosa damascena Herrm., the oleo gum resin of Pistacia terebinthus L. and Aloe spp. dried juice were powdered and used for preparing seven tablet formulations of the herbal mixture. Flowability of the different formulated powders was examined and the best formulations were selected (F6&F7). The tablets were prepared from the selected formulations compared according to the physical characteristics and finally, F7 was selected and coated. Physicochemical characters of core and coated AF tablets were determined and the HPLC method for quantitation of aloin as a marker of tablets was selected and verified according to selectivity, linearity, precision, recovery, LOD and LOQ. The results showed that core and coated AF tablets were in agreement with USP requirements for herbal drugs. They had acceptable appearance, disintegration time, friability, hardness, dissolution behavior, weight variation and content uniformity. The amount of aloin in tablets was found 123.1 mg/tab. The HPLC method for aloin determination in AF tablets was verified according to selectivity, linearity (5-500 μg/ml, r 2 :0.9999), precision (RSD: 1.62%), recovery (108.0%), LOD & LOQ (0.0053 & 0.0161 μg/ml). The formulated tablets could be a good substitute for powder and capsules of AF in ITM clinics with a feasible and precise method for its quality control. Ayarij-e-Faiqra formulation.

  5. Development of Three Methods for Simultaneous Quantitative ...

    African Journals Online (AJOL)

    Development of Three Methods for Simultaneous Quantitative Determination of Chlorpheniramine Maleate and Dexamethasone in the Presence of Parabens in ... Tropical Journal of Pharmaceutical Research ... Results: All the proposed methods were successfully applied to the analysis of raw materials and dosage form.

  6. Development of a simple method for classifying the degree of importance of components in nuclear power plants using probabilistic analysis technique

    International Nuclear Information System (INIS)

    Shimada, Yoshio; Miyazaki, Takamasa

    2006-01-01

    In order to analyze large amounts of trouble information of overseas nuclear power plants, it is necessary to select information that is significant in terms of both safety and reliability. In this research, a method of efficiently and simply classifying degrees of importance of components in terms of safety and reliability while paying attention to root-cause components appearing in the information was developed. Regarding safety, the reactor core damage frequency (CDF), which is used in the probabilistic analysis of a reactor, was used. Regarding reliability, the automatic plant trip probability (APTP), which is used in the probabilistic analysis of automatic reactor trips, was used. These two aspects were reflected in the development of criteria for classifying degrees of importance of components. By applying these criteria, a method of quantitatively and simply judging the significance of trouble information of overseas nuclear power plants was developed. (author)

  7. Development of on-line electrochemical sample pretreatment methods for the analysis of thallium and uranium by ICP-MS

    International Nuclear Information System (INIS)

    Zhou, F.; Van Berkel, G.J.; Morton, S.J.; Duckworth, D.C.; Adeniyi, W.K.; Keller, J.M.

    1995-01-01

    Anodic and adsorptive stripping voltammetry (AWV and AdSV, respectively) were performed on-line with a mercury thin-film electrode (MTFE) to effect the selective accumulation and detection of thallium and uranium, respectively. ASV-ICP-MS experiments using thallium as the test element were performed to characterize the behavior of the on-line system for low level and quantitative determinations. Excellent linearity in response was demonstrated for thallium standards ranging from 0.25 ng/L to 50 microg/L. The 1.0 pg/L detection limit calculated from this data for thallium (3σ/sensitivity) was 400 times lower than that of conventional ICP-MS. The ability to overcome sample matrix effects in quantitative determinations was demonstrated by the analysis of an undiluted synthetic urine sample. AdSV-ICP-MS experiments were performed using uranium as the test element to demonstrate the utility of this method for the determination of radiologically important elements. A uranium(VI)-cupferron complex was used to effect adsorptive accumulation of uranium from a 10 microg/L standard solution onto the MTFE. The uranium was chemically stripped from the electrode for subsequent downstream detection by the ICP-MS. The quantitative nature of this method and a modest enhancement of signal levels (∼X10) over those levels obtained with conventional ICP-MS for samples in the microgram/liter concentration range were demonstrated. Modifications to the current system to provide low flow rate operation will allow further optimization of the ASV-ICP-MS and AdSV-ICP-MS combinations

  8. New developments for the analysis of archaeological and artistic artifacts by optical and ion beam methods at LAMFI

    International Nuclear Information System (INIS)

    Rizzutto, Marcia A.; Tabacniks, Manfredo H.; Barbosa, Marcel D. L.; Added, Nemitala; Curado, Jessica F.; Kajiya, Elizabet M.; Campos, Pedro H.O.V. de

    2011-01-01

    Full text: Since 2005, the analysis of artistic and cultural heritage objects at LAMFI-USP (Laboratorio de Analises de Materiais com Feixes Ionicos), initially restricted to ion beam methods, is growing steadily. Since then, alternative methodologies and procedures have been incorporated to better characterize these objects, that possess distinctive physical characteristics and also due to their high cultural and monetary value. The examinations were expanded to other non-destructive analytical techniques like portable XRF (X-ray fluorescence) analysis, X-ray radiography, visible, UV (ultraviolet) and IR (infrared) light imaging that are helping to better understand these art objects, particularly paintings, where the techniques are helping to access the conservation state and also reveal underlying drawings, which help understanding the creative process of the artist. The external beam arrangement at LAMFI was recently updated for simultaneous PIXE (Particle induced X-ray emission), RBS (Rutherford back scattering), PIGE (Particle induced gamma-ray emission) and IBL (Ion beam luminescence) analysis in open air. The new setup comprises a 2 π star-like detector assembly with 7 collimated telescopes: two openings have laser beams for optical alignment of the target, 2 are used for X-ray detectors, 1 for a particle detector, 1 for an optical spectrometer, and 1 for a image. The particle and X-ray detector telescopes can be evacuated to reduce signal losses. The 2 telescopes with the X-ray detectors have absorbers to selectively filter low energy X-rays, optimizing the PIXE detection limits. The beam exit window is made of an 8 μm aluminum foil to monitoring integrated beam charge by measuring the Al gamma rays with a NaI detector. The geometry and materials of the assembly have been carefully designed to shield the X-ray detectors from measuring the X-rays from the exit beam window as well as reducing the detection of Ar K α from the in air beam path. The new

  9. New developments for the analysis of archaeological and artistic artifacts by optical and ion beam methods at LAMFI

    Energy Technology Data Exchange (ETDEWEB)

    Rizzutto, Marcia A.; Tabacniks, Manfredo H.; Barbosa, Marcel D. L.; Added, Nemitala; Curado, Jessica F.; Kajiya, Elizabet M.; Campos, Pedro H.O.V. de [Universidade de Sao Paulo (USP), SP (Brazil). Inst. de Fisica

    2011-07-01

    Full text: Since 2005, the analysis of artistic and cultural heritage objects at LAMFI-USP (Laboratorio de Analises de Materiais com Feixes Ionicos), initially restricted to ion beam methods, is growing steadily. Since then, alternative methodologies and procedures have been incorporated to better characterize these objects, that possess distinctive physical characteristics and also due to their high cultural and monetary value. The examinations were expanded to other non-destructive analytical techniques like portable XRF (X-ray fluorescence) analysis, X-ray radiography, visible, UV (ultraviolet) and IR (infrared) light imaging that are helping to better understand these art objects, particularly paintings, where the techniques are helping to access the conservation state and also reveal underlying drawings, which help understanding the creative process of the artist. The external beam arrangement at LAMFI was recently updated for simultaneous PIXE (Particle induced X-ray emission), RBS (Rutherford back scattering), PIGE (Particle induced gamma-ray emission) and IBL (Ion beam luminescence) analysis in open air. The new setup comprises a 2 {pi} star-like detector assembly with 7 collimated telescopes: two openings have laser beams for optical alignment of the target, 2 are used for X-ray detectors, 1 for a particle detector, 1 for an optical spectrometer, and 1 for a image. The particle and X-ray detector telescopes can be evacuated to reduce signal losses. The 2 telescopes with the X-ray detectors have absorbers to selectively filter low energy X-rays, optimizing the PIXE detection limits. The beam exit window is made of an 8 {mu}m aluminum foil to monitoring integrated beam charge by measuring the Al gamma rays with a NaI detector. The geometry and materials of the assembly have been carefully designed to shield the X-ray detectors from measuring the X-rays from the exit beam window as well as reducing the detection of Ar K {alpha} from the in air beam path. The

  10. A New Boron Analysis Method

    Energy Technology Data Exchange (ETDEWEB)

    Weitman, J; Daaverhoeg, N; Farvolden, S

    1970-07-01

    In connection with fast neutron (n, {alpha}) cross section measurements a novel boron analysis method has been developed. The boron concentration is inferred from the mass spectrometrically determined number of helium atoms produced in the thermal and epithermal B-10 (n, {alpha}) reaction. The relation between helium amount and boron concentration is given, including corrections for self shielding effects and background levels. Direct and diffusion losses of helium are calculated and losses due to gettering, adsorption and HF-ionization in the release stage are discussed. A series of boron determinations is described and the results are compared with those obtained by other methods, showing excellent agreement. The lower limit of boron concentration which can be measured varies with type of sample. In e.g. steel, concentrations below 10-5 % boron in samples of 0.1-1 gram may be determined.

  11. Analysis and development of methods for the recovery of tri-n-butylphosphate (TBP)-30%v/v-degraded dodecane

    International Nuclear Information System (INIS)

    Dalston, C.O.

    1984-01-01

    Tri-n-butyl phosphate associated with an inert hydrocarbon is the main solvent used in reprocessing of nuclear irradiated fuel arising of pressurized water reactors. The combined action of radiation and nitric acid cause severe damage to solvent, in reprocessing steps. The recovery of the solvent is, thus, of great importance, since it decreases the amount of the waste and improves the process economy. A comparative analysis of several methods of the recovery of this solvent was carried out, such as: alkaline washing, adsorption with resins, adsorption with aluminium oxide, adsorption by active carbon and adsorption by vermiculite. Some modifications of analytical 95 Zr test and a mathematical definition of two new parameters (degradation grade and efficiency of recovery) were done. Through this modified 95 Zr test, the residence time and the rate of degraded solvent: recuperator were determined. After laboratory tests, vermiculite associated with active carbon was employed for the treatment of 50 liters of tri-n-butyl phosphate (30% V/V)-dodecane, degraded by hydrolysis. Other analyses were performed to check the potentialities of these solids for this solvent recovery. (Author) [pt

  12. The Analysis Methods Of 3-Monochloropropane-1,2-Diol and Glycydyl Esters in Foods, Mitigation Studies, and Current Developments About their Effects on Health

    Directory of Open Access Journals (Sweden)

    Aslı Yıldırım

    2017-12-01

    Full Text Available Chloropropanols are known as undesired food contaminants liberated during the processing of various food products. When the adverse effects of chloropropanols, especially 3-monochloropropane-1,2-diol (3-MCPD, 2-monochloropropane-1,3-diol (2-MCPD and glycidols along with their esters were first understood, the studies about the detection and mitigation of these compounds were accelerated. 3-MCPD, which was detected in food products in higher amounts when compared to other chloropropanols, usually occurs during refining process of vegetable oils, especially in deodorisation step. The novel methods in terms of the analysis of 3-MCPD and other chloropropanols are continuously updated. However, there are two basic methods today namely direct and indirect methods. Direct methods enable to detect all of the esters individually, yet, due to the necessity of a huge number of reference standards, indirect methods are currently more preferred. The first essential step of reducing chloropropanols in food products is to determine the proper analysis method. In this review, general information, new developments in analysis methods, mitigation studies and the toxigolocial data about various chloropropanols were summarized.

  13. Hydrophilic interaction liquid chromatography in analysis of granisetron HCl and its related substances. Retention mechanisms and method development.

    Science.gov (United States)

    Maksić, Jelena; Tumpa, Anja; Stajić, Ana; Jovanović, Marko; Rakić, Tijana; Jančić-Stojanović, Biljana

    2016-05-10

    In this paper separation of granisetron and its two related substances in HILIC mode is presented. Separation was done on silica column derivatized with sulfoalkylbetaine groups (ZIC-HILIC). Firstly, retention mechanisms were assessed whereby retention factors of substances were followed in wide range of acetonitrile content (80-97%), at constant concentration of aqueous buffer (10mM) as well as at constant pH value of 3.0. Further, in order to developed optimal HILIC method, Design of Experiments (DoE) methodology was applied. For optimization full factorial design 3(2) was employed. Influence of acetonitrile content and ammonium acetate concentration were investigated while pH of the water phase was kept at 3.3. Adequacy of obtained mathematical models was confirmed by ANOVA. Optimization goals (α>1.15 and minimal run time) were accomplished with 94.7% of acetonitrile in mobile phase and 70 mM of ammonium acetate in water phase. Optimal point was in the middle of defined Design Space. In the next phase, robustness was experimetally tested by Rechtschaffen design. The investigated factors and their levels were: acetonitrile content (±1%), ammonium acetate molarity in water phase (±2 mM), pH value of water phase (±0.2) and column temperature (±4 °C). The validation scope included selectivity, linearity, accuracy and precision as well as determination of limit of detection (LOD) and limit of quantification (LOQ) for the related substances. Additionally, the validation acceptance criteria were met in all cases. Finally, the proposed method could be successfully utilized for estimation of granisetron HCl and its related substances in tablets and parenteral dosage forms, as well as for monitoring degradation under various stress conditions. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Probabilistic Structural Analysis Theory Development

    Science.gov (United States)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  15. STOCHASTIC METHODS IN RISK ANALYSIS

    Directory of Open Access Journals (Sweden)

    Vladimíra OSADSKÁ

    2017-06-01

    Full Text Available In this paper, we review basic stochastic methods which can be used to extend state-of-the-art deterministic analytical methods for risk analysis. We can conclude that the standard deterministic analytical methods highly depend on the practical experience and knowledge of the evaluator and therefore, the stochastic methods should be introduced. The new risk analysis methods should consider the uncertainties in input values. We present how large is the impact on the results of the analysis solving practical example of FMECA with uncertainties modelled using Monte Carlo sampling.

  16. COINTOF mass spectrometry: design of a time-of-flight analyzer and development of the analysis method

    International Nuclear Information System (INIS)

    Teyssier, C.

    2012-01-01

    DIAM (Device for the irradiation of molecular clusters) is a newly designed experimental setup to investigate processes resulting from the irradiation of molecular nano-systems by 20-150 keV protons. One of its specificities relies on the original technique of mass spectrometry named COINTOF (Correlated Ion and Neutral Time Of Flight) consisting in correlated measurements of the time of flight of charged and neutral fragments produced by the dissociation of a single molecular ion parent. A strategy of treatment and analysis of the detection signals was developed to distinguish two fragments close in time ( 3 O + and two water molecules. The distribution of the time of flight difference between the two neutral fragments is measured providing an estimate of the kinetic energy release of a few eV. In parallel, a second time-of-flight mass spectrometer was designed. It associates a linear time-of-flight and an orthogonal time-of-flight and integrates position detectors (delay line anode). Simulations demonstrate the potentials of the new analyzer. Finally, research works were led at the laboratory R.-J. A. Levesque (Universite de Montreal) on the imaging capabilities of the multi-pixel detectors of the MPX-ATLAS collaboration. (author)

  17. Software development for teleroentgenogram analysis

    Science.gov (United States)

    Goshkoderov, A. A.; Khlebnikov, N. A.; Obabkov, I. N.; Serkov, K. V.; Gajniyarov, I. M.; Aliev, A. A.

    2017-09-01

    A framework for the analysis and calculation of teleroentgenograms was developed. Software development was carried out in the Department of Children's Dentistry and Orthodontics in Ural State Medical University. The software calculates the teleroentgenogram by the original method which was developed in this medical department. Program allows designing its own methods for calculating the teleroentgenograms by new methods. It is planned to use the technology of machine learning (Neural networks) in the software. This will help to make the process of calculating the teleroentgenograms easier because methodological points will be placed automatically.

  18. Theoretical and experimental analysis of electroweak corrections to the inclusive jet process. Development of extreme topologies detection methods

    International Nuclear Information System (INIS)

    Meric, Nicolas

    2013-01-01

    We have studied the behaviour of the inclusive jet, W+jets and Z+jets processes from the phenomenological and experimental point of view in the ATLAS experiment at LHC in order to understand how important is the impact of Sudakov logarithms on electroweak corrections and in the associated production of weak vector boson and jets at LHC. We have computed the amplitude of the real electroweak corrections to the inclusive jet process due to the real emission of weak vector bosons from jets. We have done this computation with the MCFM and NLOjet++ generators at 7 TeV, 8 TeV and 14 TeV. This study shows that, for the inclusive jet process, the partial cancellation of the virtual weak corrections (due to weak bosons in loops) by the real electroweak corrections occurs. This effect shows that Bloch-Nordsieck violation is reduced for this process. We have then participated to the measure of the differential cross-section for these different processes in the ATLAS experiment at 7 TeV. In particular we have been involved into technical aspects of the measurement such as the study of the QCD background to the W+jets process in the muon channel. We have then combined the different measurements in this channel to compare their behaviour. This tends to show that several effects are giving to the electroweak corrections their relative importance as we see an increase of the relative contribution of weak bosons with jets processes to the inclusive jet process with the transverse momentum of jets, if we explicitly ask for the presence of electroweak bosons in the final state. This study is currently only a preliminary study and aims at showing that this study can be useful to investigate the underlying structure of these processes. Finally we have studied the noises affecting the ATLAS calorimeter. This has allowed for the development of a new way to detect problematic events using well known theorems from statistics. This new method is able to detect bursts of noise and

  19. COMPETITIVE INTELLIGENCE ANALYSIS - SCENARIOS METHOD

    Directory of Open Access Journals (Sweden)

    Ivan Valeriu

    2014-07-01

    Full Text Available Keeping a company in the top performing players in the relevant market depends not only on its ability to develop continually, sustainably and balanced, to the standards set by the customer and competition, but also on the ability to protect its strategic information and to know in advance the strategic information of the competition. In addition, given that economic markets, regardless of their profile, enable interconnection not only among domestic companies, but also between domestic companies and foreign companies, the issue of economic competition moves from the national economies to the field of interest of regional and international economic organizations. The stakes for each economic player is to keep ahead of the competition and to be always prepared to face market challenges. Therefore, it needs to know as early as possible, how to react to others’ strategy in terms of research, production and sales. If a competitor is planning to produce more and cheaper, then it must be prepared to counteract quickly this movement. Competitive intelligence helps to evaluate the capabilities of competitors in the market, legally and ethically, and to develop response strategies. One of the main goals of the competitive intelligence is to acknowledge the role of early warning and prevention of surprises that could have a major impact on the market share, reputation, turnover and profitability in the medium and long term of a company. This paper presents some aspects of competitive intelligence, mainly in terms of information analysis and intelligence generation. Presentation is theoretical and addresses a structured method of information analysis - scenarios method – in a version that combines several types of analysis in order to reveal some interconnecting aspects of the factors governing the activity of a company.

  20. Basic methods of isotope analysis

    International Nuclear Information System (INIS)

    Ochkin, A.V.; Rozenkevich, M.B.

    2000-01-01

    The bases of the most applied methods of the isotope analysis are briefly presented. The possibilities and analytical characteristics of the mass-spectrometric, spectral, radiochemical and special methods of the isotope analysis, including application of the magnetic resonance, chromatography and refractometry, are considered [ru

  1. Development, optimization, validation and application of faster gas chromatography - flame ionization detector method for the analysis of total petroleum hydrocarbons in contaminated soils.

    Science.gov (United States)

    Zubair, Abdulrazaq; Pappoe, Michael; James, Lesley A; Hawboldt, Kelly

    2015-12-18

    This paper presents an important new approach to improving the timeliness of Total Petroleum Hydrocarbon (TPH) analysis in the soil by Gas Chromatography - Flame Ionization Detector (GC-FID) using the CCME Canada-Wide Standard reference method. The Canada-Wide Standard (CWS) method is used for the analysis of petroleum hydrocarbon compounds across Canada. However, inter-laboratory application of this method for the analysis of TPH in the soil has often shown considerable variability in the results. This could be due, in part, to the different gas chromatography (GC) conditions, other steps involved in the method, as well as the soil properties. In addition, there are differences in the interpretation of the GC results, which impacts the determination of the effectiveness of remediation at hydrocarbon-contaminated sites. In this work, multivariate experimental design approach was used to develop and validate the analytical method for a faster quantitative analysis of TPH in (contaminated) soil. A fractional factorial design (fFD) was used to screen six factors to identify the most significant factors impacting the analysis. These factors included: injection volume (μL), injection temperature (°C), oven program (°C/min), detector temperature (°C), carrier gas flow rate (mL/min) and solvent ratio (v/v hexane/dichloromethane). The most important factors (carrier gas flow rate and oven program) were then optimized using a central composite response surface design. Robustness testing and validation of model compares favourably with the experimental results with percentage difference of 2.78% for the analysis time. This research successfully reduced the method's standard analytical time from 20 to 8min with all the carbon fractions eluting. The method was successfully applied for fast TPH analysis of Bunker C oil contaminated soil. A reduced analytical time would offer many benefits including an improved laboratory reporting times, and overall improved clean up

  2. A thermogravimetric analysis (TGA) method developed for estimating the stoichiometric ratio of solid-state {alpha}-cyclodextrin-based inclusion complexes

    Energy Technology Data Exchange (ETDEWEB)

    Bai, Yuxiang; Wang, Jinpeng; Bashari, Mohanad; Hu, Xiuting [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Feng, Tao [School of Perfume and Aroma Technology, Shanghai Institute of Technology, Shanghai 201418 (China); Xu, Xueming [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Jin, Zhengyu, E-mail: jinlab2008@yahoo.com [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Tian, Yaoqi, E-mail: yqtian@jiangnan.edu.cn [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China)

    2012-08-10

    Highlights: Black-Right-Pointing-Pointer We develop a TGA method for the measurement of the stoichiometric ratio. Black-Right-Pointing-Pointer A series of formulas are deduced to calculate the stoichiometric ratio. Black-Right-Pointing-Pointer Four {alpha}-CD-based inclusion complexes were successfully prepared. Black-Right-Pointing-Pointer The developed method is applicable. - Abstract: An approach mainly based on thermogravimetric analysis (TGA) was developed to evaluate the stoichiometric ratio (SR, guest to host) of the guest-{alpha}-cyclodextrin (Guest-{alpha}-CD) inclusion complexes (4-cresol-{alpha}-CD, benzyl alcohol-{alpha}-CD, ferrocene-{alpha}-CD and decanoic acid-{alpha}-CD). The present data obtained from Fourier transform-infrared (FT-IR) spectroscopy showed that all the {alpha}-CD-based inclusion complexes were successfully prepared in a solid-state form. The stoichiometric ratios of {alpha}-CD to the relative guests (4-cresol, benzyl alcohol, ferrocene and decanoic acid) determined by the developed method were 1:1, 1:2, 2:1 and 1:2, respectively. These SR data were well demonstrated by the previously reported X-ray diffraction (XRD) method and the NMR confirmatory experiments, except the SR of decanoic acid with a larger size and longer chain was not consistent. It is, therefore, suggested that the TGA-based method is applicable to follow the stoichiometric ratio of the polycrystalline {alpha}-CD-based inclusion complexes with smaller and shorter chain guests.

  3. A thermogravimetric analysis (TGA) method developed for estimating the stoichiometric ratio of solid-state α-cyclodextrin-based inclusion complexes

    International Nuclear Information System (INIS)

    Bai, Yuxiang; Wang, Jinpeng; Bashari, Mohanad; Hu, Xiuting; Feng, Tao; Xu, Xueming; Jin, Zhengyu; Tian, Yaoqi

    2012-01-01

    Highlights: ► We develop a TGA method for the measurement of the stoichiometric ratio. ► A series of formulas are deduced to calculate the stoichiometric ratio. ► Four α-CD-based inclusion complexes were successfully prepared. ► The developed method is applicable. - Abstract: An approach mainly based on thermogravimetric analysis (TGA) was developed to evaluate the stoichiometric ratio (SR, guest to host) of the guest–α-cyclodextrin (Guest-α-CD) inclusion complexes (4-cresol-α-CD, benzyl alcohol-α-CD, ferrocene-α-CD and decanoic acid-α-CD). The present data obtained from Fourier transform-infrared (FT-IR) spectroscopy showed that all the α-CD-based inclusion complexes were successfully prepared in a solid-state form. The stoichiometric ratios of α-CD to the relative guests (4-cresol, benzyl alcohol, ferrocene and decanoic acid) determined by the developed method were 1:1, 1:2, 2:1 and 1:2, respectively. These SR data were well demonstrated by the previously reported X-ray diffraction (XRD) method and the NMR confirmatory experiments, except the SR of decanoic acid with a larger size and longer chain was not consistent. It is, therefore, suggested that the TGA-based method is applicable to follow the stoichiometric ratio of the polycrystalline α-CD-based inclusion complexes with smaller and shorter chain guests.

  4. Development of a Univariate Membrane-Based Mid-Infrared Method for Protein Quantitation and Total Lipid Content Analysis of Biological Samples

    Directory of Open Access Journals (Sweden)

    Ivona Strug

    2014-01-01

    Full Text Available Biological samples present a range of complexities from homogeneous purified protein to multicomponent mixtures. Accurate qualification of such samples is paramount to downstream applications. We describe the development of an MIR spectroscopy-based analytical method offering simultaneous protein quantitation (0.25–5 mg/mL and analysis of total lipid or detergent species, as well as the identification of other biomolecules present in biological samples. The method utilizes a hydrophilic PTFE membrane engineered for presentation of aqueous samples in a dried format compatible with fast infrared analysis. Unlike classical quantification techniques, the reported method is amino acid sequence independent and thus applicable to complex samples of unknown composition. By comparison to existing platforms, this MIR-based method enables direct quantification using minimal sample volume (2 µL; it is well-suited where repeat access and limited sample size are critical parameters. Further, accurate results can be derived without specialized training or knowledge of IR spectroscopy. Overall, the simplified application and analysis system provides a more cost-effective alternative to high-throughput IR systems for research laboratories with minimal throughput demands. In summary, the MIR-based system provides a viable alternative to current protein quantitation methods; it also uniquely offers simultaneous qualification of other components, notably lipids and detergents.

  5. Development of a potentiometric EDTA method for determination of molybdenum. Use of the analysis for molybdenite concentrates

    Science.gov (United States)

    Khristova, R.; Vanmen, M.

    1986-01-01

    Based on considerations of principles and experimental data, the interference of sulfate ions in poteniometric titration of EDTA with FeCl3 was confirmed. The method of back complexometric titration of molybdenum of Nonova and Gasheva was improved by replacing hydrazine sulfate with hydrazine hydrochloride for reduction of Mo(VI) to Mo(V). The method can be used for one to tenths of mg of molybdenum with 0.04 mg standard deviation. The specific method of determination of molybdenum in molybdenite concentrates is presented.

  6. Development of a potentiometric EDTA method for determination of molybdenum. Use of the analysis for molybdenite concentrates

    International Nuclear Information System (INIS)

    Khristova, R.; Vanmen, M.

    1976-01-01

    Based on considerations of principles and experimental data, the interference of sulfate ions in poteniometric titration of EDTA with FeCl 3 was confirmed. The method of back complexometric titration of molybdenum of Nonova and Gasheva was improved by replacing hydrazine sulfate with hydrazine hydrochloride for reduction of Mo(VI) to Mo(V). The method can be used for one to tenths of mg of molybdenum with 0.04 mg standard deviation. The specific method of determination of molybdenum in molybdenite concentrates is presented

  7. Field sampling and data analysis methods for development of ecological land classifications: an application on the Manistee National Forest.

    Science.gov (United States)

    George E. Host; Carl W. Ramm; Eunice A. Padley; Kurt S. Pregitzer; James B. Hart; David T. Cleland

    1992-01-01

    Presents technical documentation for development of an Ecological Classification System for the Manistee National Forest in northwest Lower Michigan, and suggests procedures applicable to other ecological land classification projects. Includes discussion of sampling design, field data collection, data summarization and analyses, development of classification units,...

  8. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  9. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...

  10. High-performance liquid chromatography analysis methods developed for quantifying enzymatic esterification of flavonoids in ionic liquids

    DEFF Research Database (Denmark)

    Lue, Bena-Marie; Guo, Zheng; Xu, X.B.

    2008-01-01

    Methods using reversed-phase high-performance liquid chromatography (RP-HPLC) with ELSD were investigated to quantify enzymatic reactions of flavonoids with fatty acids in the presence of diverse room temperature ionic liquids (RTILs). A buffered salt (preferably triethylamine-acetate) was found...... essential for separation of flavonoids from strongly polar RTILs, whereby RTILs were generally visible as two major peaks identified based on an ion-pairing/exchanging hypothesis. C8 and C12 stationary phases were optimal while mobile phase pH (3-7) had only a minor influence on separation. The method...

  11. Comparative urine analysis by liquid chromatography-mass spectrometry and multivariate statistics : Method development, evaluation, and application to proteinuria

    NARCIS (Netherlands)

    Kemperman, Ramses F. J.; Horvatovich, Peter L.; Hoekman, Berend; Reijmers, Theo H.; Muskiet, Frits A. J.; Bischoff, Rainer

    2007-01-01

    We describe a platform for the comparative profiling of urine using reversed-phase liquid chromatography-mass spectrometry (LC-MS) and multivariate statistical data analysis. Urinary compounds were separated by gradient elution and subsequently detected by electrospray Ion-Trap MS. The lower limit

  12. New design procedure development of future reactor critical power estimation. (1) Practical design-by-analysis method for BWR critical power design correlation

    International Nuclear Information System (INIS)

    Yamamoto, Yasushi; Mitsutake, Toru

    2007-01-01

    For present BWR fuels, the full mock-up thermal-hydraulic test, such as the critical power measurement test, pressure drop measurement test and so on, has been needed. However, the full mock-up test required the high costs and large-scale test facility. At present, there are only a few test facilities to perform the full mock-up thermal-hydraulic test in the world. Moreover, for future BWR, the bundle size tends to be larger, because of reducing the plant construction costs and minimizing the routine check period. For instance, AB1600, improved ABWR, was proposed from Toshiba, whose bundle size was 1.2 times larger than the conventional BWR fuel size. It is too expensive and far from realistic to perform the full mock-up thermal-hydraulic test for such a large size fuel bundle. The new design procedure is required to realize the large scale bundle design development, especially for the future reactor. Therefore, the new design procedure, Practical Design-by-Analysis (PDBA) method, has been developed. This new procedure consists of the partial mock-up test and numerical analysis. At present, the subchannel analysis method based on three-fluid two-phase flow model only is a realistic choice. Firstly, the partial mock-up test is performed, for instance, the 1/4 partial mock-up bundle. Then, the first-step critical power correlation coefficients are evaluated with the measured data. The input data, such as the spacer effect model coefficient, on the subchannel analysis are also estimated with the data. Next, the radial power effect on the critical power of the full-bundle size was estimated with the subchannel analysis. Finally, the critical power correlation is modified by the subchannel analysis results. In the present study, the critical power correlation of the conventional 8x8 BWR fuel was developed with the PDBA method by 4x4 partial mock-up tests and the subchannel analysis code. The accuracy of the estimated critical power was 3.8%. The several themes remain to

  13. Studies on application of neutron activation analysis -Applied research on air pollution monitoring and development of analytical method of environmental samples

    International Nuclear Information System (INIS)

    Chung, Yong Sam; Moon, Jong Hwa; Chung, Young Ju; Jeong, Eui Sik; Lee, Sang Mi; Kang, Sang Hun; Cho, Seung Yeon; Kwon, Young Sik; Chung, Sang Wuk; Lee, Kyu Sung; Chun, Ki Hong; Kim, Nak Bae; Lee, Kil Yong; Yoon, Yoon Yeol; Chun, Sang Ki.

    1997-09-01

    This research report is written for results of applied research on air pollution monitoring using instrumental neutron activation analysis. For identification and standardization of analytical method, 24 environmental samples are analyzed quantitatively, and accuracy and precision of this method are measured. Using airborne particulate matter and biomonitor chosen as environmental indicators, trace elemental concentrations of sample collected at urban and rural site monthly are determined ant then the calculation of statistics and the factor analysis are carried out for investigation of emission source. Facilities for NAA are installed in a new HANARO reactor, functional test is performed for routine operation. In addition, unified software code for NAA is developed to improve accuracy, precision and abilities of analytical processes. (author). 103 refs., 61 tabs., 19 figs

  14. Analysis of human serum and whole blood for mineral content by ICP-MS and ICP-OES: development of a mineralomics method.

    Science.gov (United States)

    Harrington, James M; Young, Daniel J; Essader, Amal S; Sumner, Susan J; Levine, Keith E

    2014-07-01

    Minerals are inorganic compounds that are essential to the support of a variety of biological functions. Understanding the range and variability of the content of these minerals in biological samples can provide insight into the relationships between mineral content and the health of individuals. In particular, abnormal mineral content may serve as an indicator of illness. The development of robust, reliable analytical methods for the determination of the mineral content of biological samples is essential to developing biological models for understanding the relationship between minerals and illnesses. This paper describes a method for the analysis of the mineral content of small volumes of serum and whole blood samples from healthy individuals. Interday and intraday precision for the mineral content of the blood (250 μL) and serum (250 μL) samples was measured for eight essential minerals--sodium (Na), calcium (Ca), magnesium (Mg), potassium (K), iron (Fe), zinc (Zn), copper (Cu), and selenium (Se)--by plasma spectrometric methods and ranged from 0.635 to 10.1% relative standard deviation (RSD) for serum and 0.348-5.98% for whole blood. A comparison of the determined ranges for ten serum samples and six whole blood samples provided good agreement with literature reference ranges. The results demonstrate that the digestion and analysis methods can be used to reliably measure the content of these minerals and potentially of other minerals.

  15. Nonlinear programming analysis and methods

    CERN Document Server

    Avriel, Mordecai

    2012-01-01

    This text provides an excellent bridge between principal theories and concepts and their practical implementation. Topics include convex programming, duality, generalized convexity, analysis of selected nonlinear programs, techniques for numerical solutions, and unconstrained optimization methods.

  16. Chemical methods of rock analysis

    National Research Council Canada - National Science Library

    Jeffery, P. G; Hutchison, D

    1981-01-01

    A practical guide to the methods in general use for the complete analysis of silicate rock material and for the determination of all those elements present in major, minor or trace amounts in silicate...

  17. Development and validation of AccuTOF-DART™ as a screening method for analysis of bank security device and pepper spray components.

    Science.gov (United States)

    Pfaff, Allison M; Steiner, Robert R

    2011-03-20

    Analysis of bank security devices, containing 1-methylaminoanthraquinone (MAAQ) and o-chlorobenzylidenemalononitrile (CS), and pepper sprays, containing capsaicin, is a lengthy process with no specific screening technique to aid in identifying samples of interest. Direct Analysis in Real Time (DART™) ionization coupled with an Accurate Time of Flight (AccuTOF) mass detector is a fast, ambient ionization source that could significantly reduce time spent on these cases and increase the specificity of the screening process. A new method for screening clothing for bank dye and pepper spray, using AccuTOF-DART™ analysis, has been developed. Detection of MAAQ, CS, and capsaicin was achieved via extraction of each compound onto cardstock paper, which was then sampled in the AccuTOF-DART™. All results were verified using gas chromatography coupled with electron impact mass spectrometry. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  18. Scenario development methods and practice

    International Nuclear Information System (INIS)

    2001-01-01

    The safe management of radioactive waste is an essential aspect of all nuclear power programmes. Although a general consensus has been reached in OECD countries on the use of geological repositories for the disposal of high-level radioactive waste, analysis of the long-term safety of these repositories, using performance assessment and other tools, is required prior to implementation. The initial stage in developing a repository safety assessment is the identification of all factors that may be relevant to the long-term safety of the repository and their combination to form scenarios. This must be done in a systematic and transparent way in order to assure the regulatory authorities that nothing important has been forgotten. Scenario development has become the general term used to describe the collection and organisation of the scientific and technical information necessary to assess the long-term performance or safety of radioactive waste disposal systems. This includes the identification of the relevant features, events and processes (FEPs), the synthesis of broad models of scientific understanding, and the selection of cases to be calculated. Scenario development provides the overall framework in which the cases and their calculated consequences can be discussed, including biases or shortcomings due to omissions or lack of knowledge. The NEA Workshop on Scenario Development was organised in Madrid, in May 1999, with the objective of reviewing developments in scenario methodologies and applications in safety assessments since 1992. The outcome of this workshop is the subject of this book. It is a review of developments in scenario methodologies based on a large body of practical experience in safety assessments. It will be of interest to radioactive waste management experts as well as to other specialists involved in the development of scenario methodologies. (author)

  19. [SWOT analysis: the analytical method in the process of planning and its application in the development of orthopaedic hospital department].

    Science.gov (United States)

    Terzić, Zorica; Vukasinović, Zoran; Bjegović-Mikanović, Vesna; Jovanović, Vesna; Janicić, Radmila

    2010-01-01

    SWOT analysis is a managerial tool used to evaluate internal and external environment through strengths and weaknesses, opportunities and threats. The aim was to demonstrate the application of the SWOT analysis on the example of the Department for Paediatric Orthopaedics and Traumatology at the Institute of Orthopaedic Surgery "Banjica" in Belgrade. Qualitative research was conducted during December 2008 at the Department for Paediatric Orthopaedics and Traumatology of the Institute of Orthopaedic Surgery "Banjica" by applying the focus group technique. Participants were members of the medical staff and patients. In the first phase of the focus group brainstorming was applied to collect the factors of internal and external environment, and to identify strengths and weaknesses, opportunities and threats, respectively. In the second phase the nominal group technique was applied in order to reduce the list of factors. The factors were assessed according to their influence on the Department. Factors ranked by the three point Likert scale from 3 (highest impact) to 1 (lowest impact). The most important strengths of the Department are competent and skilled staff, high quality of services, average hospital bed utilization, the Department providing the educational basis of the School of Medicine, satisfied patients, pleasant setting, and additional working hours. The weaknesses are: poor spatial organization, personnel unmotivated to refresh knowledge, lack of specifically trained personnel, inadequate sanitary facilities, and uncovered services by the Insurance Fund, long average hospital stay, and low economic status of patients. The opportunities are: legislative regulations, formed paediatric traumatology service at the City level, good regional position of the Institute, and extension of referral areas. The threats are: absent Department autonomy in the personnel policy of the Institute, competitions within the Institute, impossibility to increase the Department

  20. Development of a gas-liquid chromatographic method for the analysis of fatty acid tryptamides in cocoa products.

    Science.gov (United States)

    Hug, Bernadette; Golay, Pierre-Alain; Giuffrida, Francesca; Dionisi, Fabiola; Destaillats, Frédéric

    2006-05-03

    The determination of the occurrence and level of cocoa shells in cocoa products and chocolate is an important analytical issue. The recent European Union directive on cocoa and chocolate products (2000/36/EC) has not retained the former limit of a maximum amount of 5% of cocoa shells in cocoa nibs (based on fat-free dry matter), previously authorized for the elaboration of cocoa products such as cocoa mass. In the present study, we report a reliable gas-liquid chromatography procedure suitable for the determination of the occurrence of cocoa shells in cocoa products by detection of fatty acid tryptamides (FATs). The precision of the method was evaluated by analyzing nine different samples (cocoa liquors with different ranges of shells) six times (replicate repeatability). The variations of the robust coefficient of variation of the repeatability demonstrated that FAT(C22), FAT(C24), and total FATs are good markers for the detection of shells in cocoa products. The trueness of the method was evaluated by determining the FAT content in two spiked matrices (cocoa liquors and cocoa shells) at different levels (from 1 to 50 mg/100 g). A good relation was found between the results obtained and the spiking (recovery varied between 90 and 130%), and the linearity range was established between 1 and 50 mg/100 g in cocoa products. For total FAT contents of cocoa liquor containing 5% shells, the measurement uncertainty allows us to conclude that FAT is equal to 4.01 +/- 0.8 mg/100 g. This validated method is perfectly suitable to determine shell contents in cocoa products using FAT(C22), FAT(C24), and total FATs as markers. The results also confirmed that cocoa shells contain FAT(C24) and FAT(C22) in a constant ratio of nearly 2:1.

  1. Development of a low cost method to estimate the seismic signature of a geothermal field form ambient noise analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Tibuleac, Ileana [Univ. of Nevada, Reno, NV (United States)

    2016-06-30

    A new, cost effective and non-invasive exploration method using ambient seismic noise has been tested at Soda Lake, NV, with promising results. The material included in this report demonstrates that, with the advantage of initial S-velocity models estimated from ambient noise surface waves, the seismic reflection survey, although with lower resolution, reproduces the results of the active survey when the ambient seismic noise is not contaminated by strong cultural noise. Ambient noise resolution is less at depth (below 1000m) compared to the active survey. In general, the results are promising and useful information can be recovered from ambient seismic noise, including dipping features and fault locations.

  2. Analysis of water-soluble polysaccharides in an edible medicinal plant Epimedium: method development, validation, and application.

    Science.gov (United States)

    Zhang, Hua-Feng; Niu, Li-Li; Yang, Xiao-Hua; Li, Lu

    2014-01-01

    Water-soluble polysaccharides are important constituents with evident health benefits in Epimedium. The aim of this study was to establish a specific, accurate, reproducible, and sensitive phenol-sulfuric acid method for the quantitative assay of Epimedium polysaccharides and to determine polysaccharides in Epimedium samples from Chinese markets. Galactose was adopted as the standard monosaccharide, and 486 nm was chosen as the detection wavelength. The optimal conditions for the color reaction were obtained using single factor experiments and an orthogonal test: temperature, 20 degrees C; amount of 5% phenol, 0.3 mL; amount of concentrated sulfuric acid, 3.5 mL; incubation time, 20 min; and addition sequence, phenol-sample-sulfuric acid. The colored sample solution after chromogenic reaction exhibited high stability within 2 h. The calibration curve was linear within the range 5.00-60.00 micro g/mL, and the correlation coefficient of the regression equation was 0.999. LOD and LOQ were 1.65 and 5.00 microg/mL, respectively. Recovery, intraday precision, interday precision, and accuracy were 97.43 to 103.80%, 0.73 to 3.48%, 1.21 to 2.75%, and 97.74 to 101.62%, respectively. Polysaccharides in 26 samples of Epimedium collected from different provinces of China were quantified by the proposed colorimetric method, and a large variation of contents of polysaccharides was observed among these samples.

  3. A Case Study Analysis of Clt Methods to Develop Grammar Competency for Academic Writing Purposes at Tertiary Level

    Directory of Open Access Journals (Sweden)

    Almodad Biduk Asmani

    2013-10-01

    Full Text Available The purpose of the research project is to find out how effective grammar teaching and learning using the Principled CLT method can improve the ability of freshman Binus University students to understand and use grammar knowledge for academic writing purposes. The research project is expected to result in computer-animated format which can be used as one of the main tools in teaching and learning grammar at tertiary level. The research project applies the descriptive quantitative approach, and thus uses numeric data. The research project involves two subject groups, which are experimental and control. The two groups are pre-tested so as to find out their level of grammar competency by their academic writing works. The experimental group receives the treatment of grammar learning by using the Principled CLT approach, while the control group receives the standard CLT approach. Then, the two groups have the post-test, and the results are compared. Through statistics, the numerical data show that there is no significant difference between the two methods’ results, and as a result, either method has its own strength and weaknesses. If one is to be implemented, it must be linked to the specific goals and purposes that each entails.  

  4. Development of computational dynamic properties analysis method and optimum design method of tissue construction under consideration of microstructure of materials for nuclear power

    International Nuclear Information System (INIS)

    Shiraishi, Haruki; Tabuchi, Masaaki; Nakasone, Yuji

    1999-01-01

    A practical reactor core material produces particles by irradiation in the matrix. The second phase particle lattice was defined and the effects of its basic parameters on the stress-strain curve were evaluated. As the basic parameters, 0.01 to 0.95 μm of particle diameter, 1.0μm of grain spacing and 0.3 of work-hardening exponent were used. The effect of the particle diameter on the stress-strain curve was studied by the large deformation finite element method. The concentration area of distortion was produced from the back of the second phase particles, and its process specified the sharp of stress-strain curve. The calculation method did not assume breaking of particles and separation of the interface of particle and matrix. So that, the values obtained showed the upper limits of strength, ductility and fracture toughness of the composite materials. (S.Y.)

  5. Seismic design and analysis methods

    International Nuclear Information System (INIS)

    Varpasuo, P.

    1993-01-01

    Seismic load is in many areas of the world the most important loading situation from the point of view of structural strength. Taking this into account it is understandable, that there has been a strong allocation of resources in the seismic analysis during the past ten years. In this study there are three areas of the center of gravity: (1) Random vibrations; (2) Soil-structure interaction and (3) The methods for determining structural response. The solution of random vibration problems is clarified with the aid of applications in this study and from the point of view of mathematical treatment and mathematical formulations it is deemed sufficient to give the relevant sources. In the soil-structure interaction analysis the focus has been the significance of frequency dependent impedance functions. As a result it was obtained, that the description of the soil with the aid of frequency dependent impedance functions decreases the structural response and it is thus always the preferred method when compared to more conservative analysis types. From the methods to determine the C structural response the following four were tested: (1) The time history method; (2) The complex frequency-response method; (3) Response spectrum method and (4) The equivalent static force method. The time history appeared to be the most accurate method and the complex frequency-response method did have the widest area of application. (orig.). (14 refs., 35 figs.)

  6. Development of boiling transition analysis code TCAPE-INS/B based on mechanistic methods for BWR fuel bundles. Models and validations with boiling transition experimental data

    International Nuclear Information System (INIS)

    Ishida, Naoyuki; Utsuno, Hideaki; Kasahara, Fumio

    2003-01-01

    The Boiling Transition (BT) analysis code TCAPE-INS/B based on the mechanistic methods coupled with subchannel analysis has been developed for the evaluation of the integrity of Boiling Water Reactor (BWR) fuel rod bundles under abnormal operations. Objective of the development is the evaluation of the BT without using empirical BT and rewetting correlations needed for different bundle designs in the current analysis methods. TCAPE-INS/B consisted mainly of the drift-flux model, the film flow model, the cross-flow model, the thermal conductivity model and the heat transfer correlations. These models were validated systematically with the experimental data. The accuracy of the prediction for the steady-state Critical Heat Flux (CHF) and the transient temperature of the fuel rod surface after the occurrence of BT were evaluated on the validations. The calculations for the experiments with the single tube and bundles were carried out for the validations of the models incorporated in the code. The results showed that the steady-state CHF was predicted within about 6% average error. In the transient calculations, BT timing and temperature of the fuel rod surface gradient agreed well with experimental results, but rewetting was predicted lately. So, modeling of heat transfer phenomena during post-BT is under modification. (author)

  7. Development of Chiral LC-MS Methods for small Molecules and Their Applications in the Analysis of Enantiomeric Composition and Pharmacokinetic Studies

    Energy Technology Data Exchange (ETDEWEB)

    Desai, Meera Jay [Iowa State Univ., Ames, IA (United States)

    2004-01-01

    The purpose of this research was to develop sensitive LC-MS methods for enantiomeric separation and detection, and then apply these methods for determination of enantiomeric composition and for the study of pharmacokinetic and pharmacodynamic properties of a chiral nutraceutical. Our first study, evaluated the use of reverse phase and polar organic mode for chiral LC-API/MS method development. Reverse phase methods containing high water were found to decrease ionization efficiency in electrospray, while polar organic methods offered good compatibility and low limits of detection with ESI. The use of lower flow rates dramatically increased the sensitivity by an order of magnitude. Additionally, for rapid chiral screening, the coupled Chirobiotic column afforded great applicability for LC-MS method development. Our second study, continued with chiral LC-MS method development in this case for the normal phase mode. Ethoxynonafluorobutane, a fluorocarbon with low flammability and no flashpoint, was used as a substitute solvent for hexane/heptane mobile phases for LC-APCI/MS. Comparable chromatographic resolutions and selectivities were found using ENFB substituted mobile phase systems, although, peak efficiencies were significantly diminished. Limits of detection were either comparable or better for ENFB-MS over heptane-PDA detection. The miscibility of ENFB with a variety of commonly used organic modifiers provided for flexibility in method development. For APCI, lower flow rates did not increase sensitivity as significantly as was previously found for ESI-MS detection. The chiral analysis of native amino acids was evaluated using both APCI and ESI sources. For free amino acids and small peptides, APCI was found to have better sensitivities over ESI at high flow rates. For larger peptides, however, sensitivity was greatly improved with the use of electrospray. Additionally, sensitivity was enhanced with the use of non-volatile additives, This optimized method was then

  8. Development of a volumetric Analysis method to determine uranium in the loaded phosphoric acid and the loaded organic phase (DEHPA/TOPO)

    International Nuclear Information System (INIS)

    Shlewit, H.; Koudsi, Y.

    2003-01-01

    Rapid and reliable volumetric analysis method has been developed to determine uranium, on line, at uranium extraction unit from wet-process phosphoric acid, in aqueous and organic phases. This process enable up 300 mg of uranium to be determined in the presence of nitric acid, in a sample volume of up to at least 10 ml. The volume of the sample, the amounts of reagents added, the temperature of the reagents and the standing time of various stages were investigated to ensure that the conditions selected for the final procedure were reasonably non-critical

  9. Developments of an Interactive Sail Design Method

    OpenAIRE

    S. M. Malpede; M. Vezza

    2000-01-01

    This paper presents a new tool for performing the integrated design and analysis of a sail. The features of the system are the geometrical definition of a sail shape, using the Bezier surface method, the creation of a finite element model for the non-linear structural analysis and a fluid-dynamic model for the aerodynamic analysis. The system has been developed using MATLAB(r). Recent sail design efforts have been focused on solving the aeroelastic behavior of the sail. The pressure dis...

  10. Developed generalised unified power flow controller model in the Newton–Raphson power-flow analysis using combined mismatches method

    DEFF Research Database (Denmark)

    Kamel, Salah; Jurado, Francisco; Chen, Zhe

    2016-01-01

    values are calculated during the iterative process based on the desired controlled values and buses voltage at the terminals of GUPFC. The parameters of GUPFC can be calculated during the iterative process and the final values are updated after load flow convergence. Using the developed GUPFC model......, the original structure and symmetry of the admittance and Jacobian matrices can still be kept, the changing of Jacobian matrix is eliminated. Consequently, the complexities of the computer load flow program codes with GUPFC are reduced. The HPCIM load flow code with the proposed model is written in C......++ programming language. Where, the SuperLU library is utilised to handle the sparse Jacobian matrix. The proposed model has been validated using the standard IEEE test systems....

  11. Remote sensing analysis of depositional landforms in alluvial settings: Method development and application to the Taquari megafan, Pantanal (Brazil)

    Science.gov (United States)

    Zani, Hiran; Assine, Mario Luis; McGlue, Michael Matthew

    2012-08-01

    Traditional Shuttle Radar Topography Mission (SRTM) topographic datasets hold limited value in the geomorphic analysis of low-relief terrains. To address this shortcoming, this paper presents a series of techniques designed to enhance digital elevation models (DEMs) of environments dominated by low-amplitude landforms, such as a fluvial megafan system. These techniques were validated through the study of a wide depositional tract composed of several megafans located within the Brazilian Pantanal. The Taquari megafan is the most remarkable of these features, covering an area of approximately 49,000 km2. To enhance the SRTM-DEM, the megafan global topography was calculated and found to be accurately represented by a second order polynomial. Simple subtraction of the global topography from altitude produced a new DEM product, which greatly enhanced low amplitude landforms within the Taquari megafan. A field campaign and optical satellite images were used to ground-truth features on the enhanced DEM, which consisted of both depositional (constructional) and erosional features. The results demonstrate that depositional lobes are the dominant landforms on the megafan. A model linking baselevel change, avulsion, clastic sedimentation, and erosion is proposed to explain the microtopographic features on the Taquari megafan surface. The study confirms the potential promise of enhanced DEMs for geomorphological research in alluvial settings.

  12. Method development for the analysis of organophosphate and pyrethroid insecticides at low parts per trillion levels in water.

    Science.gov (United States)

    Wang, Dongli; Weston, Donald P; Lydy, Michael J

    2009-06-15

    In the current study, organophosphate and pyrethroid insecticides including diazinon, chlorpyrifos, bifenthrin, fenpropathrin, permethrin, lambda-cyhalothrin, cyfluthrin, cypermethrin, esfenvalerate and deltamethrin were analyzed in laboratory and field-collected water samples. Water samples were extracted and analyzed by gas chromatography/electron capture detector (GC/ECD) and gas chromatography/nitrogen-phosphorous detector (GC/NPD). Comparison of results from liquid-liquid extraction and subsequent normal phase solid-phase extraction cleanup (LLE-NPSPE), and reversed phase solid-phase extraction (RPSPE) showed that LLE-NPSPE was the better choice to extract trace amounts of pesticides from water. Pesticide recoveries from four spiked water samples using LLE-NPSPE ranged from 63.2 to 148.8% at four spiking concentrations. Method detection limits were 0.72-1.69 ng/L using four different water sources. The stability of the target pesticides in lake water was investigated at 4 degrees C for 1h, 1d, 4d, and 7d under three conditions: (1) water samples only; (2) with 20 mL hexane used as a keeper solvent; and (3) with acidification to pH 2 with HCl. Results showed that water storage without treatment resulted in slow degradation of some pesticides with storage time, storage using water acidification led to significant degradation and loss of diazinon and chlorpyrifos, while water storage with hexane as a keeper solvent showed good stability for all of the target pesticides over the 7d storage period.

  13. Analysis of the mortality development of the population in the surroundings of Bohunice NPP using Fuzzy logic methods

    International Nuclear Information System (INIS)

    Letkovicova, M.; Durov, M.; Stehlikova, B.

    2001-01-01

    We pursue the vicinity of Bohunice NPP. The vicinity has cyclic form with radius of 30 km, what represents an area approximately 2 800 km 2 . This area of pursued vicinity is requisite by the security report of Bohunice NPP. To the presumptive calculations we used the complete databases of Register of death, Register of municipalities and of Register of age structure of the inhabitants of the Slovak republic from 1993 to 1999, fully-fashioned in Statistical authority of the Slovak republic. We work with databases, which don't contain personal identifications. We pursue the evolution of the mortality by the indicators of the mortality, calculated by the WHO. By the literary sources and by our experience is necessary the sum at least of three years to calculation of stable demographic and epidemiological parameters. Therefore we work with the method of short time series. The basic observed unit, which is represented by one value of the indicator, is one municipality. All our assessing analyses are calculated from triennial sums of all indicators, so we work with man-years. Advanced report is the adjusted extract from Complex report on situation of environment and health of the inhabitants in vicinity of Bohunice NPP in 1999, which was advanced by our society in March 2001. (authors)

  14. Prediction of fat-free mass by bioelectrical impedance analysis in older adults from developing countries: a cross-validation study using the deuterium dilution method

    International Nuclear Information System (INIS)

    Mateo, H. Aleman; Romero, J. Esparza; Valencia, M.E.

    2010-01-01

    Objective: Several limitations of published bioelectrical impedance analysis (BIA) equations have been reported. The aims were to develop in a multiethnic, elderly population a new prediction equation and cross- validate it along with some published BIA equations for estimating fat-free mass using deuterium oxide dilution as the reference method. Design and setting: Cross-sectional study of elderly from five developing countries. Methods: Total body water (TBW) measured by deuterium dilution was used to determine fat-free mass (FFM) in 383 subjects. Anthropometric and BIA variables were also measured. Only 377 subjects were included for the analysis, randomly divided into development and cross-validation groups after stratified by gender. Stepwise model selection was used to generate the model and Bland Altman analysis was used to test agreement. Results: FFM = 2.95 - 3.89 (Gender) + 0.514 (Ht2/Z) + 0.090 (Waist) + 0.156 (Body weight). The model fit parameters were an R2, total F-Ratio, and the SEE of 0.88, 314.3, and 3.3, respectively. None of the published BIA equations met the criteria for agreement. The new BIA equation underestimated FFM by just 0.3 kg in the cross-validation sample. The mean of the difference between FFM by TBW and the new BIA equation were not significantly different; 95% of the differences were between the limits of agreement of -6.3 to 6.9 kg of FFM. There was no significant association between the mean of the differences and their averages (r= 0.008 and p= 0.2). Conclusions:This new BIA equation offers a valid option compared with some of the current published BIA equations to estimate FFM in elderly subjects from five developing countries. (Authors)

  15. Structural reliability methods: Code development status

    Science.gov (United States)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  16. Evaluating public involvement in research design and grant development: Using a qualitative document analysis method to analyse an award scheme for researchers.

    Science.gov (United States)

    Baxter, Susan; Muir, Delia; Brereton, Louise; Allmark, Christine; Barber, Rosemary; Harris, Lydia; Hodges, Brian; Khan, Samaira; Baird, Wendy

    2016-01-01

    money was used, including a description of the aims and outcomes of the public involvement activities. The purpose of this study was to analyse the content of these reports. We aimed to find out what researcher views and experiences of public involvement activities were, and what lessons might be learned. Methods We used an innovative method of data analysis, drawing on group participatory approaches, qualitative content analysis, and Framework Analysis to sort and label the content of the reports. We developed a framework of categories and sub-categories (or themes and sub-themes) from this process. Results Twenty five documents were analysed. Four main themes were identified in the data: the added value of public involvement; planning and designing involvement; the role of public members; and valuing public member contributions. Within these themes, sub-themes related to the timing of involvement (prior to the research study/intended during the research study), and also specific benefits of public involvement such as: validating ideas; ensuring appropriate outcomes; ensuring the acceptability of data collection methods/tools and advice regarding research processes. Other sub-themes related to: finding and approaching public members; timing of events; training/support; the format of sessions; setting up public involvement panels: use of public contributors in analysis and interpretation of data; and using public members to assist with dissemination and translation into practice. Conclusions The analysis of reports submitted by researchers following involvement events provides evidence of the value of public involvement during the development of applications for research funding, and details a method for involving members of the public in data analysis which could be of value to other researchers The findings of the analysis indicate recognition amongst researchers of the variety in potential roles for public members in research, and also an acknowledgement of how

  17. Novel methods to help develop healthier eating habits for eating and weight disorders: A systematic review and meta-analysis.

    Science.gov (United States)

    Turton, Robert; Bruidegom, Kiki; Cardi, Valentina; Hirsch, Colette R; Treasure, Janet

    2016-02-01

    This paper systematically reviews novel interventions developed and tested in healthy controls that may be able to change the over or under controlled eating behaviours in eating and weight disorders. Electronic databases were searched for interventions targeting habits related to eating behaviours (implementation intentions; food-specific inhibition training and attention bias modification). These were assessed in accordance with the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines. In healthy controls the implementation intention approach produces a small increase in healthy food intake and reduction in unhealthy food intake post-intervention. The size of these effects decreases over time and no change in weight was found. Unhealthy food intake was moderately reduced by food-specific inhibition training and attention bias modification post-intervention. This work may have important implications for the treatment of populations with eating and weight disorders. However, these findings are preliminary as there is a moderate to high level of heterogeneity in implementation intention studies and to date there are few food-specific inhibition training and attention bias modification studies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Development of methods for body composition studies

    International Nuclear Information System (INIS)

    Mattsson, Soeren; Thomas, Brian J

    2006-01-01

    This review is focused on experimental methods for determination of the composition of the human body, its organs and tissues. It summarizes the development and current status of fat determinations from body density, total body water determinations through the dilution technique, whole and partial body potassium measurements for body cell mass estimates, in vivo neutron activation analysis for body protein measurements, dual-energy absorptiometry (DEXA), computed tomography (CT) and magnetic resonance imaging (MRI, fMRI) and spectroscopy (MRS) for body composition studies on tissue and organ levels, as well as single- and multiple-frequency bioimpedance (BIA) and anthropometry as simple easily available methods. Methods for trace element analysis in vivo are also described. Using this wide range of measurement methods, together with gradually improved body composition models, it is now possible to quantify a number of body components and follow their changes in health and disease. (review)

  19. Development of methods for body composition studies

    Energy Technology Data Exchange (ETDEWEB)

    Mattsson, Soeren [Department of Radiation Physics, Lund University, Malmoe University Hospital, SE-205 02 Malmoe (Sweden); Thomas, Brian J [School of Physical and Chemical Sciences, Queensland University of Technology, Brisbane, QLD 4001 (Australia)

    2006-07-07

    This review is focused on experimental methods for determination of the composition of the human body, its organs and tissues. It summarizes the development and current status of fat determinations from body density, total body water determinations through the dilution technique, whole and partial body potassium measurements for body cell mass estimates, in vivo neutron activation analysis for body protein measurements, dual-energy absorptiometry (DEXA), computed tomography (CT) and magnetic resonance imaging (MRI, fMRI) and spectroscopy (MRS) for body composition studies on tissue and organ levels, as well as single- and multiple-frequency bioimpedance (BIA) and anthropometry as simple easily available methods. Methods for trace element analysis in vivo are also described. Using this wide range of measurement methods, together with gradually improved body composition models, it is now possible to quantify a number of body components and follow their changes in health and disease. (review)

  20. Development and validation of a method for the determination of regulated fragrance allergens by High-Performance Liquid Chromatography and Parallel Factor Analysis 2.

    Science.gov (United States)

    Pérez-Outeiral, Jessica; Elcoroaristizabal, Saioa; Amigo, Jose Manuel; Vidal, Maider

    2017-12-01

    This work presents the development and validation of a multivariate method for quantitation of 6 potentially allergenic substances (PAS) related to fragrances by ultrasound-assisted emulsification microextraction coupled with HPLC-DAD and PARAFAC2 in the presence of other 18 PAS. The objective is the extension of a previously proposed univariate method to be able to determine the 24 PAS currently considered as allergens. The suitability of the multivariate approach for the qualitative and quantitative analysis of the analytes is discussed through datasets of increasing complexity, comprising the assessment and validation of the method performance. PARAFAC2 showed to adequately model the data facing up different instrumental and chemical issues, such as co-elution profiles, overlapping spectra, unknown interfering compounds, retention time shifts and baseline drifts. Satisfactory quality parameters of the model performance were obtained (R 2 ≥0.94), as well as meaningful chromatographic and spectral profiles (r≥0.97). Moreover, low errors of prediction in external validation standards (below 15% in most cases) as well as acceptable quantification errors in real spiked samples (recoveries from 82 to 119%) confirmed the suitability of PARAFAC2 for resolution and quantification of the PAS. The combination of the previously proposed univariate approach, for the well-resolved peaks, with the developed multivariate method allows the determination of the 24 regulated PAS. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Excitation methods for energy dispersive analysis

    International Nuclear Information System (INIS)

    Jaklevic, J.M.

    1976-01-01

    The rapid development in recent years of energy dispersive x-ray fluorescence analysis has been based primarily on improvements in semiconductor detector x-ray spectrometers. However, the whole analysis system performance is critically dependent on the availability of optimum methods of excitation for the characteristic x rays in specimens. A number of analysis facilities based on various methods of excitation have been developed over the past few years. A discussion is given of the features of various excitation methods including charged particles, monochromatic photons, and broad-energy band photons. The effects of the excitation method on background and sensitivity are discussed from both theoretical and experimental viewpoints. Recent developments such as pulsed excitation and polarized photons are also discussed

  2. A platform analytical quality by design (AQbD) approach for multiple UHPLC-UV and UHPLC-MS methods development for protein analysis.

    Science.gov (United States)

    Kochling, Jianmei; Wu, Wei; Hua, Yimin; Guan, Qian; Castaneda-Merced, Juan

    2016-06-05

    A platform analytical quality by design approach for methods development is presented in this paper. This approach is not limited just to method development following the same logical Analytical quality by design (AQbD) process, it is also exploited across a range of applications in methods development with commonality in equipment and procedures. As demonstrated by the development process of 3 methods, the systematic approach strategy offers a thorough understanding of the method scientific strength. The knowledge gained from the UHPLC-UV peptide mapping method can be easily transferred to the UHPLC-MS oxidation method and the UHPLC-UV C-terminal heterogeneity methods of the same protein. In addition, the platform AQbD method development strategy ensures method robustness is built in during development. In early phases, a good method can generate reliable data for product development allowing confident decision making. Methods generated following the AQbD approach have great potential for avoiding extensive post-approval analytical method change. While in the commercial phase, high quality data ensures timely data release, reduced regulatory risk, and lowered lab operational cost. Moreover, large, reliable database and knowledge gained during AQbD method development provide strong justifications during regulatory filling for the selection of important parameters or parameter change needs for method validation, and help to justify for removal of unnecessary tests used for product specifications. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. The death of the Job plot, transparency, open science and online tools, uncertainty estimation methods and other developments in supramolecular chemistry data analysis.

    Science.gov (United States)

    Brynn Hibbert, D; Thordarson, Pall

    2016-10-25

    Data analysis is central to understanding phenomena in host-guest chemistry. We describe here recent developments in this field starting with the revelation that the popular Job plot method is inappropriate for most problems in host-guest chemistry and that the focus should instead be on systematically fitting data and testing all reasonable binding models. We then discuss approaches for estimating uncertainties in binding studies using case studies and simulations to highlight key issues. Related to this is the need for ready access to data and transparency in the methodology or software used, and we demonstrate an example a webportal () that aims to address this issue. We conclude with a list of best-practice protocols for data analysis in supramolecular chemistry that could easily be translated to other related problems in chemistry including measuring rate constants or drug IC 50 values.

  4. Method development for the analysis of 1,4-dioxane in drinking water using solid-phase extraction and gas chromatography-mass spectrometry.

    Science.gov (United States)

    Grimmett, Paul E; Munch, Jean W

    2009-01-01

    1,4-Dioxane has been identified as a probable human carcinogen and an emerging contaminant in drinking water. The United States Environmental Protection Agency's (U.S. EPA) National Exposure Research Laboratory (NERL) has developed a method for the analysis of 1,4-dioxane in drinking water at ng/L concentrations. The method consists of an activated carbon solid-phase extraction of 500-mL or 100-mL water samples using dichloromethane as the elution solvent. The extracts are analyzed by gas chromatography-mass spectrometry (GC-MS) in selected ion monitoring (SIM) mode. In the NERL laboratory, recovery of 1,4-dioxane ranged from 94-110% in fortified laboratory reagent water and recoveries of 96-102% were demonstrated for fortified drinking water samples. The relative standard deviations for replicate analyses were less than 6% at concentrations exceeding the minimum reporting level.

  5. Development, validation and comparison of two stability-indicating RP-LC methods using charged aerosol and UV detectors for analysis of lisdexamfetamine dimesylate in capsules

    Directory of Open Access Journals (Sweden)

    Graciela Carlos

    2016-11-01

    Full Text Available Two new stability-indicating liquid chromatographic methods using two detectors, an ultraviolet (UV and a charged aerosol detector (CAD simultaneously connected in series were validated for the assessment of lisdexamfetamine dimesylate (LDX in capsule. The method was optimized and the influence of individual parameters on UV and CAD response and sensitivity was studied. Chromatography was performed on a Zorbax CN column (250 mm × 4.6 mm, 5 μm in an isocratic elution mode, using acetonitrile and 20 mM ammonium formate at pH 4.0 (50:50, v/v as mobile phase and UV detection at 207 nm. The developed method was validated according to ICH guidelines and the parameters’ specificity, limit of detection, limit of quantitation, linearity, accuracy, precision and robustness were evaluated. CAD is designated to be a non-linear detector in a wide dynamic range, however, the method was linear over the concentration range of 70–130 μg mL−1 in both detectors. The method was precise and accurate. Robustness study was performed by a Plackett–Burman design, delivering results within the acceptable range. Neither the excipients nor the degradation products showed interference in the method after studies of specificity as well as under stress conditions. The results of the LC-UV and LC-CAD methods were statistically compared through ANOVA and showed no significant difference (p > 0.05. Both proposed methods could be considered interchangeable and stability-indicating, and can be applied as an appropriate quality control tool for routine analysis of LDX in capsule.

  6. Microlocal methods in the analysis of the boundary element method

    DEFF Research Database (Denmark)

    Pedersen, Michael

    1993-01-01

    The application of the boundary element method in numerical analysis is based upon the use of boundary integral operators stemming from multiple layer potentials. The regularity properties of these operators are vital in the development of boundary integral equations and error estimates. We show...

  7. Method development for speciation analysis of nanoparticle and ionic forms of gold in biological samples by high performance liquid chromatography hyphenated to inductively coupled plasma mass spectrometry

    Science.gov (United States)

    Malejko, Julita; Świerżewska, Natalia; Bajguz, Andrzej; Godlewska-Żyłkiewicz, Beata

    2018-04-01

    A new method based on coupling high performance liquid chromatography (HPLC) to inductively coupled plasma mass spectrometry (ICP MS) has been developed for the speciation analysis of gold nanoparticles (AuNPs) and dissolved gold species (Au(III)) in biological samples. The column type, the composition and the flow rate of the mobile phase were carefully investigated in order to optimize the separation conditions. The usefulness of two polymeric reversed phase columns (PLRP-S with 100 nm and 400 nm pore size) to separate gold species were investigated for the first time. Under the optimal conditions (PLRP-S400 column, 10 mmol L-1 SDS and 5% methanol as the mobile phase, 0.5 mL min-1 flow rate), detection limits of 2.2 ng L-1 for Au(III), 2.8 ng L-1 for 10 nm AuNPs and 3.7 ng L-1 for 40 nm AuNPs were achieved. The accuracy of the method was proved by analysis of reference material RM 8011 (NIST) of gold nanoparticles of nominal diameter of 10 nm. The HPLC-ICP MS method has been successfully applied to the detection and size characterization of gold species in lysates of green algae Acutodesmus obliquus, typical representative of phytoplankton flora, incubated with 10 nm AuNPs or Au(III).

  8. Development of plant dynamic analysis code for integrated self-pressurized water reactor (ISPDYN), and comparative study of pressure control methods

    International Nuclear Information System (INIS)

    Kusunoki, Tsuyoshi; Yokomura, Takeyoshi; Nabeshima, Kunihiko; Shimazaki, Junya; Shinohara, Yoshikuni.

    1988-01-01

    This report describes the development of plant dynamic analysis code (ISPDYN) for integrated self-pressurized water reactor, and comparative study of pressure control methods with this code. ISPDYN is developed for integrated self-pressurized water reactor, one of the trial design by JAERI. In the transient responses, the calculated results by ISPDYN are in good agreement with the DRUCK calculations. In addition, this report presents some sensitivity studies for selected cases. Computing time of this code is very short so as about one fifth of real time. The comparative study of self-pressurized system with forced-pressurized system by this code, for rapid load decrease and increase cases, has provided useful informations. (author)

  9. Development of the Method of Bacterial Leaching of Metals out of Low-Grade Ores, Rocks, and Industrial Wastes Using Neutron Activation Analysis

    CERN Document Server

    Tsertsvadze, L A; Petriashvili, Sh G; Chutkerashvili, D G; Kirkesali, E I; Frontasyeva, M V; Pavlov, S S; Gundorina, S F

    2001-01-01

    The results of preliminary investigations aimed at the development of an economical and easy to apply technique of bacterial leaching of rare and valuable metals out of low-grade ores, complex composition ores, rocks, and industrial wastes in Georgia are discussed. The main groups of microbiological community of the peat suspension used in the experiments of bacterial leaching are investigated and the activity of particular microorganisms in the leaching of probes with different mineral compositions is assessed. The element composition of the primary and processed samples was investigated by the epithermal neutron activation analysis method and the enrichment/subtraction level is estimated for various elements. The efficiency of the developed technique to purify wastes, extract some scrace metals, and enrich ores or rocks in some elements, e.g. Au, U, Th, Cs, Sr, Rb, Sc, Zr, Hf, Ta, Gd, Er, Lu, Ce, etc., is demonstrated.

  10. Development of the HS-SPME-GC-MS/MS method for analysis of chemical warfare agent and their degradation products in environmental samples.

    Science.gov (United States)

    Nawała, Jakub; Czupryński, Krzysztof; Popiel, Stanisław; Dziedzic, Daniel; Bełdowski, Jacek

    2016-08-24

    After World War II approximately 50,000 tons of chemical weapons were dumped in the Baltic Sea by the Soviet Union under the provisions of the Potsdam Conference on Disarmament. These dumped chemical warfare agents still possess a major threat to the marine environment and to human life. Therefore, continue monitoring of these munitions is essential. In this work, we present the application of new solid phase microextraction fibers in analysis of chemical warfare agents and their degradation products. It can be concluded that the best fiber for analysis of sulfur mustard and its degradation products is butyl acrylate (BA), whereas for analysis of organoarsenic compounds and chloroacetophenone, the best fiber is a co-polymer of methyl acrylate and methyl methacrylate (MA/MMA). In order to achieve the lowest LOD and LOQ the samples should be divided into two subsamples. One of them should be analyzed using a BA fiber, and the second one using a MA/MMA fiber. When the fast analysis is required, the microextraction should be performed by use of a butyl acrylate fiber because the extraction efficiency of organoarsenic compounds for this fiber is acceptable. Next, we have elaborated of the HS-SPME-GC-MS/MS method for analysis of CWA degradation products in environmental samples using laboratory obtained fibers The analytical method for analysis of organosulfur and organoarsenic compounds was optimized and validated. The LOD's for all target chemicals were between 0.03 and 0.65 ppb. Then, the analytical method developed by us, was used for the analysis of sediment and pore water samples from the Baltic Sea. During these studies, 80 samples were analyzed. It was found that 25 sediments and 5 pore water samples contained CWA degradation products such as 1,4-dithiane, 1,4-oxathiane or triphenylarsine, the latter being a component of arsine oil. The obtained data is evidence that the CWAs present in the Baltic Sea have leaked into the general marine environment. Copyright

  11. Development of a method for comprehensive and quantitative analysis of plant hormones by highly sensitive nanoflow liquid chromatography-electrospray ionization-ion trap mass spectrometry

    International Nuclear Information System (INIS)

    Izumi, Yoshihiro; Okazawa, Atsushi; Bamba, Takeshi; Kobayashi, Akio; Fukusaki, Eiichiro

    2009-01-01

    In recent plant hormone research, there is an increased demand for a highly sensitive and comprehensive analytical approach to elucidate the hormonal signaling networks, functions, and dynamics. We have demonstrated the high sensitivity of a comprehensive and quantitative analytical method developed with nanoflow liquid chromatography-electrospray ionization-ion trap mass spectrometry (LC-ESI-IT-MS/MS) under multiple-reaction monitoring (MRM) in plant hormone profiling. Unlabeled and deuterium-labeled isotopomers of four classes of plant hormones and their derivatives, auxins, cytokinins (CK), abscisic acid (ABA), and gibberellins (GA), were analyzed by this method. The optimized nanoflow-LC-ESI-IT-MS/MS method showed ca. 5-10-fold greater sensitivity than capillary-LC-ESI-IT-MS/MS, and the detection limits (S/N = 3) of several plant hormones were in the sub-fmol range. The results showed excellent linearity (R 2 values of 0.9937-1.0000) and reproducibility of elution times (relative standard deviations, RSDs, <1.1%) and peak areas (RSDs, <10.7%) for all target compounds. Further, sample purification using Oasis HLB and Oasis MCX cartridges significantly decreased the ion-suppressing effects of biological matrix as compared to the purification using only Oasis HLB cartridge. The optimized nanoflow-LC-ESI-IT-MS/MS method was successfully used to analyze endogenous plant hormones in Arabidopsis and tobacco samples. The samples used in this analysis were extracted from only 17 tobacco dry seeds (1 mg DW), indicating that the efficiency of analysis of endogenous plant hormones strongly depends on the detection sensitivity of the method. Our analytical approach will be useful for in-depth studies on complex plant hormonal metabolism.

  12. Development of a method for comprehensive and quantitative analysis of plant hormones by highly sensitive nanoflow liquid chromatography-electrospray ionization-ion trap mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Izumi, Yoshihiro; Okazawa, Atsushi; Bamba, Takeshi; Kobayashi, Akio [Department of Biotechnology, Graduate School of Engineering, Osaka University, 2-1 Yamadaoka, Suita, Osaka 565-0871 (Japan); Fukusaki, Eiichiro, E-mail: fukusaki@bio.eng.osaka-u.ac.jp [Department of Biotechnology, Graduate School of Engineering, Osaka University, 2-1 Yamadaoka, Suita, Osaka 565-0871 (Japan)

    2009-08-26

    In recent plant hormone research, there is an increased demand for a highly sensitive and comprehensive analytical approach to elucidate the hormonal signaling networks, functions, and dynamics. We have demonstrated the high sensitivity of a comprehensive and quantitative analytical method developed with nanoflow liquid chromatography-electrospray ionization-ion trap mass spectrometry (LC-ESI-IT-MS/MS) under multiple-reaction monitoring (MRM) in plant hormone profiling. Unlabeled and deuterium-labeled isotopomers of four classes of plant hormones and their derivatives, auxins, cytokinins (CK), abscisic acid (ABA), and gibberellins (GA), were analyzed by this method. The optimized nanoflow-LC-ESI-IT-MS/MS method showed ca. 5-10-fold greater sensitivity than capillary-LC-ESI-IT-MS/MS, and the detection limits (S/N = 3) of several plant hormones were in the sub-fmol range. The results showed excellent linearity (R{sup 2} values of 0.9937-1.0000) and reproducibility of elution times (relative standard deviations, RSDs, <1.1%) and peak areas (RSDs, <10.7%) for all target compounds. Further, sample purification using Oasis HLB and Oasis MCX cartridges significantly decreased the ion-suppressing effects of biological matrix as compared to the purification using only Oasis HLB cartridge. The optimized nanoflow-LC-ESI-IT-MS/MS method was successfully used to analyze endogenous plant hormones in Arabidopsis and tobacco samples. The samples used in this analysis were extracted from only 17 tobacco dry seeds (1 mg DW), indicating that the efficiency of analysis of endogenous plant hormones strongly depends on the detection sensitivity of the method. Our analytical approach will be useful for in-depth studies on complex plant hormonal metabolism.

  13. Quality by Design approach in the development of hydrophilic interaction liquid chromatographic method for the analysis of iohexol and its impurities.

    Science.gov (United States)

    Jovanović, Marko; Rakić, Tijana; Tumpa, Anja; Jančić Stojanović, Biljana

    2015-06-10

    This study presents the development of hydrophilic interaction liquid chromatographic method for the analysis of iohexol, its endo-isomer and three impurities following Quality by Design (QbD) approach. The main objective of the method was to identify the conditions where adequate separation quality in minimal analysis duration could be achieved within a robust region that guarantees the stability of method performance. The relationship between critical process parameters (acetonitrile content in the mobile phase, pH of the water phase and ammonium acetate concentration in the water phase) and critical quality attributes is created applying design of experiments methodology. The defined mathematical models and Monte Carlo simulation are used to evaluate the risk of uncertainty in models prediction and incertitude in adjusting the process parameters and to identify the design space. The borders of the design space are experimentally verified and confirmed that the quality of the method is preserved in this region. Moreover, Plackett-Burman design is applied for experimental robustness testing and method is fully validated to verify the adequacy of selected optimal conditions: the analytical column ZIC HILIC (100 mm × 4.6 mm, 5 μm particle size); mobile phase consisted of acetonitrile-water phase (72 mM ammonium acetate, pH adjusted to 6.5 with glacial acetic acid) (86.7:13.3) v/v; column temperature 25 °C, mobile phase flow rate 1 mL min(-1), wavelength of detection 254 nm. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Nodal method for fast reactor analysis

    International Nuclear Information System (INIS)

    Shober, R.A.

    1979-01-01

    In this paper, a nodal method applicable to fast reactor diffusion theory analysis has been developed. This method has been shown to be accurate and efficient in comparison to highly optimized finite difference techniques. The use of an analytic solution to the diffusion equation as a means of determining accurate coupling relationships between nodes has been shown to be highly accurate and efficient in specific two-group applications, as well as in the current multigroup method

  15. Development of a new extraction technique and HPLC method for the analysis of non-psychoactive cannabinoids in fibre-type Cannabis sativa L. (hemp).

    Science.gov (United States)

    Brighenti, Virginia; Pellati, Federica; Steinbach, Marleen; Maran, Davide; Benvenuti, Stefania

    2017-09-05

    The present work was aimed at the development and validation of a new, efficient and reliable technique for the analysis of the main non-psychoactive cannabinoids in fibre-type Cannabis sativa L. (hemp) inflorescences belonging to different varieties. This study was designed to identify samples with a high content of bioactive compounds, with a view to underscoring the importance of quality control in derived products as well. Different extraction methods, including dynamic maceration (DM), ultrasound-assisted extraction (UAE), microwave-assisted extraction (MAE) and supercritical-fluid extraction (SFE) were applied and compared in order to obtain a high yield of the target analytes from hemp. Dynamic maceration for 45min with ethanol (EtOH) at room temperature proved to be the most suitable technique for the extraction of cannabinoids in hemp samples. The analysis of the target analytes in hemp extracts was carried out by developing a new reversed-phase high-performance liquid chromatography (HPLC) method coupled with diode array (UV/DAD) and electrospray ionization-mass spectrometry (ESI-MS) detection, by using an ion trap mass analyser. An Ascentis Express C 18 column (150mm×3.0mm I.D., 2.7μm) was selected for the HPLC analysis, with a mobile phase composed of 0.1% formic acid in both water and acetonitrile, under gradient elution. The application of the fused-core technology allowed us to obtain a significant improvement of the HPLC performance compared with that of conventional particulate stationary phases, with a shorter analysis time and a remarkable reduction of solvent usage. The analytical method optimized in this study was fully validated to show compliance with international requirements. Furthermore, it was applied to the characterization of nine hemp samples and six hemp-based pharmaceutical products. As such, it was demonstrated to be a very useful tool for the analysis of cannabinoids in both the plant material and its derivatives for

  16. SUBSURFACE CONSTRUCTION AND DEVELOPMENT ANALYSIS

    International Nuclear Information System (INIS)

    N.E. Kramer

    1998-01-01

    The purpose of this analysis is to identify appropriate construction methods and develop a feasible approach for construction and development of the repository subsurface facilities. The objective of this analysis is to support development of the subsurface repository layout for License Application (LA) design. The scope of the analysis for construction and development of the subsurface Repository facilities covers: (1) Excavation methods, including application of knowledge gained from construction of the Exploratory Studies Facility (ESF). (2) Muck removal from excavation headings to the surface. This task will examine ways of preventing interference with other subsurface construction activities. (3) The logistics and equipment for the construction and development rail haulage systems. (4) Impact of ground support installation on excavation and other construction activities. (5) Examination of how drift mapping will be accomplished. (6) Men and materials handling. (7) Installation and removal of construction utilities and ventilation systems. (8) Equipping and finishing of the emplacement drift mains and access ramps to fulfill waste emplacement operational needs. (9) Emplacement drift and access mains and ramps commissioning prior to handover for emplacement operations. (10) Examination of ways to structure the contracts for construction of the repository. (11) Discussion of different construction schemes and how to minimize the schedule risks implicit in those schemes. (12) Surface facilities needed for subsurface construction activities

  17. Further development of probabilistic analysis method for lifetime determination of piping and vessels. Final report; Weiterentwicklung probabilistischer Analysemethoden zur Lebensdauerbestimmung von Rohrleitungen und Behaeltern. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Heckmann, K.; Grebner, H.; Sievers, J.

    2013-07-15

    Within the framework of research project RS1196 the computer code PROST (Probabilistic Structure Calculation) for the quantitative evaluation of the structural reliability of pipe components has been further developed. Thereby models were provided and tested for the consideration of the damage mechanism 'stable crack growth' to determine leak and break probabilities in cylindrical structures of ferritic and austenitic reactor steels. These models are now additionally available to the model for the consideration of the damage mechanisms 'fatigue' and 'corrosion'. Moreover, a crack initiation model has been established supplementary to the treatment of initial cracks. Furthermore, the application range of the code was extended to the calculation of the growth of wall penetrating cracks. This is important for surface cracks growing until the formation of a stable leak. The calculation of the growth of the wall penetrating crack until break occurs improves the estimation of the break probability. For this purpose program modules were developed to be able to calculate stress intensity factors and critical crack lengths for wall penetrating cracks. In the frame of this work a restructuring of PROST was performed including possibilities to combine damage mechanisms during a calculation. Furthermore several additional fatigue crack growth laws were implemented. The implementation of methods to estimate leak areas and leak rates of wall penetrating cracks was completed by the inclusion of leak detection boundaries. The improved analysis methods were tested by calculation of cases treated already before. Furthermore comparative analyses have been performed for several tasks within the international activity BENCH-KJ. Altogether, the analyses show that with the provided flexible probabilistic analysis method quantitative determination of leak and break probabilities of a crack in a complex structure geometry under thermal-mechanical loading as

  18. Analytical Quality by Design in pharmaceutical quality assurance: Development of a capillary electrophoresis method for the analysis of zolmitriptan and its impurities.

    Science.gov (United States)

    Orlandini, Serena; Pasquini, Benedetta; Caprini, Claudia; Del Bubba, Massimo; Pinzauti, Sergio; Furlanetto, Sandra

    2015-11-01

    A fast and selective CE method for the determination of zolmitriptan (ZOL) and its five potential impurities has been developed applying the analytical Quality by Design principles. Voltage, temperature, buffer concentration, and pH were investigated as critical process parameters that can influence the critical quality attributes, represented by critical resolution values between peak pairs, analysis time, and peak efficiency of ZOL-dimer. A symmetric screening matrix was employed for investigating the knowledge space, and a Box-Behnken design was used to evaluate the main, interaction, and quadratic effects of the critical process parameters on the critical quality attributes. Contour plots were drawn highlighting important interactions between buffer concentration and pH, and the gained information was merged into the sweet spot plots. Design space (DS) was established by the combined use of response surface methodology and Monte Carlo simulations, introducing a probability concept and thus allowing the quality of the analytical performances to be assured in a defined domain. The working conditions (with the interval defining the DS) were as follows: BGE, 138 mM (115-150 mM) phosphate buffer pH 2.74 (2.54-2.94); temperature, 25°C (24-25°C); voltage, 30 kV. A control strategy was planned based on method robustness and system suitability criteria. The main advantages of applying the Quality by Design concept consisted of a great increase of knowledge of the analytical system, obtained throughout multivariate techniques, and of the achievement of analytical assurance of quality, derived by probability-based definition of DS. The developed method was finally validated and applied to the analysis of ZOL tablets. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Application of Software Safety Analysis Methods

    International Nuclear Information System (INIS)

    Park, G. Y.; Hur, S.; Cheon, S. W.; Kim, D. H.; Lee, D. Y.; Kwon, K. C.; Lee, S. J.; Koo, Y. H.

    2009-01-01

    A fully digitalized reactor protection system, which is called the IDiPS-RPS, was developed through the KNICS project. The IDiPS-RPS has four redundant and separated channels. Each channel is mainly composed of a group of bistable processors which redundantly compare process variables with their corresponding setpoints and a group of coincidence processors that generate a final trip signal when a trip condition is satisfied. Each channel also contains a test processor called the ATIP and a display and command processor called the COM. All the functions were implemented in software. During the development of the safety software, various software safety analysis methods were applied, in parallel to the verification and validation (V and V) activities, along the software development life cycle. The software safety analysis methods employed were the software hazard and operability (Software HAZOP) study, the software fault tree analysis (Software FTA), and the software failure modes and effects analysis (Software FMEA)

  20. SWOT ANALYSIS ON SAMPLING METHOD

    Directory of Open Access Journals (Sweden)

    CHIS ANCA OANA

    2014-07-01

    Full Text Available Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors. It should be applied correctly for a fair view of financial statements, to satisfy the needs of all financial users. In order to be applied correctly the method must be understood by all its users and mainly by auditors. Otherwise the risk of not applying it correctly would cause loose of reputation and discredit, litigations and even prison. Since there is not a unitary practice and methodology for applying the technique, the risk of incorrectly applying it is pretty high. The SWOT analysis is a technique used that shows the advantages, disadvantages, threats and opportunities. We applied SWOT analysis in studying the sampling method, from the perspective of three players: the audit company, the audited entity and users of financial statements. The study shows that by applying the sampling method the audit company and the audited entity both save time, effort and money. The disadvantages of the method are difficulty in applying and understanding its insight. Being largely used as an audit method and being a factor of a correct audit opinion, the sampling method’s advantages, disadvantages, threats and opportunities must be understood by auditors.

  1. An integrated quality by design and mixture-process variable approach in the development of a capillary electrophoresis method for the analysis of almotriptan and its impurities.

    Science.gov (United States)

    Orlandini, S; Pasquini, B; Stocchero, M; Pinzauti, S; Furlanetto, S

    2014-04-25

    The development of a capillary electrophoresis (CE) method for the assay of almotriptan (ALM) and its main impurities using an integrated Quality by Design and mixture-process variable (MPV) approach is described. A scouting phase was initially carried out by evaluating different CE operative modes, including the addition of pseudostationary phases and additives to the background electrolyte, in order to approach the analytical target profile. This step made it possible to select normal polarity microemulsion electrokinetic chromatography (MEEKC) as operative mode, which allowed a good selectivity to be achieved in a low analysis time. On the basis of a general Ishikawa diagram for MEEKC methods, a screening asymmetric matrix was applied in order to screen the effects of the process variables (PVs) voltage, temperature, buffer concentration and buffer pH, on critical quality attributes (CQAs), represented by critical separation values and analysis time. A response surface study was then carried out considering all the critical process parameters, including both the PVs and the mixture components (MCs) of the microemulsion (borate buffer, n-heptane as oil, sodium dodecyl sulphate/n-butanol as surfactant/cosurfactant). The values of PVs and MCs were simultaneously changed in a MPV study, making it possible to find significant interaction effects. The design space (DS) was defined as the multidimensional combination of PVs and MCs where the probability for the different considered CQAs to be acceptable was higher than a quality level π=90%. DS was identified by risk of failure maps, which were drawn on the basis of Monte-Carlo simulations, and verification points spanning the design space were tested. Robustness testing of the method, performed by a D-optimal design, and system suitability criteria allowed a control strategy to be designed. The optimized method was validated following ICH Guideline Q2(R1) and was applied to a real sample of ALM coated tablets

  2. Method Development for Clinical Comprehensive Evaluation of Pediatric Drugs Based on Multi-Criteria Decision Analysis: Application to Inhaled Corticosteroids for Children with Asthma.

    Science.gov (United States)

    Yu, Yuncui; Jia, Lulu; Meng, Yao; Hu, Lihua; Liu, Yiwei; Nie, Xiaolu; Zhang, Meng; Zhang, Xuan; Han, Sheng; Peng, Xiaoxia; Wang, Xiaoling

    2018-04-01

    Establishing a comprehensive clinical evaluation system is critical in enacting national drug policy and promoting rational drug use. In China, the 'Clinical Comprehensive Evaluation System for Pediatric Drugs' (CCES-P) project, which aims to compare drugs based on clinical efficacy and cost effectiveness to help decision makers, was recently proposed; therefore, a systematic and objective method is required to guide the process. An evidence-based multi-criteria decision analysis model that involved an analytic hierarchy process (AHP) was developed, consisting of nine steps: (1) select the drugs to be reviewed; (2) establish the evaluation criterion system; (3) determine the criterion weight based on the AHP; (4) construct the evidence body for each drug under evaluation; (5) select comparative measures and calculate the original utility score; (6) place a common utility scale and calculate the standardized utility score; (7) calculate the comprehensive utility score; (8) rank the drugs; and (9) perform a sensitivity analysis. The model was applied to the evaluation of three different inhaled corticosteroids (ICSs) used for asthma management in children (a total of 16 drugs with different dosage forms and strengths or different manufacturers). By applying the drug analysis model, the 16 ICSs under review were successfully scored and evaluated. Budesonide suspension for inhalation (drug ID number: 7) ranked the highest, with comprehensive utility score of 80.23, followed by fluticasone propionate inhaled aerosol (drug ID number: 16), with a score of 79.59, and budesonide inhalation powder (drug ID number: 6), with a score of 78.98. In the sensitivity analysis, the ranking of the top five and lowest five drugs remains unchanged, suggesting this model is generally robust. An evidence-based drug evaluation model based on AHP was successfully developed. The model incorporates sufficient utility and flexibility for aiding the decision-making process, and can be a useful

  3. Development of an HPLC-UV Method for the Analysis of Drugs Used for Combined Hypertension Therapy in Pharmaceutical Preparations and Human Plasma

    Directory of Open Access Journals (Sweden)

    Serife Evrim Kepekci Tekkeli

    2013-01-01

    Full Text Available A simple, rapid, and selective HPLC-UV method was developed for the determination of antihypertensive drug substances: amlodipine besilat (AML, olmesartan medoxomil (OLM, valsartan (VAL, and hydrochlorothiazide (HCT in pharmaceuticals and plasma. These substances are mostly used as combinations. The combinations are found in various forms, especially in current pharmaceuticals as threesome components: OLM, AML, and HCT (combination I and AML, VAL, and HCT (combination II. The separation was achieved by using an RP-CN column, and acetonitrile-methanol-10 mmol orthophosphoric acid pH 2.5 (7 : 13 : 80, v/v/v was used as a mobile phase; the detector wavelength was set at 235 nm. The linear ranges were found as 0.1–18.5 μg/mL, 0.4–25.6 μg/mL, 0.3–15.5 μg/mL, and 0.3–22 μg/mL for AML, OLM, VAL, and HCT, respectively. In order to check the selectivity of the method for pharmaceutical preparations, forced degradation studies were carried out. According to the validation studies, the developed method was found to be reproducible and accurate as shown by RSD ≤6.1%, 5.7%, 6.9%, and 4.6% and relative mean error (RME ≤10.6%, 5.8%, 6.5%, and 6.8% for AML, OLM, VAL, and HCT, respectively. Consequently, the method was applied to the analysis of tablets and plasma of the patients using drugs including those substances.

  4. Reverse phase high performance liquid chromatographic method development based on ultravioletvisible detector for the analysis of 1-hydroxypyrene (PAH biomarker) in human urine.

    Science.gov (United States)

    Kamal, Atif; Gulfraz, Mohammad; Anwar, Mohammad Asad; Malik, Riffat Naseem

    2015-01-01

    1-hydroxypyrene is an important biomarker of exposure to polycyclic aromatic hydrocarbons (PAHs), which appears in the urine of exposed human subjects. In developing countries, where advanced instruments are not available, the importance of this biomarker demands convenient and sensitive methods for determination purposes. This study aimed at developing a methodology to quantify 1-hydroxypyrene (a biomarker of PAHs exposure) based on the UV-visible detector in the reverse phase high pressure liquid chromatography (HPLC). A 20 μl injection of sample was used for manual injection into the HPLC Shimadzu, equipped with the SPD-20 A UV-visible detector, the LC-20AT pump and the DGU-20A5 degasser. The C-18 column was used for the purpose of the analysis. The method showed a good linearity (the range: R2 = 0.979-0.989), and high detectability up to the nmol level. The average retention was 6.37, with the accuracy of 2%, and the percentage of recovery remained 108%. The overall performance of this method was comparable (in terms of detection sensitivity) and relatively better than previously reported studies using the HPLC system equipped with the UV-detector. This method is suitable and reliable for the detection/quantification of the 1-OHP in human urine samples, using the UV-detector, however, it is less sensitive as compared to the results of a florescence detector. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  5. Scientific methods for developing ultrastable structures

    International Nuclear Information System (INIS)

    Gamble, M.; Thompson, T.; Miller, W.

    1990-01-01

    Scientific methods used by the Los Alamos National Laboratory for developing an ultrastable structure for study of silicon-based elementary particle tracking systems are addressed. In particular, the design, analysis, and monitoring of this system are explored. The development methodology was based on a triad of analytical, computational, and experimental techniques. These were used to achieve a significant degree of mechanical stability (alignment accuracy >1 μrad) and yet allow dynamic manipulation of the system. Estimates of system thermal and vibratory stability and component performance are compared with experimental data collected using laser interferometry and accelerometers. 8 refs., 5 figs., 4 tabs

  6. Using of a combined approach by biochemical and image analysis to develop a new method to estimate seed maturity stage for Bordeaux area grapevine

    Directory of Open Access Journals (Sweden)

    Amélie RABOT

    2017-03-01

    Full Text Available Aim: The importance of phenolic maturity (depending on tannins and anthocyans of the grape is crucial at harvest and determines the final quality of wine. The work presented here aims to characterize the evolution of phenolic maturity of seeds for 3 varieties combining macroscopic analysis and biochemical analyzes of these tannins at phenological stages of interest. Methods and results: Using R software for macroscopic analyzes have been shown that the color varies dramatically (from green to dark brown in the two months between the bunch closure and maturity. Biochemical analysis (HPLC measurement shows that tannins in seeds are increasing from bunch closure to early veraison and decrease after this step until maturity. Conclusion: All together these results have shown that the color variation is correlated to the tannins content in the seeds. Significance of the study: Nowadays, no easy ways of prediction of phenolic maturity are known. The aim of this work is to use these results (usually considered independently to have an knowledge of seed level of phenolic maturity necessary without biochemical analysis to establish a date of great harvest for the most favorable conditions for the extraction of tannins required for the organoleptic quality of a wine .The originality of this work is to use the combined visual seeds and its biochemical composition in tannins (correlation established by CPA. Forward, these results will help to develop a decision support tool based on simply system to acquiring seed image easily usable by winemakers.

  7. Development of the high-order decoupled direct method in three dimensions for particulate matter: enabling advanced sensitivity analysis in air quality models

    Directory of Open Access Journals (Sweden)

    W. Zhang

    2012-03-01

    Full Text Available The high-order decoupled direct method in three dimensions for particulate matter (HDDM-3D/PM has been implemented in the Community Multiscale Air Quality (CMAQ model to enable advanced sensitivity analysis. The major effort of this work is to develop high-order DDM sensitivity analysis of ISORROPIA, the inorganic aerosol module of CMAQ. A case-specific approach has been applied, and the sensitivities of activity coefficients and water content are explicitly computed. Stand-alone tests are performed for ISORROPIA by comparing the sensitivities (first- and second-order computed by HDDM and the brute force (BF approximations. Similar comparison has also been carried out for CMAQ sensitivities simulated using a week-long winter episode for a continental US domain. Second-order sensitivities of aerosol species (e.g., sulfate, nitrate, and ammonium with respect to domain-wide SO2, NOx, and NH3 emissions show agreement with BF results, yet exhibit less noise in locations where BF results are demonstrably inaccurate. Second-order sensitivity analysis elucidates poorly understood nonlinear responses of secondary inorganic aerosols to their precursors and competing species. Adding second-order sensitivity terms to the Taylor series projection of the nitrate concentrations with a 50% reduction in domain-wide NOx or SO2 emissions rates improves the prediction with statistical significance.

  8. Developments of an Interactive Sail Design Method

    Directory of Open Access Journals (Sweden)

    S. M. Malpede

    2000-01-01

    Full Text Available This paper presents a new tool for performing the integrated design and analysis of a sail. The features of the system are the geometrical definition of a sail shape, using the Bezier surface method, the creation of a finite element model for the non-linear structural analysis and a fluid-dynamic model for the aerodynamic analysis. The system has been developed using MATLAB(r. Recent sail design efforts have been focused on solving the aeroelastic behavior of the sail. The pressure distribution on a sail changes continuously, by virtue of cloth stretch and flexing. The sail shape determines the pressure distribution and, at the same time, the pressure distribution on the sail stretches and flexes the sail material determining its shape. This characteristic non-linear behavior requires iterative solution strategies to obtain the equilibrium configuration and evaluate the forces involved. The aeroelastic problem is tackled by combining structural with aerodynamic analysis. Firstly, pressure loads for a known sail-shape are computed (aerodynamic analysis. Secondly, the sail-shape is analyzed for the obtained external loads (structural analysis. The final solution is obtained by using an iterative analysis process, which involves both aerodynamic and the structural analysis. When the solution converges, it is possible to make design modifications.

  9. Development of a microwave assisted extraction method for the analysis of 2,4,6-trichloroanisole in cork stoppers by SIDA-SBSE-GC-MS

    International Nuclear Information System (INIS)

    Vestner, Jochen; Fritsch, Stefanie; Rauhut, Doris

    2010-01-01

    The aim of this research work was focused on the replacement of the time-consuming soaking of cork stoppers which is mainly used as screening method for cork lots in connection with sensory analysis and/or analytical methods to detect releasable 2,4,6-trichloroanisole (TCA) of natural cork stoppers. Releasable TCA from whole cork stoppers was analysed with the application of a microwave assisted extraction method (MAE) in combination with stir bar sorptive extraction (SBSE). The soaking of corks (SOAK) was used as a reference method to optimise MAE parameters. Cork lots of different quality and TCA contamination levels were used to adapt MAE. Pre-tests indicated that an MAE at 40 deg. C for 120 min with 90 min of cooling time are suitable conditions to avoid an over-extraction of TCA of low and medium tainted cork stoppers in comparison to SOAK. These MAE parameters allow the measuring of almost the same amounts of releasable TCA as with the application of the soaking procedure in the relevant range ( -1 releasable TCA from one cork) to evaluate the TCA level of cork stoppers. Stable isotope dilution assay (SIDA) was applied to optimise quantification of the released TCA with deuterium-labelled TCA (TCA-d 5 ) using a time-saving GC-MS technique in single ion monitoring (SIM) mode. The developed MAE method allows the measuring of releasable TCA from the whole cork stopper under improved conditions and in connection with a low use of solvent and a higher sample throughput.

  10. Development of a microwave assisted extraction method for the analysis of 2,4,6-trichloroanisole in cork stoppers by SIDA-SBSE-GC-MS.

    Science.gov (United States)

    Vestner, Jochen; Fritsch, Stefanie; Rauhut, Doris

    2010-02-15

    The aim of this research work was focused on the replacement of the time-consuming soaking of cork stoppers which is mainly used as screening method for cork lots in connection with sensory analysis and/or analytical methods to detect releasable 2,4,6-trichloroanisole (TCA) of natural cork stoppers. Releasable TCA from whole cork stoppers was analysed with the application of a microwave assisted extraction method (MAE) in combination with stir bar sorptive extraction (SBSE). The soaking of corks (SOAK) was used as a reference method to optimise MAE parameters. Cork lots of different quality and TCA contamination levels were used to adapt MAE. Pre-tests indicated that an MAE at 40 degrees C for 120 min with 90 min of cooling time are suitable conditions to avoid an over-extraction of TCA of low and medium tainted cork stoppers in comparison to SOAK. These MAE parameters allow the measuring of almost the same amounts of releasable TCA as with the application of the soaking procedure in the relevant range (cork) to evaluate the TCA level of cork stoppers. Stable isotope dilution assay (SIDA) was applied to optimise quantification of the released TCA with deuterium-labelled TCA (TCA-d(5)) using a time-saving GC-MS technique in single ion monitoring (SIM) mode. The developed MAE method allows the measuring of releasable TCA from the whole cork stopper under improved conditions and in connection with a low use of solvent and a higher sample throughput. Copyright 2009 Elsevier B.V. All rights reserved.

  11. Flows method in global analysis

    International Nuclear Information System (INIS)

    Duong Minh Duc.

    1994-12-01

    We study the gradient flows method for W r,p (M,N) where M and N are Riemannian manifold and r may be less than m/p. We localize some global analysis problem by constructing gradient flows which only change the value of any u in W r,p (M,N) in a local chart of M. (author). 24 refs

  12. Nondestructive analysis and development

    Science.gov (United States)

    Moslehy, Faissal A.

    1993-01-01

    This final report summarizes the achievements of project #4 of the NASA/UCF Cooperative Agreement from January 1990 to December 1992. The objectives of this project are to review NASA's NDE program at Kennedy Space Center (KSC) and recommend means for enhancing the present testing capabilities through the use of improved or new technologies. During the period of the project, extensive development of a reliable nondestructive, non-contact vibration technique to determine and quantify the bond condition of the thermal protection system (TPS) tiles of the Space Shuttle Orbiter was undertaken. Experimental modal analysis (EMA) is used as a non-destructive technique for the evaluation of Space Shuttle thermal protection system (TPS) tile bond integrity. Finite element (FE) models for tile systems were developed and were used to generate their vibration characteristics (i.e. natural frequencies and mode shapes). Various TPS tile assembly configurations as well as different bond conditions were analyzed. Results of finite element analyses demonstrated a drop in natural frequencies and a change in mode shapes which correlate with both size and location of disbond. Results of experimental testing of tile panels correlated with FE results and demonstrated the feasibility of EMA as a viable technique for tile bond verification. Finally, testing performed on the Space Shuttle Columbia using a laser doppler velocimeter demonstrated the application of EMA, when combined with FE modeling, as a non-contact, non-destructive bond evaluation technique.

  13. Development of a multianalyte method based on micro-matrix-solid-phase dispersion for the analysis of fragrance allergens and preservatives in personal care products.

    Science.gov (United States)

    Celeiro, Maria; Guerra, Eugenia; Lamas, J Pablo; Lores, Marta; Garcia-Jares, Carmen; Llompart, Maria

    2014-05-30

    An effective, simple and low cost sample preparation method based on matrix solid-phase dispersion (MSPD) followed by gas chromatography-mass spectrometry (GC-MS) or gas chromatography-triple quadrupole-mass spectrometry (GC-MS/MS) has been developed for the rapid simultaneous determination of 38 cosmetic ingredients, 25 fragrance allergens and 13 preservatives. All target substances are frequently used in cosmetics and personal care products and they are subjected to use restrictions or labeling requirements according to the EU Cosmetic Directive. The extraction procedure was optimized on real non-spiked rinse-off and leave-on cosmetic products by means of experimental designs. The final miniaturized process required the use of only 0.1g of sample and 1 mL of organic solvent, obtaining a final extract ready for analysis. The micro-MSPD method was validated showing satisfactory performance by GC-MS and GC-MS/MS analysis. The use of GC coupled to triple quadrupole mass detection allowed to reach very low detection limits (low ng g(-1)) improving, at the same time, method selectivity. In an attempt to improve the chromatographic analysis of preservatives, the inclusion of a derivatization step was also assessed. The proposed method was applied to a broad range of cosmetics and personal care products (shampoos, body milk, moisturizing milk, toothpaste, hand creams, gloss lipstick, sunblock, deodorants and liquid soaps among others), demonstrating the extended use of these substances. The concentration levels were ranging from the sub parts per million to the parts per mill. The number of target fragrance allergens per samples was quite high (up to 16). Several fragrances (linalool, farnesol, hexylcinnamal, and benzyl benzoate) have been detected at levels >0.1% (1,000 μg g(-1)). As regards preservatives, phenoxyethanol was the most frequently found additive reaching quite high concentration (>1,500 μg g(-1)) in five cosmetic products. BHT was detected in eight

  14. Analysis of mixed data methods & applications

    CERN Document Server

    de Leon, Alexander R

    2013-01-01

    A comprehensive source on mixed data analysis, Analysis of Mixed Data: Methods & Applications summarizes the fundamental developments in the field. Case studies are used extensively throughout the book to illustrate interesting applications from economics, medicine and health, marketing, and genetics. Carefully edited for smooth readability and seamless transitions between chaptersAll chapters follow a common structure, with an introduction and a concluding summary, and include illustrative examples from real-life case studies in developmental toxicolog

  15. Method Engineering: Engineering of Information Systems Development Methods and Tools

    NARCIS (Netherlands)

    Brinkkemper, J.N.; Brinkkemper, Sjaak

    1996-01-01

    This paper proposes the term method engineering for the research field of the construction of information systems development methods and tools. Some research issues in method engineering are identified. One major research topic in method engineering is discussed in depth: situational methods, i.e.

  16. Modern methods of wine quality analysis

    Directory of Open Access Journals (Sweden)

    Галина Зуфарівна Гайда

    2015-06-01

    Full Text Available  In this paper physical-chemical and enzymatic methods of quantitative analysis of the basic wine components were reviewed. The results of own experiments were presented for the development of enzyme- and cell-based amperometric sensors on ethanol, lactate, glucose, arginine

  17. METHODS TO DEVELOP A TOROIDAL SURFACE

    Directory of Open Access Journals (Sweden)

    DANAILA Ligia

    2017-05-01

    Full Text Available The paper work presents two practical methods to draw the development of a surface unable to be developed applying classical methods of Descriptive Geometry, the toroidal surface, frequently met in technical practice. The described methods are approximate ones; the development is obtained with the help of points. The accuracy of the methods is given by the number of points used when drawing. As for any other approximate method, when practically manufactured the development may need to be adjusted on site.

  18. Development of a detailed BWR core thermal-hydraulic analysis method based on the Japanese post-BT standard using a best-estimate code

    International Nuclear Information System (INIS)

    Ono, H.; Mototani, A.; Kawamura, S.; Abe, N.; Takeuchi, Y.

    2004-01-01

    The post-BT standard is a new fuel integrity standard or the Atomic Energy Society of Japan that allows temporary boiling transition condition in the evaluation for BWR anticipated operational occurrences. For application of the post-BT standard to BWR anticipated operational occurrences evaluation, it is important to identify which fuel assemblies and which axial, radial positions of fuel rods have temporarily experienced the post-BT condition and to evaluates how high the fuel cladding temperature rise was and how long the dryout duration continued. Therefore, whole bundle simulation, in which each fuel assembly is simulated independently by one thermal-hydraulic component, is considered to be an effective analytical method. In the present study, a best-estimate thermal-hydraulic code, TRACG02, has been modified to extend it predictive capability by implementing the post-BT evaluation model such as the post-BT heat transfer correlation and rewetting correlation and enlarging the number of components used for BWR plant simulation. Based on new evaluation methods, BWR core thermal-hydraulic behavior has been analyzed for typical anticipated operational occurrence conditions. The location where boiling transition occurs and the severity of fuel assembly in the case of boiling transition conditions such as fuel cladding temperature, which are important factors in determining whether the reuse of the fuel assembly can be permitted, were well predicted by the proposed evaluation method. In summary, a new evaluation method for a detailed BWR core thermal-hydraulic analysis based on the post-BT standard of the Atomic Energy Society of Japan has been developed and applied to the evaluation of the post-BT standard during the actual BWR plant anticipated operational occurrences. (author)

  19. Development of an enantiomer-specific stable carbon isotope analysis (ESIA) method for assessing the fate of α-hexachlorocyclo-hexane in the environment.

    Science.gov (United States)

    Badea, Silviu-Laurentiu; Vogt, Carsten; Gehre, Matthias; Fischer, Anko; Danet, Andrei-Florin; Richnow, Hans-Hermann

    2011-05-30

    α-Hexachlorocyclohexane (α-HCH) is the only chiral isomer of the eight 1,2,3,4,5,6-HCHs and we have developed an enantiomer-specific stable carbon isotope analysis (ESIA) method for the evaluation of its fate in the environment. The carbon isotope ratios of the α-HCH enantiomers were determined for a commercially available α-HCH sample using a gas chromatography-combustion-isotope ratio mass spectrometry (GC-C-IRMS) system equipped with a chiral column. The GC-C-IRMS measurements revealed δ-values of -32.5 ± 0.8‰ and -32.3 ± 0.5‰ for (-) α-HCH and (+) α-HCH, respectively. The isotope ratio of bulk α-HCH was estimated to be -32.4 ± 0.6‰ which was in accordance with the δ-values obtained by GC-C-IRMS (-32.7 ± 0.2‰) and elemental analyzer-isotope ratio mass spectrometry (EA-IRMS) of the bulk α-HCH (-32.1 ± 0.1‰). The similarity of the isotope ratio measurements of bulk α-HCH by EA-IRMS and GC-C-IRMS indicates the accuracy of the chiral GC-C-IRMS method. The linearity of the α-HCH ESIA method shows that carbon isotope ratios can be obtained for a signal size above 100 mV. The ESIA measurements exhibited standard deviations (2σ) that were mostly IRMS method, the isotope compositions of individual enantiomers in biodegradation experiments of α-HCH with Clostridium pasteurianum and samples from a contaminated field site were determined. The isotopic compositions of the α-HCH enantiomers show a range of enantiomeric and isotope patterns, suggesting that enantiomeric and isotope fractionation can serve as an indicator for biodegradation and source characterization of α-HCH in the environment. Copyright © 2011 John Wiley & Sons, Ltd.

  20. Liquid chromatography-tandem mass spectrometry multiresidue method for the analysis of quaternary ammonium compounds in cheese and milk products: Development and validation using the total error approach.

    Science.gov (United States)

    Slimani, Kahina; Féret, Aurélie; Pirotais, Yvette; Maris, Pierre; Abjean, Jean-Pierre; Hurtaud-Pessel, Dominique

    2017-09-29

    Quaternary ammonium compounds (QACs) are both cationic surfactants and biocidal substances widely used as disinfectants in the food industry. A sensitive and reliable method for the analysis of benzalkonium chlorides (BACs) and dialkyldimethylammonium chlorides (DDACs) has been developed that enables the simultaneous quantitative determination of ten quaternary ammonium residues in dairy products below the provisional maximum residue level (MRL), set at 0.1mgkg -1 . To the best of our knowledge, this method could be the one applicable to milk and to three major processed milk products selected, namely processed or hard pressed cheeses, and whole milk powder. The method comprises solvent extraction using a mixture of acetonitrile and ethyl acetate, without any further clean-up. Analyses were performed by liquid chromatography coupled with electrospray tandem mass spectrometry detection (LC-ESI-MS/MS) operating in positive mode. A C18 analytical column was used for chromatographic separation, with a mobile phase composed of acetonitrile and water both containing 0.3% formic acid; and methanol in the gradient mode. Five deuterated internal standards were added to obtain the most accurate quantification. Extraction recoveries were satisfactory and no matrix effects were observed. The method was validated using the total error approach in accordance with the NF V03-110 standard in order to characterize the trueness, repeatability, intermediate precision and analytical limits within the range of 5-150μgkg -1 for all matrices. These performance criteria, calculated by e.noval ® 3.0 software, were satisfactory and in full accordance with the proposed provisional MRL and with the recommendations in the European Union SANTE/11945/2015 regulatory guidelines. The limit of detection (LOD) was low (ammoniums in foodstuffs from dairy industries at residue levels, and could be used for biocide residues monitoring plans and to measure the exposition consumer to biocides products

  1. Hybrid methods for cybersecurity analysis :

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Warren Leon,; Dunlavy, Daniel M.

    2014-01-01

    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling and analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and

  2. Development of Reverse-Phase HPLC Method for Simultaneous ...

    African Journals Online (AJOL)

    Erah

    Purpose: To develop a simple, sensitive and rapid reverse phase HPLC method for the simultaneous analysis of metoprolol succinate and hydrochlorothiazide in a solid dosage form. Methods: The .... Extraction was carried out three times with.

  3. Development and validation of a spectroscopic method for the ...

    African Journals Online (AJOL)

    Development and validation of a spectroscopic method for the simultaneous analysis of ... advanced analytical methods such as high pressure liquid ..... equipment. DECLARATIONS ... high-performance liquid chromatography. J Chromatogr.

  4. Development of a magnetic solid-phase extraction coupled with high-performance liquid chromatography method for the analysis of polyaromatic hydrocarbons.

    Science.gov (United States)

    Ma, Yan; Xie, Jiawen; Jin, Jing; Wang, Wei; Yao, Zhijian; Zhou, Qing; Li, Aimin; Liang, Ying

    2015-07-01

    A novel magnetic solid phase extraction coupled with high-performance liquid chromatography method was established to analyze polyaromatic hydrocarbons in environmental water samples. The extraction conditions, including the amount of extraction agent, extraction time, pH and the surface structure of the magnetic extraction agent, were optimized. The results showed that the amount of extraction agent and extraction time significantly influenced the extraction performance. The increase in the specific surface area, the enlargement of pore size, and the reduction of particle size could enhance the extraction performance of the magnetic microsphere. The optimized magnetic extraction agent possessed a high surface area of 1311 m(2) /g, a large pore size of 6-9 nm, and a small particle size of 6-9 μm. The limit of detection for phenanthrene and benzo[g,h,i]perylene in the developed analysis method was 3.2 and 10.5 ng/L, respectively. When applied to river water samples, the spiked recovery of phenanthrene and benzo[g,h,i]perylene ranged from 89.5-98.6% and 82.9-89.1%, respectively. Phenanthrene was detected over a concentration range of 89-117 ng/L in three water samples withdrawn from the midstream of the Huai River, and benzo[g,h,i]perylene was below the detection limit. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Data Analysis Methods for Paleogenomics

    DEFF Research Database (Denmark)

    Avila Arcos, Maria del Carmen

    (Danmarks Grundforskningfond) 'Centre of Excellence in GeoGenetics' grant, with additional funding provided by the Danish Council for Independent Research 'Sapere Aude' programme. The thesis comprises five chapters, all of which represent different projects that involved the analysis of massive amounts......, thanks to the introduction of NGS and the implementation of data analysis methods specific for each project. Chapters 1 to 3 have been published in peer-reviewed journals and Chapter 4 is currently in review. Chapter 5 consists of a manuscript describing initial results of an ongoing research project......The work presented in this thesis is the result of research carried out during a three-year PhD at the Centre for GeoGenetics, Natural History Museum of Denmark, University of Copenhagen, under supervision of Professor Tom Gilbert. The PhD was funded by the Danish National Research Foundation...

  6. Developing the Model of "Pedagogical Art Communication" Using Social Phenomenological Analysis: An Introduction to a Research Method and an Example for Its Outcome

    Science.gov (United States)

    Hofmann, Fabian

    2016-01-01

    Social phenomenological analysis is presented as a research method for museum and art education. After explaining its methodological background, it is shown how this method has been applied in a study of gallery talks or guided tours in art museums: Analyzing the situation by description and interpretation, a model for understanding gallery talks…

  7. Developments in gamma-ray spectrometry: systems, software, and methods-I. 5. Nuclear Spectral Analysis with Nonlinear Robust Fitting Techniques

    International Nuclear Information System (INIS)

    Lasche, G.P.; Coldwell, R.L.

    2001-01-01

    A new approach to nuclear spectral analysis based on nonlinear robust fitting techniques has been recently developed into a package suitable for public use. The methodology behind this approach was originally made available to the public as the RobFit command-line code, but it was extremely slow and difficult to use. Recent advances in microprocessor power and the development of a graphical user interface to make its use more intuitive have made this approach, which is quite computationally intensive, feasible for more routine applications. A brief description of some of the fundamental differences in the approach used by RobFit from the more common methods of nuclear spectral analysis involving local peak searches is presented here. Popular nuclear spectral analysis applications generally perform a peak search at their heart. The continuum in the neighborhood of each peak is estimated from local data and is subtracted from the data to yield the area and the energy of the peak. These are matched to a user-selected library of radionuclides containing the energies and areas of the most significant peaks, after accounting for the effects of detector efficiency and attenuation. With these codes, the energy-to-channel calibration, the peak width as a function of energy (or 'resolution calibration'), the detector intrinsic efficiency, and attenuation effects must usually be predetermined and provided as static input for the analysis. Most of these codes focus on regions of interest that represent many small pieces of the sample spectrum. In contrast, the RobFit approach works with an entire continuous spectrum to simultaneously determine the coefficients of all of the user-selected free variables that yield the best fit to the data. Peak searches are generally used only in interim steps to help suggest new radionuclides to include in the search library. Rather than first concentrate on the location of peaks, RobFit first concentrates on the determination of the continuum

  8. Development of a fast isocratic LC-MS/MS method for the high-throughput analysis of pyrrolizidine alkaloids in Australian honey.

    Science.gov (United States)

    Griffin, Caroline T; Mitrovic, Simon M; Danaher, Martin; Furey, Ambrose

    2015-01-01

    Honey samples originating from Australia were purchased and analysed for targeted pyrrolizidine alkaloids (PAs) using a new and rapid isocratic LC-MS/MS method. This isocratic method was developed from, and is comparable with, a gradient elution method and resulted in no loss of sensitivity or reduction in chromatographic peak shape. Isocratic elution allows for significantly shorter run times (6 min), eliminates the requirement for column equilibration periods and, thus, has the advantage of facilitating a high-throughput analysis which is particularly important for regulatory testing laboratories. In excess of two hundred injections are possible, with this new isocratic methodology, within a 24-h period which is more than 50% improvement on all previously published methodologies. Good linear calibrations were obtained for all 10 PAs and four PA N-oxides (PANOs) in spiked honey samples (3.57-357.14 µg l(-1); R(2) ≥ 0.9987). Acceptable inter-day repeatability was achieved for the target analytes in honey with % RSD values (n = 4) less than 7.4%. Limits of detection (LOD) and limits of quantitation (LOQ) were achieved with spiked PAs and PANOs samples; giving an average LOD of 1.6 µg kg(-1) and LOQ of 5.4 µg kg(-1). This method was successfully applied to Australian and New Zealand honey samples sourced from supermarkets in Australia. Analysis showed that 41 of the 59 honey samples were contaminated by PAs with the mean total sum of PAs being 153 µg kg(-1). Echimidine and lycopsamine were predominant and found in 76% and 88%, respectively, of the positive samples. The average daily exposure, based on the results presented in this study, were 0.051 µg kg(-1) bw day(-1) for adults and 0.204 µg kg(-1) bw day(-1) for children. These results are a cause for concern when compared with the proposed European Food Safety Authority (EFSA), Committee on Toxicity (COT) and Bundesinstitut für Risikobewertung (BfR - Federal Institute of Risk Assessment Germany) maximum

  9. Ethnographic Contributions to Method Development

    DEFF Research Database (Denmark)

    Leander, Anna

    2016-01-01

    of IR—Critical Security Studies. Ethnographic research works with what has been termed a “strong” understanding of objectivity. When this understanding is taken seriously, it must lead to a refashioning of the processes of gathering, analyzing, and presenting data in ways that reverse many standard...... of research in the ethnographic tradition. However, it would also require rethinking standard methods instructions and the judgments they inform....... assumptions and instructions pertaining to “sound methods.” Both in the context of observation and in that of justification, working with “strong objectivity” requires a flexibility and willingness to shift research strategies that is at odds with the usual emphasis on stringency, consistency, and carefully...

  10. SIGNIFICANCE OF TARGETED EXOME SEQUENCING AND METHODS OF DATA ANALYSIS IN THE DIAGNOSIS OF GENETIC DISORDERS LEADING TO THE DEVELOPMENT OF EPILEPTIC ENCEPHALOPATHY

    Directory of Open Access Journals (Sweden)

    Tatyana Victorovna Kozhanova

    2017-08-01

    Full Text Available Epilepsy is the most common serious neurological disorder, and there is a genetic basis in almost 50% of people with epilepsy. The diagnosis of genetic epilepsies makes to estimate reasons of seizures in the patient. Last decade has shown tremendous growth in gene sequencing technologies, which have made genetic tests available. The aim is to show significance of targeted exome sequencing and methods of data analysis in the diagnosis of hereditary syndromes leading to the development of epileptic encephalopathy. We examined 27 patients with с early EE (resistant to antiepileptic drugs, psychomotor and speech development delay in the psycho-neurological department. Targeted exome sequencing was performed for patients without a previously identified molecular diagnosis using 454 Sequencing GS Junior sequencer (Roche and IlluminaNextSeq 500 platform. As a result of the analysis, specific epilepsy genetic variants were diagnosed in 27 patients. The greatest number of cases was due to mutations in the SCN1A gene (7/27. The structure of mutations for other genes (mutations with a minor allele frequency of less than 0,5% are presented: ALDH7A1 (n=1, CACNA1C (n=1, CDKL5 (n=1, CNTNAP2 (n=2, DLGAP2 (n=2, DOCK7 (n=2, GRIN2B (n=2, HCN1 (n=1, NRXN1 (n=3, PCDH19 (n=1, RNASEH2B (n=2, SLC2A1 (n=1, UBE3A (n=1. The use of the exome sequencing in the genetic practice allows to significantly improve the effectiveness of medical genetic counseling, as it made possible to diagnose certain variants of genetically heterogeneous groups of diseases with similar of clinical manifestations.

  11. Developing Scoring Algorithms (Earlier Methods)

    Science.gov (United States)

    We developed scoring procedures to convert screener responses to estimates of individual dietary intake for fruits and vegetables, dairy, added sugars, whole grains, fiber, and calcium using the What We Eat in America 24-hour dietary recall data from the 2003-2006 NHANES.

  12. Taxonomic Dimensions for Studying Situational Method Development

    NARCIS (Netherlands)

    Aydin, Mehmet N.; Harmsen, Frank; van Hillegersberg, Jos; Ralyté, Jolita; Brinkkemper, Sjaak; Henderson-Sellers, Brian

    2007-01-01

    This paper is concerned with fragmented literature on situational method development, which is one of fundamental topics related to information systems development (ISD) methods. As the topic has attracted many scholars from various and possibly complementary schools of thought, different

  13. Method Engineering: Engineering of Information Systems Development Methods and Tools

    OpenAIRE

    Brinkkemper, J.N.; Brinkkemper, Sjaak

    1996-01-01

    This paper proposes the term method engineering for the research field of the construction of information systems development methods and tools. Some research issues in method engineering are identified. One major research topic in method engineering is discussed in depth: situational methods, i.e. the configuration of a project approach that is tuned to the project at hand. A language and support tool for the engineering of situational methods are discussed.

  14. Economic analysis of alternative LLW disposal methods

    International Nuclear Information System (INIS)

    Foutes, C.E.

    1987-01-01

    The Environmental Protection Agency (EPA) has evaluated the costs and benefits of alternative disposal technologies as part of its program to develop generally applicable environmental standards for the land disposal of low-level radioactive waste (LLW). Costs, population health effects and Critical Population Group (CPG) exposures resulting from alternative waste treatment and disposal methods were developed and input into the analysis. The cost-effectiveness analysis took into account a number of waste streams, hydrogeologic and climatic region settings, and waste treatment and disposal methods. Total costs of each level of a standard included costs for packaging, processing, transportation, and burial of waste. Benefits are defined in terms of reductions in the general population health risk (expected fatal cancers and genetic effects) evaluated over 10,000 years. A cost-effectiveness ratio, was calculated for each alternative standard. This paper describes the alternatives considered and preliminary results of the cost-effectiveness analysis

  15. Developments in geophysical exploration methods

    CERN Document Server

    1982-01-01

    One of the themes in current geophysical development is the bringing together of the results of observations made on the surface and those made in the subsurface. Several benefits result from this association. The detailed geological knowledge obtained in the subsurface can be extrapolated for short distances with more confidence when the geologi­ cal detail has been related to well-integrated subsurface and surface geophysical data. This is of value when assessing the characteristics of a partially developed petroleum reservoir. Interpretation of geophysical data is generally improved by the experience of seeing the surface and subsurface geophysical expression of a known geological configuration. On the theoretical side, the understanding of the geophysical processes themselves is furthered by the study of the phenomena in depth. As an example, the study of the progress of seismic wave trains downwards and upwards within the earth has proved most instructive. This set of original papers deals with some of ...

  16. COMPUTER-ASSISTED HIGH-PERFORMANCE LIQUID CHROMATOGRAPHY METHOD DEVELOPMENT WITH APPLICATIONS TO THE ISOLATION AND ANALYSIS OF PHYTOPLANKTON PIGMENTS. (R826944)

    Science.gov (United States)

    We used chromatography modeling software to assist in HPLC method development, with the goalof enhancing separations through the exclusive use of gradient time and column temperature. Wesurveyed nine stationary phases for their utility in pigment purification and natur...

  17. Infinitesimal methods of mathematical analysis

    CERN Document Server

    Pinto, J S

    2004-01-01

    This modern introduction to infinitesimal methods is a translation of the book Métodos Infinitesimais de Análise Matemática by José Sousa Pinto of the University of Aveiro, Portugal and is aimed at final year or graduate level students with a background in calculus. Surveying modern reformulations of the infinitesimal concept with a thoroughly comprehensive exposition of important and influential hyperreal numbers, the book includes previously unpublished material on the development of hyperfinite theory of Schwartz distributions and its application to generalised Fourier transforms and harmon

  18. Nuclear methods in national development

    International Nuclear Information System (INIS)

    1993-01-01

    This volume of the proceedings of the First National Conference on Nuclear Methods held at Kongo Conference Hotel Zaria from 2-4 September 1993, contains the full text of about 30 technical papers and speeches of invited dignitaries presented at the conference. The technical papers are original or review articles containing results and experiences in nuclear and related analytical techniques. Topics treated include neutron generator operation and control, nuclear data, application of nuclear techniques in environment, geochemistry, medicine, biology, agriculture, material science and industries. General topics in nuclear laboratory organization and research experiences were also covered. The papers were fully discussed during the conference and authors were requested to make changes in the manuscripts where necessary. However, they were further edited. The organizing committee wishes to thank all authors for their presentation and cooperation in submitting their manuscripts promptly and the participants for their excellent contribution during the conference

  19. Review of strain buckling: analysis methods

    International Nuclear Information System (INIS)

    Moulin, D.

    1987-01-01

    This report represents an attempt to review the mechanical analysis methods reported in the literature to account for the specific behaviour that we call buckling under strain. In this report, this expression covers all buckling mechanisms in which the strains imposed play a role, whether they act alone (as in simple buckling under controlled strain), or whether they act with other loadings (primary loading, such as pressure, for example). Attention is focused on the practical problems relevant to LMFBR reactors. The components concerned are distinguished by their high slenderness ratios and by rather high thermal levels, both constant and variable with time. Conventional static buckling analysis methods are not always appropriate for the consideration of buckling under strain. New methods must therefore be developed in certain cases. It is also hoped that this review will facilitate the coding of these analytical methods to aid the constructor in his design task and to identify the areas which merit further investigation

  20. Gravimetric and titrimetric methods of analysis

    International Nuclear Information System (INIS)

    Rives, R.D.; Bruks, R.R.

    1983-01-01

    Gravimetric and titrimetric methods of analysis are considered. Methods of complexometric titration are mentioned, as well as methods of increasing sensitivity in titrimetry. Gravimetry and titrimetry are applied during analysis for traces of geological materials

  1. A strategy for evaluating pathway analysis methods.

    Science.gov (United States)

    Yu, Chenggang; Woo, Hyung Jun; Yu, Xueping; Oyama, Tatsuya; Wallqvist, Anders; Reifman, Jaques

    2017-10-13

    Researchers have previously developed a multitude of methods designed to identify biological pathways associated with specific clinical or experimental conditions of interest, with the aim of facilitating biological interpretation of high-throughput data. Before practically applying such pathway analysis (PA) methods, we must first evaluate their performance and reliability, using datasets where the pathways perturbed by the conditions of interest have been well characterized in advance. However, such 'ground truths' (or gold standards) are often unavailable. Furthermore, previous evaluation strategies that have focused on defining 'true answers' are unable to systematically and objectively assess PA methods under a wide range of conditions. In this work, we propose a novel strategy for evaluating PA methods independently of any gold standard, either established or assumed. The strategy involves the use of two mutually complementary metrics, recall and discrimination. Recall measures the consistency of the perturbed pathways identified by applying a particular analysis method to an original large dataset and those identified by the same method to a sub-dataset of the original dataset. In contrast, discrimination measures specificity-the degree to which the perturbed pathways identified by a particular method to a dataset from one experiment differ from those identifying by the same method to a dataset from a different experiment. We used these metrics and 24 datasets to evaluate six widely used PA methods. The results highlighted the common challenge in reliably identifying significant pathways from small datasets. Importantly, we confirmed the effectiveness of our proposed dual-metric strategy by showing that previous comparative studies corroborate the performance evaluations of the six methods obtained by our strategy. Unlike any previously proposed strategy for evaluating the performance of PA methods, our dual-metric strategy does not rely on any ground truth

  2. Development of phased mission analysis program with Monte Carlo method. Improvement of the variance reduction technique with biasing towards top event

    International Nuclear Information System (INIS)

    Yang Jinan; Mihara, Takatsugu

    1998-12-01

    This report presents a variance reduction technique to estimate the reliability and availability of highly complex systems during phased mission time using the Monte Carlo simulation. In this study, we introduced the variance reduction technique with a concept of distance between the present system state and the cut set configurations. Using this technique, it becomes possible to bias the transition from the operating states to the failed states of components towards the closest cut set. Therefore a component failure can drive the system towards a cut set configuration more effectively. JNC developed the PHAMMON (Phased Mission Analysis Program with Monte Carlo Method) code which involved the two kinds of variance reduction techniques: (1) forced transition, and (2) failure biasing. However, these techniques did not guarantee an effective reduction in variance. For further improvement, a variance reduction technique incorporating the distance concept was introduced to the PHAMMON code and the numerical calculation was carried out for the different design cases of decay heat removal system in a large fast breeder reactor. Our results indicate that the technique addition of this incorporating distance concept is an effective means of further reducing the variance. (author)

  3. Chemometrically assisted development and validation of LC-MS/MS method for the analysis of potential genotoxic impurities in meropenem active pharmaceutical ingredient.

    Science.gov (United States)

    Grigori, Katerina; Loukas, Yannis L; Malenović, Anđelija; Samara, Vicky; Kalaskani, Anastasia; Dimovasili, Efi; Kalovidouri, Magda; Dotsikas, Yannis

    2017-10-25

    A sensitive Liquid Chromatography tandem mass spectrometry (LC-MS/MS) method was developed and validated for the quantitative analysis of three potential genotoxic impurities (318BP, M9, S5) in meropenem Active Pharmaceutical Ingredient (API). Due to the requirement for LOD values in ppb range, a high concentration of meropenem API (30mg/mL) had to be injected. Therefore, efficient determination of meropenem from its impurities became a critical aim of this study, in order to divert meropenem to waste, via a switching valve. ‎ After the selection of the important factors affecting analytes' elution, a Box-Behnken design was utilized to set the plan of experiments conducted with UV detector. As responses, the separation factor s between the last eluting impurity and meropenem, as well as meropenem retention factor k were used. Grid point search methodology was implemented aiming to obtain the optimal conditions that simultaneously comply to the conflicted criteria. Optimal mobile phase consisted of ACN, methanol and 0.09% HCOOH at a ratio 71/3.5/15.5v/v. All impurities and internal standard omeprazole were eluted before 7.5min and at 8.0min the eluents were directed to waste. The protocol was transferred to LC-MS/MS and validated according to ICH guidelines. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Development of evaluation method for heat removal design of dry storage facilities. Pt. 4. Numerical analysis on vault storage system of cross flow type

    International Nuclear Information System (INIS)

    Sakamoto, Kazuaki; Hattori, Yasuo; Koga, Tomonari; Wataru, Masumi

    1999-01-01

    On the basis of the result of the heat removal test on vault storage system of cross flow type using the 1/5 scale model, an evaluation method for the heat removal design was established. It was composed of the numerical analysis for the convection phenomena of air flow inside the whole facility and that for the natural convection and the detailed turbulent mechanism near the surface of the storage tube. In the former analysis, air temperature distribution in the storage area obtained by the calculation gave good agreement within ±3degC with the test result. And fine turbulence models were introduced in the latter analysis to predict the separation flow in the boundary layer near the surface of the storage tube and the buoyant flow generated by the heat from the storage tube. Furthermore, the properties of removing the heat in a designed full-scale storage facility, such as flow pattern in the storage area, temperature and heat transfer rate of the storage tubes, were evaluated by using each of three methods, which were the established numerical analysis method, the experimental formula demonstrated in the heat removal test and the conventional evaluation method applied to the past heat removal design. As a result, the safety margin and issues included in the methods were grasped, and the measures to make a design more rational were proposed. (author)

  5. Development of methods for evaluating active faults

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    The report for long-term evaluation of active faults was published by the Headquarters for Earthquake Research Promotion on Nov. 2010. After occurrence of the 2011 Tohoku-oki earthquake, the safety review guide with regard to geology and ground of site was revised by the Nuclear Safety Commission on Mar. 2012 with scientific knowledges of the earthquake. The Nuclear Regulation Authority established on Sep. 2012 is newly planning the New Safety Design Standard related to Earthquakes and Tsunamis of Light Water Nuclear Power Reactor Facilities. With respect to those guides and standards, our investigations for developing the methods of evaluating active faults are as follows; (1) For better evaluation on activities of offshore fault, we proposed a work flow to date marine terrace (indicator for offshore fault activity) during the last 400,000 years. We also developed the analysis of fault-related fold for evaluating of blind fault. (2) To clarify the activities of active faults without superstratum, we carried out the color analysis of fault gouge and divided the activities into thousand of years and tens of thousands. (3) To reduce uncertainties of fault activities and frequency of earthquakes, we compiled the survey data and possible errors. (4) For improving seismic hazard analysis, we compiled the fault activities of the Yunotake and Itozawa faults, induced by the 2011 Tohoku-oki earthquake. (author)

  6. Shielding methods development in the United States

    International Nuclear Information System (INIS)

    Mynatt, F.R.

    1977-01-01

    A generalized shielding methodology has been developed in the U.S.A. that is adaptable to the shielding analyses of all reactor types. Thus far used primarily for liquid-metal fast breeder reactors, the methodology includes several component activities: (1) developing methods for calculating radiation transport through reactor-shield systems; (2) processing cross-section libraries; (3) performing design calculations for specific systems; (4) performing and analyzing pertinent integral experiments; (5) performing sensitivity studies on both the design calculations and the experimental analyses; and, finally, (6) calculating shield design parameters and their uncertainties. The criteria for the methodology are a 5 to 10 percent accuracy for responses at locations near the core and a factor of 2 accuracy for responses at distant locations. The methodology has been successfully adapted to most in-vessel and ex-vessel problems encountered in the shield analyses of the Fast Flux Test Facility and the Fast Flux Test Facility and the Clinch River Breeder Reactor; however, improved techniques are needed for calculating regions in which radiation streaming is dominant. Areas of the methodology in which significant progress has recently been made are those involving the development of cross-section libraries, sensitivity analysis methods, and transport codes

  7. Reactor physics methods development at Westinghouse

    International Nuclear Information System (INIS)

    Mueller, E.; Mayhue, L.; Zhang, B.

    2007-01-01

    The current state of reactor physics methods development at Westinghouse is discussed. The focus is on the methods that have been or are under development within the NEXUS project which was launched a few years ago. The aim of this project is to merge and modernize the methods employed in the PWR and BWR steady-state reactor physics codes of Westinghouse. (author)

  8. Reliability and risk analysis methods research plan

    International Nuclear Information System (INIS)

    1984-10-01

    This document presents a plan for reliability and risk analysis methods research to be performed mainly by the Reactor Risk Branch (RRB), Division of Risk Analysis and Operations (DRAO), Office of Nuclear Regulatory Research. It includes those activities of other DRAO branches which are very closely related to those of the RRB. Related or interfacing programs of other divisions, offices and organizations are merely indicated. The primary use of this document is envisioned as an NRC working document, covering about a 3-year period, to foster better coordination in reliability and risk analysis methods development between the offices of Nuclear Regulatory Research and Nuclear Reactor Regulation. It will also serve as an information source for contractors and others to more clearly understand the objectives, needs, programmatic activities and interfaces together with the overall logical structure of the program

  9. Analysis of methods. [information systems evolution environment

    Science.gov (United States)

    Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity

  10. Development and Optimization of Voltammetric Methods for Real Time Analysis of Electrorefiner Salt with High Concentrations of Actinides and Fission Products

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Michael F.; Phongikaroon, Supathorn; Zhang, Jinsuo

    2018-03-30

    This project addresses the problem of achieving accurate material control and accountability (MC&A) around pyroprocessing electrorefiner systems. Spent nuclear fuel pyroprocessing poses a unique challenge with respect to reprocessing technology in that the fuel is never fully dissolved in the process fluid. In this case, the process fluid is molten, anhydrous LiCl-KCl salt. Therefore, there is no traditional input accountability tank. However, electrorefiners (ER) accumulate very large quantities of fissile nuclear material (including plutonium) and should be well safeguarded in a commercial facility. Idaho National Laboratory (INL) currently operates a pyroprocessing facility for treatment of spent fuel from Experimental Breeder Reactor-II with two such ER systems. INL implements MC&A via a mass tracking model in combination with periodic sampling of the salt and other materials followed by destructive analysis. This approach is projected to be insufficient to meet international safeguards timeliness requirements. A real time or near real time monitoring method is, thus, direly needed to support commercialization of pyroprocessing. A variety of approaches to achieving real time monitoring for ER salt have been proposed and studied to date—including a potentiometric actinide sensor for concentration measurements, a double bubbler for salt depth and density measurements, and laser induced breakdown spectroscopy (LIBS) for concentration measurements. While each of these methods shows some promise, each also involves substantial technical complexity that may ultimately limit their implementation. Yet another alternative is voltammetry—a very simple method in theory that has previously been tested for this application to a limited extent. The equipment for a voltammetry system consists of off-the-shelf components (three electrodes and a potentiostat), which results in substantial benefits relative to cost and robustness. Based on prior knowledge of electrochemical

  11. Machine Learning Methods for Production Cases Analysis

    Science.gov (United States)

    Mokrova, Nataliya V.; Mokrov, Alexander M.; Safonova, Alexandra V.; Vishnyakov, Igor V.

    2018-03-01

    Approach to analysis of events occurring during the production process were proposed. Described machine learning system is able to solve classification tasks related to production control and hazard identification at an early stage. Descriptors of the internal production network data were used for training and testing of applied models. k-Nearest Neighbors and Random forest methods were used to illustrate and analyze proposed solution. The quality of the developed classifiers was estimated using standard statistical metrics, such as precision, recall and accuracy.

  12. IoT System Development Methods

    NARCIS (Netherlands)

    Giray, G.; Tekinerdogan, B.; Tüzün, E.

    2018-01-01

    It is generally believed that the application of methods plays an important role in developing quality systems. A development method is mainly necessary for structuring the process in producing largescale and complex systems that involve high costs. Similar to the development of other systems, it is

  13. Development of a versatile sample preparation method and its application for rare-earth pattern and Nd isotope ratio analysis in nuclear forensics

    International Nuclear Information System (INIS)

    Krajko, J.

    2015-01-01

    An improved sample preparation procedure for trace-levels of lanthanides in uranium-bearing samples was developed. The method involves a simple co-precipitation using Fe(III) carrier in ammonium carbonate medium to remove the uranium matrix. The procedure is an effective initial pre-concentration step for the subsequent extraction chromatographic separations. The applicability of the method was demonstrated by the measurement of REE pattern and 143 Nd/ 144 Nd isotope ratio in uranium ore concentrate samples. (author)

  14. Development of Dissolution Test Method for Drotaverine ...

    African Journals Online (AJOL)

    Development of Dissolution Test Method for Drotaverine ... Methods: Sink conditions, drug stability and specificity in different dissolution media were tested to optimize a dissolution test .... test by Prism 4.0 software, and differences between ...

  15. Simple gas chromatographic method for furfural analysis.

    Science.gov (United States)

    Gaspar, Elvira M S M; Lopes, João F

    2009-04-03

    A new, simple, gas chromatographic method was developed for the direct analysis of 5-hydroxymethylfurfural (5-HMF), 2-furfural (2-F) and 5-methylfurfural (5-MF) in liquid and water soluble foods, using direct immersion SPME coupled to GC-FID and/or GC-TOF-MS. The fiber (DVB/CAR/PDMS) conditions were optimized: pH effect, temperature, adsorption and desorption times. The method is simple and accurate (RSDfurfurals will contribute to characterise and quantify their presence in the human diet.

  16. Moral counselling: a method in development.

    Science.gov (United States)

    de Groot, Jack; Leget, Carlo

    2011-01-01

    This article describes a method of moral counselling developed in the Radboud University Medical Centre Nijmegen (The Netherlands). The authors apply insights of Paul Ricoeur to the non-directive counselling method of Carl Rogers in their work of coaching patients with moral problems in health care. The developed method was shared with other health care professionals in a training course. Experiences in the course and further practice led to further improvement of the method.

  17. Analysis methods (from 301 to 351)

    International Nuclear Information System (INIS)

    Analysis methods of materials used in the nuclear field (uranium, plutonium and their compounds, zirconium, magnesium, water...) and determination of impurities. Only reliable methods are selected [fr

  18. Optimization and development of the instrumental parameters for a method of multielemental analysis through atomic spectroscopy emission, for the determination of My, Fe Mn and Cr

    International Nuclear Information System (INIS)

    Lanzoni Vindas, E.

    1998-01-01

    This study optimized the instrumental parameters of a method of multielemental (sequential) analysis, through atomic emission, for the determination of My, Fe,Mn and Cr. It used the factorial design at two levels and the method of Simplex optimization, that permitted the determination of the four cations under the same instrumental conditions. The author studied an analytic system, in which the conditions were not lineal between instrumental answers and the concentration, having to make adjustment of the calibration curves in homocedastic and heterocedastic conditions. (S. Grainger)

  19. PIXE - a new method for elemental analysis

    International Nuclear Information System (INIS)

    Johansson, S.A.E.

    1983-01-01

    With elemental analysis we mean the determination of which chemical elements are present in a sample and of their concentration. This is an old and important problem in chemistry. The earliest methods were purely chemical and many such methods are still used. However, various methods based on physical principles have gradually become more and more important. One such method is neutron activation. When the sample is bombarded with neutrons it becomes radioactive and the various radioactive isotopes produced can be identified by the radiation they emit. From the measured intensity of the radiation one can calculate how much of a certain element that is present in the sample. Another possibility is to study the light emitted when the sample is excited in various ways. A spectroscopic investigation of the light can identify the chemical elements and allows also a determination of their concentration in the sample. In the same way, if a sample can be brought to emit X-rays, this radiation is also characteristic for the elements present and can be used to determine the elemental concentration. One such X-ray method which has been developed recently is PIXE. The name is an acronym for Particle Induced X-ray Emission and indicates the principle of the method. Particles in this context means heavy, charged particles such as protons and a-particles of rather high energy. Hence, in PIXE-analysis the sample is irradiated in the beam of an accelerator and the emitted X-rays are studied. (author)

  20. DTI analysis methods : Voxel-based analysis

    NARCIS (Netherlands)

    Van Hecke, Wim; Leemans, Alexander; Emsell, Louise

    2016-01-01

    Voxel-based analysis (VBA) of diffusion tensor imaging (DTI) data permits the investigation of voxel-wise differences or changes in DTI metrics in every voxel of a brain dataset. It is applied primarily in the exploratory analysis of hypothesized group-level alterations in DTI parameters, as it does

  1. Bayesian methods for data analysis

    CERN Document Server

    Carlin, Bradley P.

    2009-01-01

    Approaches for statistical inference Introduction Motivating Vignettes Defining the Approaches The Bayes-Frequentist Controversy Some Basic Bayesian Models The Bayes approach Introduction Prior Distributions Bayesian Inference Hierarchical Modeling Model Assessment Nonparametric Methods Bayesian computation Introduction Asymptotic Methods Noniterative Monte Carlo Methods Markov Chain Monte Carlo Methods Model criticism and selection Bayesian Modeling Bayesian Robustness Model Assessment Bayes Factors via Marginal Density Estimation Bayes Factors

  2. Development and Validation of Analytical Method for Losartan ...

    African Journals Online (AJOL)

    Development and Validation of Analytical Method for Losartan-Copper Complex Using UV-Vis Spectrophotometry. ... Tropical Journal of Pharmaceutical Research ... Purpose: To develop a new spectrophotometric method for the analysis of losartan potassium in pharmaceutical formulations by making its complex with ...

  3. Substoichiometric method in the simple radiometric analysis

    International Nuclear Information System (INIS)

    Ikeda, N.; Noguchi, K.

    1979-01-01

    The substoichiometric method is applied to simple radiometric analysis. Two methods - the standard reagent method and the standard sample method - are proposed. The validity of the principle of the methods is verified experimentally in the determination of silver by the precipitation method, or of zinc by the ion-exchange or solvent-extraction method. The proposed methods are simple and rapid compared with the conventional superstoichiometric method. (author)

  4. Methodology for developing new test methods

    Directory of Open Access Journals (Sweden)

    A. I. Korobko

    2017-06-01

    Full Text Available The paper describes the methodology for developing new test methods and forming solutions for the development of new test methods. The basis of the methodology for developing new test methods is the individual elements of the system and process approaches. They contribute to the development of an effective research strategy for the object, the study of interrelations, the synthesis of an adequate model of the test method. The effectiveness of the developed test method is determined by the correct choice of the set of concepts, their interrelations and mutual influence. This allows you to solve the tasks assigned to achieve the goal. The methodology is based on the use of fuzzy cognitive maps. The question of the choice of the method on the basis of which the model for the formation of solutions is based is considered. The methodology provides for recording a model for a new test method in the form of a finite set of objects. These objects are significant for the test method characteristics. Then a causal relationship is established between the objects. Further, the values of fitness indicators and the observability of the method and metrological tolerance for the indicator are established. The work is aimed at the overall goal of ensuring the quality of tests by improving the methodology for developing the test method.

  5. Further developments in the study of harmonic analysis by the correlation and spectral density methods, and its application to the adult rabbit EEG

    International Nuclear Information System (INIS)

    Meilleurat, Michele

    1973-07-01

    The application of harmonic analysis to the brain spontaneous electrical activity has been studied theoretically and practically in 30 adult rabbits chronically implanted with electrodes. Theoretically, an accurate energetic study of the signal can only be achieved by the calculation of the autocorrelation function and its Fourier transform, the power density spectrum. Secondly, a comparative study has been made of the analogical methods using analogic or hybrid devices and the digital method with an analysis and computing program (the sampling rate, the delay, the period of integration and the problems raised by the amplification of the biological signals and sampling). Data handling is discussed, the method mainly retaining the study of variance, the calculation of the total energy carried by the signal and the energies carried along the frequency bandwidth ΔF, their percentage as related to the total energy, the relationships of these various values for various electroencephalographic states. Experimentally, the general aspect of the spontaneous electric activity of the dorsal hippocampus and the visual cortex during vigilance variations is accurately described by the calculation of the variance and the study of the position of the maximum values of the peaks of the power density spectra on the frequency axis as well as by the calculation of the energies carried in various frequency bands, 0-4, 4-8, 8-12 Hz. With the same theoretical bases, both the analogical and digital methods lead to similar results, the former being easier to operate, the latter more accurate. (author) [fr

  6. Development and validation of confirmatory method for analysis of nitrofuran metabolites in milk, honey, poultry meat and fish by liquid chromatography-mass spectrometry

    Directory of Open Access Journals (Sweden)

    Fatih Alkan

    2016-03-01

    Full Text Available In this study we have devoloped and validated a confirmatory analysis method for nitrofuran metabolites, which is in accordance with European Commission Decision 2002/657/EC requirements. Nitrofuran metabolites in honey, milk, poultry meat and fish samples were acidic hydrolised followed by derivatisation with nitrobenzaldehyde and liquid-liquid extracted with ethylacetate. The quantitative and confirmative determination of nitrofuran metbolites was performed by liquid chromatography/electrospray ionisation tandem mass spectrometry (LC/ESI-MS/MS in the positive ion mode. In-house method validation was performed and reported data of validation (specificity, linearity, recovery, CCα and CCβ. The advantage of this method is that it avoids the use of clean-up by Solid-Phase Extraction (SPE. Furthermore, low levels of nitrofuran metabolites are detectable and quantitatively confirmed at a rapid rate in all samples.

  7. Numerical methods and analysis of multiscale problems

    CERN Document Server

    Madureira, Alexandre L

    2017-01-01

    This book is about numerical modeling of multiscale problems, and introduces several asymptotic analysis and numerical techniques which are necessary for a proper approximation of equations that depend on different physical scales. Aimed at advanced undergraduate and graduate students in mathematics, engineering and physics – or researchers seeking a no-nonsense approach –, it discusses examples in their simplest possible settings, removing mathematical hurdles that might hinder a clear understanding of the methods. The problems considered are given by singular perturbed reaction advection diffusion equations in one and two-dimensional domains, partial differential equations in domains with rough boundaries, and equations with oscillatory coefficients. This work shows how asymptotic analysis can be used to develop and analyze models and numerical methods that are robust and work well for a wide range of parameters.

  8. Development of Analytical Method for Detection of Some ...

    African Journals Online (AJOL)

    All rights reserved. ... 3Centre for Water Research and Analysis (ALIR), Faculty of Science and Technology, Universiti Kebangsaan (UKM), ... Purpose: To develop and validate a simple method using solid – phase extraction along with liquid.

  9. The SIESTA method; developments and applicability

    International Nuclear Information System (INIS)

    Artacho, Emilio; Anglada, E; Dieguez, O; Gale, J D; Garcia, A; Junquera, J; Martin, R M; Ordejon, P; Pruneda, J M; Sanchez-Portal, D; Soler, J M

    2008-01-01

    Recent developments in and around the SIESTA method of first-principles simulation of condensed matter are described and reviewed, with emphasis on (i) the applicability of the method for large and varied systems (ii) efficient basis sets for the standards of accuracy of density-functional methods (iii) new implementations, and (iv) extensions beyond ground-state calculations

  10. Some problems on Monte Carlo method development

    International Nuclear Information System (INIS)

    Pei Lucheng

    1992-01-01

    This is a short paper on some problems of Monte Carlo method development. The content consists of deep-penetration problems, unbounded estimate problems, limitation of Mdtropolis' method, dependency problem in Metropolis' method, random error interference problems and random equations, intellectualisation and vectorization problems of general software

  11. Developing methods of controlling quality costs

    OpenAIRE

    Gorbunova A. V.; Maximova O. N.; Ekova V. A.

    2017-01-01

    The article examines issues of managing quality costs, problems of applying economic methods of quality control, implementation of progressive methods of quality costs management in enterprises with the view of improving the efficiency of their evaluation and analysis. With the aim of increasing the effectiveness of the cost management mechanism, authors introduce controlling as a tool of deviation analysis from the standpoint of the process approach. A list of processes and corresponding eva...

  12. Chemical methods of rock analysis

    National Research Council Canada - National Science Library

    Jeffery, P. G; Hutchison, D

    1981-01-01

    .... Such methods include those based upon spectrophotometry, flame emission spectrometry and atomic absorption spectroscopy, as well as gravimetry, titrimetry and the use of ion-selective electrodes...

  13. Development and application of methods to characterize code uncertainty

    International Nuclear Information System (INIS)

    Wilson, G.E.; Burtt, J.D.; Case, G.S.; Einerson, J.J.; Hanson, R.G.

    1985-01-01

    The United States Nuclear Regulatory Commission sponsors both international and domestic studies to assess its safety analysis codes. The Commission staff intends to use the results of these studies to quantify the uncertainty of the codes with a statistically based analysis method. Development of the methodology is underway. The Idaho National Engineering Laboratory contributions to the early development effort, and testing of two candidate methods are the subjects of this paper

  14. Development of a chromatographic separation method hyphenated to electro-spray ionization mass spectrometry (ESI-MS) and inductively coupled plasma mass spectrometry (ICP-MS): application to the lanthanides speciation analysis

    International Nuclear Information System (INIS)

    Beuvier, Ludovic

    2015-01-01

    This work focuses on the development of a chromatographic separation method coupled to both ESI-MS and ICP-MS in order to achieve the comprehensive speciation analysis of lanthanides in aqueous phase representative of back-extraction phases of advanced spent nuclear fuel treatment processes. This analytical method allowed the separation, the characterization and the quantitation of lanthanides complexes holding poly-aminocarboxylic ligands, such as DTPA and EDTA, used as complexing agents in these processes. A HILIC separation method of lanthanides complexes has been developed with an amide bonded stationary phase. A screening of a wide range of mobile phase compositions demonstrated that the adsorption mechanism was predominant. This screening allowed also obtaining optimized separation conditions. Faster analysis conditions with shorter amide column packed with sub 2 μm particles reduced analysis time by 2.5 and 25% solvent consumption. Isotopic and structural characterization by HILIC ESI-MS was performed as well as the development of external calibration quantitation method. Analytical performances of quantitation method were determined. Finally, the development of the HILIC coupling to ESI-MS and ICP-MS was achieved. A simultaneous quantitation method by ESI-MS and ICP-MS was performed to determine the species quantitative distribution in solution. Analytical performances of quantitation method were also determined. (author) [fr

  15. Data Analysis Methods for Library Marketing

    Science.gov (United States)

    Minami, Toshiro; Kim, Eunja

    Our society is rapidly changing to information society, where the needs and requests of the people on information access are different widely from person to person. Library's mission is to provide its users, or patrons, with the most appropriate information. Libraries have to know the profiles of their patrons, in order to achieve such a role. The aim of library marketing is to develop methods based on the library data, such as circulation records, book catalogs, book-usage data, and others. In this paper we discuss the methodology and imporatnce of library marketing at the beginning. Then we demonstrate its usefulness through some examples of analysis methods applied to the circulation records in Kyushu University and Guacheon Library, and some implication that obtained as the results of these methods. Our research is a big beginning towards the future when library marketing is an unavoidable tool.

  16. Development of an efficient fungal DNA extraction method to be used in random amplified polymorphic DNA-PCR analysis to differentiate cyclopiazonic acid mold producers.

    Science.gov (United States)

    Sánchez, Beatriz; Rodríguez, Mar; Casado, Eva M; Martín, Alberto; Córdoba, Juan J

    2008-12-01

    A variety of previously established mechanical and chemical treatments to achieve fungal cell lysis combined with a semiautomatic system operated by a vacuum pump were tested to obtain DNA extract to be directly used in randomly amplified polymorphic DNA (RAPD)-PCR to differentiate cyclopiazonic acid-producing and -nonproducing mold strains. A DNA extraction method that includes digestion with proteinase K and lyticase prior to using a mortar and pestle grinding and a semiautomatic vacuum system yielded DNA of high quality in all the fungal strains and species tested, at concentrations ranging from 17 to 89 ng/microl in 150 microl of the final DNA extract. Two microliters of DNA extracted with this method was directly used for RAPD-PCR using primer (GACA)4. Reproducible RAPD fingerprints showing high differences between producer and nonproducer strains were observed. These differences in the RAPD patterns did not differentiate all the strains tested in clusters by cyclopiazonic acid production but may be very useful to distinguish cyclopiazonic acid producer strains from nonproducer strains by a simple RAPD analysis. Thus, the DNA extracts obtained could be used directly without previous purification and quantification for RAPD analysis to differentiate cyclopiazonic acid producer from nonproducer mold strains. This combined analysis could be adaptable to other toxigenic fungal species to enable differentiation of toxigenic and non-toxigenic molds, a procedure of great interest in food safety.

  17. Development of a Radial Deconsolidation Method

    Energy Technology Data Exchange (ETDEWEB)

    Helmreich, Grant W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Montgomery, Fred C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hunn, John D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-01

    A series of experiments have been initiated to determine the retention or mobility of fission products* in AGR fuel compacts [Petti, et al. 2010]. This information is needed to refine fission product transport models. The AGR-3/4 irradiation test involved half-inch-long compacts that each contained twenty designed-to-fail (DTF) particles, with 20-μm thick carbon-coated kernels whose coatings were deliberately fabricated such that they would crack under irradiation, providing a known source of post-irradiation isotopes. The DTF particles in these compacts were axially distributed along the compact centerline so that the diffusion of fission products released from the DTF kernels would be radially symmetric [Hunn, et al. 2012; Hunn et al. 2011; Kercher, et al. 2011; Hunn, et al. 2007]. Compacts containing DTF particles were irradiated at Idaho National Laboratory (INL) at the Advanced Test Reactor (ATR) [Collin, 2015]. Analysis of the diffusion of these various post-irradiation isotopes through the compact requires a method to radially deconsolidate the compacts so that nested-annular volumes may be analyzed for post-irradiation isotope inventory in the compact matrix, TRISO outer pyrolytic carbon (OPyC), and DTF kernels. An effective radial deconsolidation method and apparatus appropriate to this application has been developed and parametrically characterized.

  18. Development of a quantitative analysis method for mRNA from Mycobacterium leprae and slow-growing acid-fast bacteria

    International Nuclear Information System (INIS)

    Nakanaga, Kazue; Maeda Shinji; Matsuoka, Masanori; Kashiwabara, Yoshiko

    1999-01-01

    This study aimed to develop a specific method for detection and quantitative determination of mRNA that allows estimation of viable counts of M. leprae and other mycobacteria. Of heart-shock protein of 65 kDa (hsp65), mRNA was used as an indicator to discriminate the living cells and died ones. To compare mRNA detections by RNase protection assay (RPA) and Northern blot hybridization (NBH), labelled anti-sense RNA for hsp65 gene of M. leprae was synthesized using plasmid pUC8/N5. The anti-sense RNA synthesized from the template DNA containing about 580 bp (194 to 762) of hsp65 gene. When compared with NBH method, the amount of probe required for the detection by RPA method was 1/30 or less and the detection sensitivity of RPA was also 10 times higher. In addition, complicated procedures were needed to eliminate non-specific reactions in NBH method. These results indicated that RPA method is more convenient and superior for the mRNA detection. However, isotope degradation in the probe used for RPA method might affect the results. Therefore, 33 P of 35 P, of which degradation energy is less that 32 P should be used for labelling. Total RNA was effectively extracted from M. chelonae, M. marinum by AGPC method, but not from M. leprae. In conclusion, RPA is a very effective detection method for these mRNA, but it seems necessary to further improve the sensitivity of detection for a small amount of test materials. (M.N.)

  19. Development of a quantitative analysis method for mRNA from Mycobacterium leprae and slow-growing acid-fast bacteria

    Energy Technology Data Exchange (ETDEWEB)

    Nakanaga, Kazue; Maeda Shinji; Matsuoka, Masanori; Kashiwabara, Yoshiko [National Inst. of Infectious Diseases, Tokyo (Japan)

    1999-02-01

    This study aimed to develop a specific method for detection and quantitative determination of mRNA that allows estimation of viable counts of M. leprae and other mycobacteria. Of heart-shock protein of 65 kDa (hsp65), mRNA was used as an indicator to discriminate the living cells and died ones. To compare mRNA detections by RNase protection assay (RPA) and Northern blot hybridization (NBH), labelled anti-sense RNA for hsp65 gene of M. leprae was synthesized using plasmid pUC8/N5. The anti-sense RNA synthesized from the template DNA containing about 580 bp (194 to 762) of hsp65 gene. When compared with NBH method, the amount of probe required for the detection by RPA method was 1/30 or less and the detection sensitivity of RPA was also 10 times higher. In addition, complicated procedures were needed to eliminate non-specific reactions in NBH method. These results indicated that RPA method is more convenient and superior for the mRNA detection. However, isotope degradation in the probe used for RPA method might affect the results. Therefore, {sup 33}P of {sup 35}P, of which degradation energy is less that {sup 32}P should be used for labelling. Total RNA was effectively extracted from M. chelonae, M. marinum by AGPC method, but not from M. leprae. In conclusion, RPA is a very effective detection method for these mRNA, but it seems necessary to further improve the sensitivity of detection for a small amount of test materials. (M.N.)

  20. Instrumental neutron activation analysis - a routine method

    International Nuclear Information System (INIS)

    Bruin, M. de.

    1983-01-01

    This thesis describes the way in which at IRI instrumental neutron activation analysis (INAA) has been developed into an automated system for routine analysis. The basis of this work are 20 publications describing the development of INAA since 1968. (Auth.)

  1. Probabilistic methods in combinatorial analysis

    CERN Document Server

    Sachkov, Vladimir N

    2014-01-01

    This 1997 work explores the role of probabilistic methods for solving combinatorial problems. These methods not only provide the means of efficiently using such notions as characteristic and generating functions, the moment method and so on but also let us use the powerful technique of limit theorems. The basic objects under investigation are nonnegative matrices, partitions and mappings of finite sets, with special emphasis on permutations and graphs, and equivalence classes specified on sequences of finite length consisting of elements of partially ordered sets; these specify the probabilist

  2. Report of Research Cooperation Sub-Committee 46 on research and development of methods for inelastic (EPICC: Elastic-PlastIC-Creep) structural analysis

    International Nuclear Information System (INIS)

    Yamada, Yoshiaki

    1977-05-01

    This report succeeds the preceding one on ''Verification and Qualification of Nonlinear Structural Analysis Computer Program''. PNC (Power Reactor and Nuclear Fuel Development Corporation) decided to sponsor an extended research project on inelastic structural analysis for a period spanning September, 1976 to May, 1978. Responding to PNC proposal, RC Sub-Committee 46 was formed in Japan Society of Mechanical Engineers and plunged into the cooperative work from October, 1976. Besides the verification and/or qualification of available general purpose computer programs which were the major objectives of previous contract, the Committee executed the research on the topics categorized into the following three fields of interests: 1. Material data for use in elastic analysis, 2. Inelastic analysis procedure and computer program verification, 3. Design code and processing of computer solutions. This report summarizes the efforts during the first year of the Sub-Committee and consists of three parts each corresponding to the research topics stated above. Part I. Inelastic constitutive equations for materials under high temperature service conditions Part II. EPICC standard benchmark test problem and solutions Part III. Examination of postprocessors and development Although the research is still in the intermediate stage, the features of research being actively under way are 1. Evaluative review and nationwide collection of material data, recommendation of tentative constitutive equations for elastic-plastic and creep analyses of benchmark test problem, 2. Revision and augmentation of EPICC standard benchmark test problem and competitive and/or cooperative execution of solutions, 3. Review of existing prototypical post processors, and development of a processor for piping design. (author)

  3. Methods in quantitative image analysis.

    Science.gov (United States)

    Oberholzer, M; Ostreicher, M; Christen, H; Brühlmann, M

    1996-05-01

    The main steps of image analysis are image capturing, image storage (compression), correcting imaging defects (e.g. non-uniform illumination, electronic-noise, glare effect), image enhancement, segmentation of objects in the image and image measurements. Digitisation is made by a camera. The most modern types include a frame-grabber, converting the analog-to-digital signal into digital (numerical) information. The numerical information consists of the grey values describing the brightness of every point within the image, named a pixel. The information is stored in bits. Eight bits are summarised in one byte. Therefore, grey values can have a value between 0 and 256 (2(8)). The human eye seems to be quite content with a display of 5-bit images (corresponding to 64 different grey values). In a digitised image, the pixel grey values can vary within regions that are uniform in the original scene: the image is noisy. The noise is mainly manifested in the background of the image. For an optimal discrimination between different objects or features in an image, uniformity of illumination in the whole image is required. These defects can be minimised by shading correction [subtraction of a background (white) image from the original image, pixel per pixel, or division of the original image by the background image]. The brightness of an image represented by its grey values can be analysed for every single pixel or for a group of pixels. The most frequently used pixel-based image descriptors are optical density, integrated optical density, the histogram of the grey values, mean grey value and entropy. The distribution of the grey values existing within an image is one of the most important characteristics of the image. However, the histogram gives no information about the texture of the image. The simplest way to improve the contrast of an image is to expand the brightness scale by spreading the histogram out to the full available range. Rules for transforming the grey value

  4. Development and validation of a GC-C-IRMS method for the confirmation analysis of pseudo-endogenous glucocorticoids in doping control.

    Science.gov (United States)

    de la Torre, Xavier; Curcio, Davide; Colamonici, Cristiana; Molaioni, Francesco; Cilia, Marta; Botrè, Francesco

    2015-01-01

    Glucocorticoids are included in the S9 section of the World Anti-doping Agency (WADA) prohibited list international standard. Some among them are pseudo-endogenous steroids, like cortisol and cortisone, which present the same chemical structure as endogenously produced steroids. We are proposing an analytical method based on gas chromatography coupled to isotope ratio mass spectrometry (GC-C-IRMS) which allows discrimination between endogenous and synthetic origin of the urinary metabolites of the pseudo-endogenous glucocorticoids. A preliminary purification treatment by high-performance liquid chromatography (HPLC) of the target compounds (TC) (i.e., cortisol, tetrahydrocortisone (THE) 5α-tetrahydrocortisone (aTHE), tetrahydrocortisol (THF), and 5α-tetrahydrocortisol (aTHF)) allows collection of extracts with adequate purity for the subsequent analysis by IRMS. A population of 40 urine samples was analyzed for the TC and for the endogenous reference compounds (ERC: i.e., 11-desoxy-tetrahydrocortisol (THS) or pregnanediol). For each sample, the difference between the delta values of the ERCs and TCs (Δδ values) were calculated and based on that, some decision limits for atypical findings are proposed. The limits are below 3% units except for cortisol. The fit to purpose of the method has been confirmed by the analysis of urine samples collected in two patients under treatment with 25 mg of cortisone acetate (p.o). The samples showed Δδ values higher than 3 for at least 24 h following administration depending on the TC considered. The method can easily be integrated into existing procedures already used for the HPLC purification and IRMS analysis of pseudo-endogenous steroids with androgenic/anabolic activity. Copyright © 2015 John Wiley & Sons, Ltd.

  5. Development of a method to determine the specific environment as a starting point for the strategic analysis and the approach to competitors' Knowledge: presentation and applications

    Directory of Open Access Journals (Sweden)

    Emilio García Vega

    2015-09-01

    Full Text Available The determination of the specific environment is important for the formulation of efficient enterprise strategies, on the basis of a strategic analysis properly focused. This paper suggests a method to help its limitation and identification. With its use, it pretends to offer a simple and practical tool that allows to have a more accurate approach to the identification of the industry that will be analysed, as well as, a clarification of the specification of the direct and substitute competition. Also, with the use of this tool, the managers of a business idea, an experienced or new organization, will have an approximation to the mentioned themes that are of a strategic importance in any management type. Likewise, two applications of the proposed method are presented: the first orientated to a business idea and the second to supermarkets with a high service charge in Lima, Peru.

  6. Development of a photometric measuring method for soot analysis in flames. Final report; Entwicklung eines photometrischen Messverfahrens zur Russanalyse in Flammen. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Weichert, R.; Niemann, J.

    1995-12-31

    The present photometric measuring method for soot analysis in flames meets the following specifications: determination of the volume concentration of soot particles from 2 x 10{sup -7} upwards by means of extinction measurement at three different wavelengths; determination of the particle size distribution of soot particles by means of nephelometry in the range betwenn 20 and 400 nm; contactless measurements on the particle collective in the flame; no need for calibration of the photometric measuring method on the basis of particles of known size and concentration. (orig./SR) [Deutsch] Es ergeben sich fuer das entwickelte photometrische Messverfahren zur Russanalyse in Flammen folgende Spezifikationen: - Bestimmung der Volumenkonzentration der Russpartikel ab 2 x 10{sup -7} mittels Extinktionsmessungen bei drei Lichtwellenlaengen, - Ermittlung der Partikelgroessenverteilung der Russpartikel aus Streulichtmessungen im Bereich von 20 bis 400 nm, - beruehrungsfreie Messung in der Flamme am Partikelkollektiv und, - keine Kalibrierung des photometrischen Messverfahrens mit Partikeln bekannter Groesse bzw. bekannter Konzentration erforderlich. (orig./SR)

  7. Development and validation of bioanalytical UHPLC-UV method for simultaneous analysis of unchanged fenofibrate and its metabolite fenofibric acid in rat plasma: Application to pharmacokinetics

    Directory of Open Access Journals (Sweden)

    Rayan G. Alamri

    2017-01-01

    Full Text Available A simple, precise, selective and fast ultra-high performance liquid chromatography (UHPLC-UV method has been developed and validated for the simultaneous determination of a lipid regulating agent fenofibrate and its metabolite fenofibric acid in rat plasma. The chromatographic separation was carried out on a reversed-phase Acquity® BEH C18 column using methanol–water (65:35, v/v as the mobile phase. The isocratic flow was 0.3 ml/min with rapid run time of 2.5 min and UV detection was at 284 nm. The method was validated over a concentration range of 100–10000 ng/ml (r2 ⩾ 0.9993. The selectivity, specificity, recovery, accuracy and precision were validated for determination of fenofibrate/fenofibric acid in rat plasma. The lower limits of detection and quantitation of the method were 30 and 90 ng/ml for fenofibrate and 40 and 100 ng/ml for fenofibric acid, respectively. The within and between-day coefficients of variation were less than 5%. The validated method has been successfully applied to measure the plasma concentrations in pharmacokinetics study of fenofibrate in an animal model to illustrate the scope and application of the method.

  8. A development of the direct Lyapunov method for the analysis of transient stability of a system of synchronous generators based on the determination of non- stable equilibria on a multidimensional sphere

    Directory of Open Access Journals (Sweden)

    A. V. Stepanov

    2014-01-01

    Full Text Available A development of the direct Lyapunov method for the analysis of transient stability of a system of synchronous generators based on the determination of non- stable equilibria on a multidimensional sphere.We consider the problem of transient stability analysis for a system of synchronous generators under the action of strong perturbations. The aim of our work is to develop methods to analyze a transient stability of the system of synchronous generators, which allow getting trustworthy results on reserve transient stability under different perturbations. For the analysis of transient stability, we use the direct Lyapunov method.One of the problems for this method application is to find the Lypunov function that well reflects the properties of a parallel system of synchronous generators. The most reliable results were obtained when the analysis of transient stability was performed with a Lyapunov function of energy type. Another problem for application of the direct Lyapunov method is to determine the critical value of the Lyapunov function, which requires finding the non-stable equilibria of the system. Determination of the non-stable equilibria requires studying the Lyapunov function in a multidimensional space in a neighborhood of a stable equilibrium for the post-breakdown system; this is a complicated non-linear problem.In the paper, we propose a method for determination of the non-stable equilibria on a multidimensional sphere. The method is based on a search of a minimum of the Lyapunov function on a multidimensional sphere the center of which is a stable equilibrium. Our method allows, comparing with the other, e.g., gradient methods, reliable finding a non-stable equilibrium and calculating the critical value. The reliability of our method is proved by numerical experiments. The developed methods and a program realized in a MATLAB package can be recommended for design of a post-breakdown control system of synchronous generators or as a

  9. Prognostic aspects of imaging method development

    International Nuclear Information System (INIS)

    Steinhart, L.

    1987-01-01

    A survey is presented of X-ray diagnostic methods and techniques and possibilities of their further development. Promising methods include direct imaging using digital radiography. In connection with computer technology these methods achieve higher resolution. The storage of obtained images in the computer memory will allow automated processing and evaluation and the use of expert systems. Development is expected to take place especially in computerized tomography using magnetic resonance, and positron computed tomography and other non-radioactive diagnostic methods. (J.B.). 5 figs., 1 tab., 1 ref

  10. Development of a spatial analysis method using ground-based repeat photography to detect changes in the alpine treeline ecotone, Glacier National Park, Montana, U.S.A.

    Science.gov (United States)

    Roush, W.; Munroe, Jeffrey S.; Fagre, D.B.

    2007-01-01

    Repeat photography is a powerful tool for detection of landscape change over decadal timescales. Here a novel method is presented that applies spatial analysis software to digital photo-pairs, allowing vegetation change to be categorized and quantified. This method is applied to 12 sites within the alpine treeline ecotone of Glacier National Park, Montana, and is used to examine vegetation changes over timescales ranging from 71 to 93 years. Tree cover at the treeline ecotone increased in 10 out of the 12 photo-pairs (mean increase of 60%). Establishment occurred at all sites, infilling occurred at 11 sites. To demonstrate the utility of this method, patterns of tree establishment at treeline are described and the possible causes of changes within the treeline ecotone are discussed. Local factors undoubtedly affect the magnitude and type of the observed changes, however the ubiquity of the increase in tree cover implies a common forcing mechanism. Mean minimum summer temperatures have increased by 1.5??C over the past century and, coupled with variations in the amount of early spring snow water equivalent, likely account for much of the increase in tree cover at the treeline ecotone. Lastly, shortcomings of this method are presented along with possible solutions and areas for future research. ?? 2007 Regents of the University of Colorado.

  11. Moyer's method of mixed dentition analysis: a meta-analysis ...

    African Journals Online (AJOL)

    The applicability of tables derived from the data Moyer used to other ethnic groups has ... This implies that Moyer's method of prediction may have population variations. ... Key Words: meta-analysis, mixed dentition analysis, Moyer's method

  12. Probabilistic methods in fire-risk analysis

    International Nuclear Information System (INIS)

    Brandyberry, M.D.

    1989-01-01

    The first part of this work outlines a method for assessing the frequency of ignition of a consumer product in a building and shows how the method would be used in an example scenario utilizing upholstered furniture as the product and radiant auxiliary heating devices (electric heaters, wood stoves) as the ignition source. Deterministic thermal models of the heat-transport processes are coupled with parameter uncertainty analysis of the models and with a probabilistic analysis of the events involved in a typical scenario. This leads to a distribution for the frequency of ignition for the product. In second part, fire-risk analysis as currently used in nuclear plants is outlines along with a discussion of the relevant uncertainties. The use of the computer code COMPBRN is discussed for use in the fire-growth analysis along with the use of response-surface methodology to quantify uncertainties in the code's use. Generalized response surfaces are developed for temperature versus time for a cable tray, as well as a surface for the hot gas layer temperature and depth for a room of arbitrary geometry within a typical nuclear power plant compartment. These surfaces are then used to simulate the cable tray damage time in a compartment fire experiment

  13. Applying critical analysis - main methods

    Directory of Open Access Journals (Sweden)

    Miguel Araujo Alonso

    2012-02-01

    Full Text Available What is the usefulness of critical appraisal of literature? Critical analysis is a fundamental condition for the correct interpretation of any study that is subject to review. In epidemiology, in order to learn how to read a publication, we must be able to analyze it critically. Critical analysis allows us to check whether a study fulfills certain previously established methodological inclusion and exclusion criteria. This is frequently used in conducting systematic reviews although eligibility criteria are generally limited to the study design. Critical analysis of literature and be done implicitly while reading an article, as in reading for personal interest, or can be conducted in a structured manner, using explicit and previously established criteria. The latter is done when formally reviewing a topic.

  14. Homotopy analysis method for neutron diffusion calculations

    International Nuclear Information System (INIS)

    Cavdar, S.

    2009-01-01

    The Homotopy Analysis Method (HAM), proposed in 1992 by Shi Jun Liao and has been developed since then, is based on a fundamental concept in differential geometry and topology, the homotopy. It has proved useful for problems involving algebraic, linear/non-linear, ordinary/partial differential and differential-integral equations being an analytic, recursive method that provides a series sum solution. It has the advantage of offering a certain freedom for the choice of its arguments such as the initial guess, the auxiliary linear operator and the convergence control parameter, and it allows us to effectively control the rate and region of convergence of the series solution. HAM is applied for the fixed source neutron diffusion equation in this work, which is a part of our research motivated by the question of whether methods for solving the neutron diffusion equation that yield straightforward expressions but able to provide a solution of reasonable accuracy exist such that we could avoid analytic methods that are widely used but either fail to solve the problem or provide solutions through many intricate expressions that are likely to contain mistakes or numerical methods that require powerful computational resources and advanced programming skills due to their very nature or intricate mathematical fundamentals. Fourier basis are employed for expressing the initial guess due to the structure of the problem and its boundary conditions. We present the results in comparison with other widely used methods of Adomian Decomposition and Variable Separation.

  15. Development of Tsunami PSA method for Korean NPP site

    International Nuclear Information System (INIS)

    Kim, Min Kyu; Choi, In Kil; Park, Jin Hee

    2010-01-01

    A methodology of tsunami PSA was developed in this study. A tsunami PSA consists of tsunami hazard analysis, tsunami fragility analysis and system analysis. In the case of tsunami hazard analysis, evaluation of tsunami return period is major task. For the evaluation of tsunami return period, numerical analysis and empirical method can be applied. The application of this method was applied to a nuclear power plant, Ulchin 56 NPP, which is located in the east coast of Korean peninsula. Through this study, whole tsunami PSA working procedure was established and example calculation was performed for one of real nuclear power plant in Korea

  16. Safety relief valve alternate analysis method

    International Nuclear Information System (INIS)

    Adams, R.H.; Javid, A.; Khatua, T.P.

    1981-01-01

    An experimental test program was started in the United States in 1976 to define and quantify Safety Relief Valve (SRV) phenomena in General Electric Mark I Suppression Chambers. The testing considered several discharged devices and was used to correlate SRV load prediction models. The program was funded by utilities with Mark I containments and has resulted in a detailed SRV load definition as a portion of the Mark I containment program Load Definition Report (LDR). The (USNRC) has reviewed and approved the LDR SRV load definition. In addition, the USNRC has permitted calibration of structural models used for predicting torus response to SRV loads. Model calibration is subject to confirmatory in-plant testing. The SRV methodology given in the LDR requires that transient dynamic pressures be applied to a torus structural model that includes a fluid added mass matrix. Preliminary evaluations of torus response have indicated order of magnitude conservatisms, with respect to test results, which could result in unrealistic containment modifications. In addition, structural response trends observed in full-scale tests between cold pipe, first valve actuation and hot pipe, subsequent valve actuation conditions have not been duplicated using current analysis methods. It was suggested by others that an energy approach using current fluid models be utilized to define loads. An alternate SRV analysis method is defined to correct suppression chamber structural response to a level that permits economical but conservative design. Simple analogs are developed for the purpose of correcting the analytical response obtained from LDR analysis methods. Analogs evaluated considered forced vibration and free vibration structural response. The corrected response correlated well with in-plant test response. The correlation of the analytical model at test conditions permits application of the alternate analysis method at design conditions. (orig./HP)

  17. Uni-dimensional double development HPTLC-densitometry method for simultaneous analysis of mangiferin and lupeol content in mango (Mangifera indica) pulp and peel during storage.

    Science.gov (United States)

    Jyotshna; Srivastava, Pooja; Killadi, Bharti; Shanker, Karuna

    2015-06-01

    Mango (Mangifera indica) fruit is one of the important commercial fruit crops of India. Similar to other tropical fruits it is also highly perishable in nature. During storage/ripening, changes in its physico-chemical quality parameters viz. firmness, titrable acidity, total soluble solid content (TSSC), carotenoids content, and other biochemicals are inevitable. A uni-dimensional double-development high-performance thin-layer chromatography (UDDD-HPTLC) method was developed for the real-time monitoring of mangiferin and lupeol in mango pulp and peel during storage. The quantitative determination of both compounds of different classes was achieved by densitometric HPTLC method. Silica gel 60F254 HPTLC plates and two solvent systems viz. toluene/EtOAC/MeOH and EtOAC/MeOH, respectively were used for optimum separation and selective evaluation. Densitometric quantitation of mangiferin was performed at 390nm, while lupeol at 610nm after post chromatographic derivatization. Validated method was used to real-time monitoring of mangiferin and lupeol content during storage in four Indian cultivars, e.g. Bombay green (Bgreen), Dashehari, Langra, and Chausa. Significant correlations (pacidity and TSSC with mangiferin and lupeol in pulp and peel during storage were also observed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Development of analysis method of material f low cost accounting using lean technique in food production: A case study of Universal Food Public (UFC Co.,Ltd.

    Directory of Open Access Journals (Sweden)

    Wichai Chattinnawat

    2015-06-01

    Full Text Available This research aims to apply Lean technique in conjunction with analysis of Material Flow Cost Accounting (MFCA to production process of canned sweet corn in order to increase process efficiency, eliminate waste and reduce cost of the production. This research develops and presents new type of MFCA analysis by incorporating value and non-value added activities into the MFCA cost allocation process. According to the simulation-based measurement of the process efficiency, integrated cost allocation based on activity types results in higher proportion of negative product cost in comparison to that computed from conventional MFCA cost allocation. Thus, considering types of activities and process efficiency have great impacts on cost structure especially for the negative product cost. The research leads to solutions to improve work procedures, eliminate waste and reduce production cost. The overall cost per unit decreases with higher proportion of positive product cost.

  19. Cleanup standards and pathways analysis methods

    International Nuclear Information System (INIS)

    Devgun, J.S.

    1993-01-01

    Remediation of a radioactively contaminated site requires that certain regulatory criteria be met before the site can be released for unrestricted future use. Since the ultimate objective of remediation is to protect the public health and safety, residual radioactivity levels remaining at a site after cleanup must be below certain preset limits or meet acceptable dose or risk criteria. Release of a decontaminated site requires proof that the radiological data obtained from the site meet the regulatory criteria for such a release. Typically release criteria consist of a composite of acceptance limits that depend on the radionuclides, the media in which they are present, and federal and local regulations. In recent years, the US Department of Energy (DOE) has developed a pathways analysis model to determine site-specific soil activity concentration guidelines for radionuclides that do not have established generic acceptance limits. The DOE pathways analysis computer code (developed by Argonne National Laboratory for the DOE) is called RESRAD (Gilbert et al. 1989). Similar efforts have been initiated by the US Nuclear Regulatory Commission (NRC) to develop and use dose-related criteria based on genetic pathways analyses rather than simplistic numerical limits on residual radioactivity. The focus of this paper is radionuclide contaminated soil. Cleanup standards are reviewed, pathways analysis methods are described, and an example is presented in which RESRAD was used to derive cleanup guidelines

  20. Development and Validation of a GC-MS Method for the Analysis of Homogentisic Acid in Strawberry Tree (Arbutus unedo L.) Honey.

    Science.gov (United States)

    Brčić Karačonji, Irena; Jurica, Karlo

    2017-07-01

    To confirm the botanical origin of strawberry tree (Arbutus unedo L.) honey, a liquid-liquid extraction followed by GC-MS method was developed for the quantitative determination of homogentisic acid (HGA), the main phenolic compound in this honey. Different parameters affecting extraction, such as the type and volume of extraction solvents, pH of the solution, and amount of salt, were optimized. The method showed good linearity (r2 = 0.9990) over the tested concentration range (50-500 mg/kg) and a low LOD (0.3 mg/kg). Precision expressed as RSD was <7%. The average accuracy was 95%. The optimized method was applied for determining the HGA content in strawberry tree honey samples from Croatia. The HGA content in analyzed samples (n = 7) ranged from 245.1 to 485.9 mg/kg. The proposed method provided reliable performance and can be easily implemented for the routine monitoring of HGA in strawberry tree honey in order to assure honey QC.

  1. Development of seismic design method for piping system supported by elastoplastic damper. 3. Vibration test of three-dimensional piping model and its response analysis

    International Nuclear Information System (INIS)

    Namita, Yoshio; Kawahata, Jun-ichi; Ichihashi, Ichiro; Fukuda, Toshihiko.

    1995-01-01

    Component and piping systems in current nuclear power plants and chemical plants are designed to employ many supports to maintain safety and reliability against earthquakes. However, these supports are rigid and have a slight energy-dissipating effect. It is well known that applying high-damping supports to the piping system is very effective for reducing the seismic response. In this study, we investigated the design method of the elastoplastic damper [energy absorber (EAB)] and the seismic design method for a piping system supported by the EAB. Our final goal is to develop technology for applying the EAB to the piping system of an actual plant. In this paper, the vibration test results of the three-dimensional piping model are presented. From the test results, it is confirmed that EAB has a large energy-dissipating effect and is effective in reducing the seismic response of the piping system, and that the seismic design method for the piping system, which is the response spectrum mode superposition method using each modal damping and requires iterative calculation of EAB displacement, is applicable for the three-dimensional piping model. (author)

  2. Development and method of use of a mass spectrometric isotope dilution analysis within the use of negative thermoionisation for determination of boron traces

    International Nuclear Information System (INIS)

    Zeininger, H.

    1984-01-01

    A mass spectrometric trace boron determination using negative thermionisation was developed. It is based on the determination of the ratio of BO 2 - isotopes ( 10 B and 11 B). A high stability and a constant intensity at a given temperature of the BO 2 - ion currents allow for a computer controlled measurement with a programmed heating. The reproducibility lies at around 0,004-0,08%. The boron determination using Mels potentiometry with a BF 4 - -ion selective electrode was used as an analytical comparison method. The MS-IDA was first used on metal samples, such as Al, Zr, and steel. Later on the boron in reagents, biological material (milk powder, spinach, water plants) and water were determined. For this material-dependent hydrolysation and separation procedures were worked out. The MS-IDA in comparison to all other analytical methods used by other collaborators offers the greatest accuracy. (RB) [de

  3. Development of spectrophotometric fingerprinting method for ...

    African Journals Online (AJOL)

    Selective and efficient analytical methods are required not only for quality assurance but also for authentication of herbal formulations. A simple, rapid and validated fingerprint method has developed for estimation of piperine in 'Talisadi churna', a well known herbal formulation in India. The estimation was carried out in two ...

  4. Validated modified Lycopodium spore method development for ...

    African Journals Online (AJOL)

    Validated modified lycopodium spore method has been developed for simple and rapid quantification of herbal powdered drugs. Lycopodium spore method was performed on ingredients of Shatavaryadi churna, an ayurvedic formulation used as immunomodulator, galactagogue, aphrodisiac and rejuvenator. Estimation of ...

  5. Development and Application of Kinetic Spectrophotometric Method ...

    African Journals Online (AJOL)

    Purpose: To develop an improved kinetic-spectrophotometric procedure for the determination of metronidazole (MNZ) in pharmaceutical formulations. Methods: The method is based on oxidation reaction of MNZ by hydrogen peroxide in the presence of Fe(II) ions at pH 4.5 (acetate buffer). The reaction was monitored ...

  6. Microparticle analysis system and method

    Science.gov (United States)

    Morrison, Dennis R. (Inventor)

    2007-01-01

    A device for analyzing microparticles is provided which includes a chamber with an inlet and an outlet for respectively introducing and dispensing a flowing fluid comprising microparticles, a light source for providing light through the chamber and a photometer for measuring the intensity of light transmitted through individual microparticles. The device further includes an imaging system for acquiring images of the fluid. In some cases, the device may be configured to identify and determine a quantity of the microparticles within the fluid. Consequently, a method for identifying and tracking microparticles in motion is contemplated herein. The method involves flowing a fluid comprising microparticles in laminar motion through a chamber, transmitting light through the fluid, measuring the intensities of the light transmitted through the microparticles, imaging the fluid a plurality of times and comparing at least some of the intensities of light between different images of the fluid.

  7. METHOD DEVELOPMENT FOR THE ANALYSIS OF N-NITROSODIMETHYLAMINE AND OTHER N-NITROSAMINES IN DRINKING WATER AT LOW NANOGRAM/LITER CONCENTRATIONS USING SOLID PHASE EXTRACTION AND GAS CHROMATOGRAPHY WITH CHEMICAL IONIZATION TANDEM MASS SPECTROMETRY

    Science.gov (United States)

    N-Nitrosodimethylamine (NDMA) is a probable human carcinogen that has been identified as a drinking water contaminant of concern. United States Environmental Protection Agency (USEPA) Method 521 has been developed for the analysis of NDMA and six additional N-nitrosamines in dri...

  8. The comparative analysis of English and Lithuanian transport terms and some methods of developing effective science writing strategies by non-native speakers of English

    Directory of Open Access Journals (Sweden)

    V. Marina

    2009-09-01

    Full Text Available The paper addresses the problem of developing more effective strategies and skills of writing scientific and technical texts by non-native speakers of English. The causes of poor writing are identified and general guidelines for developing effective science writing strategies are outlined. The analysis of difficulties faced by non-native speakers of English in writing research papers is made by examining transport terms and international words which are based on different nomination principles in English and Lithuanian. Case study of various names given to a small vehicle used for passenger transportation in many countries is provided, illustrating the alternative ways of naming the same object of reality in different languages. The analysis is based on the theory of linguistic relativity. Differences in the use of similar international terms in English and Lithuanian, which often cause errors and misunderstanding, are also demonstrated. The recommendations helping non-native speakers of English to avoid errors and improve skills of writing scientific and technical texts are given.

  9. Development of methods for evaluating active faults

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-08-15

    The HERP report for long-term evaluation of active faults and the NSC safety review guide with regard to geology and ground of site were published on Nov. 2010 and on Dec. 2010, respectively. With respect to those reports, our investigation is as follows; (1) For assessment of seismic hazard, we estimated seismic sources around NPPs based on information of tectonic geomorphology, earthquake distribution and subsurface geology. (2) For evaluation on the activity of blind fault, we calculated the slip rate on the 2008 Iwate-Miyagi Nairiku Earthquake fault, using information on late Quaternary fluvial terraces. (3) To evaluate the magnitude of earthquakes whose sources are difficult to identify, we proposed a new method for calculation of the seismogenic layer thickness. (4) To clarify the activities of active faults without superstratum, we carried out the color analysis of fault gouge and divided the activities into thousand of years and tens of thousands. (5) For improving chronology of sediments, we detected new widespread cryptotephras using mineral chemistry and developed late Quaternary cryptotephrostratigraphy around NPPs. (author)

  10. Development of a flow method for the determination of phosphate in estuarine and freshwaters-Comparison of flow cells in spectrophotometric sequential injection analysis

    International Nuclear Information System (INIS)

    Mesquita, Raquel B.R.; Ferreira, M. Teresa S.O.B.; Toth, Ildiko V.; Bordalo, Adriano A.; McKelvie, Ian D.; Rangel, Antonio O.S.S.

    2011-01-01

    Highlights: → Sequential injection determination of phosphate in estuarine and freshwaters. → Alternative spectrophotometric flow cells are compared. → Minimization of schlieren effect was assessed. → Proposed method can cope with wide salinity ranges. → Multi-reflective cell shows clear advantages. - Abstract: A sequential injection system with dual analytical line was developed and applied in the comparison of two different detection systems viz; a conventional spectrophotometer with a commercial flow cell, and a multi-reflective flow cell coupled with a photometric detector under the same experimental conditions. The study was based on the spectrophotometric determination of phosphate using the molybdenum-blue chemistry. The two alternative flow cells were compared in terms of their response to variation of sample salinity, susceptibility to interferences and to refractive index changes. The developed method was applied to the determination of phosphate in natural waters (estuarine, river, well and ground waters). The achieved detection limit (0.007 μM PO 4 3- ) is consistent with the requirement of the target water samples, and a wide quantification range (0.024-9.5 μM) was achieved using both detection systems.

  11. Development of a flow method for the determination of phosphate in estuarine and freshwaters-Comparison of flow cells in spectrophotometric sequential injection analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Raquel B.R. [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal); Laboratory of Hydrobiology, Institute of Biomedical Sciences Abel Salazar (ICBAS) and Institute of Marine Research (CIIMAR), Universidade do Porto, Lg. Abel Salazar 2, 4099-003 Porto (Portugal); Ferreira, M. Teresa S.O.B. [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal); Toth, Ildiko V. [REQUIMTE, Departamento de Quimica, Faculdade de Farmacia, Universidade de Porto, Rua Anibal Cunha, 164, 4050-047 Porto (Portugal); Bordalo, Adriano A. [Laboratory of Hydrobiology, Institute of Biomedical Sciences Abel Salazar (ICBAS) and Institute of Marine Research (CIIMAR), Universidade do Porto, Lg. Abel Salazar 2, 4099-003 Porto (Portugal); McKelvie, Ian D. [School of Chemistry, University of Melbourne, Victoria 3010 (Australia); Rangel, Antonio O.S.S., E-mail: aorangel@esb.ucp.pt [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal)

    2011-09-02

    Highlights: {yields} Sequential injection determination of phosphate in estuarine and freshwaters. {yields} Alternative spectrophotometric flow cells are compared. {yields} Minimization of schlieren effect was assessed. {yields} Proposed method can cope with wide salinity ranges. {yields} Multi-reflective cell shows clear advantages. - Abstract: A sequential injection system with dual analytical line was developed and applied in the comparison of two different detection systems viz; a conventional spectrophotometer with a commercial flow cell, and a multi-reflective flow cell coupled with a photometric detector under the same experimental conditions. The study was based on the spectrophotometric determination of phosphate using the molybdenum-blue chemistry. The two alternative flow cells were compared in terms of their response to variation of sample salinity, susceptibility to interferences and to refractive index changes. The developed method was applied to the determination of phosphate in natural waters (estuarine, river, well and ground waters). The achieved detection limit (0.007 {mu}M PO{sub 4}{sup 3-}) is consistent with the requirement of the target water samples, and a wide quantification range (0.024-9.5 {mu}M) was achieved using both detection systems.

  12. A simple LC-MS/MS method for quantitative analysis of underivatized neurotransmitters in rats urine: assay development, validation and application in the CUMS rat model.

    Science.gov (United States)

    Zhai, Xue-jia; Chen, Fen; Zhu, Chao-ran; Lu, Yong-ning

    2015-11-01

    Many amino acid neurotransmitters in urine are associated with chronic stress as well as major depressive disorders. To better understand depression, an analytical LC-MS/MS method for the simultaneous determination of 11 underivatized neurotransmitters (4-aminohippurate, 5-HIAA, glutamate, glutamine, hippurate, pimelate, proline, tryptophan, tyramine, tyrosine and valine) in a single analytical run was developed. The advantage of this method is the simple preparation in that there is no need to deconjugate the urine samples. The quantification range was 25-12,800 ng mL(-1) with >85.8% recovery for all analytes. The nocturnal urine concentrations of the 11 neurotransmitters in chronic unpredictable mild stress (CUMS) model rats and control group (n = 12) were analyzed. A series of significant changes in urinary excretion of neurotransmitters could be detected: the urinary glutamate, glutamine, hippurate and tyramine concentrations were significantly lower in the CUMS group. In addition, the urinary concentrations of tryptophan as well as tyrosine were significantly higher in chronically stressed rats. This method allows the assessment of the neurotransmitters associated with CUMS in rat urine in a single analytical run, making it suitable for implementation as a routine technique in depression research. Copyright © 2015 John Wiley & Sons, Ltd.

  13. UHPLC/MS-MS Analysis of Six Neonicotinoids in Honey by Modified QuEChERS: Method Development, Validation, and Uncertainty Measurement

    Directory of Open Access Journals (Sweden)

    Michele Proietto Galeano

    2013-01-01

    Full Text Available Rapid and reliable multiresidue analytical methods were developed and validated for the determination of 6 neonicotinoids pesticides (acetamiprid, clothianidin, imidacloprid, nitenpyram, thiacloprid, and thiamethoxam in honey. A modified QuEChERS method has allowed a very rapid and efficient single-step extraction, while the detection was performed by UHPLC/MS-MS. The recovery studies were carried out by spiking the samples at two concentration levels (10 and 40 μg/kg. The methods were subjected to a thorough validation procedure. The mean recovery was in the range of 75 to 114% with repeatability below 20%. The limits of detection were below 2.5 μg/kg, while the limits of quantification did not exceed 4.0 μg/kg. The total uncertainty was evaluated taking the main independent uncertainty sources under consideration. The expanded uncertainty did not exceed 49% for the 10 μg/kg concentration level and was in the range of 16–19% for the 40 μg/kg fortification level.

  14. Development of a perfusion reversed-phase high performance liquid chromatography method for the characterisation of maize products using multivariate analysis.

    Science.gov (United States)

    Rodriguez-Nogales, J M; Garcia, M C; Marina, M L

    2006-02-03

    A perfusion reversed-phase high performance liquid chromatography (RP-HPLC) method has been designed to allow rapid (3.4 min) separations of maize proteins with high resolution. Several factors, such as extraction conditions, temperature, detection wavelength and type and concentration of ion-pairing agent were optimised. A fine optimisation of the gradient elution was also performed by applying experimental design. Commercial maize products for human consumption (flours, precocked flours, fried snacks and extruded snacks) were characterised for the first time by perfusion RP-HPLC and their chromatographic profiles allowed a differentiation among products relating the different technological process used for their preparation. Furthermore, applying discriminant analysis makes it possible to group the samples according with the technological process suffered by maize products, obtaining a good prediction in 92% of the samples.

  15. Earthquake Hazard Analysis Methods: A Review

    Science.gov (United States)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  16. Development and validation of a spectroscopic method for the ...

    African Journals Online (AJOL)

    Tropical Journal of Pharmaceutical Research ... Purpose: To develop a new analytical method for the quantitative analysis of miconazole ... a simple, reliable and robust method for the characterization of a mixture of the drugs in a dosage form. ... By Country · List All Titles · Free To Read Titles This Journal is Open Access.

  17. Development and Validation of Improved Method for Fingerprint ...

    African Journals Online (AJOL)

    Purpose: To develop and validate an improved method by capillary zone electrophoresis with photodiode array detection for the fingerprint analysis of Ligusticum chuanxiong Hort. (Rhizoma Chuanxiong). Methods: The optimum high performance capillary electrophoresis (HPCE) conditions were 30 mM borax containing 5 ...

  18. Development of inelastic design method for liquid metal reactor plants

    International Nuclear Information System (INIS)

    Takahashi, Yukio; Take, Kohji; Kaguchi, Hitoshi; Fukuda, Yoshio; Uno, Tetsuro.

    1991-01-01

    Effective utilization of inelastic analysis in structural design assessment is expected to play an important role for avoiding too conservative design of liquid metal reactor plants. Studies have been conducted by the authors to develop a guideline for application of detailed inelastic analysis in design assessment. Both fundamental material characteristics tests and structural failure tests were conducted. Fundamental investigations were made on inelastic analysis method and creep-fatigue life prediction method based on the results of material characteristics tests. It was demonstrated through structural failure tests that the design method constructed based on these fundamental investigations can predict failure lives in structures subjected to cyclic thermal loadings with sufficient accuracy. (author)

  19. Development of a technique using MCNPX code for determination of nitrogen content of explosive materials using prompt gamma neutron activation analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Nasrabadi, M.N., E-mail: mnnasrabadi@ast.ui.ac.ir [Department of Nuclear Engineering, Faculty of Advanced Sciences and Technologies, University of Isfahan, Isfahan 81746-73441 (Iran, Islamic Republic of); Bakhshi, F.; Jalali, M.; Mohammadi, A. [Department of Nuclear Engineering, Faculty of Advanced Sciences and Technologies, University of Isfahan, Isfahan 81746-73441 (Iran, Islamic Republic of)

    2011-12-11

    Nuclear-based explosive detection methods can detect explosives by identifying their elemental components, especially nitrogen. Thermal neutron capture reactions have been used for detecting prompt gamma 10.8 MeV following radioactive neutron capture by {sup 14}N nuclei. We aimed to study the feasibility of using field-portable prompt gamma neutron activation analysis (PGNAA) along with improved nuclear equipment to detect and identify explosives, illicit substances or landmines. A {sup 252}Cf radio-isotopic source was embedded in a cylinder made of high-density polyethylene (HDPE) and the cylinder was then placed in another cylindrical container filled with water. Measurements were performed on high nitrogen content compounds such as melamine (C{sub 3}H{sub 6}N{sub 6}). Melamine powder in a HDPE bottle was placed underneath the vessel containing water and the neutron source. Gamma rays were detected using two NaI(Tl) crystals. The results were simulated with MCNP4c code calculations. The theoretical calculations and experimental measurements were in good agreement indicating that this method can be used for detection of explosives and illicit drugs.

  20. Strategic Options Development and Analysis

    Science.gov (United States)

    Ackermann, Fran; Eden, Colin

    Strategic Options Development and Analysis (SODA) enables a group or individual to construct a graphical representation of a problematic situation, and thus explore options and their ramifications with respect to a complex system of goals or objectives. In addition the method aims to help groups arrive at a negotiated agreement about how to act to resolve the situation. It is based upon the use of causal mapping - a formally constructed means-ends network - as representation form. Because the picture has been constructed using the natural language of the problem owners it becomes a model of the situation that is ‘owned' by those who define the problem. The use of formalities for the construction of the model makes it amenable to a range of analyses as well as encouraging reflection and a deeper understanding. These analyses can be used in a ‘rough and ready' manner by visual inspection or through the use of specialist causal mapping software (Decision Explorer). Each of the analyses helps a group or individual discover important features of the problem situation, and these features facilitate agreeing agood solution. The SODA process is aimed at helping a group learn about the situation they face before they reach agreements. Most significantly the exploration through the causal map leads to a higher probability of more creative solutions and promotes solutions that are more likely to be implemented because the problem construction process is wider and more likely to include richer social dimensions about the blockages to action and organizational change. The basic theories that inform SODA derive from cognitive psychology and social negotiation, where the model acts as a continuously changing representation of the problematic situation - changing as the views of a person or group shift through learning and exploration. This chapter, jointly written by two leading practitioner academics and the original developers of SODA, Colin Eden and Fran Ackermann

  1. Advances in methods and applications of reliability and safety analysis

    International Nuclear Information System (INIS)

    Fieandt, J.; Hossi, H.; Laakso, K.; Lyytikaeinen, A.; Niemelae, I.; Pulkkinen, U.; Pulli, T.

    1986-01-01

    The know-how of the reliability and safety design and analysis techniques of Vtt has been established over several years in analyzing the reliability in the Finnish nuclear power plants Loviisa and Olkiluoto. This experience has been later on applied and developed to be used in the process industry, conventional power industry, automation and electronics. VTT develops and transfers methods and tools for reliability and safety analysis to the private and public sectors. The technology transfer takes place in joint development projects with potential users. Several computer-aided methods, such as RELVEC for reliability modelling and analysis, have been developed. The tool developed are today used by major Finnish companies in the fields of automation, nuclear power, shipbuilding and electronics. Development of computer-aided and other methods needed in analysis of operating experience, reliability or safety is further going on in a number of research and development projects

  2. 1998 Annual Study Report. Standards development of chemical analysis and non destructive inspection methods for pure titanium metals; 1998 nendo seika hokokusho. Jun chitan no shiken hyoka hoho no hyojunka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    This study was conducted to standardize the chemical analysis and non-destructive inspection methods for pure titanium metals of industrial grade. These methods are among those serving bases for international standardization of products. The chemical analysis is aimed at quantitative analysis of trace impurities, in particular, present in pure titanium metals of industrial grade by developing and standardizing the inductively coupled plasma atomic emission spectroscopy, known for its low detectable limit, and, at the same time, spark and glow discharged atomic emission spectrometry as the improved routine analysis methods. These methods, although being used by, e.g., steel makers, have not been standardized because the effects of titanium-peculiar matrix are not elucidated. The non-destructive testing is aimed at standardization of the techniques useful for automatic production lines. More concretely, these include optical methods aided by a laser or CCD camera for plate surface defect inspection, ultrasonic methods for plate internal defect inspection, and pressure differential methods for air-tightness of welded pipes. They have not been used yet for automatic production lines. (NEDO)

  3. Review of various dynamic modeling methods and development of an intuitive modeling method for dynamic systems

    International Nuclear Information System (INIS)

    Shin, Seung Ki; Seong, Poong Hyun

    2008-01-01

    Conventional static reliability analysis methods are inadequate for modeling dynamic interactions between components of a system. Various techniques such as dynamic fault tree, dynamic Bayesian networks, and dynamic reliability block diagrams have been proposed for modeling dynamic systems based on improvement of the conventional modeling methods. In this paper, we review these methods briefly and introduce dynamic nodes to the existing Reliability Graph with General Gates (RGGG) as an intuitive modeling method to model dynamic systems. For a quantitative analysis, we use a discrete-time method to convert an RGGG to an equivalent Bayesian network and develop a software tool for generation of probability tables

  4. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    Science.gov (United States)

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  5. Development of self-awareness after severe traumatic brain injury through participation in occupation-based rehabilitation: mixed-methods analysis of a case series.

    Science.gov (United States)

    Doig, Emmah; Kuipers, Pim; Prescott, Sarah; Cornwell, Petrea; Fleming, Jennifer

    2014-01-01

    OBJECTIVE. We examined participation in goal planning and development of self-awareness for people with impaired self-awareness after traumatic brain injury. METHOD. We performed a mixed-methods study of 8 participants recently discharged from inpatient rehabilitation. Self-awareness was measured using discrepancy between self and significant other ratings on the Mayo-Portland Adaptability Index (MPAI-4) at four time points. We calculated effect size to evaluate the change in MPAI-4 discrepancy over time. RESULTS. Seven participants identified their own goals. We found a large reduction in mean MPAI-4 discrepancy (M = 8.57, SD = 6.59, N = 7, d = 1.08) in the first 6 wk and a further small reduction (M = 5.33, SD = 9.09, N = 6, d = 0.45) in the second 6 wk of intervention. Case data indicated that 7 participants demonstrated some growth in self-awareness. CONCLUSION. Engagement in occupation-based, goal-directed rehabilitation appeared to foster awareness of injury-related changes to varying extents. Copyright © 2014 by the American Occupational Therapy Association, Inc.

  6. Development and validation of a simple thin-layer chromatographic method for the analysis of p-chlorophenol in treated wastewater

    Directory of Open Access Journals (Sweden)

    Tešić Živoslav

    2012-01-01

    Full Text Available A thin-layer chromatographic method with densitometric detection was established for quantification of p-chlorophenol in waste water. D