WorldWideScience

Sample records for analysis methods developed

  1. Development of advanced MCR task analysis methods

    International Nuclear Information System (INIS)

    Na, J. C.; Park, J. H.; Lee, S. K.; Kim, J. K.; Kim, E. S.; Cho, S. B.; Kang, J. S.

    2008-07-01

    This report describes task analysis methodology for advanced HSI designs. Task analyses was performed by using procedure-based hierarchical task analysis and task decomposition methods. The results from the task analysis were recorded in a database. Using the TA results, we developed static prototype of advanced HSI and human factors engineering verification and validation methods for an evaluation of the prototype. In addition to the procedure-based task analysis methods, workload estimation based on the analysis of task performance time and analyses for the design of information structure and interaction structures will be necessary

  2. Method development for trace and ultratrace analysis

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Method development, that is, selection of a mode of chromatography and the right column and mobile-phase combination for trace and ultratrace analysis, requires several main considerations. The method should be useful for resolving various trace and ultratrace components present in the sample. If the nature of these components is known, the choice of method may be straightforward, that is, a selection can be made from the following modes of HPLC: (1) adsorption chromatography; (2) normal-phase chromatography; (3) reversed-phase chromatography; (4) ion-pair chromatography; (5) ion-exchange chromatography; (6) ion chromatography. Unfortunately, the nature of all of the components is frequently unknown. However, several intelligent judgments can be made on the nature of impurities. This chapter deals with some basic approaches to mobile-phase selection and optimization. More detailed information may be found in basic texts. Techniques for separation of high-molecular-weight compounds (macromolecules) and chiral compounds may be found elsewhere. Mainly compounds with molecular weight lower than 2,000 are discussed here. 123 refs

  3. Development of Ultraviolet Spectrophotometric Method for Analysis ...

    African Journals Online (AJOL)

    Purpose: An ultraviolet spectrophotometric system was developed and validated for the quantitative determination of lornoxicam in solid dosage forms. Methods: Lornoxicam was dissolved in 0.01M NaOH and analysed using ultraviolet (UV) spectrophotometry. Various analytical parameters such as linearity, precision, ...

  4. Development of analysis methods for seismically isolated nuclear structures

    International Nuclear Information System (INIS)

    Yoo, Bong; Lee, Jae-Han; Koo, Gyeng-Hoi

    2002-01-01

    KAERI's contributions to the project entitled Development of Analysis Methods for Seismically Isolated Nuclear Structures under IAEA CRP of the intercomparison of analysis methods for predicting the behaviour of seismically isolated nuclear structures during 1996-1999 in effort to develop the numerical analysis methods and to compare the analysis results with the benchmark test results of seismic isolation bearings and isolated nuclear structures provided by participating countries are briefly described. Certain progress in the analysis procedures for isolation bearings and isolated nuclear structures has been made throughout the IAEA CRPs and the analysis methods developed can be improved for future nuclear facility applications. (author)

  5. Development of Ultraviolet Spectrophotometric Method for Analysis ...

    African Journals Online (AJOL)

    HP

    oxicams such as piroxicam and tenoxicam. [1]. It is used for the treatment of rheumatoid arthritis and other rheumatic diseases and is available in the market as tablet and injection. A literature survey showed that the analysis of lornoxicam in pharmaceutical preparations is usually by high performance liquid chromatography ...

  6. Development of the statisticlal analysis methods for managing setpoint drifts

    International Nuclear Information System (INIS)

    Jang, S. C.; Lim, T. J.

    2004-08-01

    The nuclear industry has generally accepted the statistical approach suggested in ISA-67.04(1994) for ensuring that the setpoints for safety-related instrumentation are established and maintained within the technical specification limits (NRC RG1.105, 1999). However, the current methodologies of the document may be insufficient to manage the setpoint drift of instrumentation devices, because they are basically designed to determine an instrument setpoint using statistical prediction techniques. In this report, we propose a new statistical analysis procedure composed of six steps for the management of the setpoint drift using the plant-specific as-found/as-left data of the instrumentation devices. For the applicability of the proposed method, an example is illustrated based on practical as-found/as-left data obtained from the channel functional test of a bistable at a one-month surveillance interval in a Korean Standard Nuclear Power Plant. All analysis were performed using the SeDA(Setpoint Drift Analysis) program that has been developed in accordance with the new 6-step procedure. The use of more statistical graphic tools can facilitate the process of the previous setpoint drift evaluation. The scope of the drift analysis is enlarged by the adoption of non-parametric statistical methods from the view point of the methodology. Several statistical process control techniques will provide the plant staff with more efficiency for the management of the instrumentation device

  7. Developing Methods of praxeology to Perform Document-analysis

    DEFF Research Database (Denmark)

    Frederiksen, Jesper

    2016-01-01

    This paper provides a contribution to the methodological development on praxeologic document analysis of neoliberal welfare state policies. Different institutions related to the Danish Healthcare area, transform international health policies and these institutions produce a range of strategies. T...

  8. Developing Methods of praxeology to Perform Document-analysis

    DEFF Research Database (Denmark)

    Frederiksen, Jesper

    2016-01-01

    . The affiliations of the different institutional and professional logics affect the strategies. Based on three empirical studies from welfare state documents of Inter-professional collaboration, Coherence in healthcare and Patient-safety by incident report, a summative description on the methodological work......This paper provides a contribution to the methodological development on praxeologic document analysis of neoliberal welfare state policies. Different institutions related to the Danish Healthcare area, transform international health policies and these institutions produce a range of strategies...

  9. Developing the UIC 406 Method for Capacity Analysis

    DEFF Research Database (Denmark)

    Khadem Sameni, Melody; Landex, Alex; Preston, John

    2011-01-01

    This paper applies an improvement cycle for analysing and enhancing capacity utilisation of an existing timetable. Macro and micro capacity utilisation are defined based on the discrete nature of capacity utilisation and different capacity metrics are analysed. In the category of macro asset util...... the spare capacity (Danish case study). Some suggestions are made to develop meso indices by using the UIC 406 method to decide between the alternatives for adding or removing trains....

  10. Analysis of regional development with use of multivariate statistical methods

    Directory of Open Access Journals (Sweden)

    Libuše Svatošová

    2006-01-01

    Full Text Available The paper deals with differentiation of regional entities within the Czech Republic based on study of human potential. The human factor has been defined by 22 variables from three domains: population density, demographic indicators and economic activities of inhabitants. The variables have been recorded by regions and selected districts of the C.R. in 1994–2004, for computation purposes they have been represented by their averages and standardized. Principal component method has been employed for solution, facilitating to reduce number of the variables without any considerable loss of information, to select the most significant factors for a given area and to aggregate the variables into larger groups (principal components. Two extensive methods have been constructed. The first one has been based on the C.R. regions, the second one on the data from the Vysočina Region districts. Solution results demonstrate different roles of the separate aggregate variables in regional development. While in the C.R. as a whole, the most difficult problem is population ageing, growth of urban population and unemployment, in the Vysočina Region it is the development of small villages and of countryside as a whole, and unemployment. The method used is suiteble generally for study and assessment of regional development and it brings many objective information for decision-making process.

  11. Development of a virtual metrology method using plasma harmonics analysis

    Science.gov (United States)

    Jun, H.; Shin, J.; Kim, S.; Choi, H.

    2017-07-01

    A virtual metrology technique based on plasma harmonics is developed for predicting semiconductor processes. From a plasma process performed by 300 mm photoresist stripper equipment, a strong correlation is found between optical plasma harmonics intensities and the process results, such as the photoresist strip rate and strip non-uniformity. Based on this finding, a general process prediction model is developed. The developed virtual metrology model shows that the R-squared (R2) values between the measured and predicted process results are 95% and 64% for the photoresist strip rate and photoresist strip non-uniformity, respectively. This is the first research on process prediction based on optical plasma harmonics analysis, and the results can be applied to semiconductor processes such as dry etching and plasma enhanced chemical vapor deposition.

  12. Development of a virtual metrology method using plasma harmonics analysis

    Directory of Open Access Journals (Sweden)

    H. Jun

    2017-07-01

    Full Text Available A virtual metrology technique based on plasma harmonics is developed for predicting semiconductor processes. From a plasma process performed by 300 mm photoresist stripper equipment, a strong correlation is found between optical plasma harmonics intensities and the process results, such as the photoresist strip rate and strip non-uniformity. Based on this finding, a general process prediction model is developed. The developed virtual metrology model shows that the R-squared (R2 values between the measured and predicted process results are 95% and 64% for the photoresist strip rate and photoresist strip non-uniformity, respectively. This is the first research on process prediction based on optical plasma harmonics analysis, and the results can be applied to semiconductor processes such as dry etching and plasma enhanced chemical vapor deposition.

  13. Development of rupture process analysis method for great earthquakes using Direct Solution Method

    Science.gov (United States)

    Yoshimoto, M.; Yamanaka, Y.; Takeuchi, N.

    2010-12-01

    Conventional rupture process analysis methods using teleseismic body waves were based on ray theory. Therefore, these methods have the following problems in applying to great earthquakes such as 2004 Sumatra earthquake: (1) difficulty in computing all later phases such as the PP reflection phase, (2) impossibility of computing called “W phase”, the long period phase arriving before S wave, (3) implausibility of hypothesis that the distance is far enough from the observation points to the hypocenter compared to the fault length. To solve above mentioned problems, we have developed a new method which uses the synthetic seismograms computed by the Direct Solution Method (DSM, e.g. Kawai et al. 2006) as Green’s functions. We used the DSM software (http://www.eri.u-tokyo.ac.jp/takeuchi/software/) for computing the Green’s functions up to 1 Hz for the IASP91 (Kennett and Engdahl, 1991) model, and determined the final slip distributions using the waveform inversion method (Kikuchi et al. 2003). First we confirmed whether the Green’s functions computed by DSM were accurate in higher frequencies up to 1 Hz. Next we performed the rupture process analysis of this new method for Mw8.0 (GCMT) large Solomon Islands earthquake on April 1, 2007. We found that this earthquake consisted of two asperities and the rupture propagated across the subducting Sinbo ridge. The obtained slip distribution better correlates to the aftershock distributions than existing method. Furthermore, this new method keep same accuracy of existing method (which has the advantage of calculating) with respect to direct P-wave and reflection phases near the source, and also accurately calculate the later phases such a PP-wave.

  14. Method Development in the Area of Multi-Block Analysis Focused on Food Analysis

    DEFF Research Database (Denmark)

    Biancolillo, Alessandra

    In data analysis one could be interested in the relations among a number of data sets (data blocks) having different origin. In food science, this can be particularly relevant. For instance, developing a new product, one may need to understand the relation between physical/chemical data, sensory...... could have not only different origin, but measurements could be taken at different time points or by multi-channel instruments. It has been demonstrated, that it is more convenient to extract information from multi-block data sets handling all the blocks at the same time. Namely, performing data fusion......-development and method-testing in the multi-block analysis field, with a specific focus on food analysis. Novel approaches will be compared with other well-known methods used in the same field and they will be applied both in regression and in classification. The new methodologies will be tested on simulations...

  15. Development of Rotor Diagnosis Method via Motor Current Signature Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Seok; Huh, Hyung; Kim, Min Hwan; Jeong, Kyeong Hoon; Lee, Gyu Mhan; Park, Jin Ho; Park, Keun Bae; Lee, Cheol Kwon; Hur, S

    2006-01-15

    A study on motor current signature analysis has been performed to monitor a journal bearing fault due to increasing clearance. It was known that the journal bearing clearance produces side band frequencies, the supplied current frequency plus and minus rotational rotor frequency in motor current. But the existence information of the side band frequencies is not sufficient to diagnose whether the journal bearing is safe or not. Four journal bearing sets with different clearances are used to measure the side band frequency amplitude and the rotor vibration amplitude versus the journal bearing clearance. The side band frequency amplitude and the rotor vibration amplitude are increased as the journal bearing clearance is increasing. This trend assures that ASME OM vibration guide line can be applied to estimate the journal bearing clearance size. In this research, 2.5 times the reference side band amplitude is suggested as an indicator of a journal bearing fault. Further study is necessary to make out more specific quantitative relations between the side band frequency amplitude and the journal bearing clearance of a motor.

  16. Development of computational methods of design by analysis for pressure vessel components

    International Nuclear Information System (INIS)

    Bao Shiyi; Zhou Yu; He Shuyan; Wu Honglin

    2005-01-01

    Stress classification is not only one of key steps when pressure vessel component is designed by analysis, but also a difficulty which puzzles engineers and designers at all times. At present, for calculating and categorizing the stress field of pressure vessel components, there are several computation methods of design by analysis such as Stress Equivalent Linearization, Two-Step Approach, Primary Structure method, Elastic Compensation method, GLOSS R-Node method and so on, that are developed and applied. Moreover, ASME code also gives an inelastic method of design by analysis for limiting gross plastic deformation only. When pressure vessel components design by analysis, sometimes there are huge differences between the calculating results for using different calculating and analysis methods mentioned above. As consequence, this is the main reason that affects wide application of design by analysis approach. Recently, a new approach, presented in the new proposal of a European Standard, CEN's unfired pressure vessel standard EN 13445-3, tries to avoid problems of stress classification by analyzing pressure vessel structure's various failure mechanisms directly based on elastic-plastic theory. In this paper, some stress classification methods mentioned above, are described briefly. And the computational methods cited in the European pressure vessel standard, such as Deviatoric Map, and nonlinear analysis methods (plastic analysis and limit analysis), are depicted compendiously. Furthermore, the characteristics of computational methods of design by analysis are summarized for selecting the proper computational method when design pressure vessel component by analysis. (authors)

  17. Chemical analysis of solid residue from liquid and solid fuel combustion: Method development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Trkmic, M. [University of Zagreb, Faculty of Mechanical Engineering and Naval Architecturek Zagreb (Croatia); Curkovic, L. [University of Zagreb, Faculty of Chemical Engineering and Technology, Zagreb (Croatia); Asperger, D. [HEP-Proizvodnja, Thermal Power Plant Department, Zagreb (Croatia)

    2012-06-15

    This paper deals with the development and validation of methods for identifying the composition of solid residue after liquid and solid fuel combustion in thermal power plant furnaces. The methods were developed for energy dispersive X-ray fluorescence (EDXRF) spectrometer analysis. Due to the fuels used, the different composition and the location of creation of solid residue, it was necessary to develop two methods. The first method is used for identifying solid residue composition after fuel oil combustion (Method 1), while the second method is used for identifying solid residue composition after the combustion of solid fuels, i. e. coal (Method 2). Method calibration was performed on sets of 12 (Method 1) and 6 (Method 2) certified reference materials (CRM). CRMs and analysis test samples were prepared in pellet form using hydraulic press. For the purpose of method validation the linearity, accuracy, precision and specificity were determined, and the measurement uncertainty of methods for each analyte separately was assessed. The methods were applied in the analysis of real furnace residue samples. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  18. A Product Analysis Method and its Staging to Develop Redesign Competences

    DEFF Research Database (Denmark)

    Hansen, Claus Thorp; Lenau, Torben Anker

    2013-01-01

    Most product development work in industrial practice is incremental, i.e. the company has had a product in production and on the market for some time, and now time has come to design an upgraded variant. This type of redesign project requires that the engineering designers have competences to carry...... through an analysis of the existing product encompassing both a user-oriented and a technical perspective, as well as to synthesise solution proposals for the upgraded variant. In the course module Product Analysis and Redesign we have developed a product analysis method and a staging of it, which seems...... to be very productive. In this paper we present the product analysis method and its staging and we reflect on the students’ application of it. We conclude that the method is a valid contribution to develop the students’ redesign competences....

  19. Development and application of an ELISA method for the analysis of protein-based binding media of artworks

    DEFF Research Database (Denmark)

    Lee, Hae Young; Atlasevich, Natalya; Granzotto, Clara

    2015-01-01

    Development and application of an ELISA method for the analysis of protein-based binding media of artworks.......Development and application of an ELISA method for the analysis of protein-based binding media of artworks....

  20. Rodgers' evolutionary concept analysis--a valid method for developing knowledge in nursing science.

    Science.gov (United States)

    Tofthagen, Randi; Fagerstrøm, Lisbeth M

    2010-12-01

    In nursing science, concept development is a necessary prerequisite for meaningful basic research. Rodgers' evolutionary concept analysis is a method for developing knowledge in nursing science. The purpose of this article is to present Rodgers' evolutionary concept analysis as a valid scientific method. A brief description of the evolutionary process, from data collection to data analysis, with the concepts' context, surrogate and related terms, antecedents, attributes, examples and consequences, is presented. The phases used in evolutionary concept analysis are illustrated with eight actual studies (1999-2009) from nursing research. The strength of the method is that it is systematic, with a focus on clear-cut phases during the analysis process, and that it can contribute to clarifying, describing and explaining concepts central to nursing science by analysing how a chosen concept has been used both within the discipline itself and other health sciences. While an interdisciplinary perspective which stresses the similarities and dissimilarities of how a concept is used in various disciplines can increase knowledge of a concept, it is important to clarify the specific with the discipline. Nursing research should focus on the unambiguous use of concepts, for which Rodgers' method constitutes a possible method. The importance of using quality criteria to determine the inclusion of material should, however, be emphasised in the continued development of the method. © 2010 The Authors. Scandinavian Journal of Caring Sciences © 2010 Nordic College of Caring Science.

  1. Preliminary Tests For Development Of A Non-Pertechnetate Analysis Method

    Energy Technology Data Exchange (ETDEWEB)

    Diprete, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); McCabe, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-09-28

    The objective of this task was to develop a non-pertechnetate analysis method that 222-S lab could easily implement. The initial scope involved working with 222-S laboratory personnel to adapt the existing Tc analytical method to fractionate the non-pertechnetate and pertechnetate. SRNL then developed and tested a method using commercial sorbents containing Aliquat® 336 to extract the pertechnetate (thereby separating it from non-pertechnetate), followed by oxidation, extraction, and stripping steps, and finally analysis by beta counting and Mass Spectroscopy. Several additional items were partially investigated, including impacts of a 137Cs removal step. The method was initially tested on SRS tank waste samples to determine its viability. Although SRS tank waste does not contain non-pertechnetate, testing with it was useful to investigate the compatibility, separation efficiency, interference removal efficacy, and method sensitivity.

  2. Development and validation of a multiresidue method for pesticide analysis in honey by UFLC-MS

    Directory of Open Access Journals (Sweden)

    Adriana M. Zamudio S.

    2017-05-01

    Full Text Available A method for the determination of pesticide residues in honey by ultra fast liquid chromatography coupled with mass spectrometry was developed. For this purpose, different variations of the QuECHERS method were performed: (i amount of sample, (ii type of salt to control pH, (iii buffer pH, and (iv different mixtures for cleaning-up. In addition, to demonstrate that the method is reliable, different validation parameters were studied: accuracy, limits of detection and quantification, linearity and selectivity. The results showed that by means of the changes introduced it was possible to get a more selective method that improves the accuracy of about 19 pesticides selected from the original method. It was found that the method is suitable for the analysis of 50 pesticides, out of 56. Furthermore, with the developed method recoveries between 70 and 120% and relative standard deviation below 15% were found.

  3. Development of pin-by-pin core analysis method using three-dimensional direct response matrix

    International Nuclear Information System (INIS)

    Ishii, Kazuya; Hino, Tetsushi; Mitsuyasu, Takeshi; Aoyama, Motoo

    2009-01-01

    A three-dimensional direct response matrix method using a Monte Carlo calculation has been developed. The direct response matrix is formalized by four subresponse matrices in order to respond to a core eigenvalue k and thus can be recomposed at each outer iteration in core analysis. The subresponse matrices can be evaluated by ordinary single fuel assembly calculations with the Monte Carlo method in three dimensions. Since these subresponse matrices are calculated for the actual geometry of the fuel assembly, the effects of intra- and inter-assembly heterogeneities can be reflected on global partial neutron current balance calculations in core analysis. To verify this method, calculations for heterogeneous systems were performed. The results obtained using this method agreed well with those obtained using direct calculations with a Monte Carlo method. This means that this method accurately reflects the effects of intra- and inter-assembly heterogeneities and can be used for core analysis. A core analysis method, in which neutronic calculations using this direct response matrix method are coupled with thermal-hydraulic calculations, has also been developed. As it requires neither diffusion approximation nor a homogenization process of lattice constants, a precise representation of the effects of neutronic heterogeneities is possible. Moreover, the fuel rod power distribution can be directly evaluated, which enables accurate evaluations of core thermal margins. A method of reconstructing the response matrices according to the condition of each node in the core has been developed. The test revealed that the neutron multiplication factors and the fuel rod neutron production rates could be reproduced by interpolating the elements of the response matrix. A coupled analysis of neutronic calculations using the direct response matrix method and thermal-hydraulic calculations for an ABWR quarter core was performed, and it was found that the thermal power and coolant

  4. History and Development of the Schmidt-Hunter Meta-Analysis Methods

    Science.gov (United States)

    Schmidt, Frank L.

    2015-01-01

    In this article, I provide answers to the questions posed by Will Shadish about the history and development of the Schmidt-Hunter methods of meta-analysis. In the 1970s, I headed a research program on personnel selection at the US Office of Personnel Management (OPM). After our research showed that validity studies have low statistical power, OPM…

  5. The development of methods of analysis of documents on the basis of the methods of Raman spectroscopy and fluorescence analysis

    Science.gov (United States)

    Gorshkova, Kseniia O.; Tumkin, Ilya I.; Kirillova, Elizaveta O.; Panov, Maxim S.; Kochemirovsky, Vladimir A.

    2017-05-01

    The investigation of natural aging of writing inks printed on paper using Raman spectroscopy was performed. Based on the obtained dependencies of the Raman peak intensities ratios on the exposure time, the dye degradation model was proposed. It was suggested that there are several competing bond breaking and bond forming reactions corresponding to the characteristic vibration frequencies of the dye molecule that simultaneously occur during ink aging process. Also we propose a methodology based on the study of the optical properties of paper, particularly changes in the fluorescence of optical brighteners included in its composition as well as the paper reflectivity using spectrophotometric methods. These results can be implemented to develop the novel and promising method of criminology.

  6. Adjoint-based Mesh Optimization Method: The Development and Application for Nuclear Fuel Analysis

    International Nuclear Information System (INIS)

    Son, Seongmin; Lee, Jeong Ik

    2016-01-01

    In this research, methods for optimizing mesh distribution is proposed. The proposed method uses adjoint base optimization method (adjoint method). The optimized result will be obtained by applying this meshing technique to the existing code input deck and will be compared to the results produced from the uniform meshing method. Numerical solutions are calculated form an in-house 1D Finite Difference Method code while neglecting the axial conduction. The fuel radial node optimization was first performed to match the Fuel Centerline Temperature (FCT) the best. This was followed by optimizing the axial node which the Peak Cladding Temperature (PCT) is matched the best. After obtaining the optimized radial and axial nodes, the nodalization is implemented into the system analysis code and transient analyses were performed to observe the optimum nodalization performance. The developed adjoint-based mesh optimization method in the study is applied to MARS-KS, which is a nuclear system analysis code. Results show that the newly established method yields better results than that of the uniform meshing method from the numerical point of view. It is again stressed that the optimized mesh for the steady state can also give better numerical results even during a transient analysis

  7. Considerations for Task Analysis Methods and Rapid E-Learning Development Techniques

    Directory of Open Access Journals (Sweden)

    Dr. Ismail Ipek

    2014-02-01

    Full Text Available The purpose of this paper is to provide basic dimensions for rapid training development in e-learning courses in education and business. Principally, it starts with defining task analysis and how to select tasks for analysis and task analysis methods for instructional design. To do this, first, learning and instructional technologies as visions of the future were discussed. Second, the importance of task analysis methods in rapid e-learning was considered, with learning technologies as asynchronous and synchronous e-learning development. Finally, rapid instructional design concepts and e-learning design strategies were defined and clarified with examples, that is, all steps for effective task analysis and rapid training development techniques based on learning and instructional design approaches were discussed, such as m-learning and other delivery systems. As a result, the concept of task analysis, rapid e-learning development strategies and the essentials of online course design were discussed, alongside learner interface design features for learners and designers.

  8. A new method of analysis enabled a better understanding of clinical practice guideline development processes.

    Science.gov (United States)

    Moreira, Tiago; May, Carl; Mason, James; Eccles, Martin

    2006-11-01

    To describe the process by which various forms of evidence are discussed, valued, and interpreted within the process of developing evidence-based clinical practice guidelines and, in so doing, to develop a method for such studies. An observational study. Two guideline development groups were observed by a nonparticipant observer. The 21 meetings were recorded, transcribed, and analyzed using grounded theory and frame analysis. Qualitative analysis was complemented with descriptive statistics. The groups organized their discussion around four domains--'science', 'practice', politics', and 'process'--and used boundary work to mediate between these domains. Both groups spent most time discussing 'science', followed by 'practice' or its relation with 'science'. Our analysis offers an innovative, replicable method of analysis of guideline development that permits the identification of the proportions and interrelations between knowledge domains deployed by guideline groups. This analysis also suggests that the participation hierarchy observed here and by others might be an effect of the imbalanced use of knowledge domains in the construction of clinical guidance. This constitutes an important framework to understand the interplay of participants and knowledge in guideline development.

  9. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    Ure, James; Chen, Haofeng; Tipping, David

    2014-01-01

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  10. Whole genome sequence analysis of unidentified genetically modified papaya for development of a specific detection method.

    Science.gov (United States)

    Nakamura, Kosuke; Kondo, Kazunari; Akiyama, Hiroshi; Ishigaki, Takumi; Noguchi, Akio; Katsumata, Hiroshi; Takasaki, Kazuto; Futo, Satoshi; Sakata, Kozue; Fukuda, Nozomi; Mano, Junichi; Kitta, Kazumi; Tanaka, Hidenori; Akashi, Ryo; Nishimaki-Mogami, Tomoko

    2016-08-15

    Identification of transgenic sequences in an unknown genetically modified (GM) papaya (Carica papaya L.) by whole genome sequence analysis was demonstrated. Whole genome sequence data were generated for a GM-positive fresh papaya fruit commodity detected in monitoring using real-time polymerase chain reaction (PCR). The sequences obtained were mapped against an open database for papaya genome sequence. Transgenic construct- and event-specific sequences were identified as a GM papaya developed to resist infection from a Papaya ringspot virus. Based on the transgenic sequences, a specific real-time PCR detection method for GM papaya applicable to various food commodities was developed. Whole genome sequence analysis enabled identifying unknown transgenic construct- and event-specific sequences in GM papaya and development of a reliable method for detecting them in papaya food commodities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Development and validation of an extraction method for the analysis of perfluoroalkyl substances in human hair.

    Science.gov (United States)

    Kim, Da-Hye; Oh, Jeong-Eun

    2017-05-01

    Human hair has many advantages as a non-invasive sample; however, analytical methods for detecting perfluoroalkyl substances (PFASs) in human hair are still in the development stage. Therefore, the aim of this study was to develop and validate a method for monitoring 11 PFASs in human hair. Solid-phase extraction (SPE), ion-pairing extraction (IPE), a combined method (SPE+IPE) and solvent extraction with ENVI-carb clean-up were compared to develop an optimal extraction method using two types of hair sample (powder and piece forms). Analysis of PFASs was performed using liquid chromatography and tandem mass spectrometry. Among the four different extraction procedures, the SPE method using powdered hair showed the best extraction efficiency and recoveries ranged from 85.8 to 102%. The method detection limits for the SPE method were 0.114-0.796 ng/g and good precision (below 10%) and accuracy (66.4-110%) were obtained. In light of these results, SPE is considered the optimal method for PFAS extraction from hair. It was also successfully used to detect PFASs in human hair samples. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Factor analysis methods and validity evidence: a review of instrument development across the medical education continuum.

    Science.gov (United States)

    Wetzel, Angela P

    2012-08-01

    Instrument development consistent with best practices is necessary for effective assessment and evaluation of learners and programs across the medical education continuum. The author explored the extent to which current factor analytic methods and other techniques for establishing validity are consistent with best practices. The author conducted electronic and hand searches of the English-language medical education literature published January 2006 through December 2010. To describe and assess current practices, she systematically abstracted reliability and validity evidence as well as factor analysis methods, data analysis, and reported evidence from instrument development articles reporting the application of exploratory factor analysis and principal component analysis. Sixty-two articles met eligibility criteria. They described 64 instruments and 95 factor analyses. Most studies provided at least one source of evidence based on test content. Almost all reported internal consistency, providing evidence based on internal structure. Evidence based on response process and relationships with other variables was reported less often, and evidence based on consequences of testing was not identified. Factor analysis findings suggest common method selection errors and critical omissions in reporting. Given the limited reliability and validity evidence provided for the reviewed instruments, educators should carefully consider the available supporting evidence before adopting and applying published instruments. Researchers should design for, test, and report additional evidence to strengthen the argument for reliability and validity of these measures for research and practice.

  13. DEVELOPMENT AND VALIDATION OF NUMERICAL METHOD FOR STRENGTH ANALYSIS OF LATTICE COMPOSITE FUSELAGE STRUCTURES

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available Lattice composite fuselage structures are developed as an alternative to conventional composite structures based on laminated skin and stiffeners. Structure layout of lattice structures allows to realize advantages of current composite materials to a maximal extent, at the same time minimizing their main shortcomings, that allows to provide higher weight efficiency for these structures in comparison with conventional analogues.Development and creation of lattice composite structures requires development of novel methods of strength anal- ysis, as conventional methods, as a rule, are aiming to strength analysis of thin-walled elements and do not allow to get confident estimation of local strength of high-loaded unidirectional composite ribs.In the present work the method of operative strength analysis of lattice composite structure is presented, based onspecialized FE-models of unidirectional composite ribs and their intersections. In the frames of the method, every rib is modeled by a caisson structure, consisting of arbitrary number of flanges and webs, modeled by membrane finite elements. Parameters of flanges and webs are calculated automatically from the condition of stiffness characteristics equality of real rib and the model. This method allows to perform local strength analysis of high-loaded ribs of lattice structure without use of here-dimensional finite elements, that allows to shorten time of calculations and sufficiently simplify the procedure of analysis of results of calculations.For validation of the suggested method, the results of experimental investigations of full-scale prototype of shell of lattice composite fuselage section have been used. The prototype of the lattice section was manufactured in CRISM and tested in TsAGI within the frames of a number of Russian and International scientific projects. The results of validation have shown that the suggested method allows to provide high operability of strength analysis, keeping

  14. Development of Hydrophilic Interaction Liquid Chromatography Method for the Analysis of Moxonidine and Its Impurities

    Directory of Open Access Journals (Sweden)

    Slavica Filipic

    2016-01-01

    Full Text Available Fast and simple hydrophilic interaction liquid chromatography (HILIC method was developed and validated for the analysis of moxonidine and its four impurities (A, B, C, and D in pharmaceutical dosage form. All experiments were performed on the Agilent Technologies 1200 high-performance liquid chromatography (HPLC system using Zorbax RX-SIL, 250 mm × 4.6 mm, 5 μm column as stationary phase (T=25°C, F=1 mL/min, and λ=255 nm, and mixture of acetonitrile and 40 mM ammonium formate buffer (pH 2.8 80 : 20 (v/v as mobile phase. Under the optimal chromatographic conditions, selected by central composite design, separation and analysis of moxonidine and its four impurities are enabled within 12 minutes. Validation of the method was conducted in accordance with ICH guidelines. Based on the obtained results selectivity, linearity (r≥0.9976, accuracy (recovery: 93.66%–114.08%, precision (RSD: 0.56%–2.55%, and robustness of the method were confirmed. The obtained values of the limit of detection and quantification revealed that the method can be used for determination of impurities levels below 0.1%. Validated method was applied for determination of moxonidine and its impurities in commercially available tablet formulation. Obtained results confirmed that validated method is fast, simple, and reliable for analysis of moxonidine and its impurities in tablets.

  15. Ion beam analysis - development and application of nuclear reaction analysis methods, in particular at a nuclear microprobe

    International Nuclear Information System (INIS)

    Sjoeland, K.A.

    1996-11-01

    This thesis treats the development of Ion Beam Analysis methods, principally for the analysis of light elements at a nuclear microprobe. The light elements in this context are defined as having an atomic number less than approx. 13. The work reported is to a large extent based on multiparameter methods. Several signals are recorded simultaneously, and the data can be effectively analyzed to reveal structures that can not be observed through one-parameter collection. The different techniques are combined in a new set-up at the Lund Nuclear Microprobe. The various detectors for reaction products are arranged in such a way that they can be used for the simultaneous analysis of hydrogen, lithium, boron and fluorine together with traditional PIXE analysis and Scanning Transmission Ion Microscopy as well as photon-tagged Nuclear Reaction Analysis. 48 refs

  16. Ion beam analysis - development and application of nuclear reaction analysis methods, in particular at a nuclear microprobe

    Energy Technology Data Exchange (ETDEWEB)

    Sjoeland, K.A.

    1996-11-01

    This thesis treats the development of Ion Beam Analysis methods, principally for the analysis of light elements at a nuclear microprobe. The light elements in this context are defined as having an atomic number less than approx. 13. The work reported is to a large extent based on multiparameter methods. Several signals are recorded simultaneously, and the data can be effectively analyzed to reveal structures that can not be observed through one-parameter collection. The different techniques are combined in a new set-up at the Lund Nuclear Microprobe. The various detectors for reaction products are arranged in such a way that they can be used for the simultaneous analysis of hydrogen, lithium, boron and fluorine together with traditional PIXE analysis and Scanning Transmission Ion Microscopy as well as photon-tagged Nuclear Reaction Analysis. 48 refs.

  17. An Analysis of Air Force Management Career Development Based on Timing of Skills Needs and Effectiveness of Development Methods

    Science.gov (United States)

    1991-09-01

    method(s) for the Controlling skill. The histogram of the preferred development methods is presented in the figure below. CONThOLLING SKOL I Pre+ewred...development method(s) for the Motivation skill. The histogram of the preferred development methods is presented in the figure below. MOTIVATON SKOL

  18. Development of a Probabilistic Tsunami Hazard Analysis Method and Application to an NPP in Korea

    International Nuclear Information System (INIS)

    Kim, M. K.; Choi, Ik

    2012-01-01

    A methodology of tsunami PSA was developed in this study. A tsunami PSA consists of tsunami hazard analysis, tsunami fragility analysis and system analysis. In the case of tsunami hazard analysis, evaluation of tsunami return period is a major task. For the evaluation of tsunami return period was evaluated with empirical method using historical tsunami record and tidal gauge record. For the performing a tsunami fragility analysis, procedure of tsunami fragility analysis was established and target equipment and structures for investigation of tsunami fragility assessment were selected. A sample fragility calculation was performed for the equipment in a Nuclear Power Plant. For the system analysis, accident sequence of tsunami event was developed according to the tsunami run-up and draw down, and tsunami induced core damage frequency (CDF) is determined. For the application to the real nuclear power plant, the Ulchin 56 NPP which is located on the east coast of Korean peninsula was selected. Through this study, whole tsunami PSA (Probabilistic Safety Assessment) working procedure was established and an example calculation was performed for one nuclear power plant in Korea

  19. Development of Uncertainty Analysis Method for SMART Digital Core Protection and Monitoring System

    International Nuclear Information System (INIS)

    Koo, Bon Seung; In, Wang Kee; Hwang, Dae Hyun

    2012-01-01

    The Korea Atomic Energy Research Institute has developed a system-integrated modular advanced reactor (SMART) for a seawater desalination and electricity generation. Online digital core protection and monitoring systems, called SCOPS and SCOMS respectively were developed. SCOPS calculates minimum DNBR and maximum LPD based on the several online measured system parameters. SCOMS calculates the variables of limiting conditions for operation. KAERI developed overall uncertainty analysis methodology which is used statistically combining uncertainty components of SMART core protection and monitoring system. By applying overall uncertainty factors in on-line SCOPS/SCOMS calculation, calculated LPD and DNBR are conservative with a 95/95 probability/confidence level. In this paper, uncertainty analysis method is described for SMART core protection and monitoring system

  20. Development of a Method for Tool Wear Analysis Using 3D Scanning

    Directory of Open Access Journals (Sweden)

    Hawryluk Marek

    2017-12-01

    Full Text Available The paper deals with evaluation of a 3D scanning method elaborated by the authors, by applying it to the analysis of the wear of forging tools. The 3D scanning method in the first place consists in the application of scanning to the analysis of changes in geometry of a forging tool by way of comparing the images of a worn tool with a CAD model or an image of a new tool. The method was evaluated in the context of the important measurement problems resulting from the extreme conditions present during the industrial hot forging processes. The method was used to evaluate wear of tools with an increasing wear degree, which made it possible to determine the wear characteristics in a function of the number of produced forgings. The following stage was the use it for a direct control of the quality and geometry changes of forging tools (without their disassembly by way of a direct measurement of the geometry of periodically collected forgings (indirect method based on forgings. The final part of the study points to the advantages and disadvantages of the elaborated method as well as the potential directions of its further development.

  1. Development and Validation of an HPLC Method for the Analysis of Sirolimus in Drug Products

    Directory of Open Access Journals (Sweden)

    Hadi Valizadeh

    2012-05-01

    Full Text Available Purpose: The aim of this study was to develop a simple, rapid and sensitive reverse phase high performance liquid chromatography (RP-HPLC method for quantification of sirolimus (SRL in pharmaceutical dosage forms. Methods: The chromatographic system employs isocratic elution using a Knauer- C18, 5 mm, 4.6 × 150 mm. Mobile phase consisting of acetonitril and ammonium acetate buffer set at flow rate 1.5 ml/min. The analyte was detected and quantified at 278nm using ultraviolet detector. The method was validated as per ICH guidelines. Results: The standard curve was found to have a linear relationship (r2 > 0.99 over the analytical range of 125–2000ng/ml. For all quality control (QC standards in intraday and interday assay, accuracy and precision range were -0.96 to 6.30 and 0.86 to 13.74 respectively, demonstrating the precision and accuracy over the analytical range. Samples were stable during preparation and analysis procedure. Conclusion: Therefore the rapid and sensitive developed method can be used for the routine analysis of sirolimus such as dissolution and stability assays of pre- and post-marketed dosage forms.

  2. Method development for trace analysis of heteroaromatic compounds in contaminated groundwater

    DEFF Research Database (Denmark)

    Johansen, Sys Stybe; Hansen, Asger B.; Mosbæk, Hans

    1996-01-01

    Water analysis,environmental analysis,extraction methods,aromatic compounds,heteroaromatic compounds,creosote,dichloromethane,diethyl ether,pentane......Water analysis,environmental analysis,extraction methods,aromatic compounds,heteroaromatic compounds,creosote,dichloromethane,diethyl ether,pentane...

  3. k{sub 0}-neutron activation analysis based method at CDTN: history, development and main achievements

    Energy Technology Data Exchange (ETDEWEB)

    Menezes, Maria Ângela de B.C.; Jacimovic, Radojko; Dalmazio, Ilza, E-mail: menezes@cdtn.br, E-mail: id@cdtn.br, E-mail: radojko.jacimovic@ijs.si [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte - MG (Brazil); Jožef Stefan Institute, Department of Environmental Sciences, Ljubljana (Slovenia)

    2017-11-01

    Neutron Activation Analysis (NAA) is an analytical technique to assay the elemental chemical composition in samples of several matrices. It has been applied by the Laboratory for Neutron Activation Analysis, located at Centro de Desenvolvimento da Tecnologia Nuclear (Nuclear Technology Development Centre) /Comissao Nacional de Energia Nuclear (Brazilian Commission for Nuclear Energy), CDTN/CNEN, since the starting up of the TRIGA MARK I IPR-R1 reactor, in 1960. Among the methods of this technique, the k{sub 0}-standardization method, which was established at CDTN in 1995, is the most efficient and in 2003 it was re-established and optimized. This paper is about the history and the main achievements since then. (author)

  4. k0-neutron activation analysis based method at CDTN: history, development and main achievements

    International Nuclear Information System (INIS)

    Menezes, Maria Ângela de B.C.; Jacimovic, Radojko; Dalmazio, Ilza

    2017-01-01

    Neutron Activation Analysis (NAA) is an analytical technique to assay the elemental chemical composition in samples of several matrices. It has been applied by the Laboratory for Neutron Activation Analysis, located at Centro de Desenvolvimento da Tecnologia Nuclear (Nuclear Technology Development Centre) /Comissao Nacional de Energia Nuclear (Brazilian Commission for Nuclear Energy), CDTN/CNEN, since the starting up of the TRIGA MARK I IPR-R1 reactor, in 1960. Among the methods of this technique, the k 0 -standardization method, which was established at CDTN in 1995, is the most efficient and in 2003 it was re-established and optimized. This paper is about the history and the main achievements since then. (author)

  5. Development of a new method for hydrogen isotope analysis of trace hydrocarbons in natural gas samples

    Directory of Open Access Journals (Sweden)

    Xibin Wang

    2016-12-01

    Full Text Available A new method had been developed for the analysis of hydrogen isotopic composition of trace hydrocarbons in natural gas samples by using solid phase microextraction (SPME combined with gas chromatography-isotope ratio mass spectrometry (GC/IRMS. In this study, the SPME technique had been initially introduced to achieve the enrichment of trace content of hydrocarbons with low abundance and coupled to GC/IRMS for hydrogen isotopic analysis. The main parameters, including the equilibration time, extraction temperature, and the fiber type, were systematically optimized. The results not only demonstrated that high extraction yield was true but also shows that the hydrogen isotopic fractionation was not observed during the extraction process, when the SPME device fitted with polydimethylsiloxane/divinylbenzene/carbon molecular sieve (PDMS/DVB/CAR fiber. The applications of SPME-GC/IRMS method were evaluated by using natural gas samples collected from different sedimentary basins; the standard deviation (SD was better than 4‰ for reproducible measurements; and also, the hydrogen isotope values from C1 to C9 can be obtained with satisfying repeatability. The SPME-GC/IRMS method fitted with PDMS/DVB/CAR fiber is well suited for the preconcentration of trace hydrocarbons, and provides a reliable hydrogen isotopic analysis for trace hydrocarbons in natural gas samples.

  6. Development of a diagnostic expert system for eddy current data analysis using applied artificial intelligence methods

    Energy Technology Data Exchange (ETDEWEB)

    Upadhyaya, B.R.; Yan, W. [Tennessee Univ., Knoxville, TN (United States). Dept. of Nuclear Engineering; Behravesh, M.M. [Electric Power Research Institute, Palo Alto, CA (United States); Henry, G. [EPRI NDE Center, Charlotte, NC (United States)

    1999-09-01

    A diagnostic expert system that integrates database management methods, artificial neural networks, and decision-making using fuzzy logic has been developed for the automation of steam generator eddy current test (ECT) data analysis. The new system, known as EDDYAI, considers the following key issues: (1) digital eddy current test data calibration, compression, and representation; (2) development of robust neural networks with low probability of misclassification for flaw depth estimation; (3) flaw detection using fuzzy logic; (4) development of an expert system for database management, compilation of a trained neural network library, and a decision module; and (5) evaluation of the integrated approach using eddy current data. The implementation to field test data includes the selection of proper feature vectors for ECT data analysis, development of a methodology for large eddy current database management, artificial neural networks for flaw depth estimation, and a fuzzy logic decision algorithm for flaw detection. A large eddy current inspection database from the Electric Power Research Institute NDE Center is being utilized in this research towards the development of an expert system for steam generator tube diagnosis. The integration of ECT data pre-processing as part of the data management, fuzzy logic flaw detection technique, and tube defect parameter estimation using artificial neural networks are the fundamental contributions of this research. (orig.)

  7. Method development for trace analysis of heteroaromatic compounds in contaminated groundwater

    International Nuclear Information System (INIS)

    Johansen, S.S.; Hansen, A.B.; Mosbaek, H.

    1996-01-01

    An analytical method providing high sensitivity (limit of quantitation of 50 ng/l) with acceptable reproducibility (mean R.S.D. 19%) has been developed for determining heteroaromatic compounds in creosote-contaminated groundwater. The best technique (highest recovery and reproducibility) found between liquid-liquid extraction using either dichloromethane, diethyl ether or pentane and solid-phase extraction with reversed-phase bonded columns, was the classical liquid extraction with dichloromethane from weak basic solutions and GC-MS (selective ion monitoring) analysis of concentrated extracts

  8. Fast analysis of glibenclamide and its impurities: quality by design framework in capillary electrophoresis method development.

    Science.gov (United States)

    Furlanetto, Sandra; Orlandini, Serena; Pasquini, Benedetta; Caprini, Claudia; Mura, Paola; Pinzauti, Sergio

    2015-10-01

    A fast capillary zone electrophoresis method for the simultaneous analysis of glibenclamide and its impurities (I(A) and I(B)) in pharmaceutical dosage forms was fully developed within a quality by design framework. Critical quality attributes were represented by I(A) peak efficiency, critical resolution between glibenclamide and I(B), and analysis time. Experimental design was efficiently used for rapid and systematic method optimization. A 3(5)//16 symmetric screening matrix was chosen for investigation of the five selected critical process parameters throughout the knowledge space, and the results obtained were the basis for the planning of the subsequent response surface study. A Box-Behnken design for three factors allowed the contour plots to be drawn and the design space to be identified by introduction of the concept of probability. The design space corresponded to the multidimensional region where all the critical quality attributes reached the desired values with a degree of probability π ≥ 90%. Under the selected working conditions, the full separation of the analytes was obtained in less than 2 min. A full factorial design simultaneously allowed the design space to be validated and method robustness to be tested. A control strategy was finally implemented by means of a system suitability test. The method was fully validated and was applied to real samples of glibenclamide tablets.

  9. Development of a low-cost method of analysis for the qualitative and quantitative analysis of butyltins in environmental samples.

    Science.gov (United States)

    Bangkedphol, Sornnarin; Keenan, Helen E; Davidson, Christine; Sakultantimetha, Arthit; Songsasen, Apisit

    2008-12-01

    Most analytical methods for butyltins are based on high resolution techniques with complicated sample preparation. For this study, a simple application of an analytical method was developed using High Performance Liquid Chromatography (HPLC) with UV detection. The developed method was studied to determine tributyltin (TBT), dibutyltin (DBT) and monobutyltin (MBT) in sediment and water samples. The separation was performed in isocratic mode on an ultra cyanopropyl column with a mobile phase of hexane containing 5% THF and 0.03% acetic acid. This method was confirmed using standard GC/MS techniques and verified by statistical paired t-test method. Under the experimental conditions used, the limit of detection (LOD) of TBT and DBT were 0.70 and 0.50 microg/mL, respectively. The optimised extraction method for butyltins in water and sediment samples involved using hexane containing 0.05-0.5% tropolone and 0.2% sodium chloride in water at pH 1.7. The quantitative extraction of butyltin compounds in a certified reference material (BCR-646) and naturally contaminated samples was achieved with recoveries ranging from 95 to 108% and at %RSD 0.02-1.00%. This HPLC method and optimum extraction conditions were used to determine the contamination level of butyltins in environmental samples collected from the Forth and Clyde canal, Scotland, UK. The values obtained severely exceeded the Environmental Quality Standard (EQS) values. Although high resolution methods are utilised extensively for this type of research, the developed method is cheaper in both terms of equipment and running costs, faster in analysis time and has comparable detection limits to the alternative methods. This is advantageous not just as a confirmatory technique but also to enable further research in this field.

  10. Development and validation of a reversed phase liquid chromatographic method for analysis of griseofulvin and impurities.

    Science.gov (United States)

    Kahsay, Getu; Adegoke, Aremu Olajire; Van Schepdael, Ann; Adams, Erwin

    2013-06-01

    A simple and robust reversed phase liquid chromatographic method was developed and validated for the quantitative determination of griseofulvin (GF) and its impurities in drug substances and drug products (tablets). Chromatographic separation was achieved on a Discovery C18 (250mm×4.6mm, 5μm) column kept at 30°C. The mobile phase consisted of a gradient mixture of mobile phase A (water-0.1% formic acid pH 4.5, 80:20, v/v) and B (ACN-water-0.1% formic acid pH 4.5, 65:15:20, v/v/v) pumped at a flow rate of 1.0mL/min. UV detection was performed at 290nm. The method was validated for its robustness, sensitivity, precision, accuracy and linearity based on ICH guidelines. The robustness study was performed by means of an experimental design and multivariate analysis. Satisfactory results were obtained from the validation studies. The use of volatile mobile phases allowed for the identification of three main impurities present above the identification threshold using mass spectrometry (MS). The developed LC method has been applied for the assay and impurity determination of GF drug substances and tablets. The method could be very useful for the quality control of GF and its impurities in bulk and formulated dosage forms. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Development of a correction method for the time-of-flight prompt gamma-ray analysis

    Science.gov (United States)

    Huang, M.; Toh, Y.; Ebihara, M.; Kimura, A.; Nakamura, S.

    2017-03-01

    A new analytical technique, time-of-flight prompt gamma-ray analysis, has been developed at the Japan Proton Accelerator Research Complex. In order to apply it to accurate elemental analysis, a set of Fe and Au reference samples were measured to examine the several factors which affect the number of detected events. It was found that major contributing factors included attenuations of neutrons and gamma rays in the sample, live-time fraction and signal pile-up correction. A simulation model was built for the estimation of neutron and gamma-ray attenuations. A simple empirical formula was proposed to calculate the signal pile-up correction factor. The whole correction method has proven to be accurate and reliable.

  12. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    Science.gov (United States)

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  13. Development of Quality Control Method for Glucofarmaka Antidiabetic Jamu by HPLC Fingerprint Analysis

    Directory of Open Access Journals (Sweden)

    Hanifullah Habibie

    2017-04-01

    Full Text Available Herbal medicines become increasingly popular all over the world for preventive and therapeutic purposes. Quality control of herbal medicines is important to make sure their safety and efficacy. Chromatographic fingerprinting has been accepted by the World Health Organization as one reliable strategy for quality control method in herbal medicines. In this study, high-performance liquid chromatography fingerprint analysis was developed as a quality control method for glucofarmaka antidiabetic jamu. The optimum fingerprint chromatogram were obtained using C18 as the stationary phase and linear gradient elution using 10–95% acetonitrile:water as the mobile phase within 60 minutes of elution and detection at 210 nm. About 20 peaks were detected and could be used as fingerprint of glucofarmaka jamu. To evaluate the analytical performance of the method, we determined the precision, reproducibility, and stability. The result of the analytical performance showed reliable results. The proposed method could be used as a quality control method for glucofarmaka antidiabetic jamu and also for its raw materials.

  14. Methods and considerations for longitudinal structural brain imaging analysis across development

    Directory of Open Access Journals (Sweden)

    Kathryn L. Mills

    2014-07-01

    Full Text Available Magnetic resonance imaging (MRI has allowed the unprecedented capability to measure the human brain in vivo. This technique has paved the way for longitudinal studies exploring brain changes across the entire life span. Results from these studies have given us a glimpse into the remarkably extended and multifaceted development of our brain, converging with evidence from anatomical and histological studies. Ever-evolving techniques and analytical methods provide new avenues to explore and questions to consider, requiring researchers to balance excitement with caution. This review addresses what MRI studies of structural brain development in children and adolescents typically measure and how. We focus on measurements of brain morphometry (e.g., volume, cortical thickness, surface area, folding patterns, as well as measurements derived from diffusion tensor imaging (DTI. By integrating finding from multiple longitudinal investigations, we give an update on current knowledge of structural brain development and how it relates to other aspects of biological development and possible underlying physiological mechanisms. Further, we review and discuss current strategies in image processing, analysis techniques and modeling of brain development. We hope this review will aid current and future longitudinal investigations of brain development, as well as evoke a discussion amongst researchers regarding best practices.

  15. Analysis and development of methods of correcting for heterogeneities to cobalt-60: computing application

    International Nuclear Information System (INIS)

    Kappas, K.

    1982-11-01

    The purpose of this work is the analysis of the influence of inhomogeneities of the human body on the determination of the dose in Cobalt-60 radiation therapy. The first part is dedicated to the physical characteristics of inhomogeneities and to the conventional methods of correction. New methods of correction are proposed based on the analysis of the scatter. This analysis allows to take account, with a greater accuracy of their physical characteristics and of the corresponding modifications of the dose: ''the differential TAR method'' and ''the Beam Substraction Method''. The second part is dedicated to the computer implementation of the second method of correction for routine application in hospital [fr

  16. Development of a Thiolysis HPLC Method for the Analysis of Procyanidins in Cranberry Products.

    Science.gov (United States)

    Gao, Chi; Cunningham, David G; Liu, Haiyan; Khoo, Christina; Gu, Liwei

    2018-03-07

    The objective of this study was to develop a thiolysis HPLC method to quantify total procyanidins, the ratio of A-type linkages, and A-type procyanidin equivalents in cranberry products. Cysteamine was utilized as a low-odor substitute of toluene-α-thiol for thiolysis depolymerization. A reaction temperature of 70 °C and reaction time of 20 min, in 0.3 M of HCl, were determined to be optimum depolymerization conditions. Thiolytic products of cranberry procyanidins were separated by RP-HPLC and identified using high-resolution mass spectrometry. Standards curves of good linearity were obtained on thiolyzed procyanidin dimer A2 and B2 external standards. The detection and quantification limits, recovery, and precision of this method were validated. The new method was applied to quantitate total procyanidins, average degree of polymerization, ratio of A-type linkages, and A-type procyanidin equivalents in cranberry products. Results showed that the method was suitable for quantitative and qualitative analysis of procyanidins in cranberry products.

  17. Development and Analysis of Train Brake Curve Calculation Methods with Complex Simulation

    Directory of Open Access Journals (Sweden)

    Bela Vincze

    2006-01-01

    Full Text Available This paper describes an efficient method using simulation for developing and analyzing train brake curve calculation methods for the on-board computer of the ETCS system. An application example with actual measurements is also presented.

  18. Development of methods for analysis of trace rare earths and uranium using high resolution ICP-AES

    International Nuclear Information System (INIS)

    Anitha, M.; Kotekar, M.K.; Ambare, D.N.; Singh, H.

    2008-01-01

    The matrix components present in analyte influence the results of determination by ICP-AES. This paper reports results of methods development for trace element analysis using ICP-AES of high resolution, JY Ultima 2. Methods were developed for analysis of rare earths (RE) in Dy 2 O 3 for Advanced Heavy Water Reactor (AHWR), rare earths (RE) in phosphate medium and trace uranium analysis

  19. Advanced methods for a probabilistic safety analysis of fires. Development of advanced methods for performing as far as possible realistic plant specific fire risk analysis (fire PSA)

    International Nuclear Information System (INIS)

    Hofer, E.; Roewekamp, M.; Tuerschmann, M.

    2003-07-01

    In the frame of the research project RS 1112 'Development of Methods for a Recent Probabilistic Safety Analysis, Particularly Level 2' funded by the German Federal Ministry of Economics and Technology (BMWi), advanced methods, in particular for performing as far as possible realistic plant specific fire risk analyses (fire PSA), should be developed. The present Technical Report gives an overview on the methodologies developed in this context for assessing the fire hazard. In the context of developing advanced methodologies for fire PSA, a probabilistic dynamics analysis with a fire simulation code including an uncertainty and sensitivity study has been performed for an exemplary scenario of a cable fire induced by an electric cabinet inside the containment of a modern Konvoi type German nuclear power plant taking into consideration the effects of fire detection and fire extinguishing means. With the present study, it was possible for the first time to determine the probabilities of specified fire effects from a class of fire events by means of probabilistic dynamics supplemented by uncertainty and sensitivity analyses. The analysis applies a deterministic dynamics model, consisting of a dynamic fire simulation code and a model of countermeasures, considering effects of the stochastics (so-called aleatory uncertainties) as well as uncertainties in the state of knowledge (so-called epistemic uncertainties). By this means, probability assessments including uncertainties are provided to be used within the PSA. (orig.) [de

  20. Compare the user interface of digital libraries\\' websites between the developing and developed countries in content analysis method

    Directory of Open Access Journals (Sweden)

    Gholam Abbas Mousavi

    2017-03-01

    Full Text Available Purpose: This study performed with goals of determining the Items in designing and developing the user interface of digital libraries' websites and to determine the best digital libraries' websites and discuss their advantages and disadvantages; to analyze and compare digital libraries' websites in developing countries with those in the developed countries. Methodology: to do so, 50 digital libraries' websites were selected by purposive sampling method. By analyzing the level of development of the countries in the sample regarding their digital libraries' websites, 12 websites were classified as belonging to developing and 38 countries to developed counties. Then, their content was studied by using a qualitative content analysis. The study was conducted by using a research-constructed checklist containing 12 main categories and 44 items, whose validity was decided by content validity method. The data was analyzed in SPSS (version 16. Findings: The results showed that in terms of “online resources”, “library collection,” and “navigation”, there is a significant relationship between the digital library' user interface design in both types of countries. Results: The items of “online public access catalogue (OPAC” and “visits statistics” were observed in more developing countries’ digital libraries' websites. However, the item of “menu and submenus to introduce library' sections” was presented in more developed countries’ digital libraries' websites. Moreover, by analyzing the number of items in the selected websites, “American Memory” with 44 items, “International Children Digital Library” with 40 items, and “California” with 39 items were the best, and “Berkeley Sun Site” with 10 items was the worst website. Despite more and better quality digital libraries in developed countries, the quality of digital libraries websites in developing countries is considerable. In general, some of the newly established

  1. CZECHOSLOVAK FOOTPRINTS IN THE DEVELOPMENT OF METHODS OF THERMOMETRY, CALORIMETRY AND THERMAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Pavel Holba

    2012-07-01

    Full Text Available A short history on the development of thermometric methods are reviewed accentuating the role of Rudolf Bárta in underpinning special thermoanalytical conferences and new journal Silikáty in fifties as well as Vladimir Šatava mentioning his duty in the creation of the Czech school on thermoanalytical kinetics. This review surveys the innovative papers dealing with thermal analysis and the related fields (e.g. calorimetry, kinetics which have been published by noteworthy postwar Czechoslovak scholars and scientists and by their disciples in 1950-1980. Itemized 227 references with titles show rich scientific productivity revealing that many of them were ahead of time even at international connotation.

  2. Development of an HPLC, GC/MS method for analysis of HYGAS oil samples

    Energy Technology Data Exchange (ETDEWEB)

    Raphaelian, L A

    1979-06-01

    Direct analysis of a HYGAS oil sample by gas chromatography/mass spectrometry (GC/MS) or capillary column GC/MS is difficult for at least two reasons: (1) due to the large number (probably over 400) of compounds present in the mixture, many overlapping peaks occur, resulting in mass spectra that are often confusing, and (2) moderately to highly polar compounds are not easily chromatographable. In Part 1 of this study, high performance liquid chromatographic (HPLC) methods for separating the complex HYGAS oil samples into fractions were investigated. A satisfactory separation of a HYGAS oil sample into seven fractions was achieved on a ..mu..Bondapak CN column with a complex gradient of hexane to THF. In Part 2, derivatization as a means for making polar compounds more amenable to identification by capillary column GC/MS was explored. With the use of standard phenols, carboxylic acids, amines, and alcohols, it was found that BSA (a silylating agent) was most effective in derivatizing phenols and alcohols and Methyl-8 Concentrate (an alkylating agent) was most effective in derivatizing carboxylic acids and amines. In Part 3, a preliminary study of the methods developed in Parts 1 and 2, namely HPLC separation into fractions and derivatization of the polar fractions, was undertaken on an authentic HYGAS oil sample to determine whether the methods would make the sample more amenable to analysis by capillary column GC/MS. It was found that, with HPLC, the complex mixtures of HYGAS oil samples are separated into simpler complex mixtures and, with derivatization of the polar fractions, the identification by capillary column GC/MS of polar compounds not normally chromatographable was enhanced.

  3. COMPUTER METHODS OF GENETIC ANALYSIS.

    Directory of Open Access Journals (Sweden)

    A. L. Osipov

    2017-02-01

    Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.

  4. Development of High Precision Tsunami Runup Calculation Method Coupled with Structure Analysis

    Science.gov (United States)

    Arikawa, Taro; Seki, Katsumi; Chida, Yu; Takagawa, Tomohiro; Shimosako, Kenichiro

    2017-04-01

    Calculation Method Based on a Hierarchical Simulation", Journal of Disaster ResearchVol.11 No.4 T. Arikawa, K. Hamaguchi, K. Kitagawa, T. Suzuki (2009): "Development of Numerical Wave Tank Coupled with Structure Analysis Based on FEM", Journal of J.S.C.E., Ser. B2 (Coastal Engineering) Vol. 65, No. 1 T. Arikawa et. al.(2012) "Failure Mechanism of Kamaishi Breakwaters due to the Great East Japan Earthquake Tsunami", 33rd International Conference on Coastal Engineering, No.1191

  5. Analysis and development of stochastic multigrid methods in lattice field theory

    International Nuclear Information System (INIS)

    Grabenstein, M.

    1994-01-01

    We study the relation between the dynamical critical behavior and the kinematics of stochastic multigrid algorithms. The scale dependence of acceptance rates for nonlocal Metropolis updates is analyzed with the help of an approximation formula. A quantitative study of the kinematics of multigrid algorithms in several interacting models is performed. We find that for a critical model with Hamiltonian H(Φ) absence of critical slowing down can only be expected if the expansion of (H(Φ+ψ)) in terms of the shift ψ contains no relevant term (mass term). The predictions of this rule was verified in a multigrid Monte Carlo simulation of the Sine Gordon model in two dimensions. Our analysis can serve as a guideline for the development of new algorithms: We propose a new multigrid method for nonabelian lattice gauge theory, the time slice blocking. For SU(2) gauge fields in two dimensions, critical slowing down is almost completely eliminated by this method, in accordance with the theoretical prediction. The generalization of the time slice blocking to SU(2) in four dimensions is investigated analytically and by numerical simulations. Compared to two dimensions, the local disorder in the four dimensional gauge field leads to kinematical problems. (orig.)

  6. Enantiomeric separation of the antiuremic drug colchicine by electrokinetic chromatography. Method development and quantitative analysis.

    Science.gov (United States)

    Menéndez-López, Nuria; Valimaña-Traverso, Jesús; Castro-Puyana, María; Salgado, Antonio; García, María Ángeles; Marina, María Luisa

    2017-05-10

    Two analytical methodologies were developed by CE enabling the enantiomeric separation of colchicine, an antiuremic drug commercialized as pure enantiomer. Succinyl-γ-CD and Sulfated-γ-CD were selected as chiral selectors after a screening with different anionic CDs. Under the optimized conditions, chiral resolutions of 5.6 in 12min and 3.2 in 8min were obtained for colchicine with Succinyl-γ-CD and Sulfated-γ-CD, respectively. An opposite enantiomeric migration order was observed with these two CDs being S-colchicine the first-migrating enantiomer with Succinyl-γ-CD and the second-migrating enantiomer with Sulfated-γ-CD. H NMR experiments showed a 1:1 stoichiometry for the enantiomer-CD complexes in both cases. However, the apparent and averaged equilibrium constants for the enantiomer-CD complexes could be calculated only for Succinyl-γ-CD. The developed methods were applied to the analysis of pharmaceutical formulations but only the use of Succinyl-γ-CD enabled to detect a 0.1% of enantiomeric impurity in colchicine formulations. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. A Roadmap of Risk Diagnostic Methods: Developing an Integrated View of Risk Identification and Analysis Techniques

    National Research Council Canada - National Science Library

    Williams, Ray; Ambrose, Kate; Bentrem, Laura

    2004-01-01

    ...), which is envisioned to be a comprehensive reference tool for risk identification and analysis (RI AND A) techniques. Program Managers (PMs) responsible for developing or acquiring software-intensive systems typically identify risks in different ways...

  8. Coupling Neumann development and component mode synthesis methods for stochastic analysis of random structures

    Directory of Open Access Journals (Sweden)

    Driss Sarsri

    2014-05-01

    Full Text Available In this paper, we propose a method to calculate the first two moments (mean and variance of the structural dynamics response of a structure with uncertain variables and subjected to random excitation. For this, Newmark method is used to transform the equation of motion of the structure into a quasistatic equilibrium equation in the time domain. The Neumann development method was coupled with Monte Carlo simulations to calculate the statistical values of the random response. The use of modal synthesis methods can reduce the dimensions of the model before integration of the equation of motion. Numerical applications have been developed to highlight effectiveness of the method developed to analyze the stochastic response of large structures.

  9. Development of the complex of nuclear-physical methods of analysis for geology and technology tasks in Kazakhstan

    International Nuclear Information System (INIS)

    Solodukhin, V.; Silachyov, I.; Poznyak, V.; Gorlachev, I.

    2016-01-01

    The paper describes the development of nuclear-physical methods of analysis and their applications in Kazakhstan for geological tasks and technology. The basic methods of this complex include instrumental neutron-activation analysis, x-ray fluorescent analysis and instrumental γ-spectrometry. The following aspects are discussed: applications of developed and adopted analytical techniques for assessment and calculations of rare-earth metal reserves at various deposits in Kazakhstan, for technology development of mining and extraction from uranium-phosphorous ore and wastes, for radioactive coal gasification technology, for studies of rare metal contents in chromite, bauxites, black shales and their processing products. (author)

  10. Analysis of factors affecting the development of food crop varieties bred by mutation method in China

    International Nuclear Information System (INIS)

    Wang Zhidong; Hu Ruifa

    2002-01-01

    The research developed a production function on crop varieties developed by mutation method in order to explore factors affecting the development of new varieties. It is found that the research investment, human capital and radiation facilities were the most important factors that affected the development and cultivation area of new varieties through the mutation method. It is concluded that not all institutions involved in the breeding activities using mutation method must have radiation facilities and the national government only needed to invest in those key research institutes, which had strong research capacities. The saved research budgets can be used in the entrusting the institutes that have stronger research capacities with irradiating more breeding materials developed by the institutes that have weak research capacities, by which more opportunities to breed better varieties can be created

  11. Development of a segmentation method for analysis of Campos basin typical reservoir rocks

    Energy Technology Data Exchange (ETDEWEB)

    Rego, Eneida Arendt; Bueno, Andre Duarte [Universidade Estadual do Norte Fluminense Darcy Ribeiro (UENF), Macae, RJ (Brazil). Lab. de Engenharia e Exploracao de Petroleo (LENEP)]. E-mails: eneida@lenep.uenf.br; bueno@lenep.uenf.br

    2008-07-01

    This paper represents a master thesis proposal in Exploration and Reservoir Engineering that have the objective to development a specific segmentation method for digital images of reservoir rocks, which produce better results than the global methods available in the bibliography for the determination of rocks physical properties as porosity and permeability. (author)

  12. Pathways to Lean Software Development: An Analysis of Effective Methods of Change

    Science.gov (United States)

    Hanson, Richard D.

    2014-01-01

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is…

  13. Quantification of visual characteristics of whipped cream by image analysis and machine vision: method development.

    Science.gov (United States)

    Liu, Peng; Balaban, Murat O

    2015-04-01

    The appearance (color, shine, surface roughness, bumpiness, view area, height) of whipped cream samples change with time. Quantitative methods based on image analysis were developed to measure these parameters in cream samples stored at 3 temperatures (room temperature, 10 °C and 2 °C) for up to 24 h. There was a small decrease in L* and a* values over time, and a significant increase in b* values. ΔE values suggest that these color changes are significant and observable (2.96 for room temperature, 9.46 for 10 °C, 12.6 for 2 °C). Difference between polarized and nonpolarized images was used to quantify shine. The length of a laser line over the cream sample was measured to quantify the change in bumpiness of the surface (33.8%, 41.78%, and 33.27% reduction in laser line length for room temperature, 10 °C, and 2 °C, respectively). For room temperature stored samples, most changes occurred during the first 5 min. For samples stored at 10 °C and 2 °C, most changes occurred during the first 30 min. Turn angles data did not provide useful information. The changes in view area depended on temperature: at room temperature the area increased over time, at 10 °C it increased first then decreased; at 2 °C the view area decreased over time. Correlation of the results of these methods with sensory evaluation can make the evaluation of the appearance of whipped cream more objective, repeatable, and quantitative. © 2015 Institute of Food Technologists®

  14. Development of Evaluation Methods for Lower Limb Function between Aged and Young Using Principal Component Analysis

    Science.gov (United States)

    Nomoto, Yohei; Yamashita, Kazuhiko; Ohya, Tetsuya; Koyama, Hironori; Kawasumi, Masashi

    There is the increasing concern of the society to prevent the fall of the aged. The improvement in aged people's the muscular strength of the lower-limb, postural control and walking ability are important for quality of life and fall prevention. The aim of this study was to develop multiple evaluation methods in order to advise for improvement and maintenance of lower limb function between aged and young. The subjects were 16 healthy young volunteers (mean ± S.D: 19.9 ± 0.6 years) and 10 healthy aged volunteers (mean ± S.D: 80.6 ± 6.1 years). Measurement items related to lower limb function were selected from the items which we have ever used. Selected measurement items of function of lower are distance of extroversion of the toe, angle of flexion of the toe, maximum width of step, knee elevation, moving distance of greater trochanter, walking balance, toe-gap force and rotation range of ankle joint. Measurement items summarized by the principal component analysis into lower ability evaluation methods including walking ability and muscle strength of lower limb and flexibility of ankle. The young group demonstrated the factor of 1.6 greater the assessment score of walking ability compared with the aged group. The young group demonstrated the factor of 1.4 greater the assessment score of muscle strength of lower limb compared with the aged group. The young group demonstrated the factor of 1.2 greater the assessment score of flexibility of ankle compared with the aged group. The results suggested that it was possible to assess the lower limb function of aged and young numerically and to advise on their foot function.

  15. Perfection Of Methods Of Mathematical Analysis For Increasing The Completeness Of Subsoil Development

    Science.gov (United States)

    Fokina, Mariya

    2017-11-01

    The economy of Russia is based around the mineral-raw material complex to the highest degree. The mining industry is a prioritized and important area. Given the high competitiveness of businesses in this sector, increasing the efficiency of completed work and manufactured products will become a central issue. Improvement of planning and management in this sector should be based on multivariant study and the optimization of planning decisions, the appraisal of their immediate and long-term results, taking the dynamic of economic development into account. All of this requires the use of economic mathematic models and methodsApplying an economic-mathematic model to determine optimal ore mine production capacity, we receive a figure of 4,712,000 tons. The production capacity of the Uchalinsky ore mine is 1560 thousand tons, and the Uzelginsky ore mine - 3650 thousand. Conducting a corresponding analysis of the production of OAO "Uchalinsky Gok", an optimal production plan was received: the optimal production of copper - 77961,4 rubles; the optimal production of zinc - 17975.66 rubles. The residual production volume of the two main ore mines of OAO "UGOK" is 160 million tons of ore.

  16. Method development and analysis of retail foods for annatto food colouring material.

    Science.gov (United States)

    Scotter, M J; Castle, L; Honeybone, C A; Nelson, C

    2002-03-01

    Analytical methods for the determination of the permitted food colouring annatto (E160b) have been developed or refined to encompass the wide range of food commodity types permitted to contain it. Specific solvent extraction regimens have been used depending upon the food commodity analysed and HPLC analysis techniques coupled with spectral confirmation have been used for the determination of the major colouring components. Qualitative and quantitative data on the annatto content of 165 composite and two single retail food samples covering a wide range of foods at levels above the limit of quantification (0.1 mg kg(-1)) is reported. Quantitative results are given for the major colour principals 9'-cis-bixin, 9'-cis-norbixin and trans-bixin. Semi-quantitative results are given for the minor bixin and norbixin isomers monocis- (not 9'-), di-cis- and trans-norbixin, for which authentic reference standards were not available. Repeat analyses (n = 4-9) of 12 different types of food commodity (covering the permitted range) spiked with annatto at levels between 1.7 and 27.7 mg kg(-1) gave mean recoveries between 61 and 96%. The corresponding relative SDs (RSD) were between 2.1 and 7.9%.

  17. Development of a PSA-based Loss of Large Area Analysis Method

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Mee Jeong; Jung, Woosik [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Myungsu [Korea Hydro Nuclear Power, Central Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    As a result of these initial post 9-11 assessments in 2002, the NRC issued an interim safeguards and security compensatory measures order. In 'Interim Compensatory Measures for High Threat Environment,'. Section B.5.b (not publically available) of this order, current NPP licensees had to adopt mitigation or restore reactor core cooling, containment, and spent fuel pool (SFP) cooling capabilities to cope with a LOLA due to large fires and explosions from any cause, including beyond-design basis threat(BDBT) aircraft impacts. In 2009, the NRC issued amendments to 10CFR Part 52, and Part 73 for power reactor security requirements for operating and new reactors. New U.S. licensed commercial nuclear power plant operators are required to provide a LOLA(Loss of Large Area) analysis as per the U.S. Code of Federal Regulations, 10CFR50.54(hh)(2). Additionally 10CFR52.80(d) provides the required submittal information on how an applicant for a combined operating license(COL) for a nuclear power plant to meet these requirements. It is necessary to prepare our own guidance for a development of LOLA strategies. In this paper, we proposed a method to look for interesting combinations of rooms in certain targets getting through VAI model, and produced insights that could be used to influence LOLA strategies.

  18. X-RAY FLUORESCENCE ANALYSIS OF HANFORD LOW ACTIVITY WASTE SIMULANTS METHOD DEVELOPMENT

    Energy Technology Data Exchange (ETDEWEB)

    Jurgensen, A; David Missimer, D; Ronny Rutherford, R

    2007-08-08

    The x-ray fluorescence laboratory (XRF) in the Analytical Development Directorate (ADD) of the Savannah River National Laboratory (SRNL) was requested to develop an x-ray fluorescence spectrometry method for elemental characterization of the Hanford Tank Waste Treatment and Immobilization Plant (WTP) pretreated low activity waste (LAW) stream to the LAW Vitrification Plant. The WTP is evaluating the potential for using XRF as a rapid turnaround technique to support LAW product compliance and glass former batching. The overall objective of this task was to develop an XRF analytical method that provides rapid turnaround time (<8 hours), while providing sufficient accuracy and precision to determine variations in waste.

  19. Pathways to lean software development: An analysis of effective methods of change

    Science.gov (United States)

    Hanson, Richard D.

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is firmly entrenched in American business today was to blame for this difficulty (Chatterjee, 2010). Each of these proponents of new methods sought to remove waste, lighten out the process, and implement lean principles in software development. Through this study, the experts evaluated the barriers to effective development principles and defined leadership qualities necessary to overcome these barriers. The barriers identified were issues of resistance to change, risk and reward issues, and management buy-in. Thirty experts in software development from several Fortune 500 companies across the United States explored each of these issues in detail. The conclusion reached by these experts was that visionary leadership is necessary to overcome these challenges.

  20. Developing Vulnerability Analysis Method for Climate Change Adaptation on Agropolitan Region in Malang District

    Science.gov (United States)

    Sugiarto, Y.; Perdinan; Atmaja, T.; Wibowo, A.

    2017-03-01

    Agriculture plays a strategic role in strengthening sustainable development. Based on agropolitan concept, the village becomes the center of economic activities by combining agriculture, agro-industry, agribusiness and tourism that able to create high value-added economy. The impact of climate change on agriculture and water resources may increase the pressure on agropolitan development. The assessment method is required to measure the vulnerability of area-based communities in the agropolitan to climate change impact. An analysis of agropolitan vulnerability was conducted in Malang district based on four aspects and considering the availability and distribution of water as the problem. The indicators used to measure was vulnerability component which consisted of sensitivity and adaptive capacity and exposure component. The studies earned 21 indicators derived from the 115 village-based data. The results of vulnerability assessments showed that most of the villages were categorised at a moderate level. Around 20% of 388 villages were categorized at high to very high level of vulnerability due to low level of agricultural economic. In agropolitan region within the sub-district of Poncokusumo, the vulnerability of the villages varies between very low to very high. The most villages were vulnerable due to lower adaptive capacity, eventhough the level of sensitivity and exposure of all villages were relatively similar. The existence of water resources was the biggest contributor to the high exposure of the villages in Malang district, while the reception of credit facilities and source of family income were among the indicators that lead to high sensitivity component.

  1. In Vitro Dissolution Profile of Dapagliflozin: Development, Method Validation, and Analysis of Commercial Tablets

    Directory of Open Access Journals (Sweden)

    Rafaela Zielinski Cavalheiro de Meira

    2017-01-01

    Full Text Available Dapagliflozin was the first of its class (inhibitors of sodium-glucose cotransporter to be approved in Europe, USA, and Brazil. As the drug was recently approved, there is the need for research on analytical methods, including dissolution studies for the quality evaluation and assurance of tablets. The dissolution methodology was developed with apparatus II (paddle in 900 mL of medium (simulated gastric fluid, pH 1.2, temperature set at 37±0.5°C, and stirring speed of 50 rpm. For the quantification, a spectrophotometric (λ=224 nm method was developed and validated. In validation studies, the method proved to be specific and linear in the range from 0.5 to 15 μg·mL−1 (r2=0.998. The precision showed results with RSD values lower than 2%. The recovery of 80.72, 98.47, and 119.41% proved the accuracy of the method. Through a systematic approach by applying Factorial 23, the robustness of the method was confirmed (p>0.05. The studies of commercial tablets containing 5 or 10 mg demonstrated that they could be considered similar through f1, f2, and dissolution efficiency analyses. Also, the developed method can be used for the quality evaluation of dapagliflozin tablets and can be considered as a scientific basis for future official pharmacopoeial methods.

  2. Shlaer-Mellor object-oriented analysis and recursive design, an effective modern software development method for development of computing systems for a large physics detector

    International Nuclear Information System (INIS)

    Kozlowski, T.; Carey, T.A.; Maguire, C.F.

    1995-01-01

    After evaluation of several modern object-oriented methods for development of the computing systems for the PHENIX detector at RHIC, we selected the Shlaer-Mellor Object-Oriented Analysis and Recursive Design method as the most appropriate for the needs and development environment of a large nuclear or high energy physics detector. This paper discusses our specific needs and environment, our method selection criteria, and major features and components of the Shlaer-Mellor method

  3. Analysis of numerical methods

    CERN Document Server

    Isaacson, Eugene

    1994-01-01

    This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.

  4. Analysis of slippery droplet on tilted plate by development of optical correction method

    Science.gov (United States)

    Ko, Han Seo; Gim, Yeonghyeon; Choi, Sung Ho; Jang, Dong Kyu; Sohn, Dong Kee

    2017-11-01

    Because of distortion effects on a surface of a sessile droplet, the inner flow field of the droplet is measured by a PIV (particle image velocimetry) method with low reliability. In order to solve this problem, many researchers have studied and developed the optical correction method. However, the method cannot be applied for various cases such as the tilted droplet or other asymmetric shaped droplets since most methods were considered only for the axisymmetric shaped droplets. For the optical correction of the asymmetric shaped droplet, the surface function was calculated by the three-dimensional reconstruction using the ellipse curve fitting method. Also, the optical correction using the surface function was verified by the numerical simulation. Then, the developed method was applied to reconstruct the inner flow field of the droplet on the tilted plate. The colloidal droplet of water on the tilted surface was used, and the distorted effect on the surface of the droplet was calculated. Using the obtained results and the PIV method, the corrected flow field for the inner and interface parts of the droplet was reconstructed. Consequently, the error caused by the distortion effect of the velocity vector located on the apex of the droplet was removed. National Research Foundation (NRF) of Korea, (2016R1A2B4011087).

  5. Development of an unbiased statistical method for the analysis of unigenic evolution

    Directory of Open Access Journals (Sweden)

    Shilton Brian H

    2006-03-01

    Full Text Available Abstract Background Unigenic evolution is a powerful genetic strategy involving random mutagenesis of a single gene product to delineate functionally important domains of a protein. This method involves selection of variants of the protein which retain function, followed by statistical analysis comparing expected and observed mutation frequencies of each residue. Resultant mutability indices for each residue are averaged across a specified window of codons to identify hypomutable regions of the protein. As originally described, the effect of changes to the length of this averaging window was not fully eludicated. In addition, it was unclear when sufficient functional variants had been examined to conclude that residues conserved in all variants have important functional roles. Results We demonstrate that the length of averaging window dramatically affects identification of individual hypomutable regions and delineation of region boundaries. Accordingly, we devised a region-independent chi-square analysis that eliminates loss of information incurred during window averaging and removes the arbitrary assignment of window length. We also present a method to estimate the probability that conserved residues have not been mutated simply by chance. In addition, we describe an improved estimation of the expected mutation frequency. Conclusion Overall, these methods significantly extend the analysis of unigenic evolution data over existing methods to allow comprehensive, unbiased identification of domains and possibly even individual residues that are essential for protein function.

  6. A Product Analysis Method and Its Staging to Develop Redesign Competences

    Science.gov (United States)

    Hansen, Claus Thorp; Lenau, Torben Anker

    2013-01-01

    Most product development work in industrial practice is incremental, i.e., the company has had a product in production and on the market for some time, and now time has come to design an upgraded variant. This type of redesign project requires that the engineering designers have competences to carry through an analysis of the existing product…

  7. Sequential Pattern Analysis: Method and Application in Exploring How Students Develop Concept Maps

    Science.gov (United States)

    Chiu, Chiung-Hui; Lin, Chien-Liang

    2012-01-01

    Concept mapping is a technique that represents knowledge in graphs. It has been widely adopted in science education and cognitive psychology to aid learning and assessment. To realize the sequential manner in which students develop concept maps, most research relies upon human-dependent, qualitative approaches. This article proposes a method for…

  8. An Observational Analysis of Coaching Behaviors for Career Development Event Teams: A Mixed Methods Study

    Science.gov (United States)

    Ball, Anna L.; Bowling, Amanda M.; Sharpless, Justin D.

    2016-01-01

    School Based Agricultural Education (SBAE) teachers can use coaching behaviors, along with their agricultural content knowledge to help their Career Development Event (CDE) teams succeed. This mixed methods, collective case study observed three SBAE teachers preparing multiple CDEs throughout the CDE season. The teachers observed had a previous…

  9. Development of conjugate methods with gas chromatography for inorganic compounds analysis

    International Nuclear Information System (INIS)

    Baccan, N.

    1975-01-01

    The application of gas chromatography combined with mass spectrometry or with nuclear methods for the analysis of inorganic compounds is studied. The advantages of the use of a gas chromatograph coupled with a quadrupole mass spectrometer or with a high resolution radiation detector, are discussed. We also studied the formation and solvent extraction of metal chelates; an aliquot of the organic phase was directly injected into the gas chromatograph and the eluted compounds were detected by mass spectrometry or, when radioactive, by nuclear methods. (author)

  10. Applied research and development of neutron activation analysis - Development of the precise analysis method for plastic materials by the use of NAA

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kil Yong; Sim, Sang Kwan; Yoon, Yoon Yeol; Chun, Sang Ki [Korea Institute of Geology, Mining and Materials, Taejon (Korea)

    2000-04-01

    The demand for inorganic analysis of plastics has significantly increased in the fields of microelectronic, environmental, nuclear and resource recycling. The difficulties of chemical analysis methods have led to the application of NAA which has great advantages of non-destructivity, freedom from blank, high sensitivity. The goal of the present work is to optimize and to develop the NAA procedures for the inorganic analysis of plastics. Even though NAA has unique advantages, it has two problems for plastics. One is the contamination by metallic utensils during sample treatment and the other is destruction of sample ampule due to pressure build-up by hydrogen and methane gas formed from oxyhydrogenation reaction with neutrons. For the first problem, large plastics were cut to pieces after immersion in liquid nitrogen. And the second problem has been solved by making an aperture on top side of sample ampule. These research results have been applied to analysis of various plastic materials which were used in food, drug containers and toys for children. Moreover, korean irradiation rabbit could be produced by the application of the results and standard reference materials of plastics which were used for the analysis in XRF and ICP could be produced. 36 refs., 6 figs., 37 tabs (Author)

  11. STAGS Developments for Residual Strength Analysis Methods for Metallic Fuselage Structures

    Science.gov (United States)

    Young, Richard D.; Rose, Cheryl A.

    2014-01-01

    A summary of advances in the Structural Analysis of General Shells (STAGS) finite element code for the residual strength analysis of metallic fuselage structures, that were realized through collaboration between the structures group at NASA Langley, and Dr. Charles Rankin is presented. The majority of the advancements described were made in the 1990's under the NASA Airframe Structural Integrity Program (NASIP). Example results from studies that were conducted using the STAGS code to develop improved understanding of the nonlinear response of cracked fuselage structures subjected to combined loads are presented. An integrated residual strength analysis methodology for metallic structure that models crack growth to predict the effect of cracks on structural integrity is demonstrated

  12. SENSITIVITY ANALYSIS as a methodical approach to the development of design strategies for environmentally sustainable buildings

    DEFF Research Database (Denmark)

    Hansen, Hanne Tine Ring

    The field of environmentally sustainable architecture has been under development since the late 1960's when mankind first started to notice the consequences of industrialisation and modern lifestyle. Energy crises in 1973 and 1979, and global climatic changes ascribed to global warming have caused...... architecture, such as: ecological, green, bio-climatic, sustainable, passive, low-energy and environmental architecture. This PhD project sets out to gain a better understanding of environmentally sustainable architecture and the methodical approaches applied in the development of this type of architecture...

  13. Method Development for Pesticide Residue Analysis in Farmland Soil using High Perfomance Liquid Chromatography

    Science.gov (United States)

    Theresia Djue Tea, Marselina; Sabarudin, Akhmad; Sulistyarti, Hermin

    2018-01-01

    A method for the determination of diazinon and chlorantraniliprole in soil samples has been developed. The analyte was extracted with acetonitrile from farmland soil sample. Determination and quantification of diazinon and chlorantraniliprole were perfomed by high perfomance liquid chromatography (HPLC) with an UV detector. Several parameters of HPLC method were optimized with respect to sensitivity, high resolution of separation, and accurate determination of diazinon and chlorantraniliprole. Optimum conditions for the separation of two pesticides were eluent composition of acetonitrile:water ratio of 60:40, 0.4 mL/min of flow rate, and 220 nm of wavelength. Under the optimum conditions, diazinon linearity was in the range from 1-25 ppm with R2 of 0.9976, 1.19 mgL-1 LOD, and 3.98 mgL-1 LOQ; while the linearity of chlorantraniliprole was in the range from 0.2-5 mgL-1 with R2 of 0.9972, 0.39 mgL-1 LOD, and 1.29 mgL-1 LOQ. When the method was applied to the soil sample, both pesticides showed acceptable recoveries for real sample of more than 85%: thus, the developed method meets the validation requirement. Under this developed method, the concentrations of both pesticides in the soil samples were below the LOD and LOQ (0.577 mgL-1 for diazinon and 0.007 mgL-1 for chlorantraniliprole). Therefore, it can be concluded that the soil samples used in this study have neither diazinon nor chlorantraniliprole.

  14. Development of a method for the analysis of perfluoroalkylated compounds in whole blood

    Energy Technology Data Exchange (ETDEWEB)

    Kaerrman, A.; Bavel, B. van; Lindstroem, G. [Oerebro Univ. (Sweden). Man-Technology-Environmental Research Centre; Jaernberg, U. [Stockholm Univ. (Sweden). Inst. of Applied Environmental Research

    2004-09-15

    The commercialisation of interfaced high performance liquid chromatography mass spectrometry (HPLC-MS) facilitated selective and sensitive analysis of perfluoroalkylated (PFA) acids, a group of compounds frequently used for example as industrial surfactants and which are very persistent and biologically active, in a more convenient way than before. Since then a number of reports on PFA compounds found in humans and wildlife have been published. The most used technique for the analysis of perfluoroalkylated compounds has been ion-pair extraction followed by high performance liquid chromatography (HPLC) and negative electrospray tandem mass spectrometry (MS/MS). Tetrabutylammonium ion as the counter ion in the ion-pair extraction has been used together with GC-analysis, LC-fluorescence and LC-MS/MS. Recently, solid phase extraction (SPE) has been used instead of ion-pair extraction for the extraction of human serum. Previously reported studies on human exposure have mainly been on serum, probably because there are indications that PFA acids bind to plasma proteins. We here present a fast and simple method that involves SPE and which is suitable for extracting whole blood samples. Further more, 13 PFAs were included in the method, which uses HPLC and single quadropole mass spectrometry.

  15. Analysis of heavy oils: Method development and application to Cerro Negro heavy petroleum

    Energy Technology Data Exchange (ETDEWEB)

    Carbognani, L.; Hazos, M.; Sanchez, V. (INTEVEP, Filial de Petroleos de Venezuela, SA, Caracas (Venezuela)); Green, J.A.; Green, J.B.; Grigsby, R.D.; Pearson, C.D.; Reynolds, J.W.; Shay, J.Y.; Sturm, G.P. Jr.; Thomson, J.S.; Vogh, J.W.; Vrana, R.P.; Yu, S.K.T.; Diehl, B.H.; Grizzle, P.L.; Hirsch, D.E; Hornung, K.W.; Tang, S.Y.

    1989-12-01

    On March 6, 1980, the US Department of Energy (DOE) and the Ministry of Energy and Mines of Venezuela (MEMV) entered into a joint agreement which included analysis of heavy crude oils from the Venezuelan Orinoco oil belt.The purpose of this report is to present compositional data and describe new analytical methods obtained from work on the Cerro Negro Orinoco belt crude oil since 1980. Most of the chapters focus on the methods rather than the resulting data on Cerro Negro oil, and results from other oils obtained during the verification of the method are included. In addition, published work on analysis of heavy oils, tar sand bitumens, and like materials is reviewed, and the overall state of the art in analytical methodology for heavy fossil liquids is assessed. The various phases of the work included: distillation and determination of routine'' physical/chemical properties (Chapter 1); preliminary separation of >200{degree}C distillates and the residue into acid, base, neutral, saturated hydrocarbon and neutral-aromatic concentrates (Chapter 2); further separation of acid, base, and neutral concentrates into subtypes (Chapters 3-5); and determination of the distribution of metal-containing compounds in all fractions (Chapter 6).

  16. Cooperative method development

    DEFF Research Database (Denmark)

    Dittrich, Yvonne; Rönkkö, Kari; Eriksson, Jeanette

    2008-01-01

    The development of methods tools and process improvements is best to be based on the understanding of the development practice to be supported. Qualitative research has been proposed as a method for understanding the social and cooperative aspects of software development. However, qualitative...... research is not easily combined with the improvement orientation of an engineering discipline. During the last 6 years, we have applied an approach we call `cooperative method development', which combines qualitative social science fieldwork, with problem-oriented method, technique and process improvement....... The action research based approach focusing on shop floor software development practices allows an understanding of how contextual contingencies influence the deployment and applicability of methods, processes and techniques. This article summarizes the experiences and discusses the further development...

  17. Development of an LC/MS method for the trace analysis of triacetone triperoxide (TATP).

    Science.gov (United States)

    Widmer, Leo; Watson, Stuart; Schlatter, Konrad; Crowson, Andrew

    2002-12-01

    The detection and quantification of triacetone triperoxide (TATP) using LC/MS is investigated. GC/MS analysis of TATP is hindered by stationary phase activation in very short periods of time. Due to the lower temperatures used in LC. this problem is not encountered. This study presents a method that is suitable for the detection of TATP at levels as low as 100 pg microl(-1) (10 ng per 100 microl). Initial findings are also reported for the investigation of a secondary chromatographic peak, which is thought to be caused by separation of two conformers. This study concludes that LC/MS is a suitable technique for the analysis of trace levels of TATP.

  18. Development of a method for analysis for wind turbines horizontal shaft by a method of fluid dynamics computational (CFD)

    International Nuclear Information System (INIS)

    Farinnas Wong, E. Y.; Jauregui Rigo, S.; Betancourt Mena, J.

    2009-01-01

    In this paper we describe different approaches to solving problems computational fluid dynamics using the finite element method, there is a perspective what are the different problems that must be addressed when choose a path to develop a code that solves the problems of boundary layer and turbulence to simulate the transport equipment and fluid handling. In principle, the turbulent flow is governed by the equations of dynamics fluids. The nonlinearity of the Navier-Stokes equations, make the solution analytical is only possible in a few very specific cases and for senior Reynolds numbers the flow equations become a more complex, for it is necessary to use certain models dependent on some settings, usually obtained experimentally. Existing in the powerful techniques present numerical resolution of these equations such as the direct numerical simulation (DNS) and large eddy simulation or vertices (RES), discussed for use in solving problems flow machines. (author)

  19. Development of an evaluation method for the quality of NPP MCR operators' communication using Work Domain Analysis (WDA)

    International Nuclear Information System (INIS)

    Jang, Inseok; Park, Jinkyun; Seong, Poonghyun

    2011-01-01

    Research highlights: → No evaluation method is available for operators' communication quality in NPPs. → To model this evaluation method, the Work Domain Analysis (WDA) method was found. → This proposed method was applied to NPP MCR operators. → The quality of operators' communication can be evaluated with the propose method. - Abstract: The evolution of work demands has seen industrial evolution itself evolve into the computerization of these demands, making systems more complex. This field is now known as the Complex Socio-Technical System. As communication failures are problems associated with Complex Socio-Technical Systems, it has been discovered that communication failures are the cause of many incidents and accidents in various industries, including the nuclear, aerospace and railway industries. Despite the fact that there have been many studies on the severity of communication failures, there is no evaluation method for operators' communication quality in Nuclear Power Plants (NPPs). Therefore, the objectives of this study are to develop an evaluation method for the quality of NPP Main Control Room (MCR) operators' communication and to apply the proposed method to operators in a full-scope simulator. To develop the proposed method, the Work Domain Analysis (WDA) method is introduced. Several characteristics of WDA, including Abstraction Decomposition Space (ADS) and the diagonal of ADS are the important points in developing an evaluation method for the quality of NPP MCR operators' communication. In addition, to apply the proposed method, nine teams working in NPPs participated in a field simulation. The results of this evaluation reveal that operators' communication quality improved as a greater proportion of the components in the developed evaluation criteria were mentioned. Therefore, the proposed method could be useful for evaluating the communication quality in any complex system.

  20. Method developments approaches in supercritical fluid chromatography applied to the analysis of cosmetics.

    Science.gov (United States)

    Lesellier, E; Mith, D; Dubrulle, I

    2015-12-04

    Analyses of complex samples of cosmetics, such as creams or lotions, are generally achieved by HPLC. These analyses are often multistep gradients, due to the presence of compounds with a large range of polarity. For instance, the bioactive compounds may be polar, while the matrix contains lipid components that are rather non-polar, thus cosmetic formulations are usually oil-water emulsions. Supercritical fluid chromatography (SFC) uses mobile phases composed of carbon dioxide and organic co-solvents, allowing for good solubility of both the active compounds and the matrix excipients. Moreover, the classical and well-known properties of these mobile phases yield fast analyses and ensure rapid method development. However, due to the large number of stationary phases available for SFC and to the varied additional parameters acting both on retention and separation factors (co-solvent nature and percentage, temperature, backpressure, flow rate, column dimensions and particle size), a simplified approach can be followed to ensure a fast method development. First, suited stationary phases should be carefully selected for an initial screening, and then the other operating parameters can be limited to the co-solvent nature and percentage, maintaining the oven temperature and back-pressure constant. To describe simple method development guidelines in SFC, three sample applications are discussed in this paper: UV-filters (sunscreens) in sunscreen cream, glyceryl caprylate in eye liner and caffeine in eye serum. Firstly, five stationary phases (ACQUITY UPC(2)) are screened with isocratic elution conditions (10% methanol in carbon dioxide). Complementary of the stationary phases is assessed based on our spider diagram classification which compares a large number of stationary phases based on five molecular interactions. Secondly, the one or two best stationary phases are retained for further optimization of mobile phase composition, with isocratic elution conditions or, when

  1. Development of a Probabilistic Dynamic Synthesis Method for the Analysis of Nondeterministic Structures

    Science.gov (United States)

    Brown, A. M.

    1998-01-01

    Accounting for the statistical geometric and material variability of structures in analysis has been a topic of considerable research for the last 30 years. The determination of quantifiable measures of statistical probability of a desired response variable, such as natural frequency, maximum displacement, or stress, to replace experience-based "safety factors" has been a primary goal of these studies. There are, however, several problems associated with their satisfactory application to realistic structures, such as bladed disks in turbomachinery. These include the accurate definition of the input random variables (rv's), the large size of the finite element models frequently used to simulate these structures, which makes even a single deterministic analysis expensive, and accurate generation of the cumulative distribution function (CDF) necessary to obtain the probability of the desired response variables. The research presented here applies a methodology called probabilistic dynamic synthesis (PDS) to solve these problems. The PDS method uses dynamic characteristics of substructures measured from modal test as the input rv's, rather than "primitive" rv's such as material or geometric uncertainties. These dynamic characteristics, which are the free-free eigenvalues, eigenvectors, and residual flexibility (RF), are readily measured and for many substructures, a reasonable sample set of these measurements can be obtained. The statistics for these rv's accurately account for the entire random character of the substructure. Using the RF method of component mode synthesis, these dynamic characteristics are used to generate reduced-size sample models of the substructures, which are then coupled to form system models. These sample models are used to obtain the CDF of the response variable by either applying Monte Carlo simulation or by generating data points for use in the response surface reliability method, which can perform the probabilistic analysis with an order of

  2. Development of the finite element method of body fit nodalization for mixed convection analysis in rod bundles

    International Nuclear Information System (INIS)

    Lee, G.J.; Chang, S.H.

    1990-01-01

    In the reactor rod bundle analysis, mixed convection phenomena are very important after the reactor shutdown. In this paper, the finite element method based on the body fit nodalization are developed to analyze the mixed convection phenomena in a complex geometry. The velocity distribution and the temperature distribution in the reactor rod bundles are obtained using the above two methods. To validate the developed methods, a comparison of the present results with the analytic solutions for a concentric tube is taken. The results show that the mixed convection in a complex geometry can be treated very well with these two methods, and that the finite element method with the body fit nodalization is more efficient than the finite difference method with the body-fitted coordinate system. (orig.)

  3. The development of trend and pattern analysis methods for incident data by CEC'S joint research at Ispra

    International Nuclear Information System (INIS)

    Amesz, J.; Kalfsbeek, H.W.

    1990-01-01

    The Abnormal Occurrences Reporting System of the Commission of the European Communities was developed by the Joint Research Centre at Ispra in the period 1982 through 1985. It collects in a unique format all safety relevant events from NPPs as recorded in the participating countries. The system has been set-up with the specific objective of providing an advanced tool for a synoptic analysis of a large number of events, identifying patterns of sequences, trends, multiple dependencies between incident descriptors, precursors to severe incidents, performance indicators etc. This paper gives an overview of the development of trend and pattern analysis techniques of two different types: - event sequence analysis; - statistical methods. Though these methods have been developed and applied in relation with the AORS data, they can be regarded as generic in the sense that they may be applied to any incident reporting system satisfying the necessary criteria as to homogeneity and completeness, for rendering valid results

  4. Effective methods of consumer protection in Brazil. An analysis in the context of property development contracts

    Directory of Open Access Journals (Sweden)

    Deborah Alcici Salomão

    2015-12-01

    Full Text Available This study examines consumer protection in arbitration, especially under the example of property development contract disputes in Brazil. This is a very current issue in light of the presidential veto of consumer arbitration on May 26, 2015. The article discusses the arbitrability of these disputes based on Brazilian legislation and relevant case law. It also analyzes of the advantages, disadvantages and trends of consumer arbitration in the context of real estate contracts. The paper concludes by providing suggestions specific to consumer protection in arbitration based on this analysis.

  5. Development and application of an automated analysis method for individual cerebral perfusion single photon emission tomography images

    CERN Document Server

    Cluckie, A J

    2001-01-01

    Neurological images may be analysed by performing voxel by voxel comparisons with a group of control subject images. An automated, 3D, voxel-based method has been developed for the analysis of individual single photon emission tomography (SPET) scans. Clusters of voxels are identified that represent regions of abnormal radiopharmaceutical uptake. Morphological operators are applied to reduce noise in the clusters, then quantitative estimates of the size and degree of the radiopharmaceutical uptake abnormalities are derived. Statistical inference has been performed using a Monte Carlo method that has not previously been applied to SPET scans, or for the analysis of individual images. This has been validated for group comparisons of SPET scans and for the analysis of an individual image using comparison with a group. Accurate statistical inference was obtained independent of experimental factors such as degrees of freedom, image smoothing and voxel significance level threshold. The analysis method has been eval...

  6. Analysis and development of spatial hp-refinement methods for solving the neutron transport equation

    International Nuclear Information System (INIS)

    Fournier, D.

    2011-01-01

    The different neutronic parameters have to be calculated with a higher accuracy in order to design the 4. generation reactor cores. As memory storage and computation time are limited, adaptive methods are a solution to solve the neutron transport equation. The neutronic flux, solution of this equation, depends on the energy, angle and space. The different variables are successively discretized. The energy with a multigroup approach, considering the different quantities to be constant on each group, the angle by a collocation method called SN approximation. Once the energy and angle variable are discretized, a system of spatially-dependent hyperbolic equations has to be solved. Discontinuous finite elements are used to make possible the development of hp-refinement methods. Thus, the accuracy of the solution can be improved by spatial refinement (h-refinement), consisting into subdividing a cell into sub-cells, or by order refinement (p-refinement), by increasing the order of the polynomial basis. In this thesis, the properties of this methods are analyzed showing the importance of the regularity of the solution to choose the type of refinement. Thus, two error estimators are used to lead the refinement process. Whereas the first one requires high regularity hypothesis (analytical solution), the second one supposes only the minimal hypothesis required for the solution to exist. The comparison of both estimators is done on benchmarks where the analytic solution is known by the method of manufactured solutions. Thus, the behaviour of the solution as a regard of the regularity can be studied. It leads to a hp-refinement method using the two estimators. Then, a comparison is done with other existing methods on simplified but also realistic benchmarks coming from nuclear cores. These adaptive methods considerably reduces the computational cost and memory footprint. To further improve these two points, an approach with energy-dependent meshes is proposed. Actually, as the

  7. Development of a preparation and staining method for fetal erythroblasts in maternal blood : Simultaneous immunocytochemical staining and FISH analysis

    NARCIS (Netherlands)

    Oosterwijk, JC; Mesker, WE; Ouwerkerk-van Velzen, MCM; Knepfle, CFHM; Wiesmeijer, KC; van den Burg, MJM; Beverstock, GC; Bernini, LF; van Ommen, Gert-Jan B; Kanhai, HHH; Tanke, HJ

    1998-01-01

    In order to detect fetal nucleated red blood cells (NRBCs) in maternal blood, a protocol was developed which aimed at producing a reliable staining method for combined immunocytochemical and FISH analysis. The technique had to be suitable for eventual automated screening of slides. Chorionic villi

  8. The development of quantative and qualitive analysis methods of suppositories with Maclura Pomifera extract

    Directory of Open Access Journals (Sweden)

    V. A. Korotkov

    2014-08-01

    Full Text Available Chronic prostatitis and BPH are still very common diseases. In recent years, herbal preparations are widely used in the treatment of prostate diseases gland. The effectiveness of herbal medicinal products derived from MacluraPomifera is associated with their content of phytosterols and terpenes. Derived oil extract of MacluraPomiferafruit Orange (Maclurapomifera, Moraceae is a rich source of terpenes and phytosterols. Previous studies indicated, that the content of such substances as lupeol and β-sitosterol, which is known its prostatoprotectors properties as well as the presence of isoflavones possessing anti-inflammatory and antioxidant properties. Aim of the work The aim of this work is a developing of methods that allow assaying qualitative and quantitative assessment of the suppositories with MacluraPomifera extract. Materials and methods Theobjects of this study are the suppositories with oil extract of MacluraPomifera. For the suppositories ingredients’identification thin layer chromatography has been used. Quantitative determination of active compounds has been carried out by spectrophotometry in ultraviolet and visible spectrum. The spectrophotometers of Thermo Scientific Evolution S60 (USA and Apel PD303S (Japan have been used. Determination of phytosterols and triterpenes amount has been performed in the equivalent of a reliable sample of lupeol ('Santa Cruz Biotechnology', USA; CAS: 545-47-1. Determination of isoflavones amount has been carried out in the equivalent of a reliable sample osayin ('BioBioPhaCo., Ltd.',China; CAS: 482-53-1. Results and discussion For identification of phytosterols and isoflavones in the suppositories composition a method of identifying their joint presence by TLC has been developed. Inalcoholic extraction from suppository mass two purple spots with Rf 0,8 and 0,57 are observed at the level of spots solution (lupeol and β-sitosterol and two yellow spots with Rf 0,45 and 0,21 are observed at the level

  9. A development and integration of the concentration database for relative method, k0 method and absolute method in instrumental neutron activation analysis using Microsoft Access

    International Nuclear Information System (INIS)

    Hoh Siew Sin

    2012-01-01

    Instrumental Neutron Activation Analysis (INAA) is offen used to determine and calculate the concentration of an element in the sample by the National University of Malaysia, especially students of Nuclear Science Program. The lack of a database service leads consumers to take longer time to calculate the concentration of an element in the sample. This is because we are more dependent on software that is developed by foreign researchers which are costly. To overcome this problem, a study has been carried out to build an INAA database software. The objective of this study is to build a database software that help the users of INAA in Relative Method and Absolute Method for calculating the element concentration in the sample using Microsoft Excel 2010 and Microsoft Access 2010. The study also integrates k 0 data, k 0 Concent and k 0 -Westcott to execute and complete the system. After the integration, a study was conducted to test the effectiveness of the database software by comparing the concentrations between the experiments and in the database. Triple Bare Monitor Zr-Au and Cr-Mo-Au were used in Abs-INAA as monitor to determine the thermal to epithermal neutron flux ratio (f). Calculations involved in determining the concentration are the net peak area (N p ), the measurement time (t m ), the irradiation time (t irr ), k-factor (k), thermal to epithermal neutron flux ratio (f), the parameters of the neutron flux distribution epithermal (α) and detection efficiency (ε p ). For Com-INAA databases, reference material IAEA-375 Soil was used to calculate the concentration of elements in the sample. CRM, SRM are also used in this database. After the INAA database integration, a verification process was to examine the effectiveness of the Abs-INAA was carried out by comparing the sample concentration between the in database and the experiment. The result of the experimental concentration value of INAA database software performed with high accuracy and precision. ICC

  10. Development and application of an automated analysis method for individual cerebral perfusion single photon emission tomography images

    International Nuclear Information System (INIS)

    Cluckie, Alice Jane

    2001-01-01

    Neurological images may be analysed by performing voxel by voxel comparisons with a group of control subject images. An automated, 3D, voxel-based method has been developed for the analysis of individual single photon emission tomography (SPET) scans. Clusters of voxels are identified that represent regions of abnormal radiopharmaceutical uptake. Morphological operators are applied to reduce noise in the clusters, then quantitative estimates of the size and degree of the radiopharmaceutical uptake abnormalities are derived. Statistical inference has been performed using a Monte Carlo method that has not previously been applied to SPET scans, or for the analysis of individual images. This has been validated for group comparisons of SPET scans and for the analysis of an individual image using comparison with a group. Accurate statistical inference was obtained independent of experimental factors such as degrees of freedom, image smoothing and voxel significance level threshold. The analysis method has been evaluated for application to cerebral perfusion SPET imaging in ischaemic stroke. It has been shown that useful quantitative estimates, high sensitivity and high specificity may be obtained. Sensitivity and the accuracy of signal quantification were found to be dependent on the operator defined analysis parameters. Recommendations for the values of these parameters have been made. The analysis method developed has been compared with an established method and shown to result in higher specificity for the data and analysis parameter sets tested. In addition, application to a group of ischaemic stroke patient SPET scans has demonstrated its clinical utility. The influence of imaging conditions has been assessed using phantom data acquired with different gamma camera SPET acquisition parameters. A lower limit of five million counts and standardisation of all acquisition parameters has been recommended for the analysis of individual SPET scans. (author)

  11. Development of distinction method of production area of ginsengs by using a neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Youngjin; Chung, Yongsam; Sim, Chulmuu; Sun, Gwangmin; Lee, Yuna; Yoo, Sangho

    2011-01-15

    During the last 2 years of the project, we have tried to develop the technology to make a distinction of the production areas for Korean ginsengs cultivated in the various provinces in Korea and foreign countries. It will contribute to secure the health food safety for public and stability of its market. In this year, we collected ginseng samples cultivated in the northeastern province in Chinese mainland such as Liaoning province, Jilin province and Baekdu mountain within Jilin province. 10 ginseng samples were collected at each province. The elemental concentrations in the ginseng were analyzed by using a neutron activation analysis technique at the HANARO research reactor. The distinction of production area was made by using a statistical software. As a result, the Chinese Korean ginsengs were certainly differentiated from those cultivated in the famous province in Korea though there was a limitation that the number of our sample we analyzed is very small.

  12. Development of distinction method of production area of ginsengs by using a neutron activation analysis

    International Nuclear Information System (INIS)

    Kim, Youngjin; Chung, Yongsam; Sim, Chulmuu; Sun, Gwangmin; Lee, Yuna; Yoo, Sangho

    2011-01-01

    During the last 2 years of the project, we have tried to develop the technology to make a distinction of the production areas for Korean ginsengs cultivated in the various provinces in Korea and foreign countries. It will contribute to secure the health food safety for public and stability of its market. In this year, we collected ginseng samples cultivated in the northeastern province in Chinese mainland such as Liaoning province, Jilin province and Baekdu mountain within Jilin province. 10 ginseng samples were collected at each province. The elemental concentrations in the ginseng were analyzed by using a neutron activation analysis technique at the HANARO research reactor. The distinction of production area was made by using a statistical software. As a result, the Chinese Korean ginsengs were certainly differentiated from those cultivated in the famous province in Korea though there was a limitation that the number of our sample we analyzed is very small

  13. METHODS AND MODELS FOR ANALYSIS OF THE ORGANIZATIONAL ECONOMICS ACTIVITY USED FOR DEVELOPMENT OF INFORMATICS SYSTEMS

    Directory of Open Access Journals (Sweden)

    TEODORA VĂTUIU

    2014-10-01

    Full Text Available Study of organizational activity and highlighting problem situations that require specific solutions, require a detailed analysis of the models defined for the real system of the economic companies, regarded not as a sum of assets, but as organizations in which there are activities related into processes. In addition to the usual approach of using modeling languages in the development of information systems, in this paper we intend to present some examples that demonstrate the usefulness of a standard modeling language (UML to analyze organizational activities and to report problem situations that may occur in data management registered on primary documents or in processes that bring together activities. Examples that have been focused on a travel agency can be extrapolated to any other organization, and the diagrams can be used in different contexts, depending on the complexity of the activities identified.

  14. Methods and tools for developing virtual territories for scenario analysis of agro-ecosystems

    Directory of Open Access Journals (Sweden)

    Carlo Giupponi

    2016-08-01

    combinations of different typologies or levels of climate, physical conditions, socio-economic development, etc.; the efficiency and the flexibility of the tools adopted to easily generate realistic landscape and their variants. The approach is demonstrated through the development of erosion analysis under climate change scenarios.

  15. Nano-sized aerosol classification, collection and analysis--method development using dental composite materials.

    Science.gov (United States)

    Bogdan, Axel; Buckett, Mary I; Japuntich, Daniel A

    2014-01-01

    This article presents a methodical approach for generating, collecting, and analyzing nano-size (1-100 nm) aerosol from abraded dental composite materials. Existing aerosol sampling instruments were combined with a custom-made sampling chamber to create and sample a fresh, steady-state aerosol size distribution before significant Brownian coagulation. Morphological, size, and compositional information was obtained by Transmission Electron Microscopy (TEM). To create samples sizes suitable for TEM analysis, aerosol concentrations in the test chamber had to be much higher than one would typically expect in a dental office, and therefore, these results do not represent patient or dental personnel exposures. Results show that nano-size aerosol was produced by the dental drill alone, with and without cooling water drip, prior to abrasion of dental composite. During abrasion, aerosol generation seemed independent of the percent filler load of the restorative material and the operator who generated the test aerosol. TEM investigation showed that "chunks" of filler and resin were generated in the nano-size range; however, free nano-size filler particles were not observed. The majority of observed particles consisted of oil droplets, ash, and graphitic structures.

  16. First characterization of the expiratory flow increase technique: method development and results analysis

    International Nuclear Information System (INIS)

    Maréchal, L; Barthod, C; Jeulin, J C

    2009-01-01

    This study provides an important contribution to the definition of the expiratory flow increase technique (EFIT). So far, no measuring means were suited to assess the manual EFIT performed on infants. The proposed method aims at objectively defining the EFIT based on the quantification of pertinent cognitive parameters used by physiotherapists when practicing. We designed and realized customized instrumented gloves endowed with pressure and displacement sensors, and the associated electronics and software. This new system is specific to the manoeuvre, to the user and innocuous for the patient. Data were collected and analysed on infants with bronchiolitis managed by an expert physiotherapist. The analysis presented is realized on a group of seven subjects (mean age: 6.1 months, SD: 1.1; mean chest circumference: 44.8 cm, SD: 1.9). The results are consistent with the physiotherapist's tactility. In spite of inevitable variability due to measurements on infants, repeatable quantitative data could be reported regarding the manoeuvre characteristics: the magnitudes of displacements do not exceed 10 mm on both hands; the movement of the thoracic hand is more vertical than the movement of the abdominal hand; the maximum applied pressure with the thoracic hand is about twice higher than with the abdominal hand; the thrust of the manual compression lasts (590 ± 62) ms. Inter-operators measurements are in progress in order to generalize these results

  17. [Method development for analysis of short-and long-chain perfluorocarboxylates in sewage].

    Science.gov (United States)

    Li, Fei; Zhang, Chao-Jie; Qu, Yan; Chen, Jing; Zhou, Qi; Yan, Xiang-Bo; Ma, Jin-Xing

    2009-09-15

    A method using weak anion exchange cartridge for solid phase extraction before high performance liquid chromatography-negative electronspray ionization-tandem mass spectrometry [WAX-SPE + HPLC-ESI(-)-MS/MS] detection has been developed to measure C2-C14 perfluorocarboxylates (PFCAs) in sewages. When the weak anion exchange (WAX) cartridges were used for solid phase extraction (SPE), the better recoveries of short- and long-chain PFCAs (C2-C14) were achieved as the sewage samples were acidified to pH = 3.0 by formic acid, 2% formic acid was used for washing solvent and 1% ammonium hydroxide in methanol was used for elution solvent. The WAX cartridges used for SPE overcame the disadvantages of reverse phase solid phase extraction cartridges, i.e. the recoveries of short-chain PFCAs (C2-C5) were very low. The validity of this method was demonstrated by determination of short- and long-chain PFCAs (C2-C14) in influent of municipal wastewater treatment plant A and B in Shanghai, China. The results indicate that the recoveries of all PFCAs in influent of plant A and B are 56%-121% and 54%-120%, respectively; the recovery relative standard deviations (RSD) of plant A and B are MQL) are 0.2-1.0 ng/L and 1.0-5.0 ng/L, respectively. In addition, the perfluorooctanoate (743 ng/L and 837 ng/L, respectively) and trifluoroacetic acid (139 ng/L and 489 ng/L, respectively) are the most and the second most PFCAs found in both plant A and B.

  18. Development and Analysis of Volume Multi-Sphere Method Model Generation using Electric Field Fitting

    Science.gov (United States)

    Ingram, G. J.

    Electrostatic modeling of spacecraft has wide-reaching applications such as detumbling space debris in the Geosynchronous Earth Orbit regime before docking, servicing and tugging space debris to graveyard orbits, and Lorentz augmented orbits. The viability of electrostatic actuation control applications relies on faster-than-realtime characterization of the electrostatic interaction. The Volume Multi-Sphere Method (VMSM) seeks the optimal placement and radii of a small number of equipotential spheres to accurately model the electrostatic force and torque on a conducting space object. Current VMSM models tuned using force and torque comparisons with commercially available finite element software are subject to the modeled probe size and numerical errors of the software. This work first investigates fitting of VMSM models to Surface-MSM (SMSM) generated electrical field data, removing modeling dependence on probe geometry while significantly increasing performance and speed. A proposed electric field matching cost function is compared to a force and torque cost function, the inclusion of a self-capacitance constraint is explored and 4 degree-of-freedom VMSM models generated using electric field matching are investigated. The resulting E-field based VMSM development framework is illustrated on a box-shaped hub with a single solar panel, and convergence properties of select models are qualitatively analyzed. Despite the complex non-symmetric spacecraft geometry, elegantly simple 2-sphere VMSM solutions provide force and torque fits within a few percent.

  19. Production of polyclonal antibody against madecassoside and development of immunoassay methods for analysis of triterpene glycosides in Centella asiatica.

    Science.gov (United States)

    Tassanawat, Patcharin; Putalun, Waraporn; Yusakul, Gorawit; Sritularak, Boonchoo; Juengwatanatrakul, Thaweesak; Tanaka, Hiroyuki

    2013-01-01

    Centella asiatica (L.) Urban consists of two major triterpene glycosides, asiaticoside (AS) and madecassoside (MA), as active components used for wound healing and enhancing memory. To produce a polyclonal antibody against madecassoside (MA-PAb) and develop enzyme-linked immunosorbent assay (ELISA) and Eastern blotting methods for quantitative analysis of triterpene glycosides in Centella asiatica. An ELISA method was developed using polyclonal antibody against MA. An Eastern blotting method on the PES membrane was established for determination of MA and AS. The immunoassays were validated for sensitivity, precision, specificity and accuracy. The prepared MA-PAb shows specificity to MA and AS. The measuring range of triterpene glycosides was 0.39-50 µg/mL using the ELISA method. An Eastern blotting method was developed for determining individual MA and AS, which could be detected in the range of 62.5-500 ng. The limit of detection for MA and AS was 31.25 ng. The two methods developed showed good specificity, precision, and accuracy, and also correlated with high-performance liquid chromatography. These immunoassays have several advantages that include high sensitivity as well as being rapid and facile for determination of the triterpene glycosides in C. asiatica. Copyright © 2012 John Wiley & Sons, Ltd.

  20. Development of multi dimensional analysis code for containment safety and performance based on staggered semi-implicit finite volume method

    International Nuclear Information System (INIS)

    Hong, Soon Joon; Hwang, Su Hyun; Han, Tae Young; Lee, Byung Chul; Byun, Choong Sup

    2009-01-01

    A solver of 3-dimensional thermal hydraulic analysis code for a large building having multi rooms such as reactor containment was developed based on 2-phase and 3-field conservation equations. Three fields mean gas, continuous liquid, and dispersed drop. Gas field includes steam, air and hydrogen. Gas motion equation and state equation also considered. Homogeneous and equilibrium conditions were assumed for gas motion equation. Source terms related with phase change were explicitly expressed for the implicit scheme. Resultantly, total 17 independent equations were setup, and total 17 primitive unknowns were identified. Numerical scheme followed the FVM (Finite Volume Method) based on staggered orthogonal structured grid and semi-implicit method. Staggered grid system produces staggered numerical cells of a scalar cell and a vector cell. The porosity method was adopted for easy handling the complex structures inside a computational cell. Such porosity method has been known to be very effective in reducing mesh numbers and acquiring accurate results in spite of fewer meshes. In the actual programming C++ language of OOP (Object Oriented Programming) was used. The code developed by OOP has the features such as the information hiding, encapsulation, modularity and inheritance. These can offer code developers the more explicit and clearer development method. Classes were designed. Cell and Face, and Volume and Component are the bases of the largest Class, System. Class Solver was designed in order to run the solver. Sample runs showed physically reasonable results. The foundation of code was setup through a series of numerical development. (author)

  1. Development of evaluation method for the quality of NPP MCR operators' communication using work domain analysis (WDA)

    International Nuclear Information System (INIS)

    Jang, In Seok

    2010-02-01

    Evolution of work demands has changed industrial evolution to computerization which makes systems complex and complicated: this field is called Complex Socio-Technical Systems. As communication failure is one problem of Complex Socio-Technical Systems, it has been discovered that communication failure is the reason for many incidents and accidents in various industries, including the nuclear, aerospace and railway industries. Despite the fact that there have been many studies on the severity of communication failure, there is no evaluation method for operators' communication quality in NPPs. Therefore, the objectives of this study are to develop an evaluation method for the quality of NPP Main Control Room (MCR) operators' communication and to apply the proposed method to operators in a full-scope simulator. In order to develop the proposed method, the Work Domain Analysis (WDA) method is introduced. Several characteristic of WDA, such as Abstraction Decomposition Space (ADS) and the diagonal of ADS are the key points in developing an evaluation method for the quality of NPP MCR operators' communication. In addition, in order to apply the proposed method, nine teams working in NPPs participated in the field simulation. Evaluation results reveal that operators' communication quality was higher as larger portion of components in the developed evaluation criteria were mentioned. Therefore, the proposed method could be a useful one for evaluating the communication quality in any complex system. In order to verify that the proposed method is meaningful to evaluate communication quality, the evaluation results were further investigated with objective performance measures. Further investigation of the evaluation results also supports the idea that the proposed method can be used in evaluating communication quality

  2. GEM simulation methods development

    International Nuclear Information System (INIS)

    Tikhonov, V.; Veenhof, R.

    2002-01-01

    A review of methods used in the simulation of processes in gas electron multipliers (GEMs) and in the accurate calculation of detector characteristics is presented. Such detector characteristics as effective gas gain, transparency, charge collection and losses have been calculated and optimized for a number of GEM geometries and compared with experiment. A method and a new special program for calculations of detector macro-characteristics such as signal response in a real detector readout structure, and spatial and time resolution of detectors have been developed and used for detector optimization. A detailed development of signal induction on readout electrodes and electronics characteristics are included in the new program. A method for the simulation of charging-up effects in GEM detectors is described. All methods show good agreement with experiment

  3. [Snoring analysis methods].

    Science.gov (United States)

    Fiz Fernández, José Antonio; Solà Soler, Jordi; Jané Campos, Raimon

    2011-06-11

    Snore is a breathing sound that is originated during sleep, either nocturnal or diurnal. Snoring may be inspiratory, expiratory or it may occupy the whole breathing cycle. It is caused by the vibrations of the different tissues of the upper airway. Many procedures have been used to analyze it, from simple interrogation, to standardized questionnaires, to more sophisticated acoustic methods developed thanks to the advance of biomedical techniques in the last years. The present work describes the current state of the art of snoring analysis procedures. Copyright © 2010 Elsevier España, S.L. All rights reserved.

  4. Decision Analysis in the U.S. Army’s Capabilties Needs Analysis: Applications of Decision Analysis Methods to Capabilities Resource Allocation and Capabilities Development Decisions

    Science.gov (United States)

    2015-10-01

    value functions and their scales; the use of a modified Delphi method to refine the analysis results; and the implementation of an open architecture web...Decision Analysis; value analysis; value model; Multi-attribute Decision Making (MADM); Swing Weighting; Value functions; Delphi method 16. SECURITY...solution approaches. Once analyst made initial assessments a refinement process using a modified Delphi method enabled final recommendations that

  5. Development of soil-structure interaction analysis method (II) - Volume 1

    International Nuclear Information System (INIS)

    Chang, S. P.; Ko, H. M.; Park, H. K. and others

    1994-02-01

    This project includes following six items : free field analysis for the determination of site input motions, impedance analysis which simplifies the effects of soil-structure interaction by using lumped parameters, soil-structure interaction analysis including the material nonlinearity of soil depending on the level of strains, strong geometric nonlinearity due to the uplifting of the base, seismic analysis of underground structure such as varied pipes, seismic analysis of liquid storage tanks. Each item contains following contents respectively : state-of-the-art review on each item and data base construction on the past researches, theoretical review on the technology of soil-structure interaction analysis, proposing preferable technology and estimating the domestic applicability, proposing guidelines for evaluation of safety and analysis scheme

  6. Analysis and development of the method for calculating calibration of the working plank in the cold tube roller rolling mills

    Directory of Open Access Journals (Sweden)

    S. V. Pilipenko

    2017-05-01

    Full Text Available Analysis and development of the existing method of calculation of the calibrated profile of the working strips mills CTRR roller cold rolling pipe to ensure the required distribution of energy-power parameters along the cone. In presented paper, which has for aim the development of existing method for calculating the profile of calibrated working plank in the cold tube roller rolling mills, the analysis had been made and it was proposed to use Besier-lines while building the the profile of the plank working surface. It was established that the use of Besier spline-curve for calculating the calibration of supporting planks creates the possibility to calculate the parameters proceeding from reduction over the external diameter. The proposed method for calculating deformation parameters in CTRR mills is the result of development of existing method and as such shows the scientific novelty. Comparison of the plots for distribution of the force parameters of the CTRR process along the cone of deformation presents as evidence the advantage of the method to be proposed. The decrease of reduction value at the end of deformation zone favors the manufacture of tubes with lesser wall thickness deviation (especially longitudinal one, caused with waviness induced by the cold pilgering process. Joined the further development of the method of calculating the deformation parameters CTRR. It is proposed for the calculation of the calibration work surface support bracket mills CTRR to use a spline Bezier. The practical significance of the proposed method consists in the fact that calculation of all zones of the plank by means of one dependence allows simplifying the process of manufacturing the latter in machines with programmed numerical control. In this case the change of reduction parameters over the thickness of the wall will not exert the considerable influence on the character of the force parameters (the character and not the value distribution along the

  7. Development of systematic evaluation method on nonlinear behavior of the constructions using repeated finite element method analysis

    International Nuclear Information System (INIS)

    Kasahara, Naoto

    1997-01-01

    Supposing that the nuclear reactor stops on any reason, the temperature of flown out coolant from the reactor core will decrease and the temperature of elements touched with the coolant in the nuclear plant equipments also decreases on response to this. On the other hand, temperature pursuit at non-touched portions is delayed to form a thermal stress due to their temperature difference. In particular, a stress over its yield value at discontinuous portion of structure due to stress concentration generates, which could be thought of possibility to form a creep fatigue crack if repeating such thermal stress under high temperature. The Power Reactor and Nuclear Fuel Development Corporation has developed the transient thermal stress real time simulation code for calculating thermal stress formed within a construction in accompany with temperature changes of the coolant once and at high speed since 1994 FY, and after 1995 FY the development of FEM simulation technique from macroscopic region to microscopic region which set an objective regions from construction level to material texture has been promoted. In future, development of total simulation technique connected both and optimum design technique due to its results will be planned. (G.K.)

  8. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    Science.gov (United States)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize

  9. Development of bioinformatic tools and application of novel statistical methods in genome-wide analysis

    NARCIS (Netherlands)

    van der Most, Peter Johannes

    2017-01-01

    Een genoom-brede associatie studie is een methode die genen betrokken bij complexe fenotypes identificeert door het hele genoom te scannen. Deze studies leggen over het algemeen de nadruk op kwantiteit in plaats van kwaliteit: eenvoudige statistische methodes worden toegepast op genetische data van

  10. Development of Distinction Method of Production Area of Ginsengs by Using a Neutron Activation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Jin; Chung, Yong Sam; Sun, Gwang Min; Lee, Yu Na; Yoo, Sang Ho [KAERI, Daejeon (Korea, Republic of)

    2010-05-15

    Distinction of production area of Korean ginsengs has been tried by using neutron activation techniques such as an instrumental neutron activation analysis (INAA) and a prompt gamma activation analysis (PGAA). A distribution of elements has varied according to the part of plant clue to the difference of enrichment effect and influence from a soil where the plants have been grown. So correlation study between plants and soil has been an Issue. In this study, the distribution of trace elements within a Korean ginseng was investigated by using an instrumental neutron activation analysis

  11. Comparison of critical methods developed for fatty acid analysis: A review.

    Science.gov (United States)

    Wu, Zhuona; Zhang, Qi; Li, Ning; Pu, Yiqiong; Wang, Bing; Zhang, Tong

    2017-01-01

    Fatty acids are important nutritional substances and metabolites in living organisms. These acids are abundant in Chinese herbs, such as Brucea javanica, Notopterygium forbesii, Isatis tinctoria, Astragalus membranaceus, and Aconitum szechenyianum. This review illustrates the types of fatty acids and their significant roles in the human body. Many analytical methods are used for the qualitative and quantitative evaluation of fatty acids. Some of the methods used to analyze fatty acids in more than 30 kinds of plants, drugs, and other samples are presented in this paper. These analytical methods include gas chromatography, liquid chromatography, near-infrared spectroscopy, and NMR spectroscopy. The advantages and disadvantages of these techniques are described and compared. This review provides a valuable reference for establishing methods for fatty acid determination. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Development and comparison of HPLC and MEKC methods for the analysis of cyclic sulfur mustard degradation products.

    Science.gov (United States)

    Lees, Heidi; Vaher, Merike; Kaljurand, Mihkel

    2017-04-01

    In this study, novel, fast, and simple methods based on RP-HPLC and MEKC with DAD are developed and validated for the qualitative and quantitative determination of five cyclic sulfur mustard (HD) degradation products (1,4-thioxane, 1,3-dithiolane, 1,4-dithiane, 1,2,5-trithiepane, and 1,4,5-oxadithiepane) in water samples. The HPLC method employs a C18 column and an isocratic water-ACN (55:45, v/v) mobile phase. This method enables separation of all five cyclic compounds within 8 min. With the CE method, the baseline separation of five compounds was achieved in less than 11 min by applying a simple BGE composed of a 10 mM borate buffer and 90 mM SDS (pH 9.15). Both methods showed good linear correlation (R 2 > 0.9904). The detection limits were in the range of 0.08-0.1 μM for the HPLC method and 10-20 μM for MEKC. The precision tests resulted in RSDs for migration times and peak areas less than 0.9 and 5.5%, respectively, for the HPLC method, and less than 1.1 and 7.7% for the MEKC method, respectively. The developed methods were successfully applied to the analysis of five cyclic HD degradation products in water samples. With the HPLC method, the LODs were lowered using the SPE for sample purification and concentration. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Analysis apparatus and method of analysis

    International Nuclear Information System (INIS)

    1976-01-01

    A continuous streaming method developed for the excution of immunoassays is described in this patent. In addition, a suitable apparatus for the method was developed whereby magnetic particles are automatically employed for the consecutive analysis of a series of liquid samples via the RIA technique

  14. Ultrasonic method for microscopic analysis of suspended particle aggregation and floc properties - new developments in the water industry

    Energy Technology Data Exchange (ETDEWEB)

    Spengler, J.F. [Gelsenwasser AG, Gelsenkirchen (Germany). Abt. PIA; Jekel, M. [Technische Univ. Berlin (Germany). Dept. of Water Quality Control; Coakley, W.T. [Cardiff Univ., Cardiff (United Kingdom). School of Biosciences

    2002-07-01

    This work reports the investigation, development and optimisation of methods employing MHz ultrasonic standing waves (USSW) for analytical applications in the water industry. The performance of large-scale processing methods for suspensions a significantly influenced by aggregate properties. Laboratory-scale studies on the aggregation efficiency and the settling behaviour of suspensions under varied suspension stability ('Jar-Test') is e.g. a common method to optimise focculation/sedimentation processes. A novel microscopic method, employing an acoustic 'mini-chamber', has been introduced here. The set-up facilitates the observation of aggregation processes under defined conditions in real-time and in-situ. It was successfully used with a number of suspensions to visualise floc growth, the final aggregate structure and other aggregate properties of practical importance (strength, settling velocity, density). Image analysis was employed for quantitative floc characterisation by e.g. the fractal dimension. The findings were consistent with the DLVO theory of suspension stability and qualitative models of particle aggregation dynamics. Aggregation kinetics has been quantitatively studied by particle image velocimetry (PIV) analysis. These novel investigations suggest the principal applicability of the mini-chamber/microscope system for various research fields and suspended particle investigations, which are of interest not only for water industry purposes but also for the processing of heterogeneous systems in general. Design studies of simple, low-cost and disposable ultrasonic devices for such studies have been developed and a prototype tested successfully. (orig.)

  15. Developments in Surrogating Methods

    Directory of Open Access Journals (Sweden)

    Hans van Dormolen

    2005-11-01

    Full Text Available In this paper, I would like to talk about the developments in surrogating methods for preservation. My main focus will be on the technical aspects of preservation surrogates. This means that I will tell you something about my job as Quality Manager Microfilming for the Netherlands’ national preservation program, Metamorfoze, which is coordinated by the National Library. I am responsible for the quality of the preservation microfilms, which are produced for Metamorfoze. Firstly, I will elaborate on developments in preservation methods in relation to the following subjects: · Preservation microfilms · Scanning of preservation microfilms · Preservation scanning · Computer Output Microfilm. In the closing paragraphs of this paper, I would like to tell you something about the methylene blue test. This is an important test for long-term storage of preservation microfilms. Also, I will give you a brief report on the Cellulose Acetate Microfilm Conference that was held in the British Library in London, May 2005.

  16. The analysis of lipophilic marine toxins : development of an alternative method

    NARCIS (Netherlands)

    Gerssen, A.

    2010-01-01

    Lipophilic marine toxins are produced by certain algae species and can accumulate in filter feeding shellfish such as mussels, scallops and oysters. Consumption of contaminated shellfish can lead to severe intoxications such as diarrhea, abdominal cramps and vomiting. Methods described in

  17. Development and Implementation of Efficiency-Improving Analysis Methods for the SAGE III on ISS Thermal Model Originating

    Science.gov (United States)

    Liles, Kaitlin; Amundsen, Ruth; Davis, Warren; Scola, Salvatore; Tobin, Steven; McLeod, Shawn; Mannu, Sergio; Guglielmo, Corrado; Moeller, Timothy

    2013-01-01

    The Stratospheric Aerosol and Gas Experiment III (SAGE III) instrument is the fifth in a series of instruments developed for monitoring aerosols and gaseous constituents in the stratosphere and troposphere. SAGE III will be delivered to the International Space Station (ISS) via the SpaceX Dragon vehicle in 2015. A detailed thermal model of the SAGE III payload has been developed in Thermal Desktop (TD). Several novel methods have been implemented to facilitate efficient payload-level thermal analysis, including the use of a design of experiments (DOE) methodology to determine the worst-case orbits for SAGE III while on ISS, use of TD assemblies to move payloads from the Dragon trunk to the Enhanced Operational Transfer Platform (EOTP) to its final home on the Expedite the Processing of Experiments to Space Station (ExPRESS) Logistics Carrier (ELC)-4, incorporation of older models in varying unit sets, ability to change units easily (including hardcoded logic blocks), case-based logic to facilitate activating heaters and active elements for varying scenarios within a single model, incorporation of several coordinate frames to easily map to structural models with differing geometries and locations, and streamlined results processing using an Excel-based text file plotter developed in-house at LaRC. This document presents an overview of the SAGE III thermal model and describes the development and implementation of these efficiency-improving analysis methods.

  18. Development of signal analysis method for the motional Stark effect diagnostic on EAST

    Science.gov (United States)

    Fu, Jia; Lyu, Bo; Liu, Haiqing; Li, Yingying; Liu, Dongmei; Wei, Yongqing; Fan, Chao; Shi, Yuejiang; Wu, Zhenwei; Wan, Baonian

    2017-10-01

    A pilot single-channel Motional Stark Effect (MSE) diagnostic has been developed on EAST since 2015. The dual photo-elastic modulators (PEM) were employed to encode the polarization angle into a time-varying signal. The pitch angle was related to the ratio of modulation amplitude at the second harmonic frequency. A digital harmonic analyzer (DHA) technique was developed for extracting the second harmonic amplitude. The results were validated with a hardware phase lock-in amplifier, and is also consistent with the software dual phase-locking algorithm.

  19. Quantitative analysis of concrete using portable x-ray fluorescence: Method development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Washington, Aaron L. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Narrows, William [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Christian, Jonathan H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Msgwood, Leroy [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-07-27

    During Decommissioning and Demolition (D&D) activities at SRS, it is important that the building be screened for radionuclides and heavy metals to ensure that the proper safety and disposal metrics are in place. A major source of contamination at DOE facilities is the accumulation of mercury contamination, from nuclear material processing and Liquid Waste System (LWS). This buildup of mercury could possibly cause harm to any demolition crew or the environment should this material be released. The current standard method is to take core samples in various places in the facility and use X-ray fluorescence (XRF) to detect the contamination. This standard method comes with a high financial value due to the security levels of these sample facilities with unknown contamination levels. Here in we propose the use of portable XRF units to detect for this contamination on-site. To validate this method, the instrument has to be calibrated to detect the heavy metal contamination, be both precise with the known elemental concentrations and consistent with its actual results of a sample concrete and pristine contaminant, and be able to detect changes in the sample concrete’s composition. After receiving the various concrete samples with their compositions found by a XRF wave-dispersive method, the calibration factor’s linear regressions were adjusted to give the baseline concentration of the concrete with no contamination. Samples of both concrete and concrete/flyash were evaluated; their standard deviations revealed that the measurements were consistent with the known composition. Finally, the samples were contaminated with different concentrations of sodium tungsten dihydrate, allowed to air dry, and measured. When the contaminated samples were analyzed, the heavy metal contamination was seen within the spectrum of the instrument, but there was not a trend of quantification based on the concentration of the solution.

  20. Analysis of Investigational Drugs in Biological Fluids - Method Development and Routine Assay.

    Science.gov (United States)

    pyridostigmine in urine, mefloquine in plasma, physostigmine in plasma and WR 6026 in blood) are currently under development, three routine analyses...bioavailability and pharmacokinetic studies, and routine analyses (for mefloquine in plasma and pyridostigmine in plasma) are in progress in support of other studies.

  1. analysis methods of uranium

    International Nuclear Information System (INIS)

    Bekdemir, N.; Acarkan, S.

    1997-01-01

    There are various methods for the determination of uranium. The most often used methods are spectrophotometric (PAR, DBM and Arsenazo III) and potentiometric titration methods. For uranium contents between 1-300 g/LU potentiometric titration method based on oxidation-reduction reactions gives reliable results. PAR (1-pyridiyl-2-azo resorcinol) is a sensitive reagent for uranium, forming complexes in aqueous solutions. It is a suitable method for determination of uranium at concentrations between 2-400microgram U. In this study, the spectrophotometric and potentiometric analysis methods, used in the Nuclear Fuel Department will be discussed in detail and other methods and their principles will be briefly mentioned

  2. Spatial analysis method of assessing water supply and demand applied to energy development in the Ohio River Basin

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, A.D.

    1979-08-01

    The focus of the study is on water availability for energy development in the Ohio River Basin; however, the techniques developed are applicable to water supply investigations for other regions and uses. The study assesses the spatial association between water supply and demand for future energy development in the Basin. The problem is the development of a method that accurately portrays the actual spatial coincidence of water availability and use within a basin. The issues addressed involve questions of scale and methods used to create a model distribution of streamflow and to compare it with projected patterns of water requirements for energy production. The analysis procedure involves the compilation of streamflow data and calculation of 7-day/10-year low-flow estimates within the Basin. Low-flow probabilities are based on historical flows at gaging stations and are adjusted for the effects of reservoir augmentation. Once streamflow estimates have been determined at gaging stations, interpolation of these values is made between known data points to enable direct comparison with projected energy water-use data. Finally, a method is devised to compare the patterns of projected water requirements with the model distribution of streamflow, in sequential downstream order.

  3. Methods for geochemical analysis

    Science.gov (United States)

    Baedecker, Philip A.

    1987-01-01

    The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.

  4. Analysis of the Difficulties and Improvement Method on Introduction of PBL Approach in Developing Country

    Science.gov (United States)

    Okano, Takasei; Sessa, Salvatore

    In the field of international cooperation, it is increasing to introduce Japanese engineering educational model in the developing country to improve the quality of education and research activity. A naive implementation of such model in different cultures and educational systems may lead to several problems. In this paper, we evaluated the Project Based Learning (PBL) class, developed at Waseda University in Japan, and employed to the Egyptian education context at the Egypt-Japan University of Science and Technology (E-JUST) . We found difficulties such as : non-homogeneous student’ s background, disconnection with the student’ s research, weak learning style adaptation, and irregular course conduction. To solve these difficulties at E-JUST, we proposed : the groupware introduction, project theme choice based on student’ s motivation, and curriculum modification.

  5. Estimation of Bouguer Density Precision: Development of Method for Analysis of La Soufriere Volcano Gravity Data

    Directory of Open Access Journals (Sweden)

    Hendra Gunawan

    2014-06-01

    Full Text Available http://dx.doi.org/10.17014/ijog.vol3no3.20084The precision of topographic density (Bouguer density estimation by the Nettleton approach is based on a minimum correlation of Bouguer gravity anomaly and topography. The other method, the Parasnis approach, is based on a minimum correlation of Bouguer gravity anomaly and Bouguer correction. The precision of Bouguer density estimates was investigated by both methods on simple 2D syntetic models and under an assumption free-air anomaly consisting of an effect of topography, an effect of intracrustal, and an isostatic compensation. Based on simulation results, Bouguer density estimates were then investigated for a gravity survey of 2005 on La Soufriere Volcano-Guadeloupe area (Antilles Islands. The Bouguer density based on the Parasnis approach is 2.71 g/cm3 for the whole area, except the edifice area where average topography density estimates are 2.21 g/cm3 where Bouguer density estimates from previous gravity survey of 1975 are 2.67 g/cm3. The Bouguer density in La Soufriere Volcano was uncertainly estimated to be 0.1 g/cm3. For the studied area, the density deduced from refraction seismic data is coherent with the recent Bouguer density estimates. New Bouguer anomaly map based on these Bouguer density values allows to a better geological intepretation.    

  6. Development of an Evaluation Method for Team Safety Culture Competencies using Social Network Analysis

    International Nuclear Information System (INIS)

    Han, Sang Min; Kim, Ar Ryum; Seong, Poong Hyun

    2016-01-01

    In this study, team safety culture competency of a team was estimated through SNA, as a team safety culture index. To overcome the limit of existing safety culture evaluation methods, the concept of competency and SNA were adopted. To estimate team safety culture competency, we defined the definition, range and goal of team safety culture competencies. Derivation of core team safety culture competencies is performed and its behavioral characteristics were derived for each safety culture competency, from the procedures used in NPPs and existing criteria to assess safety culture. Then observation was chosen as a method to provide the input data for the SNA matrix of team members versus insufficient team safety culture competencies. Then through matrix operation, the matrix was converted into the two meaningful values, which are density of team members and degree centralities of each team safety culture competency. Density of tem members and degree centrality of each team safety culture competency represent the team safety culture index and the priority of team safety culture competency to be improved

  7. Development of partitioning method

    International Nuclear Information System (INIS)

    Kubota, Kazuo; Dojiri, Shigeru; Kubota, Masumitsu

    1988-10-01

    The literature survey was carried out on the amount of natural resources, behaviors in reprocessing process and in separation and recovery methods of the platinum group elements and technetium which are contained in spent fuel. The essential results are described below. (1) The platinum group elements, which are contained in spent fuel, are quantitatively limited, compared with total demand for them in Japan. And estimated separation and recovery cost is rather high. In spite of that, development of these techniques is considered to be very important because the supply of these elements is almost from foreign resources in Japan. (2) For recovery of these elements, studies of recovery from undisolved residue and from high level liquid waste (HLLW) also seem to be required. (3) As separation and recovery methods, following techniques are considered to be effective; lead extraction, liquid metal extraction, solvent extraction, ion-exchange, adsorption, precipitation, distillation, electrolysis or their combination. (4) But each of these methods has both advantages and disadvantages. So development of such processes largely depends on future works. (author) 94 refs

  8. Methodical Approaches To Analysis And Forecasting Of Development Fuel And Energy Complex And Gas Industry In The Region

    Directory of Open Access Journals (Sweden)

    Vladimir Andreyevich Tsybatov

    2014-12-01

    Full Text Available Fuel and energy complex (FEC is one of the main elements of the economy of any territory over which intertwine the interests of all economic entities. To ensure economic growth of the region should ensure that internal balance of energy resources, which should be developed with account of regional specifics of economic growth and energy security. The study examined the status of this equilibrium, indicating fuel and energy balance of the region (TEB. The aim of the research is the development of the fuel and energy balance, which will allow to determine exactly how many and what resources are not enough to ensure the regional development strategy and what resources need to be brought in. In the energy balances as the focus of displays all issues of regional development, so thermopile is necessary as a mechanism of analysis of current issues, economic development, and in the forward-looking version — as a tool future vision for the fuel and energy complex, energy threats and ways of overcoming them. The variety of relationships in the energy sector with other sectors and aspects of society lead to the fact that the development of the fuel and energy balance of the region have to go beyond the actual energy sector, involving the analysis of other sectors of economy, as well as systems such as banking, budgetary, legislative, tax. Due to the complexity of the discussed problems, the obvious is the need to develop appropriate forecast-analytical system, allowing regional authorities to implement evidence-based predictions of the consequences of management decisions. Multivariant scenario study on development of fuel and energy complex and separately industry, to use the methods of project-based management, harmonized application of state regulation of strategic and market mechanisms on the operational directions of development of fuel and energy complex and separately industry in the economy of the region.

  9. Developing a digital photography-based method for dietary analysis in self-serve dining settings.

    Science.gov (United States)

    Christoph, Mary J; Loman, Brett R; Ellison, Brenna

    2017-07-01

    Current population-based methods for assessing dietary intake, including food frequency questionnaires, food diaries, and 24-h dietary recall, are limited in their ability to objectively measure food intake. Digital photography has been identified as a promising addition to these techniques but has rarely been assessed in self-serve settings. We utilized digital photography to examine university students' food choices and consumption in a self-serve dining hall setting. Research assistants took pre- and post-photos of students' plates during lunch and dinner to assess selection (presence), servings, and consumption of MyPlate food groups. Four coders rated the same set of approximately 180 meals for inter-rater reliability analyses; approximately 50 additional meals were coded twice by each coder to assess intra-rater agreement. Inter-rater agreement on the selection, servings, and consumption of food groups was high at 93.5%; intra-rater agreement was similarly high with an average of 95.6% agreement. Coders achieved the highest rates of agreement in assessing if a food group was present on the plate (95-99% inter-rater agreement, depending on food group) and estimating the servings of food selected (81-98% inter-rater agreement). Estimating consumption, particularly for items such as beans and cheese that were often in mixed dishes, was more challenging (77-94% inter-rater agreement). Results suggest that the digital photography method presented is feasible for large studies in real-world environments and can provide an objective measure of food selection, servings, and consumption with a high degree of agreement between coders; however, to make accurate claims about the state of dietary intake in all-you-can-eat, self-serve settings, researchers will need to account for the possibility of diners taking multiple trips through the serving line. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Development and validation of an infrared spectroscopy-based method for the analysis of moisture content in 5-fluorouracil.

    Science.gov (United States)

    Singh, Parul; Jangir, Deepak K; Mehrotra, Ranjana; Bakhshi, A K

    2009-06-01

    The determination of moisture content in pharmaceuticals is very important as moisture is mainly responsible for the degradation of drugs. Degraded drugs have reduced efficacy and could be hazardous. The objective of the present work is to replace the Karl Fischer (KF) titration method used for moisture analysis with a method that is rapid, involves no toxic materials and is more effective. Diffuse reflectance infrared (IR) spectroscopy, which is explored as a potential alternative to various approaches, is investigated for moisture analysis in 5-fluorouracil, an anticancer drug. A total of 150 samples with varying moisture content were prepared in laboratory by exposing the drug at different relative humidities, for different time intervals. Infrared spectra of these samples were collected with a Fourier transform infrared (FTIR) spectrophotometer using a diffuse reflectance accessory. Reference moisture values were obtained using the Karl Fischer titration method. A number of calibration models were developed using the partial least squares (PLS) regression method. A good correlation was obtained between predicted IR values and reference values in the calibration and validation set. The derived calibration curve was used to predict moisture content in unknown samples. The results show that IR spectroscopy can be used successfully for the determination of moisture content in the pharmaceutical industry. Copyright 2009 John Wiley & Sons, Ltd.

  11. Method development for the analysis of nitrotoluenes, nitramines and other organic compounds in ammunition waste water

    International Nuclear Information System (INIS)

    Mussmann, P.; Preiss, A.; Levsen, K.; Wuensch, G.

    1994-01-01

    Gas chromatography and high performance liquid chromatography were used to determine explosives, their by- and degradation products near the former ammunition plant Elsnig in Saxony. Enrichment procedures using liquid/liquid-and solid-phase extraction, which have already been developed, were used to investigate ground and surface water samples. Mono-, di- and trinitrotoluenes as well as aminonitro- and chlorinated nitroaromatics were identified and quantified using GC/MS, the electron capture detector (ECD) and the nitrogen-phosphorus detector (NPD). Besides, some nitrophenols were identified in ground water. Additionally, RDX, which is hardly to be determined by GC, was quantified using high performance liquid chromatography. Identification was performed by the UV-spectra using a photodiode array detector. (orig.) [de

  12. Development and application of a validated HPLC method for the analysis of dissolution samples of levothyroxine sodium drug products.

    Science.gov (United States)

    Collier, J W; Shah, R B; Bryant, A R; Habib, M J; Khan, M A; Faustino, P J

    2011-02-20

    A rapid, selective, and sensitive gradient HPLC method was developed for the analysis of dissolution samples of levothyroxine sodium tablets. Current USP methodology for levothyroxine (L-T(4)) was not adequate to resolve co-elutants from a variety of levothyroxine drug product formulations. The USP method for analyzing dissolution samples of the drug product has shown significant intra- and inter-day variability. The sources of method variability include chromatographic interferences introduced by the dissolution media and the formulation excipients. In the present work, chromatographic separation of levothyroxine was achieved on an Agilent 1100 Series HPLC with a Waters Nova-pak column (250 mm × 3.9 mm) using a 0.01 M phosphate buffer (pH 3.0)-methanol (55:45, v/v) in a gradient elution mobile phase at a flow rate of 1.0 mL/min and detection UV wavelength of 225 nm. The injection volume was 800 μL and the column temperature was maintained at 28°C. The method was validated according to USP Category I requirements. The validation characteristics included accuracy, precision, specificity, linearity, and analytical range. The standard curve was found to have a linear relationship (r(2)>0.99) over the analytical range of 0.08-0.8 μg/mL. Accuracy ranged from 90 to 110% for low quality control (QC) standards and 95 to 105% for medium and high QC standards. Precision was <2% at all QC levels. The method was found to be accurate, precise, selective, and linear for L-T(4) over the analytical range. The HPLC method was successfully applied to the analysis of dissolution samples of marketed levothyroxine sodium tablets. Published by Elsevier B.V.

  13. Development of a Novel Method for Temporal Analysis of Airborne Microbial Communities

    Science.gov (United States)

    Spring, A.; Domingue, K. D.; Mooney, M. M.; Kerber, T. V.; Lemmer, K. M.; Docherty, K. M.

    2017-12-01

    Microorganisms are ubiquitous in the atmosphere, which serves as an important vector for microbial dispersal to all terrestrial habitats. Very little is known about the mechanisms that control microbial dispersal, because sampling of airborne microbial communities beyond 2 m above the ground is limited. The goal of this study was to construct and test an airborne microbial sampling system to collect sufficient DNA for conducting next generation sequencing and microbial community analyses. The system we designed employs helium-filled helikites as a mechanism for launching samplers to various altitudes. The samplers use a passive collection dish system, weigh under 6 lbs and are operated by remote control from the ground. We conducted several troubleshooting experiments to test sampler functionality. We extracted DNA from sterile collection dish surfaces and examined communities using amplicons of the V4 region of 16S rRNA in bacteria using Illumina Mi-Seq. The results of these experiments demonstrate that the samplers we designed 1) remain decontaminated when closed and collect sufficient microbial biomass for DNA-based analyses when open for 6 hours; 2) are optimally decontaminated with 15 minutes of UV exposure; 3) require 8 collection dish surfaces to collect sufficient biomass. We also determined that DNA extraction conducted within 24 hours of collection has less of an impact on community composition than extraction after frozen storage. Using this sampling system, we collected samples from multiple altitudes in December 2016 and May 2017 at 3 sites in Kalamazoo and Pellston, Michigan. In Kalamazoo, areas sampled were primarily developed or agricultural, while in Pellston they were primarily forested. We observed significant differences between airborne bacterial communities collected at each location and time point. Additionally, bacterial communities did not differ with altitude, suggesting that terrestrial land use has an important influence on the upward

  14. Recent developments in atomic spectrometric methods as tools for the analysis of advanced materials with the example of advanced ceramics

    International Nuclear Information System (INIS)

    Broekaert, J.A.C.

    1995-01-01

    Atomic spectrometric methods are based on the emission, absorption and fluorescence processes and using radiation in the UV and VIS region. These methods are suitable for the direct analysis of solids. The first type of methodology is extremely attractive for routine analysis. Atomic spectrometric methods are relative methods and characterized with independent methods and these are widely available in the analytical laboratories. For solution analysis inductively coupled plasma (ICP) sources became available. For electrically non-conducting samples such as ceramics are discussed as laser ablation in combination with various types of spectrometry. (A.B.)

  15. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  16. The NASA/Industry Design Analysis Methods for Vibrations (DAMVIBS) Program - A government overview. [of rotorcraft technology development using finite element method

    Science.gov (United States)

    Kvaternik, Raymond G.

    1992-01-01

    An overview is presented of government contributions to the program called Design Analysis Methods for Vibrations (DAMV) which attempted to develop finite-element-based analyses of rotorcraft vibrations. NASA initiated the program with a finite-element modeling program for the CH-47D tandem-rotor helicopter. The DAMV program emphasized four areas including: airframe finite-element modeling, difficult components studies, coupled rotor-airframe vibrations, and airframe structural optimization. Key accomplishments of the program include industrywide standards for modeling metal and composite airframes, improved industrial designs for vibrations, and the identification of critical structural contributors to airframe vibratory responses. The program also demonstrated the value of incorporating secondary modeling details to improving correlation, and the findings provide the basis for an improved finite-element-based dynamics design-analysis capability.

  17. Generic Analysis Methods for Gas Turbine Engine Performance : The development of the gas turbine simulation program GSP

    NARCIS (Netherlands)

    Visser, W.P.J.

    2015-01-01

    Numerical modelling and simulation have played a critical role in the research and development towards today’s powerful and efficient gas turbine engines for both aviation and power generation. The simultaneous progress in modelling methods, numerical methods, software development tools and methods,

  18. [Novel method of analysis of risk of development of ischemic heart disease with the use of genomic and computer technologies].

    Science.gov (United States)

    Zhuravlev, Iu I; Nazarenko, G I; Riazanov, V V; Kleĭmenova, E B

    2011-01-01

    to assess effectiveness of the use of modern methods of prognostication for assessment of risk of development of ischemic heart disease (IHD). We examined 131 patients with diagnosis of IHD verified by coronary angiography and 159 subjects of control group. Initial information on each patient included the following parameters: traditional risk factors, laboratory parameters, results of instrumental examination, genetic markers. We studied 29 polymorphisms in 27 genes which according to international databases were associated with IHD. Genotype was assessed as 2 models: dominant and recessive. For each patient we calculated individual genetic index as sum of present polymorphic markers with addition of data of familial anamnesis. The data obtained were analyzed with the "RECOGNITION" system which used for solution of prognostication problems main approaches and algorithms of the theory of recognition by precedents. Accuracy of recognition varied from 70 to 75% with small number of traits and up to 90% on informative trait subsystems. The method "linear machine" showed the highest accuracy. The voting algorithm showed maximal accuracy of prognosis relative to some algorithms. In IHD prognostication most information systems comprised genetic markers, most significant of which was the genetic index representing sum of available polymorphic markers with addition of data of familial anamnesis. Analysis with the use of methods of recognition by precedents is a perspective technique for stratification of IHD risk and support of optimal decision making on prevention. The use of collectives of different methods of prognostication allows to increase accuracy of prognosis.

  19. Development and validation of reversed-phase high performance liquid chromatographic method for analysis of cephradine in human plasma samples

    International Nuclear Information System (INIS)

    Ahmad, M.; Usman, M.; Madni, A.; Akhtar, N.; Khalid, N.; Asghar, W.

    2010-01-01

    An HPLC method with high precision, accuracy and selectivity was developed and validated for the assessment of cephradine in human plasma samples. The extraction procedure was simple and accurate with single step followed by direct injection of sample into HPLC system. The extracted cephradine in spiked human plasma was separated and quantitated using reversed phase C/sub 18/ column and UV detection wavelength of 254 nm. The optimized mobile phase of new composition of 0.05 M potassium dihydrogen phosphate (pH 3.4)-acetonitrile (88: 12) was pumped at an optimum flow rate of 1 mL.min/sup 1/. The method resulted linearity in the concentration range 0.15- 20 micro g mL/sup -1/. The limit of detection (LOD) and limit of quantification (LOQ) were 0.05 and 0.150 Microg.mL/sup -1/, respectively. The accuracy of method was 98.68 %. This method can 1>e applied for bioequivalence studies and therapeutic drug monitoring as well as for the routine analysis of cephradine. (author)

  20. Development of synthetic velocity - depth damage curves using a Weighted Monte Carlo method and Logistic Regression analysis

    Science.gov (United States)

    Vozinaki, Anthi Eirini K.; Karatzas, George P.; Sibetheros, Ioannis A.; Varouchakis, Emmanouil A.

    2014-05-01

    Damage curves are the most significant component of the flood loss estimation models. Their development is quite complex. Two types of damage curves exist, historical and synthetic curves. Historical curves are developed from historical loss data from actual flood events. However, due to the scarcity of historical data, synthetic damage curves can be alternatively developed. Synthetic curves rely on the analysis of expected damage under certain hypothetical flooding conditions. A synthetic approach was developed and presented in this work for the development of damage curves, which are subsequently used as the basic input to a flood loss estimation model. A questionnaire-based survey took place among practicing and research agronomists, in order to generate rural loss data based on the responders' loss estimates, for several flood condition scenarios. In addition, a similar questionnaire-based survey took place among building experts, i.e. civil engineers and architects, in order to generate loss data for the urban sector. By answering the questionnaire, the experts were in essence expressing their opinion on how damage to various crop types or building types is related to a range of values of flood inundation parameters, such as floodwater depth and velocity. However, the loss data compiled from the completed questionnaires were not sufficient for the construction of workable damage curves; to overcome this problem, a Weighted Monte Carlo method was implemented, in order to generate extra synthetic datasets with statistical properties identical to those of the questionnaire-based data. The data generated by the Weighted Monte Carlo method were processed via Logistic Regression techniques in order to develop accurate logistic damage curves for the rural and the urban sectors. A Python-based code was developed, which combines the Weighted Monte Carlo method and the Logistic Regression analysis into a single code (WMCLR Python code). Each WMCLR code execution

  1. Sensitivity and Uncertainty Analysis of Coupled Reactor Physics Problems : Method Development for Multi-Physics in Reactors

    NARCIS (Netherlands)

    Perkó, Z.

    2015-01-01

    This thesis presents novel adjoint and spectral methods for the sensitivity and uncertainty (S&U) analysis of multi-physics problems encountered in the field of reactor physics. The first part focuses on the steady state of reactors and extends the adjoint sensitivity analysis methods well

  2. Method Development for Rapid Analysis of Natural Radioactive Nuclides Using Sector Field Inductively Coupled Plasma Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Lim, J.M.; Ji, Y.Y.; Lee, H.; Park, J.H.; Jang, M.; Chung, K.H.; Kang, M.J.; Choi, G.S. [Korea Atomic Energy Research Institute (Korea, Republic of)

    2014-07-01

    As an attempt to reduce the social costs and apprehension arising from radioactivity in the environment, an accurate and rapid assessment of radioactivity is highly desirable. Naturally occurring radioactive materials (NORM) are widely spread throughout the environment. The concern with radioactivity from these materials has therefore been growing for the last decade. In particular, radiation exposure in the industry when handling raw materials (e.g., coal mining and combustion, oil and gas production, metal mining and smelting, mineral sands (REE, Ti, Zr), fertilizer (phosphate), and building materials) has been brought to the public's attention. To decide the proper handling options, a rapid and accurate analytical method that can be used to evaluate the radioactivity of radionuclides (e.g., {sup 238}U, {sup 235}U, {sup 232}Th, {sup 226}Ra, and {sup 40}K) should be developed and validated. Direct measuring methods such as alpha spectrometry, a liquid scintillation counter (LSC), and mass-spectrometry are usually used for the measurement of radioactivity in NORM samples, and they encounter the most significant difficulties during pretreatment (e.g., purification, speciation, and dilution/enrichment). Since the pretreatment process consequently plays an important role in the measurement uncertainty, method development and validation should be performed. Furthermore, a-spectrometry has a major disadvantage of a long counting time, while it has a prominent measurement capability at a very low activity level of {sup 238}U, {sup 235}U, {sup 232}Th, and {sup 226}Ra. Contrary to the α-spectrometry method, a measurement technique using ICP-MS allow radioactivity in many samples to be measured in a short time period with a high degree of accuracy and precision. In this study, a method was developed for a rapid analysis of natural radioactive nuclides using ICP-MS. A sample digestion process was established using LiBO{sub 2} fusion and Fe co-precipitation. A magnetic

  3. Development of a cause analysis system for a CPCS trip by using the rule-base deduction method.

    Science.gov (United States)

    Park, Je-Yun; Koo, In-Soo; Sohn, Chang-Ho; Kim, Jung-Seon; Cho, Gi-Ho; Park, Hee-Seok

    2009-07-01

    A Core Protection Calculator System (CPCS) was developed to initiate a Reactor Trip under the circumstance of certain transients by a Combustion Engineering Company. The major function of the Core Protection Calculator System is to generate contact outputs for the Departure from Nucleate Boiling Ratio (DNBR) Trip and a Local Power Density (LPD) Trip. But in a Core Protection Calculator System, a trip cause cannot be identified, thus only trip signals are transferred to the Plant Protection System (PPS) and only the trip status is displayed. It could take a considerable amount of time and effort for a plant operator to analyze the trip causes of a Core Protection Calculator System. So, a Cause Analysis System for a Core Protection Calculator System (CASCPCS) has been developed by using the rule-base deduction method to assist operators in a Nuclear Power Plant. CASCPCS consists of three major parts. Inference engine has a role of controlling the searching knowledge base, executing the rules and tracking the inference process by using the depth-first searching method. Knowledge base consists of four major parts: rules, data base constants, trip buffer variables and causes. And a user interface is implemented by using menu-driven and window display techniques. The advantage of CASCPCS is that it saves time and effort to diagnose the trip causes of a Core Protection Calculator System, it increases a plant's availability and reliability, and it makes it easy to manage CASCPCS because of using only a cursor control.

  4. Development of an evaluation method for seismic isolation systems of nuclear power facilities. Seismic design analysis methods for crossover piping system

    International Nuclear Information System (INIS)

    Tai, Koichi; Sasajima, Keisuke; Fukushima, Shunsuke; Takamura, Noriyuki; Onishi, Shigenobu

    2014-01-01

    This paper provides seismic design analysis methods suitable for crossover piping system, which connects between seismic isolated building and non-isolated building in the seismic isolated nuclear power plant. Through the numerical study focused on the main steam crossover piping system, seismic response spectrum analysis applying ISM (Independent Support Motion) method with SRSS combination or CCFS (Cross-oscillator, Cross-Floor response Spectrum) method has found to be quite effective for the seismic design of multiply supported crossover piping system. (author)

  5. Development and validation of a rapid chromatographic method for the analysis of flunarizine and its main production impurities

    Directory of Open Access Journals (Sweden)

    Niamh O’Connor

    2013-06-01

    Full Text Available A rapid selective method for the analysis of flunarizine and its associated impurities was developed and validated according to ICH guidelines. The separation was carried out using a Thermo Scientific Hypersil Gold C18 column (50mm×4.6mm i.d., 1.9μm particle size with a gradient mobile phase of acetonitrile–ammonium acetate–tetrabutylammoniumhydrogen sulfate buffer, at a flow rate of 1.8mL/min and UV detection at 230nm. Naturally aged samples were also tested to determine sample stability. A profile of sample and impurity breakdown was also presented. Keywords: Flunarizine, Sub 2μm column, Active pharmaceutical ingredient, HPLC

  6. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA......-protein pulldown combined with mass spectrometry analysis is applied for in vivo as well as in vitro identification of RNA-binding proteins, the latter succeeding in verifying known RNA-protein interactions. Secondly, acknowledging the significance of flexible promoter usage for the diversification...... of the transcriptome, 5’ end capture of RNA is combined with next-generation sequencing for high-throughput quantitative assessment of transcription start sites by two different methods. The methods presented here allow for functional investigation of coding as well as noncoding RNA and contribute to future...

  7. Development and application of a validated HPLC method for the analysis of dissolution samples of levothyroxine sodium drug products

    Science.gov (United States)

    Collier, J.W.; Shah, R.B.; Bryant, A.R.; Habib, M.J.; Khan, M.A.; Faustino, P.J.

    2011-01-01

    A rapid, selective, and sensitive gradient HPLC method was developed for the analysis of dissolution samples of levothyroxine sodium tablets. Current USP methodology for levothyroxine (l-T4) was not adequate to resolve co-elutants from a variety of levothyroxine drug product formulations. The USP method for analyzing dissolution samples of the drug product has shown significant intra- and inter-day variability. The sources of method variability include chromatographic interferences introduced by the dissolution media and the formulation excipients. In the present work, chromatographic separation of levothyroxine was achieved on an Agilent 1100 Series HPLC with a Waters Nova-pak column (250mm × 3.9mm) using a 0.01 M phosphate buffer (pH 3.0)–methanol (55:45, v/v) in a gradient elution mobile phase at a flow rate of 1.0 mL/min and detection UV wavelength of 225 nm. The injection volume was 800 µL and the column temperature was maintained at 28 °C. The method was validated according to USP Category I requirements. The validation characteristics included accuracy, precision, specificity, linearity, and analytical range. The standard curve was found to have a linear relationship (r2 > 0.99) over the analytical range of 0.08–0.8 µg/mL. Accuracy ranged from 90 to 110% for low quality control (QC) standards and 95 to 105% for medium and high QC standards. Precision was levothyroxine sodium tablets. PMID:20947276

  8. Development of speciation analysis methods for radionuclides. Connecting tests for ICP-MS with various liquid chromatography systems

    International Nuclear Information System (INIS)

    Takaku, Yuichi; Kakiuchi, Hideki; Ohtsuka, Yoshihito; Hisamatsu, Shunichi; Inaba, Jiro

    2002-01-01

    Analytical methods for physicochemical speciation for radionuclides and related elements, such as lanthanides, actinides and transition metals, in environmental samples are developed to elucidate behavior of radioactive materials in the environment. Inductively coupled plasma mass spectrometry was combined with high performance liquid chromatography such as size exclusion (SEC), ion exchange (IC) or capillary electrophoresis (CE). The IC and CE are used to separate chemical species of small molecular size. Large organic molecule-metal complexes are analyzed by SEC and CE. In this paper, analytical results of rare earth elements (REE) are reported for fresh water samples using SEC-ICP-MS. Preliminary tests results obtained by IC- and CE-ICP-MS are also given. Target substances in fresh water samples are separated by an ultra filtration technique according to their molecular size. After filtration, samples with large molecules were introduced to SEC-ICP-MS. The SEC column was ZORBAX GF-250 (Agirent). Tri-HNO 3 buffer solution (pH 7.3) was used as a mobile phase instead of inorganic salt buffer in order to avoid clogging of the nebulizer torch in the ICP-MS system. In water samples collected at Sakyo Marsh (Higashidori-mura), three UV absorption peaks were found at molecular weights of 700, 100 and 20 kDa. REE were detected online by SEC-ICP-MS at 20 kDa. The results suggested that many REE are present as the large organic molecule-metal complexes in this marsh. Preliminary connection tests of IC and CE chromatography systems to the ICP-MS were made. Stepwise analysis technique of all lanthanides (La to Lu) was developed using IC-ICP-MS. IC column was Excelpak ICS C-15 (Agirent). MIBA buffer solution was used as a mobile phase. In the analysis, the MIBA concentration was changed stepwise from 70 to 400 mM. The total analysis time was 1800 sec. Also, arsenic chemical species of As(III) to As(V) were separated by CE-ICP-MS. Detection limit of each As species was 10 pg g

  9. Development of a Matlab/Simulink tool to facilitate system analysis and simulation via the adjoint and covariance methods

    NARCIS (Netherlands)

    Bucco, D.; Weiss, M.

    2007-01-01

    The COVariance and ADjoint Analysis Tool (COVAD) is a specially designed software tool, written for the Matlab/Simulink environment, which allows the user the capability to carry out system analysis and simulation using the adjoint, covariance or Monte Carlo methods. This paper describes phase one

  10. Social Phenomenological Analysis as a Research Method in Art Education: Developing an Empirical Model for Understanding Gallery Talks

    Science.gov (United States)

    Hofmann, Fabian

    2016-01-01

    Social phenomenological analysis is presented as a research method to study gallery talks or guided tours in art museums. The research method is based on the philosophical considerations of Edmund Husserl and sociological/social science concepts put forward by Max Weber and Alfred Schuetz. Its starting point is the everyday lifeworld; the…

  11. Development of a traceability analysis method based on case grammar for NPP requirement documents written in Korean language

    International Nuclear Information System (INIS)

    Yoo, Yeong Jae; Seong, Poong Hyun; Kim, Man Cheol

    2004-01-01

    Software inspection is widely believed to be an effective method for software verification and validation (V and V). However, software inspection is labor-intensive and, since it uses little technology, software inspection is viewed upon as unsuitable for a more technology-oriented development environment. Nevertheless, software inspection is gaining in popularity. KAIST Nuclear I and C and Information Engineering Laboratory (NICIEL) has developed software management and inspection support tools, collectively named 'SIS-RT.' SIS-RT is designed to partially automate the software inspection processes. SIS-RT supports the analyses of traceability between a given set of specification documents. To make SIS-RT compatible for documents written in Korean, certain techniques in natural language processing have been studied. Among the techniques considered, case grammar is most suitable for analyses of the Korean language. In this paper, we propose a methodology that uses a case grammar approach to analyze the traceability between documents written in Korean. A discussion regarding some examples of such an analysis will follow

  12. DEVELOPMENT OF METHOD OF QUALITATIVE ANALYSIS OF BIRD CHERRY FRUIT FOR INCLUSION IN THE MONOGRAPH OF STATE PHARMACOPOEIA OF UKRAINE

    Directory of Open Access Journals (Sweden)

    Lenchyk L.V.

    2016-06-01

    Full Text Available Introduction. Bird cherry Padus avium Mill, Rosaceae, is widespread in Ukraine, especially in forests and forest-steppe areas. Bird cherry fruits have long been used in medicine and is a valuable medicinal raw materials. They stated to posess astringent, anti-inflammatory, phytoncidal properties. Bird cherry fruits are included in the USSR Pharmacopoeia IX ed., The State Pharmacopoeia of the Russian Federation, The State Pharmacopoeia of Republic of Belarus. In Ukraine there are no contemporary normative documents for this medicinal plant material, therefore it is the actual to develop projects in the national monographs "dry bird cherry fruit" and "fresh bird cherry fruit" to be included in the State Pharmacopoeia of Ukraine. According to European Pharmacopoeia recommendation method of thin-layer chromatography (TLC is prescribed only for the identification of the herbal drug. The principles of thin-layer chromatography and application of the technique in pharmaceutical analysis are described in State Pharmacopoeia of Ukraine. As it is effective and easy to perform, and the equipment required is inexpensive, the technique is frequently used for evaluating medicinal plant materials and their preparations. The TLC is aimed at elucidating the chromatogram of the drug with respect to selected reference compounds that are described for inclusion as reagents. Aim of this study was to develop methods of qualitative analysis of bird cherry fruits for a monograph in the State Pharmacopoeia of Ukraine (SPU. Materials and Methods. The object of our study was dried bird cherry fruits (7 samples and fresh bird cherry fruits (7 samples harvested in 2013-2015 in Kharkiv, Poltava, Luhansk, Sumy, Lviv, Mykolaiv regions and the city Mariupol. Samples were registered in the department of SPU State Enterprise "Pharmacopeia center". In accordance with the Ph. Eur. and SPU requirements in "identification C" determination was performed by TLC. TLC was performed on

  13. THE DEVELOPMENT OF METHOD FOR MINT AND TURMERIC ESSENTIAL OILS IDENTIFICATION AND QUANTITATIVE ANALYSIS IN COMPLEX DRUG

    Directory of Open Access Journals (Sweden)

    O. G. Smalyuh

    2015-04-01

    Full Text Available The aim of our study was to develop the method for identification and assay of essential oils of mint and turmeric in complex medicinal product in capsule form. Materials and method.The paper used samples of turmeric and mint essential oils and complex drug, in the form of capsules containing oil of peppermint, oil of Curcuma longa, a mixture of extracts sandy everlasting (Helichrysumarenarium (L. Moench, marigold (Caléndulaofficinális L, wild carrot (Daucussarota and Curcuma longa (Curcuma longa. Results and discussion. The structure of the complex drug is dry extract sand everlasting flowers, wild carrot extract of marigold flowers and fruits thick, dry extract of Curcuma longa and essential oils of peppermint and turmeric. According to the research of different samples of peppermint oil, and given the need for its identification and quantification of the finished medicinal product, we have decided to choose menthol as analytical marker. In order to establish the identity of complex drug its main components - Ar- turmeric, α-and β- turmeric, and their total content must meet the quantitative indicators "content turmerics" in the specifications for turmeric oil. Past studies of sample preparation conditions allowed to offer 96% ethanol to extract oil components from the sample; ultrasonic and centrifugation to improve removal of the capsule weight. Cromatographiccharacteristics of substances was obtained by column firm Agilent, HP-Innowax. It has been established that other active pharmaceutical ingredients capsule (placebo did not affect the quantification of the components of essential oils of mint and turmeric. Conclusions. 1. Chromatographic conditions of identification and assay of essential oils of mint and turmeric in a complex drug and optimal conditions for sample preparation and analysis by gas chromatographyhave been studied. 2. Methods for identification and assay of menthol, α-, β- and Ar- turmerics in complex drug based on

  14. Development of the finite element method in the thermal field. TRIO-EF software for thermal and radiation analysis

    International Nuclear Information System (INIS)

    Casalotti, N.; Magnaud, J.P.

    1989-01-01

    The possibilities of the TRIO-EF software in the thermal field are presented. The TRIO-EF is a computer program based on the finite element method and used for three-dimensional incompressible flow analysis. It enables the calculation of three-dimensional heat transfer and the fluid/structure analysis. The geometrically complex radiative reactor systems are taken into account in the form factor calculation. The implemented algorithms are described [fr

  15. Development and experimental verification of a finite element method for accurate analysis of a surface acoustic wave device

    Science.gov (United States)

    Mohibul Kabir, K. M.; Matthews, Glenn I.; Sabri, Ylias M.; Russo, Salvy P.; Ippolito, Samuel J.; Bhargava, Suresh K.

    2016-03-01

    Accurate analysis of surface acoustic wave (SAW) devices is highly important due to their use in ever-growing applications in electronics, telecommunication and chemical sensing. In this study, a novel approach for analyzing the SAW devices was developed based on a series of two-dimensional finite element method (FEM) simulations, which has been experimentally verified. It was found that the frequency response of the two SAW device structures, each having slightly different bandwidth and center lobe characteristics, can be successfully obtained utilizing the current density of the electrodes via FEM simulations. The two SAW structures were based on XY Lithium Niobate (LiNbO3) substrates and had two and four electrode finger pairs in both of their interdigital transducers, respectively. Later, SAW devices were fabricated in accordance with the simulated models and their measured frequency responses were found to correlate well with the obtained simulations results. The results indicated that better match between calculated and measured frequency response can be obtained when one of the input electrode finger pairs was set at zero volts and all the current density components were taken into account when calculating the frequency response of the simulated SAW device structures.

  16. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    While increasing evidence appoints diverse types of RNA as key players in the regulatory networks underlying cellular differentiation and metabolism, the potential functions of thousands of conserved RNA structures encoded in mammalian genomes remain to be determined. Since the functions of most...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA......-protein pulldown combined with mass spectrometry analysis is applied for in vivo as well as in vitro identification of RNA-binding proteins, the latter succeeding in verifying known RNA-protein interactions. Secondly, acknowledging the significance of flexible promoter usage for the diversification...

  17. Method of signal analysis

    International Nuclear Information System (INIS)

    Berthomier, Charles

    1975-01-01

    A method capable of handling the amplitude and the frequency time laws of a certain kind of geophysical signals is described here. This method is based upon the analytical signal idea of Gabor and Ville, which is constructed either in the time domain by adding an imaginary part to the real signal (in-quadrature signal), or in the frequency domain by suppressing negative frequency components. The instantaneous frequency of the initial signal is then defined as the time derivative of the phase of the analytical signal, and his amplitude, or envelope, as the modulus of this complex signal. The method is applied to three types of magnetospheric signals: chorus, whistlers and pearls. The results obtained by analog and numerical calculations are compared to results obtained by classical systems using filters, i.e. based upon a different definition of the concept of frequency. The precision with which the frequency-time laws are determined leads then to the examination of the principle of the method and to a definition of instantaneous power density spectrum attached to the signal, and to the first consequences of this definition. In this way, a two-dimensional representation of the signal is introduced which is less deformed by the analysis system properties than the usual representation, and which moreover has the advantage of being obtainable practically in real time [fr

  18. A Contribution To The Development And Analysis Of A Combined Current-Voltage Instrument Transformer By Using Modern CAD Methods

    International Nuclear Information System (INIS)

    Chundeva-Blajer, Marija M.

    2004-01-01

    The principle aim and task of the thesis is the analysis and development of 20 kV combined current-voltage instrument transformer (CCVIT) by using modern CAD techniques. CCVIT is a complex electromagnetic system comprising of four windings and two magnetic cores in one insulation housing for simultaneous transformation of high voltages and currents to measurable signal values by standard instruments. The analytical design methods can be applied on simple electromagnetic configurations, which is not the case with the CCVIT. There is mutual electromagnetic influence between the voltage measurement core (VMC) and the current measurement core (CMC). After the analytical CCVIT design had been done, exact determination of its metrological characteristics has been accomplished by using the numerical finite element method implemented in the FEM-3D program package. The FEM-3D calculation is made in 19 cross-sectional layers of the z-axis of the CCVIT three-dimensional domain. By FEM-3D application the three-dimensional CCVIT magnetic field distribution is derived. This is the basis for calculation of the initial metrological characteristics of the CCVIT (VMC is accuracy class 3 and CMC is accuracy class 1). By using the stochastic optimization technique based on genetic algorithm the CCVIT optimal design is achieved. The objective function is the minimum of the metrological parameters (VIM voltage error and CMC current error). There are I I independent input variables during the optimization process by which the optimal project is derived. The optimal project is adapted for realization of a prototype and the optimized project is derived. Full comparative analysis of the metrological and the electromagnetic characteristics of the three projects is accomplished. By application of the program package MATLAB/SIMULINK the CCVIT transient phenomena is analyzed for different regimes in the three design projects. In the Instrument Transformer Factory of EMO A. D.-Ohrid a CCVIT

  19. Development of a physiologically relevant dripping analytical method using simulated nasal mucus for nasal spray formulation analysis

    Directory of Open Access Journals (Sweden)

    Tina Masiuk

    2016-10-01

    Full Text Available Current methods for nasal spray formulations have been elementary evaluating the dripping characteristics of a formulation and have not assessed the behavior of the nasal formulation in the presence of varying types of mucus depending on the indication or diseased state. This research investigated the effects of nasal mucus on the dripping behavior of nasal formulations and focused on developing an improved in vitro analytical test method that is more physiologically relevant in characterizing nasal formulation dripping behavior. Method development was performed using simulated nasal mucus preparations for both healthy and diseased states as coatings for the dripping experiment representing a wide range of viscosity. Factors evaluated during development of this in vitro test method included amount of mucus, application of mucus, drying times, and compatibility of the mucus on a C18 Thin Layer Chromatography (TLC substrate. The dripping behavior of nasal formulations containing a range of 1% Avicel to 3.5% Avicel was assessed by actuating the nasal spray on a perpendicular TLC plate coated with either healthy or diseased simulated nasal mucus. After actuation of the nasal spray, the dripping of the formulation on the coated TLC plate was measured after the plate was repositioned vertically. The method that was developed generated reproducible results on the dripping behavior of nasal formulations and provided critical information about the compatibility of the formulation with the nasal mucus for different diseased states, aiding in nasal spray formulation development and physical characterization of the nasal spray.

  20. Motion as perturbation. II. Development of the method for dosimetric analysis of motion effects with fixed-gantry IMRT

    Energy Technology Data Exchange (ETDEWEB)

    Nelms, Benjamin E. [Canis Lupus LLC, Merrimac, Wisconsin 53561 (United States); Opp, Daniel; Zhang, Geoffrey; Moros, Eduardo; Feygelman, Vladimir, E-mail: vladimir.feygelman@moffitt.org [Department of Radiation Oncology, Moffitt Cancer Center, Tampa, Florida 33612 (United States)

    2014-06-15

    Purpose: In this work, the feasibility of implementing a motion-perturbation approach to accurately estimate volumetric dose in the presence of organ motion—previously demonstrated for VMAT-–is studied for static gantry IMRT. The method's accuracy is improved for the voxels that have very low planned dose but acquire appreciable dose due to motion. The study describes the modified algorithm and its experimental validation and provides an example of a clinical application. Methods: A contoured region-of-interest is propagated according to the predefined motion kernel throughout time-resolved 4D phantom dose grids. This timed series of 3D dose grids is produced by the measurement-guided dose reconstruction algorithm, based on an irradiation of a staticARCCHECK (AC) helical dosimeter array (Sun Nuclear Corp., Melbourne, FL). Each moving voxel collects dose over the dynamic simulation. The difference in dose-to-moving voxel vs dose-to-static voxel in-phantom forms the basis of a motion perturbation correction that is applied to the corresponding voxel in the patient dataset. A new method to synchronize the accelerator and dosimeter clocks, applicable to fixed-gantry IMRT, was developed. Refinements to the algorithm account for the excursion of low dose voxels into high dose regions, causing appreciable dose increase due to motion (LDVE correction). For experimental validation, four plans using TG-119 structure sets and objectives were produced using segmented IMRT direct machine parameters optimization in Pinnacle treatment planning system (v. 9.6, Philips Radiation Oncology Systems, Fitchburg, WI). All beams were delivered with the gantry angle of 0°. Each beam was delivered three times: (1) to the static AC centered on the room lasers; (2) to a static phantom containing a MAPCHECK2 (MC2) planar diode array dosimeter (Sun Nuclear); and (3) to the moving MC2 phantom. The motion trajectory was an ellipse in the IEC XY plane, with 3 and 1.5 cm axes. The period

  1. Development and interlaboratory validation of quantitative polymerase chain reaction method for screening analysis of genetically modified soybeans.

    Science.gov (United States)

    Takabatake, Reona; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi

    2013-01-01

    A novel real-time polymerase chain reaction (PCR)-based quantitative screening method was developed for three genetically modified soybeans: RRS, A2704-12, and MON89788. The 35S promoter (P35S) of cauliflower mosaic virus is introduced into RRS and A2704-12 but not MON89788. We then designed a screening method comprised of the combination of the quantification of P35S and the event-specific quantification of MON89788. The conversion factor (Cf) required to convert the amount of a genetically modified organism (GMO) from a copy number ratio to a weight ratio was determined experimentally. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDR), respectively. The determined RSDR values for the method were less than 25% for both targets. We consider that the developed method would be suitable for the simple detection and approximate quantification of GMO.

  2. Complementing Gender Analysis Methods.

    Science.gov (United States)

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital.

  3. Development of techniques using DNA analysis method for detection/analysis of radiation-induced mutation. Development of an useful probe/primer and improvement of detection efficacy

    International Nuclear Information System (INIS)

    Maekawa, Hideaki; Tsuchida, Kozo; Hashido, Kazuo; Takada, Naoko; Kameoka, Yosuke; Hirata, Makoto

    1999-01-01

    Previously, it was demonstrated that detection of centromere became easy and reliable through fluorescent staining by FISH method using a probe of the sequence preserved in α-satelite DNA. Since it was, however, found inappropriate to detect dicentrics based on the relative amount of DNA probe on each chromosome. A prove which allows homogeneous detection of α-satelite DNA for each chromosome was constructed. A presumed sequence specific to kinetochore, CENP-B box was amplified by PCR method and the product DNA was used as a probe. However, the variation in amounts of probe DNA among chromosomes was decreased by only about 20%. Then, a program for image processing of the results obtained from FISH using α-satelite DNA was constructed to use as a marker for centromere. When compared with detection of abnormal chromosomes stained by the conventional method, calculation efficacy for only detection of centromere was improved by the use of this program. Calculation to discriminate the normal or not was still complicated and the detection efficacy was little improved. Chromosomal abnormalities in lymphocytes were used to detect the effects of radiation. In this method, it is needed to shift the phase of cells into metaphase. The mutation induced by radiation might be often repaired during shifting. To exclude this possibility, DNA extraction was conducted at a low temperature and immediately after exposure to 137 Cs, and a rapid genome detection method was established using the genome DNA. As the model genomes, the following three were used: 1) long chain repeated sequences widely dispersed over chromosome, 2) cluster genes, 3) single copy genes. The effects of radiation were detectable at 1-2 Gy for the long repeated sequences and at 7 Gy for the cluster genes, respectively, whereas no significant effects were observed at any Gy tested for the single copy genes. Amplification was marked in the cells exposed at 1-10 Gy (peak at 4 Gy), suggesting that these regions had

  4. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    International Nuclear Information System (INIS)

    Hunt, M.A.; Klatt, L.N.; Thompson, D.H.

    1998-02-01

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons

  5. Analysis and assessment on heavy metal sources in the coastal soils developed from alluvial deposits using multivariate statistical methods.

    Science.gov (United States)

    Li, Jinling; He, Ming; Han, Wei; Gu, Yifan

    2009-05-30

    An investigation on heavy metal sources, i.e., Cu, Zn, Ni, Pb, Cr, and Cd in the coastal soils of Shanghai, China, was conducted using multivariate statistical methods (principal component analysis, clustering analysis, and correlation analysis). All the results of the multivariate analysis showed that: (i) Cu, Ni, Pb, and Cd had anthropogenic sources (e.g., overuse of chemical fertilizers and pesticides, industrial and municipal discharges, animal wastes, sewage irrigation, etc.); (ii) Zn and Cr were associated with parent materials and therefore had natural sources (e.g., the weathering process of parent materials and subsequent pedo-genesis due to the alluvial deposits). The effect of heavy metals in the soils was greatly affected by soil formation, atmospheric deposition, and human activities. These findings provided essential information on the possible sources of heavy metals, which would contribute to the monitoring and assessment process of agricultural soils in worldwide regions.

  6. Methods Of Complex Analysis And Conformity Assessment Of Formation Of Regional Retail Networks To The Principles Of Sustainable Development

    OpenAIRE

    Zoryana Gerasymchuk; Victor Korsak

    2014-01-01

    The methodical approach to assessing of conformity of the formation of regional retail sales networks (RRSN) to the principles of sustainable development, which includes: selection of goals, objectives, directions and scorecard assessment of territorial retail network; calculation of integral indices of RRSN and socio-economic and environmental security in the region; calculation of the integral index of sustainability of RRSN; assessment of balance-imbalance of RRSN development and regional ...

  7. Systematic errors in detecting biased agonism: Analysis of current methods and development of a new model-free approach.

    Science.gov (United States)

    Onaran, H Ongun; Ambrosio, Caterina; Uğur, Özlem; Madaras Koncz, Erzsebet; Grò, Maria Cristina; Vezzi, Vanessa; Rajagopal, Sudarshan; Costa, Tommaso

    2017-03-14

    Discovering biased agonists requires a method that can reliably distinguish the bias in signalling due to unbalanced activation of diverse transduction proteins from that of differential amplification inherent to the system being studied, which invariably results from the non-linear nature of biological signalling networks and their measurement. We have systematically compared the performance of seven methods of bias diagnostics, all of which are based on the analysis of concentration-response curves of ligands according to classical receptor theory. We computed bias factors for a number of β-adrenergic agonists by comparing BRET assays of receptor-transducer interactions with Gs, Gi and arrestin. Using the same ligands, we also compared responses at signalling steps originated from the same receptor-transducer interaction, among which no biased efficacy is theoretically possible. In either case, we found a high level of false positive results and a general lack of correlation among methods. Altogether this analysis shows that all tested methods, including some of the most widely used in the literature, fail to distinguish true ligand bias from "system bias" with confidence. We also propose two novel semi quantitative methods of bias diagnostics that appear to be more robust and reliable than currently available strategies.

  8. Method development for liquid chromatographic/triple quadrupole mass spectrometric analysis of trace level perfluorocarboxylic acids in articles of commerce

    Science.gov (United States)

    An analytical method to identify and quantify trace levels of C5 to C12 perfluorocarboxylic acids (PFCAs) in articles of commerce (AOC) is developed and rigorously validated. Solid samples were extracted in methanol, and liquid samples were diluted with a solvent consisting of 60...

  9. Analysis of combined conduction and radiation heat transfer in presence of participating medium by the development of hybrid method

    International Nuclear Information System (INIS)

    Mahapatra, S.K.; Dandapat, B.K.; Sarkar, A.

    2006-01-01

    The current study addresses the mathematical modeling aspects of coupled conductive and radiative heat transfer in the presence of absorbing, emitting and isotropic scattering gray medium within two-dimensional square enclosure. A blended method where the concepts of modified differential approximation employed by combining discrete ordinate method and spherical harmonics method, has been developed for modeling the radiative transport equation. The gray participating medium is bounded by isothermal walls of two-dimensional enclosure which are considered to be opaque, diffuse and gray. The effect of various influencing parameters i.e., radiation-conduction parameter, surface emissivity, single scattering albedo and optical thickness has been illustrated. The adaptability of the present method has also been addressed

  10. An analysis of normalization methods for Drosophila RNAi genomic screens and development of a robust validation scheme

    Science.gov (United States)

    Wiles, Amy M.; Ravi, Dashnamoorthy; Bhavani, Selvaraj; Bishop, Alexander J.R.

    2010-01-01

    Genome-wide RNAi screening is a powerful, yet relatively immature technology that allows investigation into the role of individual genes in a process of choice. Most RNAi screens identify a large number of genes with a continuous gradient in the assessed phenotype. Screeners must then decide whether to examine just those genes with the most robust phenotype or to examine the full gradient of genes that cause an effect and how to identify the candidate genes to be validated. We have used RNAi in Drosophila cells to examine viability in a 384-well plate format and compare two screens, untreated control and treatment. We compare multiple normalization methods, which take advantage of different features within the data, including quantile normalization, background subtraction, scaling, cellHTS2 1, and interquartile range measurement. Considering the false-positive potential that arises from RNAi technology, a robust validation method was designed for the purpose of gene selection for future investigations. In a retrospective analysis, we describe the use of validation data to evaluate each normalization method. While no normalization method worked ideally, we found that a combination of two methods, background subtraction followed by quantile normalization and cellHTS2, at different thresholds, captures the most dependable and diverse candidate genes. Thresholds are suggested depending on whether a few candidate genes are desired or a more extensive systems level analysis is sought. In summary, our normalization approaches and experimental design to perform validation experiments are likely to apply to those high-throughput screening systems attempting to identify genes for systems level analysis. PMID:18753689

  11. Development of radiochemical method of analysis of binding of tritium labeled drotaverine hydrochloride with human blood serum albumin

    International Nuclear Information System (INIS)

    Kim, A.A.; Djuraeva, G.T.; Shukurov, B.V.; Mavlyanov, I.R.

    2004-01-01

    Full text: The albumin, being a basic functional linkage of numerous endogenous and exogenous substances is the most important protein of blood plasma. At the diseases connected to liver disfunction, collected in blood metabolite reduce connecting ability of albumino. The aim of the present research was a development of radiochemical method of determination of ability of albumin to bind the tritium labeled preparation drotaverine hydrochloride (no - spa). We had developed a micromethod of definition of connecting ability of albumin, allowing to analyse 20 mkl of blood serum. The method consists in incubation of tritium labeled drotaverine hydrochloride with blood serum in vitro, the following fractionation of serum proteins by gel - filtration on a microcolumn with Sephadex G-25, and direct measurement of the radioactivity connected to fraction of proteins of blood serum. The method has been tested on a series of blood serum of control group of healthy people and on a series of blood serum of patients with hepatitis B. We received quantitative characteristics of binding of drotaverine hydrochloride with albumin of patients with hepatitis B. It was preliminary established that binding ability of serum albumin of children with various forms of acute virus hepatitis tends to decrease in comparison with group of the control. Advantage of the developed radiochemical method is high precision and the high sensitivity of detection of infringement of binding ability of albumin. Application of tritium labeled drotaverine hydrochloride allows to measure directly levels of binding of a preparation with albumin

  12. Comparison of infrared spectroscopy techniques: developing an efficient method for high resolution analysis of sediment properties from long records

    Science.gov (United States)

    Hahn, Annette; Rosén, Peter; Kliem, Pierre; Ohlendorf, Christian; Persson, Per; Zolitschka, Bernd; Pasado Science Team

    2010-05-01

    the sample is necessary. This could not be accomplished, therefore absorbance in higher wavelengths was not recorded correctly. As a result of the poor spectral quality no calibration model was established for BSi using the Equinox device. Since this is by far the most time-consuming and elaborate conventional measurement, results give clear advantages for the Alpha device. Further calibration models were developed using spectra from the Visible Near Infrared Spectroscopy (VNIRS) region (400-2500 nm). Sample preparation for VNIRS analysis also is faster than for DRIFTS. However, FTIRS calibrations seem to perform better than those for VNIRS which show an R of 0.75 (BSi), 0.93 (TOC), 0.93 (TN), and 0.89 (TIC). NIRS primarily measures overtones of molecular vibrations and is typically used for quantitative measurement of organic functional groups. FTIRS is similar to NIRS, but uses longer wavelengths and directly monitors molecular vibrations. As a consequence, FTIRS allows more detailed structural and compositional analyses of both organic and inorganic compounds. Statistical analysis of the FTIRS-PLS models shows that the calibration depends on specific wave numbers, which compare well with spectra of pure compounds. The VNIRS technique gives rise to a spectrum with broad peaks and many overlapping signals which makes interpretation difficult without statistical analyses. In conclusion, the DRIFTS technique shows the best statistical performance for the analysis of biogeochemical properties. However, the VNIRS techniques and especially the ATR-FTIRS Alpha device show comparable results and can also be used as a rapid screening tool when time and costs are limiting factors. Kellner R, Mermet J-M, Otto M, Widmer HM (1998) Analytical chemistry. Wiley-VCH, Weinheim, etc. Rosén P, Vogel H, Cunnigham L, Reuss N, Conley DJ, Persson P (2009) Fourier transform infrared spectroscopy, a new method for rapid determination of total organic and inorganic carbon and biogenic silica

  13. Development and validation of an alternative to conventional pretreatment methods for residue analysis of butachlor in water, soil, and rice.

    Science.gov (United States)

    Xue, Jiaying; Jiang, Wenqing; Liu, Fengmao; Zhao, Huiyu; Wang, Suli; Peng, Wei

    2014-01-01

    A rapid and effective alternative analytical method for residues of butachlor in water, soil, and rice was established. The operating variables affecting performance of this method, including different extraction conditions and cleanup adsorbents, were evaluated. The determination of butachlor residues in soil, straw, rice hull, and husked rice was performed using GC/MS after extraction with n-hexane and cleanup with graphite carbon black. The average recoveries ranged from 81.5 to 102.7%, with RSDs of 0.6-7.7% for all of the matrixes investigated. The limits of quantitation were 0.05 mg/kg in water and rice plant, and 0.01 mg/kg in soil, straw, rice hull, and husked rice. A comparison among this proposed method, the conventional liquid-liquid extraction, the Quick, Easy, Cheap, Effective, Rugged, and Safe method, and Soxhlet extraction indicated that this method was more suitable for analyzing butachlor in rice samples. The further validation of the proposed method was carried out by Soxhlet extraction for the determination of butachlor residues in the husked rice samples, and the residue results showed there was no obvious difference obtained from these two methods. Samples from a rice field were found to contain butachlor residues below the maximum residue limits set by China (0.5 mg/kg) and Japan (0.1 mg/kg). The proposed method has a strong potential for application in routine screening and processing of large numbers of samples. This study developed a more effective alternative to the conventional analytical methods for analyzing butachlor residues in various matrixes.

  14. High-performance liquid chromatography analysis methods developed for quantifying enzymatic esterification of flavonoids in ionic liquids

    DEFF Research Database (Denmark)

    Lue, Bena-Marie; Guo, Zheng; Xu, X.B.

    2008-01-01

    Methods using reversed-phase high-performance liquid chromatography (RP-HPLC) with ELSD were investigated to quantify enzymatic reactions of flavonoids with fatty acids in the presence of diverse room temperature ionic liquids (RTILs). A buffered salt (preferably triethylamine-acetate) was found ...... developed was successfully applied for primary screening of RTILs (> 20), with in depth evaluation of substrates in 10 RTILs, for their evaluation as reaction media....

  15. Development of a Direct Headspace Collection Method from Arabidopsis Seedlings Using HS-SPME-GC-TOF-MS Analysis

    Directory of Open Access Journals (Sweden)

    Kazuki Saito

    2013-04-01

    Full Text Available Plants produce various volatile organic compounds (VOCs, which are thought to be a crucial factor in their interactions with harmful insects, plants and animals. Composition of VOCs may differ when plants are grown under different nutrient conditions, i.e., macronutrient-deficient conditions. However, in plants, relationships between macronutrient assimilation and VOC composition remain unclear. In order to identify the kinds of VOCs that can be emitted when plants are grown under various environmental conditions, we established a conventional method for VOC profiling in Arabidopsis thaliana (Arabidopsis involving headspace-solid-phase microextraction-gas chromatography-time-of-flight-mass spectrometry (HS-SPME-GC-TOF-MS. We grew Arabidopsis seedlings in an HS vial to directly perform HS analysis. To maximize the analytical performance of VOCs, we optimized the extraction method and the analytical conditions of HP-SPME-GC-TOF-MS. Using the optimized method, we conducted VOC profiling of Arabidopsis seedlings, which were grown under two different nutrition conditions, nutrition-rich and nutrition-deficient conditions. The VOC profiles clearly showed a distinct pattern with respect to each condition. This study suggests that HS-SPME-GC-TOF-MS analysis has immense potential to detect changes in the levels of VOCs in not only Arabidopsis, but other plants grown under various environmental conditions.

  16. Method Development for the Analysis of Pharmaceuticals with Acethylcholinesterase Activity in Water Using HPLC-DAD and Solid Phase Extraction

    Directory of Open Access Journals (Sweden)

    Samuel Budi Wardhana

    2014-03-01

    Full Text Available An SPE followed by HPLC-DAD method with ion pair chromatography technique to analyze pharmaceuticals with acethylcholinesterase activity including pyridostigmine (PYR, galathamine (GAL, neostigmine (NEO, eserine (ESE, and donepezil (DON in water samples was developed. Acetylcholinesterase (AChE inhibitors have been used to treat less severe dementias such as Alzheimer’s disease. Chromatographic separation was achieved using reversed-phase SymmetryShield column using gradient system with mobile phase consisting of H2O/ACN (99:1, v/v as mobile phase A with 10 mM sodium 1-hexanesulfonate and 0.1% acetic acid (HAc. The HPLC/DAD method was linear between concentrations of 5 to 100 ng/μL. The IDL and IQL ranged from 0.50 to 1.25 ng/μL and 1.5 to 3.0 ng/μL, respectively. SPE was used to extract and clean up the target substances in spiked pure water, tap water, and wastewater samples. The application of extraction method of 5 target substances in wastewater sample was divided into 2 parts: Oasis WCX (6 mL, 500 mg for PYR and Oasis HLB (6 mL, 200 mg for GAL, NEO, ESE and DON. The developed SPE and HPLC/DAD method is applicable for quantification of the 5 target substances in water samples in a concentration range > 50 µg/L and assumable lower for DON (> 25 µg/L.

  17. Methods and Model Development for Coupled RELAP5/PARCS Analysis of the Atucha-II Nuclear Power Plant

    Directory of Open Access Journals (Sweden)

    Andrew M. Ward

    2011-01-01

    Full Text Available In order to analyze the steady state and transient behavior of CNA-II, several tasks were required. Methods and models were developed in several areas. HELIOS lattice models were developed and benchmarked against WIMS/MCNP5 results generated by NA-SA. Cross-sections for the coupled RELAP5/PARCS calculation were extracted from HELIOS within the GenPMAXS framework. The validation of both HELIOS and PARCS was performed primarily by comparisons to WIMS/PUMA and MCNP for idealized models. Special methods were developed to model the control rods and boron injection systems of CNA-II. The insertion of the rods is oblique, and a special routine was added to PARCS to treat this effect. CFD results combined with specialized mapping routines were used to model the boron injection system. In all cases there was good agreement in the results which provided confidence in the neutronics methods and modeling. A coupled code benchmark between U of M and U of Pisa is ongoing and results are still preliminary. Under a LOCA transient, the best estimate behavior of the core appears to be acceptable.

  18. Methods and Model Development for Coupled RELAP5/PARCS Analysis of the Atucha-II Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ward, A.M.; Collins, B.S.; Xu, Y.; Downar, Th.J.; Madariaga, M.

    2011-01-01

    In order to analyze the steady state and transient behavior of CNA-II, several tasks were required. Methods and models were developed in several areas. HELIOS lattice models were developed and benchmarked against WIMS/MCNP5 results generated by NA-SA. Cross-sections for the coupled RELAP5/PARCS calculation were extracted from HELIOS within the Gen PMAXS framework. The validation of both HELIOS and PARCS was performed primarily by comparisons to WIMS/PUMA and MCNP for idealized models. Special methods were developed to model the control rods and boron injection systems of CNA-II. The insertion of the rods is oblique, and a special routine was added to PARCS to treat this effect. CFD results combined with specialized mapping routines were used to model the boron injection system. In all cases there was good agreement in the results which provided confidence in the neutronics methods and modeling. A coupled code benchmark between U of M and U of Pisa is ongoing and results are still preliminary. Under a LOCA transient, the best estimate behavior of the core appears to be acceptable

  19. Analysis of Scientific and Methodical Approaches to Portfolio Investment as a Tool of Financial Provision of Sustainable Economic Development

    Directory of Open Access Journals (Sweden)

    Leus Daryna V.

    2013-12-01

    Full Text Available The article analyses scientific and methodical approaches to portfolio investment. It develops recommendations on specification of the categorical apparatus of portfolio investment in the context of differentiation of strategic (direct and portfolio investments as alternative approaches to the conduct of investment activity. It identifies the composition and functions of objects and subjects of portfolio investment under conditions of globalisation of the world financial markets. It studies main postulates of the portfolio theory and justifies a necessity of identification of the place, role and functions of subjects of portfolio investment in them for ensuring sustainable development of the economy. It offers to specify, as one of the ways of further development of portfolio theories, a separate direction in the financial provision of economy with consideration of ecologic and social components – socio responsible investment.

  20. Methods of nonlinear analysis

    CERN Document Server

    Bellman, Richard Ernest

    1970-01-01

    In this book, we study theoretical and practical aspects of computing methods for mathematical modelling of nonlinear systems. A number of computing techniques are considered, such as methods of operator approximation with any given accuracy; operator interpolation techniques including a non-Lagrange interpolation; methods of system representation subject to constraints associated with concepts of causality, memory and stationarity; methods of system representation with an accuracy that is the best within a given class of models; methods of covariance matrix estimation;methods for low-rank mat

  1. Electroporation-based methods for in vivo, whole mount and primary culture analysis of zebrafish brain development

    Directory of Open Access Journals (Sweden)

    Jesuthasan Suresh

    2007-03-01

    Full Text Available Abstract Background Electroporation is a technique for the introduction of nucleic acids and other macromolecules into cells. In chick embryos it has been a particularly powerful technique for the spatial and temporal control of gene expression in developmental studies. Electroporation methods have also been reported for Xenopus, zebrafish, and mouse. Results We present a new protocol for zebrafish brain electroporation. Using a simple set-up with fixed spaced electrodes and microinjection equipment, it is possible to electroporate 50 to 100 embryos in 1 hour with no lethality and consistently high levels of transgene expression in numerous cells. Transfected cells in the zebrafish brain are amenable to in vivo time lapse imaging. Explants containing transfected neurons can be cultured for in vitro analysis. We also present a simple enzymatic method to isolate whole brains from fixed zebrafish for immunocytochemistry. Conclusion Building on previously described methods, we have optimized several parameters to allow for highly efficient unilateral or bilateral transgenesis of a large number of cells in the zebrafish brain. This method is simple and provides consistently high levels of transgenesis for large numbers of embryos.

  2. Electroporation-based methods for in vivo, whole mount and primary culture analysis of zebrafish brain development.

    Science.gov (United States)

    Hendricks, Michael; Jesuthasan, Suresh

    2007-03-15

    Electroporation is a technique for the introduction of nucleic acids and other macromolecules into cells. In chick embryos it has been a particularly powerful technique for the spatial and temporal control of gene expression in developmental studies. Electroporation methods have also been reported for Xenopus, zebrafish, and mouse. We present a new protocol for zebrafish brain electroporation. Using a simple set-up with fixed spaced electrodes and microinjection equipment, it is possible to electroporate 50 to 100 embryos in 1 hour with no lethality and consistently high levels of transgene expression in numerous cells. Transfected cells in the zebrafish brain are amenable to in vivo time lapse imaging. Explants containing transfected neurons can be cultured for in vitro analysis. We also present a simple enzymatic method to isolate whole brains from fixed zebrafish for immunocytochemistry. Building on previously described methods, we have optimized several parameters to allow for highly efficient unilateral or bilateral transgenesis of a large number of cells in the zebrafish brain. This method is simple and provides consistently high levels of transgenesis for large numbers of embryos.

  3. Development of CAPP code based on the finite element method for the analysis of VHTR cores - HTR2008-58169

    International Nuclear Information System (INIS)

    Lee, H. C.; Jo, C. K.; Noh, J. M.

    2008-01-01

    In this study, we developed a neutron diffusion equation solver based on the finite element method for CAPP code. Three types of triangular finite elements and five types of rectangular depending on the order of the shape functions were implemented for 2-D application. Ten types of triangular prismatic finite elements and seventeen types of rectangular prismatic finite elements were also implemented for 3-D application. Two types of polynomial mapping from the master finite element to a real finite element were adopted for flexibility in dealing with complex geometry. They are linear mapping and iso-parametric mapping. In linear mapping, only the vertex nodes are used as the mapping points. In iso-parametric mapping, all the nodal points in the finite element are used as the mapping points, which enables the real finite elements to have curved surfaces. For the treatment of spatial dependency of cross-sections in the finite elements, three types of polynomial expansion of the cross-sections in the finite elements were implemented. They are constant, linear, and iso-parametric cross-section expansions. The power method with the Wielandt acceleration technique was adopted as the outer iteration algorithm. The BiCGSTAB algorithm with the ILU (Incomplete LU) decomposition pre-conditioner was used as the linear equation solver in the inner iteration. The neutron diffusion equation solver developed in this study was verified against two well known benchmark problems, IAEA PWR benchmark problem and OECD/NEA PBMR400 benchmark problem. Results of numerical tests showed that the solution converged to the reference solution as the finite elements are refined and as the order of the finite elements increases. Numerical tests also showed that the higher order finite element method is much efficient than lower order finite element method or finite difference method. (authors)

  4. Development of a sample preparation method for the analysis of current-use pesticides in sediment using gas chromatography.

    Science.gov (United States)

    Wang, Dongli; Weston, Donald P; Ding, Yuping; Lydy, Michael J

    2010-02-01

    Pyrethroid insecticides have been implicated as the cause of sediment toxicity to Hyalella azteca in both agricultural and urban areas of California; however, for a subset of these toxic sediments (approximately 30%), the cause of toxicity remains unidentified. This article describes the analytical method development for seven additional pesticides that are being examined to determine if they might play a role in the unexplained toxicity. A pressurized liquid extraction method was optimized to simultaneously extract diazinon, methyl parathion, oxyfluorfen, dicofol, fenpropathrin, pyraclostrobin, and indoxacarb from sediment, and the extracts were cleaned using a two-step solid-phase extraction procedure. The final extract was analyzed for the target pesticides by gas chromatography/nitrogen-phosphorus detector (GC/NPD), and gas chromatography/electron capture detector (GC/ECD), after sulfur was removed by shaking with copper and cold crystallization. Three sediments were used as reference matrices to assess method accuracy and precision. Method detection limits were 0.23-1.8 ng/g dry sediment using seven replicates of sediment spiked at 1.0 ng/g dry sediment. Recoveries ranged from 61.6 to 118% with relative standard deviations of 2.1-17% when spiked at 5.0 and 50 ng/g dry sediment. The three reference sediments, spiked with 50 ng/g dry weight of the pesticide mixture, were aged for 0.25, 1, 4, 7, and 14 days. Recoveries of the pesticides in the sediments generally decreased with increased aging time, but the magnitude of the decline was pesticide and sediment dependent. The developed method was applied to field-collected sediments from the Central Valley of California.

  5. Multipoint development of the weighted pairwise correlation (WPC) linkage method for pedigrees of arbitrary size and application to the analysis of breast cancer and alcoholism familial data.

    Science.gov (United States)

    Zinn-Justin, A; Ziegler, A; Abel, L

    2001-07-01

    The weighted pairwise correlation (WPC) method is a simple and powerful model-free method of linkage analysis that has the advantages of being applicable to binary, ordered categorical, quantitative, or censored traits, and to consider all pairs of relatives in large pedigrees. The originally implemented approach was limited to the use of the identical by state (IBS) information, and we recently extended the WPC method to incorporate the identical by descent (IBD) information for two-point linkage analysis. Here, we develop a multipoint WPC method suitable for pedigrees of arbitrary size and large number of markers. The multipoint IBD estimation procedure for relative pairs is based on the efficient regression approach developed for pedigrees implemented in SOLAR. A robust and fast Monte-Carlo procedure is used to determine reliable P values. Application of the method to the 214 pedigrees from the Breast Cancer Linkage Consortium provided for the Genetic Analysis Workshop (GAW) 9 shows that multipoint WPC statistic values were not far from two-point maximum lod-score values obtained by the classical parametric linkage method and were higher than multipoint variance component analysis lod-scores obtained with SOLAR. The multipoint WPC method is also used to analyze the familial Collaborative Study of the Genetics of Alcoholism data on alcoholism released for GAW11. It allows a better specification of the linkage results previously obtained within the chromosome 4 region. Copyright 2001 Wiley-Liss, Inc.

  6. Development of a Novel Nuclear Safety Culture Evaluation Method for an Operating Team Using Probabilistic Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Han, Sangmin; Lee, Seung Min; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)

    2015-05-15

    IAEA defined safety culture as follows: 'Safety Culture is that assembly of characteristics and attitudes in organizations and individuals which establishes that, as an overriding priority, nuclear plant safety issues receive the attention warranted by their significance'. Also, celebrated behavioral scientist, Cooper, defined safety culture as,'safety culture is that observable degree of effort by which all organizational members direct their attention and actions toward improving safety on a daily basis' with his internal psychological, situational, and behavioral context model. With these various definitions and criteria of safety culture, several safety culture assessment methods have been developed to improve and manage safety culture. To develop a new quantitative safety culture evaluation method for an operating team, we unified and redefined safety culture assessment items. Then we modeled a new safety culture evaluation by adopting level 1 PSA concept. Finally, we suggested the criteria to obtain nominal success probabilities of assessment items by using 'operational definition'. To validate the suggested evaluation method, we analyzed the collected audio-visual recording data collected from a full scope main control room simulator of a NPP in Korea.

  7. Development of a Novel Nuclear Safety Culture Evaluation Method for an Operating Team Using Probabilistic Safety Analysis

    International Nuclear Information System (INIS)

    Han, Sangmin; Lee, Seung Min; Seong, Poong Hyun

    2015-01-01

    IAEA defined safety culture as follows: 'Safety Culture is that assembly of characteristics and attitudes in organizations and individuals which establishes that, as an overriding priority, nuclear plant safety issues receive the attention warranted by their significance'. Also, celebrated behavioral scientist, Cooper, defined safety culture as,'safety culture is that observable degree of effort by which all organizational members direct their attention and actions toward improving safety on a daily basis' with his internal psychological, situational, and behavioral context model. With these various definitions and criteria of safety culture, several safety culture assessment methods have been developed to improve and manage safety culture. To develop a new quantitative safety culture evaluation method for an operating team, we unified and redefined safety culture assessment items. Then we modeled a new safety culture evaluation by adopting level 1 PSA concept. Finally, we suggested the criteria to obtain nominal success probabilities of assessment items by using 'operational definition'. To validate the suggested evaluation method, we analyzed the collected audio-visual recording data collected from a full scope main control room simulator of a NPP in Korea

  8. Contribution of ion beam analysis methods to the development of 2nd generation high temperature superconducting (HTS) wires

    Energy Technology Data Exchange (ETDEWEB)

    Usov, Igor O [Los Alamos National Laboratory; Arendt, Paul N [Los Alamos National Laboratory; Stan, Liliana [Los Alamos National Laboratory; Holesinger, Terry G [Los Alamos National Laboratory; Foltyn, Steven R [Los Alamos National Laboratory; Depaula, Raymond F [Los Alamos National Laboratory

    2009-01-01

    One of the crucial steps in the second generation high temperature superconducting wire program was development of the buffer layer architecture. The architecture designed at the Superconductivity Technology Center at Los Alamos National Laboratory consists of several oxide layers wherein each layer plays a specific role, namely: nucleation layer, diffusion barrier, biaxially textured template, and an intermediate layer with a good match to the lattice parameter of superconducting Y{sub 1}Ba{sub 2}Cu{sub 3}O{sub 7} (YBCO) compound. This report demonstrates how a wide range of ion beam analysis techniques (SIMS, RBS, channeling, PIXE, PIGE, NRA, ERD) was employed for analysis of each buffer layer and the YBCO films. These results assisted in understanding of a variety of physical processes occurring during the buffet layer fabrication and helped to optimize the buffer layer architecture as a whole.

  9. The development and error analysis of a kinematic parameters based spatial positioning method for an orthopedic navigation robot system.

    Science.gov (United States)

    Pei, Baoqing; Zhu, Gang; Wang, Yu; Qiao, Huiting; Chen, Xiangqian; Wang, Binbin; Li, Xiaoyun; Zhang, Weijun; Liu, Wenyong; Fan, Yubo

    2017-09-01

    Spatial positioning is the key function of a surgical navigation robot system, and accuracy is the most important performance index of such a system. The kinematic parameters of a six degrees of freedom (DOF) robot arm were used to form the transformation from intraoperative fluoroscopy images to a robot's coordinate system without C-arm calibration and to solve the redundant DOF problem. The influences of three typical error sources and their combination on the final navigation error were investigated through Monte Carlo simulation. The navigation error of the proposed method is less than 0.6 mm, and the feasibility was verified through cadaver experiments. Error analysis suggests that the robot kinematic error has a linear relationship with final navigation error, while the image error and gauge error have nonlinear influences. This kinematic parameters based method can provide accurate and convenient navigation for orthopedic surgeries. The result of error analysis will help error design and assignment for surgical robots. Copyright © 2016 John Wiley & Sons, Ltd.

  10. Development of quantitative analysis method for stereotactic brain image. Assessment of reduced accumulation in extent and severity using anatomical segmentation

    International Nuclear Information System (INIS)

    Mizumura, Sunao; Kumita, Shin-ichiro; Cho, Keiichi; Ishihara, Makiko; Nakajo, Hidenobu; Toba, Masahiro; Kumazaki, Tatsuo

    2003-01-01

    Through visual assessment by three-dimensional (3D) brain image analysis methods using stereotactic brain coordinates system, such as three-dimensional stereotactic surface projections and statistical parametric mapping, it is difficult to quantitatively assess anatomical information and the range of extent of an abnormal region. In this study, we devised a method to quantitatively assess local abnormal findings by segmenting a brain map according to anatomical structure. Through quantitative local abnormality assessment using this method, we studied the characteristics of distribution of reduced blood flow in cases with dementia of the Alzheimer type (DAT). Using twenty-five cases with DAT (mean age, 68.9 years old), all of whom were diagnosed as probable Alzheimer's disease based on National Institute of Neurological and Communicative Disorders and Stroke-Alzheimer's Disease and Related Disorders Association (NINCDS-ADRDA), we collected I-123 iodoamphetamine SPECT data. A 3D brain map using the 3D-stereotactic surface projections (SSP) program was compared with the data of 20 cases in the control group, who age-matched the subject cases. To study local abnormalities on the 3D images, we divided the whole brain into 24 segments based on anatomical classification. We assessed the extent of an abnormal region in each segment (rate of the coordinates with a Z-value that exceeds the threshold value, in all coordinates within a segment), and severity (average Z-value of the coordinates with a Z-value that exceeds the threshold value). This method clarified orientation and expansion of reduced accumulation, through classifying stereotactic brain coordinates according to the anatomical structure. This method was considered useful for quantitatively grasping distribution abnormalities in the brain and changes in abnormality distribution. (author)

  11. Experimental design-based isotope-dilution SPME-GC/MS method development for the analysis of smoke flavouring products.

    Science.gov (United States)

    Giri, Anupam; Zelinkova, Zuzana; Wenzl, Thomas

    2017-12-01

    For the implementation of Regulation (EC) No 2065/2003 related to smoke flavourings used or intended for use in or on foods a method based on solid-phase micro extraction (SPME) GC/MS was developed for the characterisation of liquid smoke products. A statistically based experimental design (DoE) was used for method optimisation. The best general conditions to quantitatively analyse the liquid smoke compounds were obtained with a polydimethylsiloxane/divinylbenzene (PDMS/DVB) fibre, 60°C extraction temperature, 30 min extraction time, 250°C desorption temperature, 180 s desorption time, 15 s agitation time, and 250 rpm agitation speed. Under the optimised conditions, 119 wood pyrolysis products including furan/pyran derivatives, phenols, guaiacol, syringol, benzenediol, and their derivatives, cyclic ketones, and several other heterocyclic compounds were identified. The proposed method was repeatable (RSD% method proved to be fit for purpose, allowing the rapid identification and quantification of volatile compounds in liquid smoke flavourings.

  12. New analysis methods for skin fine-structure via optical image and development of 3D skin Cycloscan(™).

    Science.gov (United States)

    Han, J Y; Nam, G W; Lee, H K; Kim, M J; Kim, E J

    2015-11-01

    This study was conducted to develop methods for measuring skin fine-structure via optical image and apparatus for photographing to analyze efficacy of anti-aging. We developed an apparatus named 3D Skin CycloScan(™) to evaluate the efficacy of cosmetics by imagification of skin fine-structure such as wrinkles, pores, and skin texture. The semi-sphere shaped device has 12 different sequential flashing light sources captures optical image simultaneously in one second to exclude the influence of the subject's movement. The normal map that is extracted through shape from shading method is composed of face contour and skin fine-structure parts. When the low-frequency component which is the result of the Gaussian Filter application is eliminated, we can get only skin fine-structure. In this normal map, it is possible to extract two-dimensional vector map called direction map and we can regulate the intensity of the image of wrinkles, pores, and skin texture after filtering the direction map. We performed a clinical study to apply this new apparatus and methods to evaluate an anti-aging efficacy of cosmetics visually and validate with other conventional methods. After using anti-aging cream including 2% adenosine for 8 weeks, the total amount of fine wrinkle around eye area detected via 3D Skin CycloScan(™) was reduced by 12.1%. Also, wrinkles on crow's feet measured by PRIMOS COMPACT(®) (GFMesstechnik GmbH, Germany) reduced 11.7%. According to an aspect of the present study, by changing the direction of the lights toward to subject's skin, we can obtain the information about the fine structures present on the skin such as wrinkles, pores, or skin texture and represent it as an image. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  13. Developing a 3D constrained variational analysis method to obtain accurate gridded atmospheric vertical velocity and horizontal advections

    Science.gov (United States)

    Tang, S.; Zhang, M.

    2013-12-01

    Based on the constrained variational analysis (CVA) algorithm developed by Zhang and Lin (1997), a 3-dimensional (3D) version of CVA is developed. The new algorithm used gridded surface and TOA observations as constraints to adjust atmospheric state variables in each grid point to satisfy column-integrated mass, moisture and static energy conservation. From the process of adjustment a set of high-quality 3D large-scale forcing data (vertical velocity and horizontal advections) can be derived to drive Single-Column models (SCM), Cloud-Resolving Models (CRM) and Large-Eddy Simulations (LES) to evaluate and improve parameterizations. Since the 3D CVA can adjust gridded state variables from any data source with observed precipitation, radiation and surface fluxes, it also gives a potential possibility to use this algorithm in data assimilation system to assimilate precipitation and radiation data.

  14. Development and validation of a cleanup method for hydrocarbon containing samples for the analysis of semivolatile organic compounds

    Energy Technology Data Exchange (ETDEWEB)

    Hoppe, E.W.; Stromatt, R.W.; Campbell, J.A.; Steele, M.J.; Jones, J.E.

    1992-04-01

    Samples obtained from the Hanford single shell tanks (SSTs) are contaminated with normal paraffin hydrocarbon (NPH) as hydrostatic fluid from the sampling process or can be native to the tank waste. The contamination is usually high enough that a dilution of up to several orders of magnitude may be required before the sample can be analyzed by the conventional gas chromatography/mass spectrometry methodology. This can prevent detection and measurement of organic constituents that are present at lower concentration levels. To eliminate or minimize the problem, a sample cleanup method has been developed and validated and is presented in this document.

  15. Development of New Method for Simultaneous Analysis of Piracetam and Levetiracetam in Pharmaceuticals and Biological Fluids: Application in Stability Studies

    OpenAIRE

    Siddiqui, Farhan Ahmed; Sher, Nawab; Shafi, Nighat; Wafa Sial, Alisha; Ahmad, Mansoor; Mehjebeen,; Naseem, Huma

    2014-01-01

    RP-HPLC ultraviolet detection simultaneous quantification of piracetam and levetiracetam has been developed and validated. The chromatography was obtained on a Nucleosil C18 column of 25 cm × 0.46 cm, 10 μm, dimension. The mobile phase was a (70 : 30 v/v) mixture of 0.1 g/L of triethylamine and acetonitrile. Smooth flow of mobile phase at 1 mL/min was set and 205 nm wavelength was selected. Results were evaluated through statistical parameters which qualify the method reproducibility and sele...

  16. Development and validation of a cleanup method for hydrocarbon containing samples for the analysis of semivolatile organic compounds

    International Nuclear Information System (INIS)

    Hoppe, E.W.; Stromatt, R.W.; Campbell, J.A.; Steele, M.J.; Jones, J.E.

    1992-04-01

    Samples obtained from the Hanford single shell tanks (SSTs) are contaminated with normal paraffin hydrocarbon (NPH) as hydrostatic fluid from the sampling process or can be native to the tank waste. The contamination is usually high enough that a dilution of up to several orders of magnitude may be required before the sample can be analyzed by the conventional gas chromatography/mass spectrometry methodology. This can prevent detection and measurement of organic constituents that are present at lower concentration levels. To eliminate or minimize the problem, a sample cleanup method has been developed and validated and is presented in this document

  17. Development of a method of analysis and computer program for calculating the inviscid flow about the windward surfaces of space shuttle configurations at large angles of attack

    Science.gov (United States)

    Maslen, S. H.

    1974-01-01

    A general method developed for the analysis of inviscid hypersonic shock layers is discussed for application to the case of the shuttle vehicle at high (65 deg) angle of attack. The associated extensive subsonic flow region caused convergence difficulties whose resolution is discussed. It is required that the solution be smoother than anticipated.

  18. Trial Sequential Methods for Meta-Analysis

    Science.gov (United States)

    Kulinskaya, Elena; Wood, John

    2014-01-01

    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  19. Biological Methods and Manual Development

    Science.gov (United States)

    EPA scientists conduct research to develop and evaluate analytical methods for the identification, enumeration, evaluation of aquatic organisms exposed to environmental stressors and to correlate exposures with effects on chemical and biological indicators

  20. DDOT MXD+ method development report.

    Science.gov (United States)

    2015-09-01

    Mixed-use development has become increasingly common across the country, including Washington, D.C. : However, a straightforward and empirically validated method for evaluating the traffic impacts of such : projects is still needed. The data presente...

  1. Development of Extraction Methods for the Analysis of Perfluorinated Compounds in Leather with High Performance Liquid Chromatography Tandem Mass Spectrometry

    Science.gov (United States)

    Zhang, Yan; Wang, Youchao; Tang, Chuanjiang; Nie, Jingmei; Xu, Chengtao

    2018-01-01

    Perfluorinated compounds (PFCs), used to provide water, oil, grease, heat and stain repellency to a range of textile and other products, have been found to be persistent in the environment and are associated with adverse effects on humans and wildlife. This study presents the development and validation of an analytical method to determine the simultaneous presence of eleven PFCs in leather using solid-phase extraction followed by liquid chromatography-tandem mass spectrometry (LC-MS/MS). The perfluorinated compounds were primarily extracted from the samples by a liquid extraction procedure by ultrasonic, in which the parameters were optimized. Then the solid-phase extraction (SPE) is the most important advantages of the developed methodology. The sample volume and elution conditions were optimized by means of an experimental design. The proposed method was applied to determine the PFCs in leather, where the detection limits of the eleven compounds were 0.09-0.96 ng/L, and the recoveries of all compounds spiked at 5 ng/L concentration level were in the range of 65-96%, with a better RSD lower than 19% (n = 7).

  2. Development of new method for simultaneous analysis of piracetam and levetiracetam in pharmaceuticals and biological fluids: application in stability studies.

    Science.gov (United States)

    Siddiqui, Farhan Ahmed; Sher, Nawab; Shafi, Nighat; Wafa Sial, Alisha; Ahmad, Mansoor; Mehjebeen; Naseem, Huma

    2014-01-01

    RP-HPLC ultraviolet detection simultaneous quantification of piracetam and levetiracetam has been developed and validated. The chromatography was obtained on a Nucleosil C18 column of 25 cm×0.46 cm, 10 μm, dimension. The mobile phase was a (70:30 v/v) mixture of 0.1 g/L of triethylamine and acetonitrile. Smooth flow of mobile phase at 1 mL/min was set and 205 nm wavelength was selected. Results were evaluated through statistical parameters which qualify the method reproducibility and selectivity for the quantification of piracetam, levetiracetam, and their impurities hence proving stability-indicating properties. The proposed method is significantly important, permitting the separation of the main constituent piracetam from levetiracetam. Linear behavior was observed between 20 ng/mL and 10,000 ng/mL for both drugs. The proposed method was checked in bulk drugs, dosage formulations, physiological condition, and clinical investigations and excellent outcome was witnessed.

  3. Development of New Method for Simultaneous Analysis of Piracetam and Levetiracetam in Pharmaceuticals and Biological Fluids: Application in Stability Studies

    Directory of Open Access Journals (Sweden)

    Farhan Ahmed Siddiqui

    2014-01-01

    Full Text Available RP-HPLC ultraviolet detection simultaneous quantification of piracetam and levetiracetam has been developed and validated. The chromatography was obtained on a Nucleosil C18 column of 25 cm×0.46 cm, 10 μm, dimension. The mobile phase was a (70 : 30 v/v mixture of 0.1 g/L of triethylamine and acetonitrile. Smooth flow of mobile phase at 1 mL/min was set and 205 nm wavelength was selected. Results were evaluated through statistical parameters which qualify the method reproducibility and selectivity for the quantification of piracetam, levetiracetam, and their impurities hence proving stability-indicating properties. The proposed method is significantly important, permitting the separation of the main constituent piracetam from levetiracetam. Linear behavior was observed between 20 ng/mL and 10000 ng/mL for both drugs. The proposed method was checked in bulk drugs, dosage formulations, physiological condition, and clinical investigations and excellent outcome was witnessed.

  4. The order and priority of research and design method application within an assistive technology new product development process: a summative content analysis of 20 case studies.

    Science.gov (United States)

    Torrens, George Edward

    2018-01-01

    Summative content analysis was used to define methods and heuristics from each case study. The review process was in two parts: (1) A literature review to identify conventional research methods and (2) a summative content analysis of published case studies, based on the identified methods and heuristics to suggest an order and priority of where and when were used. Over 200 research and design methods and design heuristics were identified. From the review of the 20 case studies 42 were identified as being applied. The majority of methods and heuristics were applied in phase two, market choice. There appeared a disparity between the limited numbers of methods frequently used, under 10 within the 20 case studies, when hundreds were available. Implications for Rehabilitation The communication highlights a number of issues that have implication for those involved in assistive technology new product development: •The study defined over 200 well-established research and design methods and design heuristics that are available for use by those who specify and design assistive technology products, which provide a comprehensive reference list for practitioners in the field; •The review within the study suggests only a limited number of research and design methods are regularly used by industrial design focused assistive technology new product developers; and, •Debate is required within the practitioners working in this field to reflect on how a wider range of potentially more effective methods and heuristics may be incorporated into daily working practice.

  5. A strategy to the development of a human error analysis method for accident management in nuclear power plants using industrial accident dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Hee; Kim, Jae Whan; Jung, Won Dae; Ha, Jae Ju

    1998-06-01

    This technical report describes the early progress of he establishment of a human error analysis method as a part of a human reliability analysis(HRA) method for the assessment of the human error potential in a given accident management strategy. At first, we review the shortages and limitations of the existing HRA methods through an example application. In order to enhance the bias to the quantitative aspect of the HRA method, we focused to the qualitative aspect, i.e., human error analysis(HEA), during the proposition of a strategy to the new method. For the establishment of a new HEA method, we discuss the basic theories and approaches to the human error in industry, and propose three basic requirements that should be maintained as pre-requisites for HEA method in practice. Finally, we test IAD(Industrial Accident Dynamics) which has been widely utilized in industrial fields, in order to know whether IAD can be so easily modified and extended to the nuclear power plant applications. We try to apply IAD to the same example case and develop new taxonomy of the performance shaping factors in accident management and their influence matrix, which could enhance the IAD method as an HEA method. (author). 33 refs., 17 tabs., 20 figs.

  6. A strategy to the development of a human error analysis method for accident management in nuclear power plants using industrial accident dynamics

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Kim, Jae Whan; Jung, Won Dae; Ha, Jae Ju

    1998-06-01

    This technical report describes the early progress of he establishment of a human error analysis method as a part of a human reliability analysis(HRA) method for the assessment of the human error potential in a given accident management strategy. At first, we review the shortages and limitations of the existing HRA methods through an example application. In order to enhance the bias to the quantitative aspect of the HRA method, we focused to the qualitative aspect, i.e., human error analysis(HEA), during the proposition of a strategy to the new method. For the establishment of a new HEA method, we discuss the basic theories and approaches to the human error in industry, and propose three basic requirements that should be maintained as pre-requisites for HEA method in practice. Finally, we test IAD(Industrial Accident Dynamics) which has been widely utilized in industrial fields, in order to know whether IAD can be so easily modified and extended to the nuclear power plant applications. We try to apply IAD to the same example case and develop new taxonomy of the performance shaping factors in accident management and their influence matrix, which could enhance the IAD method as an HEA method. (author). 33 refs., 17 tabs., 20 figs

  7. Development of quantification analysis software for measuring regional cerebral blood flow by the modified split-dose method with 123I-IMP before and after acetazolamide loading

    International Nuclear Information System (INIS)

    Nagaki, Akio; Kobara, Kouichi; Matsutomo, Norikazu

    2003-01-01

    We developed a quantification analysis software program for measuring regional cerebral blood flow (rCBF) at rest and under acetazolamide (ACZ) stress by the modified split-dose (MSD) method with iodine-123 N-isopropyl-p-iodoamphetamine (IMP) and compared the rCBF values measured by the MSD method and by the split dose 123 I-IMP SPECT (SD) method requiring one continuous withdrawal of arterial blood. Since the MSD method allows the input of two arterial blood sampling parameter values, the background subtraction procedure for obtaining ACZ-induced images in the MSD method is not identical to the procedure in the SD method. With our software program for rCBF quantification, the resting rCBF values determined by the MSD method were closely correlated with the values measured by the SD method (r=0.94), and there was also a good correlation between the ACZ-induced rCBF values obtained by the MSD method and by the SD method (r=0.81). The increase in rCBF under ACZ stress was estimated to be approximately 26% by the SD method and 38% by the MSD method, suggesting that the MSD method tends to overestimate the increase in rCBF under ACZ stress in comparison with the SD method, but the variability of the rCBF values at rest and during ACZ stress analyzed by the MSD method was smaller than the variability with the SD method. Further clinical studies are required to validate our rCBF quantification analysis program for the MSD method. (author)

  8. Analytical method development and validation for quantification of uranium by Fourier Transform Infrared Spectroscopy (FTIR) for routine quality control analysis

    International Nuclear Information System (INIS)

    Pereira, Elaine; Silva, Ieda de S.; Gomide, Ricardo G.; Pires, Maria Aparecida F.

    2015-01-01

    This work presents a low cost, simple and new methodology for direct determination uranium in different matrices uranium: organic phase (UO 2 (NO 3 ) 2 .2TBP - uranyl nitrate complex) and aqueous phase (UO 2 (NO 3 ) 2 - NTU - uranyl nitrate), based on Fourier Transform Infrared spectroscopy (FTIR) using KBr pellets technique. The analytical validation is essential to define if a developed methodology is completely adjusted to the objectives that it is destined and is considered one of the main instruments of quality control. The parameters used in the validation process were: selectivity, linearity, limits of detection (LD) and quantitation (LQ), precision (repeatability and intermediate precision), accuracy and robustness. The method for uranium in organic phase (UO 2 (NO 3 ) 2 .2TBP in hexane/embedded in KBr) was linear (r=0.9989) over the range of 1.0 g L -1 a 14.3 g L -1 , LD were 92.1 mg L -1 and LQ 113.1 mg L -1 , precision (RSD < 1.6% and p-value < 0.05), accurate (recovery of 100.1% - 102.9%). The method for uranium aqueous phase (UO 2 (NO 3 )2/embedded in KBr) was linear (r=0.9964) over the range of 5.4 g L -1 a 51.2 g L -1 , LD were 835 mg L -1 and LQ 958 mg L -1 , precision (RSD < 1.0% and p-value < 0.05), accurate (recovery of 99.1% - 102.0%). The FTIR method is robust regarding most of the variables analyzed, as the difference between results obtained under nominal and modified conditions were lower than the critical value for all analytical parameters studied. Some process samples were analyzed in FTIR and compared with gravimetric and x ray fluorescence (XRF) analyses showing similar results in all three methods. The statistical tests (Student-t and Fischer) showed that the techniques are equivalent. (author)

  9. SWOT analysis: The analytical method in the process of planning and its application in the development of orthopaedic hospital department

    Directory of Open Access Journals (Sweden)

    Terzić Zorica

    2010-01-01

    Full Text Available Introduction. SWOT analysis is a managerial tool used to evaluate internal and external environment through strengths and weaknesses, opportunities and threats. Objective. The aim was to demonstrate the application of the SWOT analysis on the example of the Department for Paediatric Orthopaedics and Traumatology at the Institute of Orthopaedic Surgery 'Banjica' in Belgrade. Methods. Qualitative research was conducted during December 2008 at the Department for Paediatric Orthopaedics and Traumatology of the Institute of Orthopaedic Surgery 'Banjica' by applying the focus group technique. Participants were members of the medical staff and patients. In the first phase of the focus group brainstorming was applied to collect the factors of internal and external environment, and to identify strengths and weaknesses, opportunities and threats, respectively. In the second phase the nominal group technique was applied in order to reduce the list of factors. The factors were assessed according to their influence on the Department. Factors ranked by the three point Likert scale from 3 (highest impact to 1 (lowest impact. Results. The most important strengths of the Department are competent and skilled staff, high quality of services, average hospital bed utilization, the Department providing the educational basis of the School of Medicine, satisfied patients, pleasant setting, and additional working hours. The weaknesses are: poor spatial organization, personnel unmotivated to refresh knowledge, lack of specifically trained personnel, inadequate sanitary facilities, and uncovered services by the Insurance Fund, long average hospital stay, and low economic status of patients. The opportunities are: legislative regulations, formed paediatric traumatology service at the City level, good regional position of the Institute, and extension of referral areas. The threats are: absent Department autonomy in the personnel policy of the Institute, competitions within

  10. DEVELOPMENT OF METHODS OF ESTIMATION, ANALYSIS, SUBSTANTIATION AND SELECTION OF ORGANIZATIONAL AND TECHNOLOGICAL DECISIONS FOR RECONSTRUCTION OF INDUSTRIAL ENTERPRISES

    Directory of Open Access Journals (Sweden)

    SIEDIN V. L.

    2017-02-01

    Full Text Available Raising of problem. Over the past decade, changes in the economy have led to the decline of many industrial enterprises, which in turn led to the emergence of abandoned buildings and degraded areas that create a social and environmental hazard. Accordingly, the buildings and structures of such enterprises do not function and need reconstruction. Purpose of the aricle. Study of the development of methods for assessing, analyzing, substantiating and selecting rational organizational and technological decisions for the reconstruction of industrial enterprises. Conclusion. With the aim of transforming degraded and disordered territories into modern centers of vital activity, it is necessary to identify in each populated area the areas of priority renovation and reconstruction, and also to concentrate budgetary funds and private investments for the implementation of such projects. In the implementation of the above measures, the settlements will be systematically updated in accordance with european standards.

  11. Formulation of an aloe-based product according to Iranian traditional medicine and development of its analysis method.

    Science.gov (United States)

    Moein, Elham; Hajimehdipoor, Homa; Toliyat, Tayebeh; Choopani, Rasool; Hamzeloo-Moghadam, Maryam

    2017-08-29

    Currently, people are more interested to traditional medicine. The traditional formulations should be converted to modern drug delivery systems to be more acceptable for the patients. In the present investigation, a poly herbal medicine "Ayarij-e-Faiqra" (AF) based on Iranian traditional medicine (ITM) has been formulated and its quality control parameters have been developed. The main ingredients of AF including barks of Cinnamomum zeylanicum Blume and Cinnamomum cassia J. Presl, the rhizomes of Nardostachys jatamansi DC., the fruits of Piper cubeba L.f., the flowers of Rosa damascena Herrm., the oleo gum resin of Pistacia terebinthus L. and Aloe spp. dried juice were powdered and used for preparing seven tablet formulations of the herbal mixture. Flowability of the different formulated powders was examined and the best formulations were selected (F6&F7). The tablets were prepared from the selected formulations compared according to the physical characteristics and finally, F7 was selected and coated. Physicochemical characters of core and coated AF tablets were determined and the HPLC method for quantitation of aloin as a marker of tablets was selected and verified according to selectivity, linearity, precision, recovery, LOD and LOQ. The results showed that core and coated AF tablets were in agreement with USP requirements for herbal drugs. They had acceptable appearance, disintegration time, friability, hardness, dissolution behavior, weight variation and content uniformity. The amount of aloin in tablets was found 123.1 mg/tab. The HPLC method for aloin determination in AF tablets was verified according to selectivity, linearity (5-500 μg/ml, r 2 :0.9999), precision (RSD: 1.62%), recovery (108.0%), LOD & LOQ (0.0053 & 0.0161 μg/ml). The formulated tablets could be a good substitute for powder and capsules of AF in ITM clinics with a feasible and precise method for its quality control. Ayarij-e-Faiqra formulation.

  12. Semianalytical analysis of shear walls with the use of discrete-continual finite element method. Part 2: Numerical examples, future development

    Directory of Open Access Journals (Sweden)

    Akimov Pavel

    2016-01-01

    Full Text Available The distinctive paper is devoted to the two-dimensional semi-analytical solution of boundary problems of analysis of shear walls with the use of discrete-continual finite element method (DCFEM. This approach allows obtaining the exact analytical solution in one direction (so-called “basic” direction, also decrease the size of the problem to one-dimensional common finite element analysis. Two numerical examples of structural analysis with the use of DCFEM are considered, conventional finite element method (FEM is used for verification purposes. The presented examples show some of the advantages of the suggested approach to semianalytical analysis of the shear wall. Future development of DCFEM, particularly associated with multigrid approach, is under consideration as well.

  13. Insights From Google Play Store User Reviews for the Development of Weight Loss Apps: Mixed-Method Analysis.

    Science.gov (United States)

    Frie, Kerstin; Hartmann-Boyce, Jamie; Jebb, Susan; Albury, Charlotte; Nourse, Rebecca; Aveyard, Paul

    2017-12-22

    Significant weight loss takes several months to achieve, and behavioral support can enhance weight loss success. Weight loss apps could provide ongoing support and deliver innovative interventions, but to do so, developers must ensure user satisfaction. The aim of this study was to conduct a review of Google Play Store apps to explore what users like and dislike about weight loss and weight-tracking apps and to examine qualitative feedback through analysis of user reviews. The Google Play Store was searched and screened for weight loss apps using the search terms weight loss and weight track*, resulting in 179 mobile apps. A content analysis was conducted based on the Oxford Food and Activity Behaviors taxonomy. Correlational analyses were used to assess the association between complexity of mobile health (mHealth) apps and popularity indicators. The sample was then screened for popular apps that primarily focus on weight-tracking. For the resulting subset of 15 weight-tracking apps, 569 user reviews were sampled from the Google Play Store. Framework and thematic analysis of user reviews was conducted to assess which features users valued and how design influenced users' responses. The complexity (number of components) of weight loss apps was significantly positively correlated with the rating (r=.25; P=.001), number of reviews (r=.28; P<.001), and number of downloads (r=.48; P<.001) of the app. In contrast, in the qualitative analysis of weight-tracking apps, users expressed preference for simplicity and ease of use. In addition, we found that positive reinforcement through detailed feedback fostered users' motivation for further weight loss. Smooth functioning and reliable data storage emerged as critical prerequisites for long-term app usage. Users of weight-tracking apps valued simplicity, whereas users of comprehensive weight loss apps appreciated availability of more features, indicating that complexity demands are specific to different target populations

  14. Information Systems Development as a Research Method

    Directory of Open Access Journals (Sweden)

    Helen Hasan

    2003-11-01

    Full Text Available This paper takes the stance that some cases of information systems development can be considered knowledge creating activities, and, in those cases, information systems development can be a legitimate research method. In these cases not only is knowledge created about the development process itself but also a deeper understanding emerges about the organisational problem that the system is designed to solve. The paper begins with a brief overview of research in the design sciences and a comparison of research methods that are concerned with the design, and use, of information systems. This is followed by an assessment of the way systems development as a research method deals with the scientific research processes of data collection, analysis, synthesis and display. A case study, where the systems development research method was use, is described to illustrate the method and give the reader a better understanding of the approach.

  15. Analysis and development of methods for the recovery of tri-n-butylphosphate (TBP)-30%v/v-degraded dodecane

    International Nuclear Information System (INIS)

    Dalston, C.O.

    1984-01-01

    Tri-n-butyl phosphate associated with an inert hydrocarbon is the main solvent used in reprocessing of nuclear irradiated fuel arising of pressurized water reactors. The combined action of radiation and nitric acid cause severe damage to solvent, in reprocessing steps. The recovery of the solvent is, thus, of great importance, since it decreases the amount of the waste and improves the process economy. A comparative analysis of several methods of the recovery of this solvent was carried out, such as: alkaline washing, adsorption with resins, adsorption with aluminium oxide, adsorption by active carbon and adsorption by vermiculite. Some modifications of analytical 95 Zr test and a mathematical definition of two new parameters (degradation grade and efficiency of recovery) were done. Through this modified 95 Zr test, the residence time and the rate of degraded solvent: recuperator were determined. After laboratory tests, vermiculite associated with active carbon was employed for the treatment of 50 liters of tri-n-butyl phosphate (30% V/V)-dodecane, degraded by hydrolysis. Other analyses were performed to check the potentialities of these solids for this solvent recovery. (Author) [pt

  16. A low-cost method for performing a curriculum gap-analysis in developing countries: medical school competencies in Ghana.

    Science.gov (United States)

    Rominski, Sarah; Donkor, Peter A; Lawson, Aaron; Danso, Kwanbena; Stern, David

    2012-01-01

    In this study, we undertook a comparison of the international, national, and local curricula of Ghanaian medical schools in order to identify any gaps. To identify gaps in the Ghanaian medical school curriculum. The Ministry of Health and the two major sites for medical education in Ghana (UGMS, KNUST) participated using the only internationally accepted and validated set of outcome standards for medical education, the Global Minimum Essential Requirements. The competencies were reviewed by two U.S. consultants (DS, SR) and then edited, revised, and validated by individuals who are deeply involved in medical education in Ghana. The KNUST team validated 6 gaps in their curriculum, and the team from UGMS identified 5. The standards were found by the U.S.-based consultants and validated by the Ghanaian team to have 6 gaps, many of which overlapped with those found in the Ghanaian medical school curricula. This analysis is a first step to determining physician training competency and an inexpensive method for identifying agreed-upon gaps in the current national requirements and local curriculum.

  17. HPLC method development for the simultaneous analysis of amlodipine and valsartan in combined dosage forms and in vitro dissolution studies

    Directory of Open Access Journals (Sweden)

    Mustafa Çelebier

    2010-12-01

    Full Text Available A simple, rapid and reproducible HPLC method was developed for the simultaneous determination of amlodipine and valsartan in their combined dosage forms, and for drug dissolution studies. A C18 column (ODS 2, 10 μm, 200 x 4.6 mm and a mobile phase of phosphate buffer (pH 3.6 , 0.01 mol L-1:acetonitrile: methanol (46:44:10 v/v/v mixture were used for separation and quantification. Analyses were run at a flow-rate of 1 mL min-1 and at ambient temperature. The injection volume was 20 μL and the ultraviolet detector was set at 240 nm. Under these conditions, amlodipine and valsartan were eluted at 7.1 min and 3.4 min, respectively. Total run time was shorter than 9 min. The developed method was validated according to the literature and found to be linear within the range 0.1 - 50 μg mL-1 for amlodipine, and 0.05 - 50 μg mL-1 for valsartan. The developed method was applied successfully for quality control assay of amlodipine and valsartan in their combination drug product and in vitro dissolution studies.Desenvolveu-se método de HPLC rápido e reprodutível para a determinação simultânea de anlodipino e valsartana em suas formas de associação e para os estudos de dissolução dos fármacos. Utilizaram-se coluna C18 (ODS 2, 10 μm, 200 x 4,6 mm e fase móvel tampão fosfato (pH 3,6, 0,01 mol L-1:acetonitrila: metanol para a separação e a quantificação. As análises foram efetuadas com velocidade de fluxo de 1 mL min-1 e à temparatura ambiente O volume de injeção foi de 20 μL e utilizou-se detector de ultravioleta a 240 nm. Sob essas condições, anlodipino e valsartana foram eluídas a 7,1 min e 3,4 min, respectivamente. O tempo total de corrida foi menor que 9 min. O método desenvolvido foi validado de acordo com a literatura e se mostrou linear na faixa de 0,1-50 μg mL-1 para anlodipino e de 0,05-50 μg mL-1 para valsartana. O método desenvolvido foi aplicado com sucesso para ensaios de controle de qualidade de associações de

  18. Nonlinear structural analysis using integrated force method

    Indian Academy of Sciences (India)

    During the formulative period of structural analysis by matrix methods, earnest research was directed to automate the force ... (1973) for the analysis of discrete and continuous systems. IFM is a force method of .... (Nagabhushanam & Patnaik 1989) are being developed, which helps the use of efficient solution techniques for ...

  19. Basic methods of isotope analysis

    International Nuclear Information System (INIS)

    Ochkin, A.V.; Rozenkevich, M.B.

    2000-01-01

    The bases of the most applied methods of the isotope analysis are briefly presented. The possibilities and analytical characteristics of the mass-spectrometric, spectral, radiochemical and special methods of the isotope analysis, including application of the magnetic resonance, chromatography and refractometry, are considered [ru

  20. Development of CAD based on ANN analysis of power spectra for pneumoconiosis in chest radiographs: effect of three new enhancement methods.

    Science.gov (United States)

    Okumura, Eiichiro; Kawashita, Ikuo; Ishida, Takayuki

    2014-07-01

    We have been developing a computer-aided detection (CAD) scheme for pneumoconiosis based on a rule-based plus artificial neural network (ANN) analysis of power spectra. In this study, we have developed three enhancement methods for the abnormal patterns to reduce false-positive and false-negative values. The image database consisted of 2 normal and 15 abnormal chest radiographs. The International Labour Organization standard chest radiographs with pneumoconiosis were categorized as subcategory, size, and shape of pneumoconiosis. Regions of interest (ROIs) with a matrix size of 32 × 32 were selected from normal and abnormal lungs. Three new enhanced methods were obtained by window function, top-hat transformation, and gray-level co-occurrence matrix analysis. We calculated the power spectrum (PS) of all ROIs by Fourier transform. For the classification between normal and abnormal ROIs, we applied a combined analysis using the ruled-based plus the ANN method. To evaluate the overall performance of this CAD scheme, we employed ROC analysis for distinguishing between normal and abnormal ROIs. On the chest radiographs of the highest categories (severe pneumoconiosis) and the lowest categories (early pneumoconiosis), this CAD scheme achieved area under the curve (AUC) values of 0.93 ± 0.02 and 0.72 ± 0.03. The combined rule-based plus ANN method with the three new enhanced methods obtained the highest classification performance for distinguishing between abnormal and normal ROIs. Our CAD system based on the three new enhanced methods would be useful in assisting radiologists in the classification of pneumoconiosis.

  1. A thermogravimetric analysis (TGA) method developed for estimating the stoichiometric ratio of solid-state {alpha}-cyclodextrin-based inclusion complexes

    Energy Technology Data Exchange (ETDEWEB)

    Bai, Yuxiang; Wang, Jinpeng; Bashari, Mohanad; Hu, Xiuting [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Feng, Tao [School of Perfume and Aroma Technology, Shanghai Institute of Technology, Shanghai 201418 (China); Xu, Xueming [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Jin, Zhengyu, E-mail: jinlab2008@yahoo.com [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Tian, Yaoqi, E-mail: yqtian@jiangnan.edu.cn [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China)

    2012-08-10

    Highlights: Black-Right-Pointing-Pointer We develop a TGA method for the measurement of the stoichiometric ratio. Black-Right-Pointing-Pointer A series of formulas are deduced to calculate the stoichiometric ratio. Black-Right-Pointing-Pointer Four {alpha}-CD-based inclusion complexes were successfully prepared. Black-Right-Pointing-Pointer The developed method is applicable. - Abstract: An approach mainly based on thermogravimetric analysis (TGA) was developed to evaluate the stoichiometric ratio (SR, guest to host) of the guest-{alpha}-cyclodextrin (Guest-{alpha}-CD) inclusion complexes (4-cresol-{alpha}-CD, benzyl alcohol-{alpha}-CD, ferrocene-{alpha}-CD and decanoic acid-{alpha}-CD). The present data obtained from Fourier transform-infrared (FT-IR) spectroscopy showed that all the {alpha}-CD-based inclusion complexes were successfully prepared in a solid-state form. The stoichiometric ratios of {alpha}-CD to the relative guests (4-cresol, benzyl alcohol, ferrocene and decanoic acid) determined by the developed method were 1:1, 1:2, 2:1 and 1:2, respectively. These SR data were well demonstrated by the previously reported X-ray diffraction (XRD) method and the NMR confirmatory experiments, except the SR of decanoic acid with a larger size and longer chain was not consistent. It is, therefore, suggested that the TGA-based method is applicable to follow the stoichiometric ratio of the polycrystalline {alpha}-CD-based inclusion complexes with smaller and shorter chain guests.

  2. Development, optimization, validation and application of faster gas chromatography - flame ionization detector method for the analysis of total petroleum hydrocarbons in contaminated soils.

    Science.gov (United States)

    Zubair, Abdulrazaq; Pappoe, Michael; James, Lesley A; Hawboldt, Kelly

    2015-12-18

    This paper presents an important new approach to improving the timeliness of Total Petroleum Hydrocarbon (TPH) analysis in the soil by Gas Chromatography - Flame Ionization Detector (GC-FID) using the CCME Canada-Wide Standard reference method. The Canada-Wide Standard (CWS) method is used for the analysis of petroleum hydrocarbon compounds across Canada. However, inter-laboratory application of this method for the analysis of TPH in the soil has often shown considerable variability in the results. This could be due, in part, to the different gas chromatography (GC) conditions, other steps involved in the method, as well as the soil properties. In addition, there are differences in the interpretation of the GC results, which impacts the determination of the effectiveness of remediation at hydrocarbon-contaminated sites. In this work, multivariate experimental design approach was used to develop and validate the analytical method for a faster quantitative analysis of TPH in (contaminated) soil. A fractional factorial design (fFD) was used to screen six factors to identify the most significant factors impacting the analysis. These factors included: injection volume (μL), injection temperature (°C), oven program (°C/min), detector temperature (°C), carrier gas flow rate (mL/min) and solvent ratio (v/v hexane/dichloromethane). The most important factors (carrier gas flow rate and oven program) were then optimized using a central composite response surface design. Robustness testing and validation of model compares favourably with the experimental results with percentage difference of 2.78% for the analysis time. This research successfully reduced the method's standard analytical time from 20 to 8min with all the carbon fractions eluting. The method was successfully applied for fast TPH analysis of Bunker C oil contaminated soil. A reduced analytical time would offer many benefits including an improved laboratory reporting times, and overall improved clean up

  3. The development of the continuous orthonormalization and adjoint methods for solar seismology: Eigenfrequency computation and sensitivity analysis for direct and inverse problems

    International Nuclear Information System (INIS)

    Rosenwald, R.D.

    1989-01-01

    Two new analysis methods for solar seismology are developed. Called the continuous orthonormalization (CON) and adjoint methods, their use enables both solar eigenfrequencies and eigenfrequency sensitivities to be computed more accurately and efficiently than with existing methods. The CON method integrates an eighth-order nonlinear system of ordinary differential equations (ODEs) which defines the linear adiabatic nonradial oscillation modes of the Sun. All normal modes of oscillation are treated identically, regardless of their type or their predominant location inside the Sun. The adjoint method integrates a related eighth-order linear inhomogeneous system of ODEs. From the resultant solution, an eigenfrequency's partial derivatives with respect to an extensive set of solar model parameters may be computed simultaneously. Extensive numerical tests confirm the validity of the two new methods. Eigenfrequencies obtained via the CON method have seven significant digits and match within 1 percent the eigenfrequencies obtained via finite difference or mesh approaches. Eigenfrequency sensitivities obtained via the adjoint method match with 2 percent the results obtained by explicitly perturbing the solar model parameters and recomputing the eigenfrequencies. The usefulness and power of the two new methods are demonstrated by applying them to the solution of an elementary solar inversion problem. A sample solar model's f-mode frequencies are iteratively driven into agreement with an observed set of f-mode frequencies. Adjoint sensitivity results are used to alter solar model parameters within hundreds of radial bins

  4. Development of a Univariate Membrane-Based Mid-Infrared Method for Protein Quantitation and Total Lipid Content Analysis of Biological Samples

    Directory of Open Access Journals (Sweden)

    Ivona Strug

    2014-01-01

    Full Text Available Biological samples present a range of complexities from homogeneous purified protein to multicomponent mixtures. Accurate qualification of such samples is paramount to downstream applications. We describe the development of an MIR spectroscopy-based analytical method offering simultaneous protein quantitation (0.25–5 mg/mL and analysis of total lipid or detergent species, as well as the identification of other biomolecules present in biological samples. The method utilizes a hydrophilic PTFE membrane engineered for presentation of aqueous samples in a dried format compatible with fast infrared analysis. Unlike classical quantification techniques, the reported method is amino acid sequence independent and thus applicable to complex samples of unknown composition. By comparison to existing platforms, this MIR-based method enables direct quantification using minimal sample volume (2 µL; it is well-suited where repeat access and limited sample size are critical parameters. Further, accurate results can be derived without specialized training or knowledge of IR spectroscopy. Overall, the simplified application and analysis system provides a more cost-effective alternative to high-throughput IR systems for research laboratories with minimal throughput demands. In summary, the MIR-based system provides a viable alternative to current protein quantitation methods; it also uniquely offers simultaneous qualification of other components, notably lipids and detergents.

  5. Stir bar sorptive extraction-gas chromatography-mass spectrometry analysis of tetramethylene disulfotetramine in food: Method development and comparison to solid-phase microextraction.

    Science.gov (United States)

    De Jager, Lowri S; Perfetti, Gracia A; Diachenko, Gregory W

    2009-03-09

    A stir bar sorptive extraction-gas chromatography-mass spectrometry (SBSE-GC-MS) method for the determination of tetramethylene disulfotetramine is presented. The limits of detection (LOD) of the optimized method was 0.2ngg(-1) for extractions from water and 0.3-2.1ngg(-1) for extractions from foods. Recovery was highly matrix dependent (36-130%) and quantification required standard addition calibrations. Standard addition calibration lines had high linearity (R(2)>0.97) and replicate extractions had good reproducibility (R.S.D.=4.4-9.8%). A comparison of the SBSE method and a previously developed headspace (HS)-solid-phase microextraction (SPME) method was performed. Generally, SBSE provided higher sensitivity with decreased analysis time.

  6. Field sampling and data analysis methods for development of ecological land classifications: an application on the Manistee National Forest.

    Science.gov (United States)

    George E. Host; Carl W. Ramm; Eunice A. Padley; Kurt S. Pregitzer; James B. Hart; David T. Cleland

    1992-01-01

    Presents technical documentation for development of an Ecological Classification System for the Manistee National Forest in northwest Lower Michigan, and suggests procedures applicable to other ecological land classification projects. Includes discussion of sampling design, field data collection, data summarization and analyses, development of classification units,...

  7. Progressive methods in multiple criteria decision analysis

    OpenAIRE

    Meyer, Patrick

    2007-01-01

    Our work mainly focusses on the study and the development of progressive methods in the field of Multiple Criteria Decision Analysis, i.e., iterative procedures presenting partial conclusions to the Decision Maker that can be refined at further steps of the analysis. The thesis is divided into three parts. The first one is intended to be a general analysis of the concept of progressiveness. The last two parts develop progressive methods related first to Multiattribute Value Theory and sec...

  8. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  9. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...

  10. Analysis of human serum and whole blood for mineral content by ICP-MS and ICP-OES: development of a mineralomics method.

    Science.gov (United States)

    Harrington, James M; Young, Daniel J; Essader, Amal S; Sumner, Susan J; Levine, Keith E

    2014-07-01

    Minerals are inorganic compounds that are essential to the support of a variety of biological functions. Understanding the range and variability of the content of these minerals in biological samples can provide insight into the relationships between mineral content and the health of individuals. In particular, abnormal mineral content may serve as an indicator of illness. The development of robust, reliable analytical methods for the determination of the mineral content of biological samples is essential to developing biological models for understanding the relationship between minerals and illnesses. This paper describes a method for the analysis of the mineral content of small volumes of serum and whole blood samples from healthy individuals. Interday and intraday precision for the mineral content of the blood (250 μL) and serum (250 μL) samples was measured for eight essential minerals--sodium (Na), calcium (Ca), magnesium (Mg), potassium (K), iron (Fe), zinc (Zn), copper (Cu), and selenium (Se)--by plasma spectrometric methods and ranged from 0.635 to 10.1% relative standard deviation (RSD) for serum and 0.348-5.98% for whole blood. A comparison of the determined ranges for ten serum samples and six whole blood samples provided good agreement with literature reference ranges. The results demonstrate that the digestion and analysis methods can be used to reliably measure the content of these minerals and potentially of other minerals.

  11. Analysis and Development of Finite Element Methods for the Study of Nonlinear Thermomechanical Behavior of Structural Components

    Science.gov (United States)

    Oden, J. Tinsley

    1995-01-01

    Underintegrated methods are investigated with respect to their stability and convergence properties. The focus was on identifying regions where they work and regions where techniques such as hourglass viscosity and hourglass control can be used. Results obtained show that underintegrated methods typically lead to finite element stiffness with spurious modes in the solution. However, problems exist (scalar elliptic boundary value problems) where underintegrated with hourglass control yield convergent solutions. Also, stress averaging in underintegrated stiffness calculations does not necessarily lead to stable or convergent stress states.

  12. Studies on application of neutron activation analysis -Applied research on air pollution monitoring and development of analytical method of environmental samples

    International Nuclear Information System (INIS)

    Chung, Yong Sam; Moon, Jong Hwa; Chung, Young Ju; Jeong, Eui Sik; Lee, Sang Mi; Kang, Sang Hun; Cho, Seung Yeon; Kwon, Young Sik; Chung, Sang Wuk; Lee, Kyu Sung; Chun, Ki Hong; Kim, Nak Bae; Lee, Kil Yong; Yoon, Yoon Yeol; Chun, Sang Ki.

    1997-09-01

    This research report is written for results of applied research on air pollution monitoring using instrumental neutron activation analysis. For identification and standardization of analytical method, 24 environmental samples are analyzed quantitatively, and accuracy and precision of this method are measured. Using airborne particulate matter and biomonitor chosen as environmental indicators, trace elemental concentrations of sample collected at urban and rural site monthly are determined ant then the calculation of statistics and the factor analysis are carried out for investigation of emission source. Facilities for NAA are installed in a new HANARO reactor, functional test is performed for routine operation. In addition, unified software code for NAA is developed to improve accuracy, precision and abilities of analytical processes. (author). 103 refs., 61 tabs., 19 figs

  13. New design procedure development of future reactor critical power estimation. (1) Practical design-by-analysis method for BWR critical power design correlation

    International Nuclear Information System (INIS)

    Yamamoto, Yasushi; Mitsutake, Toru

    2007-01-01

    For present BWR fuels, the full mock-up thermal-hydraulic test, such as the critical power measurement test, pressure drop measurement test and so on, has been needed. However, the full mock-up test required the high costs and large-scale test facility. At present, there are only a few test facilities to perform the full mock-up thermal-hydraulic test in the world. Moreover, for future BWR, the bundle size tends to be larger, because of reducing the plant construction costs and minimizing the routine check period. For instance, AB1600, improved ABWR, was proposed from Toshiba, whose bundle size was 1.2 times larger than the conventional BWR fuel size. It is too expensive and far from realistic to perform the full mock-up thermal-hydraulic test for such a large size fuel bundle. The new design procedure is required to realize the large scale bundle design development, especially for the future reactor. Therefore, the new design procedure, Practical Design-by-Analysis (PDBA) method, has been developed. This new procedure consists of the partial mock-up test and numerical analysis. At present, the subchannel analysis method based on three-fluid two-phase flow model only is a realistic choice. Firstly, the partial mock-up test is performed, for instance, the 1/4 partial mock-up bundle. Then, the first-step critical power correlation coefficients are evaluated with the measured data. The input data, such as the spacer effect model coefficient, on the subchannel analysis are also estimated with the data. Next, the radial power effect on the critical power of the full-bundle size was estimated with the subchannel analysis. Finally, the critical power correlation is modified by the subchannel analysis results. In the present study, the critical power correlation of the conventional 8x8 BWR fuel was developed with the PDBA method by 4x4 partial mock-up tests and the subchannel analysis code. The accuracy of the estimated critical power was 3.8%. The several themes remain to

  14. Comparative urine analysis by liquid chromatography-mass spectrometry and multivariate statistics : Method development, evaluation, and application to proteinuria

    NARCIS (Netherlands)

    Kemperman, Ramses F. J.; Horvatovich, Peter L.; Hoekman, Berend; Reijmers, Theo H.; Muskiet, Frits A. J.; Bischoff, Rainer

    2007-01-01

    We describe a platform for the comparative profiling of urine using reversed-phase liquid chromatography-mass spectrometry (LC-MS) and multivariate statistical data analysis. Urinary compounds were separated by gradient elution and subsequently detected by electrospray Ion-Trap MS. The lower limit

  15. Development of an SDS-gel electrophoresis method on SU-8 microchips for protein separation with LIF detection: Application to the analysis of whey proteins.

    Science.gov (United States)

    Del Mar Barrios-Romero, Maria; Crevillén, Agustín G; Diez-Masa, José Carlos

    2013-08-01

    This work describes the development of an SDS-gel electrophoresis method for the analysis of major whey proteins (α-lactalbumin, β-lactoglobulin, and BSA) carried out in SU-8 microchips. The method uses a low-viscosity solution of dextran as a sieving polymer. A commercial coating agent (EOTrol LN) was added to the separation buffer to control the EOF of the chips. The potential of this coating agent to prevent protein adsorption on the walls of the SU-8 channels was also evaluated. Additionally, the fluorescence background of the SU-8 material was studied to improve the sensitivity of the method. By selecting an excitation wavelength of 532 nm at which the background fluorescence remains low and by replacing the mercury arc lamp by a laser in the detection system, an LOD in the nanomolar range was achieved for proteins derivatized with the fluorogenic reagent Chromeo P540. Finally, the method was applied to the analysis of milk samples, demonstrating the potential of SU-8 microchips for the analysis of proteins in complex food samples. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Nonlinear programming analysis and methods

    CERN Document Server

    Avriel, Mordecai

    2012-01-01

    This text provides an excellent bridge between principal theories and concepts and their practical implementation. Topics include convex programming, duality, generalized convexity, analysis of selected nonlinear programs, techniques for numerical solutions, and unconstrained optimization methods.

  17. Chemical methods of rock analysis

    National Research Council Canada - National Science Library

    Jeffery, P. G; Hutchison, D

    1981-01-01

    A practical guide to the methods in general use for the complete analysis of silicate rock material and for the determination of all those elements present in major, minor or trace amounts in silicate...

  18. Design of experiments as a tool for LC-MS/MS method development for the trace analysis of the potentially genotoxic 4-dimethylaminopyridine impurity in glucocorticoids.

    Science.gov (United States)

    Székely, Gy; Henriques, B; Gil, M; Ramos, A; Alvarez, C

    2012-11-01

    The present study reports on a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method development strategy supported by design of experiments (DoE) for the trace analysis of 4-dimethylaminopyridine (DMAP). The conventional approaches for development of LC-MS/MS methods are usually via trial and error, varying intentionally the experimental factors which is time consuming and interactions between experimental factors are not considered. The LC factors chosen for the DoE study include flow (F), gradient (G) and injection volume (V(inj)) while cone voltage (E(con)) and collision energy (E(col)) were chosen as MS parameters. All of the five factors were studied simultaneously. The method was optimized with respect to four responses: separation of peaks (Sep), peak area (A(peak)), length of the analysis (T) and the signal to noise ratio (S/N). A quadratic model, namely central composite face (CCF) featuring 29 runs was used instead of a less powerful linear model since the increase in the number of injections was insignificant. In order to determine the robustness of the method a new set of DoE experiments was carried out applying robustness around the optimal conditions was evaluated applying a fractional factorial of resolution III with 11 runs, wherein additional factors - such as column temperature and quadrupole resolution - were considered. The method utilizes a Phenomenex Gemini NX C-18 HPLC analytical column with electrospray ionization and a triple quadrupole mass detector in multiple reaction monitoring (MRM) mode, resulting in short analyses with a 10min runtime. Drawbacks of derivatization, namely incomplete reaction and time consuming sample preparation, have been avoided and the change from SIM to MRM mode resulted in increased sensitivity and lower LOQ. The DoE method development strategy led to a method allowing the trace analysis of DMAP at 0.5 ng/ml absolute concentration which corresponds to a 0.1 ppm limit of quantification in 5mg

  19. Scenario development methods and practice

    International Nuclear Information System (INIS)

    2001-01-01

    The safe management of radioactive waste is an essential aspect of all nuclear power programmes. Although a general consensus has been reached in OECD countries on the use of geological repositories for the disposal of high-level radioactive waste, analysis of the long-term safety of these repositories, using performance assessment and other tools, is required prior to implementation. The initial stage in developing a repository safety assessment is the identification of all factors that may be relevant to the long-term safety of the repository and their combination to form scenarios. This must be done in a systematic and transparent way in order to assure the regulatory authorities that nothing important has been forgotten. Scenario development has become the general term used to describe the collection and organisation of the scientific and technical information necessary to assess the long-term performance or safety of radioactive waste disposal systems. This includes the identification of the relevant features, events and processes (FEPs), the synthesis of broad models of scientific understanding, and the selection of cases to be calculated. Scenario development provides the overall framework in which the cases and their calculated consequences can be discussed, including biases or shortcomings due to omissions or lack of knowledge. The NEA Workshop on Scenario Development was organised in Madrid, in May 1999, with the objective of reviewing developments in scenario methodologies and applications in safety assessments since 1992. The outcome of this workshop is the subject of this book. It is a review of developments in scenario methodologies based on a large body of practical experience in safety assessments. It will be of interest to radioactive waste management experts as well as to other specialists involved in the development of scenario methodologies. (author)

  20. Development of the analysis methods for charm triangular flow measurment in Pb-Pb collisions in ALICE

    CERN Document Server

    Alemanno, Francesca

    2017-01-01

    The aim of my project is to implement the technique and the tools to study the D0 mesons within the ALICE analysis framework. During my work I analyzed ALICE data using the Computing Grid facilities and in this report I will present the measurement of the azimuthal anisotropy of the D0 meson production in Pb-Pb collisions at √sNN = 5.02 TeV, with the ALICE detector.

  1. Development and validation of AccuTOF-DART™ as a screening method for analysis of bank security device and pepper spray components.

    Science.gov (United States)

    Pfaff, Allison M; Steiner, Robert R

    2011-03-20

    Analysis of bank security devices, containing 1-methylaminoanthraquinone (MAAQ) and o-chlorobenzylidenemalononitrile (CS), and pepper sprays, containing capsaicin, is a lengthy process with no specific screening technique to aid in identifying samples of interest. Direct Analysis in Real Time (DART™) ionization coupled with an Accurate Time of Flight (AccuTOF) mass detector is a fast, ambient ionization source that could significantly reduce time spent on these cases and increase the specificity of the screening process. A new method for screening clothing for bank dye and pepper spray, using AccuTOF-DART™ analysis, has been developed. Detection of MAAQ, CS, and capsaicin was achieved via extraction of each compound onto cardstock paper, which was then sampled in the AccuTOF-DART™. All results were verified using gas chromatography coupled with electron impact mass spectrometry. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  2. Development of liquid chromatography-tandem mass spectrometry method for analysis of polyphenolic compounds in liquid samples of grape juice, green tea and coffee.

    Science.gov (United States)

    Sapozhnikova, Yelena

    2014-05-01

    A simple and fast method for the analysis of a wide range of polyphenolic compounds in juice, tea, and coffee samples was developed using liquid chromatography-tandem mass spectrometry (LC-MS/MS). The method was based on a simple sample preparation "dilute and shoot" approach, and LC-MS/MS quantification using genistein-d4 as an internal standard. The performance of six different syringeless filter devices was tested for sample preparation. The method was evaluated for recoveries of polyphenols at three spiking levels in juice, tea, and coffee samples. The recoveries of the majority of polyphenols were satisfactory (70-120%), but some varied significantly (20-138%) depending on the matrix. NIST Standard Reference Materials (SRM) 3257 Catechin Calibration Solutions and 3255 Camellia sinensis (Green Tea) Extract with certified concentrations of catechin and epicatechin were used for method validation. The measurement accuracy in two SRMs was 71-113%. The method was successfully applied to the analysis of liquid samples of grape juice, green tea, and coffee. Published by Elsevier Ltd.

  3. Development of an efficient method for multi residue analysis of 160 pesticides in herbal plant by ethyl acetate hexane mixture with direct injection to GC-MS/MS.

    Science.gov (United States)

    Taha, Sherif M; Gadalla, Sohair A

    2017-11-01

    A simple and efficient multi residue method was developed, for the analysis of 160 pesticides by GC-MS/MS in herbal plants. The developed method employs pesticide residue extraction by EtAC/ n. hexane (6:4) with a cleanup step using florisil/ PSA mixture. The optimized conditions have resulted in lower co-extracted matrix components than those extracted using EtAC or MeCN (QuEChERS method), according to FTIR and full scan GC/MS analyses. In addition, the developed method (EtAC/ Hexane) eliminates the evaporation step that is usually performed when using MeCN as an extraction solvent prior to the GC-MS/MS injection. The developed method was fully validated on chamomile, based on SANTE/11945/2015 guidelines. Where, intraday recoveries were estimated at three concentration levels of 10, 50 and 250μgkg -1 . However, interday recoveries have also been carried out, at 250μgkg -1 . In addition, intraday recoveries were estimated for two other herbal plants (thyme and marjoram), at 250μgkg -1 . Three point calibration mixtures were prepared in ethyl acetate solvent and in the blank extracts of chamomile, thyme, and marjoram, in order to check the linearity and matrix effect. The average recoveries for most of the studied pesticides ranged from 70% to 100% at 50 and 250μgkg -1 with relative standard deviations below 20%. The validated method was successfully applied for determination of pesticide residues in 20 herb samples, collected from the Egyptian market. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Development and validation of an HPLC-MS/MS method for the analysis of dexamethasone from pig synovial fluid using dried matrix spotting.

    Science.gov (United States)

    Christianson, Chad D; Laine, Derek F; Zimmer, Jennifer Sd; Johnson, Casey Jl; Sheaff, Chrystal N; Carpenter, Anna; Needham, Shane R

    2010-11-01

    Dried matrix spot techniques were employed to validate an HPLC-MS/MS assay for the determination of dexamethasone in clear Yorkshire pig synovial fluid using 15 µl of sample. We have adopted the term dried matrix spot to indicate that the techniques used for dried blood spots can be applied to nonblood matrices. The dried matrix spot method employs a color-indicating process developed at Alturas Analytics that enhances the ability to analyze transparent fluids spotted onto collection paper by allowing the analyst to visually verify the location of the dried sample spot. The method was shown to be accurate (±4.3%) and precise (14.2% at the LLOQ and ≤10.0% at all other concentrations) across the dynamic range of the assay. The method shows the potential application of dried matrix spot techniques for the analysis of transparent biological fluids.

  5. Neurocognitive Pattern Analysis of an Auditory and Visual Numeric Motor Control Task. Part 1. Development of Methods.

    Science.gov (United States)

    1984-10-01

    of stereophotogrammetry developed for orthodontic use. A series of stereo pictures of the participant’s head with electrode cap in place was taken...strip chart write-outs may be largely attributed to movements of the electrode wire or to movement of the quick insert ball in the plastic holder

  6. Methods of Strength Development in Middle Schoolers

    Directory of Open Access Journals (Sweden)

    В. О. Нарижний

    2015-06-01

    Full Text Available Strength being the foundation of motor abilities development, the question arises as to rationalizing the process of its development. The purpose of the research is to improve the conventional methods of strength development in middle schoolers. To achieve the objectives set, the study used the following methods: theoretical analysis and collation of methodological literature, testing, pedagogical experiment, methods of mathematical statistics. Research results. The analysis of the testing results showed a statistically significant increase in the indicators by two of five tests in girls and three of five tests in boys, with the combined method used. The other results show a tendency to improve too, but the deviations in the testing results are statistically weak. The re-use of the method reveals a statistically reliable indicator in one of five exercises, whereas no such indicator manifests in girls. Conclusions. Using the combined method makes it possible to simultaneously influence several types of strength, which rationalizes the development of strength abilities. The tests “bending and unbending of arms in suspension lying”, “remaining in suspension lying on bent arms” proved most informative.

  7. [SWOT analysis: the analytical method in the process of planning and its application in the development of orthopaedic hospital department].

    Science.gov (United States)

    Terzić, Zorica; Vukasinović, Zoran; Bjegović-Mikanović, Vesna; Jovanović, Vesna; Janicić, Radmila

    2010-01-01

    SWOT analysis is a managerial tool used to evaluate internal and external environment through strengths and weaknesses, opportunities and threats. The aim was to demonstrate the application of the SWOT analysis on the example of the Department for Paediatric Orthopaedics and Traumatology at the Institute of Orthopaedic Surgery "Banjica" in Belgrade. Qualitative research was conducted during December 2008 at the Department for Paediatric Orthopaedics and Traumatology of the Institute of Orthopaedic Surgery "Banjica" by applying the focus group technique. Participants were members of the medical staff and patients. In the first phase of the focus group brainstorming was applied to collect the factors of internal and external environment, and to identify strengths and weaknesses, opportunities and threats, respectively. In the second phase the nominal group technique was applied in order to reduce the list of factors. The factors were assessed according to their influence on the Department. Factors ranked by the three point Likert scale from 3 (highest impact) to 1 (lowest impact). The most important strengths of the Department are competent and skilled staff, high quality of services, average hospital bed utilization, the Department providing the educational basis of the School of Medicine, satisfied patients, pleasant setting, and additional working hours. The weaknesses are: poor spatial organization, personnel unmotivated to refresh knowledge, lack of specifically trained personnel, inadequate sanitary facilities, and uncovered services by the Insurance Fund, long average hospital stay, and low economic status of patients. The opportunities are: legislative regulations, formed paediatric traumatology service at the City level, good regional position of the Institute, and extension of referral areas. The threats are: absent Department autonomy in the personnel policy of the Institute, competitions within the Institute, impossibility to increase the Department

  8. Development of a gas-liquid chromatographic method for the analysis of fatty acid tryptamides in cocoa products.

    Science.gov (United States)

    Hug, Bernadette; Golay, Pierre-Alain; Giuffrida, Francesca; Dionisi, Fabiola; Destaillats, Frédéric

    2006-05-03

    The determination of the occurrence and level of cocoa shells in cocoa products and chocolate is an important analytical issue. The recent European Union directive on cocoa and chocolate products (2000/36/EC) has not retained the former limit of a maximum amount of 5% of cocoa shells in cocoa nibs (based on fat-free dry matter), previously authorized for the elaboration of cocoa products such as cocoa mass. In the present study, we report a reliable gas-liquid chromatography procedure suitable for the determination of the occurrence of cocoa shells in cocoa products by detection of fatty acid tryptamides (FATs). The precision of the method was evaluated by analyzing nine different samples (cocoa liquors with different ranges of shells) six times (replicate repeatability). The variations of the robust coefficient of variation of the repeatability demonstrated that FAT(C22), FAT(C24), and total FATs are good markers for the detection of shells in cocoa products. The trueness of the method was evaluated by determining the FAT content in two spiked matrices (cocoa liquors and cocoa shells) at different levels (from 1 to 50 mg/100 g). A good relation was found between the results obtained and the spiking (recovery varied between 90 and 130%), and the linearity range was established between 1 and 50 mg/100 g in cocoa products. For total FAT contents of cocoa liquor containing 5% shells, the measurement uncertainty allows us to conclude that FAT is equal to 4.01 +/- 0.8 mg/100 g. This validated method is perfectly suitable to determine shell contents in cocoa products using FAT(C22), FAT(C24), and total FATs as markers. The results also confirmed that cocoa shells contain FAT(C24) and FAT(C22) in a constant ratio of nearly 2:1.

  9. Development of a low cost method to estimate the seismic signature of a geothermal field form ambient noise analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Tibuleac, Ileana [Univ. of Nevada, Reno, NV (United States)

    2016-06-30

    A new, cost effective and non-invasive exploration method using ambient seismic noise has been tested at Soda Lake, NV, with promising results. The material included in this report demonstrates that, with the advantage of initial S-velocity models estimated from ambient noise surface waves, the seismic reflection survey, although with lower resolution, reproduces the results of the active survey when the ambient seismic noise is not contaminated by strong cultural noise. Ambient noise resolution is less at depth (below 1000m) compared to the active survey. In general, the results are promising and useful information can be recovered from ambient seismic noise, including dipping features and fault locations.

  10. Development and validation of quantitative methods for the analysis of clinical biomarkers of iron metabolism applying stable isotopes

    OpenAIRE

    Konz, Tobías

    2016-01-01

    The continuous discovery of new biomarkers is reflected not only in the scientific literature but also in its application in current clinical practice. The urgent need to determine the values of appropriate clinical parameters (biomarkers) in biological fluids is driving the development of new analytical methodologies capable of providing quantitative, precise, accurate and validated information on the concentration of those biomarkers, both in control individuals as well as in patients affec...

  11. Development of genetic diagnosing method for diabetes and cholecystitis based on gene analysis of CCK-A receptor

    Energy Technology Data Exchange (ETDEWEB)

    Kono, Akira [National Kyushu Cancer Center, Fukuoka (Japan)

    2000-02-01

    Based on the gene analysis of cholecystokinin type A receptor (CCKAR) from normal mouse and its sequence analysis in the previous year, CCKAR knock-out gene which allows mRNA expression of {beta}-galactosidase gene in stead of CCKAR gene was constructed. Since some abnormality in CCKAR gene is thought to be a causal factor of diabetes and cholecystitis, a knock-out mouse that expressed LacZ but not CCKAR was constructed to investigate the correlation between the clinical features of diabetes and cholecystitis, and CCKAR gene abnormalities. F2 mice that had mutations in CCKAR gene were born according to the Mendel's low. The expression of CCKAR gene was investigated in detail based on the expression of LacZ gene in various tissues of homo (-/-) and hetero (-/+) knockout mice. Comparative study on blood sugar level, blood insulin level, the formation of biliary calculus, etc. is underway with the wild mouse, hetero and homo knockout mouse. (M.N.)

  12. Analysis of urban metabolic processes based on input-output method: model development and a case study for Beijing

    Science.gov (United States)

    Zhang, Yan; Liu, Hong; Chen, Bin; Zheng, Hongmei; Li, Yating

    2014-06-01

    Discovering ways in which to increase the sustainability of the metabolic processes involved in urbanization has become an urgent task for urban design and management in China. As cities are analogous to living organisms, the disorders of their metabolic processes can be regarded as the cause of "urban disease". Therefore, identification of these causes through metabolic process analysis and ecological element distribution through the urban ecosystem's compartments will be helpful. By using Beijing as an example, we have compiled monetary input-output tables from 1997, 2000, 2002, 2005, and 2007 and calculated the intensities of the embodied ecological elements to compile the corresponding implied physical input-output tables. We then divided Beijing's economy into 32 compartments and analyzed the direct and indirect ecological intensities embodied in the flows of ecological elements through urban metabolic processes. Based on the combination of input-output tables and ecological network analysis, the description of multiple ecological elements transferred among Beijing's industrial compartments and their distribution has been refined. This hybrid approach can provide a more scientific basis for management of urban resource flows. In addition, the data obtained from distribution characteristics of ecological elements may provide a basic data platform for exploring the metabolic mechanism of Beijing.

  13. A Case Study Analysis of Clt Methods to Develop Grammar Competency for Academic Writing Purposes at Tertiary Level

    Directory of Open Access Journals (Sweden)

    Almodad Biduk Asmani

    2013-10-01

    Full Text Available The purpose of the research project is to find out how effective grammar teaching and learning using the Principled CLT method can improve the ability of freshman Binus University students to understand and use grammar knowledge for academic writing purposes. The research project is expected to result in computer-animated format which can be used as one of the main tools in teaching and learning grammar at tertiary level. The research project applies the descriptive quantitative approach, and thus uses numeric data. The research project involves two subject groups, which are experimental and control. The two groups are pre-tested so as to find out their level of grammar competency by their academic writing works. The experimental group receives the treatment of grammar learning by using the Principled CLT approach, while the control group receives the standard CLT approach. Then, the two groups have the post-test, and the results are compared. Through statistics, the numerical data show that there is no significant difference between the two methods’ results, and as a result, either method has its own strength and weaknesses. If one is to be implemented, it must be linked to the specific goals and purposes that each entails.  

  14. Analysis of water-soluble polysaccharides in an edible medicinal plant Epimedium: method development, validation, and application.

    Science.gov (United States)

    Zhang, Hua-Feng; Niu, Li-Li; Yang, Xiao-Hua; Li, Lu

    2014-01-01

    Water-soluble polysaccharides are important constituents with evident health benefits in Epimedium. The aim of this study was to establish a specific, accurate, reproducible, and sensitive phenol-sulfuric acid method for the quantitative assay of Epimedium polysaccharides and to determine polysaccharides in Epimedium samples from Chinese markets. Galactose was adopted as the standard monosaccharide, and 486 nm was chosen as the detection wavelength. The optimal conditions for the color reaction were obtained using single factor experiments and an orthogonal test: temperature, 20 degrees C; amount of 5% phenol, 0.3 mL; amount of concentrated sulfuric acid, 3.5 mL; incubation time, 20 min; and addition sequence, phenol-sample-sulfuric acid. The colored sample solution after chromogenic reaction exhibited high stability within 2 h. The calibration curve was linear within the range 5.00-60.00 micro g/mL, and the correlation coefficient of the regression equation was 0.999. LOD and LOQ were 1.65 and 5.00 microg/mL, respectively. Recovery, intraday precision, interday precision, and accuracy were 97.43 to 103.80%, 0.73 to 3.48%, 1.21 to 2.75%, and 97.74 to 101.62%, respectively. Polysaccharides in 26 samples of Epimedium collected from different provinces of China were quantified by the proposed colorimetric method, and a large variation of contents of polysaccharides was observed among these samples.

  15. Validation of hip joint center localization methods during gait analysis using 3D EOS imaging in typically developing and cerebral palsy children.

    Science.gov (United States)

    Assi, Ayman; Sauret, Christophe; Massaad, Abir; Bakouny, Ziad; Pillet, Hélène; Skalli, Wafa; Ghanem, Ismat

    2016-07-01

    Localization of the hip joint center (HJC) is essential in computation of gait data. EOS low dose biplanar X-rays have been shown to be a good reference in evaluating various methods of HJC localization in adults. The aim is to evaluate predictive and functional techniques for HJC localization in typically developing (TD) and cerebral palsy (CP) children, using EOS as an image based reference. Eleven TD and 17 CP children underwent 3D gait analysis. Six HJC localization methods were evaluated in each group bilaterally: 3 predictive (Plug in Gait, Bell and Harrington) and 3 functional methods based on the star arc technique (symmetrical center of rotation estimate, center transformation technique and geometrical sphere fitting). All children then underwent EOS low dose biplanar radiographs. Pelvis, lower limbs and their corresponding external markers were reconstructed in 3D. The center of the femoral head was considered as the reference (HJCEOS). Euclidean distances between HJCs estimated by each of the 6 methods and the HJCEOS were calculated; distances were shown to be lower in predictive compared to functional methods (p<0.0001). Contrarily to findings in adults, functional methods were shown to be less accurate than predictive methods in TD and CP children, which could be mainly due to the shorter thigh segment in children. Harrington method was shown to be the most accurate in the prediction of HJC (mean error≈18mm, SD=9mm) and quasi-equivalent to the Bell method. The bias for each method was quantified, allowing its correction for an improved HJC estimation. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Development of Chiral LC-MS Methods for small Molecules and Their Applications in the Analysis of Enantiomeric Composition and Pharmacokinetic Studies

    Energy Technology Data Exchange (ETDEWEB)

    Desai, Meera Jay [Iowa State Univ., Ames, IA (United States)

    2004-01-01

    The purpose of this research was to develop sensitive LC-MS methods for enantiomeric separation and detection, and then apply these methods for determination of enantiomeric composition and for the study of pharmacokinetic and pharmacodynamic properties of a chiral nutraceutical. Our first study, evaluated the use of reverse phase and polar organic mode for chiral LC-API/MS method development. Reverse phase methods containing high water were found to decrease ionization efficiency in electrospray, while polar organic methods offered good compatibility and low limits of detection with ESI. The use of lower flow rates dramatically increased the sensitivity by an order of magnitude. Additionally, for rapid chiral screening, the coupled Chirobiotic column afforded great applicability for LC-MS method development. Our second study, continued with chiral LC-MS method development in this case for the normal phase mode. Ethoxynonafluorobutane, a fluorocarbon with low flammability and no flashpoint, was used as a substitute solvent for hexane/heptane mobile phases for LC-APCI/MS. Comparable chromatographic resolutions and selectivities were found using ENFB substituted mobile phase systems, although, peak efficiencies were significantly diminished. Limits of detection were either comparable or better for ENFB-MS over heptane-PDA detection. The miscibility of ENFB with a variety of commonly used organic modifiers provided for flexibility in method development. For APCI, lower flow rates did not increase sensitivity as significantly as was previously found for ESI-MS detection. The chiral analysis of native amino acids was evaluated using both APCI and ESI sources. For free amino acids and small peptides, APCI was found to have better sensitivities over ESI at high flow rates. For larger peptides, however, sensitivity was greatly improved with the use of electrospray. Additionally, sensitivity was enhanced with the use of non-volatile additives, This optimized method was then

  17. DNA Lesions in Medaka (o. latipes): Development of a Micro-Method for Tissue Analysis Using Gas Chromatography-Mass Spectrometry

    Science.gov (United States)

    1993-07-29

    Enomoto, M. Usefulness and rapidity of screening for the toxicity and carconogenicity of chemicals in medaka, Oryzias latipes . 1982. J Exp Med 52 (5...medaka ( Oryzias latipes ). 1986a. JNCI 76 (3): 453-456. 14. Hendricks, J.D., Meyers, T.R., Casteel, J.L., Nixon, J.E., Loveland, P.M., and Bailey, G.S...AD-A275 291 Q GRANT NO: DAMD17-92-J-2006 TITLE: DNA LESIONS IN MEDAKA (0. LATIPES ): DEVELOPMENT OF A MICRO-METHOD FOR TISSUE ANALYSIS USING GAS

  18. Development of a volumetric Analysis method to determine uranium in the loaded phosphoric acid and the loaded organic phase (DEHPA/TOPO)

    International Nuclear Information System (INIS)

    Shlewit, H.; Koudsi, Y.

    2003-01-01

    Rapid and reliable volumetric analysis method has been developed to determine uranium, on line, at uranium extraction unit from wet-process phosphoric acid, in aqueous and organic phases. This process enable up 300 mg of uranium to be determined in the presence of nitric acid, in a sample volume of up to at least 10 ml. The volume of the sample, the amounts of reagents added, the temperature of the reagents and the standing time of various stages were investigated to ensure that the conditions selected for the final procedure were reasonably non-critical

  19. Remote sensing analysis of depositional landforms in alluvial settings: Method development and application to the Taquari megafan, Pantanal (Brazil)

    Science.gov (United States)

    Zani, Hiran; Assine, Mario Luis; McGlue, Michael Matthew

    2012-08-01

    Traditional Shuttle Radar Topography Mission (SRTM) topographic datasets hold limited value in the geomorphic analysis of low-relief terrains. To address this shortcoming, this paper presents a series of techniques designed to enhance digital elevation models (DEMs) of environments dominated by low-amplitude landforms, such as a fluvial megafan system. These techniques were validated through the study of a wide depositional tract composed of several megafans located within the Brazilian Pantanal. The Taquari megafan is the most remarkable of these features, covering an area of approximately 49,000 km2. To enhance the SRTM-DEM, the megafan global topography was calculated and found to be accurately represented by a second order polynomial. Simple subtraction of the global topography from altitude produced a new DEM product, which greatly enhanced low amplitude landforms within the Taquari megafan. A field campaign and optical satellite images were used to ground-truth features on the enhanced DEM, which consisted of both depositional (constructional) and erosional features. The results demonstrate that depositional lobes are the dominant landforms on the megafan. A model linking baselevel change, avulsion, clastic sedimentation, and erosion is proposed to explain the microtopographic features on the Taquari megafan surface. The study confirms the potential promise of enhanced DEMs for geomorphological research in alluvial settings.

  20. Prediction of fat-free mass by bioelectrical impedance analysis in older adults from developing countries: a cross-validation study using the deuterium dilution method

    International Nuclear Information System (INIS)

    Mateo, H. Aleman; Romero, J. Esparza; Valencia, M.E.

    2010-01-01

    Objective: Several limitations of published bioelectrical impedance analysis (BIA) equations have been reported. The aims were to develop in a multiethnic, elderly population a new prediction equation and cross- validate it along with some published BIA equations for estimating fat-free mass using deuterium oxide dilution as the reference method. Design and setting: Cross-sectional study of elderly from five developing countries. Methods: Total body water (TBW) measured by deuterium dilution was used to determine fat-free mass (FFM) in 383 subjects. Anthropometric and BIA variables were also measured. Only 377 subjects were included for the analysis, randomly divided into development and cross-validation groups after stratified by gender. Stepwise model selection was used to generate the model and Bland Altman analysis was used to test agreement. Results: FFM = 2.95 - 3.89 (Gender) + 0.514 (Ht2/Z) + 0.090 (Waist) + 0.156 (Body weight). The model fit parameters were an R2, total F-Ratio, and the SEE of 0.88, 314.3, and 3.3, respectively. None of the published BIA equations met the criteria for agreement. The new BIA equation underestimated FFM by just 0.3 kg in the cross-validation sample. The mean of the difference between FFM by TBW and the new BIA equation were not significantly different; 95% of the differences were between the limits of agreement of -6.3 to 6.9 kg of FFM. There was no significant association between the mean of the differences and their averages (r= 0.008 and p= 0.2). Conclusions:This new BIA equation offers a valid option compared with some of the current published BIA equations to estimate FFM in elderly subjects from five developing countries. (Authors)

  1. Development of methods for body composition studies

    International Nuclear Information System (INIS)

    Mattsson, Soeren; Thomas, Brian J

    2006-01-01

    This review is focused on experimental methods for determination of the composition of the human body, its organs and tissues. It summarizes the development and current status of fat determinations from body density, total body water determinations through the dilution technique, whole and partial body potassium measurements for body cell mass estimates, in vivo neutron activation analysis for body protein measurements, dual-energy absorptiometry (DEXA), computed tomography (CT) and magnetic resonance imaging (MRI, fMRI) and spectroscopy (MRS) for body composition studies on tissue and organ levels, as well as single- and multiple-frequency bioimpedance (BIA) and anthropometry as simple easily available methods. Methods for trace element analysis in vivo are also described. Using this wide range of measurement methods, together with gradually improved body composition models, it is now possible to quantify a number of body components and follow their changes in health and disease. (review)

  2. Development of methods for body composition studies

    Energy Technology Data Exchange (ETDEWEB)

    Mattsson, Soeren [Department of Radiation Physics, Lund University, Malmoe University Hospital, SE-205 02 Malmoe (Sweden); Thomas, Brian J [School of Physical and Chemical Sciences, Queensland University of Technology, Brisbane, QLD 4001 (Australia)

    2006-07-07

    This review is focused on experimental methods for determination of the composition of the human body, its organs and tissues. It summarizes the development and current status of fat determinations from body density, total body water determinations through the dilution technique, whole and partial body potassium measurements for body cell mass estimates, in vivo neutron activation analysis for body protein measurements, dual-energy absorptiometry (DEXA), computed tomography (CT) and magnetic resonance imaging (MRI, fMRI) and spectroscopy (MRS) for body composition studies on tissue and organ levels, as well as single- and multiple-frequency bioimpedance (BIA) and anthropometry as simple easily available methods. Methods for trace element analysis in vivo are also described. Using this wide range of measurement methods, together with gradually improved body composition models, it is now possible to quantify a number of body components and follow their changes in health and disease. (review)

  3. Evaluating public involvement in research design and grant development: Using a qualitative document analysis method to analyse an award scheme for researchers.

    Science.gov (United States)

    Baxter, Susan; Muir, Delia; Brereton, Louise; Allmark, Christine; Barber, Rosemary; Harris, Lydia; Hodges, Brian; Khan, Samaira; Baird, Wendy

    2016-01-01

    money was used, including a description of the aims and outcomes of the public involvement activities. The purpose of this study was to analyse the content of these reports. We aimed to find out what researcher views and experiences of public involvement activities were, and what lessons might be learned. Methods We used an innovative method of data analysis, drawing on group participatory approaches, qualitative content analysis, and Framework Analysis to sort and label the content of the reports. We developed a framework of categories and sub-categories (or themes and sub-themes) from this process. Results Twenty five documents were analysed. Four main themes were identified in the data: the added value of public involvement; planning and designing involvement; the role of public members; and valuing public member contributions. Within these themes, sub-themes related to the timing of involvement (prior to the research study/intended during the research study), and also specific benefits of public involvement such as: validating ideas; ensuring appropriate outcomes; ensuring the acceptability of data collection methods/tools and advice regarding research processes. Other sub-themes related to: finding and approaching public members; timing of events; training/support; the format of sessions; setting up public involvement panels: use of public contributors in analysis and interpretation of data; and using public members to assist with dissemination and translation into practice. Conclusions The analysis of reports submitted by researchers following involvement events provides evidence of the value of public involvement during the development of applications for research funding, and details a method for involving members of the public in data analysis which could be of value to other researchers The findings of the analysis indicate recognition amongst researchers of the variety in potential roles for public members in research, and also an acknowledgement of how

  4. Dynamic management of sustainable development methods for large technical systems

    CERN Document Server

    Krishans, Zigurds; Merkuryev, Yuri; Oleinikova, Irina

    2014-01-01

    Dynamic Management of Sustainable Development presents a concise summary of the authors' research in dynamic methods analysis of technical systems development. The text illustrates mathematical methods, with a focus on practical realization and applications.

  5. Novel methods to help develop healthier eating habits for eating and weight disorders: A systematic review and meta-analysis.

    Science.gov (United States)

    Turton, Robert; Bruidegom, Kiki; Cardi, Valentina; Hirsch, Colette R; Treasure, Janet

    2016-02-01

    This paper systematically reviews novel interventions developed and tested in healthy controls that may be able to change the over or under controlled eating behaviours in eating and weight disorders. Electronic databases were searched for interventions targeting habits related to eating behaviours (implementation intentions; food-specific inhibition training and attention bias modification). These were assessed in accordance with the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines. In healthy controls the implementation intention approach produces a small increase in healthy food intake and reduction in unhealthy food intake post-intervention. The size of these effects decreases over time and no change in weight was found. Unhealthy food intake was moderately reduced by food-specific inhibition training and attention bias modification post-intervention. This work may have important implications for the treatment of populations with eating and weight disorders. However, these findings are preliminary as there is a moderate to high level of heterogeneity in implementation intention studies and to date there are few food-specific inhibition training and attention bias modification studies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Development and validation of a method for the determination of regulated fragrance allergens by High-Performance Liquid Chromatography and Parallel Factor Analysis 2.

    Science.gov (United States)

    Pérez-Outeiral, Jessica; Elcoroaristizabal, Saioa; Amigo, Jose Manuel; Vidal, Maider

    2017-12-01

    This work presents the development and validation of a multivariate method for quantitation of 6 potentially allergenic substances (PAS) related to fragrances by ultrasound-assisted emulsification microextraction coupled with HPLC-DAD and PARAFAC2 in the presence of other 18 PAS. The objective is the extension of a previously proposed univariate method to be able to determine the 24 PAS currently considered as allergens. The suitability of the multivariate approach for the qualitative and quantitative analysis of the analytes is discussed through datasets of increasing complexity, comprising the assessment and validation of the method performance. PARAFAC2 showed to adequately model the data facing up different instrumental and chemical issues, such as co-elution profiles, overlapping spectra, unknown interfering compounds, retention time shifts and baseline drifts. Satisfactory quality parameters of the model performance were obtained (R 2 ≥0.94), as well as meaningful chromatographic and spectral profiles (r≥0.97). Moreover, low errors of prediction in external validation standards (below 15% in most cases) as well as acceptable quantification errors in real spiked samples (recoveries from 82 to 119%) confirmed the suitability of PARAFAC2 for resolution and quantification of the PAS. The combination of the previously proposed univariate approach, for the well-resolved peaks, with the developed multivariate method allows the determination of the 24 regulated PAS. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Development of a rapid multi-element method of analysis of antitussive syrups by inductively coupled plasma atomic emission spectrometry and direct sample introduction.

    Science.gov (United States)

    Zachariadis, G A; Kapsimali, D C

    2006-06-16

    A new rapid method was developed and optimized for routine multi-element determination of traces of metals in antitussive syrups using direct introduction of diluted syrup into the nebulization system of inductively coupled plasma atomic emission spectrometer (ICP-AES). Using a Scott-type double-pass spray chamber combined with a cross-flow nebulizer, the optimum ICP conditions, like RF incident power, argon gas flow rate and nebulizer sample uptake flow rate were found. A critical objective of the study was to evaluate the matrix effect on the intensity and consequently on the sensitivity of the developed method. Thus, the maximum syrup concentration which could be introduced into the argon plasma, was estimated. The sensitivity variation was calculated as compared to the corresponding sensitivity obtained from aqueous solutions for each analyte. The performance characteristics of the proposed method were evaluated for quantitative and semi-quantitative determination and finally, the method was applied to the analysis of various commercial antitussives.

  8. Sampling method development and optimization in view of human hand odor analysis by thermal desorption coupled with gas chromatography and mass spectrometry.

    Science.gov (United States)

    Cuzuel, Vincent; Portas, Eglantine; Cognon, Guillaume; Rivals, Isabelle; Heulard, François; Thiébaut, Didier; Vial, Jérôme

    2017-08-01

    Forensic profiling of human odor is challenging and would be useful to support information provided by dogs in courts of justice. Analyses of volatile compounds constitutive of human odor are commonly performed with gas chromatography coupled with mass spectrometry. All developed methods and sampling prototypes have to be easy to use in the field by crime scene investigators. This paper will focus on techniques for human hand odor sampling prior to analysis by a thermodesorption device coupled with gas chromatography and mass spectrometry. Thermodesorption and gas chromatography methods were developed using a sorbent phase spiked with a mixture of 80 compounds representative of human hand odor. Then, the crucial sampling step was performed indirectly with a homemade device based on air suction and trapping on a sorbent. This indirect sampling device was evaluated with the same synthetic mixture for optimization. An innovative polymer sorbent called Sorb-Star ® was compared to classic Tenax TA ® packed tubes. Sorb-Star ® provided similar recovery to Tenax TA ® packed tubes and a smaller pooled coefficient of variation (6 vs 13%). Thus, it appeared to be fully suited to the indirect sampling of human hand odor. The developed methods were successfully applied to real samples, the ultimate aim being the comparison of a suspect's sample to a sample collected from a crime scene.

  9. Excitation methods for energy dispersive analysis

    International Nuclear Information System (INIS)

    Jaklevic, J.M.

    1976-01-01

    The rapid development in recent years of energy dispersive x-ray fluorescence analysis has been based primarily on improvements in semiconductor detector x-ray spectrometers. However, the whole analysis system performance is critically dependent on the availability of optimum methods of excitation for the characteristic x rays in specimens. A number of analysis facilities based on various methods of excitation have been developed over the past few years. A discussion is given of the features of various excitation methods including charged particles, monochromatic photons, and broad-energy band photons. The effects of the excitation method on background and sensitivity are discussed from both theoretical and experimental viewpoints. Recent developments such as pulsed excitation and polarized photons are also discussed

  10. A platform analytical quality by design (AQbD) approach for multiple UHPLC-UV and UHPLC-MS methods development for protein analysis.

    Science.gov (United States)

    Kochling, Jianmei; Wu, Wei; Hua, Yimin; Guan, Qian; Castaneda-Merced, Juan

    2016-06-05

    A platform analytical quality by design approach for methods development is presented in this paper. This approach is not limited just to method development following the same logical Analytical quality by design (AQbD) process, it is also exploited across a range of applications in methods development with commonality in equipment and procedures. As demonstrated by the development process of 3 methods, the systematic approach strategy offers a thorough understanding of the method scientific strength. The knowledge gained from the UHPLC-UV peptide mapping method can be easily transferred to the UHPLC-MS oxidation method and the UHPLC-UV C-terminal heterogeneity methods of the same protein. In addition, the platform AQbD method development strategy ensures method robustness is built in during development. In early phases, a good method can generate reliable data for product development allowing confident decision making. Methods generated following the AQbD approach have great potential for avoiding extensive post-approval analytical method change. While in the commercial phase, high quality data ensures timely data release, reduced regulatory risk, and lowered lab operational cost. Moreover, large, reliable database and knowledge gained during AQbD method development provide strong justifications during regulatory filling for the selection of important parameters or parameter change needs for method validation, and help to justify for removal of unnecessary tests used for product specifications. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Development, validation and comparison of two stability-indicating RP-LC methods using charged aerosol and UV detectors for analysis of lisdexamfetamine dimesylate in capsules

    Directory of Open Access Journals (Sweden)

    Graciela Carlos

    2016-11-01

    Full Text Available Two new stability-indicating liquid chromatographic methods using two detectors, an ultraviolet (UV and a charged aerosol detector (CAD simultaneously connected in series were validated for the assessment of lisdexamfetamine dimesylate (LDX in capsule. The method was optimized and the influence of individual parameters on UV and CAD response and sensitivity was studied. Chromatography was performed on a Zorbax CN column (250 mm × 4.6 mm, 5 μm in an isocratic elution mode, using acetonitrile and 20 mM ammonium formate at pH 4.0 (50:50, v/v as mobile phase and UV detection at 207 nm. The developed method was validated according to ICH guidelines and the parameters’ specificity, limit of detection, limit of quantitation, linearity, accuracy, precision and robustness were evaluated. CAD is designated to be a non-linear detector in a wide dynamic range, however, the method was linear over the concentration range of 70–130 μg mL−1 in both detectors. The method was precise and accurate. Robustness study was performed by a Plackett–Burman design, delivering results within the acceptable range. Neither the excipients nor the degradation products showed interference in the method after studies of specificity as well as under stress conditions. The results of the LC-UV and LC-CAD methods were statistically compared through ANOVA and showed no significant difference (p > 0.05. Both proposed methods could be considered interchangeable and stability-indicating, and can be applied as an appropriate quality control tool for routine analysis of LDX in capsule.

  12. Microlocal methods in the analysis of the boundary element method

    DEFF Research Database (Denmark)

    Pedersen, Michael

    1993-01-01

    The application of the boundary element method in numerical analysis is based upon the use of boundary integral operators stemming from multiple layer potentials. The regularity properties of these operators are vital in the development of boundary integral equations and error estimates. We show...

  13. Method development for speciation analysis of nanoparticle and ionic forms of gold in biological samples by high performance liquid chromatography hyphenated to inductively coupled plasma mass spectrometry

    Science.gov (United States)

    Malejko, Julita; Świerżewska, Natalia; Bajguz, Andrzej; Godlewska-Żyłkiewicz, Beata

    2018-04-01

    A new method based on coupling high performance liquid chromatography (HPLC) to inductively coupled plasma mass spectrometry (ICP MS) has been developed for the speciation analysis of gold nanoparticles (AuNPs) and dissolved gold species (Au(III)) in biological samples. The column type, the composition and the flow rate of the mobile phase were carefully investigated in order to optimize the separation conditions. The usefulness of two polymeric reversed phase columns (PLRP-S with 100 nm and 400 nm pore size) to separate gold species were investigated for the first time. Under the optimal conditions (PLRP-S400 column, 10 mmol L-1 SDS and 5% methanol as the mobile phase, 0.5 mL min-1 flow rate), detection limits of 2.2 ng L-1 for Au(III), 2.8 ng L-1 for 10 nm AuNPs and 3.7 ng L-1 for 40 nm AuNPs were achieved. The accuracy of the method was proved by analysis of reference material RM 8011 (NIST) of gold nanoparticles of nominal diameter of 10 nm. The HPLC-ICP MS method has been successfully applied to the detection and size characterization of gold species in lysates of green algae Acutodesmus obliquus, typical representative of phytoplankton flora, incubated with 10 nm AuNPs or Au(III).

  14. Development of plant dynamic analysis code for integrated self-pressurized water reactor (ISPDYN), and comparative study of pressure control methods

    International Nuclear Information System (INIS)

    Kusunoki, Tsuyoshi; Yokomura, Takeyoshi; Nabeshima, Kunihiko; Shimazaki, Junya; Shinohara, Yoshikuni.

    1988-01-01

    This report describes the development of plant dynamic analysis code (ISPDYN) for integrated self-pressurized water reactor, and comparative study of pressure control methods with this code. ISPDYN is developed for integrated self-pressurized water reactor, one of the trial design by JAERI. In the transient responses, the calculated results by ISPDYN are in good agreement with the DRUCK calculations. In addition, this report presents some sensitivity studies for selected cases. Computing time of this code is very short so as about one fifth of real time. The comparative study of self-pressurized system with forced-pressurized system by this code, for rapid load decrease and increase cases, has provided useful informations. (author)

  15. Development of the Method of Bacterial Leaching of Metals out of Low-Grade Ores, Rocks, and Industrial Wastes Using Neutron Activation Analysis

    CERN Document Server

    Tsertsvadze, L A; Petriashvili, Sh G; Chutkerashvili, D G; Kirkesali, E I; Frontasyeva, M V; Pavlov, S S; Gundorina, S F

    2001-01-01

    The results of preliminary investigations aimed at the development of an economical and easy to apply technique of bacterial leaching of rare and valuable metals out of low-grade ores, complex composition ores, rocks, and industrial wastes in Georgia are discussed. The main groups of microbiological community of the peat suspension used in the experiments of bacterial leaching are investigated and the activity of particular microorganisms in the leaching of probes with different mineral compositions is assessed. The element composition of the primary and processed samples was investigated by the epithermal neutron activation analysis method and the enrichment/subtraction level is estimated for various elements. The efficiency of the developed technique to purify wastes, extract some scrace metals, and enrich ores or rocks in some elements, e.g. Au, U, Th, Cs, Sr, Rb, Sc, Zr, Hf, Ta, Gd, Er, Lu, Ce, etc., is demonstrated.

  16. Development of a capillary electrophoresis method for the analysis in alkaline media as polyoxoanions of two strategic metals: Niobium and tantalum.

    Science.gov (United States)

    Deblonde, Gauthier J-P; Chagnes, Alexandre; Cote, Gérard; Vial, Jérôme; Rivals, Isabelle; Delaunay, Nathalie

    2016-03-11

    Tantalum (Ta) and niobium (Nb) are two strategic metals essential to several key sectors, like the aerospace, gas and oil, nuclear and electronic industries, but their separation is really difficult due to their almost identical chemical properties. Whereas they are currently produced by hydrometallurgical processes using fluoride-based solutions, efforts are being made to develop cleaner processes by replacing the fluoride media by alkaline ones. However, methods to analyze Nb and Ta simultaneously in alkaline samples are lacking. In this work, we developed a capillary zone electrophoresis (CE) method able to separate and quantify Nb and Ta directly in alkaline media. This method takes advantage of the hexaniobate and hexatantalate ions which are naturally formed at pH>9 and absorb in the UV domain. First, the detection conditions, the background electrolyte (BGE) pH, the nature of the BGE co-ion and the internal standard (IS) were optimized by a systematic approach. As the BGE counter-ion nature modified the speciation of both ions, sodium- and lithium-based BGE were tested. For each alkaline cation, the BGE ionic strength and separation temperature were optimized using experimental designs. Since changes in the migration order of IS, Nb and Ta were observed within the experimental domain, the resolution was not a monotonic function of ionic strength and separation temperature. This forced us to develop an original data treatment for the prediction of the optimum separation conditions. Depending on the consideration of either peak widths or peak symmetries, with or without additional robustness constraints, four optima were predicted for each tested alkaline cation. The eight predicted optima were tested experimentally and the best experimental optimum was selected considering analysis time, resolution and robustness. The best separation was obtained at 31.0°C and in a BGE containing 10mM LiOH and 35mM LiCH3COO.The separation voltage was finally optimized

  17. Development of a dynamic headspace gas chromatography-mass spectrometry method for on-site analysis of sulfur mustard degradation products in sediments.

    Science.gov (United States)

    Magnusson, R; Nordlander, T; Östin, A

    2016-01-15

    Sampling teams performing work at sea in areas where chemical munitions may have been dumped require rapid and reliable analytical methods for verifying sulfur mustard leakage from suspected objects. Here we present such an on-site analysis method based on dynamic headspace GC-MS for analysis of five cyclic sulfur mustard degradation products that have previously been detected in sediments from chemical weapon dumping sites: 1,4-oxathiane, 1,3-dithiolane, 1,4-dithiane, 1,4,5-oxadithiephane, and 1,2,5-trithiephane. An experimental design involving authentic Baltic Sea sediments spiked with the target analytes was used to develop an optimized protocol for sample preparation, headspace extraction and analysis that afforded recoveries of up to 60-90%. The optimized method needs no organic solvents, uses only two grams of sediment on a dry weight basis and involves a unique sample presentation whereby sediment is spread uniformly as a thin layer inside the walls of a glass headspace vial. The method showed good linearity for analyte concentrations of 5-200 ng/g dw, good repeatability, and acceptable carry-over. The method's limits of detection for spiked sediment samples ranged from 2.5 to 11 μg/kg dw, with matrix interference being the main limiting factor. The instrumental detection limits were one to two orders of magnitude lower. Full-scan GC-MS analysis enabled the use of automated mass spectral deconvolution for rapid identification of target analytes. Using this approach, analytes could be identified in spiked sediment samples at concentrations down to 13-65 μg/kg dw. On-site validation experiments conducted aboard the research vessel R/V Oceania demonstrated the method's practical applicability, enabling the successful identification of four cyclic sulfur mustard degradation products at concentrations of 15-308μg/kg in sediments immediately after being collected near a wreck at the Bornholm Deep dumpsite in the Baltic Sea. Copyright © 2015 Elsevier B.V. All

  18. Simplified analysis of 11-hydroxy-delta-9-tetrahydrocannabinol and 11-carboxy-delta-9-tetrahydrocannabinol in human meconium: method development and validation.

    Science.gov (United States)

    Tynon, Marykathryn; Porto, Marcellino; Logan, Barry K

    2015-01-01

    We describe the development of a sensitive analytical method for the analysis of 11-hydroxy-delta-9-tetrahydrocannabinol (11-OH-THC) and 11-carboxy-delta-9-tetrahydrocannabinol (THCC) in meconium using a gas chromatography-mass spectrometry (GC/MS) platform. The method was validated according to protocols, which included assessment of accuracy, precision, robustness, stability in meconium and in-process stability, interference and sensitivity and specificity. The method consists of a solid phase extraction with alkaline hydrolysis and derivatization of the analytes with N, O-Bis(trimethylsilyl)trifluoroacteamide, followed by GC/MS analysis using selected ion monitoring. The method uses deuterated internal standards for both analytes. Calibration curves had r(2) values >0.998, and extraction efficiency was determined to be 84.7% for THCC and 78.6% for 11-OH-THC. The detection limit for both analytes was 5 ng/g. This confirmatory method was successfully applied to 183 meconium samples that had screened positive by enzyme-linked immunosorbent assay, and 67.2% were confirmed for THCC, and 2.2% were confirmed positive for 11-OH-THC. The mean (SD) and median (range) THCC (n = 123) concentrations detected were 55.0 ng/g (±59.0) and 33.75 ng/g (5-265 ng/g), while the mean and median (range) for 11-OH-THC (n = 4) concentrations were 8.25 ng/g (±4.71) and 6.5 ng/g (5-15 ng/g). © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. Development of a method for comprehensive and quantitative analysis of plant hormones by highly sensitive nanoflow liquid chromatography-electrospray ionization-ion trap mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Izumi, Yoshihiro; Okazawa, Atsushi; Bamba, Takeshi; Kobayashi, Akio [Department of Biotechnology, Graduate School of Engineering, Osaka University, 2-1 Yamadaoka, Suita, Osaka 565-0871 (Japan); Fukusaki, Eiichiro, E-mail: fukusaki@bio.eng.osaka-u.ac.jp [Department of Biotechnology, Graduate School of Engineering, Osaka University, 2-1 Yamadaoka, Suita, Osaka 565-0871 (Japan)

    2009-08-26

    In recent plant hormone research, there is an increased demand for a highly sensitive and comprehensive analytical approach to elucidate the hormonal signaling networks, functions, and dynamics. We have demonstrated the high sensitivity of a comprehensive and quantitative analytical method developed with nanoflow liquid chromatography-electrospray ionization-ion trap mass spectrometry (LC-ESI-IT-MS/MS) under multiple-reaction monitoring (MRM) in plant hormone profiling. Unlabeled and deuterium-labeled isotopomers of four classes of plant hormones and their derivatives, auxins, cytokinins (CK), abscisic acid (ABA), and gibberellins (GA), were analyzed by this method. The optimized nanoflow-LC-ESI-IT-MS/MS method showed ca. 5-10-fold greater sensitivity than capillary-LC-ESI-IT-MS/MS, and the detection limits (S/N = 3) of several plant hormones were in the sub-fmol range. The results showed excellent linearity (R{sup 2} values of 0.9937-1.0000) and reproducibility of elution times (relative standard deviations, RSDs, <1.1%) and peak areas (RSDs, <10.7%) for all target compounds. Further, sample purification using Oasis HLB and Oasis MCX cartridges significantly decreased the ion-suppressing effects of biological matrix as compared to the purification using only Oasis HLB cartridge. The optimized nanoflow-LC-ESI-IT-MS/MS method was successfully used to analyze endogenous plant hormones in Arabidopsis and tobacco samples. The samples used in this analysis were extracted from only 17 tobacco dry seeds (1 mg DW), indicating that the efficiency of analysis of endogenous plant hormones strongly depends on the detection sensitivity of the method. Our analytical approach will be useful for in-depth studies on complex plant hormonal metabolism.

  20. Development of a rapid method for the quantitative analysis of four methoxypyrazines in white and red wine using multi-dimensional Gas Chromatography-Mass Spectrometry.

    Science.gov (United States)

    Botezatu, Andreea; Pickering, Gary J; Kotseridis, Yorgos

    2014-10-01

    Alkyl-methoxypyrazines (MPs) are important odour-active constituents of many grape cultivars and their wines. Recently, a new MP - 2,5-dimethyl-3-methoxypyrazine (DMMP) - has been reported as a possible constituent of wine. This study sought to develop a rapid and reliable method for quantifying DMMP, isopropyl methoxypyrazine (IPMP), secbutyl methoxypyrazine (SBMP) and isobutyl methoxypyrazine (IBMP) in wine. The proposed method is able to rapidly and accurately resolve all 4 MPs in a range of wine styles, with limits of detection between 1 and 2 ng L(-1) for IPMP, SBMP and IBMP and 5 ng L(-1) for DMMP. Analysis of a set of 11 commercial wines agrees with previously published values for IPMP, SBMP and IBMP, and shows for the first time that DMMP may be an important and somewhat common odorant in red wines. To our knowledge, this is the first analytical method developed for the quantification of DMMP in wine. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Development of a new extraction technique and HPLC method for the analysis of non-psychoactive cannabinoids in fibre-type Cannabis sativa L. (hemp).

    Science.gov (United States)

    Brighenti, Virginia; Pellati, Federica; Steinbach, Marleen; Maran, Davide; Benvenuti, Stefania

    2017-09-05

    The present work was aimed at the development and validation of a new, efficient and reliable technique for the analysis of the main non-psychoactive cannabinoids in fibre-type Cannabis sativa L. (hemp) inflorescences belonging to different varieties. This study was designed to identify samples with a high content of bioactive compounds, with a view to underscoring the importance of quality control in derived products as well. Different extraction methods, including dynamic maceration (DM), ultrasound-assisted extraction (UAE), microwave-assisted extraction (MAE) and supercritical-fluid extraction (SFE) were applied and compared in order to obtain a high yield of the target analytes from hemp. Dynamic maceration for 45min with ethanol (EtOH) at room temperature proved to be the most suitable technique for the extraction of cannabinoids in hemp samples. The analysis of the target analytes in hemp extracts was carried out by developing a new reversed-phase high-performance liquid chromatography (HPLC) method coupled with diode array (UV/DAD) and electrospray ionization-mass spectrometry (ESI-MS) detection, by using an ion trap mass analyser. An Ascentis Express C 18 column (150mm×3.0mm I.D., 2.7μm) was selected for the HPLC analysis, with a mobile phase composed of 0.1% formic acid in both water and acetonitrile, under gradient elution. The application of the fused-core technology allowed us to obtain a significant improvement of the HPLC performance compared with that of conventional particulate stationary phases, with a shorter analysis time and a remarkable reduction of solvent usage. The analytical method optimized in this study was fully validated to show compliance with international requirements. Furthermore, it was applied to the characterization of nine hemp samples and six hemp-based pharmaceutical products. As such, it was demonstrated to be a very useful tool for the analysis of cannabinoids in both the plant material and its derivatives for

  2. SUBSURFACE CONSTRUCTION AND DEVELOPMENT ANALYSIS

    International Nuclear Information System (INIS)

    N.E. Kramer

    1998-01-01

    The purpose of this analysis is to identify appropriate construction methods and develop a feasible approach for construction and development of the repository subsurface facilities. The objective of this analysis is to support development of the subsurface repository layout for License Application (LA) design. The scope of the analysis for construction and development of the subsurface Repository facilities covers: (1) Excavation methods, including application of knowledge gained from construction of the Exploratory Studies Facility (ESF). (2) Muck removal from excavation headings to the surface. This task will examine ways of preventing interference with other subsurface construction activities. (3) The logistics and equipment for the construction and development rail haulage systems. (4) Impact of ground support installation on excavation and other construction activities. (5) Examination of how drift mapping will be accomplished. (6) Men and materials handling. (7) Installation and removal of construction utilities and ventilation systems. (8) Equipping and finishing of the emplacement drift mains and access ramps to fulfill waste emplacement operational needs. (9) Emplacement drift and access mains and ramps commissioning prior to handover for emplacement operations. (10) Examination of ways to structure the contracts for construction of the repository. (11) Discussion of different construction schemes and how to minimize the schedule risks implicit in those schemes. (12) Surface facilities needed for subsurface construction activities

  3. Further development of probabilistic analysis method for lifetime determination of piping and vessels. Final report; Weiterentwicklung probabilistischer Analysemethoden zur Lebensdauerbestimmung von Rohrleitungen und Behaeltern. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Heckmann, K.; Grebner, H.; Sievers, J.

    2013-07-15

    Within the framework of research project RS1196 the computer code PROST (Probabilistic Structure Calculation) for the quantitative evaluation of the structural reliability of pipe components has been further developed. Thereby models were provided and tested for the consideration of the damage mechanism 'stable crack growth' to determine leak and break probabilities in cylindrical structures of ferritic and austenitic reactor steels. These models are now additionally available to the model for the consideration of the damage mechanisms 'fatigue' and 'corrosion'. Moreover, a crack initiation model has been established supplementary to the treatment of initial cracks. Furthermore, the application range of the code was extended to the calculation of the growth of wall penetrating cracks. This is important for surface cracks growing until the formation of a stable leak. The calculation of the growth of the wall penetrating crack until break occurs improves the estimation of the break probability. For this purpose program modules were developed to be able to calculate stress intensity factors and critical crack lengths for wall penetrating cracks. In the frame of this work a restructuring of PROST was performed including possibilities to combine damage mechanisms during a calculation. Furthermore several additional fatigue crack growth laws were implemented. The implementation of methods to estimate leak areas and leak rates of wall penetrating cracks was completed by the inclusion of leak detection boundaries. The improved analysis methods were tested by calculation of cases treated already before. Furthermore comparative analyses have been performed for several tasks within the international activity BENCH-KJ. Altogether, the analyses show that with the provided flexible probabilistic analysis method quantitative determination of leak and break probabilities of a crack in a complex structure geometry under thermal-mechanical loading as

  4. Application of Software Safety Analysis Methods

    International Nuclear Information System (INIS)

    Park, G. Y.; Hur, S.; Cheon, S. W.; Kim, D. H.; Lee, D. Y.; Kwon, K. C.; Lee, S. J.; Koo, Y. H.

    2009-01-01

    A fully digitalized reactor protection system, which is called the IDiPS-RPS, was developed through the KNICS project. The IDiPS-RPS has four redundant and separated channels. Each channel is mainly composed of a group of bistable processors which redundantly compare process variables with their corresponding setpoints and a group of coincidence processors that generate a final trip signal when a trip condition is satisfied. Each channel also contains a test processor called the ATIP and a display and command processor called the COM. All the functions were implemented in software. During the development of the safety software, various software safety analysis methods were applied, in parallel to the verification and validation (V and V) activities, along the software development life cycle. The software safety analysis methods employed were the software hazard and operability (Software HAZOP) study, the software fault tree analysis (Software FTA), and the software failure modes and effects analysis (Software FMEA)

  5. SWOT ANALYSIS ON SAMPLING METHOD

    Directory of Open Access Journals (Sweden)

    CHIS ANCA OANA

    2014-07-01

    Full Text Available Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors. It should be applied correctly for a fair view of financial statements, to satisfy the needs of all financial users. In order to be applied correctly the method must be understood by all its users and mainly by auditors. Otherwise the risk of not applying it correctly would cause loose of reputation and discredit, litigations and even prison. Since there is not a unitary practice and methodology for applying the technique, the risk of incorrectly applying it is pretty high. The SWOT analysis is a technique used that shows the advantages, disadvantages, threats and opportunities. We applied SWOT analysis in studying the sampling method, from the perspective of three players: the audit company, the audited entity and users of financial statements. The study shows that by applying the sampling method the audit company and the audited entity both save time, effort and money. The disadvantages of the method are difficulty in applying and understanding its insight. Being largely used as an audit method and being a factor of a correct audit opinion, the sampling method’s advantages, disadvantages, threats and opportunities must be understood by auditors.

  6. An integrated quality by design and mixture-process variable approach in the development of a capillary electrophoresis method for the analysis of almotriptan and its impurities.

    Science.gov (United States)

    Orlandini, S; Pasquini, B; Stocchero, M; Pinzauti, S; Furlanetto, S

    2014-04-25

    The development of a capillary electrophoresis (CE) method for the assay of almotriptan (ALM) and its main impurities using an integrated Quality by Design and mixture-process variable (MPV) approach is described. A scouting phase was initially carried out by evaluating different CE operative modes, including the addition of pseudostationary phases and additives to the background electrolyte, in order to approach the analytical target profile. This step made it possible to select normal polarity microemulsion electrokinetic chromatography (MEEKC) as operative mode, which allowed a good selectivity to be achieved in a low analysis time. On the basis of a general Ishikawa diagram for MEEKC methods, a screening asymmetric matrix was applied in order to screen the effects of the process variables (PVs) voltage, temperature, buffer concentration and buffer pH, on critical quality attributes (CQAs), represented by critical separation values and analysis time. A response surface study was then carried out considering all the critical process parameters, including both the PVs and the mixture components (MCs) of the microemulsion (borate buffer, n-heptane as oil, sodium dodecyl sulphate/n-butanol as surfactant/cosurfactant). The values of PVs and MCs were simultaneously changed in a MPV study, making it possible to find significant interaction effects. The design space (DS) was defined as the multidimensional combination of PVs and MCs where the probability for the different considered CQAs to be acceptable was higher than a quality level π=90%. DS was identified by risk of failure maps, which were drawn on the basis of Monte-Carlo simulations, and verification points spanning the design space were tested. Robustness testing of the method, performed by a D-optimal design, and system suitability criteria allowed a control strategy to be designed. The optimized method was validated following ICH Guideline Q2(R1) and was applied to a real sample of ALM coated tablets

  7. LC-MS/MS based method development for the analysis of florfenicol and its application to estimate relative distribution in various tissues of broiler chicken.

    Science.gov (United States)

    Imran, Muhammad; Fazal-E-Habib; Tawab, Abdul; Rauf, Waqar; Rahman, Moazur; Khan, Qaiser Mehmood; Asi, Muhammad Rafique; Iqbal, Mazhar

    2017-09-15

    Florfenicol, a broad spectrum bacteriostatic antibiotic belonging to amphenicol class, is widely used in poultry and livestock for the treatment of various infections. The major metabolite of florfenicol in different animal species is florfenicol amine which is exploited as the marker residue for the determination of florfenicol. Analysis of florfenicol merely by solvent extraction cannot determine the accurate amount of the drug present in incurred tissues (muscle, liver and kidney) of treated birds, as indicated by this study. Thus the methods solely based on solvent extraction may lead to false negative results. A reliable LC-MS/MS based confirmatory method for the analysis of florfenicol and its metabolites in chicken muscle was developed and validated according to the European Union Commission Decision 2002/657/EC. The method was based on acid hydrolysis to liberate non-extractable residues having presumably been covalently bound to tissues, and to convert all the florfenicol residues as well as its metabolites into florfenicol amine. The amine was subsequently recovered with ethyl acetate at pH 10.5, defatted and further cleaned up with dispersive solid phase extraction (dSPE). The LC separation was achieved on reverse phase C-18 column with isocratic elution using acetonitrile/water mobile phase and the analysis was performed on linear ion trap mass spectrometer. Calibration curve was obtained over a concentration range of 25-600μg/kg for chicken muscles. The accuracy values ranged from 84 to 101.4% and the precision values for within day and between days ranged from 1.2-11.7%, respectively. Limit of detection (LOD), limit of quantification (LOD), CCα and CCβ values were 0.98, 3.2, 113 and 126μg/kg, respectively. The developed method was highly robust and was further applied to estimate the relative distribution of solvent-extractable against solvent-non-extractable florfenicol drug residues in muscle, liver and kidney samples of broiler chicken after 5

  8. Statistical methods for bioimpedance analysis

    Directory of Open Access Journals (Sweden)

    Christian Tronstad

    2014-04-01

    Full Text Available This paper gives a basic overview of relevant statistical methods for the analysis of bioimpedance measurements, with an aim to answer questions such as: How do I begin with planning an experiment? How many measurements do I need to take? How do I deal with large amounts of frequency sweep data? Which statistical test should I use, and how do I validate my results? Beginning with the hypothesis and the research design, the methodological framework for making inferences based on measurements and statistical analysis is explained. This is followed by a brief discussion on correlated measurements and data reduction before an overview is given of statistical methods for comparison of groups, factor analysis, association, regression and prediction, explained in the context of bioimpedance research. The last chapter is dedicated to the validation of a new method by different measures of performance. A flowchart is presented for selection of statistical method, and a table is given for an overview of the most important terms of performance when evaluating new measurement technology.

  9. Assessment of Thorium Analysis Methods

    International Nuclear Information System (INIS)

    Putra, Sugili

    1994-01-01

    The Assessment of thorium analytical methods for mixture power fuel consisting of titrimetry, X-ray flouresence spectrometry, UV-VIS spectrometry, alpha spectrometry, emission spectrography, polarography, chromatography (HPLC) and neutron activation were carried out. It can be concluded that analytical methods which have high accuracy (deviation standard < 3%) were; titrimetry neutron activation analysis and UV-VIS spectrometry; whereas with low accuracy method (deviation standard 3-10%) were; alpha spectrometry and emission spectrography. Ore samples can be analyzed by X-ray flourescnce spectrometry, neutron activation analysis, UV-VIS spectrometry, emission spectrography, chromatography and alpha spectometry. Concentrated samples can be analyzed by X-ray flourescence spectrometry; simulation samples can be analyzed by titrimetry, polarography and UV-VIS spectrometry, and samples of thorium as minor constituent can be analyzed by neutron activation analysis and alpha spectrometry. Thorium purity (impurities element in thorium samples) can be analyzed by emission spectography. Considering interference aspects, in general analytical methods without molecule reaction are better than those involving molecule reactions (author). 19 refs., 1 tabs

  10. Scientific methods for developing ultrastable structures

    International Nuclear Information System (INIS)

    Gamble, M.; Thompson, T.; Miller, W.

    1990-01-01

    Scientific methods used by the Los Alamos National Laboratory for developing an ultrastable structure for study of silicon-based elementary particle tracking systems are addressed. In particular, the design, analysis, and monitoring of this system are explored. The development methodology was based on a triad of analytical, computational, and experimental techniques. These were used to achieve a significant degree of mechanical stability (alignment accuracy >1 μrad) and yet allow dynamic manipulation of the system. Estimates of system thermal and vibratory stability and component performance are compared with experimental data collected using laser interferometry and accelerometers. 8 refs., 5 figs., 4 tabs

  11. Survey of Task Analysis Methods

    Science.gov (United States)

    1978-02-14

    Taylor, for example, referred to task analysis in his work on scientific management (65). In the same time frame, the Gilbreths developed the first...ciation, Washington, D. C., 1965. 21. Gilbreth , F. B. Bricklaying System, M. C. Clark, New York, 1909. -42- REFERENCES (Continued) 22. Gilbreth , F

  12. Development of an HPLC-UV Method for the Analysis of Drugs Used for Combined Hypertension Therapy in Pharmaceutical Preparations and Human Plasma.

    Science.gov (United States)

    Kepekci Tekkeli, Serife Evrim

    2013-01-01

    A simple, rapid, and selective HPLC-UV method was developed for the determination of antihypertensive drug substances: amlodipine besilat (AML), olmesartan medoxomil (OLM), valsartan (VAL), and hydrochlorothiazide (HCT) in pharmaceuticals and plasma. These substances are mostly used as combinations. The combinations are found in various forms, especially in current pharmaceuticals as threesome components: OLM, AML, and HCT (combination I) and AML, VAL, and HCT (combination II). The separation was achieved by using an RP-CN column, and acetonitrile-methanol-10 mmol orthophosphoric acid pH 2.5 (7 : 13 : 80, v/v/v) was used as a mobile phase; the detector wavelength was set at 235 nm. The linear ranges were found as 0.1-18.5  μ g/mL, 0.4-25.6  μ g/mL, 0.3-15.5  μ g/mL, and 0.3-22  μ g/mL for AML, OLM, VAL, and HCT, respectively. In order to check the selectivity of the method for pharmaceutical preparations, forced degradation studies were carried out. According to the validation studies, the developed method was found to be reproducible and accurate as shown by RSD ≤6.1%, 5.7%, 6.9%, and 4.6% and relative mean error (RME) ≤10.6%, 5.8%, 6.5%, and 6.8% for AML, OLM, VAL, and HCT, respectively. Consequently, the method was applied to the analysis of tablets and plasma of the patients using drugs including those substances.

  13. Development of an HPLC-UV Method for the Analysis of Drugs Used for Combined Hypertension Therapy in Pharmaceutical Preparations and Human Plasma

    Directory of Open Access Journals (Sweden)

    Serife Evrim Kepekci Tekkeli

    2013-01-01

    Full Text Available A simple, rapid, and selective HPLC-UV method was developed for the determination of antihypertensive drug substances: amlodipine besilat (AML, olmesartan medoxomil (OLM, valsartan (VAL, and hydrochlorothiazide (HCT in pharmaceuticals and plasma. These substances are mostly used as combinations. The combinations are found in various forms, especially in current pharmaceuticals as threesome components: OLM, AML, and HCT (combination I and AML, VAL, and HCT (combination II. The separation was achieved by using an RP-CN column, and acetonitrile-methanol-10 mmol orthophosphoric acid pH 2.5 (7 : 13 : 80, v/v/v was used as a mobile phase; the detector wavelength was set at 235 nm. The linear ranges were found as 0.1–18.5 μg/mL, 0.4–25.6 μg/mL, 0.3–15.5 μg/mL, and 0.3–22 μg/mL for AML, OLM, VAL, and HCT, respectively. In order to check the selectivity of the method for pharmaceutical preparations, forced degradation studies were carried out. According to the validation studies, the developed method was found to be reproducible and accurate as shown by RSD ≤6.1%, 5.7%, 6.9%, and 4.6% and relative mean error (RME ≤10.6%, 5.8%, 6.5%, and 6.8% for AML, OLM, VAL, and HCT, respectively. Consequently, the method was applied to the analysis of tablets and plasma of the patients using drugs including those substances.

  14. Method of photon spectral analysis

    Science.gov (United States)

    Gehrke, Robert J.; Putnam, Marie H.; Killian, E. Wayne; Helmer, Richard G.; Kynaston, Ronnie L.; Goodwin, Scott G.; Johnson, Larry O.

    1993-01-01

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and .gamma.-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2000 keV), as well as high-energy .gamma. rays (>1 MeV). A 8192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The .gamma.-ray portion of each spectrum is analyzed by a standard Ge .gamma.-ray analysis program. This method can be applied to any analysis involving x- and .gamma.-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the .gamma.-ray analysis and accommodated during the x-ray analysis.

  15. Developments of an Interactive Sail Design Method

    Directory of Open Access Journals (Sweden)

    S. M. Malpede

    2000-01-01

    Full Text Available This paper presents a new tool for performing the integrated design and analysis of a sail. The features of the system are the geometrical definition of a sail shape, using the Bezier surface method, the creation of a finite element model for the non-linear structural analysis and a fluid-dynamic model for the aerodynamic analysis. The system has been developed using MATLAB(r. Recent sail design efforts have been focused on solving the aeroelastic behavior of the sail. The pressure distribution on a sail changes continuously, by virtue of cloth stretch and flexing. The sail shape determines the pressure distribution and, at the same time, the pressure distribution on the sail stretches and flexes the sail material determining its shape. This characteristic non-linear behavior requires iterative solution strategies to obtain the equilibrium configuration and evaluate the forces involved. The aeroelastic problem is tackled by combining structural with aerodynamic analysis. Firstly, pressure loads for a known sail-shape are computed (aerodynamic analysis. Secondly, the sail-shape is analyzed for the obtained external loads (structural analysis. The final solution is obtained by using an iterative analysis process, which involves both aerodynamic and the structural analysis. When the solution converges, it is possible to make design modifications.

  16. Development and validation of a robust high-performance liquid chromatographic method for the analysis of monacolins in red yeast rice.

    Science.gov (United States)

    Theunis, Mart; Naessens, Tania; Verhoeven, Veronique; Hermans, Nina; Apers, Sandra

    2017-11-01

    A robust analytical method, using reversed phase high-performance liquid chromatography with diode array detection, was developed and validated for the quantification of monacolins in red yeast rice bulk products. Tests on the composition of the extraction solvent, extraction time and the number of repetitions of extraction were evaluated with the aim of complete extraction of the monacolins and minimal transitions between the monacolins during analysis. Monacolin K (acid form), monacolin K (lactone form) and minor monacolin peaks were separated on a C18 column (250×4.6mm, 5µm) using acetonitrile/0.1% trifluoroacetic acid as the mobile phase. For the calibration curve of monacolin K (lactone form), a linear correlation in the range 6-119µg/mL was found. The precision of the method for time and concentration gave a relative standard deviation of less than 5%, which was deemed acceptable. The recovery of the method was 98.75%. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Development of a Method for the Analysis of Multiclass Antibiotic Residues in Milk Using QuEChERS and Liquid Chromatography-Tandem Mass Spectrometry.

    Science.gov (United States)

    Wang, Yuan-long; Liu, Zhen-min; Ren, Jing; Guo, Ben-heng

    2015-08-01

    A precise and simplified method of sample preparation for the simultaneous quantification of the antibiotics β-lactam, macrolide, tetracycline, sulfonamide, and quinolone in bovine milk was developed. The central composite design of response surface methodology was used to design and optimize the method for the determination of six different antibiotic residues in milk. The recovery of each antibiotic was studied using a quick, easy, cheap, effective, rugged, and safe (QuEChERS) method. Octadecylsilane (C18), primary secondary amine (PSA), and sodium acetate (Na acetate) were the main factors affecting the recovery of each antibiotic. After optimization, the maximum predicted recovery rate was 84.18% for erythromycin under the optimized conditions of 101.20 mg C18, 52.00 mg PSA, and 1.01 g Na acetate. The recovery rates of the five other antibiotic residues ranged from 86.09% to 115.99%. The results suggested that modified QuEChERS could effectively be implemented in the analysis of antibiotic residues in milk.

  18. Development of a microwave assisted extraction method for the analysis of 2,4,6-trichloroanisole in cork stoppers by SIDA-SBSE-GC-MS

    International Nuclear Information System (INIS)

    Vestner, Jochen; Fritsch, Stefanie; Rauhut, Doris

    2010-01-01

    The aim of this research work was focused on the replacement of the time-consuming soaking of cork stoppers which is mainly used as screening method for cork lots in connection with sensory analysis and/or analytical methods to detect releasable 2,4,6-trichloroanisole (TCA) of natural cork stoppers. Releasable TCA from whole cork stoppers was analysed with the application of a microwave assisted extraction method (MAE) in combination with stir bar sorptive extraction (SBSE). The soaking of corks (SOAK) was used as a reference method to optimise MAE parameters. Cork lots of different quality and TCA contamination levels were used to adapt MAE. Pre-tests indicated that an MAE at 40 deg. C for 120 min with 90 min of cooling time are suitable conditions to avoid an over-extraction of TCA of low and medium tainted cork stoppers in comparison to SOAK. These MAE parameters allow the measuring of almost the same amounts of releasable TCA as with the application of the soaking procedure in the relevant range ( -1 releasable TCA from one cork) to evaluate the TCA level of cork stoppers. Stable isotope dilution assay (SIDA) was applied to optimise quantification of the released TCA with deuterium-labelled TCA (TCA-d 5 ) using a time-saving GC-MS technique in single ion monitoring (SIM) mode. The developed MAE method allows the measuring of releasable TCA from the whole cork stopper under improved conditions and in connection with a low use of solvent and a higher sample throughput.

  19. Development of a microwave assisted extraction method for the analysis of 2,4,6-trichloroanisole in cork stoppers by SIDA-SBSE-GC-MS

    Energy Technology Data Exchange (ETDEWEB)

    Vestner, Jochen [Forschungsanstalt Geisenheim, Fachgebiet Mikrobiologie und Biochemie, Von-Lade-Strasse 1, D-65366 Geisenheim (Germany); Hochschule RheinMain, Fachbereich Geisenheim, Von-Lade-Strasse 1, D-65366 Geisenheim (Germany); Fritsch, Stefanie [Forschungsanstalt Geisenheim, Fachgebiet Mikrobiologie und Biochemie, Von-Lade-Strasse 1, D-65366 Geisenheim (Germany); Rauhut, Doris, E-mail: doris.rauhut@fa-gm.de [Forschungsanstalt Geisenheim, Fachgebiet Mikrobiologie und Biochemie, Von-Lade-Strasse 1, D-65366 Geisenheim (Germany)

    2010-02-15

    The aim of this research work was focused on the replacement of the time-consuming soaking of cork stoppers which is mainly used as screening method for cork lots in connection with sensory analysis and/or analytical methods to detect releasable 2,4,6-trichloroanisole (TCA) of natural cork stoppers. Releasable TCA from whole cork stoppers was analysed with the application of a microwave assisted extraction method (MAE) in combination with stir bar sorptive extraction (SBSE). The soaking of corks (SOAK) was used as a reference method to optimise MAE parameters. Cork lots of different quality and TCA contamination levels were used to adapt MAE. Pre-tests indicated that an MAE at 40 deg. C for 120 min with 90 min of cooling time are suitable conditions to avoid an over-extraction of TCA of low and medium tainted cork stoppers in comparison to SOAK. These MAE parameters allow the measuring of almost the same amounts of releasable TCA as with the application of the soaking procedure in the relevant range (<25 ng L{sup -1} releasable TCA from one cork) to evaluate the TCA level of cork stoppers. Stable isotope dilution assay (SIDA) was applied to optimise quantification of the released TCA with deuterium-labelled TCA (TCA-d{sub 5}) using a time-saving GC-MS technique in single ion monitoring (SIM) mode. The developed MAE method allows the measuring of releasable TCA from the whole cork stopper under improved conditions and in connection with a low use of solvent and a higher sample throughput.

  20. Liquid chromatography-tandem mass spectrometry multiresidue method for the analysis of quaternary ammonium compounds in cheese and milk products: Development and validation using the total error approach.

    Science.gov (United States)

    Slimani, Kahina; Féret, Aurélie; Pirotais, Yvette; Maris, Pierre; Abjean, Jean-Pierre; Hurtaud-Pessel, Dominique

    2017-09-29

    Quaternary ammonium compounds (QACs) are both cationic surfactants and biocidal substances widely used as disinfectants in the food industry. A sensitive and reliable method for the analysis of benzalkonium chlorides (BACs) and dialkyldimethylammonium chlorides (DDACs) has been developed that enables the simultaneous quantitative determination of ten quaternary ammonium residues in dairy products below the provisional maximum residue level (MRL), set at 0.1mgkg -1 . To the best of our knowledge, this method could be the one applicable to milk and to three major processed milk products selected, namely processed or hard pressed cheeses, and whole milk powder. The method comprises solvent extraction using a mixture of acetonitrile and ethyl acetate, without any further clean-up. Analyses were performed by liquid chromatography coupled with electrospray tandem mass spectrometry detection (LC-ESI-MS/MS) operating in positive mode. A C18 analytical column was used for chromatographic separation, with a mobile phase composed of acetonitrile and water both containing 0.3% formic acid; and methanol in the gradient mode. Five deuterated internal standards were added to obtain the most accurate quantification. Extraction recoveries were satisfactory and no matrix effects were observed. The method was validated using the total error approach in accordance with the NF V03-110 standard in order to characterize the trueness, repeatability, intermediate precision and analytical limits within the range of 5-150μgkg -1 for all matrices. These performance criteria, calculated by e.noval ® 3.0 software, were satisfactory and in full accordance with the proposed provisional MRL and with the recommendations in the European Union SANTE/11945/2015 regulatory guidelines. The limit of detection (LOD) was low (method is suitable for quantifying quaternary ammoniums in foodstuffs from dairy industries at residue levels, and could be used for biocide residues monitoring plans and to measure

  1. Developing the Business Modelling Method

    NARCIS (Netherlands)

    Meertens, Lucas Onno; Iacob, Maria Eugenia; Nieuwenhuis, Lambertus Johannes Maria; Shishkov, B; Shishkov, Boris

    2011-01-01

    Currently, business modelling is an art, instead of a science, as no scientific method for business modelling exists. This, and the lack of using business models altogether, causes many projects to end after the pilot stage, unable to fulfil their apparent promise. We propose a structured method to

  2. Flows method in global analysis

    International Nuclear Information System (INIS)

    Duong Minh Duc.

    1994-12-01

    We study the gradient flows method for W r,p (M,N) where M and N are Riemannian manifold and r may be less than m/p. We localize some global analysis problem by constructing gradient flows which only change the value of any u in W r,p (M,N) in a local chart of M. (author). 24 refs

  3. Development and Evaluation of a Spectral Analysis Method to Eliminate Organic Interference with Cavity Ring-Down Measurements of Water Isotope Ratios.

    Science.gov (United States)

    Lin, Z.; Kim-Hak, D.; Popp, B. N.; Wallsgrove, N.; Kagawa-Viviani, A.; Johnson, J.

    2017-12-01

    Cavity ring-down spectroscopy (CRDS) is a technology based on the spectral absorption of gas molecules of interest at specific spectral regions. The CRDS technique enables the analysis of hydrogen and oxygen stable isotope ratios of water by directly measuring individual isotopologue absorption peaks such as H16OH, H18OH, and D16OH. Early work demonstrated that the accuracy of isotope analysis by CRDS and other laser-based absorption techniques could be compromised by spectral interference from organic compounds, in particular methanol and ethanol, which can be prevalent in ecologically-derived waters. There have been several methods developed by various research groups including Picarro to address the organic interference challenge. Here, we describe an organic fitter and a post-processing algorithm designed to improve the accuracy of the isotopic analysis of the "organic contaminated" water specifically for Picarro models L2130-i and L2140-i. To create the organic fitter, the absorption features of methanol around 7200 cm-1 were characterized and incorporated into spectral analysis. Since there was residual interference remaining after applying the organic fitter, a statistical model was also developed for post-processing correction. To evaluate the performance of the organic fitter and the postprocessing correction, we conducted controlled experiments on the L2130-i for two water samples with different isotope ratios blended with varying amounts of methanol (0-0.5%) and ethanol (0-5%). When the original fitter was not used for spectral analysis, the addition of 0.5% methanol changed the apparent isotopic composition of the water samples by +62‰ for δ18O values and +97‰ for δ2H values, and the addition of 5% ethanol changed the apparent isotopic composition by -0.5‰ for δ18O values and -3‰ for δ2H values. When the organic fitter was used for spectral analysis, the maximum methanol-induced errors were reduced to +4‰ for δ18O values and +5‰ for δ2

  4. Development of an UPLC mass spectrometry method for measurement of myofibrillar protein synthesis: application to analysis of murine muscles during cancer cachexia.

    Science.gov (United States)

    Lima, Maria; Sato, Shuichi; Enos, Reilly T; Baynes, John W; Carson, James A

    2013-03-15

    Cachexia, characterized by skeletal muscle mass loss, is a major contributory factor to patient morbidity and mortality during cancer. However, there are no reports on the rate of myofibrillar protein synthesis (MPS) in skeletal muscles that vary in primary metabolic phenotype during cachexia, in large part because of the small-size muscles and regional differences in larger muscles in the mouse. Here, we describe a sensitive method for measurement of MPS and its application to analysis of MPS in specific muscles of mice with (Apc(Min/+)) and without (C57BL/6) cancer cachexia. Mice were injected with a loading dose of deuterated phenylalanine (D5F), and myofibrillar proteins were extracted from skeletal muscles at 30 min. The relative concentrations of D5F and naturally occurring phenylalanine (F) in the myofibrillar proteins and the amino acid pool were quantified by ultra-performance liquid chromatograph (UPLC) mass spectrometry (MS). The rate of MPS was determined from D5F-to-F ratio in the protein fraction compared with the amino acid pool. The rate of MPS, measured in 2-5 mg of muscle protein, was reduced by up to 65% with cachexia in the soleus, plantaris, diaphragm, and oxidative and glycolytic regions of the gastrocnemius. The rate of MPS was significantly higher in the oxidative vs. glycolytic gastrocnemius muscle. A sufficiently sensitive UPLC MS method requiring a very small amount of muscle has been developed to measure the rate of MPS in various mouse muscles. This method should be useful for studies in other animal models for quantifying effects of cancer and anti-cancer therapies on protein synthesis in cachexia, and particularly for analysis of sequential muscle biopsies in a wide range of animal and human studies.

  5. Development of a proxy for past surface UV-B irradiation : A thermally assisted hydrolysis and methylation py-GC/MS method for the analysis of pollen and spores

    NARCIS (Netherlands)

    Blokker, Peter; Yeloff, Dan; Boelen, Peter; Broekman, Rob A; Rozema, Jelte

    2005-01-01

    A method was developed for the analysis of the UV-absorbing sporopollenin monomers p-coumaric acid and ferulic acid in very low numbers of pollen. This enables the analysis of pollen or spores from cultured plants, from herbarium collections, and from sediment, soil, and peat cores. The method

  6. Method Engineering: Engineering of Information Systems Development Methods and Tools

    NARCIS (Netherlands)

    Brinkkemper, J.N.; Brinkkemper, Sjaak

    1996-01-01

    This paper proposes the term method engineering for the research field of the construction of information systems development methods and tools. Some research issues in method engineering are identified. One major research topic in method engineering is discussed in depth: situational methods, i.e.

  7. METHODS TO DEVELOP A TOROIDAL SURFACE

    Directory of Open Access Journals (Sweden)

    DANAILA Ligia

    2017-05-01

    Full Text Available The paper work presents two practical methods to draw the development of a surface unable to be developed applying classical methods of Descriptive Geometry, the toroidal surface, frequently met in technical practice. The described methods are approximate ones; the development is obtained with the help of points. The accuracy of the methods is given by the number of points used when drawing. As for any other approximate method, when practically manufactured the development may need to be adjusted on site.

  8. Analysis of mixed data methods & applications

    CERN Document Server

    de Leon, Alexander R

    2013-01-01

    A comprehensive source on mixed data analysis, Analysis of Mixed Data: Methods & Applications summarizes the fundamental developments in the field. Case studies are used extensively throughout the book to illustrate interesting applications from economics, medicine and health, marketing, and genetics. Carefully edited for smooth readability and seamless transitions between chaptersAll chapters follow a common structure, with an introduction and a concluding summary, and include illustrative examples from real-life case studies in developmental toxicolog

  9. Advanced development of the boundary element method for elastic and inelastic thermal stress analysis. Ph.D. Thesis, 1987 Final Report

    Science.gov (United States)

    Henry, Donald P., Jr.

    1991-01-01

    The focus of this dissertation is on advanced development of the boundary element method for elastic and inelastic thermal stress analysis. New formulations for the treatment of body forces and nonlinear effects are derived. These formulations, which are based on particular integral theory, eliminate the need for volume integrals or extra surface integrals to account for these effects. The formulations are presented for axisymmetric, two and three dimensional analysis. Also in this dissertation, two dimensional and axisymmetric formulations for elastic and inelastic, inhomogeneous stress analysis are introduced. The derivatives account for inhomogeneities due to spatially dependent material parameters, and thermally induced inhomogeneities. The nonlinear formulation of the present work are based on an incremental initial stress approach. Two inelastic solutions algorithms are implemented: an iterative; and a variable stiffness type approach. The Von Mises yield criterion with variable hardening and the associated flow rules are adopted in these algorithms. All formulations are implemented in a general purpose, multi-region computer code with the capability of local definition of boundary conditions. Quadratic, isoparametric shape functions are used to model the geometry and field variables of the boundary (and domain) of the problem. The multi-region implementation permits a body to be modeled in substructured parts, thus dramatically reducing the cost of analysis. Furthermore, it allows a body consisting of regions of different (homogeneous) material to be studied. To test the program, results obtained for simple test cases are checked against their analytic solutions. Thereafter, a range of problems of practical interest are analyzed. In addition to displacement and traction loads, problems with body forces due to self-weight, centrifugal, and thermal loads are considered.

  10. Chromatographic methods for analysis of triazine herbicides.

    Science.gov (United States)

    Abbas, Hana Hassan; Elbashir, Abdalla A; Aboul-Enein, Hassan Y

    2015-01-01

    Gas chromatography (GC) and high-performance liquid chromatography (HPLC) coupled to different detectors, and in combination with different sample extraction methods, are most widely used for analysis of triazine herbicides in different environmental samples. Nowadays, many variations and modifications of extraction and sample preparation methods such as solid-phase microextraction (SPME), hollow fiber-liquid phase microextraction (HF-LPME), stir bar sportive extraction (SBSE), headspace-solid phase microextraction (HS-SPME), dispersive liquid-liquid microextraction (DLLME), dispersive liquid-liquid microextraction based on solidification of floating organic droplet (DLLME-SFO), ultrasound-assisted emulsification microextraction (USAEME), and others have been introduced and developed to obtain sensitive and accurate methods for the analysis of these hazardous compounds. In this review, several analytical properties such as linearity, sensitivity, repeatability, and accuracy for each developed method are discussed, and excellent results were obtained for the most of developed methods combined with GC and HPLC techniques for the analysis of triazine herbicides. This review gives an overview of recent publications of the application of GC and HPLC for analysis of triazine herbicides residues in various samples.

  11. ASAAM: Aspectual Sofware Architecture Analysis Method

    NARCIS (Netherlands)

    Tekinerdogan, B.

    Software architecture analysis methods aim to predict the quality of a system before it has been developed. In general, the quality of the architecture is validated by analyzing the impact of predefined scenarios on architectural components. Hereby, it is implicitly assumed that an appropriate

  12. Modern methods of wine quality analysis

    Directory of Open Access Journals (Sweden)

    Галина Зуфарівна Гайда

    2015-06-01

    Full Text Available  In this paper physical-chemical and enzymatic methods of quantitative analysis of the basic wine components were reviewed. The results of own experiments were presented for the development of enzyme- and cell-based amperometric sensors on ethanol, lactate, glucose, arginine

  13. Development of a detailed BWR core thermal-hydraulic analysis method based on the Japanese post-BT standard using a best-estimate code

    International Nuclear Information System (INIS)

    Ono, H.; Mototani, A.; Kawamura, S.; Abe, N.; Takeuchi, Y.

    2004-01-01

    The post-BT standard is a new fuel integrity standard or the Atomic Energy Society of Japan that allows temporary boiling transition condition in the evaluation for BWR anticipated operational occurrences. For application of the post-BT standard to BWR anticipated operational occurrences evaluation, it is important to identify which fuel assemblies and which axial, radial positions of fuel rods have temporarily experienced the post-BT condition and to evaluates how high the fuel cladding temperature rise was and how long the dryout duration continued. Therefore, whole bundle simulation, in which each fuel assembly is simulated independently by one thermal-hydraulic component, is considered to be an effective analytical method. In the present study, a best-estimate thermal-hydraulic code, TRACG02, has been modified to extend it predictive capability by implementing the post-BT evaluation model such as the post-BT heat transfer correlation and rewetting correlation and enlarging the number of components used for BWR plant simulation. Based on new evaluation methods, BWR core thermal-hydraulic behavior has been analyzed for typical anticipated operational occurrence conditions. The location where boiling transition occurs and the severity of fuel assembly in the case of boiling transition conditions such as fuel cladding temperature, which are important factors in determining whether the reuse of the fuel assembly can be permitted, were well predicted by the proposed evaluation method. In summary, a new evaluation method for a detailed BWR core thermal-hydraulic analysis based on the post-BT standard of the Atomic Energy Society of Japan has been developed and applied to the evaluation of the post-BT standard during the actual BWR plant anticipated operational occurrences. (author)

  14. Embedding methods: application and development

    Science.gov (United States)

    Cheng, Jin; Libisch, Florian; Carter, Emily

    2013-03-01

    Correlated-wavefunction/density functional theory (CW/DFT) embedding methods aim to combine the formally exact correlation treatment in CW methods with the high efficiency of DFT. By partitioning a system into a cluster and its environment, each part can be treated independently. Different embedding schemes have been proposed. The density-based scheme searches for a global embedding potential mediating the interaction on the DFT level. The potential can then be used in CW calculations, e.g., to investigate hot-electron assisted H2 dissociation on Al and Au surfaces. Experimentally, optical excitations of plasmons efficiently create the required hot electrons. The embedded CW calculations validates that the hot electrons play a key role. However, this method neglects the back-action of the cluster on the environment. To solve this problem, a potential-based scheme has been proposed [J. Chem. Phys., 135, 194104 (2011)] that allows for a self-consistent combination of different ab-initio methods. Such an embedding potential thus goes beyond the DFT level. The heterogeneity involved poses various numerical challenges. We report on efforts to construct appropriate basis sets and pseudopotentials as well as to optimize the numerical procedure.

  15. Hybrid methods for cybersecurity analysis :

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Warren Leon,; Dunlavy, Daniel M.

    2014-01-01

    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling and analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and

  16. Measurement of -OH groups in coals of different rank using microwave methodology, and the development of quantitative solid state n.m.r. methods for in situ analysis

    Energy Technology Data Exchange (ETDEWEB)

    Monsef-Mirzai, P.; McWhinnie, W.R.; Perry, M.C.; Burchill, P. [Aston University, Birmingham (United Kingdom). Dept. of Chemical Engineering and Applied Chemistry

    1995-05-01

    Experiments with both model compounds (substituted phenols) and with 11 coals (nine British and two American) have established that microwave heating will greatly accelerate silylation reactions of the phenolic -OH groups, e.g. for Creswell coal complete silylation of -OH groups occurs in 35 min in the microwave oven, whereas 24 h is required using a bench reflux technique. Microwave reaction times for coals vary from 35 min to 3 h for more dense coals such as Cortonwood. The above observations have allowed the development of a `one pot` silylation of coal, followed by an in situ analysis of the added Me{sub 3}Si- groups by quantitative {sup 29}Si magic angle spinning nuclear magnetic resonance (MAS n.m.r.) spectroscopy. The development of a quantitative n.m.r. method required the determination of {sup 29}Si spin lattice relaxation times, T{sub 1}, e.g. for silylated coals T{sub 1} {approximately} 8s; for silylated phenols, T{sub 1} {approximately} 25s; for the synthetic smectite clay laponite, T{sub 1} {approximately} 25 s; and for Ph{sub 3}SiH, T{sub 1} {approximately} 64 s. Inert laponite was selected as the standard. The requirement to wait for five T{sub 1 max} between pulses, together with the relatively low natural abundance of {sup 29}Si (4.71%), results in rather long accumulation times to obtain spectra of analytical quality (8-48 h). However, in comparison with other methods, even in the most unfavourable case, the total time from commencement of analysis to result may be described as `rapid`. The results for O{sub OH}/O{sub total} obtained are compared with other literature data. Comparison with ketene data, for example, shows agreement to vary from excellent (Creswell) through satisfactory (Cortonwood) to poor (Pittsburgh). Even in cases where agreement with ketene data is less good, the silylation results may be close to estimates made via other acetylation methods. Possible reasons for the variations observed are discussed. 18 refs., 2 figs., 7 tabs.

  17. Developing the Model of "Pedagogical Art Communication" Using Social Phenomenological Analysis: An Introduction to a Research Method and an Example for Its Outcome

    Science.gov (United States)

    Hofmann, Fabian

    2016-01-01

    Social phenomenological analysis is presented as a research method for museum and art education. After explaining its methodological background, it is shown how this method has been applied in a study of gallery talks or guided tours in art museums: Analyzing the situation by description and interpretation, a model for understanding gallery talks…

  18. Developments in gamma-ray spectrometry: systems, software, and methods-I. 5. Nuclear Spectral Analysis with Nonlinear Robust Fitting Techniques

    International Nuclear Information System (INIS)

    Lasche, G.P.; Coldwell, R.L.

    2001-01-01

    A new approach to nuclear spectral analysis based on nonlinear robust fitting techniques has been recently developed into a package suitable for public use. The methodology behind this approach was originally made available to the public as the RobFit command-line code, but it was extremely slow and difficult to use. Recent advances in microprocessor power and the development of a graphical user interface to make its use more intuitive have made this approach, which is quite computationally intensive, feasible for more routine applications. A brief description of some of the fundamental differences in the approach used by RobFit from the more common methods of nuclear spectral analysis involving local peak searches is presented here. Popular nuclear spectral analysis applications generally perform a peak search at their heart. The continuum in the neighborhood of each peak is estimated from local data and is subtracted from the data to yield the area and the energy of the peak. These are matched to a user-selected library of radionuclides containing the energies and areas of the most significant peaks, after accounting for the effects of detector efficiency and attenuation. With these codes, the energy-to-channel calibration, the peak width as a function of energy (or 'resolution calibration'), the detector intrinsic efficiency, and attenuation effects must usually be predetermined and provided as static input for the analysis. Most of these codes focus on regions of interest that represent many small pieces of the sample spectrum. In contrast, the RobFit approach works with an entire continuous spectrum to simultaneously determine the coefficients of all of the user-selected free variables that yield the best fit to the data. Peak searches are generally used only in interim steps to help suggest new radionuclides to include in the search library. Rather than first concentrate on the location of peaks, RobFit first concentrates on the determination of the continuum

  19. Development of a fast isocratic LC-MS/MS method for the high-throughput analysis of pyrrolizidine alkaloids in Australian honey.

    Science.gov (United States)

    Griffin, Caroline T; Mitrovic, Simon M; Danaher, Martin; Furey, Ambrose

    2015-01-01

    Honey samples originating from Australia were purchased and analysed for targeted pyrrolizidine alkaloids (PAs) using a new and rapid isocratic LC-MS/MS method. This isocratic method was developed from, and is comparable with, a gradient elution method and resulted in no loss of sensitivity or reduction in chromatographic peak shape. Isocratic elution allows for significantly shorter run times (6 min), eliminates the requirement for column equilibration periods and, thus, has the advantage of facilitating a high-throughput analysis which is particularly important for regulatory testing laboratories. In excess of two hundred injections are possible, with this new isocratic methodology, within a 24-h period which is more than 50% improvement on all previously published methodologies. Good linear calibrations were obtained for all 10 PAs and four PA N-oxides (PANOs) in spiked honey samples (3.57-357.14 µg l(-1); R(2) ≥ 0.9987). Acceptable inter-day repeatability was achieved for the target analytes in honey with % RSD values (n = 4) less than 7.4%. Limits of detection (LOD) and limits of quantitation (LOQ) were achieved with spiked PAs and PANOs samples; giving an average LOD of 1.6 µg kg(-1) and LOQ of 5.4 µg kg(-1). This method was successfully applied to Australian and New Zealand honey samples sourced from supermarkets in Australia. Analysis showed that 41 of the 59 honey samples were contaminated by PAs with the mean total sum of PAs being 153 µg kg(-1). Echimidine and lycopsamine were predominant and found in 76% and 88%, respectively, of the positive samples. The average daily exposure, based on the results presented in this study, were 0.051 µg kg(-1) bw day(-1) for adults and 0.204 µg kg(-1) bw day(-1) for children. These results are a cause for concern when compared with the proposed European Food Safety Authority (EFSA), Committee on Toxicity (COT) and Bundesinstitut für Risikobewertung (BfR - Federal Institute of Risk Assessment Germany) maximum

  20. Method Engineering: Engineering of Information Systems Development Methods and Tools

    OpenAIRE

    Brinkkemper, J.N.; Brinkkemper, Sjaak

    1996-01-01

    This paper proposes the term method engineering for the research field of the construction of information systems development methods and tools. Some research issues in method engineering are identified. One major research topic in method engineering is discussed in depth: situational methods, i.e. the configuration of a project approach that is tuned to the project at hand. A language and support tool for the engineering of situational methods are discussed.

  1. Developing Scoring Algorithms (Earlier Methods)

    Science.gov (United States)

    We developed scoring procedures to convert screener responses to estimates of individual dietary intake for fruits and vegetables, dairy, added sugars, whole grains, fiber, and calcium using the What We Eat in America 24-hour dietary recall data from the 2003-2006 NHANES.

  2. SIGNIFICANCE OF TARGETED EXOME SEQUENCING AND METHODS OF DATA ANALYSIS IN THE DIAGNOSIS OF GENETIC DISORDERS LEADING TO THE DEVELOPMENT OF EPILEPTIC ENCEPHALOPATHY

    Directory of Open Access Journals (Sweden)

    Tatyana Victorovna Kozhanova

    2017-08-01

    Full Text Available Epilepsy is the most common serious neurological disorder, and there is a genetic basis in almost 50% of people with epilepsy. The diagnosis of genetic epilepsies makes to estimate reasons of seizures in the patient. Last decade has shown tremendous growth in gene sequencing technologies, which have made genetic tests available. The aim is to show significance of targeted exome sequencing and methods of data analysis in the diagnosis of hereditary syndromes leading to the development of epileptic encephalopathy. We examined 27 patients with с early EE (resistant to antiepileptic drugs, psychomotor and speech development delay in the psycho-neurological department. Targeted exome sequencing was performed for patients without a previously identified molecular diagnosis using 454 Sequencing GS Junior sequencer (Roche and IlluminaNextSeq 500 platform. As a result of the analysis, specific epilepsy genetic variants were diagnosed in 27 patients. The greatest number of cases was due to mutations in the SCN1A gene (7/27. The structure of mutations for other genes (mutations with a minor allele frequency of less than 0,5% are presented: ALDH7A1 (n=1, CACNA1C (n=1, CDKL5 (n=1, CNTNAP2 (n=2, DLGAP2 (n=2, DOCK7 (n=2, GRIN2B (n=2, HCN1 (n=1, NRXN1 (n=3, PCDH19 (n=1, RNASEH2B (n=2, SLC2A1 (n=1, UBE3A (n=1. The use of the exome sequencing in the genetic practice allows to significantly improve the effectiveness of medical genetic counseling, as it made possible to diagnose certain variants of genetically heterogeneous groups of diseases with similar of clinical manifestations.

  3. Trace analysis of fluoxetine and its metabolite norfluoxetine. Part I: development of a chiral liquid chromatography-tandem mass spectrometry method for wastewater samples.

    Science.gov (United States)

    Barclay, Victoria K H; Tyrefors, Niklas L; Johansson, I Monika; Pettersson, Curt E

    2011-08-19

    An enantioselective method for the determination of fluoxetine (a selective serotonin reuptake inhibitor) and its pharmacologically active metabolite norfluoxetine has been developed for raw and treated wastewater samples. The stable isotope-labeled fluoxetine and norfluoxetine were used in an extended way for extraction recovery calculations at trace level concentrations in wastewater. Wastewater samples were enriched by solid phase extraction (SPE) with Evolute CX-50 extraction cartridges. The obtained extraction recoveries ranged between 65 and 82% in raw and treated wastewater at a trace level concentration of 50 pM (15-16 ng L⁻¹). The target compounds were identified by the use of chiral liquid chromatography tandem mass spectrometry (LC-MS/MS) in selected reaction monitoring (SRM) mode. The enantiomers were successfully resolved on a chiral α₁-acid glycoprotein column (chiral AGP) with acetonitrile and 10 mM ammonium acetate buffer at pH 4.4 (3/97, v/v) as the mobile phase. The effects of pH, amount of organic modifier and buffer concentration in the mobile phase were investigated on the enantiomeric resolution (R(s)) of the target compounds. Enantiomeric R(s)-values above 2.0 (1.03 RSD%, n=3) were achieved for the enantiomers of fluoxetine and norfluoxetine in all mobile phases investigated. The method was validated by assessing parameters such as cross-contamination and carryover during SPE and during LC analysis. Cross-talk effects were examined during the detection of the analytes in SRM mode. In addition, the isotopic purity of fluoxetine-d₅ and norfluoxetine-d₅ were assessed to exclude the possibility of self-contamination. The interassay precision of the chromatographic separation was excellent, with relative standard deviations (RSD) equal to or lower than 0.56 and 0.81% in raw and treated wastewaters, respectively. The method detection and quantification limits (respectively, MDL and MQL) were determined by the use of fluoxetine-d₅ and

  4. Method development for the analysis of ionophore antimicrobials in dairy manure to assess removal within a membrane-based treatment system.

    Science.gov (United States)

    Hurst, Jerod J; Wallace, Josh S; Aga, Diana S

    2018-04-01

    Ionophore antimicrobials are heavily used in the livestock industries, both for preventing animal infection by coccidia protozoa and for increasing feed efficiency. Ionophores are excreted mostly unmetabolized and are released into the environment when manure is land-applied to fertilize croplands. Here, an analytical method was optimized to study the occurrences of five ionophore residues (monensin, lasalocid, maduramycin, salinomycin, and narasin) in dairy manure after solid-liquid separation and further treatment of the liquid manure by a membrane-based treatment system. Ionophore residues from the separated solid manure (dewatered manure) and suspended solids of manure slurry samples were extracted using ultrasonication with methanol, followed by sample clean-up using solid phase extraction (SPE) and subsequent analysis via liquid chromatography-tandem mass spectrometry (LC-MS/MS). The use of an ethyl acetate and methanol (1:1 v:v) mixture as an SPE eluent resulted in higher recoveries and lower method quantitation limits (MQL), when compared to using methanol. Overall recoveries from separated solid manure ranged from 73 to 134%. Liquid manure fractions were diluted with Nanopure™ water and cleaned up using SPE, where recoveries ranged from 51 to 100%. The developed extraction and LC-MS/MS methods were applied to analyze dairy manure samples subjected to an advanced manure treatment process involving a membrane-based filtration step (reverse osmosis). Monensin and lasalocid were detected at higher concentrations in the suspended solid fractions (4.40-420 ng/g for lasalocid and 85-1950 ng/g for monensin) compared to the liquid fractions (

  5. Development and validation of a liquid chromatography tandem mass spectrometry method for the analysis of beta-agonists in animal feed and drinking water.

    Science.gov (United States)

    Juan, C; Igualada, C; Moragues, F; León, N; Mañes, J

    2010-09-24

    A reproducible, sensitive and selective multiresidue analytical method for seven beta-agonists: clenbuterol (CBT), clenpenterol (CPT), ractopamine (RTP), brombuterol (BBT), mabuterol (MBT), mapenterol (MPT), and hydroxymethylclenbuterol (HMCBT) was developed and validated by using liquid chromatography tandem mass spectrometry (LC-MS/MS) in feed and drinking water samples. The validation was achieved according to the criteria laid down in the Commission Decision 2002/657/EC, however it was necessary to use minimum required performance limits (MRPLs) proposed by the Community Reference Laboratories (CRLs) due to the lack of maximum residue limits (MRLs) for beta-agonists. By setting up these MRPLs, allows controlling their use in safe mode, since beta-agonists are commonly used in veterinary medicine sometime in a fraudulent manner, for increasing the weigh of animals. Values set for both matrices studied are 50 microg/kg for animal feed, and a range from 0.2 to 10 microg/L for drinking water. CCalpha values calculated were under the MRPLs suggested; for drinking water the lowest value obtained was 0.12 microg/L, and for animal feed 0.87 microg/kg. Values for CCbeta were ranged from 0.08 to 0.13 microg/L in drinking water and from 0.5 to 0.92 microg/kg in animal feed samples. The excellence values obtained, allowed us to conclude that the proposed analytical method is capable to control the beta-agonists studied in both matrices and that it can be successfully applied and used as a routine method in laboratories of residue analysis of veterinary food control. Copyright 2010 Elsevier B.V. All rights reserved.

  6. Developments in geophysical exploration methods

    CERN Document Server

    1982-01-01

    One of the themes in current geophysical development is the bringing together of the results of observations made on the surface and those made in the subsurface. Several benefits result from this association. The detailed geological knowledge obtained in the subsurface can be extrapolated for short distances with more confidence when the geologi­ cal detail has been related to well-integrated subsurface and surface geophysical data. This is of value when assessing the characteristics of a partially developed petroleum reservoir. Interpretation of geophysical data is generally improved by the experience of seeing the surface and subsurface geophysical expression of a known geological configuration. On the theoretical side, the understanding of the geophysical processes themselves is furthered by the study of the phenomena in depth. As an example, the study of the progress of seismic wave trains downwards and upwards within the earth has proved most instructive. This set of original papers deals with some of ...

  7. Radioctivation analysis: methods and biomedical applications

    International Nuclear Information System (INIS)

    Maziere, B.

    1976-01-01

    After a brief survey of activation analysis and its fields of application in biomedicine the physical bases of neutron activation are reviewed and the different neutron sources and nuclear reactions used are described. In the next chapter 'in vitro' analysis techniques are described and some biomedical applications developed. Carried out in the Frederic Joliot Hospital Service (S.H.F.J.) these applications concern research on the thyroid metabolism or hydromineral equilibrium in young patients under chronic dialysis, together with a nutritional study of some oligo-elements in infants. Chapter three deals with analysis 'in vivo', its methods and applications. Three examples are described: thyroid iodine determination, elementary analysis of the living animal and in vivo analysis of bone tissue in man. The article concludes with a discussion on the future prospects offered by the use of charged particles or muons for activation analysis purposes [fr

  8. Nuclear methods in national development

    International Nuclear Information System (INIS)

    1993-01-01

    This volume of the proceedings of the First National Conference on Nuclear Methods held at Kongo Conference Hotel Zaria from 2-4 September 1993, contains the full text of about 30 technical papers and speeches of invited dignitaries presented at the conference. The technical papers are original or review articles containing results and experiences in nuclear and related analytical techniques. Topics treated include neutron generator operation and control, nuclear data, application of nuclear techniques in environment, geochemistry, medicine, biology, agriculture, material science and industries. General topics in nuclear laboratory organization and research experiences were also covered. The papers were fully discussed during the conference and authors were requested to make changes in the manuscripts where necessary. However, they were further edited. The organizing committee wishes to thank all authors for their presentation and cooperation in submitting their manuscripts promptly and the participants for their excellent contribution during the conference

  9. COMPUTER-ASSISTED HIGH-PERFORMANCE LIQUID CHROMATOGRAPHY METHOD DEVELOPMENT WITH APPLICATIONS TO THE ISOLATION AND ANALYSIS OF PHYTOPLANKTON PIGMENTS. (R826944)

    Science.gov (United States)

    We used chromatography modeling software to assist in HPLC method development, with the goalof enhancing separations through the exclusive use of gradient time and column temperature. Wesurveyed nine stationary phases for their utility in pigment purification and natur...

  10. Development and Validation of Analytical Method for Losartan ...

    African Journals Online (AJOL)

    HP

    Purpose: To develop a new spectrophotometric method for the analysis of losartan potassium in pharmaceutical formulations by making its complex with copper. Method: A coloured complex based on UV/Vis spectroscopic method was developed for the determination of losartan potassium concentration in pharmaceutical ...

  11. Infinitesimal methods of mathematical analysis

    CERN Document Server

    Pinto, J S

    2004-01-01

    This modern introduction to infinitesimal methods is a translation of the book Métodos Infinitesimais de Análise Matemática by José Sousa Pinto of the University of Aveiro, Portugal and is aimed at final year or graduate level students with a background in calculus. Surveying modern reformulations of the infinitesimal concept with a thoroughly comprehensive exposition of important and influential hyperreal numbers, the book includes previously unpublished material on the development of hyperfinite theory of Schwartz distributions and its application to generalised Fourier transforms and harmon

  12. A strategy for evaluating pathway analysis methods.

    Science.gov (United States)

    Yu, Chenggang; Woo, Hyung Jun; Yu, Xueping; Oyama, Tatsuya; Wallqvist, Anders; Reifman, Jaques

    2017-10-13

    Researchers have previously developed a multitude of methods designed to identify biological pathways associated with specific clinical or experimental conditions of interest, with the aim of facilitating biological interpretation of high-throughput data. Before practically applying such pathway analysis (PA) methods, we must first evaluate their performance and reliability, using datasets where the pathways perturbed by the conditions of interest have been well characterized in advance. However, such 'ground truths' (or gold standards) are often unavailable. Furthermore, previous evaluation strategies that have focused on defining 'true answers' are unable to systematically and objectively assess PA methods under a wide range of conditions. In this work, we propose a novel strategy for evaluating PA methods independently of any gold standard, either established or assumed. The strategy involves the use of two mutually complementary metrics, recall and discrimination. Recall measures the consistency of the perturbed pathways identified by applying a particular analysis method to an original large dataset and those identified by the same method to a sub-dataset of the original dataset. In contrast, discrimination measures specificity-the degree to which the perturbed pathways identified by a particular method to a dataset from one experiment differ from those identifying by the same method to a dataset from a different experiment. We used these metrics and 24 datasets to evaluate six widely used PA methods. The results highlighted the common challenge in reliably identifying significant pathways from small datasets. Importantly, we confirmed the effectiveness of our proposed dual-metric strategy by showing that previous comparative studies corroborate the performance evaluations of the six methods obtained by our strategy. Unlike any previously proposed strategy for evaluating the performance of PA methods, our dual-metric strategy does not rely on any ground truth

  13. Rapid quantitative analysis of magnesium stearate in pharmaceutical powders and solid dosage forms by atomic absorption: method development and application in product manufacturing.

    Science.gov (United States)

    Sugisawa, Keiichi; Kaneko, Takashi; Sago, Tsuyoshi; Sato, Tomonobu

    2009-04-05

    The distribution of magnesium stearate (MgSt) in tablet granule has a significant impact on the compression process. A rapid quantitative method for evaluating magnesium stearate content by atomic absorption was established. The MgSt was extracted from the granule in 0.1 mol/L nitric acid and the resulting free magnesium ion quantitated by atomic absorption. The total analysis time was significantly shortened in comparison to the previously used sample ignition method. This newly established method was evaluated with several drug products and several types of blender. The analytical method was also applied to tablets with poor compression (rough tablet surface). The MgSt content in these rough surface tablets was significantly lower than in tablets with smooth surfaces from the same batch. From these results, this atomic absorption method is considered to be an accurate and useful method for evaluating MgSt distribution and can be applied to tablet manufacturing process validation.

  14. Data Analysis Methods for Paleogenomics

    DEFF Research Database (Denmark)

    Avila Arcos, Maria del Carmen

    , thanks to the introduction of NGS and the implementation of data analysis methods specific for each project. Chapters 1 to 3 have been published in peer-reviewed journals and Chapter 4 is currently in review. Chapter 5 consists of a manuscript describing initial results of an ongoing research project...... of sequence data, generated using next-generation sequencing (NGS) technologies, from either forensic (Chapter 1) or ancient (Chapters 2-5) materials. These chapters present projects very different in nature, reflecting the diversity of questions that have become possible to address in the ancient DNA field......, for which more data is currently being generated; therefore it should be interpreted as a preliminary report. In addition to the five chapters, an introduction and five appendices are included. Appended articles are included for the reader's interest, these represent the collaborations I have been part...

  15. Analysis of potential genotoxic impurities in rabeprazole active pharmaceutical ingredient via Liquid Chromatography-tandem Mass Spectrometry, following quality-by-design principles for method development.

    Science.gov (United States)

    Iliou, Katerina; Malenović, Anđelija; Loukas, Yannis L; Dotsikas, Yannis

    2018-02-05

    A novel Liquid Chromatography-tandem mass spectrometry (LC-MS/MS) method is presented for the quantitative determination of two potential genotoxic impurities (PGIs) in rabeprazole active pharmaceutical ingredient (API). In order to overcome the analytical challenges in the trace analysis of PGIs, a development procedure supported by Quality-by-Design (QbD) principles was evaluated. The efficient separation between rabeprazole and the two PGIs in the shortest analysis time was set as the defined analytical target profile (ATP) and to this purpose utilization of a switching valve allowed the flow to be sent to waste when rabeprazole was eluted. The selected critical quality attributes (CQAs) were the separation criterion s between the critical peak pair and the capacity factor k of the last eluted compound. The effect of the following critical process parameters (CPPs) on the CQAs was studied: %ACN content, the pH and the concentration of the buffer salt in the mobile phase, as well as the stationary phase of the analytical column. D-Optimal design was implemented to set the plan of experiments with UV detector. In order to define the design space, Monte Carlo simulations with 5000 iterations were performed. Acceptance criteria were met for C 8 column (50×4mm, 5μm) , and the region having probability π≥95% to achieve satisfactory values of all defined CQAs was computed. The working point was selected with the mobile phase consisting ‎of ACN, ammonium formate 11mM at a ratio 31/69v/v with pH=6,8 for the water phase. The LC protocol was transferred to LC-MS/MS and validated according to ICH guidelines. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Development of methods for evaluating active faults

    International Nuclear Information System (INIS)

    2013-01-01

    The report for long-term evaluation of active faults was published by the Headquarters for Earthquake Research Promotion on Nov. 2010. After occurrence of the 2011 Tohoku-oki earthquake, the safety review guide with regard to geology and ground of site was revised by the Nuclear Safety Commission on Mar. 2012 with scientific knowledges of the earthquake. The Nuclear Regulation Authority established on Sep. 2012 is newly planning the New Safety Design Standard related to Earthquakes and Tsunamis of Light Water Nuclear Power Reactor Facilities. With respect to those guides and standards, our investigations for developing the methods of evaluating active faults are as follows; (1) For better evaluation on activities of offshore fault, we proposed a work flow to date marine terrace (indicator for offshore fault activity) during the last 400,000 years. We also developed the analysis of fault-related fold for evaluating of blind fault. (2) To clarify the activities of active faults without superstratum, we carried out the color analysis of fault gouge and divided the activities into thousand of years and tens of thousands. (3) To reduce uncertainties of fault activities and frequency of earthquakes, we compiled the survey data and possible errors. (4) For improving seismic hazard analysis, we compiled the fault activities of the Yunotake and Itozawa faults, induced by the 2011 Tohoku-oki earthquake. (author)

  17. IoT System Development Methods

    NARCIS (Netherlands)

    Giray, G.; Tekinerdogan, B.; Tüzün, E.

    2018-01-01

    It is generally believed that the application of methods plays an important role in developing quality systems. A development method is mainly necessary for structuring the process in producing largescale and complex systems that involve high costs. Similar to the development of other systems, it is

  18. Method development and validations: characterization of critical ...

    African Journals Online (AJOL)

    Method development and validations: characterization of critical elements in the development of pharmaceuticals. ... International Journal of Health Research ... Although a thorough validation cannot rule out all potential problems, the process of method development and validation should address the most common ones.

  19. Practical Fourier analysis for multigrid methods

    CERN Document Server

    Wienands, Roman

    2004-01-01

    Before applying multigrid methods to a project, mathematicians, scientists, and engineers need to answer questions related to the quality of convergence, whether a development will pay out, whether multigrid will work for a particular application, and what the numerical properties are. Practical Fourier Analysis for Multigrid Methods uses a detailed and systematic description of local Fourier k-grid (k=1,2,3) analysis for general systems of partial differential equations to provide a framework that answers these questions.This volume contains software that confirms written statements about convergence and efficiency of algorithms and is easily adapted to new applications. Providing theoretical background and the linkage between theory and practice, the text and software quickly combine learning by reading and learning by doing. The book enables understanding of basic principles of multigrid and local Fourier analysis, and also describes the theory important to those who need to delve deeper into the detai...

  20. Reliability and risk analysis methods research plan

    International Nuclear Information System (INIS)

    1984-10-01

    This document presents a plan for reliability and risk analysis methods research to be performed mainly by the Reactor Risk Branch (RRB), Division of Risk Analysis and Operations (DRAO), Office of Nuclear Regulatory Research. It includes those activities of other DRAO branches which are very closely related to those of the RRB. Related or interfacing programs of other divisions, offices and organizations are merely indicated. The primary use of this document is envisioned as an NRC working document, covering about a 3-year period, to foster better coordination in reliability and risk analysis methods development between the offices of Nuclear Regulatory Research and Nuclear Reactor Regulation. It will also serve as an information source for contractors and others to more clearly understand the objectives, needs, programmatic activities and interfaces together with the overall logical structure of the program

  1. Development and Optimization of Voltammetric Methods for Real Time Analysis of Electrorefiner Salt with High Concentrations of Actinides and Fission Products

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Michael F.; Phongikaroon, Supathorn; Zhang, Jinsuo

    2018-03-30

    This project addresses the problem of achieving accurate material control and accountability (MC&A) around pyroprocessing electrorefiner systems. Spent nuclear fuel pyroprocessing poses a unique challenge with respect to reprocessing technology in that the fuel is never fully dissolved in the process fluid. In this case, the process fluid is molten, anhydrous LiCl-KCl salt. Therefore, there is no traditional input accountability tank. However, electrorefiners (ER) accumulate very large quantities of fissile nuclear material (including plutonium) and should be well safeguarded in a commercial facility. Idaho National Laboratory (INL) currently operates a pyroprocessing facility for treatment of spent fuel from Experimental Breeder Reactor-II with two such ER systems. INL implements MC&A via a mass tracking model in combination with periodic sampling of the salt and other materials followed by destructive analysis. This approach is projected to be insufficient to meet international safeguards timeliness requirements. A real time or near real time monitoring method is, thus, direly needed to support commercialization of pyroprocessing. A variety of approaches to achieving real time monitoring for ER salt have been proposed and studied to date—including a potentiometric actinide sensor for concentration measurements, a double bubbler for salt depth and density measurements, and laser induced breakdown spectroscopy (LIBS) for concentration measurements. While each of these methods shows some promise, each also involves substantial technical complexity that may ultimately limit their implementation. Yet another alternative is voltammetry—a very simple method in theory that has previously been tested for this application to a limited extent. The equipment for a voltammetry system consists of off-the-shelf components (three electrodes and a potentiostat), which results in substantial benefits relative to cost and robustness. Based on prior knowledge of electrochemical

  2. Machine Learning Methods for Production Cases Analysis

    Science.gov (United States)

    Mokrova, Nataliya V.; Mokrov, Alexander M.; Safonova, Alexandra V.; Vishnyakov, Igor V.

    2018-03-01

    Approach to analysis of events occurring during the production process were proposed. Described machine learning system is able to solve classification tasks related to production control and hazard identification at an early stage. Descriptors of the internal production network data were used for training and testing of applied models. k-Nearest Neighbors and Random forest methods were used to illustrate and analyze proposed solution. The quality of the developed classifiers was estimated using standard statistical metrics, such as precision, recall and accuracy.

  3. Study on the Development of New BWR Core Analysis Scheme Based on the Continuous Energy Monte Carlo Burn-up Calculation Method

    OpenAIRE

    東條, 匡志; tojo, masashi

    2007-01-01

    In this study, a BWR core calculation method is developed. The continuous energy Monte Carlo burn-up calculation code is newly applied to BWR assembly calculations of production level. The applicability of the present new calculation method is verified through the tracking-calculation of commercial BWR.The mechanism and quantitative effects of the error propagations, the spatial discretization and of the temperature distribution in fuel pellet on the Monte Carlo burn-up calculations are clari...

  4. Moral counselling: a method in development.

    Science.gov (United States)

    de Groot, Jack; Leget, Carlo

    2011-01-01

    This article describes a method of moral counselling developed in the Radboud University Medical Centre Nijmegen (The Netherlands). The authors apply insights of Paul Ricoeur to the non-directive counselling method of Carl Rogers in their work of coaching patients with moral problems in health care. The developed method was shared with other health care professionals in a training course. Experiences in the course and further practice led to further improvement of the method.

  5. Development and Validation of a Stability-Indicating HPTLC Method for Analysis of Rasagiline Mesylate in the Bulk Drug and Tablet Dosage Form

    Directory of Open Access Journals (Sweden)

    Singaram Kathirvel

    2012-01-01

    Full Text Available A simple and sensitive thin-layer chromatographic method has been established for analysis of rasagiline mesylate in pharmaceutical dosage form. Chromatography on silica gel 60 F254 plates with 6 : 1 : 2(v/v/v butanol-methanol water as mobile phase furnished compact spots at Rf  0.76±0.01. Densitometric analysis was performed at 254 nm. To show the specificity of the method, rasagiline mesylate was subjected to acid, base, neutral hydrolysis, oxidation, photolysis, and thermal decomposition, and the peaks of degradation products were well resolved from that of the pure drug. Linear regression analysis revealed a good linear relationship between peak area and amount of rasagiline mesylate in the range of 100–350 ng/band. The minimum amount of rasagiline mesylate that could be authentically detected and quantified was 11.12 and 37.21 ng/band, respectively. The method was validated, in accordance with ICH guidelines for precision, accuracy, and robustness. Since the method could effectively separate the drug from its degradation products, it can be regarded as stability indicating.

  6. Enabling Predictive Simulation and UQ of Complex Multiphysics PDE Systems by the Development of Goal-Oriented Variational Sensitivity Analysis and A Posteriori Error Estimation Methods

    Energy Technology Data Exchange (ETDEWEB)

    Ginting, Victor

    2014-03-15

    it was demonstrated that a posteriori analyses in general and in particular one that uses adjoint methods can accurately and efficiently compute numerical error estimates and sensitivity for critical Quantities of Interest (QoIs) that depend on a large number of parameters. Activities include: analysis and implementation of several time integration techniques for solving system of ODEs as typically obtained from spatial discretization of PDE systems; multirate integration methods for ordinary differential equations; formulation and analysis of an iterative multi-discretization Galerkin finite element method for multi-scale reaction-diffusion equations; investigation of an inexpensive postprocessing technique to estimate the error of finite element solution of the second-order quasi-linear elliptic problems measured in some global metrics; investigation of an application of the residual-based a posteriori error estimates to symmetric interior penalty discontinuous Galerkin method for solving a class of second order quasi-linear elliptic problems; a posteriori analysis of explicit time integrations for system of linear ordinary differential equations; derivation of accurate a posteriori goal oriented error estimates for a user-defined quantity of interest for two classes of first and second order IMEX schemes for advection-diffusion-reaction problems; Postprocessing finite element solution; and A Bayesian Framework for Uncertain Quantification of Porous Media Flows.

  7. Method development for the analysis of resinous materials with MALDI-FT-ICR-MS: novel internal standards and a new matrix material for negative ion mode.

    Science.gov (United States)

    Teearu, A; Vahur, S; Rodima, T; Herodes, K; Bonrath, W; Netscher, T; Tshepelevitsh, S; Trummal, A; Lõkov, M; Leito, I

    2017-09-01

    Matrix-assisted laser desorption/ionization (MALDI) is a mass spectrometry (MS) ionization technique suitable for a wide variety of sample types including highly complex ones such as natural resinous materials. Coupled with Fourier transform ion cyclotron resonance (FT-ICR) mass analyser, which provides mass spectra with high resolution and accuracy, the method gives a wealth of information about the composition of the sample. One of the key aspects in MALDI-MS is the right choice of matrix compound. We have previously demonstrated that 2,5-dihydroxybenzoic acid is suitable for the positive ion mode analysis of resinous samples. However, 2,5-dihydroxybenzoic acid was found to be unsuitable for the analysis of these samples in the negative ion mode. The second problem addressed was the limited choice of calibration standards offering a flexible selection of m/z values under m/z 1000. This study presents a modified MALDI-FT-ICR-MS method for the analysis of resinous materials, which incorporates a novel matrix compound, 2-aminoacridine for the negative ion mode analysis and extends the selection of internal standards with m/z negative (anions of four fluorine-rich sulpho-compounds) ion mode. The novel internal calibration compounds and matrix material were tested for the analysis of various natural resins and real-life varnish samples taken from cultural heritage objects. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Development and validation of multi-residue and multi-class method for antibacterial substances analysis in non-target feed by liquid chromatography - tandem mass spectrometry.

    Science.gov (United States)

    Patyra, Ewelina; Nebot, Carolina; Gavilán, Rosa Elvira; Cepeda, Alberto; Kwiatek, Krzysztof

    2018-03-01

    A confirmatory HPLC-MS/MS method for the determination of residues of 11 antibacterial substances from different therapeutic class (β-lactams, lincosamides, fluoroquinolones, macrolides, pleuromutilins and sulfonamides) in animal feeds has been developed. The sample preparation is based on extraction with 0.1% formic acid in acetonitrile. Separation of the analytes was performed on biphenyl column with a gradient of 0.1% formic acid in acetonitrile and 0.1% formic acid in Milli-Q water. The developed method was validated following the guidelines included in the European Union Commission Decision 2002/657/EC. Limits of detection ranging from 79.22 to 193.60 µg/kg; instrumental and analytical linearity coefficients were above 0.99 for matrix-match calibration; and relative recoveries ranging from 76.04% to 117.39%. Repeatability of the method was in the range of 2.41-19.76% (CV, %), whereas reproducibility ranged from 6.52 to 28.40% (CV %). The method shown to be efficient and precise for quantification of the 11 antibacterial substances in animal feed. The results demonstrate the feasibility of the method for routine use to monitor these substances in feed. The validated method was successfully applied to eight suspect feed samples collected from the Association of American Feed Control Officials (AFFCO) and feed manufactures from Galicia (Spain) in June and July 2017. Of these 8 non-target feeds, 5 were positive for the presence of tiamulin, tylosin and sulfadiazine.

  9. Development and Validation of Analytical Method for Losartan ...

    African Journals Online (AJOL)

    Development and Validation of Analytical Method for Losartan-Copper Complex Using UV-Vis Spectrophotometry. ... Tropical Journal of Pharmaceutical Research ... Purpose: To develop a new spectrophotometric method for the analysis of losartan potassium in pharmaceutical formulations by making its complex with ...

  10. PIXE - a new method for elemental analysis

    International Nuclear Information System (INIS)

    Johansson, S.A.E.

    1983-01-01

    With elemental analysis we mean the determination of which chemical elements are present in a sample and of their concentration. This is an old and important problem in chemistry. The earliest methods were purely chemical and many such methods are still used. However, various methods based on physical principles have gradually become more and more important. One such method is neutron activation. When the sample is bombarded with neutrons it becomes radioactive and the various radioactive isotopes produced can be identified by the radiation they emit. From the measured intensity of the radiation one can calculate how much of a certain element that is present in the sample. Another possibility is to study the light emitted when the sample is excited in various ways. A spectroscopic investigation of the light can identify the chemical elements and allows also a determination of their concentration in the sample. In the same way, if a sample can be brought to emit X-rays, this radiation is also characteristic for the elements present and can be used to determine the elemental concentration. One such X-ray method which has been developed recently is PIXE. The name is an acronym for Particle Induced X-ray Emission and indicates the principle of the method. Particles in this context means heavy, charged particles such as protons and a-particles of rather high energy. Hence, in PIXE-analysis the sample is irradiated in the beam of an accelerator and the emitted X-rays are studied. (author)

  11. Bayesian methods for data analysis

    CERN Document Server

    Carlin, Bradley P.

    2009-01-01

    Approaches for statistical inference Introduction Motivating Vignettes Defining the Approaches The Bayes-Frequentist Controversy Some Basic Bayesian Models The Bayes approach Introduction Prior Distributions Bayesian Inference Hierarchical Modeling Model Assessment Nonparametric Methods Bayesian computation Introduction Asymptotic Methods Noniterative Monte Carlo Methods Markov Chain Monte Carlo Methods Model criticism and selection Bayesian Modeling Bayesian Robustness Model Assessment Bayes Factors via Marginal Density Estimation Bayes Factors

  12. Substoichiometric method in the simple radiometric analysis

    International Nuclear Information System (INIS)

    Ikeda, N.; Noguchi, K.

    1979-01-01

    The substoichiometric method is applied to simple radiometric analysis. Two methods - the standard reagent method and the standard sample method - are proposed. The validity of the principle of the methods is verified experimentally in the determination of silver by the precipitation method, or of zinc by the ion-exchange or solvent-extraction method. The proposed methods are simple and rapid compared with the conventional superstoichiometric method. (author)

  13. The SIESTA method; developments and applicability

    International Nuclear Information System (INIS)

    Artacho, Emilio; Anglada, E; Dieguez, O; Gale, J D; Garcia, A; Junquera, J; Martin, R M; Ordejon, P; Pruneda, J M; Sanchez-Portal, D; Soler, J M

    2008-01-01

    Recent developments in and around the SIESTA method of first-principles simulation of condensed matter are described and reviewed, with emphasis on (i) the applicability of the method for large and varied systems (ii) efficient basis sets for the standards of accuracy of density-functional methods (iii) new implementations, and (iv) extensions beyond ground-state calculations

  14. Adaptive Nodal Transport Methods for Reactor Transient Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Thomas Downar; E. Lewis

    2005-08-31

    Develop methods for adaptively treating the angular, spatial, and time dependence of the neutron flux in reactor transient analysis. These methods were demonstrated in the DOE transport nodal code VARIANT and the US NRC spatial kinetics code, PARCS.

  15. Further developments in the study of harmonic analysis by the correlation and spectral density methods, and its application to the adult rabbit EEG

    International Nuclear Information System (INIS)

    Meilleurat, Michele

    1973-07-01

    The application of harmonic analysis to the brain spontaneous electrical activity has been studied theoretically and practically in 30 adult rabbits chronically implanted with electrodes. Theoretically, an accurate energetic study of the signal can only be achieved by the calculation of the autocorrelation function and its Fourier transform, the power density spectrum. Secondly, a comparative study has been made of the analogical methods using analogic or hybrid devices and the digital method with an analysis and computing program (the sampling rate, the delay, the period of integration and the problems raised by the amplification of the biological signals and sampling). Data handling is discussed, the method mainly retaining the study of variance, the calculation of the total energy carried by the signal and the energies carried along the frequency bandwidth ΔF, their percentage as related to the total energy, the relationships of these various values for various electroencephalographic states. Experimentally, the general aspect of the spontaneous electric activity of the dorsal hippocampus and the visual cortex during vigilance variations is accurately described by the calculation of the variance and the study of the position of the maximum values of the peaks of the power density spectra on the frequency axis as well as by the calculation of the energies carried in various frequency bands, 0-4, 4-8, 8-12 Hz. With the same theoretical bases, both the analogical and digital methods lead to similar results, the former being easier to operate, the latter more accurate. (author) [fr

  16. Development and validation of confirmatory method for analysis of nitrofuran metabolites in milk, honey, poultry meat and fish by liquid chromatography-mass spectrometry

    Directory of Open Access Journals (Sweden)

    Fatih Alkan

    2016-03-01

    Full Text Available In this study we have devoloped and validated a confirmatory analysis method for nitrofuran metabolites, which is in accordance with European Commission Decision 2002/657/EC requirements. Nitrofuran metabolites in honey, milk, poultry meat and fish samples were acidic hydrolised followed by derivatisation with nitrobenzaldehyde and liquid-liquid extracted with ethylacetate. The quantitative and confirmative determination of nitrofuran metbolites was performed by liquid chromatography/electrospray ionisation tandem mass spectrometry (LC/ESI-MS/MS in the positive ion mode. In-house method validation was performed and reported data of validation (specificity, linearity, recovery, CCα and CCβ. The advantage of this method is that it avoids the use of clean-up by Solid-Phase Extraction (SPE. Furthermore, low levels of nitrofuran metabolites are detectable and quantitatively confirmed at a rapid rate in all samples.

  17. Development and validation of a normal-phase HPTLC method for the simultaneous analysis of Lamivudine and Zidovudine in fixed-dose combination tablets

    Directory of Open Access Journals (Sweden)

    Palani Venkatesh

    2012-04-01

    Full Text Available Simultaneous quantification of Lamivudine and Zidovudine in tablets by HPTLC method was developed and validated. The chromatograms were developed using a mobile phase of toluene:ethyl acetate:methanol (4:4:2, v/v/v on pre-coated plate of silica gel GF aluminum TLC plate and quantified by densitometric absorbance mode at 276 nm. The Rf values were 0.41±0.03 and 0.60±0.04 for Lamivudine and Zidovudine, respectively. The linearity of the method was found to be within the concentration range of 50−250 ng/spot for Lamivudine and for Zidovudine, it was 100−500 ng/spot. The lower limits of detection and quantification were 2.23 ng/spot and 7.90 ng/spot for Lamivudine and 2.90 ng/spot and 8.85 ng/spot for Zidovudine. The method was also validated for precision, specificity and recovery. This developed method was used to analyze fixed-dose tablets (Duovir, Cipla Ltd samples of Lamivudine and Zidovudine. Keywords: Normal-phase HPTLC, Lamivudine, Zidovudine, Methanol

  18. Numerical methods and analysis of multiscale problems

    CERN Document Server

    Madureira, Alexandre L

    2017-01-01

    This book is about numerical modeling of multiscale problems, and introduces several asymptotic analysis and numerical techniques which are necessary for a proper approximation of equations that depend on different physical scales. Aimed at advanced undergraduate and graduate students in mathematics, engineering and physics – or researchers seeking a no-nonsense approach –, it discusses examples in their simplest possible settings, removing mathematical hurdles that might hinder a clear understanding of the methods. The problems considered are given by singular perturbed reaction advection diffusion equations in one and two-dimensional domains, partial differential equations in domains with rough boundaries, and equations with oscillatory coefficients. This work shows how asymptotic analysis can be used to develop and analyze models and numerical methods that are robust and work well for a wide range of parameters.

  19. Fundamental analysis and development of the current and voltage control method by changing the driving frequency for the transcutaneous energy transmission system.

    Science.gov (United States)

    Miura, Hidekazu; Yamada, Akihiro; Shiraishi, Yasuyuki; Yambe, Tomoyuki

    2015-08-01

    We have been developing transcutaneous energy transmission system (TETS) for a ventricular assist device, shape memory alloy (SMA) fibered artificial organs and so on, the system has high efficiency and a compact size. In this paper, we summarize the development, design method and characteristics of the TETS. New control methods for stabilizing output voltage or current of the TETS are proposed. These methods are primary side, are outside of the body, not depending on a communication system from the inside the body. Basically, the TETS operates at the fixed frequency with a suitable compensation capacitor so that the internal impedance is minimalized and a flat load characteristic is obtained. However, when the coil shifted from the optimal position, the coupling factor changes and the output is fluctuated. TETS has a resonant property; its output can be controlled by changing the driving frequency. The continuous current to continuous voltage driving method was implemented by changing driving frequency and setting of limitation of low side frequency. This method is useful for battery charging system for electrically driven artificial hearts and also useful for SMA fibered artificial organs which need intermittent high peak power comsumption. In this system, the internal storage capacitor is charged slowly while the fibers are turned off and discharge the energy when the fibers are turned on. We examined the effect of the system. It was found that the size and maximum output of the TETS would able to be reduced.

  20. Developing methods of controlling quality costs

    OpenAIRE

    Gorbunova A. V.; Maximova O. N.; Ekova V. A.

    2017-01-01

    The article examines issues of managing quality costs, problems of applying economic methods of quality control, implementation of progressive methods of quality costs management in enterprises with the view of improving the efficiency of their evaluation and analysis. With the aim of increasing the effectiveness of the cost management mechanism, authors introduce controlling as a tool of deviation analysis from the standpoint of the process approach. A list of processes and corresponding eva...

  1. Development and comparison of two multi-residue methods for the analysis of select pesticides in honey bees, pollen, and wax by gas chromatography-quadrupole mass spectrometry.

    Science.gov (United States)

    Li, Yuanbo; Kelley, Rebecca A; Anderson, Troy D; Lydy, Michael J

    2015-08-01

    One of the hypotheses that may help explain the loss of honey bee colonies worldwide is the increasing potential for exposure of honey bees to complex mixtures of pesticides. To better understand this phenomenon, two multi-residue methods based on different extraction and cleanup procedures have been developed, and compared for the determination of 11 relevant pesticides in honey bees, pollen, and wax by gas chromatography-quadrupole mass spectrometry. Sample preparatory methods included solvent extraction followed by gel permeation chromatography (GPC) cleanup and cleanup using a dispersive solid-phase extraction with zirconium-based sorbents (Z-Sep). Matrix effects, method detection limits, recoveries, and reproducibility were evaluated and compared. Method detection limits (MDL) of the pesticides for the GPC method in honey bees, pollen, and wax ranged from 0.65 to 5.92 ng/g dw, 0.56 to 6.61 ng/g dw, and 0.40 to 8.30 ng/g dw, respectively, while MDLs for the Z-Sep method were from 0.33 to 4.47 ng/g dw, 0.42 to 5.37 ng/g dw, and 0.51 to 5.34 ng/g dw, respectively. The mean recoveries in all matrices and at three spiking concentrations ranged from 64.4% to 149.5% and 71.9% to 126.2% for the GPC and Z-Sep methods, with relative standard deviation between 1.5-25.3% and 1.3-15.9%, respectively. The results showed that the Z-Sep method was more suitable for the determination of the target pesticides, especially chlorothalonil, in bee hive samples. The Z-Sep method was then validated using a series of field-collected bee hive samples taken from honey bee colonies in Virginia. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Chemical methods of rock analysis

    National Research Council Canada - National Science Library

    Jeffery, P. G; Hutchison, D

    1981-01-01

    .... Such methods include those based upon spectrophotometry, flame emission spectrometry and atomic absorption spectroscopy, as well as gravimetry, titrimetry and the use of ion-selective electrodes...

  3. Development of a Radial Deconsolidation Method

    Energy Technology Data Exchange (ETDEWEB)

    Helmreich, Grant W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Montgomery, Fred C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hunn, John D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-01

    A series of experiments have been initiated to determine the retention or mobility of fission products* in AGR fuel compacts [Petti, et al. 2010]. This information is needed to refine fission product transport models. The AGR-3/4 irradiation test involved half-inch-long compacts that each contained twenty designed-to-fail (DTF) particles, with 20-μm thick carbon-coated kernels whose coatings were deliberately fabricated such that they would crack under irradiation, providing a known source of post-irradiation isotopes. The DTF particles in these compacts were axially distributed along the compact centerline so that the diffusion of fission products released from the DTF kernels would be radially symmetric [Hunn, et al. 2012; Hunn et al. 2011; Kercher, et al. 2011; Hunn, et al. 2007]. Compacts containing DTF particles were irradiated at Idaho National Laboratory (INL) at the Advanced Test Reactor (ATR) [Collin, 2015]. Analysis of the diffusion of these various post-irradiation isotopes through the compact requires a method to radially deconsolidate the compacts so that nested-annular volumes may be analyzed for post-irradiation isotope inventory in the compact matrix, TRISO outer pyrolytic carbon (OPyC), and DTF kernels. An effective radial deconsolidation method and apparatus appropriate to this application has been developed and parametrically characterized.

  4. Development of a chromatographic separation method hyphenated to electro-spray ionization mass spectrometry (ESI-MS) and inductively coupled plasma mass spectrometry (ICP-MS): application to the lanthanides speciation analysis

    International Nuclear Information System (INIS)

    Beuvier, Ludovic

    2015-01-01

    This work focuses on the development of a chromatographic separation method coupled to both ESI-MS and ICP-MS in order to achieve the comprehensive speciation analysis of lanthanides in aqueous phase representative of back-extraction phases of advanced spent nuclear fuel treatment processes. This analytical method allowed the separation, the characterization and the quantitation of lanthanides complexes holding poly-aminocarboxylic ligands, such as DTPA and EDTA, used as complexing agents in these processes. A HILIC separation method of lanthanides complexes has been developed with an amide bonded stationary phase. A screening of a wide range of mobile phase compositions demonstrated that the adsorption mechanism was predominant. This screening allowed also obtaining optimized separation conditions. Faster analysis conditions with shorter amide column packed with sub 2 μm particles reduced analysis time by 2.5 and 25% solvent consumption. Isotopic and structural characterization by HILIC ESI-MS was performed as well as the development of external calibration quantitation method. Analytical performances of quantitation method were determined. Finally, the development of the HILIC coupling to ESI-MS and ICP-MS was achieved. A simultaneous quantitation method by ESI-MS and ICP-MS was performed to determine the species quantitative distribution in solution. Analytical performances of quantitation method were also determined. (author) [fr

  5. Development of Three Methods for Simultaneous Quantitative ...

    African Journals Online (AJOL)

    densitometric method, was based on the separation of the mixture on silica gel plates using chloroform: methanol (93:7, v/v) as a mobile phase. Results: All the proposed methods were successfully applied to the analysis of raw materials and dosage ...

  6. Data Analysis Methods for Library Marketing

    Science.gov (United States)

    Minami, Toshiro; Kim, Eunja

    Our society is rapidly changing to information society, where the needs and requests of the people on information access are different widely from person to person. Library's mission is to provide its users, or patrons, with the most appropriate information. Libraries have to know the profiles of their patrons, in order to achieve such a role. The aim of library marketing is to develop methods based on the library data, such as circulation records, book catalogs, book-usage data, and others. In this paper we discuss the methodology and imporatnce of library marketing at the beginning. Then we demonstrate its usefulness through some examples of analysis methods applied to the circulation records in Kyushu University and Guacheon Library, and some implication that obtained as the results of these methods. Our research is a big beginning towards the future when library marketing is an unavoidable tool.

  7. Development of fatigue cracks from mechanically machined scratches on 2024-T351 aluminium alloy - Part II: finite element analysis and prediction method

    OpenAIRE

    Cini, Andrea; Irving, Phil E.

    2016-01-01

    A prediction method to evaluate the effect of scratch geometry on fatigue life of aluminium structures containing scribe marks was developed on the basis of the experimental results described in Part I of this paper. Finite element calculations were performed on scribed samples to investigate the local stress around scribes. Elastic and elastic plastic stress and strain distributions at the scribe root were computed under monotonic and cyclic tensile and bending loads evaluating the driving f...

  8. Development and Validation of a GC-MS Method for the Analysis of Homogentisic Acid in Strawberry Tree (Arbutus unedoL.) Honey.

    Science.gov (United States)

    Brčić Karačonji, Irena; Jurica, Karlo

    2017-07-01

    To confirm the botanical origin of strawberry tree (Arbutus unedo L.) honey, a liquid-liquid extraction followed by GC-MS method was developed for the quantitative determination of homogentisic acid (HGA), the main phenolic compound in this honey. Different parameters affecting extraction, such as the type and volume of extraction solvents, pH of the solution, and amount of salt, were optimized. The method showed good linearity (r2 = 0.9990) over the tested concentration range (50-500 mg/kg) and a low LOD (0.3 mg/kg). Precision expressed as RSD was strawberry tree honey samples from Croatia. The HGA content in analyzed samples (n = 7) ranged from 245.1 to 485.9 mg/kg. The proposed method provided reliable performance and can be easily implemented for the routine monitoring of HGA in strawberry tree honey in order to assure honey QC.

  9. Development of an efficient fungal DNA extraction method to be used in random amplified polymorphic DNA-PCR analysis to differentiate cyclopiazonic acid mold producers.

    Science.gov (United States)

    Sánchez, Beatriz; Rodríguez, Mar; Casado, Eva M; Martín, Alberto; Córdoba, Juan J

    2008-12-01

    A variety of previously established mechanical and chemical treatments to achieve fungal cell lysis combined with a semiautomatic system operated by a vacuum pump were tested to obtain DNA extract to be directly used in randomly amplified polymorphic DNA (RAPD)-PCR to differentiate cyclopiazonic acid-producing and -nonproducing mold strains. A DNA extraction method that includes digestion with proteinase K and lyticase prior to using a mortar and pestle grinding and a semiautomatic vacuum system yielded DNA of high quality in all the fungal strains and species tested, at concentrations ranging from 17 to 89 ng/microl in 150 microl of the final DNA extract. Two microliters of DNA extracted with this method was directly used for RAPD-PCR using primer (GACA)4. Reproducible RAPD fingerprints showing high differences between producer and nonproducer strains were observed. These differences in the RAPD patterns did not differentiate all the strains tested in clusters by cyclopiazonic acid production but may be very useful to distinguish cyclopiazonic acid producer strains from nonproducer strains by a simple RAPD analysis. Thus, the DNA extracts obtained could be used directly without previous purification and quantification for RAPD analysis to differentiate cyclopiazonic acid producer from nonproducer mold strains. This combined analysis could be adaptable to other toxigenic fungal species to enable differentiation of toxigenic and non-toxigenic molds, a procedure of great interest in food safety.

  10. Development of a liquid chromatography-electrospray ionization-tandem mass spectrometry method for the simultaneous analysis of intact glucosinolates and isothiocyanates in Brassicaceae seeds and functional foods.

    Science.gov (United States)

    Franco, P; Spinozzi, S; Pagnotta, E; Lazzeri, L; Ugolini, L; Camborata, C; Roda, A

    2016-01-08

    A new high pressure liquid chromatography-electrospray ionization-tandem mass spectrometry method for the simultaneous determination of glucosinolates, as glucoraphanin and glucoerucin, and the corresponding isothiocyanates, as sulforaphane and erucin, was developed and applied to quantify these compounds in Eruca sativa defatted seed meals and enriched functional foods. The method involved solvent extraction, separation was achieved in gradient mode using water with 0.5% formic acid and acetonitrile with 0.5% formic acid and using a reverse phase C18 column. The electrospray ion source operated in negative and positive mode for the detection of glucosinolates and isothiocyanates, respectively, and the multiple reaction monitoring (MRM) was selected as acquisition mode. The method was validated following the ICH guidelines. Replicate experiments demonstrated a good accuracy (bias%food products enriched with glucosinolates, or nutraceutical bakery products. In addition, the developed method was applied to the simultaneous determination of glucosinolates and isothiocyanates in bakery product enriched with glucosinolates, to evaluate their thermal stability after different industrial processes from cultivation phases to consumer processing. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Instrumental neutron activation analysis - a routine method

    International Nuclear Information System (INIS)

    Bruin, M. de.

    1983-01-01

    This thesis describes the way in which at IRI instrumental neutron activation analysis (INAA) has been developed into an automated system for routine analysis. The basis of this work are 20 publications describing the development of INAA since 1968. (Auth.)

  12. Probabilistic methods in combinatorial analysis

    CERN Document Server

    Sachkov, Vladimir N

    2014-01-01

    This 1997 work explores the role of probabilistic methods for solving combinatorial problems. These methods not only provide the means of efficiently using such notions as characteristic and generating functions, the moment method and so on but also let us use the powerful technique of limit theorems. The basic objects under investigation are nonnegative matrices, partitions and mappings of finite sets, with special emphasis on permutations and graphs, and equivalence classes specified on sequences of finite length consisting of elements of partially ordered sets; these specify the probabilist

  13. Report of Research Cooperation Sub-Committee 46 on research and development of methods for inelastic (EPICC: Elastic-PlastIC-Creep) structural analysis

    International Nuclear Information System (INIS)

    Yamada, Yoshiaki

    1977-05-01

    This report succeeds the preceding one on ''Verification and Qualification of Nonlinear Structural Analysis Computer Program''. PNC (Power Reactor and Nuclear Fuel Development Corporation) decided to sponsor an extended research project on inelastic structural analysis for a period spanning September, 1976 to May, 1978. Responding to PNC proposal, RC Sub-Committee 46 was formed in Japan Society of Mechanical Engineers and plunged into the cooperative work from October, 1976. Besides the verification and/or qualification of available general purpose computer programs which were the major objectives of previous contract, the Committee executed the research on the topics categorized into the following three fields of interests: 1. Material data for use in elastic analysis, 2. Inelastic analysis procedure and computer program verification, 3. Design code and processing of computer solutions. This report summarizes the efforts during the first year of the Sub-Committee and consists of three parts each corresponding to the research topics stated above. Part I. Inelastic constitutive equations for materials under high temperature service conditions Part II. EPICC standard benchmark test problem and solutions Part III. Examination of postprocessors and development Although the research is still in the intermediate stage, the features of research being actively under way are 1. Evaluative review and nationwide collection of material data, recommendation of tentative constitutive equations for elastic-plastic and creep analyses of benchmark test problem, 2. Revision and augmentation of EPICC standard benchmark test problem and competitive and/or cooperative execution of solutions, 3. Review of existing prototypical post processors, and development of a processor for piping design. (author)

  14. Development and validation of a GC-C-IRMS method for the confirmation analysis of pseudo-endogenous glucocorticoids in doping control.

    Science.gov (United States)

    de la Torre, Xavier; Curcio, Davide; Colamonici, Cristiana; Molaioni, Francesco; Cilia, Marta; Botrè, Francesco

    2015-01-01

    Glucocorticoids are included in the S9 section of the World Anti-doping Agency (WADA) prohibited list international standard. Some among them are pseudo-endogenous steroids, like cortisol and cortisone, which present the same chemical structure as endogenously produced steroids. We are proposing an analytical method based on gas chromatography coupled to isotope ratio mass spectrometry (GC-C-IRMS) which allows discrimination between endogenous and synthetic origin of the urinary metabolites of the pseudo-endogenous glucocorticoids. A preliminary purification treatment by high-performance liquid chromatography (HPLC) of the target compounds (TC) (i.e., cortisol, tetrahydrocortisone (THE) 5α-tetrahydrocortisone (aTHE), tetrahydrocortisol (THF), and 5α-tetrahydrocortisol (aTHF)) allows collection of extracts with adequate purity for the subsequent analysis by IRMS. A population of 40 urine samples was analyzed for the TC and for the endogenous reference compounds (ERC: i.e., 11-desoxy-tetrahydrocortisol (THS) or pregnanediol). For each sample, the difference between the delta values of the ERCs and TCs (Δδ values) were calculated and based on that, some decision limits for atypical findings are proposed. The limits are below 3% units except for cortisol. The fit to purpose of the method has been confirmed by the analysis of urine samples collected in two patients under treatment with 25 mg of cortisone acetate (p.o). The samples showed Δδ values higher than 3 for at least 24 h following administration depending on the TC considered. The method can easily be integrated into existing procedures already used for the HPLC purification and IRMS analysis of pseudo-endogenous steroids with androgenic/anabolic activity. Copyright © 2015 John Wiley & Sons, Ltd.

  15. Methods in quantitative image analysis.

    Science.gov (United States)

    Oberholzer, M; Ostreicher, M; Christen, H; Brühlmann, M

    1996-05-01

    The main steps of image analysis are image capturing, image storage (compression), correcting imaging defects (e.g. non-uniform illumination, electronic-noise, glare effect), image enhancement, segmentation of objects in the image and image measurements. Digitisation is made by a camera. The most modern types include a frame-grabber, converting the analog-to-digital signal into digital (numerical) information. The numerical information consists of the grey values describing the brightness of every point within the image, named a pixel. The information is stored in bits. Eight bits are summarised in one byte. Therefore, grey values can have a value between 0 and 256 (2(8)). The human eye seems to be quite content with a display of 5-bit images (corresponding to 64 different grey values). In a digitised image, the pixel grey values can vary within regions that are uniform in the original scene: the image is noisy. The noise is mainly manifested in the background of the image. For an optimal discrimination between different objects or features in an image, uniformity of illumination in the whole image is required. These defects can be minimised by shading correction [subtraction of a background (white) image from the original image, pixel per pixel, or division of the original image by the background image]. The brightness of an image represented by its grey values can be analysed for every single pixel or for a group of pixels. The most frequently used pixel-based image descriptors are optical density, integrated optical density, the histogram of the grey values, mean grey value and entropy. The distribution of the grey values existing within an image is one of the most important characteristics of the image. However, the histogram gives no information about the texture of the image. The simplest way to improve the contrast of an image is to expand the brightness scale by spreading the histogram out to the full available range. Rules for transforming the grey value

  16. A development of the direct Lyapunov method for the analysis of transient stability of a system of synchronous generators based on the determination of non- stable equilibria on a multidimensional sphere

    Directory of Open Access Journals (Sweden)

    A. V. Stepanov

    2014-01-01

    Full Text Available A development of the direct Lyapunov method for the analysis of transient stability of a system of synchronous generators based on the determination of non- stable equilibria on a multidimensional sphere.We consider the problem of transient stability analysis for a system of synchronous generators under the action of strong perturbations. The aim of our work is to develop methods to analyze a transient stability of the system of synchronous generators, which allow getting trustworthy results on reserve transient stability under different perturbations. For the analysis of transient stability, we use the direct Lyapunov method.One of the problems for this method application is to find the Lypunov function that well reflects the properties of a parallel system of synchronous generators. The most reliable results were obtained when the analysis of transient stability was performed with a Lyapunov function of energy type. Another problem for application of the direct Lyapunov method is to determine the critical value of the Lyapunov function, which requires finding the non-stable equilibria of the system. Determination of the non-stable equilibria requires studying the Lyapunov function in a multidimensional space in a neighborhood of a stable equilibrium for the post-breakdown system; this is a complicated non-linear problem.In the paper, we propose a method for determination of the non-stable equilibria on a multidimensional sphere. The method is based on a search of a minimum of the Lyapunov function on a multidimensional sphere the center of which is a stable equilibrium. Our method allows, comparing with the other, e.g., gradient methods, reliable finding a non-stable equilibrium and calculating the critical value. The reliability of our method is proved by numerical experiments. The developed methods and a program realized in a MATLAB package can be recommended for design of a post-breakdown control system of synchronous generators or as a

  17. Development of a method to determine the specific environment as a starting point for the strategic analysis and the approach to competitors' Knowledge: presentation and applications

    Directory of Open Access Journals (Sweden)

    Emilio García Vega

    2015-09-01

    Full Text Available The determination of the specific environment is important for the formulation of efficient enterprise strategies, on the basis of a strategic analysis properly focused. This paper suggests a method to help its limitation and identification. With its use, it pretends to offer a simple and practical tool that allows to have a more accurate approach to the identification of the industry that will be analysed, as well as, a clarification of the specification of the direct and substitute competition. Also, with the use of this tool, the managers of a business idea, an experienced or new organization, will have an approximation to the mentioned themes that are of a strategic importance in any management type. Likewise, two applications of the proposed method are presented: the first orientated to a business idea and the second to supermarkets with a high service charge in Lima, Peru.

  18. Development of Tsunami PSA method for Korean NPP site

    International Nuclear Information System (INIS)

    Kim, Min Kyu; Choi, In Kil; Park, Jin Hee

    2010-01-01

    A methodology of tsunami PSA was developed in this study. A tsunami PSA consists of tsunami hazard analysis, tsunami fragility analysis and system analysis. In the case of tsunami hazard analysis, evaluation of tsunami return period is major task. For the evaluation of tsunami return period, numerical analysis and empirical method can be applied. The application of this method was applied to a nuclear power plant, Ulchin 56 NPP, which is located in the east coast of Korean peninsula. Through this study, whole tsunami PSA working procedure was established and example calculation was performed for one of real nuclear power plant in Korea

  19. Probabilistic methods in fire-risk analysis

    International Nuclear Information System (INIS)

    Brandyberry, M.D.

    1989-01-01

    The first part of this work outlines a method for assessing the frequency of ignition of a consumer product in a building and shows how the method would be used in an example scenario utilizing upholstered furniture as the product and radiant auxiliary heating devices (electric heaters, wood stoves) as the ignition source. Deterministic thermal models of the heat-transport processes are coupled with parameter uncertainty analysis of the models and with a probabilistic analysis of the events involved in a typical scenario. This leads to a distribution for the frequency of ignition for the product. In second part, fire-risk analysis as currently used in nuclear plants is outlines along with a discussion of the relevant uncertainties. The use of the computer code COMPBRN is discussed for use in the fire-growth analysis along with the use of response-surface methodology to quantify uncertainties in the code's use. Generalized response surfaces are developed for temperature versus time for a cable tray, as well as a surface for the hot gas layer temperature and depth for a room of arbitrary geometry within a typical nuclear power plant compartment. These surfaces are then used to simulate the cable tray damage time in a compartment fire experiment

  20. Uni-dimensional double development HPTLC-densitometry method for simultaneous analysis of mangiferin and lupeol content in mango (Mangifera indica) pulp and peel during storage.

    Science.gov (United States)

    Jyotshna; Srivastava, Pooja; Killadi, Bharti; Shanker, Karuna

    2015-06-01

    Mango (Mangifera indica) fruit is one of the important commercial fruit crops of India. Similar to other tropical fruits it is also highly perishable in nature. During storage/ripening, changes in its physico-chemical quality parameters viz. firmness, titrable acidity, total soluble solid content (TSSC), carotenoids content, and other biochemicals are inevitable. A uni-dimensional double-development high-performance thin-layer chromatography (UDDD-HPTLC) method was developed for the real-time monitoring of mangiferin and lupeol in mango pulp and peel during storage. The quantitative determination of both compounds of different classes was achieved by densitometric HPTLC method. Silica gel 60F254 HPTLC plates and two solvent systems viz. toluene/EtOAC/MeOH and EtOAC/MeOH, respectively were used for optimum separation and selective evaluation. Densitometric quantitation of mangiferin was performed at 390nm, while lupeol at 610nm after post chromatographic derivatization. Validated method was used to real-time monitoring of mangiferin and lupeol content during storage in four Indian cultivars, e.g. Bombay green (Bgreen), Dashehari, Langra, and Chausa. Significant correlations (p<0.05) between of acidity and TSSC with mangiferin and lupeol in pulp and peel during storage were also observed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Prognostic Analysis System and Methods of Operation

    Science.gov (United States)

    MacKey, Ryan M. E. (Inventor); Sneddon, Robert (Inventor)

    2014-01-01

    A prognostic analysis system and methods of operating the system are provided. In particular, a prognostic analysis system for the analysis of physical system health applicable to mechanical, electrical, chemical and optical systems and methods of operating the system are described herein.

  2. Development of analysis method of material f low cost accounting using lean technique in food production: A case study of Universal Food Public (UFC Co.,Ltd.

    Directory of Open Access Journals (Sweden)

    Wichai Chattinnawat

    2015-06-01

    Full Text Available This research aims to apply Lean technique in conjunction with analysis of Material Flow Cost Accounting (MFCA to production process of canned sweet corn in order to increase process efficiency, eliminate waste and reduce cost of the production. This research develops and presents new type of MFCA analysis by incorporating value and non-value added activities into the MFCA cost allocation process. According to the simulation-based measurement of the process efficiency, integrated cost allocation based on activity types results in higher proportion of negative product cost in comparison to that computed from conventional MFCA cost allocation. Thus, considering types of activities and process efficiency have great impacts on cost structure especially for the negative product cost. The research leads to solutions to improve work procedures, eliminate waste and reduce production cost. The overall cost per unit decreases with higher proportion of positive product cost.

  3. Regional Development Sustainability Analysis Consept

    Directory of Open Access Journals (Sweden)

    Janno Reiljan

    2014-08-01

    Full Text Available Problems associated with the qualitative analysis and quantitative measurement of sustainability, and opportunities for connecting the concept with the methodological basis of development assessment and the essence of the subject that values sustainability are dealed. The goal of article is to work out the basics for analysis of the regional development in a country in terms and framework of sustainability concept. The article starts by outlining the definition of sustainability, which is followed by an analysis of the nature of sustainability. The third subsection highlights the demands of the decision-making process in guaranteeing sustainability and then considers sustainability in a competitive environment. In the second part of article the sustainable development conception is implemented in regional development sustainability analysis.

  4. Cleanup standards and pathways analysis methods

    International Nuclear Information System (INIS)

    Devgun, J.S.

    1993-01-01

    Remediation of a radioactively contaminated site requires that certain regulatory criteria be met before the site can be released for unrestricted future use. Since the ultimate objective of remediation is to protect the public health and safety, residual radioactivity levels remaining at a site after cleanup must be below certain preset limits or meet acceptable dose or risk criteria. Release of a decontaminated site requires proof that the radiological data obtained from the site meet the regulatory criteria for such a release. Typically release criteria consist of a composite of acceptance limits that depend on the radionuclides, the media in which they are present, and federal and local regulations. In recent years, the US Department of Energy (DOE) has developed a pathways analysis model to determine site-specific soil activity concentration guidelines for radionuclides that do not have established generic acceptance limits. The DOE pathways analysis computer code (developed by Argonne National Laboratory for the DOE) is called RESRAD (Gilbert et al. 1989). Similar efforts have been initiated by the US Nuclear Regulatory Commission (NRC) to develop and use dose-related criteria based on genetic pathways analyses rather than simplistic numerical limits on residual radioactivity. The focus of this paper is radionuclide contaminated soil. Cleanup standards are reviewed, pathways analysis methods are described, and an example is presented in which RESRAD was used to derive cleanup guidelines

  5. Validated modified Lycopodium spore method development for ...

    African Journals Online (AJOL)

    Validated modified lycopodium spore method has been developed for simple and rapid quantification of herbal powdered drugs. Lycopodium spore method was performed on ingredients of Shatavaryadi churna, an ayurvedic formulation used as immunomodulator, galactagogue, aphrodisiac and rejuvenator. Estimation of ...

  6. Development of spectrophotometric fingerprinting method for ...

    African Journals Online (AJOL)

    Selective and efficient analytical methods are required not only for quality assurance but also for authentication of herbal formulations. A simple, rapid and validated fingerprint method has developed for estimation of piperine in 'Talisadi churna', a well known herbal formulation in India. The estimation was carried out in two ...

  7. Development of seismic design method for piping system supported by elastoplastic damper. 3. Vibration test of three-dimensional piping model and its response analysis

    International Nuclear Information System (INIS)

    Namita, Yoshio; Kawahata, Jun-ichi; Ichihashi, Ichiro; Fukuda, Toshihiko.

    1995-01-01

    Component and piping systems in current nuclear power plants and chemical plants are designed to employ many supports to maintain safety and reliability against earthquakes. However, these supports are rigid and have a slight energy-dissipating effect. It is well known that applying high-damping supports to the piping system is very effective for reducing the seismic response. In this study, we investigated the design method of the elastoplastic damper [energy absorber (EAB)] and the seismic design method for a piping system supported by the EAB. Our final goal is to develop technology for applying the EAB to the piping system of an actual plant. In this paper, the vibration test results of the three-dimensional piping model are presented. From the test results, it is confirmed that EAB has a large energy-dissipating effect and is effective in reducing the seismic response of the piping system, and that the seismic design method for the piping system, which is the response spectrum mode superposition method using each modal damping and requires iterative calculation of EAB displacement, is applicable for the three-dimensional piping model. (author)

  8. Developing a TQM quality management method model

    OpenAIRE

    Zhang, Zhihai

    1997-01-01

    From an extensive review of total quality management literature, the external and internal environment affecting an organization's quality performance and the eleven primary elements of TQM are identified. Based on the primary TQM elements, a TQM quality management method model is developed. This model describes the primary quality management methods which may be used to assess an organization's present strengths and weaknesses with regard to its use of quality management methods. This model ...

  9. Development and method of use of a mass spectrometric isotope dilution analysis within the use of negative thermoionisation for determination of boron traces

    International Nuclear Information System (INIS)

    Zeininger, H.

    1984-01-01

    A mass spectrometric trace boron determination using negative thermionisation was developed. It is based on the determination of the ratio of BO 2 - isotopes ( 10 B and 11 B). A high stability and a constant intensity at a given temperature of the BO 2 - ion currents allow for a computer controlled measurement with a programmed heating. The reproducibility lies at around 0,004-0,08%. The boron determination using Mels potentiometry with a BF 4 - -ion selective electrode was used as an analytical comparison method. The MS-IDA was first used on metal samples, such as Al, Zr, and steel. Later on the boron in reagents, biological material (milk powder, spinach, water plants) and water were determined. For this material-dependent hydrolysation and separation procedures were worked out. The MS-IDA in comparison to all other analytical methods used by other collaborators offers the greatest accuracy. (RB) [de

  10. Regional Development Sustainability Analysis Consept

    OpenAIRE

    Janno Reiljan

    2014-01-01

    Problems associated with the qualitative analysis and quantitative measurement of sustainability, and opportunities for connecting the concept with the methodological basis of development assessment and the essence of the subject that values sustainability are dealed. The goal of article is to work out the basics for analysis of the regional development in a country in terms and framework of sustainability concept. The article starts by outlining the definition of sustainability, which is fol...

  11. Developing a multi-method approach to data collection and analysis for explaining the learning during simulation in undergraduate nurse education.

    Science.gov (United States)

    Bland, Andrew J; Tobbell, Jane

    2015-11-01

    Simulation has become an established feature of undergraduate nurse education and as such requires extensive investigation. Research limited to pre-constructed categories imposed by some questionnaire and interview methods may only provide partial understanding. This is problematic in understanding the mechanisms of learning in simulation-based education as contemporary distributed theories of learning posit that learning can be understood as the interaction of individual identity with context. This paper details a method of data collection and analysis that captures interaction of individuals within the simulation experience which can be analysed through multiple lenses, including context and through the lens of both researcher and learner. The study utilised a grounded theory approach involving 31 under-graduate third year student nurses. Data was collected and analysed through non-participant observation, digital recordings of simulation activity and focus group deconstruction of their recorded simulation by the participants and researcher. Focus group interviews enabled further clarification. The method revealed multiple levels of dynamic data, concluding that in order to better understand how students learn in social and active learning strategies, dynamic data is required enabling researchers and participants to unpack what is happening as it unfolds in action. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Determination of 222Rn in fresh water: development of a robust method of analysis by alpha/beta separation liquid scintillation spectrometry.

    Science.gov (United States)

    Pates, Jacqueline M; Mullinger, Neil J

    2007-01-01

    Liquid scintillation spectrometry is used widely for determining (222)Rn in natural waters; however, the benefits of alpha/beta separation have not been fully explored. The extractants toluene and Ultima Gold F were compared, and both performed well for a range of extreme waters. A robust method for calibrating extraction and counting efficiencies has been developed. Detection limits are 20 mBql(-1) (toluene) and 16 mBql(-1) (UGF) for a 60 min count and 600-ml sample, halving the required sample volume.

  13. Spectroscopic Methods of Steroid Analysis

    Science.gov (United States)

    Kasal, Alexander; Budesinsky, Milos; Griffiths, William J.

    Modern chemical laboratories contain equipment capable of measuring many of the physical properties of single chemical compounds and mixtures of compounds, particularly their spectral properties, which can, if interpreted correctly, provide valuable information about both structure (of single compounds) and composition (of mixtures). Over the past 50 years, the author have witnessed enormous progress in the technical capabilities of this equipment. Automation and speed of analysis have greatly improved the ease of use and the versatility of the technology.

  14. Convergence of the homotopy analysis method

    OpenAIRE

    Turkyilmazoglu, Mustafa

    2010-01-01

    The homotopy analysis method is studied in the present paper. The question of convergence of the homotopy analysis method is resolved. It is proven that under a special constraint the homotopy analysis method does converge to the exact solution of the sought solution of nonlinear ordinary or partial differential equations. An optimal value of the convergence control parameter is given through the square residual error. An error estimate is also provided. Examples, including the Blasius flow, ...

  15. Root Cause Analysis: Methods and Mindsets.

    Science.gov (United States)

    Kluch, Jacob H.

    This instructional unit is intended for use in training operations personnel and others involved in scram analysis at nuclear power plants in the techniques of root cause analysis. Four lessons are included. The first lesson provides an overview of the goals and benefits of the root cause analysis method. Root cause analysis techniques are covered…

  16. Development of a flow method for the determination of phosphate in estuarine and freshwaters-Comparison of flow cells in spectrophotometric sequential injection analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Raquel B.R. [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal); Laboratory of Hydrobiology, Institute of Biomedical Sciences Abel Salazar (ICBAS) and Institute of Marine Research (CIIMAR), Universidade do Porto, Lg. Abel Salazar 2, 4099-003 Porto (Portugal); Ferreira, M. Teresa S.O.B. [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal); Toth, Ildiko V. [REQUIMTE, Departamento de Quimica, Faculdade de Farmacia, Universidade de Porto, Rua Anibal Cunha, 164, 4050-047 Porto (Portugal); Bordalo, Adriano A. [Laboratory of Hydrobiology, Institute of Biomedical Sciences Abel Salazar (ICBAS) and Institute of Marine Research (CIIMAR), Universidade do Porto, Lg. Abel Salazar 2, 4099-003 Porto (Portugal); McKelvie, Ian D. [School of Chemistry, University of Melbourne, Victoria 3010 (Australia); Rangel, Antonio O.S.S., E-mail: aorangel@esb.ucp.pt [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal)

    2011-09-02

    Highlights: {yields} Sequential injection determination of phosphate in estuarine and freshwaters. {yields} Alternative spectrophotometric flow cells are compared. {yields} Minimization of schlieren effect was assessed. {yields} Proposed method can cope with wide salinity ranges. {yields} Multi-reflective cell shows clear advantages. - Abstract: A sequential injection system with dual analytical line was developed and applied in the comparison of two different detection systems viz; a conventional spectrophotometer with a commercial flow cell, and a multi-reflective flow cell coupled with a photometric detector under the same experimental conditions. The study was based on the spectrophotometric determination of phosphate using the molybdenum-blue chemistry. The two alternative flow cells were compared in terms of their response to variation of sample salinity, susceptibility to interferences and to refractive index changes. The developed method was applied to the determination of phosphate in natural waters (estuarine, river, well and ground waters). The achieved detection limit (0.007 {mu}M PO{sub 4}{sup 3-}) is consistent with the requirement of the target water samples, and a wide quantification range (0.024-9.5 {mu}M) was achieved using both detection systems.

  17. Development of a flow method for the determination of phosphate in estuarine and freshwaters-Comparison of flow cells in spectrophotometric sequential injection analysis

    International Nuclear Information System (INIS)

    Mesquita, Raquel B.R.; Ferreira, M. Teresa S.O.B.; Toth, Ildiko V.; Bordalo, Adriano A.; McKelvie, Ian D.; Rangel, Antonio O.S.S.

    2011-01-01

    Highlights: → Sequential injection determination of phosphate in estuarine and freshwaters. → Alternative spectrophotometric flow cells are compared. → Minimization of schlieren effect was assessed. → Proposed method can cope with wide salinity ranges. → Multi-reflective cell shows clear advantages. - Abstract: A sequential injection system with dual analytical line was developed and applied in the comparison of two different detection systems viz; a conventional spectrophotometer with a commercial flow cell, and a multi-reflective flow cell coupled with a photometric detector under the same experimental conditions. The study was based on the spectrophotometric determination of phosphate using the molybdenum-blue chemistry. The two alternative flow cells were compared in terms of their response to variation of sample salinity, susceptibility to interferences and to refractive index changes. The developed method was applied to the determination of phosphate in natural waters (estuarine, river, well and ground waters). The achieved detection limit (0.007 μM PO 4 3- ) is consistent with the requirement of the target water samples, and a wide quantification range (0.024-9.5 μM) was achieved using both detection systems.

  18. Probabilistic Analysis Methods for Hybrid Ventilation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Frier, Christian; Heiselberg, Per

    This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application...

  19. UHPLC/MS-MS Analysis of Six Neonicotinoids in Honey by Modified QuEChERS: Method Development, Validation, and Uncertainty Measurement

    Directory of Open Access Journals (Sweden)

    Michele Proietto Galeano

    2013-01-01

    Full Text Available Rapid and reliable multiresidue analytical methods were developed and validated for the determination of 6 neonicotinoids pesticides (acetamiprid, clothianidin, imidacloprid, nitenpyram, thiacloprid, and thiamethoxam in honey. A modified QuEChERS method has allowed a very rapid and efficient single-step extraction, while the detection was performed by UHPLC/MS-MS. The recovery studies were carried out by spiking the samples at two concentration levels (10 and 40 μg/kg. The methods were subjected to a thorough validation procedure. The mean recovery was in the range of 75 to 114% with repeatability below 20%. The limits of detection were below 2.5 μg/kg, while the limits of quantification did not exceed 4.0 μg/kg. The total uncertainty was evaluated taking the main independent uncertainty sources under consideration. The expanded uncertainty did not exceed 49% for the 10 μg/kg concentration level and was in the range of 16–19% for the 40 μg/kg fortification level.

  20. Development and Validation of a Normal Phase Chiral HPLC Method for Analysis of Afoxolaner Using a Chiralpak® AD-3 Column.

    Science.gov (United States)

    Zhuang, Jinyou; Kumar, Satish; Rustum, Abu

    2016-11-01

    Afoxolaner is a new antiparasitic molecule from the isoxazoline family that acts on the insect acarine gamma-aminobutyric acid and glutamate receptors. Isoxazoline family of compounds has been employed as active pharmaceutical ingredient in drug products prescribed for control of fleas and ticks in dogs. Afoxolaner with a chiral center at isoxazoline ring exists as a racemic mixture. A normal phase chiral high performance liquid chromatography analytical method has been developed to verify that afoxolaner is a racemic mixture as demonstrated by specific rotation, as well as to determine enantiomeric purity of single enantiomer samples. A Chiralpak ® AD-3 column (150 × 4.6 mm I.D.) maintained at 35°C was used in the method. Analytes were analyzed with an isocratic elution using n-Hexane/IPA/MeOH (89:10:1, v/v/v) as the mobile phase with a detection wavelength of 312 nm. Desired separation of the two enantiomers was achieved in <10 minutes with resolution and selectivity factors of 5.0 and 1.54, respectively. The analytical method was appropriately validated according to ICH guidelines for its intended use. ® All marks are the property of their respective owners. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. A simple LC-MS/MS method for quantitative analysis of underivatized neurotransmitters in rats urine: assay development, validation and application in the CUMS rat model.

    Science.gov (United States)

    Zhai, Xue-jia; Chen, Fen; Zhu, Chao-ran; Lu, Yong-ning

    2015-11-01

    Many amino acid neurotransmitters in urine are associated with chronic stress as well as major depressive disorders. To better understand depression, an analytical LC-MS/MS method for the simultaneous determination of 11 underivatized neurotransmitters (4-aminohippurate, 5-HIAA, glutamate, glutamine, hippurate, pimelate, proline, tryptophan, tyramine, tyrosine and valine) in a single analytical run was developed. The advantage of this method is the simple preparation in that there is no need to deconjugate the urine samples. The quantification range was 25-12,800 ng mL(-1) with >85.8% recovery for all analytes. The nocturnal urine concentrations of the 11 neurotransmitters in chronic unpredictable mild stress (CUMS) model rats and control group (n = 12) were analyzed. A series of significant changes in urinary excretion of neurotransmitters could be detected: the urinary glutamate, glutamine, hippurate and tyramine concentrations were significantly lower in the CUMS group. In addition, the urinary concentrations of tryptophan as well as tyrosine were significantly higher in chronically stressed rats. This method allows the assessment of the neurotransmitters associated with CUMS in rat urine in a single analytical run, making it suitable for implementation as a routine technique in depression research. Copyright © 2015 John Wiley & Sons, Ltd.

  2. Development and Validation of Improved Method for Fingerprint ...

    African Journals Online (AJOL)

    Purpose: To develop and validate an improved method by capillary zone electrophoresis with photodiode array detection for the fingerprint analysis of Ligusticum chuanxiong Hort. (Rhizoma Chuanxiong). Methods: The optimum high performance capillary electrophoresis (HPCE) conditions were 30 mM borax containing 5 ...

  3. Development of a perfusion reversed-phase high performance liquid chromatography method for the characterisation of maize products using multivariate analysis.

    Science.gov (United States)

    Rodriguez-Nogales, J M; Garcia, M C; Marina, M L

    2006-02-03

    A perfusion reversed-phase high performance liquid chromatography (RP-HPLC) method has been designed to allow rapid (3.4 min) separations of maize proteins with high resolution. Several factors, such as extraction conditions, temperature, detection wavelength and type and concentration of ion-pairing agent were optimised. A fine optimisation of the gradient elution was also performed by applying experimental design. Commercial maize products for human consumption (flours, precocked flours, fried snacks and extruded snacks) were characterised for the first time by perfusion RP-HPLC and their chromatographic profiles allowed a differentiation among products relating the different technological process used for their preparation. Furthermore, applying discriminant analysis makes it possible to group the samples according with the technological process suffered by maize products, obtaining a good prediction in 92% of the samples.

  4. Earthquake Hazard Analysis Methods: A Review

    Science.gov (United States)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  5. A modified VIKOR method for multiple criteria analysis.

    Science.gov (United States)

    Chang, Chia-Ling

    2010-09-01

    The VIKOR method was developed to solve multiple criteria decision making (MCDM) problems with conflicting or non-commensurable criteria. This method assumes that compromising is acceptable for conflicting resolution. Although the VIKOR method is a popular method applied in multi-criteria analysis (MCA), it has some problems when solving MCDM problems. This study discussed existing problems in the traditional VIKOR method. The objective of this study was to develop a modified VIKOR method to avoid numerical difficulties in solving problems by the traditional VIKOR method. Several synthetic experiments were designed and assessed to verify the improvement of solution efficiency of the modified VIKOR method in MCA.

  6. Development of new NIR-spectroscopy method combined with multivariate analysis for detection of adulteration in camel milk with goat milk.

    Science.gov (United States)

    Mabood, Fazal; Jabeen, Farah; Ahmed, Manzor; Hussain, Javid; Al Mashaykhi, Saaida A A; Al Rubaiey, Zainb M A; Farooq, Saim; Boqué, Ricard; Ali, Liaqat; Hussain, Zahid; Al-Harrasi, Ahmed; Khan, Abdul Latif; Naureen, Zakira; Idrees, Mohammed; Manzoor, Suryyia

    2017-04-15

    New NIR spectroscopy combined with multivariate analysis for detection and quantification of camel milk adulteration with goat milk was investigated. Camel milk samples were collected from Aldhahira and Sharqia regions of Sultanate of Oman and were measured using NIR spectroscopy in absorption mode in the wavelength range from 700 to 2500nm, at 2cm -1 resolution and using a 0.2mm path length CaF 2 sealed cell. The multivariate methods like PCA, PLS-DA and PLS regression were used for interpretation of NIR spectral data. PLS-DA was used to detect the discrimination between the pure and adulterated milk samples. For PLSDA model the R-square value obtained was 0.974 with 0.08 RMSE. Furthermore, PLS regression model was used to quantify the levels of adulteration from, 0%, 2%, 5%, 10%, 15% and 20%. The PLS model showed the RMSEC=1.10% with R 2 =94%. This method is simple, reproducible, having excellent sensitivity. The limit of detection was found 0.5%, while the limit of quantification was 2%. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Advances in methods and applications of reliability and safety analysis

    International Nuclear Information System (INIS)

    Fieandt, J.; Hossi, H.; Laakso, K.; Lyytikaeinen, A.; Niemelae, I.; Pulkkinen, U.; Pulli, T.

    1986-01-01

    The know-how of the reliability and safety design and analysis techniques of Vtt has been established over several years in analyzing the reliability in the Finnish nuclear power plants Loviisa and Olkiluoto. This experience has been later on applied and developed to be used in the process industry, conventional power industry, automation and electronics. VTT develops and transfers methods and tools for reliability and safety analysis to the private and public sectors. The technology transfer takes place in joint development projects with potential users. Several computer-aided methods, such as RELVEC for reliability modelling and analysis, have been developed. The tool developed are today used by major Finnish companies in the fields of automation, nuclear power, shipbuilding and electronics. Development of computer-aided and other methods needed in analysis of operating experience, reliability or safety is further going on in a number of research and development projects

  8. Development of a loop-mediated isothermal amplification method for detecting Streptococcus equi subsp. zooepidemicus and analysis of its use with three simple methods of extracting DNA from equine respiratory tract specimens.

    Science.gov (United States)

    Kinoshita, Yuta; Niwa, Hidekazu; Katayama, Yoshinari

    2014-09-01

    Streptococcus equi subsp. zooepidemicus (S. zooepidemicus) is a dominant pathogenic bacterium in equine pneumonia. We developed a specific loop-mediated isothermal amplification (LAMP) method, which targets the gene encoding sorbitol-6-phosphate 2-dehydrogenase (sorD), for detecting S. zooepidemicus and examined the clinical efficacies of its use in combination with each of 3 DNA extraction methods easily used by veterinary practitioners, namely the Loopamp PURE DNA Extraction Kit, InstaGene Matrix and a conventional boiling method. The LAMP method plus the Loopamp PURE DNA Extraction Kit gave higher rates of positivity than the other combinations in both clinical and spiked samples containing clinically significant concentrations (>1 × 10(4) CFU/ml) of S. zooepidemicus.

  9. Strategic Options Development and Analysis

    Science.gov (United States)

    Ackermann, Fran; Eden, Colin

    Strategic Options Development and Analysis (SODA) enables a group or individual to construct a graphical representation of a problematic situation, and thus explore options and their ramifications with respect to a complex system of goals or objectives. In addition the method aims to help groups arrive at a negotiated agreement about how to act to resolve the situation. It is based upon the use of causal mapping - a formally constructed means-ends network - as representation form. Because the picture has been constructed using the natural language of the problem owners it becomes a model of the situation that is ‘owned' by those who define the problem. The use of formalities for the construction of the model makes it amenable to a range of analyses as well as encouraging reflection and a deeper understanding. These analyses can be used in a ‘rough and ready' manner by visual inspection or through the use of specialist causal mapping software (Decision Explorer). Each of the analyses helps a group or individual discover important features of the problem situation, and these features facilitate agreeing agood solution. The SODA process is aimed at helping a group learn about the situation they face before they reach agreements. Most significantly the exploration through the causal map leads to a higher probability of more creative solutions and promotes solutions that are more likely to be implemented because the problem construction process is wider and more likely to include richer social dimensions about the blockages to action and organizational change. The basic theories that inform SODA derive from cognitive psychology and social negotiation, where the model acts as a continuously changing representation of the problematic situation - changing as the views of a person or group shift through learning and exploration. This chapter, jointly written by two leading practitioner academics and the original developers of SODA, Colin Eden and Fran Ackermann

  10. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    Science.gov (United States)

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  11. Development of a nondestructive method for underglaze painted tiles--demonstrated by the analysis of Persian objects from the nineteenth century.

    Science.gov (United States)

    Reiche, Ina; Röhrs, Stefan; Salomon, Joseph; Kanngiesser, Birgit; Höhn, Yvonne; Malzer, Wolfgang; Voigt, Friederike

    2009-02-01

    The paper presents an analytical method developed for the nondestructive study of nineteenth-century Persian polychrome underglaze painted tiles. As an example, 9 tiles from French and German museum collections were investigated. Before this work was undertaken little was known about the materials used in pottery at that time, although the broad range of colors and shades, together with their brilliant glazes, made these objects stand out when compared with Iranian ceramics of the preceding periods and suggested the use of new pigments, colorants, and glaze compositions. These materials are thought to be related to provenance and as such appropriate criteria for art-historical attribution. The analytical method is based on the combination of different nondestructive spectroscopic techniques using microfocused beams such as proton-induced X-ray emission/proton-induced gamma-ray emission, X-ray fluorescence, 3D X-ray absorption near edge structure, and confocal Raman spectroscopy and also visible spectroscopy. It was established to address the specific difficulties these objects and the technique of underglaze painting raise. The exact definition of the colors observed on the tiles using the Natural Color System helped to attribute them to different colorants. It was possible to establish the presence of Cr- and U-based colorants as new materials in nineteenth-century Persian tilemaking. The difference in glaze composition (Pb, Sn, Na, and K contents) as well as the use of B and Sn were identified as a potential marker for different workshops.

  12. Development and validation of automatic HS-SPME with a gas chromatography-ion trap/mass spectrometry method for analysis of volatiles in wines.

    Science.gov (United States)

    Paula Barros, Elisabete; Moreira, Nathalie; Elias Pereira, Giuliano; Leite, Selma Gomes Ferreira; Moraes Rezende, Claudia; Guedes de Pinho, Paula

    2012-11-15

    An automated headspace solid-phase microextraction (HS-SPME) combined with gas chromatography-ion trap/mass spectrometry (GC-IT/MS) was developed in order to quantify a large number of volatile compounds in wines such as alcohols, ester, norisoprenoids and terpenes. The procedures were optimized for SPME fiber selection, pre-incubation temperature and time, extraction temperature and time, and salt addition. A central composite experimental design was used in the optimization of the extraction conditions. The volatile compounds showed optimal extraction using a DVB/CAR/PDMS fiber, incubation of 5 ml of wine with 2g NaCl at 45 °C during 5 min, and subsequent extraction of 30 min at the same temperature. The method allowed the identification of 64 volatile compounds. Afterwards, the method was validated successfully for the most significant compounds and was applied to study the volatile composition of different white wines. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Development of a LC-MS/MS method for the analysis of enniatins and beauvericin in whole fresh and ensiled maize.

    Science.gov (United States)

    Sørensen, Jens Laurids; Nielsen, Kristian Fog; Rasmussen, Peter Have; Thrane, Ulf

    2008-11-12

    A LC-MS/MS method for the detection of beauvericin and the four enniatins A, A1, B, and B1 in maize and maize silage was developed. The method uses direct injection of maize extracts without any tedious and laborious cleanup procedures. The limit of quantification was determined at 13 ng g(-1) for beauvericin and at 17, 34, 24, and 26 ng g(-1) for enniatins A, A1, B, and B1, respectively. The method was used in surveys of the compounds in fresh maize samples collected at harvest in 2005 and 2006. All samples had the same distribution of the enniatins: B > B1 > A1 > A. Enniatin B was present in 90% of the samples in 2005 and in 100% in 2006 at levels up to 489 and 2598 ng g(-1), respectively. Beauvericin contamination was more frequently detected in 2006 than in 2005 (89 and 10%, respectively) and in higher amounts (988 and 71 ng g(-1), respectively). The occurrence of beauvericin and the four enniatins was examined in 3-month-old maize silage stacks from 20 different farms. As observed in fresh maize, enniatin B was the most abundant compound in ensiled maize and was found from 19 stacks at levels up to 218 ng g(-1). The stability of enniatin B in maize silage was assessed by analyzing samples from 10 of the silage stacks taken after 3, 7, and 11 months of ensiling. Enniatin B could be detected at all locations after 11 months and appeared to be stable during ensiling.

  14. Nonlinear structural analysis using integrated force method

    Indian Academy of Sciences (India)

    Force method in the pre-computer era was the popular analysis tool for civil, mechanical and aerospace engineering structures. This popularity can be attributed to its ability to determine accurate estimates of forces in the structure. During the formulative period of structural analysis by matrix methods, earnest research was ...

  15. Parametric Methods for Order Tracking Analysis

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Jensen, Tobias Lindstrøm

    2017-01-01

    Order tracking analysis is often used to find the critical speeds at which structural resonances are excited by a rotating machine. Typically, order tracking analysis is performed via non-parametric methods. In this report, however, we demonstrate some of the advantages of using a parametric method...

  16. Development of eddy current analysis code ''INCANET''

    International Nuclear Information System (INIS)

    Uesaka, Mitsuru; Hoshi, Yuichi

    1987-01-01

    The eddy current analysis code, INCANET (IHI Induced Current Analysis Code by the Network Mesh Method), for an arbitrary thin shell structure was developed based on the Network Mesh Method. In this method, which was developed by Princeton University in the United States in 1979, a continuous surface is approximated to a network of equivalent circuits. The eddy current, magnetic field generated by it, Joule Power Loss and Lorentz force are calculated by using INCANET which is installed in the VAX/CAE system and a successive stress analysis by Lorentz force is available. The results of INCANET were confirmed in the International Workshop for Eddy Current Code Comparison (Tokyo, 1986). INCANET has been successfully applied to advanced technical fields such as magnetic fusion devices, a positron storage ring, and Magnetic Resonance Imaging (MRI). (author)

  17. Rapid Method Development in Hydrophilic Interaction Liquid Chromatography for Pharmaceutical Analysis Using a Combination of Quantitative Structure-Retention Relationships and Design of Experiments.

    Science.gov (United States)

    Taraji, Maryam; Haddad, Paul R; Amos, Ruth I J; Talebi, Mohammad; Szucs, Roman; Dolan, John W; Pohl, Chris A

    2017-02-07

    A design-of-experiment (DoE) model was developed, able to describe the retention times of a mixture of pharmaceutical compounds in hydrophilic interaction liquid chromatography (HILIC) under all possible combinations of acetonitrile content, salt concentration, and mobile-phase pH with R 2 > 0.95. Further, a quantitative structure-retention relationship (QSRR) model was developed to predict retention times for new analytes, based only on their chemical structures, with a root-mean-square error of prediction (RMSEP) as low as 0.81%. A compound classification based on the concept of similarity was applied prior to QSRR modeling. Finally, we utilized a combined QSRR-DoE approach to propose an optimal design space in a quality-by-design (QbD) workflow to facilitate the HILIC method development. The mathematical QSRR-DoE model was shown to be highly predictive when applied to an independent test set of unseen compounds in unseen conditions with a RMSEP value of 5.83%. The QSRR-DoE computed retention time of pharmaceutical test analytes and subsequently calculated separation selectivity was used to optimize the chromatographic conditions for efficient separation of targets. A Monte Carlo simulation was performed to evaluate the risk of uncertainty in the model's prediction, and to define the design space where the desired quality criterion was met. Experimental realization of peak selectivity between targets under the selected optimal working conditions confirmed the theoretical predictions. These results demonstrate how discovery of optimal conditions for the separation of new analytes can be accelerated by the use of appropriate theoretical tools.

  18. Analysis of Vibration Diagnostics Methods for Induction Motors

    Directory of Open Access Journals (Sweden)

    A. P. Kalinov

    2012-01-01

    Full Text Available The paper presents an analysis of existing vibration diagnostics methods. In order to evaluate an efficiency of method application the following criteria have been proposed: volume of input data required for establishing diagnosis, data content, software and hardware level, execution time for vibration diagnostics. According to the mentioned criteria a classification of vibration diagnostics methods for determination of their advantages and disadvantages, search for their development and improvement has been presented in paper. The paper contains a comparative estimation of methods in accordance with the proposed  criteria. According to this estimation the most efficient methods are a spectral analysis and spectral analysis of the vibration signal envelope.

  19. The delayed neutron method of uranium analysis

    International Nuclear Information System (INIS)

    Wall, T.

    1989-01-01

    The technique of delayed neutron analysis (DNA) is discussed. The DNA rig installed on the MOATA reactor, the assay standards and the types of samples which have been assayed are described. Of the total sample throughput of about 55,000 units since the uranium analysis service began, some 78% has been concerned with analysis of uranium ore samples derived from mining and exploration. Delayed neutron analysis provides a high sensitivity, low cost uranium analysis method for both uranium exploration and other applications. It is particularly suitable for analysis of large batch samples and for non-destructive analysis over a wide range of matrices. 8 refs., 4 figs., 3 tabs

  20. Radiochemistry and nuclear methods of analysis

    International Nuclear Information System (INIS)

    Ehmann, W.D.; Vance, D.

    1991-01-01

    This book provides both the fundamentals of radiochemistry as well as specific applications of nuclear techniques to analytical chemistry. It includes such areas of application as radioimmunoassay and activation techniques using very short-lined indicator radionuclides. It emphasizes the current nuclear methods of analysis such as neutron activation PIXE, nuclear reaction analysis, Rutherford backscattering, isotope dilution analysis and others

  1. Constructing an Intelligent Patent Network Analysis Method

    Directory of Open Access Journals (Sweden)

    Chao-Chan Wu

    2012-11-01

    Full Text Available Patent network analysis, an advanced method of patent analysis, is a useful tool for technology management. This method visually displays all the relationships among the patents and enables the analysts to intuitively comprehend the overview of a set of patents in the field of the technology being studied. Although patent network analysis possesses relative advantages different from traditional methods of patent analysis, it is subject to several crucial limitations. To overcome the drawbacks of the current method, this study proposes a novel patent analysis method, called the intelligent patent network analysis method, to make a visual network with great precision. Based on artificial intelligence techniques, the proposed method provides an automated procedure for searching patent documents, extracting patent keywords, and determining the weight of each patent keyword in order to generate a sophisticated visualization of the patent network. This study proposes a detailed procedure for generating an intelligent patent network that is helpful for improving the efficiency and quality of patent analysis. Furthermore, patents in the field of Carbon Nanotube Backlight Unit (CNT-BLU were analyzed to verify the utility of the proposed method.

  2. Analysis and development of methods for the recovery of degraded tri-n-butyl phosphate (TBP)-30%V/V-dodecane

    International Nuclear Information System (INIS)

    Dalston, C.O.

    1984-01-01

    Tri-n-butyl phosphate associated with an inert hydrocarbon, is the principal solvent used in reprocessing of nuclear irradiated fuel arising of pressurized water reactors, nowdays. The combined action of radiation and nitric acid cause severe damage to solvent, in reprocessing steps. Then, the recovery of solvent gets some importance, since it decreases the amount of the waste and improves the economy of the process. A comparative analysis of several methods of the recovery of this solvent was done, such as: alkaline washing, adsortion with resins, adsorption with aluminium oxide, adsorption by active carbon and adsorption by vermiculite. Some modifications of the analytical test of 95 Zr and a mathematical definition of two new parameters were done: the degradation grade and the eficiency of recovering. Through this modified test of 95 Zr, the residence time and the rate of degraded solvent: recuperator, were determined. After the laboratory tests had been performed, vermiculite, associated with active carbon, were employed in the treatment of 50 liters of tri-n-butyl phosphate (30%V/V)-dodecane, degraded by hydrolysis. Succeding analyses were made to check up the potentialities of these solids in the recovering of this solvent. (Author) [pt

  3. Method development for the determination of bromine in coal using high-resolution continuum source graphite furnace molecular absorption spectrometry and direct solid sample analysis

    Science.gov (United States)

    Pereira, Éderson R.; Castilho, Ivan N. B.; Welz, Bernhard; Gois, Jefferson S.; Borges, Daniel L. G.; Carasek, Eduardo; de Andrade, Jailson B.

    2014-06-01

    This work reports a simple approach for Br determination in coal using direct solid sample analysis in a graphite tube furnace and high-resolution continuum source molecular absorption spectrometry. The molecular absorbance of the calcium mono-bromide (CaBr) molecule has been measured using the rotational line at 625.315 nm. Different chemical modifiers (zirconium, ruthenium, palladium and a mixture of palladium and magnesium nitrates) have been evaluated in order to increase the sensitivity of the CaBr absorption, and Zr showed the best overall performance. The pyrolysis and vaporization temperatures were 800 °C and 2200 °C, respectively. Accuracy and precision of the method have been evaluated using certified coal reference materials (BCR 181, BCR 182, NIST 1630a, and NIST 1632b) with good agreement (between 98 and 103%) with the informed values for Br. The detection limit was around 4 ng Br, which corresponds to about 1.5 μg g- 1 Br in coal, based on a sample mass of 3 mg. In addition, the results were in agreement with those obtained using electrothermal vaporization inductively coupled plasma mass spectrometry, based on a Student t-test at a 95% confidence level. A mechanism for the formation of the CaBr molecule is proposed, which might be considered for other diatomic molecules as well.

  4. Development of a highly precise ID-ICP-SFMS method for analysis of low concentrations of lead in rice flour reference materials.

    Science.gov (United States)

    Zhu, Yanbei; Inagaki, Kazumi; Yarita, Takashi; Chiba, Koichi

    2008-07-01

    Microwave digestion and isotope dilution inductively coupled plasma mass spectrometry (ID-ICP-SFMS) has been applied to the determination of Pb in rice flour. In order to achieve highly precise determination of low concentrations of Pb, the digestion blank for Pb was reduced to 0.21 ng g(-1) after optimization of the digestion conditions, in which 20 mL analysis solution was obtained after digestion of 0.5 g rice flour. The observed value of Pb in a non-fat milk powder certified reference material (CRM), NIST SRM 1549, was 16.8 +/- 0.8 ng g(-1) (mean +/- expanded uncertainty, k = 2; n = 5), which agreed with the certified value of 19 +/- 3 ng g(-1) and indicated the effectiveness of the method. Analytical results for Pb in three brown rice flour CRMs, NIST SRM 1568a, NIES CRM 10-a, and NIES CRM 10-b, were 7.32 +/- 0.24 ng g(-1) (n = 5), 1010 +/- 10 ng g(-1) (n = 5), and 1250 +/- 20 ng g(-1) (n = 5), respectively. The concentration of Pb in a candidate white rice flour reference material (RM) sample prepared by the National Metrology Institute of Japan (NMIJ) was observed to be 4.36 +/- 0.28 ng g(-1) (n = 10 bottles).

  5. A Framework for Teaching Software Development Methods

    Science.gov (United States)

    Dubinsky, Yael; Hazzan, Orit

    2005-01-01

    This article presents a study that aims at constructing a teaching framework for software development methods in higher education. The research field is a capstone project-based course, offered by the Technion's Department of Computer Science, in which Extreme Programming is introduced. The research paradigm is an Action Research that involves…

  6. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, B.; Saeki, Motoshi; Sunyé, Gerson; van den Broek, P.M.; Hruby, Pavel; Tekinerdogan, B.; van den Broek, P.M.; Saeki, M.; Hruby, P.; Sunye, G.

    2001-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software

  7. Developing a TQM quality management method model

    NARCIS (Netherlands)

    Zhang, Zhihai

    1997-01-01

    From an extensive review of total quality management literature, the external and internal environment affecting an organization's quality performance and the eleven primary elements of TQM are identified. Based on the primary TQM elements, a TQM quality management method model is developed. This

  8. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, B.; Frohner, A´ kos; Saeki, Motoshi; Sunyé, Gerson; van den Broek, P.M.; Hruby, Pavel

    2002-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software

  9. Usability Evaluation Method for Agile Software Development

    Directory of Open Access Journals (Sweden)

    Saad Masood Butt

    2015-02-01

    Full Text Available Agile methods are the best fit for tremendously growing software industry due to its flexible and dynamic nature. But the software developed using agile methods do meet the usability standards? To answer this question we can see that majority of agile software development projects currently involve interactive user interface designs, which can only be possible by following User Centered Design (UCD in agile methods. The question here is, how to integrate UCD with agile models. Both Agile models and UCD are iterative in nature but agile models focus on coding and development of software; whereas, UCD focuses on user interface of the software. Similarly, both of them have testing features where the agile model involves automated tested code while UCD involves an expert or a user to test the user interface. In this paper, a new agile usability model is proposed and the evaluation is of the proposed model is presented by practically implementing it in three real life projects. . Key results from these projects clearly show: the proposed agile model incorporates usability evaluation methods, improves the relationship between usability experts to work with agile software experts; in addition, allows agile developers to incorporate the result from UCD into subsequent interactions.

  10. Relativity Concept Inventory: Development, Analysis, and Results

    Science.gov (United States)

    Aslanides, J. S.; Savage, C. M.

    2013-01-01

    We report on a concept inventory for special relativity: the development process, data analysis methods, and results from an introductory relativity class. The Relativity Concept Inventory tests understanding of relativistic concepts. An unusual feature is confidence testing for each question. This can provide additional information; for example,…

  11. Overview of the South African mechanistic pavement design analysis method

    CSIR Research Space (South Africa)

    Theyse, HL

    1996-01-01

    Full Text Available A historical overview of the South African mechanistic pavement design method, from its development in the early 1970s to the present, is presented. Material characterization, structural analysis, and pavement life prediction are discussed...

  12. CARBON SEQUESTRATION: A METHODS COMPARATIVE ANALYSIS

    International Nuclear Information System (INIS)

    Christopher J. Koroneos; Dimitrios C. Rovas

    2008-01-01

    All human activities are related with the energy consumption. Energy requirements will continue to rise, due to the modern life and the developing countries growth. Most of the energy demand emanates from fossil fuels. Fossil fuels combustion has negative environmental impacts, with the CO 2 production to be dominating. The fulfillment of the Kyoto protocol criteria requires the minimization of CO 2 emissions. Thus the management of the CO 2 emissions is an urgent matter. The use of appliances with low energy use and the adoption of an energy policy that prevents the unnecessary energy use, can play lead to the reduction of carbon emissions. A different route is the introduction of ''clean'' energy sources, such as renewable energy sources. Last but not least, the development of carbon sequestration methods can be promising technique with big future potential. The objective of this work is the analysis and comparison of different carbon sequestration and deposit methods. Ocean deposit, land ecosystems deposit, geological formations deposit and radical biological and chemical approaches will be analyzed

  13. Structural Analysis of Communication Development.

    Science.gov (United States)

    Conville, Richard L.

    This paper discusses the question of the legitimacy of applying structural analysis to actual human behavior and illustrates its legitimacy by using the reasoning in an essay by Paul Ricoeur. It then asks if the principles of communication development (obliqueness, exchange, and dying) derived from Helen Keller's experience of communication…

  14. Novel Method of Production Decline Analysis

    Science.gov (United States)

    Xie, Shan; Lan, Yifei; He, Lei; Jiao, Yang; Wu, Yong

    2018-02-01

    ARPS decline curves is the most commonly used in oil and gas field due to its minimal data requirements and ease application. And prediction of production decline which is based on ARPS analysis rely on known decline type. However, when coefficient index are very approximate under different decline type, it is difficult to directly recognize decline trend of matched curves. Due to difficulties above, based on simulation results of multi-factor response experiments, a new dynamic decline prediction model is introduced with using multiple linear regression of influence factors. First of all, according to study of effect factors of production decline, interaction experimental schemes are designed. Based on simulated results, annual decline rate is predicted by decline model. Moreover, the new method is applied in A gas filed of Ordos Basin as example to illustrate reliability. The result commit that the new model can directly predict decline tendency without needing recognize decline style. From arithmetic aspect, it also take advantage of high veracity. Finally, the new method improves the evaluation method of gas well production decline in low permeability gas reservoir, which also provides technical support for further understanding of tight gas field development laws.

  15. Computational structural analysis and finite element methods

    CERN Document Server

    Kaveh, A

    2014-01-01

    Graph theory gained initial prominence in science and engineering through its strong links with matrix algebra and computer science. Moreover, the structure of the mathematics is well suited to that of engineering problems in analysis and design. The methods of analysis in this book employ matrix algebra, graph theory and meta-heuristic algorithms, which are ideally suited for modern computational mechanics. Efficient methods are presented that lead to highly sparse and banded structural matrices. The main features of the book include: application of graph theory for efficient analysis; extension of the force method to finite element analysis; application of meta-heuristic algorithms to ordering and decomposition (sparse matrix technology); efficient use of symmetry and regularity in the force method; and simultaneous analysis and design of structures.

  16. Development of medical application methods using radiation. Radionuclide therapy

    International Nuclear Information System (INIS)

    Choi, Chang Woon; Lim, S. M.; Kim, E.H.; Woo, K. S.; Chung, W. S.; Lim, S. J.; Choi, T. H.; Hong, S. W.; Chung, H. Y.; No, W. C.; Oh, B. H.; Hong, H. J.

    1999-04-01

    In this project, we studied following subjects: 1. development of monoclonal antibodies and radiopharmaceuticals 2. clinical applications of radionuclide therapy 3. radioimmunoguided surgery 4. prevention of restenosis with intracoronary radiation. The results can be applied for the following objectives: 1) radionuclide therapy will be applied in clinical practice to treat the cancer patients or other diseases in multi-center trial. 2) The newly developed monoclonal antibodies and biomolecules can be used in biology, chemistry or other basic life science research. 3) The new methods for the analysis of therapeutic effects, such as dosimetry, and quantitative analysis methods of radioactivity, can be applied in basic research, such as radiation oncology and radiation biology

  17. Development of medical application methods using radiation. Radionuclide therapy

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Chang Woon; Lim, S. M.; Kim, E.H.; Woo, K. S.; Chung, W. S.; Lim, S. J.; Choi, T. H.; Hong, S. W.; Chung, H. Y.; No, W. C. [Korea Atomic Energy Research Institute. Korea Cancer Center Hospital, Seoul, (Korea, Republic of); Oh, B. H. [Seoul National University. Hospital, Seoul (Korea, Republic of); Hong, H. J. [Antibody Engineering Research Unit, Taejon (Korea, Republic of)

    1999-04-01

    In this project, we studied following subjects: 1. development of monoclonal antibodies and radiopharmaceuticals 2. clinical applications of radionuclide therapy 3. radioimmunoguided surgery 4. prevention of restenosis with intracoronary radiation. The results can be applied for the following objectives: (1) radionuclide therapy will be applied in clinical practice to treat the cancer patients or other diseases in multi-center trial. (2) The newly developed monoclonal antibodies and biomolecules can be used in biology, chemistry or other basic life science research. (3) The new methods for the analysis of therapeutic effects, such as dosimetry, and quantitative analysis methods of radioactivity, can be applied in basic research, such as radiation oncology and radiation biology.

  18. Nuclear analysis methods. Rudiments of radiation protection

    International Nuclear Information System (INIS)

    Roth, E.

    1998-01-01

    The nuclear analysis methods are generally used to analyse radioactive elements but they can be used also for chemical analysis, with fields such analysis and characterization of traces. The principles of radiation protection are explained (ALARA), the biological effects of ionizing radiations are given, elements and units used in radiation protection are reminded in tables. A part of this article is devoted to how to use radiation protection in a nuclear analysis laboratory. (N.C.)

  19. Chapter 11. Community analysis-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  20. Generalized Analysis of a Distribution Separation Method

    Directory of Open Access Journals (Sweden)

    Peng Zhang

    2016-04-01

    Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.

  1. Raman spectroscopic analysis of cyanogenic glucosides in plants: development of a Flow Injection Surface-Enhanced Raman Scatter (FI-SERS) method for determination of cyanide

    DEFF Research Database (Denmark)

    Thygesen, Lisbeth Garbrecht; Jørgensen, Kirsten; Møller, Birger Lindberg

    2004-01-01

    -dried sorghum leaf was also obtained using this instrument. Surface-enhanced Raman Spectroscopy (SERS) was demonstrated to be a more sensitive method that enabled determination of the cyanogenic potential of plant tissue. The SERS method was optimized by flow injection (FI) using a colloidal gold dispersion...... as effluent. Potential problems and pitfalls of the method are discussed....

  2. Preliminary hazard analysis using sequence tree method

    International Nuclear Information System (INIS)

    Huang Huiwen; Shih Chunkuan; Hung Hungchih; Chen Minghuei; Yih Swu; Lin Jiinming

    2007-01-01

    A system level PHA using sequence tree method was developed to perform Safety Related digital I and C system SSA. The conventional PHA is a brainstorming session among experts on various portions of the system to identify hazards through discussions. However, this conventional PHA is not a systematic technique, the analysis results strongly depend on the experts' subjective opinions. The analysis quality cannot be appropriately controlled. Thereby, this research developed a system level sequence tree based PHA, which can clarify the relationship among the major digital I and C systems. Two major phases are included in this sequence tree based technique. The first phase uses a table to analyze each event in SAR Chapter 15 for a specific safety related I and C system, such as RPS. The second phase uses sequence tree to recognize what I and C systems are involved in the event, how the safety related systems work, and how the backup systems can be activated to mitigate the consequence if the primary safety systems fail. In the sequence tree, the defense-in-depth echelons, including Control echelon, Reactor trip echelon, ESFAS echelon, and Indication and display echelon, are arranged to construct the sequence tree structure. All the related I and C systems, include digital system and the analog back-up systems are allocated in their specific echelon. By this system centric sequence tree based analysis, not only preliminary hazard can be identified systematically, the vulnerability of the nuclear power plant can also be recognized. Therefore, an effective simplified D3 evaluation can be performed as well. (author)

  3. Development of an Aerodynamic Analysis Method and Database for the SLS Service Module Panel Jettison Event Utilizing Inviscid CFD and MATLAB

    Science.gov (United States)

    Applebaum, Michael P.; Hall, Leslie, H.; Eppard, William M.; Purinton, David C.; Campbell, John R.; Blevins, John A.

    2015-01-01

    This paper describes the development, testing, and utilization of an aerodynamic force and moment database for the Space Launch System (SLS) Service Module (SM) panel jettison event. The database is a combination of inviscid Computational Fluid Dynamic (CFD) data and MATLAB code written to query the data at input values of vehicle/SM panel parameters and return the aerodynamic force and moment coefficients of the panels as they are jettisoned from the vehicle. The database encompasses over 5000 CFD simulations with the panels either in the initial stages of separation where they are hinged to the vehicle, in close proximity to the vehicle, or far enough from the vehicle that body interference effects are neglected. A series of viscous CFD check cases were performed to assess the accuracy of the Euler solutions for this class of problem and good agreement was obtained. The ultimate goal of the panel jettison database was to create a tool that could be coupled with any 6-Degree-Of-Freedom (DOF) dynamics model to rapidly predict SM panel separation from the SLS vehicle in a quasi-unsteady manner. Results are presented for panel jettison simulations that utilize the database at various SLS flight conditions. These results compare favorably to an approach that directly couples a 6-DOF model with the Cart3D Euler flow solver and obtains solutions for the panels at exact locations. This paper demonstrates a method of using inviscid CFD simulations coupled with a 6-DOF model that provides adequate fidelity to capture the physics of this complex multiple moving-body panel separation event.

  4. Evaluation methods for corrosion damage of components in cooling systems of nuclear power plants by coupling analysis of corrosion and flow dynamics (1). Major targets and development strategies of the evaluation methods

    International Nuclear Information System (INIS)

    Naitoh, Masanori; Uchida, Shunsuke; Koshizuka, Seiichi; Ninokata, Hisashi; Hiranuma, Naoki; Dosaki, Koji; Nishida, Koji; Akiyama, Minoru; Saitoh, Hiroaki

    2008-01-01

    Problems in major components and structural materials in nuclear power plants have often been caused by flow induced vibration and corrosion and their overlapping effects. In order to establish safe and reliable plant operation, future problems for structural materials should be predicted based on combined analyses of flow dynamics and corrosion and they should be mitigated before becoming serious issues for plant operation. Three approaches have been prepared for predicting future problems in structural materials: 1. Computer program packages for predicting future corrosion fatigue on structural materials, 2. Computer program packages for predicting future corrosion damage on structural materials, and 3. Computer program packages for predicting wall thinning caused by flow accelerated corrosion. General features of evaluation methods and their computer packages, technical innovations required for their development, and application plans for the developed approaches for plant operation are introduced in this paper. (author)

  5. Behaviour change techniques: the development and evaluation of a taxonomic method for reporting and describing behaviour change interventions (a suite of five studies involving consensus methods, randomised controlled trials and analysis of qualitative data).

    Science.gov (United States)

    Michie, Susan; Wood, Caroline E; Johnston, Marie; Abraham, Charles; Francis, Jill J; Hardeman, Wendy

    2015-11-01

    Meeting global health challenges requires effective behaviour change interventions (BCIs). This depends on advancing the science of behaviour change which, in turn, depends on accurate intervention reporting. Current reporting often lacks detail, preventing accurate replication and implementation. Recent developments have specified intervention content into behaviour change techniques (BCTs) - the 'active ingredients', for example goal-setting, self-monitoring of behaviour. BCTs are 'the smallest components compatible with retaining the postulated active ingredients, i.e. the proposed mechanisms of change. They can be used alone or in combination with other BCTs' (Michie S, Johnston M. Theories and techniques of behaviour change: developing a cumulative science of behaviour change. Health Psychol Rev 2012;6:1-6). Domain-specific taxonomies of BCTs have been developed, for example healthy eating and physical activity, smoking cessation and alcohol consumption. We need to build on these to develop an internationally shared language for specifying and developing interventions. This technology can be used for synthesising evidence, implementing effective interventions and testing theory. It has enormous potential added value for science and global health. (1) To develop a method of specifying content of BCIs in terms of component BCTs; (2) to lay a foundation for a comprehensive methodology applicable to different types of complex interventions; (3) to develop resources to support application of the taxonomy; and (4) to achieve multidisciplinary and international acceptance for future development. Four hundred participants (systematic reviewers, researchers, practitioners, policy-makers) from 12 countries engaged in investigating, designing and/or delivering BCIs. Development of the taxonomy involved a Delphi procedure, an iterative process of revisions and consultation with 41 international experts; hierarchical structure of the list was developed using inductive

  6. Statistical methods for categorical data analysis

    CERN Document Server

    Powers, Daniel

    2008-01-01

    This book provides a comprehensive introduction to methods and models for categorical data analysis and their applications in social science research. Companion website also available, at https://webspace.utexas.edu/dpowers/www/

  7. Development of Gocing Storage Method for Cocoyam

    OpenAIRE

    Chukwu, G.O; Nwosu, K.I; Madu, T.U; Chinaka, C; Okoye, B.C

    2008-01-01

    Lack of good storage reduces the shelf life of harvested cocoyam (Colocasia spp and Xanthosoma spp) corms and cormels. This is a major challenge facing cocoyam farmers, processors, and marketers in Nigeria. The National Root Crops Research Institute (NRCRI), Umudike, Nigeria, which has a national mandate to research into root and tubers crops of economic importance, has developed the ‘Gocing Storage’ for improved storage of cocoyam. The paper highlights this improved method of storing cocoya...

  8. Phylogenetic analysis using parsimony and likelihood methods.

    Science.gov (United States)

    Yang, Z

    1996-02-01

    The assumptions underlying the maximum-parsimony (MP) method of phylogenetic tree reconstruction were intuitively examined by studying the way the method works. Computer simulations were performed to corroborate the intuitive examination. Parsimony appears to involve very stringent assumptions concerning the process of sequence evolution, such as constancy of substitution rates between nucleotides, constancy of rates across nucleotide sites, and equal branch lengths in the tree. For practical data analysis, the requirement of equal branch lengths means similar substitution rates among lineages (the existence of an approximate molecular clock), relatively long interior branches, and also few species in the data. However, a small amount of evolution is neither a necessary nor a sufficient requirement of the method. The difficulties involved in the application of current statistical estimation theory to tree reconstruction were discussed, and it was suggested that the approach proposed by Felsenstein (1981, J. Mol. Evol. 17: 368-376) for topology estimation, as well as its many variations and extensions, differs fundamentally from the maximum likelihood estimation of a conventional statistical parameter. Evidence was presented showing that the Felsenstein approach does not share the asymptotic efficiency of the maximum likelihood estimator of a statistical parameter. Computer simulations were performed to study the probability that MP recovers the true tree under a hierarchy of models of nucleotide substitution; its performance relative to the likelihood method was especially noted. The results appeared to support the intuitive examination of the assumptions underlying MP. When a simple model of nucleotide substitution was assumed to generate data, the probability that MP recovers the true topology could be as high as, or even higher than, that for the likelihood method. When the assumed model became more complex and realistic, e.g., when substitution rates were

  9. Developing methods of controlling quality costs

    Directory of Open Access Journals (Sweden)

    Gorbunova A. V.

    2017-01-01

    Full Text Available The article examines issues of managing quality costs, problems of applying economic methods of quality control, implementation of progressive methods of quality costs management in enterprises with the view of improving the efficiency of their evaluation and analysis. With the aim of increasing the effectiveness of the cost management mechanism, authors introduce controlling as a tool of deviation analysis from the standpoint of the process approach. A list of processes and corresponding evaluation criteria in the quality management system of enterprises is introduced. Authors also introduce the method of controlling quality costs and propose it for the practical application, which allows them to determine useful and unnecessary costs at the existing operating plant. Implementing the proposed recommendations in the system of cost management at an enterprise will allow to improve productivity of processes operating and reduce wasted expense on the quality of the process on the basis of determining values of useful and useless costs of quality according to criteria of processes functioning in the system of quality management.

  10. An introduction to numerical methods and analysis

    CERN Document Server

    Epperson, James F

    2013-01-01

    Praise for the First Edition "". . . outstandingly appealing with regard to its style, contents, considerations of requirements of practice, choice of examples, and exercises.""-Zentralblatt MATH "". . . carefully structured with many detailed worked examples.""-The Mathematical Gazette The Second Edition of the highly regarded An Introduction to Numerical Methods and Analysis provides a fully revised guide to numerical approximation. The book continues to be accessible and expertly guides readers through the many available techniques of numerical methods and analysis. An Introduction to

  11. Innovative Method of the Power Analysis

    Directory of Open Access Journals (Sweden)

    Z. Martinasek

    2013-06-01

    Full Text Available This paper describes an innovative method of the power analysis which presents the typical example of successful attacks against trusted cryptographic devices such as RFID (Radio-Frequency IDentifications and contact smart cards. The proposed method analyzes power consumption of the AES (Advanced Encryption Standard algorithm with neural network, which successively classifies the first byte of the secret key. This way of the power analysis is an entirely new approach and it is designed to combine the advantages of simple and differential power analysis. In the extreme case, this feature allows to determine the whole secret key of a cryptographic module only from one measured power trace. This attribute makes the proposed method very attractive for potential attackers. Besides theoretical design of the method, we also provide the first implementation results. We assume that the method will be certainly optimized to obtain more accurate classification results in the future.

  12. Analysis of Cryptocurrencies Price Development

    OpenAIRE

    Jan Lansky

    2016-01-01

    Cryptocurrencies are a type of digital currencies based on cryptography principles. Cryptocurrencies are a unique combination of three characteristics: they provide anonymity, they are independent of central authority and they provide protection from double spending attack. The aim of this paper is to capture trends in the area of significant cryptocurrencies price developments and to explain their causes. The current research in this area is exclusively limited to an analysis of the price de...

  13. Laboratory theory and methods for sediment analysis

    Science.gov (United States)

    Guy, Harold P.

    1969-01-01

    The diverse character of fluvial sediments makes the choice of laboratory analysis somewhat arbitrary and the pressing of sediment samples difficult. This report presents some theories and methods used by the Water Resources Division for analysis of fluvial sediments to determine the concentration of suspended-sediment samples and the particle-size distribution of both suspended-sediment and bed-material samples. Other analyses related to these determinations may include particle shape, mineral content, and specific gravity, the organic matter and dissolved solids of samples, and the specific weight of soils. The merits and techniques of both the evaporation and filtration methods for concentration analysis are discussed. Methods used for particle-size analysis of suspended-sediment samples may include the sieve pipet, the VA tube-pipet, or the BW tube-VA tube depending on the equipment available, the concentration and approximate size of sediment in the sample, and the settling medium used. The choice of method for most bed-material samples is usually limited to procedures suitable for sand or to some type of visual analysis for large sizes. Several tested forms are presented to help insure a well-ordered system in the laboratory to handle the samples, to help determine the kind of analysis required for each, to conduct the required processes, and to assist in the required computations. Use of the manual should further 'standardize' methods of fluvial sediment analysis among the many laboratories and thereby help to achieve uniformity and precision of the data.

  14. Development of a multiple bulked segregant analysis (MBSA) method used to locate a new stem rust resistance gene (Sr54) in the winter wheat cultivar Norin 40.

    Science.gov (United States)

    Ghazvini, Habibollah; Hiebert, Colin W; Thomas, Julian B; Fetch, Thomas

    2013-02-01

    An important aspect of studying putative new genes in wheat is determining their position on the wheat genetic map. The primary difficulty in mapping genes is determining which chromosome carries the gene of interest. Several approaches have been developed to address this problem, each with advantages and disadvantages. Here we describe a new approach called multiple bulked segregant analysis (MBSA). A set of 423 simple sequence repeat (SSR) markers were selected based on profile simplicity, frequency of polymorphism, and distribution across the wheat genome. SSR primers were preloaded in 384-well PCR plates with each primer occupying 16 wells. In practice, 14 wells are reserved for "mini-bulks" that are equivalent to four gametes (e.g. two F(2) individuals) comprised of individuals from a segregated population that have a known homozygous genotype for the gene of interest. The remaining two wells are reserved for the parents of the population. Each well containing a mini-bulk can have one of three allele compositions for each SSR: only the allele from one parent, only the allele from the other parent, or both alleles. Simulation experiments were performed to determine the pattern of mini-bulk allele composition that would indicate putative linkage between the SSR in question and the gene of interest. As a test case, MBSA was employed to locate an unidentified stem rust resistance (Sr) gene in the winter wheat cultivar Norin 40. A doubled haploid (DH) population (n = 267) was produced from hybrids of the cross LMPG-6S/Norin 40. The DH population segregated for a single gene (χ (1:1) (2) = 0.093, p = 0.76) for resistance to Puccinia graminis f.sp. tritici race LCBN. Four resistant DH lines were included in each of the 14 mini-bulks for screening. The Sr gene was successfully located to the long arm of chromosome 2D using MBSA. Further mapping confirmed the chromosome location and revealed that the Sr gene was located in a linkage block that may represent an alien

  15. Relating Actor Analysis Methods to Policy Problems

    NARCIS (Netherlands)

    Van der Lei, T.E.

    2009-01-01

    For a policy analyst the policy problem is the starting point for the policy analysis process. During this process the policy analyst structures the policy problem and makes a choice for an appropriate set of methods or techniques to analyze the problem (Goeller 1984). The methods of the policy

  16. Error Analysis of Band Matrix Method

    OpenAIRE

    Taniguchi, Takeo; Soga, Akira

    1984-01-01

    Numerical error in the solution of the band matrix method based on the elimination method in single precision is investigated theoretically and experimentally, and the behaviour of the truncation error and the roundoff error is clarified. Some important suggestions for the useful application of the band solver are proposed by using the results of above error analysis.

  17. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  18. Recent Developments in the Methods of Estimating Shooting Distance

    Directory of Open Access Journals (Sweden)

    Arie Zeichner

    2002-01-01

    Full Text Available A review of developments during the past 10 years in the methods of estimating shooting distance is provided. This review discusses the examination of clothing targets, cadavers, and exhibits that cannot be processed in the laboratory. The methods include visual/microscopic examinations, color tests, and instrumental analysis of the gunshot residue deposits around the bullet entrance holes. The review does not cover shooting distance estimation from shotguns that fired pellet loads.

  19. Spectral Analysis Methods of Social Networks

    Directory of Open Access Journals (Sweden)

    P. G. Klyucharev

    2017-01-01

    Full Text Available Online social networks (such as Facebook, Twitter, VKontakte, etc. being an important channel for disseminating information are often used to arrange an impact on the social consciousness for various purposes - from advertising products or services to the full-scale information war thereby making them to be a very relevant object of research. The paper reviewed the analysis methods of social networks (primarily, online, based on the spectral theory of graphs. Such methods use the spectrum of the social graph, i.e. a set of eigenvalues of its adjacency matrix, and also the eigenvectors of the adjacency matrix.Described measures of centrality (in particular, centrality based on the eigenvector and PageRank, which reflect a degree of impact one or another user of the social network has. A very popular PageRank measure uses, as a measure of centrality, the graph vertices, the final probabilities of the Markov chain, whose matrix of transition probabilities is calculated on the basis of the adjacency matrix of the social graph. The vector of final probabilities is an eigenvector of the matrix of transition probabilities.Presented a method of dividing the graph vertices into two groups. It is based on maximizing the network modularity by computing the eigenvector of the modularity matrix.Considered a method for detecting bots based on the non-randomness measure of a graph to be computed using the spectral coordinates of vertices - sets of eigenvector components of the adjacency matrix of a social graph.In general, there are a number of algorithms to analyse social networks based on the spectral theory of graphs. These algorithms show very good results, but their disadvantage is the relatively high (albeit polynomial computational complexity for large graphs.At the same time it is obvious that the practical application capacity of the spectral graph theory methods is still underestimated, and it may be used as a basis to develop new methods.The work

  20. Development of precursors recognition methods in vector signals

    Science.gov (United States)

    Kapralov, V. G.; Elagin, V. V.; Kaveeva, E. G.; Stankevich, L. A.; Dremin, M. M.; Krylov, S. V.; Borovov, A. E.; Harfush, H. A.; Sedov, K. S.

    2017-10-01

    Precursor recognition methods in vector signals of plasma diagnostics are presented. Their requirements and possible options for their development are considered. In particular, the variants of using symbolic regression for building a plasma disruption prediction system are discussed. The initial data preparation using correlation analysis and symbolic regression is discussed. Special attention is paid to the possibility of using algorithms in real time.

  1. A Comparative Analysis of Short Time Series Processing Methods

    OpenAIRE

    Kiršners, A; Borisovs, A

    2012-01-01

    This article analyzes the traditional time series processing methods that are used to perform the task of short time series analysis in demand forecasting. The main aim of this paper is to scrutinize the ability of these methods to be used when analyzing short time series. The analyzed methods include exponential smoothing, exponential smoothing with the development trend and moving average method. The paper gives the description of the structure and main operating princi...

  2. Statistical Methods for Conditional Survival Analysis.

    Science.gov (United States)

    Jung, Sin-Ho; Lee, Ho Yun; Chow, Shein-Chung

    2017-11-29

    We investigate the survival distribution of the patients who have survived over a certain time period. This is called a conditional survival distribution. In this paper, we show that one-sample estimation, two-sample comparison and regression analysis of conditional survival distributions can be conducted using the regular methods for unconditional survival distributions that are provided by the standard statistical software, such as SAS and SPSS. We conduct extensive simulations to evaluate the finite sample property of these conditional survival analysis methods. We illustrate these methods with real clinical data.

  3. Hydropower development priority using MCDM method

    International Nuclear Information System (INIS)

    Supriyasilp, Thanaporn; Pongput, Kobkiat; Boonyasirikul, Thana

    2009-01-01

    Hydropower is recognized as a renewable and clean energy sources and its potential should be realized in an environmentally sustainable and socially equitable manner. Traditionally, the decision criteria when analyzing hydropower projects, have been mostly a technical and economical analysis which focused on the production of electricity. However, environmental awareness and sensitivity to locally affected people should also be considered. Multi-criteria decision analysis has been applied to study the potential to develop hydropower projects with electric power greater than 100 kW in the Ping River Basin, Thailand, and to determine the advantages and disadvantages of the projects in five main criteria: electricity generation, engineering and economics, socio-economics, environment, and stakeholder involvement. There are 64 potential sites in the study area. Criteria weights have been discussed and assigned by expert groups for each main criteria and subcriteria. As a consequence of weight assignment, the environmental aspect is the most important aspect in the view of the experts. Two scenarios using expert weight and fair weight have been studied to determine the priority for development of each project. This study has been done to assist policy making for hydropower development in the Ping River Basin.

  4. Hydropower development priority using MCDM method

    Energy Technology Data Exchange (ETDEWEB)

    Supriyasilp, Thanaporn [Department of Civil Engineering, Chiang Mai University, Chiang Mai 50200 (Thailand); Pongput, Kobkiat [Department of Water Resource Engineering, Kasetsart University, Bangkhen Campus, Bangkok 10900 (Thailand); Boonyasirikul, Thana [Electricity Generating Authority of Thailand, Bang Kruai, Nonthaburi 11130 (Thailand)

    2009-05-15

    Hydropower is recognized as a renewable and clean energy sources and its potential should be realized in an environmentally sustainable and socially equitable manner. Traditionally, the decision criteria when analyzing hydropower projects, have been mostly a technical and economical analysis which focused on the production of electricity. However, environmental awareness and sensitivity to locally affected people should also be considered. Multi-criteria decision analysis has been applied to study the potential to develop hydropower projects with electric power greater than 100 kW in the Ping River Basin, Thailand, and to determine the advantages and disadvantages of the projects in five main criteria: electricity generation, engineering and economics, socio-economics, environment, and stakeholder involvement. There are 64 potential sites in the study area. Criteria weights have been discussed and assigned by expert groups for each main criteria and subcriteria. As a consequence of weight assignment, the environmental aspect is the most important aspect in the view of the experts. Two scenarios using expert weight and fair weight have been studied to determine the priority for development of each project. This study has been done to assist policy making for hydropower development in the Ping River Basin. (author)

  5. Prioritizing pesticide compounds for analytical methods development

    Science.gov (United States)

    Norman, Julia E.; Kuivila, Kathryn; Nowell, Lisa H.

    2012-01-01

    The U.S. Geological Survey (USGS) has a periodic need to re-evaluate pesticide compounds in terms of priorities for inclusion in monitoring and studies and, thus, must also assess the current analytical capabilities for pesticide detection. To meet this need, a strategy has been developed to prioritize pesticides and degradates for analytical methods development. Screening procedures were developed to separately prioritize pesticide compounds in water and sediment. The procedures evaluate pesticide compounds in existing USGS analytical methods for water and sediment and compounds for which recent agricultural-use information was available. Measured occurrence (detection frequency and concentrations) in water and sediment, predicted concentrations in water and predicted likelihood of occurrence in sediment, potential toxicity to aquatic life or humans, and priorities of other agencies or organizations, regulatory or otherwise, were considered. Several existing strategies for prioritizing chemicals for various purposes were reviewed, including those that identify and prioritize persistent, bioaccumulative, and toxic compounds, and those that determine candidates for future regulation of drinking-water contaminants. The systematic procedures developed and used in this study rely on concepts common to many previously established strategies. The evaluation of pesticide compounds resulted in the classification of compounds into three groups: Tier 1 for high priority compounds, Tier 2 for moderate priority compounds, and Tier 3 for low priority compounds. For water, a total of 247 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods for monitoring and studies. Of these, about three-quarters are included in some USGS analytical method; however, many of these compounds are included on research methods that are expensive and for which there are few data on environmental samples. The remaining quarter of Tier 1

  6. Analysis and development of spatial hp-refinement methods for solving the neutron transport equation; Analyse et developpement de methodes de raffinement hp en espace pour l'equation de transport des neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Fournier, D.

    2011-10-10

    The different neutronic parameters have to be calculated with a higher accuracy in order to design the 4. generation reactor cores. As memory storage and computation time are limited, adaptive methods are a solution to solve the neutron transport equation. The neutronic flux, solution of this equation, depends on the energy, angle and space. The different variables are successively discretized. The energy with a multigroup approach, considering the different quantities to be constant on each group, the angle by a collocation method called SN approximation. Once the energy and angle variable are discretized, a system of spatially-dependent hyperbolic equations has to be solved. Discontinuous finite elements are used to make possible the development of hp-refinement methods. Thus, the accuracy of the solution can be improved by spatial refinement (h-refinement), consisting into subdividing a cell into sub-cells, or by order refinement (p-refinement), by increasing the order of the polynomial basis. In this thesis, the properties of this methods are analyzed showing the importance of the regularity of the solution to choose the type of refinement. Thus, two error estimators are used to lead the refinement process. Whereas the first one requires high regularity hypothesis (analytical solution), the second one supposes only the minimal hypothesis required for the solution to exist. The comparison of both estimators is done on benchmarks where the analytic solution is known by the method of manufactured solutions. Thus, the behaviour of the solution as a regard of the regularity can be studied. It leads to a hp-refinement method using the two estimators. Then, a comparison is done with other existing methods on simplified but also realistic benchmarks coming from nuclear cores. These adaptive methods considerably reduces the computational cost and memory footprint. To further improve these two points, an approach with energy-dependent meshes is proposed. Actually, as the

  7. Progress in spatial analysis methods and applications

    CERN Document Server

    Páez, Antonio; Buliung, Ron N; Dall'erba, Sandy

    2010-01-01

    This book brings together developments in spatial analysis techniques, including spatial statistics, econometrics, and spatial visualization, and applications to fields such as regional studies, transportation and land use, population and health.

  8. DEVELOPMENT OF A RISK SCREENING METHOD FOR CREDITED OPERATOR ACTIONS

    International Nuclear Information System (INIS)

    HIGGINS, J.C.; O'HARA, J.M.; LEWIS, P.M.; PERSENSKY, J.; BONGARRA, J.

    2002-01-01

    DEVELOPMENT OF A RISK SCREENING METHOD FOR CREDITED OPERATOR ACTIONS. THE U.S. NUCLEAR REGULATORY COMMISSION (NRC) REVIEWS THE HUMAN FACTORS ASPECTS OF PROPOSED LICENSE AMENDMENTS THAT IMPACT HUMAN ACTIONS THAT ARE CREDITED IN A PLANTS SAFETY ANALYSIS. THE STAFF IS COMMITTED TO A GRADED APPROACH TO THESE REVIEWS THAT FOCUS RESOURCES ON THE MOST RISK IMPORTANT CHANGES. THEREFORE, A RISK INFORMED SCREENING METHOD WAS DEVELOPED BASED ON AN ADAPTATION OF EXISTING GUIDANCE FOR RISK INFORMED REGULATION AND HUMAN FACTORS. THE METHOD USES BOTH QUANTITATIVE AND QUALITATIVE INFORMATION TO DIVIDE THE AMENDMENT REQUESTS INTO DIFFERENT LEVELS OF REVIEW. THE METHOD WAS EVALUATED USING A VARIETY OF TESTS. THIS PAPER WILL SUMMARIZE THE DEVELOPMENT OF THE METHODOLOGY AND THE EVALUATIONS THAT WERE PERFORMED TO VERIFY ITS USEFULNESS

  9. Methodology of Credit Analysis Development

    Directory of Open Access Journals (Sweden)

    Slađana Neogradi

    2017-12-01

    Full Text Available The subject of research presented in this paper refers to the definition of methodology for the development of credit analysis in companies and its application in lending operations in the Republic of Serbia. With the developing credit market, there is a growing need for a well-developed risk and loss prevention system. In the introduction the process of bank analysis of the loan applicant is presented in order to minimize and manage the credit risk. By examining the subject matter, the process of processing the credit application is described, the procedure of analyzing the financial statements in order to get an insight into the borrower's creditworthiness. In the second part of the paper, the theoretical and methodological framework is presented applied in the concrete company. In the third part, models are presented which banks should use to protect against exposure to risks, i.e. their goal is to reduce losses on loan operations in our country, as well as to adjust to market conditions in an optimal way.

  10. Scope-Based Method Cache Analysis

    DEFF Research Database (Denmark)

    Huber, Benedikt; Hepp, Stefan; Schoeberl, Martin

    2014-01-01

    , as it requests memory transfers at well-defined instructions only. In this article, we present a new cache analysis framework that generalizes and improves work on cache persistence analysis. The analysis demonstrates that a global view on the cache behavior permits the precise analyses of caches which are hard......The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution...

  11. Diffraction as a Method of Critical Policy Analysis

    Science.gov (United States)

    Ulmer, Jasmine B.

    2016-01-01

    Recent developments in critical policy analysis have occurred alongside the new materialisms in qualitative research. These lines of scholarship have unfolded along two separate, but related, tracks. In particular, the new materialist method of "diffraction" aligns with many elements of critical policy analysis. Both involve critical…

  12. An analysis of the influence of production conditions on the development of the microporous structure of the activated carbon fibres using the LBET method

    Science.gov (United States)

    Kwiatkowski, Mirosław

    2017-12-01

    The paper presents the results of the research on the application of the new analytical models of multilayer adsorption on heterogeneous surfaces with the unique fast multivariant identification procedure, together called LBET method, as a tool for analysing the microporous structure of the activated carbon fibres obtained from polyacrylonitrile by chemical activation using potassium and sodium hydroxides. The novel LBET method was employed particularly to evaluate the impact of the used activator and the hydroxide to polyacrylonitrile ratio on the obtained microporous structure of the activated carbon fibres.

  13. Dynamic relaxation method in analysis of reinforced concrete bent elements

    Directory of Open Access Journals (Sweden)

    Anna Szcześniak

    2015-12-01

    Full Text Available The paper presents a method for the analysis of nonlinear behaviour of reinforced concrete bent elements subjected to short-term static load. The considerations in the range of modelling of deformation processes of reinforced concrete element were carried out. The method of structure effort analysis was developed using the finite difference method. The Dynamic Relaxation Method, which — after introduction of critical damping — allows for description of the static behaviour of a structural element, was used to solve the system of nonlinear equilibrium equations. In order to increase the method effectiveness in the range of the post-critical analysis, the Arc Length Parameter on the equilibrium path was introduced into the computational procedure.[b]Keywords[/b]: reinforced concrete elements, physical nonlinearity, geometrical nonlinearity, dynamic relaxation method, arc-length method

  14. Development and validation of a comprehensive two-dimensional gas chromatography-mass spectrometry method for the analysis of phytosterol oxidation products in human plasma

    NARCIS (Netherlands)

    Menéndez-Carreño, M.; Steenbergen, H.; Janssen, H.-G.

    2012-01-01

    Phytosterol oxidation products (POPs) have been suggested to exert adverse biological effects similar to, although less severe than, their cholesterol counterparts. For that reason, their analysis in human plasma is highly relevant. Comprehensive two-dimensional gas chromatography (GC×GC) coupled

  15. Development of numerical methods for reactive transport

    International Nuclear Information System (INIS)

    Bouillard, N.

    2006-12-01

    When a radioactive waste is stored in deep geological disposals, it is expected that the waste package will be damaged under water action (concrete leaching, iron corrosion). Then, to understand these damaging processes, chemical reactions and solutes transport are modelled. Numerical simulations of reactive transport can be done sequentially by the coupling of several codes. This is the case of the software platform ALLIANCES which is developed jointly with CEA, ANDRA and EDF. Stiff reactions like precipitation-dissolution are crucial for the radioactive waste storage applications, but standard sequential iterative approaches like Picard's fail in solving rapidly reactive transport simulations with such stiff reactions. In the first part of this work, we focus on a simplified precipitation and dissolution process: a system made up with one solid species and two aqueous species moving by diffusion is studied mathematically. It is assumed that a precipitation dissolution reaction occurs in between them, and it is modelled by a discontinuous kinetics law of unknown sign. By using monotonicity properties, the convergence of a finite volume scheme on admissible mesh is proved. Existence of a weak solution is obtained as a by-product of the convergence of the scheme. The second part is dedicated to coupling algorithms which improve Picard's method and can be easily used in an existing coupling code. By extending previous works, we propose a general and adaptable framework to solve nonlinear systems. Indeed by selecting special options, we can either recover well known methods, like nonlinear conjugate gradient methods, or design specific method. This algorithm has two main steps, a preconditioning one and an acceleration one. This algorithm is tested on several examples, some of them being rather academical and others being more realistic. We test it on the 'three species model'' example. Other reactive transport simulations use an external chemical code CHESS. For a

  16. Instrumental methods of analysis, 7th edition

    International Nuclear Information System (INIS)

    Willard, H.H.; Merritt, L.L. Jr.; Dean, J.A.; Settle, F.A. Jr.

    1988-01-01

    The authors have prepared an organized and generally polished product. The book is fashioned to be used as a textbook for an undergraduate instrumental analysis course, a supporting textbook for graduate-level courses, and a general reference work on analytical instrumentation and techniques for professional chemists. Four major areas are emphasized: data collection and processing, spectroscopic instrumentation and methods, liquid and gas chromatographic methods, and electrochemical methods. Analytical instrumentation and methods have been updated, and a thorough citation of pertinent recent literature is included

  17. Robust methods for multivariate data analysis A1

    DEFF Research Database (Denmark)

    Frosch, Stina; Von Frese, J.; Bro, Rasmus

    2005-01-01

    Outliers may hamper proper classical multivariate analysis, and lead to incorrect conclusions. To remedy the problem of outliers, robust methods are developed in statistics and chemometrics. Robust methods reduce or remove the effect of outlying data points and allow the ?good? data to primarily...

  18. Software Development for Decision Analysis

    Science.gov (United States)

    1975-03-01

    place vandom variable 1 after decision 3 in the tree. In the nuit phase of our research, we hope to develop general algorithms for translating any...1 5 | > 192 - -■--■ i r«iiii iiml ii MM and tl en defining (GUARANTEE COST)!^ □□BlfSMMiMYTIfS) feöl QF]( GUARANTEE")!^(YES) • The blanching ...Stanford, California, 1974. [4c] Howard, R. A., "Proximal Decision Analysis," Management Science Vol. 17, No. 9, May 1971. a L [5] International

  19. Analysis of Cryptocurrencies Price Development

    Directory of Open Access Journals (Sweden)

    Jan Lansky

    2016-12-01

    Full Text Available Cryptocurrencies are a type of digital currencies based on cryptography principles. Cryptocurrencies are a unique combination of three characteristics: they provide anonymity, they are independent of central authority and they provide protection from double spending attack. The aim of this paper is to capture trends in the area of significant cryptocurrencies price developments and to explain their causes. The current research in this area is exclusively limited to an analysis of the price developments of the most important Bitcoin cryptocurrency; our research is the first to focus on other cryptocurrencies too. The economic perspective on cryptocurrencies is based on IT knowledge regarding the principles of their functioning. We have created a database of prices of 1278 cryptocurrencies from 2013 to 2016. This database is publicly available. To analyse the data, SQL query language was used.

  20. Response Matrix Method Development Program at Savannah River Laboratory

    International Nuclear Information System (INIS)

    Sicilian, J.M.

    1976-01-01

    The Response Matrix Method Development Program at Savannah River Laboratory (SRL) has concentrated on the development of an effective system of computer codes for the analysis of Savannah River Plant (SRP) reactors. The most significant contribution of this program to date has been the verification of the accuracy of diffusion theory codes as used for routine analysis of SRP reactor operation. This paper documents the two steps carried out in achieving this verification: confirmation of the accuracy of the response matrix technique through comparison with experiment and Monte Carlo calculations; and establishment of agreement between diffusion theory and response matrix codes in situations which realistically approximate actual operating conditions