WorldWideScience

Sample records for analysis methods developed

  1. Cloud Based Development Issues: A Methodical Analysis

    Directory of Open Access Journals (Sweden)

    Sukhpal Singh

    2012-11-01

    Full Text Available Cloud based development is a challenging task for various software engineering projects, especifically for those which demand extraordinary quality, reusability and security along with general architecture. In this paper we present a report on a methodical analysis of cloud based development problems published in major computer science and software engineering journals and conferences organized by various researchers. Research papers were collected from different scholarly databases using search engines within a particular period of time. A total of 89 research papers were analyzed in this methodical study and we categorized into four classes according to the problems addressed by them. The majority of the research papers focused on quality (24 papers associated with cloud based development and 16 papers focused on analysis and design. By considering the areas focused by existing authors and their gaps, untouched areas of cloud based development can be discovered for future research works.

  2. Probabilistic structural analysis methods development for SSME

    Science.gov (United States)

    Chamis, C. C.; Hopkins, D. A.

    1988-01-01

    The development of probabilistic structural analysis methods is a major part of the SSME Structural Durability Program and consists of three program elements: composite load spectra, probabilistic finite element structural analysis, and probabilistic structural analysis applications. Recent progress includes: (1) the effects of the uncertainties of several factors on the HPFP blade temperature pressure and torque, (2) the evaluation of the cumulative distribution function of structural response variables based on assumed uncertainties on primitive structural variables, and (3) evaluation of the failure probability. Collectively, the results obtained demonstrate that the structural durability of critical SSME components can be probabilistically evaluated.

  3. Methods development for criticality safety analysis

    International Nuclear Information System (INIS)

    A status review on the work at Oak Ridge to develop improved methods for performing multigroup, discrete-ordinates, and Monte Carlo criticality safety analyses is presented. In the area of multigroup cross section preparation this work entails the testing of ENDE/B-IV based and other cross-section libraries in the SCALE system, the development of improved cross-section processing methods for the AMPX system, and the generation of an ENDF/B-V based library. In the area of systems analysis this work entails improvements to the one-dimensional discrete-ordinates code XSDRNPM-S, the testing of the combinatorial geometry version of KENO, KENO-IV/CG, and development of an advanced version of KENO, KENO-V. Also presented is a brief review of the existing criticality safety analytical sequences in the SCALE system, CSAS1 and CSAS2, and the development of the advanced analytical sequences CSAS3 and CSAS4

  4. Development of analysis methods for seismically isolated nuclear structures

    International Nuclear Information System (INIS)

    KAERI's contributions to the project entitled Development of Analysis Methods for Seismically Isolated Nuclear Structures under IAEA CRP of the intercomparison of analysis methods for predicting the behaviour of seismically isolated nuclear structures during 1996-1999 in effort to develop the numerical analysis methods and to compare the analysis results with the benchmark test results of seismic isolation bearings and isolated nuclear structures provided by participating countries are briefly described. Certain progress in the analysis procedures for isolation bearings and isolated nuclear structures has been made throughout the IAEA CRPs and the analysis methods developed can be improved for future nuclear facility applications. (author)

  5. Method development of gas analysis with mass spectrometer

    International Nuclear Information System (INIS)

    Dissolved gas content in deep saline groundwater is an important factor, which has to be known and taken into account when planning the deep repository for the spent nuclear fuel. Posiva has investigated dissolved gases in deep groundwaters since the 1990's. In 2002 Posiva started a project that focused on developing the mass spectrometric method for measuring the dissolved gas content in deep saline groundwater. The main idea of the project was to analyse the dissolved gas content of both the gas phase and the water phase by a mass spectrometer. The development of the method started in 2003 (in the autumn). One of the aims was to create a parallel method for gas analysis with the gas chromatographic method. The starting point of this project was to test if gases could be analysed directly from water using a membrane inlet in the mass spectrometer. The main objective was to develop mass spectrometric methods for gas analysis with direct and membrane inlets. An analysis method for dissolved gases was developed for direct gas inlet mass spectrometry. The accuracy of the analysis method is tested with parallel real PAVE samples analysed in the laboratory of Insinoeoeritoimisto Paavo Ristola Oy. The results were good. The development of the membrane inlet mass spectrometric method still continues. Two different membrane materials (silicone and teflon) were tested. Some basic tests (linearity,repeatability and detection limits for different gases) will be done by this method. (orig.)

  6. Recent Developments in Helioseismic Analysis Methods and Solar Data Assimilation

    CERN Document Server

    Schad, Ariane; Duvall, Tom L; Roth, Markus; Vorontsov, Sergei V

    2016-01-01

    We review recent advances and results in enhancing and developing helioseismic analysis methods and in solar data assimilation. In the first part of this paper we will focus on selected developments in time-distance and global helioseismology. In the second part, we review the application of data assimilation methods on solar data. Relating solar surface observations as well as helioseismic proxies with solar dynamo models by means of the techniques from data assimilation is a promising new approach to explore and to predict the magnetic activity cycle of the Sun.

  7. Development of sample preparation method for honey analysis using PIXE

    International Nuclear Information System (INIS)

    We developed an original preparation method for honey samples (samples in paste-like state) specifically designed for PIXE analysis. The results of PIXE analysis of thin targets prepared by adding a standard containing nine elements to honey samples demonstrated that the preparation method bestowed sufficient accuracy on quantitative values. PIXE analysis of 13 kinds of honey was performed, and eight mineral components (Si, P, S, K, Ca, Mn, Cu and Zn) were detected in all honey samples. The principal mineral components were K and Ca, and the quantitative value for K accounted for the majority of the total value for mineral components. K content in honey varies greatly depending on the plant source. Chestnuts had the highest K content. In fact, it was 2-3 times that of Manuka, which is known as a high quality honey. K content of false-acacia, which is produced in the greatest abundance, was 1/20 that of chestnuts. (author)

  8. Development of Photogrammetric Methods of Stress Analysis and Quality Control

    CERN Document Server

    Kubik, D L; Kubik, Donna L.; Greenwood, John A.

    2003-01-01

    A photogrammetric method of stress analysis has been developed to test thin, nonstandard windows designed for hydrogen absorbers, major components of a muon cooling channel. The purpose of the absorber window tests is to demonstrate an understanding of the window behavior and strength as a function of applied pressure. This is done by comparing the deformation of the window, measured via photogrammetry, to the deformation predicted by finite element analysis (FEA). FEA analyses indicate a strong sensitivity of strain to the window thickness. Photogrammetric methods were chosen to measure the thickness of the window, thus providing data that are more accurate to the FEA. This, plus improvements made in hardware and testing procedures, resulted in a precision of 5 microns in all dimensions and substantial agreement with FEA predictions.

  9. Development of Analysis Methods for Designing with Composites

    Science.gov (United States)

    Madenci, E.

    1999-01-01

    The project involved the development of new analysis methods to achieve efficient design of composite structures. We developed a complex variational formulation to analyze the in-plane and bending coupling response of an unsymmetrically laminated plate with an elliptical cutout subjected to arbitrary edge loading as shown in Figure 1. This formulation utilizes four independent complex potentials that satisfy the coupled in-plane and bending equilibrium equations, thus eliminating the area integrals from the strain energy expression. The solution to a finite geometry laminate under arbitrary loading is obtained by minimizing the total potential energy function and solving for the unknown coefficients of the complex potentials. The validity of this approach is demonstrated by comparison with finite element analysis predictions for a laminate with an inclined elliptical cutout under bi-axial loading.The geometry and loading of this laminate with a lay-up of [-45/45] are shown in Figure 2. The deformed configuration shown in Figure 3 reflects the presence of bending-stretching coupling. The validity of the present method is established by comparing the out-of-plane deflections along the boundary of the elliptical cutout from the present approach with those of the finite element method. The comparison shown in Figure 4 indicates remarkable agreement. The details of this method are described in a manuscript by Madenci et al. (1998).

  10. Development of rapid urine analysis method for uranium

    International Nuclear Information System (INIS)

    ICP-MS has begun to spread in the field of individual monitoring for internal exposure as a very effective machine for uranium analysis. Although the ICP-MS has very high sensitivity, it requires longer time than conventional analysis, such as fluorescence analysis, because it is necessary to remove matrix from a urine sample sufficiently. To shorten time required for the urine bioassay by ICP-MS, a rapid uranium analysis method using the ICP-MS connected with a flow injection system was developed. Since this method does not involve chemical separation steps, the time required is equivalent to the conventional analysis. A measurement test was carried out using 10 urine solutions prepared from a urine sample. Required volume of urine solution is 5 ml. Main chemical treatment is only the digestion with 5 ml of nitric acid using a microwave oven to decompose organic matter and to dissolve suspended or precipitated matter. The microwave oven can digest 10 samples at once within an hour. Volume of digested sample solution was adjusted to 10 ml. The prepared sample solutions were directly introduced to the ICP-MS without any chemical separation procedure. The ICP-MS was connected with a flow injection system and an auto sampler. The flow injection system can minimize the matrix effects caused from salt dissolved in high matrix solution, such as non chemical separated urine sample, because it can introduce micro volume of sample solution into the ICP-MS. The ICP-MS detected uranium within 2 min/sample using the auto sampler. The 10 solutions prepared from a urine sample showed an average of 7.5 ng/l of uranium concentration in urine with 10 % standard deviation. A detection limit is about 1 ng/l. The total time required was less than 4 hours for 10 sample analysis. In the series of measurement, any memory effect was not observed. The present analysis method using the ICP-MS equipped with the flow injection system demonstrated that the shortening of time required on high

  11. Multiphysics methods development for high temperature gas reactor analysis

    Science.gov (United States)

    Seker, Volkan

    Multiphysics computational methods were developed to perform design and safety analysis of the next generation Pebble Bed High Temperature Gas Cooled Reactors. A suite of code modules was developed to solve the coupled thermal-hydraulics and neutronics field equations. The thermal-hydraulics module is based on the three dimensional solution of the mass, momentum and energy equations in cylindrical coordinates within the framework of the porous media method. The neutronics module is a part of the PARCS (Purdue Advanced Reactor Core Simulator) code and provides a fine mesh finite difference solution of the neutron diffusion equation in three dimensional cylindrical coordinates. Coupling of the two modules was performed by mapping the solution variables from one module to the other. Mapping is performed automatically in the code system by the use of a common material mesh in both modules. The standalone validation of the thermal-hydraulics module was performed with several cases of the SANA experiment and the standalone thermal-hydraulics exercise of the PBMR-400 benchmark problem. The standalone neutronics module was validated by performing the relevant exercises of the PBMR-268 and PBMR-400 benchmark problems. Additionally, the validation of the coupled code system was performed by analyzing several steady state and transient cases of the OECD/NEA PBMR-400 benchmark problem.

  12. Development of root observation method by image analysis system

    OpenAIRE

    Kim, Giyoung

    1995-01-01

    Knowledge of plant roots is important for determining plant-soil relationships, managing soil effectively, studying nutrient and water extraction, and creating a soil quality index. Plant root research is limited by the large amount of time and labor required to wash the roots from the soil and measure the viable roots. A root measurement method based on image analysis was proposed to reduce the time and labor requirement. A thinning algorithm-based image analysis method was us...

  13. Task analysis method for procedural training curriculum development.

    Science.gov (United States)

    Riggle, Jakeb D; Wadman, Michael C; McCrory, Bernadette; Lowndes, Bethany R; Heald, Elizabeth A; Carstens, Patricia K; Hallbeck, M Susan

    2014-06-01

    A central venous catheter (CVC) is an important medical tool used in critical care and emergent situations. Integral to proper care in many circumstances, insertion of a CVC introduces the risk of central line-associated blood stream infections and mechanical adverse events; proper training is important for safe CVC insertion. Cognitive task analysis (CTA) methods have been successfully implemented in the medical field to improve the training of postgraduate medical trainees, but can be very time-consuming to complete and require a significant time commitment from many subject matter experts (SMEs). Many medical procedures such as CVC insertion are linear processes with well-documented procedural steps. These linear procedures may not require a traditional CTA to gather the information necessary to create a training curriculum. Accordingly, a novel, streamlined CTA method designed primarily to collect cognitive cues for linear procedures was developed to be used by medical professionals with minimal CTA training. This new CTA methodology required fewer trained personnel, fewer interview sessions, and less time commitment from SMEs than a traditional CTA. Based on this study, a streamlined CTA methodology can be used to efficiently gather cognitive information on linear medical procedures for the creation of resident training curricula and procedural skills assessments. PMID:24366759

  14. Development of methods for the analysis of NPP operating experience

    International Nuclear Information System (INIS)

    Component failure-events and safety-related abnormal events collected in nuclear power plants of many countries, after having been homogenized both in language and in format, are stored in data banks of the European Reliability Data System, developed by the Joint Research Centre (JRC). The tools available, or still under study, at the JRC for the analysis of the contents or these banks are presented and some results of analysis are commented. (author)

  15. Metaphysics methods development for high temperature gas cooled reactor analysis

    International Nuclear Information System (INIS)

    Gas cooled reactors have been characterized as one of the most promising nuclear reactor concepts in the Generation-IV technology road map. Considerable research has been performed on the design and safety analysis of these reactors. However, the calculational tools being used to perform these analyses are not state-of-the-art and are not capable of performing detailed three-dimensional analyses. This paper presents the results of an effort to develop an improved thermal-hydraulic solver for the pebble bed type high temperature gas cooled reactors. The solution method is based on the porous medium approach and the momentum equation including the modified Ergun's resistance model for pebble bed is solved in three-dimensional geometry. The heat transfer in the pebble bed is modeled considering the local thermal non-equilibrium between the solid and gas, which results in two separate energy equations for each medium. The effective thermal conductivity of the pebble-bed can be calculated both from Zehner-Schluender and Robold correlations. Both the fluid flow and the heat transfer are modeled in three dimensional cylindrical coordinates and can be solved in steady-state and time dependent. The spatial discretization is performed using the finite volume method and the theta-method is used in the temporal discretization. A preliminary verification was performed by comparing the results with the experiments conducted at the SANA test facility. This facility is located at the Institute for Safety Research and Reactor Technology (ISR), Julich, Germany. Various experimental cases are modeled and good agreement in the gas and solid temperatures is observed. An on-going effort is to model the control rod ejection scenarios as described in the OECD/NEA/NSC PBMR-400 benchmark problem. In order to perform these analyses PARCS reactor simulator code will be coupled with the new thermal-hydraulic solver. Furthermore, some of the other anticipated accident scenarios in the benchmark

  16. Development of direct transmission probability method for criticality safety analysis

    International Nuclear Information System (INIS)

    We have developed new deterministic Sn-type two dimensional transport calculation method using direct transmission probabilities. The present method has advantage on calculation accuracy for geometries with much void or neutron absorption regions, because paths of neutron are calculated from generation to reaction without approximation. Checking calculations are carried out for a criticality safety problem of fuel assemblies in a spent fuel storage pool with neutron absorption materials, which show difference between the present method and the conventional Sn methods of DOT3.5 on eigenvalues and flux distributions. The other checking calculations for a neutron shielding problem show advantage of the present method comparing with the conventional Sn methods from the viewpoint of ray effects. (author)

  17. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  18. Rapid software development : ANALYSIS OF AGILE METHODS FOR APP STARTUPS

    OpenAIRE

    Wahlqvist, Daniel

    2014-01-01

    This thesis is focused on software development using so called Agile methods. The scope of research is startup companies creating consumer apps. The thesis work was performed at a Swedish app startup; Storypic/Accelit AB. An overview of current research on Agile methods is given. A qualitative case study was undertaken in four parts; 1. Observing the team 2. Testing business hypotheses 3. Interviews with the team and 4. User feedback. Analyzing the findings some conclusions are drawn:  An ag...

  19. Organic analysis and analytical methods development: FY 1995 progress report

    Energy Technology Data Exchange (ETDEWEB)

    Clauss, S.A.; Hoopes, V.; Rau, J. [and others

    1995-09-01

    This report describes the status of organic analyses and developing analytical methods to account for the organic components in Hanford waste tanks, with particular emphasis on tanks assigned to the Flammable Gas Watch List. The methods that have been developed are illustrated by their application to samples obtained from Tank 241-SY-103 (Tank 103-SY). The analytical data are to serve as an example of the status of methods development and application. Samples of the convective and nonconvective layers from Tank 103-SY were analyzed for total organic carbon (TOC). The TOC value obtained for the nonconvective layer using the hot persulfate method was 10,500 {mu}g C/g. The TOC value obtained from samples of Tank 101-SY was 11,000 {mu}g C/g. The average value for the TOC of the convective layer was 6400 {mu}g C/g. Chelator and chelator fragments in Tank 103-SY samples were identified using derivatization. gas chromatography/mass spectrometry (GC/MS). Organic components were quantified using GC/flame ionization detection. Major components in both the convective and nonconvective-layer samples include ethylenediaminetetraacetic acid (EDTA), nitrilotriacetic acid (NTA), succinic acid, nitrosoiminodiacetic acid (NIDA), citric acid, and ethylenediaminetriacetic acid (ED3A). Preliminary results also indicate the presence of C16 and C18 carboxylic acids in the nonconvective-layer sample. Oxalic acid was one of the major components in the nonconvective layer as determined by derivatization GC/flame ionization detection.

  20. Method development and analysis of arsenolipids in marine oils

    OpenAIRE

    Sele, Veronika

    2014-01-01

    Arsenic in marine oils is mainly present in the form of lipid-soluble compounds; collectively called arsenolipids. Although total arsenic concentrations in marine oils typically range from 0.2 to 16 mg kg-1 [1-3], knowledge regarding the chemical structures and distribution of arsenolipids in oils is limited. The present work describes the development of analytical methods for the determination of arsenolipids, and their application to marine oil, including fish oil and oil of ...

  1. Leadership Development Expertise: A Mixed-Method Analysis

    Science.gov (United States)

    Okpala, Comfort O.; Hopson, Linda B.; Chapman, Bernadine; Fort, Edward

    2011-01-01

    In this study, the impact of graduate curriculum, experience, and standards in the development of leadership expertise were examined. The major goals of the study were to (1) examine the impact of college content curriculum in the development of leadership expertise, (2) examine the impact of on the job experience in the development of leadership…

  2. Analysis of cultural development of Isfahan city Using Factor analysis method

    Directory of Open Access Journals (Sweden)

    J.Mohammadi

    2013-01-01

    Full Text Available Extended abstract1-IntroductionCultural spaces are consideredas one of the main factors for development. Cultural development is a qualitative and valuable process that for assessing it, quantitative indicators in cultural planning are used to obtain development objectives in the pattern of goods and services. The aim of the study is to determine and analyze cultural development level and regional inequality of different districts of Isfahan using factor analysis technique. The statistical population of the study is 14 districts of Isfahan municipality. The dominant approach ofthis study is quantitative – description and analytical. In this study, 35 indices have been summarized by factor analysis method and have been reduced to 5 factors and combined in significant ones and delivered.2 – Theoretical basesThe most important objectives of spatial planning, considering limitation of resources, are optimum distributions of facilities and services among different locations in which people live. To do this,there is a need to identify different locations in terms of having different facilities and services, so that developed locations are specified and planners can proceed to do something for spatial equilibrium and reducing privileged distance between districts.The present study has been conducted to reach to an equal development in Isfahan urban districts following identifying the situation and the manner of distributing development facilities cultural selected indices in different districts.3 – DiscussionCultural development of societies is evaluated by considering the changes and improvement of its indices and measured by quantitative frames. Cultural development indices are the most important tools for cultural planning in a special district in a society. In this study, cultural development indices have been used to determine the levels of districts. By using factor analysis model, the share of influential factors in the cultural

  3. A factor analysis method applied in development field

    OpenAIRE

    Mădălina CĂRBUREANU

    2010-01-01

    Both processes of development and globalization, refers to the nation’s economy and people welfare from all over the world, being subjects of present interest for different organizations. The economic reality and people living status can be described through a batch of variables. The problem appears when the number of variables is significant, and appears the need of handle this great volume of information. A solution to this problem can be the application of a factor ...

  4. Development of Rotor Diagnosis Method via Motor Current Signature Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Seok; Huh, Hyung; Kim, Min Hwan; Jeong, Kyeong Hoon; Lee, Gyu Mhan; Park, Jin Ho; Park, Keun Bae; Lee, Cheol Kwon; Hur, S

    2006-01-15

    A study on motor current signature analysis has been performed to monitor a journal bearing fault due to increasing clearance. It was known that the journal bearing clearance produces side band frequencies, the supplied current frequency plus and minus rotational rotor frequency in motor current. But the existence information of the side band frequencies is not sufficient to diagnose whether the journal bearing is safe or not. Four journal bearing sets with different clearances are used to measure the side band frequency amplitude and the rotor vibration amplitude versus the journal bearing clearance. The side band frequency amplitude and the rotor vibration amplitude are increased as the journal bearing clearance is increasing. This trend assures that ASME OM vibration guide line can be applied to estimate the journal bearing clearance size. In this research, 2.5 times the reference side band amplitude is suggested as an indicator of a journal bearing fault. Further study is necessary to make out more specific quantitative relations between the side band frequency amplitude and the journal bearing clearance of a motor.

  5. Further development of a static seismic analysis method for piping systems: The load coefficient method

    International Nuclear Information System (INIS)

    Currently the ASME Boiler and Pressure Vessel Code is considering a simplified Static Analysis Method for seismic design of piping systems for incorporation into Appendix N of Section 3, Division 1, of the Code. This proposed method, called the Load Coefficient Method, uses coefficients, ranging from .4 to 1.0, times the peak value of the in-structure response spectra with a static analysis technique to evaluate the response of piping systems to seismic events. The coefficient used is a function of the pipe support spacing hence the frequency response of the system and in general, the greater the support spacing the lower the frequency, the lower the spectral response, hence the lower the coefficient. The results of the Load Coefficient Method static analyses have been compared to analyses using the Response Spectrum Modal Analysis Method. Reaction loads were also evaluated with one important modification, a minimum support reaction load as a function of nominal pipe diameter has been established. This assures that lightly loaded supports regardless of the analytical method used will be loaded to realistic values and eliminate the potential for under designed supports. With respect to the accelerations applicable to inline components, a factor of 0.9 times the Square Root of Sum of Square of horizontal floor spectra peaks was determined to envelop the horizontal accelerations and a coefficient of 1.2 was shown to envelop the vertical accelerations. Presented in this paper is the current form of the load coefficient method, a summarization of the results of the over 2,700 benchmark analysis of piping system segments which form the basis for the acceptance of the method, and an explanation of the use of the method

  6. Considerations for Task Analysis Methods and Rapid E-Learning Development Techniques

    OpenAIRE

    Dr. Ismail Ipek; Dr. Ömer Faruk Sözcü

    2014-01-01

    The purpose of this paper is to provide basic dimensions for rapid training development in e-learning courses in education and business. Principally, it starts with defining task analysis and how to select tasks for analysis and task analysis methods for instructional design. To do this, first, learning and instructional technologies as visions of the future were discussed. Second, the importance of task analysis methods in rapid e-learning was considered, with learning technologi...

  7. Analysis of cultural development of Isfahan city Using Factor analysis method

    OpenAIRE

    Mohammadi, J.; M Izadi

    2013-01-01

    Extended abstract1-IntroductionCultural spaces are consideredas one of the main factors for development. Cultural development is a qualitative and valuable process that for assessing it, quantitative indicators in cultural planning are used to obtain development objectives in the pattern of goods and services. The aim of the study is to determine and analyze cultural development level and regional inequality of different districts of Isfahan using factor analysis technique. The statistical po...

  8. Chemical analysis of solid residue from liquid and solid fuel combustion: Method development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Trkmic, M. [University of Zagreb, Faculty of Mechanical Engineering and Naval Architecturek Zagreb (Croatia); Curkovic, L. [University of Zagreb, Faculty of Chemical Engineering and Technology, Zagreb (Croatia); Asperger, D. [HEP-Proizvodnja, Thermal Power Plant Department, Zagreb (Croatia)

    2012-06-15

    This paper deals with the development and validation of methods for identifying the composition of solid residue after liquid and solid fuel combustion in thermal power plant furnaces. The methods were developed for energy dispersive X-ray fluorescence (EDXRF) spectrometer analysis. Due to the fuels used, the different composition and the location of creation of solid residue, it was necessary to develop two methods. The first method is used for identifying solid residue composition after fuel oil combustion (Method 1), while the second method is used for identifying solid residue composition after the combustion of solid fuels, i. e. coal (Method 2). Method calibration was performed on sets of 12 (Method 1) and 6 (Method 2) certified reference materials (CRM). CRMs and analysis test samples were prepared in pellet form using hydraulic press. For the purpose of method validation the linearity, accuracy, precision and specificity were determined, and the measurement uncertainty of methods for each analyte separately was assessed. The methods were applied in the analysis of real furnace residue samples. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  9. Methodological aspects of development of new instrumental methods of analysis and improvement of known ones

    International Nuclear Information System (INIS)

    Consideration is given to possibilities of instrumental methods of analysis, such as method of precision registration of natural isotope rates of light elements from gaseous phase; method of piezoquartz microweighting; probe methods of analysis in spark mass-spectroscopy; extraction-atomic-emission spectroscopy with inductively coupled plasma. Prediction of further development of these methods, improvement of their analytic characteristics is given: increase of sensitivity, accuracy and rapidity. Extension of fields of their application is forecasted as well. 20 refs.; 7 figs.; 2 tabs

  10. A Product Analysis Method and its Staging to Develop Redesign Competences

    DEFF Research Database (Denmark)

    Hansen, Claus Thorp; Lenau, Torben Anker

    2013-01-01

    Most product development work in industrial practice is incremental, i.e. the company has had a product in production and on the market for some time, and now time has come to design an upgraded variant. This type of redesign project requires that the engineering designers have competences to carry...... through an analysis of the existing product encompassing both a user-oriented and a technical perspective, as well as to synthesise solution proposals for the upgraded variant. In the course module Product Analysis and Redesign we have developed a product analysis method and a staging of it, which seems...... to be very productive. In this paper we present the product analysis method and its staging and we reflect on the students’ application of it. We conclude that the method is a valid contribution to develop the students’ redesign competences....

  11. Recent developments in quasi-Newton methods for structural analysis and synthesis

    Science.gov (United States)

    Kamat, M. P.; Hayduk, R. J.

    1981-01-01

    Unlike the Newton-Raphson method, quasi-Newton methods by virture of the updates and step length control procedures are globally convergent and hence better suited for the solution of nonlinear problems of structural analysis and synthesis. Extension of quasi-Newton algorithms to large scale problems has led to the development of sparse update algorithms and to economical strategies for evaluating sparse Hessians. Ill-conditioning problems have led to the development of self-scaled variable metric and conjugate gradient algorithms, as well as the use of the singular perturbation theory. This paper emphasizes the effectiveness of such quasi-Newton algorithms for nonlinear structural analysis and synthesis.

  12. Development of pin-by-pin core analysis method using three-dimensional direct response matrix

    International Nuclear Information System (INIS)

    A three-dimensional direct response matrix method using a Monte Carlo calculation has been developed. The direct response matrix is formalized by four subresponse matrices in order to respond to a core eigenvalue k and thus can be recomposed at each outer iteration in core analysis. The subresponse matrices can be evaluated by ordinary single fuel assembly calculations with the Monte Carlo method in three dimensions. Since these subresponse matrices are calculated for the actual geometry of the fuel assembly, the effects of intra- and inter-assembly heterogeneities can be reflected on global partial neutron current balance calculations in core analysis. To verify this method, calculations for heterogeneous systems were performed. The results obtained using this method agreed well with those obtained using direct calculations with a Monte Carlo method. This means that this method accurately reflects the effects of intra- and inter-assembly heterogeneities and can be used for core analysis. A core analysis method, in which neutronic calculations using this direct response matrix method are coupled with thermal-hydraulic calculations, has also been developed. As it requires neither diffusion approximation nor a homogenization process of lattice constants, a precise representation of the effects of neutronic heterogeneities is possible. Moreover, the fuel rod power distribution can be directly evaluated, which enables accurate evaluations of core thermal margins. A method of reconstructing the response matrices according to the condition of each node in the core has been developed. The test revealed that the neutron multiplication factors and the fuel rod neutron production rates could be reproduced by interpolating the elements of the response matrix. A coupled analysis of neutronic calculations using the direct response matrix method and thermal-hydraulic calculations for an ABWR quarter core was performed, and it was found that the thermal power and coolant

  13. Development of spectral history methods for pin-by-pin core analysis method using three-dimensional direct response matrix

    International Nuclear Information System (INIS)

    Spectral history methods for pin-by-pin core analysis method using the three-dimensional direct response matrix have been developed. The direct response matrix is formalized by four sub-response matrices in order to respond to a core eigenvalue k and thus can be recomposed at each outer iteration in the core analysis. For core analysis, it is necessary to take into account the burn-up effect related to spectral history. One of the methods is to evaluate the nodal burn-up spectrum obtained using the out-going neutron current. The other is to correct the fuel rod neutron production rates obtained the pin-by-pin correction. These spectral history methods were tested in a heterogeneous system. The test results show that the neutron multiplication factor error can be reduced by half during burn-up, the nodal neutron production rates errors can be reduced by 30% or more. The root-mean-square differences between the relative fuel rod neutron production rate distributions can be reduced within 1.1% error. This means that these methods can accurately reflect the effects of intra- and inter-assembly heterogeneities during burn-up and can be used for core analysis. Core analysis with the DRM method was carried out for an ABWR quarter core and it was found that both thermal power and coolant-flow distributions were smoothly converged. (authors)

  14. Development of A Standard Method for Human Reliability Analysis (HRA) of Nuclear Power Plants

    International Nuclear Information System (INIS)

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  15. Development of A Standard Method for Human Reliability Analysis of Nuclear Power Plants

    International Nuclear Information System (INIS)

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  16. Analysis of Pre-Service Science Teachers' Views about the Methods Which Develop Reflective Thinking

    Science.gov (United States)

    Töman, Ufuk; Odabasi Çimer, Sabiha; Çimer, Atilla

    2014-01-01

    In this study, we investigate of science and technology pre-service teachers' opinions about the methods developed reflective thinking and we determined at the level of reflective thinking. This study is a descriptive study. Open-ended questions were used to determine the views of pre-service teachers. Questions used in the statistical analysis of…

  17. Considerations for Task Analysis Methods and Rapid E-Learning Development Techniques

    Directory of Open Access Journals (Sweden)

    Dr. Ismail Ipek

    2014-02-01

    Full Text Available The purpose of this paper is to provide basic dimensions for rapid training development in e-learning courses in education and business. Principally, it starts with defining task analysis and how to select tasks for analysis and task analysis methods for instructional design. To do this, first, learning and instructional technologies as visions of the future were discussed. Second, the importance of task analysis methods in rapid e-learning was considered, with learning technologies as asynchronous and synchronous e-learning development. Finally, rapid instructional design concepts and e-learning design strategies were defined and clarified with examples, that is, all steps for effective task analysis and rapid training development techniques based on learning and instructional design approaches were discussed, such as m-learning and other delivery systems. As a result, the concept of task analysis, rapid e-learning development strategies and the essentials of online course design were discussed, alongside learner interface design features for learners and designers.

  18. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  19. Development of spectrofluorimetric and HPLC methods for In vitro analysis of repaglinide

    Directory of Open Access Journals (Sweden)

    Kaushal N

    2010-01-01

    Full Text Available Spectrofluorimetric and high-performance liquid chromatography methods for estimation of repaglinide were developed. These methods were validated for estimation of repaglinide in tablets as well as in receptor fluid obtained during in vitro permeation studies. Repaglinide was observed to exhibit emission and excitation wavelengths, respectively, at 379 nm and 282 nm with linearity in the concentration range of 5-80 μg/ml. High-performance liquid chromatography analysis of repaglinide yielded retention time of 6.14 min with linearity ranging from 0.1-1.2 μg/ml concentration. Spectrofluorimetric analysis of repaglinide in tablets yielded results comparable to high performance liquid chromatography.

  20. Development of a transient boiling transition analysis method based on a film flow model

    International Nuclear Information System (INIS)

    A new single-channel, transient boiling transition (BT) prediction method based on a film flow model has been developed for a core thermal-hydraulic code. This method could predict onset and location of dryout and rewetting under transient conditions mechanically based on the dryout criterion and with consideration of the spacer effect. The developed method was applied to analysis of steady-state and transient BT experiments using BWR fuel bundle mock-ups for verification. Comparisons between calculated results and experimental data showed that the developed method tended to predict occurrence of rewetting earlier, however, onset time of BT and maximum rod surface temperature were well predicted within 0.6 s and 20degC, respectively. Moreover, it was confirmed that consideration of the spacer effect on liquid film flow rate on the rod surface was required to predict dryout phenomena accurately under transient conditions. (author)

  1. Development of best estimate analysis methods in Canada to allow quantification of safety margins

    International Nuclear Information System (INIS)

    The paper presents an outline, from the regulator's perspective, of the current situation in Canada with development of best estimate and uncertainty assessment (BE+UA) methods intended for application in the licensing safety analysis. Reasons, incentives and expectations related to development and application of the best estimate safety analysis methodology are being explored in some detail. Difficulties in attaining acceptance of this methodology for licensing applications are also discussed. Maintenance of adequate safety margins is a firmly established principle in the Canadian regulatory practice. Safety analysis is performed to demonstrate that margins are present and sufficient. Due to a variety of reasons, some of the operating CANDU reactors have been recently experiencing difficulties in compelling demonstration of adequate margins. The industry sees application of the BE+UA safety analysis as a potential way of recovering or improving safety margins and removing operational constraints. The operating power reactors in Canada have been licensed, as anywhere in the world, with the use of deterministic, conservative safety analysis methods. Until now, the conservative approach continues to be the cornerstone of safety analyses. It has been recognized, however, by both the industry and the regulator, that BE+UA methods have reached sufficient maturity to allow more accurate and realistic modelling of accident transients, thus presenting an opportunity to better quantify safety margins. It is expected that in many cases a BE+UA analysis will be able to show larger margins than it was possible to demonstrate using the conservative approach. If the BE+UA analyses predict more benign consequences of analysed events, this would facilitate resolving some of the currently outstanding safety issues and lessen economic penalties on utilities. As a consequence, the industry has started several projects aimed at the development and application of BE+UA methods and has

  2. A Product Analysis Method and its Staging to Develop Redesign Competences

    DEFF Research Database (Denmark)

    Hansen, Claus Thorp; Lenau, Torben Anker

    2013-01-01

    Most product development work in industrial practice is incremental, i.e. the company has had a product in production and on the market for some time, and now time has come to design an upgraded variant. This type of redesign project requires that the engineering designers have competences to carry...... to be very productive. In this paper we present the product analysis method and its staging and we reflect on the students’ application of it. We conclude that the method is a valid contribution to develop the students’ redesign competences....

  3. Development of methods of the Fractal Dimension estimation for the ecological data analysis

    CERN Document Server

    Jura, Jakub; Mironovová, Martina

    2015-01-01

    This paper deals with an estimating of the Fractal Dimension of a hydrometeorology variables like an Air temperature or humidity at a different sites in a landscape (and will be further evaluated from the land use point of view). Three algorithms and methods of an estimation of the Fractal Dimension of a hydrometeorology time series were developed. The first results indicate that developed methods are usable for the analysis of a hydrometeorology variables and for a testing of the relation with autoregulation functions of ecosystem

  4. Whole genome sequence analysis of unidentified genetically modified papaya for development of a specific detection method.

    Science.gov (United States)

    Nakamura, Kosuke; Kondo, Kazunari; Akiyama, Hiroshi; Ishigaki, Takumi; Noguchi, Akio; Katsumata, Hiroshi; Takasaki, Kazuto; Futo, Satoshi; Sakata, Kozue; Fukuda, Nozomi; Mano, Junichi; Kitta, Kazumi; Tanaka, Hidenori; Akashi, Ryo; Nishimaki-Mogami, Tomoko

    2016-08-15

    Identification of transgenic sequences in an unknown genetically modified (GM) papaya (Carica papaya L.) by whole genome sequence analysis was demonstrated. Whole genome sequence data were generated for a GM-positive fresh papaya fruit commodity detected in monitoring using real-time polymerase chain reaction (PCR). The sequences obtained were mapped against an open database for papaya genome sequence. Transgenic construct- and event-specific sequences were identified as a GM papaya developed to resist infection from a Papaya ringspot virus. Based on the transgenic sequences, a specific real-time PCR detection method for GM papaya applicable to various food commodities was developed. Whole genome sequence analysis enabled identifying unknown transgenic construct- and event-specific sequences in GM papaya and development of a reliable method for detecting them in papaya food commodities. PMID:27006240

  5. Development of regression model for uncertainty analysis by response surface method in HANARO

    International Nuclear Information System (INIS)

    The feasibility of uncertainty analysis with regression model in reactor physics problem was investigated. Regression model as a alternative model for a MCNP/ORIGEN2 code system which is uncertainty analysis tool of fission-produced molybdenum production was developed using Response Surface Method. It was shown that the development of regression model in the reactor physics problem was possible by introducing the burnup parameter. The most important parameter affecting the uncertainty of 99Mo yield ratio was fuel thickness in the regression model. This results agree well those of Crude Monte Carlo Method for each parameter. The regression model developed in this research was shown to be suitable as a alternative model, because coefficient of determination was 0.99

  6. Development of an adjoint sensitivity method for site characterization, uncertainty analysis, and code calibration/validation

    Energy Technology Data Exchange (ETDEWEB)

    Lu, A.H.

    1991-09-01

    The adjoint method is applied to groundwater flow-mass transport coupled equations in variably saturated media. The sensitivity coefficients derived by this method can be calculated by a single execution for each performance measure regardless of the number of parameters in question. The method provides an efficient and effective way to rank the importance of the parameters, so that data collection can be guided in support of site characterization programs. The developed code will facilitate the sensitivity/uncertainty analysis in both model prediction and model calibration/validation. 13 refs., 1 tab.

  7. Development of an adjoint sensitivity method for site characterization, uncertainty analysis, and code calibration/validation

    International Nuclear Information System (INIS)

    The adjoint method is applied to groundwater flow-mass transport coupled equations in variably saturated media. The sensitivity coefficients derived by this method can be calculated by a single execution for each performance measure regardless of the number of parameters in question. The method provides an efficient and effective way to rank the importance of the parameters, so that data collection can be guided in support of site characterization programs. The developed code will facilitate the sensitivity/uncertainty analysis in both model prediction and model calibration/validation. 13 refs., 1 tab

  8. Method development and validation for pharmaceutical tablets analysis using transmission Raman spectroscopy.

    Science.gov (United States)

    Li, Yi; Igne, Benoît; Drennen, James K; Anderson, Carl A

    2016-02-10

    The objective of the study is to demonstrate the development and validation of a transmission Raman spectroscopic method using the ICH-Q2 Guidance as a template. Specifically, Raman spectroscopy was used to determine niacinamide content in tablet cores. A 3-level, 2-factor full factorial design was utilized to generate a partial least-squares model for active pharmaceutical ingredient quantification. Validation of the transmission Raman model was focused on figures of merit from three independent batches manufactured at pilot scale. The resultant model statistics were evaluated along with the linearity, accuracy, precision and robustness assessments. Method specificity was demonstrated by accurate determination of niacinamide in the presence of niacin (an expected related substance). The method was demonstrated as fit for purpose and had the desirable characteristics of very short analysis times (∼2.5s per tablet). The resulting method was used for routine content uniformity analysis of single dosage units in a stability study. PMID:26656945

  9. Development of a Probabilistic Component Mode Synthesis Method for the Analysis of Non-Deterministic Substructures

    Science.gov (United States)

    Brown, Andrew M.; Ferri, Aldo A.

    1995-01-01

    Standard methods of structural dynamic analysis assume that the structural characteristics are deterministic. Recognizing that these characteristics are actually statistical in nature, researchers have recently developed a variety of methods that use this information to determine probabilities of a desired response characteristic, such as natural frequency, without using expensive Monte Carlo simulations. One of the problems in these methods is correctly identifying the statistical properties of primitive variables such as geometry, stiffness, and mass. This paper presents a method where the measured dynamic properties of substructures are used instead as the random variables. The residual flexibility method of component mode synthesis is combined with the probabilistic methods to determine the cumulative distribution function of the system eigenvalues. A simple cantilever beam test problem is presented that illustrates the theory.

  10. Development of load-dependent Ritz vector method for structural dynamic analysis of large space structures

    Science.gov (United States)

    Ricles, James M.

    1990-01-01

    The development and preliminary assessment of a method for dynamic structural analysis based on load-dependent Ritz vectors are presented. The vector basis is orthogonalized with respect to the mass and structural stiffness in order that the equations of motion can be uncoupled and efficient analysis of large space structure performed. A series of computer programs was developed based on the algorithm for generating the orthogonal load-dependent Ritz vectors. Transient dynamic analysis performed on the Space Station Freedom using the software was found to provide solutions that require a smaller number of vectors than the modal analysis method. Error norm based on the participation of the mass distribution of the structure and spatial distribution of structural loading, respectively, were developed in order to provide an indication of vector truncation. These norms are computed before the transient analysis is performed. An assessment of these norms through a convergence study of the structural response was performed. The results from this assessment indicate that the error norms can provide a means of judging the quality of the vector basis and accuracy of the transient dynamic solution.

  11. Ion beam analysis - development and application of nuclear reaction analysis methods, in particular at a nuclear microprobe

    International Nuclear Information System (INIS)

    This thesis treats the development of Ion Beam Analysis methods, principally for the analysis of light elements at a nuclear microprobe. The light elements in this context are defined as having an atomic number less than approx. 13. The work reported is to a large extent based on multiparameter methods. Several signals are recorded simultaneously, and the data can be effectively analyzed to reveal structures that can not be observed through one-parameter collection. The different techniques are combined in a new set-up at the Lund Nuclear Microprobe. The various detectors for reaction products are arranged in such a way that they can be used for the simultaneous analysis of hydrogen, lithium, boron and fluorine together with traditional PIXE analysis and Scanning Transmission Ion Microscopy as well as photon-tagged Nuclear Reaction Analysis. 48 refs

  12. Ion beam analysis - development and application of nuclear reaction analysis methods, in particular at a nuclear microprobe

    Energy Technology Data Exchange (ETDEWEB)

    Sjoeland, K.A.

    1996-11-01

    This thesis treats the development of Ion Beam Analysis methods, principally for the analysis of light elements at a nuclear microprobe. The light elements in this context are defined as having an atomic number less than approx. 13. The work reported is to a large extent based on multiparameter methods. Several signals are recorded simultaneously, and the data can be effectively analyzed to reveal structures that can not be observed through one-parameter collection. The different techniques are combined in a new set-up at the Lund Nuclear Microprobe. The various detectors for reaction products are arranged in such a way that they can be used for the simultaneous analysis of hydrogen, lithium, boron and fluorine together with traditional PIXE analysis and Scanning Transmission Ion Microscopy as well as photon-tagged Nuclear Reaction Analysis. 48 refs.

  13. Radiochemical method development

    International Nuclear Information System (INIS)

    The authors have developed methods for chemical characterization of the environment under a multitask project that focuses on improvement of radioanalytical methods with an emphasis on faster and cheaper routine methods. The authors have developed improved methods for separation of environmental levels of technetium-99, radium, and actinides from soil and water; separation of actinides from soil and water matrix interferences; and isolation of strontium. They are also developing methods for simultaneous detection of multiple isotopes (including nonradionuclides) by using a new instrumental technique, inductively coupled plasma-mass spectrometry (ICP-MS). The new ICP-MS methods have greater sensitivity and efficiency and could replace many radiometric techniques. They are using flow injection analysis to integrate and automate the separation methods with the ICP-MS methodology. The final product of all activities will be methods that are available (published in the U.S. Department of Energy's analytical methods compendium) and acceptable for use in regulatory situations

  14. REQUIREMENT ANALYSIS METHOD OF ECOMMERCE WEBSITES DEVELOPMENT FOR SMALLMEDIUM ENTERPRISES, CASE STUDY: INDONESIA

    Directory of Open Access Journals (Sweden)

    Veronica S. Moertini

    2014-03-01

    Full Text Available Along with the growth of the Internet, the trend shows that e-commerce have been growing significantly in the last several years. This means business opportunities for small-medium enterprises (SMEs, which are recognized as the backbone of the economy. SMEs may develop and run small to medium size of particular e-commerce websites as the solution of specific business opportunities. Certainly, the websites should be developed accordingly to support business success. In developing the websites, key elements of e-commerce business model that are necessary to ensure the success should be resolved at the requirement stage of the development. In this paper, we propose an enhancement of requirement analysis method found in literatures such that it includes activities to resolve the key elements. The method has been applied in three case studies based on Indonesia situations and we conclude that it is suitable to be adopted by SMEs.

  15. Development of a Probabilistic Tsunami Hazard Analysis Method and Application to an NPP in Korea

    International Nuclear Information System (INIS)

    A methodology of tsunami PSA was developed in this study. A tsunami PSA consists of tsunami hazard analysis, tsunami fragility analysis and system analysis. In the case of tsunami hazard analysis, evaluation of tsunami return period is a major task. For the evaluation of tsunami return period was evaluated with empirical method using historical tsunami record and tidal gauge record. For the performing a tsunami fragility analysis, procedure of tsunami fragility analysis was established and target equipment and structures for investigation of tsunami fragility assessment were selected. A sample fragility calculation was performed for the equipment in a Nuclear Power Plant. For the system analysis, accident sequence of tsunami event was developed according to the tsunami run-up and draw down, and tsunami induced core damage frequency (CDF) is determined. For the application to the real nuclear power plant, the Ulchin 56 NPP which is located on the east coast of Korean peninsula was selected. Through this study, whole tsunami PSA (Probabilistic Safety Assessment) working procedure was established and an example calculation was performed for one nuclear power plant in Korea

  16. Analysis of heavy oils: Method development and application to Cerro Negro heavy petroleum detailed separation and analysis of acidic compounds

    Energy Technology Data Exchange (ETDEWEB)

    Green, J.B.; Yu, S.K.T.; Green, J.A.; Doughty, D.A.; Vogh, J.W.; Grigsby, R.D.

    1989-10-01

    An HPLC method for fractionation of whole acid concentrates into nominal compound class subfractions is described. The method utilizes silica columns and gradient elution with eluents containing a strong base, tetramethyl-ammonium hydroxide. The performance of the method is evaluated through analysis of subfractions obtained from a coal liquid, Wilmington, CA, petroleum and Cerro Negro heavy oil. Methods developed specifically for analysis of whole acid concentrates and subfractions are described in detail. These include: (1) an infrared method for determination of total hydroxyl and carboxyl groups after their conversion to trifluoroacetate and 2,2,2-trifluoresters, respectively. (2) an NMR method for functional group analysis based on methylation of acidic groups with {sup 13}C-enriched methyl iodide, (3) a nonaqueous titration procedure employing the potassium salt of dimethyl sulfoxide as a titrant for acidic compounds, (4) GC/MS analysis of hydroxyaromatic compounds after their conversion to trifluoroacetate esters, and (5) probe microdistillation high resolution mass spectrometric analysis of acid fractions exhibiting low volatility. 146 refs., 38 figs., 27 tabs.

  17. Development of TRU waste mobile analysis methods for RCRA-regulated metals

    International Nuclear Information System (INIS)

    This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). Glow-discharge mass spectrometry (GD-MS), laser-induced breakdown spectroscopy (LIBS), dc-arc atomic-emission spectroscopy (DC-ARC-AES), laser-ablation inductively-coupled-plasma mass spectrometry (LA-ICP-MS), and energy-dispersive x-ray fluorescence (EDXRF) were identified as potential solid-sample analytical techniques for mobile characterization of TRU waste. Each technology developers was provided with surrogate TRU waste samples in order to develop an analytical method. Following successful development of the analytical method, five performance evaluation samples were distributed to each of the researchers in a blind round-robin format. Results of the round robin were compared to known values and Transuranic Waste Characterization Program (TWCP) data quality objectives. Only two techniques, DC-ARC-AES and EDXRF, were able to complete the entire project. Methods development for GD-MS and LA-ICP-MS was halted due to the stand-down at the CMR facility. Results of the round-robin analysis are given for the EDXRF and DCARC-AES techniques. While DC-ARC-AES met several of the data quality objectives, the performance of the EDXRF technique by far surpassed the DC-ARC-AES technique. EDXRF is a simple, rugged, field portable instrument that appears to hold great promise for mobile characterization of TRU waste. The performance of this technique needs to be tested on real TRU samples in order to assess interferences from actinide constituents. In addition, mercury and beryllium analysis will require another analytical technique because the EDXRF method failed to meet the TWCP data quality objectives. Mercury analysis is easily accomplished on solid samples by cold vapor atomic fluorescence (CVAFS). Beryllium can be analyzed by any of a variety of emission techniques

  18. Development and Validation of an HPLC Method for the Analysis of Sirolimus in Drug Products

    Directory of Open Access Journals (Sweden)

    Hadi Valizadeh

    2012-05-01

    Full Text Available Purpose: The aim of this study was to develop a simple, rapid and sensitive reverse phase high performance liquid chromatography (RP-HPLC method for quantification of sirolimus (SRL in pharmaceutical dosage forms. Methods: The chromatographic system employs isocratic elution using a Knauer- C18, 5 mm, 4.6 × 150 mm. Mobile phase consisting of acetonitril and ammonium acetate buffer set at flow rate 1.5 ml/min. The analyte was detected and quantified at 278nm using ultraviolet detector. The method was validated as per ICH guidelines. Results: The standard curve was found to have a linear relationship (r2 > 0.99 over the analytical range of 125–2000ng/ml. For all quality control (QC standards in intraday and interday assay, accuracy and precision range were -0.96 to 6.30 and 0.86 to 13.74 respectively, demonstrating the precision and accuracy over the analytical range. Samples were stable during preparation and analysis procedure. Conclusion: Therefore the rapid and sensitive developed method can be used for the routine analysis of sirolimus such as dissolution and stability assays of pre- and post-marketed dosage forms.

  19. Development of prediction method of void fraction distribution in fuel assemblies for use in safety analysis

    International Nuclear Information System (INIS)

    The establishment of code system for BWR safety analysis is now in progress at Institute of Nuclear Safety (INS), in order to predict the onset of boiling transition (BT) in nuclear fuel assemblies in any thermal-hydraulic condition without relying on the thermal-hydraulic characteristic data provided by licensee. The prediction method for void fraction distribution across cross section of BWR fuel assemblies has been developed based on multi-dimensional two-fluid model. Lift forces working on bubbles and void diffusion that can not be handled with one-dimensional analysis were considered. Comparisons between calculated results and experimental data obtained from thermal-hydraulic tests of PWR and BWR mock-up fuel assemblies showed good agreement. Lift force models have been empirical and further studies were needed, but the calculations showed the possibility of applying these models to multi-dimensional gas-liquid two-phase flow analysis. (author)

  20. Recent developments of nanoparticle-based enrichment methods for mass spectrometric analysis in proteomics

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    In proteome research, rapid and effective separation strategies are essential for successful protein identification due to the broad dynamic range of proteins in biological samples. Some important proteins are often expressed in ultra low abundance, thus making the pre-concentration procedure before mass spectrometric analysis prerequisite. The main purpose of enrichment is to isolate target molecules from complex mixtures to reduce sample complexity and facilitate the subsequent analyzing steps. The introduction of nanoparticles into this field has accelerated the development of enrichment methods. In this review, we mainly focus on recent developments of using different nanomaterials for pre-concentration of low-abundance peptides/ proteins, including those containing post-translational modifications, such as phosphorylation and glycosylation, prior to mass spectrometric analysis.

  1. Development of Finite Elements for Two-Dimensional Structural Analysis Using the Integrated Force Method

    Science.gov (United States)

    Kaljevic, Igor; Patnaik, Surya N.; Hopkins, Dale A.

    1996-01-01

    The Integrated Force Method has been developed in recent years for the analysis of structural mechanics problems. This method treats all independent internal forces as unknown variables that can be calculated by simultaneously imposing equations of equilibrium and compatibility conditions. In this paper a finite element library for analyzing two-dimensional problems by the Integrated Force Method is presented. Triangular- and quadrilateral-shaped elements capable of modeling arbitrary domain configurations are presented. The element equilibrium and flexibility matrices are derived by discretizing the expressions for potential and complementary energies, respectively. The displacement and stress fields within the finite elements are independently approximated. The displacement field is interpolated as it is in the standard displacement method, and the stress field is approximated by using complete polynomials of the correct order. A procedure that uses the definitions of stress components in terms of an Airy stress function is developed to derive the stress interpolation polynomials. Such derived stress fields identically satisfy the equations of equilibrium. Moreover, the resulting element matrices are insensitive to the orientation of local coordinate systems. A method is devised to calculate the number of rigid body modes, and the present elements are shown to be free of spurious zero-energy modes. A number of example problems are solved by using the present library, and the results are compared with corresponding analytical solutions and with results from the standard displacement finite element method. The Integrated Force Method not only gives results that agree well with analytical and displacement method results but also outperforms the displacement method in stress calculations.

  2. Advanced organic analysis and analytical methods development: FY 1995 progress report. Waste Tank Organic Safety Program

    Energy Technology Data Exchange (ETDEWEB)

    Wahl, K.L.; Campbell, J.A.; Clauss, S.A. [and others

    1995-09-01

    This report describes the work performed during FY 1995 by Pacific Northwest Laboratory in developing and optimizing analysis techniques for identifying organics present in Hanford waste tanks. The main focus was to provide a means for rapidly obtaining the most useful information concerning the organics present in tank waste, with minimal sample handling and with minimal waste generation. One major focus has been to optimize analytical methods for organic speciation. Select methods, such as atmospheric pressure chemical ionization mass spectrometry and matrix-assisted laser desorption/ionization mass spectrometry, were developed to increase the speciation capabilities, while minimizing sample handling. A capillary electrophoresis method was developed to improve separation capabilities while minimizing additional waste generation. In addition, considerable emphasis has been placed on developing a rapid screening tool, based on Raman and infrared spectroscopy, for determining organic functional group content when complete organic speciation is not required. This capability would allow for a cost-effective means to screen the waste tanks to identify tanks that require more specialized and complete organic speciation to determine tank safety.

  3. Advanced organic analysis and analytical methods development: FY 1995 progress report. Waste Tank Organic Safety Program

    International Nuclear Information System (INIS)

    This report describes the work performed during FY 1995 by Pacific Northwest Laboratory in developing and optimizing analysis techniques for identifying organics present in Hanford waste tanks. The main focus was to provide a means for rapidly obtaining the most useful information concerning the organics present in tank waste, with minimal sample handling and with minimal waste generation. One major focus has been to optimize analytical methods for organic speciation. Select methods, such as atmospheric pressure chemical ionization mass spectrometry and matrix-assisted laser desorption/ionization mass spectrometry, were developed to increase the speciation capabilities, while minimizing sample handling. A capillary electrophoresis method was developed to improve separation capabilities while minimizing additional waste generation. In addition, considerable emphasis has been placed on developing a rapid screening tool, based on Raman and infrared spectroscopy, for determining organic functional group content when complete organic speciation is not required. This capability would allow for a cost-effective means to screen the waste tanks to identify tanks that require more specialized and complete organic speciation to determine tank safety

  4. Development of a diagnostic expert system for eddy current data analysis using applied artificial intelligence methods

    Energy Technology Data Exchange (ETDEWEB)

    Upadhyaya, B.R.; Yan, W. [Tennessee Univ., Knoxville, TN (United States). Dept. of Nuclear Engineering; Behravesh, M.M. [Electric Power Research Institute, Palo Alto, CA (United States); Henry, G. [EPRI NDE Center, Charlotte, NC (United States)

    1999-09-01

    A diagnostic expert system that integrates database management methods, artificial neural networks, and decision-making using fuzzy logic has been developed for the automation of steam generator eddy current test (ECT) data analysis. The new system, known as EDDYAI, considers the following key issues: (1) digital eddy current test data calibration, compression, and representation; (2) development of robust neural networks with low probability of misclassification for flaw depth estimation; (3) flaw detection using fuzzy logic; (4) development of an expert system for database management, compilation of a trained neural network library, and a decision module; and (5) evaluation of the integrated approach using eddy current data. The implementation to field test data includes the selection of proper feature vectors for ECT data analysis, development of a methodology for large eddy current database management, artificial neural networks for flaw depth estimation, and a fuzzy logic decision algorithm for flaw detection. A large eddy current inspection database from the Electric Power Research Institute NDE Center is being utilized in this research towards the development of an expert system for steam generator tube diagnosis. The integration of ECT data pre-processing as part of the data management, fuzzy logic flaw detection technique, and tube defect parameter estimation using artificial neural networks are the fundamental contributions of this research. (orig.)

  5. Development of a diagnostic expert system for eddy current data analysis using applied artificial intelligence methods

    International Nuclear Information System (INIS)

    A diagnostic expert system that integrates database management methods, artificial neural networks, and decision-making using fuzzy logic has been developed for the automation of steam generator eddy current test (ECT) data analysis. The new system, known as EDDYAI, considers the following key issues: (1) digital eddy current test data calibration, compression, and representation; (2) development of robust neural networks with low probability of misclassification for flaw depth estimation; (3) flaw detection using fuzzy logic; (4) development of an expert system for database management, compilation of a trained neural network library, and a decision module; and (5) evaluation of the integrated approach using eddy current data. The implementation to field test data includes the selection of proper feature vectors for ECT data analysis, development of a methodology for large eddy current database management, artificial neural networks for flaw depth estimation, and a fuzzy logic decision algorithm for flaw detection. A large eddy current inspection database from the Electric Power Research Institute NDE Center is being utilized in this research towards the development of an expert system for steam generator tube diagnosis. The integration of ECT data pre-processing as part of the data management, fuzzy logic flaw detection technique, and tube defect parameter estimation using artificial neural networks are the fundamental contributions of this research. (orig.)

  6. Recent developments in methods of chemical analysis in investigations of firearm-related events.

    Science.gov (United States)

    Zeichner, Arie

    2003-08-01

    A review of recent (approximately the last ten years) developments in the methods used for chemical analysis in investigations of firearm-related events is provided. This review discusses:examination of gunshot (primer) residues (GSR) and gunpowder (propellant) residues on suspects and their clothing;detection of firearm imprints on the hands of suspects;identification of the bullet entry holes and estimation of shooting distance;linking weapons and/or fired ammunition to the gunshot entries, and estimation of the time since discharge. PMID:12811451

  7. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    Science.gov (United States)

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  8. Development of an improved HRA method: A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, J.H.; Luckas, W.J. [Brookhaven National Lab., Upton, NY (United States); Wreathall, J. [John Wreathall & Co., Dublin, OH (United States)] [and others

    1996-03-01

    Probabilistic risk assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification.

  9. Development of an improved HRA method: A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Probabilistic risk assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification

  10. Development of the analysis methods for eddy current signal of D-probe

    International Nuclear Information System (INIS)

    Eddy current testing (ECT) method is very useful to detect flaws and defects in steam generator (SG) tube of nuclear power plants (NPPs) during in-service inspection. In spite of its technical improvements recently, typical ECT method has some shortcomings such as the low resolution in detection of small defects and it causes difficulty in analysis of the detect signals. The new diagnostic eddy current probe (D-probe) which was developed by KAERI, has the dual functions of the crack detection and a quantitative 3-dimensional profile measurement. It can measure a shape changes, and provides the information of the axial / circumferential location and magnitude of defect developed. The distribution of the defects and the shape changes on the SG tube surface could be acquired simultaneously from the analysis of an inspection signals with the D-probe. A stress corrosion cracking of the SG tubes in operating NPPs is supposed to have a relation with the residual stress existing in the local geometric changed region such as expansion transition, bend, dent and bulge, etc. Therefore the type and quantitative size of geometric anomaly existing in a tube is very important information to the activity of a nondestructive inspection, and it could provide a warning signal of an earlier defect in SG tubes. In this study, the personal computer based software program was developed for the analysis of the advanced D-probe ECT signals. The function of this software program includes the 3-dimensional quantitative evaluation and visualization of the geometric anomaly in a SG tube such as its type, location, magnitude and distribution

  11. Development of a Bayesian method for the analysis of inertial confinement fusion experiments on the NIF

    CERN Document Server

    Gaffney, Jim A; Sonnad, Vijay; Libby, Stephen B

    2013-01-01

    The complex nature of inertial confinement fusion (ICF) experiments results in a very large number of experimental parameters that are only known with limited reliability. These parameters, combined with the myriad physical models that govern target evolution, make the reliable extraction of physics from experimental campaigns very difficult. We develop an inference method that allows all important experimental parameters, and previous knowledge, to be taken into account when investigating underlying microphysics models. The result is framed as a modified $\\chi^{2}$ analysis which is easy to implement in existing analyses, and quite portable. We present a first application to a recent convergent ablator experiment performed at the NIF, and investigate the effect of variations in all physical dimensions of the target (very difficult to do using other methods). We show that for well characterised targets in which dimensions vary at the 0.5% level there is little effect, but 3% variations change the results of i...

  12. Development and validation of HPLC method for quantitative analysis of triamcinolone in biodegradable microparticles

    Directory of Open Access Journals (Sweden)

    A. A. Silva-Júnior

    2009-01-01

    Full Text Available

    A simple, rapid, selective and specific high performance liquid chromatographic (HPLC method for quantitative analysis of the triamcinolone in polylactide-co-glycolide acid (PLGA microparticles was developed. The chromatographic parameters were reversed-phase C18 column, 250mm x 4.6mm, with particle size 5 m. The column oven was thermostated at 35 ºC ± 2 ºC. The mobile phase was methanol/water 45:55 (v/v and elution was isocratic at a flow-rate of 1mL.mL-1. The determinations were performed using a UV-Vis detector at 239 nm. The injected sample volume was 10 µL. The standard curve was linear (r2 > 0.999 in the concentration range 100-2500 ng.mL-1. The method showed adequate precision, with a relative standard deviation (RSD was smaller than 3%. The accuracy was analyzed by adding a standard drug and good recovery values were obtained for all drug concentrations used. The method showed specificity and selectivity with linearity in the working range and good precision and accuracy, making it very suitable for quantitation of triamcinolone in PLGA microparticles. Keywords: triamcinolone; HPLC analytical method; PLGA microparticles; analytical method validation.

  13. Development of a Bayesian method for the analysis of inertial confinement fusion experiments on the NIF

    International Nuclear Information System (INIS)

    The complex nature of inertial confinement fusion (ICF) experiments results in a very large number of experimental parameters which, when combined with the myriad physical models that govern target evolution, make the reliable extraction of physics from experimental campaigns very difficult. We develop an inference method that allows all important experimental parameters, and previous knowledge, to be taken into account when investigating underlying microphysics models. The result is framed as a modified χ2 analysis which is easy to implement in existing analyses, and quite portable. We present a first application to a recent convergent ablator experiment performed at the National Ignition Facility (NIF), and investigate the effect of variations in all physical dimensions of the target (very difficult to do using other methods). We show that for well characterized targets in which dimensions vary at the 0.5% level there is little effect, but 3% variations change the results of inferences dramatically. Our Bayesian method allows particular inference results to be associated with prior errors in microphysics models; in our example, tuning the carbon opacity to match experimental data (i.e. ignoring prior knowledge) is equivalent to an assumed prior error of 400% in the tabop opacity tables. This large error is unreasonable, underlining the importance of including prior knowledge in the analysis of these experiments. (paper)

  14. Analysis and development of methods of correcting for heterogeneities to cobalt-60: computing application

    International Nuclear Information System (INIS)

    The purpose of this work is the analysis of the influence of inhomogeneities of the human body on the determination of the dose in Cobalt-60 radiation therapy. The first part is dedicated to the physical characteristics of inhomogeneities and to the conventional methods of correction. New methods of correction are proposed based on the analysis of the scatter. This analysis allows to take account, with a greater accuracy of their physical characteristics and of the corresponding modifications of the dose: ''the differential TAR method'' and ''the Beam Substraction Method''. The second part is dedicated to the computer implementation of the second method of correction for routine application in hospital

  15. Further development of probabilistic analysis method for lifetime determination of piping and vessels. Final report

    International Nuclear Information System (INIS)

    Within the framework of research project RS1196 the computer code PROST (Probabilistic Structure Calculation) for the quantitative evaluation of the structural reliability of pipe components has been further developed. Thereby models were provided and tested for the consideration of the damage mechanism 'stable crack growth' to determine leak and break probabilities in cylindrical structures of ferritic and austenitic reactor steels. These models are now additionally available to the model for the consideration of the damage mechanisms 'fatigue' and 'corrosion'. Moreover, a crack initiation model has been established supplementary to the treatment of initial cracks. Furthermore, the application range of the code was extended to the calculation of the growth of wall penetrating cracks. This is important for surface cracks growing until the formation of a stable leak. The calculation of the growth of the wall penetrating crack until break occurs improves the estimation of the break probability. For this purpose program modules were developed to be able to calculate stress intensity factors and critical crack lengths for wall penetrating cracks. In the frame of this work a restructuring of PROST was performed including possibilities to combine damage mechanisms during a calculation. Furthermore several additional fatigue crack growth laws were implemented. The implementation of methods to estimate leak areas and leak rates of wall penetrating cracks was completed by the inclusion of leak detection boundaries. The improved analysis methods were tested by calculation of cases treated already before. Furthermore comparative analyses have been performed for several tasks within the international activity BENCH-KJ. Altogether, the analyses show that with the provided flexible probabilistic analysis method quantitative determination of leak and break probabilities of a crack in a complex structure geometry under thermal-mechanical loading as

  16. Progress report on development of intermediate fidelity full assembly analysis methods.

    Energy Technology Data Exchange (ETDEWEB)

    Hu, R.; Fanning, T. H. (Nuclear Engineering Division)

    2011-09-30

    While high fidelity modeling capabilities for various physics phenomena are being pursued under advanced modeling and simulation initiatives under the DOE Office of Nuclear Energy, they generally rely on high-performance computation facilities and are too expensive to be used for parameter-space exploration or design analysis. One-dimensional system codes have been used for a long time and have reached a degree of maturity, but limit their validity to specific applications. Thus, an intermediate fidelity (IF) modeling method is being pursued in this work for a fast-running, modest-fidelity, whole-core transient analyses capability. The new approach is essential for design scoping and engineering analyses and could lead to improvements in the design of the new generations of reactors and to the reduction of uncertainties in safety analysis. This report summarizes the initial effort on the development of the intermediate-fidelity full assembly modeling method. The requirements and the desired merits of the IF approach have been defined. A three-dimensional momentum source model has been developed to model the anisotropic flow in the wire-wrapped rod bundle without the need to resolve the geometric details. It has been confirmed that the momentum source model works well if its affecting region is accurately imposed. The validity of the model is further verified by mesh and parameter sensitivity studies. The developed momentum source model, in principle, can be applied to any wire-wrapped bundle geometries and any flow regimes; while the modeling strategy can be applied to other conditions with complex or distorted geometry, such as flow in blocked channels.

  17. The development of mini gamma calorimeter. Analysis of the calorimeter characteristic using analytical method

    International Nuclear Information System (INIS)

    The development of mini gamma calorimeter. Analysis of the calorimeter characteristic using analytical method. To increase the gamma calorimeter capability, especially to obtain the new type of calorimeter that can be used at high power reactor, it is necessary to find out an innovation of the existing calorimeter model. The basic idea of the innovation is to eliminate the absorber material which restricts the performance of the old calorimeter. As the first step of innovation, characteristics of this mini calorimeter without absorber will be analyzed by analytical method in the static condition. The analysis was performed for several combinations of geometries and dimensions of active parts as well as those of gas isolations. The calculation results showed that the sensitivity (as a principal characteristic) of the calorimeter of 30oC per W/g is acceptable value, and the active length of 2 cm with the diameter of 1 mm of thermocouples (active part) is the optimum geometry. According to the results, it can be concluded that the mini gamma calorimeter proposed is reasonable to be made

  18. CZECHOSLOVAK FOOTPRINTS IN THE DEVELOPMENT OF METHODS OF THERMOMETRY, CALORIMETRY AND THERMAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Pavel Holba

    2012-07-01

    Full Text Available A short history on the development of thermometric methods are reviewed accentuating the role of Rudolf Bárta in underpinning special thermoanalytical conferences and new journal Silikáty in fifties as well as Vladimir Šatava mentioning his duty in the creation of the Czech school on thermoanalytical kinetics. This review surveys the innovative papers dealing with thermal analysis and the related fields (e.g. calorimetry, kinetics which have been published by noteworthy postwar Czechoslovak scholars and scientists and by their disciples in 1950-1980. Itemized 227 references with titles show rich scientific productivity revealing that many of them were ahead of time even at international connotation.

  19. Development on quantitative safety analysis method of accident scenario. The automatic scenario generator development for event sequence construction of accident

    International Nuclear Information System (INIS)

    This study intends to develop a more sophisticated tool that will advance the current event tree method used in all PSA, and to focus on non-catastrophic events, specifically a non-core melt sequence scenario not included in an ordinary PSA. In the non-catastrophic event PSA, it is necessary to consider various end states and failure combinations for the purpose of multiple scenario construction. Therefore it is anticipated that an analysis work should be reduced and automated method and tool is required. A scenario generator that can automatically handle scenario construction logic and generate the enormous size of sequences logically identified by state-of-the-art methodology was developed. To fulfill the scenario generation as a technical tool, a simulation model associated with AI technique and graphical interface, was introduced. The AI simulation model in this study was verified for the feasibility of its capability to evaluate actual systems. In this feasibility study, a spurious SI signal was selected to test the model's applicability. As a result, the basic capability of the scenario generator could be demonstrated and important scenarios were generated. The human interface with a system and its operation, as well as time dependent factors and their quantification in scenario modeling, was added utilizing human scenario generator concept. Then the feasibility of an improved scenario generator was tested for actual use. Automatic scenario generation with a certain level of credibility, was achieved by this study. (author)

  20. Chemical sensors and the development of potentiometric methods for liquid media analysis

    International Nuclear Information System (INIS)

    Aspects of applying indirect potentiometric determination to chemical analysis are considered. Among them are the standard and modified addition and subtraction methods, the multiple addition method, and potentiometric titration using ion-selective electrodes as indicators. These methods significantly extend the capabilities of ion-selective potentiometric analysis. Conditions for the applicability of the above-mentioned methods to various samples (Cd, REE, Th, iodides and others) are discussed using all available ion-selective electrodes as examples. 162 refs., 2 figs., 5 tabs

  1. Method Development of Cadmium Investigation in Rice by Radiochemical Neutron Activation Analysis

    International Nuclear Information System (INIS)

    Full text: A radiochemical neutron activation analysis for the determination of cadmium was investigated. A chemical separation of cadmium utilized ion exchange chromatography of a strong basic anion-exchange resin BIO-RAD 1X 8 (Chloride form). The adsorbing medium of 2M HCl was found to be the most suitable among the concentration attempted (2, 4, 6, 8 and 10M HCl) and the eluent for desorption of the cadmium from column was 8M NH3 solution. A chemical yield of 95% was found. The method has been evaluated by analyzing certified reference materials with 0.5.g/g (SRM 1577b, Bovine Liver) and 2.48.g/g (SRM 1566b, Oyster Tissue) cadmium. The agreement of the result with certified values is within 92% for Bovine Liver and 96% for Oyster Tissue. The method developed was applied to determine the cadmium concentrations in contaminated Thai rice. It was found that the cadmium concentrations ranged from 7.4 to 578.9 ppb

  2. Analysis and development of stochastic multigrid methods in lattice field theory

    International Nuclear Information System (INIS)

    We study the relation between the dynamical critical behavior and the kinematics of stochastic multigrid algorithms. The scale dependence of acceptance rates for nonlocal Metropolis updates is analyzed with the help of an approximation formula. A quantitative study of the kinematics of multigrid algorithms in several interacting models is performed. We find that for a critical model with Hamiltonian H(Φ) absence of critical slowing down can only be expected if the expansion of (H(Φ+ψ)) in terms of the shift ψ contains no relevant term (mass term). The predictions of this rule was verified in a multigrid Monte Carlo simulation of the Sine Gordon model in two dimensions. Our analysis can serve as a guideline for the development of new algorithms: We propose a new multigrid method for nonabelian lattice gauge theory, the time slice blocking. For SU(2) gauge fields in two dimensions, critical slowing down is almost completely eliminated by this method, in accordance with the theoretical prediction. The generalization of the time slice blocking to SU(2) in four dimensions is investigated analytically and by numerical simulations. Compared to two dimensions, the local disorder in the four dimensional gauge field leads to kinematical problems. (orig.)

  3. Development of Reliability Analysis Method for Nuclear Power Plant Protection System

    International Nuclear Information System (INIS)

    The paper presents a method and procedures for the reliability analysis of safety related nuclear power plant systems. The analysis provides an appropriate model to represent the system that will facilitate the applications of reliability engineering techniques during the design, construction and operating stages of a plant's life. The analysis assists in selecting design alternatives with high reliability and high safety potential during early design phase. It also ensures that all conceivable failure modes and their effects on operational success of the system have been considered. The output of the analysis is to be used as inputs to determine and to update the test interval of the system

  4. Development of thermal analysis method for the near field of HLW repository using ABAQUS

    Energy Technology Data Exchange (ETDEWEB)

    Kuh, Jung Eui; Kang, Chul Hyung; Park, Jeong Hwa [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-10-01

    An appropriate tool is needed to evaluate the thermo-mechanical stability of high level radioactive waste (HLW) repository. In this report a thermal analysis methodology for the near field of HLW repository is developed to use ABAQUS which is one of the multi purpose FEM code and has been used for many engineering area. The main contents of this methodology development are the structural and material modelling to simulate a repository, setup of side conditions, e.g., boundary and load conditions, and initial conditions, and the procedure to selection proper material parameters. In addition to these, the interface programs for effective production of input data and effective change of model size for sensitivity analysis for disposal concept development are developed. The results of this work will be apply to evaluate the thermal stability and to use as main input data for mechanical analysis of HLW repository. (author). 20 refs., 15 figs., 5 tabs.

  5. Development of a forensically useful age prediction method based on DNA methylation analysis.

    Science.gov (United States)

    Zbieć-Piekarska, Renata; Spólnicka, Magdalena; Kupiec, Tomasz; Parys-Proszek, Agnieszka; Makowska, Żanetta; Pałeczka, Anna; Kucharczyk, Krzysztof; Płoski, Rafał; Branicki, Wojciech

    2015-07-01

    Forensic DNA phenotyping needs to be supplemented with age prediction to become a relevant source of information on human appearance. Recent progress in analysis of the human methylome has enabled selection of multiple candidate loci showing linear correlation with chronological age. Practical application in forensic science depends on successful validation of these potential age predictors. In this study, eight DNA methylation candidate loci were analysed using convenient and reliable pyrosequencing technology. A total number of 41 CpG sites was investigated in 420 samples collected from men and women aged from 2 to 75 years. The study confirmed correlation of all the investigated markers with human age. The five most significantly correlated CpG sites in ELOVL2 on 6p24.2, C1orf132 on 1q32.2, TRIM59 on 3q25.33, KLF14 on 7q32.3 and FHL2 on 2q12.2 were chosen to build a prediction model. This restriction allowed the technical analysis to be simplified without lowering the prediction accuracy significantly. Model parameters for a discovery set of 300 samples were R(2)=0.94 and the standard error of the estimate=4.5 years. An independent set of 120 samples was used to test the model performance. Mean absolute deviation for this testing set was 3.9 years. The number of correct predictions ±5 years achieved a very high level of 86.7% in the age category 2-19 and gradually decreased to 50% in the age category 60-75. The prediction model was deterministic for individuals belonging to these two extreme age categories. The developed method was implemented in a freely available online age prediction calculator. PMID:26026729

  6. Principles and methods of neutron activation analysis (NAA) in improved water resources development

    International Nuclear Information System (INIS)

    The methods of neutron activation analysis (NAA) as it applies to water resources exploration, exploitation and management has been reviewed and its capabilities demonstrated. NAA has been found to be superior and offer higher sensitivity to many other analytical techniques in analysis of water. The implications of chemical and element concentrations (water pollution and quality) determined in water on environmental impact assessment to aquatic life and human health are briefly highlighted

  7. Pathways to Lean Software Development: An Analysis of Effective Methods of Change

    Science.gov (United States)

    Hanson, Richard D.

    2014-01-01

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is…

  8. Development of Design Analysis Methods for C/SiC Composite Structures

    Science.gov (United States)

    Sullivan, Roy M.; Mital, Subodh K.; Murthy, Pappu L. N.; Palko, Joseph L.; Cueno, Jacques C.; Koenig, John R.

    2006-01-01

    The stress-strain behavior at room temperature and at 1100 C (2000 F) was measured for two carbon-fiber-reinforced silicon carbide (C/SiC) composite materials: a two-dimensional plain-weave quasi-isotropic laminate and a three-dimensional angle-interlock woven composite. Micromechanics-based material models were developed for predicting the response properties of these two materials. The micromechanics based material models were calibrated by correlating the predicted material property values with the measured values. Four-point beam bending sub-element specimens were fabricated with these two fiber architectures and four-point bending tests were performed at room temperature and at 1100 C. Displacements and strains were measured at various locations along the beam and recorded as a function of load magnitude. The calibrated material models were used in concert with a nonlinear finite element solution to simulate the structural response of these two materials in the four-point beam bending tests. The structural response predicted by the nonlinear analysis method compares favorably with the measured response for both materials and for both test temperatures. Results show that the material models scale up fairly well from coupon to subcomponent level.

  9. Shlaer-Mellor object-oriented analysis and recursive design, an effective modern software development method for development of computing systems for a large physics detector

    International Nuclear Information System (INIS)

    After evaluation of several modern object-oriented methods for development of the computing systems for the PHENIX detector at RHIC, we selected the Shlaer-Mellor Object-Oriented analysis and Recursive Design methods as the most appropriate for the needs and development environment of a large nuclear or high energy physics detector. This paper discusses our specific needs and environment, our method selection criteria, and major features and components of the Shlaer-Mellor method. (author)

  10. Pathways to lean software development: An analysis of effective methods of change

    Science.gov (United States)

    Hanson, Richard D.

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is firmly entrenched in American business today was to blame for this difficulty (Chatterjee, 2010). Each of these proponents of new methods sought to remove waste, lighten out the process, and implement lean principles in software development. Through this study, the experts evaluated the barriers to effective development principles and defined leadership qualities necessary to overcome these barriers. The barriers identified were issues of resistance to change, risk and reward issues, and management buy-in. Thirty experts in software development from several Fortune 500 companies across the United States explored each of these issues in detail. The conclusion reached by these experts was that visionary leadership is necessary to overcome these challenges.

  11. Development and validation of HPLC method for analysis of dexamethasone acetate in microemulsions

    Directory of Open Access Journals (Sweden)

    Maria Cristina Cocenza Urban

    2009-03-01

    Full Text Available A simple, rapid, accurate and sensitive method was developed for quantitative analysis of dexamethasone acetate in microemulsions using high performance liquid chromatography (HPLC with UV detection. The chromatography parameters were stainless steel Lichrospher 100 RP-18 column (250 mm x 4 mm i.d., 5 μm particle size, at 30 ± 2 ºC. The isocratic mobile phase was methanol:water (65:35; v/v at a flow rate of at 1.0 mL.min-1. The determinations were performed using UV-Vis detector set at 239 nm. Samples were prepared with methanol and the volume injected was 20 μL. The analytical curve was linear (r² 0.9995 over a wide concentration range (2.0-30.0 μg.mL-1. The presence of components of the microemulsion did not interfere in the results of the analysis. The method showed adequate precision, with a relative standard deviation (RSD smaller than 3%. The accuracy was analyzed by adding a standard drug and good recovery values were obtained for all drug concentrations used. The HPLC method developed in this study showed specificity and selectivity with linearity in the working range and good precision and accuracy, making it very suitable for quantification of dexamethasone in microemulsions. The analytical procedure is reliable and offers advantages in terms of speed and low cost of reagents.Um método simples, rápido, preciso e sensível foi desenvolvido para a análise quantitativa de acetato de dexametasona em microemulsões usando cromatografia líquida de alta eficiência (CLAE. Os parâmetros cromatográficos foram: coluna cromatográfica Lichrospher 100 RP-18, (250 mm x 4 mm i.d., 5 μm partícula tamanho, com temperatura de coluna de 30 ± 2 ºC. A fase móvel foi composta de metanol: água (65:35; v/v com fluxo isocrático de 1 mL.min-1 e volume de injeção de 20 μL. As determinações foram realizadas utilizando detector UV-Vis no comprimento de onda de 239 nm. A curva analítica mostrou-se linear (r² 0,999 em uma ampla faixa de

  12. Development of LC-MS/MS method for analysis of polyphenolic compounds in juice, tea and coffee samples

    Science.gov (United States)

    A simple and fast method for the analysis of a wide range of polyphenolic compounds in juice, tea, and coffee samples was developed using liquid chromatography-tandem mass spectrometry (LC-MS/MS). The method was based on a simple sample preparation “dilute and shoot” approach, and LC-MS/MS triple qu...

  13. Development of CFD analysis method based on droplet tracking model for BWR fuel assemblies

    International Nuclear Information System (INIS)

    It is well known that the minimum critical power ratio (MCPR) of the boiling water reactor (BWR) fuel assembly depends on the spacer grid type. Recently, improvement of the critical power is being studied by using a spacer grid with mixing devices attaching various types of flow deflectors. In order to predict the critical power of the improved BWR fuel assembly, we have developed an analysis method based on the consideration of detailed thermal-hydraulic mechanism of annular mist flow regime in the subchannels for an arbitrary spacer type. The proposed method is based on a computational fluid dynamics (CFD) model with a droplet tracking model for analyzing the vapor-phase turbulent flow in which droplets are transported in the subchannels of the BWR fuel assembly. We adopted the general-purpose CFD software Advance/FrontFlow/red (AFFr) as the base code, which is a commercial software package created as a part of Japanese national project. AFFr employs a three-dimensional (3D) unstructured grid system for application to complex geometries. First, AFFr was applied to single-phase flows of gas in the present paper. The calculated results were compared with experiments using a round cellular spacer in one subchannel to investigate the influence of the choice of turbulence model. The analyses using the large eddy simulation (LES) and re-normalisation group (RNG) k-ε models were carried out. The results of both the LES and RNG k-ε models show that calculations of velocity distribution and velocity fluctuation distribution in the spacer downstream reproduce the experimental results qualitatively. However, the velocity distribution analyzed by the LES model is better than that by the RNG k-ε model. The velocity fluctuation near the fuel rod, which is important for droplet deposition to the rod, is also simulated well by the LES model. Then, to examine the effect of the spacer shape on the analytical result, the gas flow analyses with the RNG k-ε model were performed

  14. Level set method for computational multi-fluid dynamics: A review on developments, applications and analysis

    Indian Academy of Sciences (India)

    Atul Sharma

    2015-05-01

    Functions and conservation as well as subsidiary equations in Level Set Method (LSM) are presented. After the mathematical formulation, improvements in the numerical methodology for LSM are reviewed here for advection schemes, reinitialization methods, hybrid methods, adaptive-grid LSM, dual-resolution LSM, sharp-interface LSM, conservative LSM, parallel computing and extension from two to multi fluid/phase as well as to various types of two-phase flow. In the second part of this article, LSM method based Computational Multi-Fluid Dynamics (CMFD) applications and analysis are reviewed for four different types of multi-phase flow: separated and parallel internal flow, drop/bubble dynamics during jet break-up, drop impact dynamics on a solid or liquid surface and boiling. In the last twenty years, LSM has established itself as a method which is easy to program and is accurate as well as computationally-efficient.

  15. A Product Analysis Method and Its Staging to Develop Redesign Competences

    Science.gov (United States)

    Hansen, Claus Thorp; Lenau, Torben Anker

    2013-01-01

    Most product development work in industrial practice is incremental, i.e., the company has had a product in production and on the market for some time, and now time has come to design an upgraded variant. This type of redesign project requires that the engineering designers have competences to carry through an analysis of the existing product…

  16. Development of analysis method for determination of mineral oil contamination in cardboard

    OpenAIRE

    Greye, Jovin

    2016-01-01

    ABSTRACT Since mineral oil contamination from food packaging to food has become a public health concern, several laboratories are investigating possibilities to develop a simply and affordable analytical method for measuring mineral oil saturated hydrocarbons (MOSH) and mineral oil aromatic hydrocarbons (MOAH) in cardboard. The present study investigated on the behalf of Metsä Board Oy (Finland) an efficient analytical method for the determination of mineral oil in cardboard by using ...

  17. Analysis of scenario development methods and practice of high level radioactive waste geological disposal

    International Nuclear Information System (INIS)

    The scenario development is the key step in HLW geological disposal. The features, events and processes (FEPs) should be first considered. The FEPs can be sorted and grouped to form scenario. It is very useful and have reference value for the developing the FEPs of HLW geological disposal in a conceptual and planning stage in China by introducing the FEPs established and sorted methods. (authors)

  18. Analysis of numerical methods

    CERN Document Server

    Isaacson, Eugene

    1994-01-01

    This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.

  19. Development of conjugate methods with gas chromatography for inorganic compounds analysis

    International Nuclear Information System (INIS)

    The application of gas chromatography combined with mass spectrometry or with nuclear methods for the analysis of inorganic compounds is studied. The advantages of the use of a gas chromatograph coupled with a quadrupole mass spectrometer or with a high resolution radiation detector, are discussed. We also studied the formation and solvent extraction of metal chelates; an aliquot of the organic phase was directly injected into the gas chromatograph and the eluted compounds were detected by mass spectrometry or, when radioactive, by nuclear methods. (author)

  20. Development and Validation of a Triple Quad LC/MS Method for Fiber Dye Analysis

    Science.gov (United States)

    Connolly-Ingram, Ceirin M.

    This study aims to determine whether the analysis of dyed fiber through liquid chromatography (HPLC) with triple-quadrupole mass spectrometry (MS) can be used as a reliable alternative to the current chemical techniques used to differentiate dyes. Other methods of analysis involving HPLC and MS have proven to be capable of distinguishing chemically different dyes within a few dye classifications, but none have proven capable of providing a complete alternative to the current accepted technique of thin layer chromatography (TLC). In theory, HPLC-triple quad MS is capable of providing more reproducible and reliable data than the conventional TLC methods with a much greater depth of measurable information with which to characterize dye components. In this study, dyes will be extracted from various types of fibers, including commonly worn types like cotton, polyester, nylon, and wool, and examine dyes from most of the eight different dye classes.

  1. Development of a new service-oriented modelling method for information systems analysis and design

    OpenAIRE

    Gustiené, Prima

    2010-01-01

    This thesis presents a new modelling method for information systems analysis and design, where the concept of service and the principles of service orientation are used for integrated modelling and reasoning about information systems architectures across organisational and technical systems boundaries. The concept of service enables cohesion of the intersubjective and objective modelling traditions by using a single type of diagram that facilitates detection of semantic inconsistency, incompl...

  2. Development and application of neutron transport methods and uncertainty analysis for reactor core calculations. Final report

    International Nuclear Information System (INIS)

    This report documents the research and development goals reached within the reactor safety research project RS1503 ''Development and Application of Neutron Transport Methods and Uncertainty Analyses for Reactor Core Calculations''. The superordinate goal of the project is the development, validation, and application of neutron transport methods and uncertainty analyses for reactor core calculations. These calculation methods will mainly be applied to problems related to the core behaviour of light water reactors and innovative reactor concepts. The contributions of this project towards achieving this goal are the further development, validation, and application of deterministic and stochastic calculation programmes and of methods for uncertainty and sensitivity analyses, as well as the assessment of artificial neutral networks, for providing a complete nuclear calculation chain. This comprises processing nuclear basis data, creating multi-group data for diffusion and transport codes, obtaining reference solutions for stationary states with Monte Carlo codes, performing coupled 3D full core analyses in diffusion approximation and with other deterministic and also Monte Carlo transport codes, and implementing uncertainty and sensitivity analyses with the aim of propagating uncertainties through the whole calculation chain from fuel assembly, spectral and depletion calculations to coupled transient analyses. This calculation chain shall be applicable to light water reactors and also to innovative reactor concepts, and therefore has to be extensively validated with the help of benchmarks and critical experiments.

  3. Personnel planning in general practices: development and testing of a skill mix analysis method.

    NARCIS (Netherlands)

    Eitzen-Strassel, J. von; Vrijhoef, H.J.M.; Derckx, E.W.C.C.; Bakker, D.H. de

    2014-01-01

    Background: General practitioners (GPs) have to match patients’ demands with the mix of their practice staff’s competencies. However, apart from some general principles, there is little guidance on recruiting new staff. The purpose of this study was to develop and test a method which would allow GPs

  4. Development of unstructured grid methods for steady and unsteady aerodynamic analysis

    Science.gov (United States)

    Batina, John T.

    1990-01-01

    The current status of the development of unstructured grid methods in the Unsteady Aerodynamics Branch at NASA-Langley is described. These methods are being developed for steady and unsteady aerodynamic applications. The flow solvers that were developed for the solution of the unsteady Euler and Navier-Stokes equations are highlighted and selected results are given which demonstrate various features of the capability. The results demonstrate 2-D and 3-D applications for both steady and unsteady flows. Comparisons are also made with solutions obtained using a structured grid code and with experimental data to determine the accuracy of the unstructured grid methodology. These comparisons show good agreement which thus verifies the accuracy.

  5. Development of flow network analysis code for block type VHTR core by linear theory method

    International Nuclear Information System (INIS)

    VHTR (Very High Temperature Reactor) is high-efficiency nuclear reactor which is capable of generating hydrogen with high temperature of coolant. PMR (Prismatic Modular Reactor) type reactor consists of hexagonal prismatic fuel blocks and reflector blocks. The flow paths in the prismatic VHTR core consist of coolant holes, bypass gaps and cross gaps. Complicated flow paths are formed in the core since the coolant holes and bypass gap are connected by the cross gap. Distributed coolant was mixed in the core through the cross gap so that the flow characteristics could not be modeled as a simple parallel pipe system. It requires lot of effort and takes very long time to analyze the core flow with CFD analysis. Hence, it is important to develop the code for VHTR core flow which can predict the core flow distribution fast and accurate. In this study, steady state flow network analysis code is developed using flow network algorithm. Developed flow network analysis code was named as FLASH code and it was validated with the experimental data and CFD simulation results. (authors)

  6. Development of Multi-Disciplinary Finite Element Method Analysis Courses at California State University, Los Angeles

    Science.gov (United States)

    McKinney, John; Wu, Chivey

    1998-01-01

    The NASA Dryden Flight Research Center (DFRC) Partnership Awards Grant to California State University, Los Angeles (CSULA) has two primary goals that help to achieve NASA objectives. The overall objectives of the NASA Partnership Awards are to create opportunities for joint University NASA/Government sponsored research and related activities. One of the goals of the grant is to have university faculty researchers participate and contribute to the development of NASA technology that supports NASA goals for research and development (R&D) in Aeronautics and Astronautics. The other goal is technology transfer in the other direction, where NASA developed technology is made available to the general public and more specifically, targeted to industries that can profit from utilization of government developed technology. This years NASA Dryden Partnership Awards grant to CSULA entitled, "Computer Simulation of Multi-Disciplinary Engineering Systems", has two major tasks that satisfy overall NASA objectives. The first task conducts basic and applied research that contributes to technology development at the Dryden Flight Research Center. The second part of the grant provides for dissemination of NASA developed technology, by using the teaching environment created in the CSULA classroom. The second task and how this is accomplished is the topic of this paper. The NASA STARS (Structural Analysis Routines) computer simulation program is used at the Dryden center to support flight testing of high-performance experimental aircraft and to conduct research and development of new and advanced Aerospace technology.

  7. Recent status of developments in analytical methods for safeguards environmental samples. Focused on particle analysis

    International Nuclear Information System (INIS)

    In order to contribute the strengthened safeguards system based on the 93+2 Program of IAEA, we are developing bulk and particle analysis techniques for ultra-trace amounts of nuclear materials (uranium, plutonium, etc.) in environmental samples. The isotope ratios of nuclear materials in the environmental samples taken from the inside and the outside of nuclear facilities are analyzed to detect undeclared activities. In the bulk analysis, after chemical treatment of each sample, the nuclear materials in it are quantitatively and qualitatively analyzed. The data are to be obtained as representative values for each sample. The particle analysis can provide more detailed information on the isotope ratios in the sample, because the isotope ratios of nuclear materials for individual particles are measured. Up to now, the isotope ratio measurement of uranium in single particle with a diameter of as small as 1 μm has become possible by secondary ion mass spectrometry (SIMS). In order to avoid the sample contamination of target elements from ambient air, sample preparation and analysis are carried out in a clean chemistry laboratory (Clean Laboratory for Environmental Analysis and Research), the construction of which was completed in April 2001. (author)

  8. Analysis of heavy oils: Method development and application to Cerro Negro heavy petroleum

    Energy Technology Data Exchange (ETDEWEB)

    Carbognani, L.; Hazos, M.; Sanchez, V. (INTEVEP, Filial de Petroleos de Venezuela, SA, Caracas (Venezuela)); Green, J.A.; Green, J.B.; Grigsby, R.D.; Pearson, C.D.; Reynolds, J.W.; Shay, J.Y.; Sturm, G.P. Jr.; Thomson, J.S.; Vogh, J.W.; Vrana, R.P.; Yu, S.K.T.; Diehl, B.H.; Grizzle, P.L.; Hirsch, D.E; Hornung, K.W.; Tang, S.Y.

    1989-12-01

    On March 6, 1980, the US Department of Energy (DOE) and the Ministry of Energy and Mines of Venezuela (MEMV) entered into a joint agreement which included analysis of heavy crude oils from the Venezuelan Orinoco oil belt.The purpose of this report is to present compositional data and describe new analytical methods obtained from work on the Cerro Negro Orinoco belt crude oil since 1980. Most of the chapters focus on the methods rather than the resulting data on Cerro Negro oil, and results from other oils obtained during the verification of the method are included. In addition, published work on analysis of heavy oils, tar sand bitumens, and like materials is reviewed, and the overall state of the art in analytical methodology for heavy fossil liquids is assessed. The various phases of the work included: distillation and determination of routine'' physical/chemical properties (Chapter 1); preliminary separation of >200{degree}C distillates and the residue into acid, base, neutral, saturated hydrocarbon and neutral-aromatic concentrates (Chapter 2); further separation of acid, base, and neutral concentrates into subtypes (Chapters 3-5); and determination of the distribution of metal-containing compounds in all fractions (Chapter 6).

  9. Development of a Bayesian method for the analysis of inertial confinement fusion experiments on the NIF

    OpenAIRE

    Gaffney, Jim A; Clark, Dan; Sonnad, Vijay; Libby, Stephen B.

    2013-01-01

    The complex nature of inertial confinement fusion (ICF) experiments results in a very large number of experimental parameters that are only known with limited reliability. These parameters, combined with the myriad physical models that govern target evolution, make the reliable extraction of physics from experimental campaigns very difficult. We develop an inference method that allows all important experimental parameters, and previous knowledge, to be taken into account when investigating un...

  10. Developing A New Sampling and Analysis Method for Hydrazine and Monomethyl Hydrazine

    Science.gov (United States)

    Allen, John R.

    2002-01-01

    Solid phase microextraction (SPME) will be used to develop a method for detecting monomethyl hydrazine (MMH) and hydrazine (Hz). A derivatizing agent, pentafluorobenzoyl chloride (PFBCl), is known to react readily with MMH and Hz. The SPME fiber can either be coated with PFBCl and introduced into a gaseous stream containing MMH, or PFBCl and MMH can react first in a syringe barrel and after a short equilibration period a SPME is used to sample the resulting solution. These methods were optimized and compared. Because Hz and MMH can degrade the SPME, letting the reaction occur first gave better results. Only MMH could be detected using either of these methods. Future research will concentrate on constructing calibration curves and determining the detection limit.

  11. Development and application of an automated analysis method for individual cerebral perfusion single photon emission tomography images

    CERN Document Server

    Cluckie, A J

    2001-01-01

    Neurological images may be analysed by performing voxel by voxel comparisons with a group of control subject images. An automated, 3D, voxel-based method has been developed for the analysis of individual single photon emission tomography (SPET) scans. Clusters of voxels are identified that represent regions of abnormal radiopharmaceutical uptake. Morphological operators are applied to reduce noise in the clusters, then quantitative estimates of the size and degree of the radiopharmaceutical uptake abnormalities are derived. Statistical inference has been performed using a Monte Carlo method that has not previously been applied to SPET scans, or for the analysis of individual images. This has been validated for group comparisons of SPET scans and for the analysis of an individual image using comparison with a group. Accurate statistical inference was obtained independent of experimental factors such as degrees of freedom, image smoothing and voxel significance level threshold. The analysis method has been eval...

  12. Development of model-free analysis method on quasi-elastic neutron scattering and application to liquid water

    International Nuclear Information System (INIS)

    In general, analysis of quasi-elastic neutron scattering spectra needs some mathematical models in its process, and hence the obtained result is a model dependent. Model-dependent analysis may lead misunderstandings caused by inappropriate initial models or may miss an unexpected relaxation phenomenon. We have developed an analysis method for processing QENS data without a specific model, which we call as mode-distribution analysis. In this method, we supposed that all modes can be described as combinations of the relaxations based on the exponential law. By this method, we can obtain a distribution function B(Q,Γ) which we call the mode-distribution function, to represent the number of relaxation modes and distributions of the relaxation times in the modes. We report the first application to experimental data of liquid water. In addition to the two known modes, the existence of a relaxation mode of water molecules with an intermediate time scale has been discovered. (author)

  13. Development of an evaluation method for the quality of NPP MCR operators' communication using Work Domain Analysis (WDA)

    International Nuclear Information System (INIS)

    Research highlights: → No evaluation method is available for operators' communication quality in NPPs. → To model this evaluation method, the Work Domain Analysis (WDA) method was found. → This proposed method was applied to NPP MCR operators. → The quality of operators' communication can be evaluated with the propose method. - Abstract: The evolution of work demands has seen industrial evolution itself evolve into the computerization of these demands, making systems more complex. This field is now known as the Complex Socio-Technical System. As communication failures are problems associated with Complex Socio-Technical Systems, it has been discovered that communication failures are the cause of many incidents and accidents in various industries, including the nuclear, aerospace and railway industries. Despite the fact that there have been many studies on the severity of communication failures, there is no evaluation method for operators' communication quality in Nuclear Power Plants (NPPs). Therefore, the objectives of this study are to develop an evaluation method for the quality of NPP Main Control Room (MCR) operators' communication and to apply the proposed method to operators in a full-scope simulator. To develop the proposed method, the Work Domain Analysis (WDA) method is introduced. Several characteristics of WDA, including Abstraction Decomposition Space (ADS) and the diagonal of ADS are the important points in developing an evaluation method for the quality of NPP MCR operators' communication. In addition, to apply the proposed method, nine teams working in NPPs participated in a field simulation. The results of this evaluation reveal that operators' communication quality improved as a greater proportion of the components in the developed evaluation criteria were mentioned. Therefore, the proposed method could be useful for evaluating the communication quality in any complex system.

  14. Development of a calculation method for one dimensional kinetic analysis in fission reactors, with feedback effects

    International Nuclear Information System (INIS)

    The methodology used in the WIGLE3 computer code is studied. This methodology has been applied for the steady-state and transient solutions of the one-dimensional, two-group, diffusion equations in slab geometry, in axial type probelm analysis. It's also studied, based in a WIGLE3 computer code, reactor representative models, considering non-boiling heat transfer. A steady-state program for control rod bank position search- CITER 1D- has been developed. Some criticality research on the proposed system has been done using different control rod bank initial positions, time steps and convergence parameters. (E.G.)

  15. Effective methods of consumer protection in Brazil. An analysis in the context of property development contracts

    Directory of Open Access Journals (Sweden)

    Deborah Alcici Salomão

    2015-12-01

    Full Text Available This study examines consumer protection in arbitration, especially under the example of property development contract disputes in Brazil. This is a very current issue in light of the presidential veto of consumer arbitration on May 26, 2015. The article discusses the arbitrability of these disputes based on Brazilian legislation and relevant case law. It also analyzes of the advantages, disadvantages and trends of consumer arbitration in the context of real estate contracts. The paper concludes by providing suggestions specific to consumer protection in arbitration based on this analysis.

  16. Development of a preparation and staining method for fetal erythroblasts in maternal blood : Simultaneous immunocytochemical staining and FISH analysis

    NARCIS (Netherlands)

    Oosterwijk, JC; Mesker, WE; Ouwerkerk-van Velzen, MCM; Knepfle, CFHM; Wiesmeijer, KC; van den Burg, MJM; Beverstock, GC; Bernini, LF; van Ommen, Gert-Jan B; Kanhai, HHH; Tanke, HJ

    1998-01-01

    In order to detect fetal nucleated red blood cells (NRBCs) in maternal blood, a protocol was developed which aimed at producing a reliable staining method for combined immunocytochemical and FISH analysis. The technique had to be suitable for eventual automated screening of slides. Chorionic villi w

  17. Analysis of laser-fission-fusion systems: Development of methods for nuclear and thermohydrodynamic calculations

    International Nuclear Information System (INIS)

    A set of computer programs is developed for the calculation of laser-driven fission-fusion microexplosions. Both nuclear and thermohydrodynamic processes are considered, as well as their coupling effects, without taking into account the laser interaction so far, but simulating it by a boundary pressure pulse that can be varied parametrically. Three different systems (BERTA, NORMA-CLARA, NORMA-LIBERTAS) have been developed upon different approaches. BERTA is an integrated code which takes into account both nuclear and hydrodynamic processes in a coupled but simplified way. NORMA calculates in detail the thermo-hydrodynamic evolution under given boundary pressure conditions and nuclear energy generation. CLARA is a discrete-ordinates time-dependent neutron transport code which works directly coupled with NORMA. LIBERTAS is a Monte Carlo time-dependent neutron transport code, also coupled to NORMA, which can be of interest for analysis of anomalous or stochastic situations. (orig.)

  18. Method developments approaches in supercritical fluid chromatography applied to the analysis of cosmetics.

    Science.gov (United States)

    Lesellier, E; Mith, D; Dubrulle, I

    2015-12-01

    Analyses of complex samples of cosmetics, such as creams or lotions, are generally achieved by HPLC. These analyses are often multistep gradients, due to the presence of compounds with a large range of polarity. For instance, the bioactive compounds may be polar, while the matrix contains lipid components that are rather non-polar, thus cosmetic formulations are usually oil-water emulsions. Supercritical fluid chromatography (SFC) uses mobile phases composed of carbon dioxide and organic co-solvents, allowing for good solubility of both the active compounds and the matrix excipients. Moreover, the classical and well-known properties of these mobile phases yield fast analyses and ensure rapid method development. However, due to the large number of stationary phases available for SFC and to the varied additional parameters acting both on retention and separation factors (co-solvent nature and percentage, temperature, backpressure, flow rate, column dimensions and particle size), a simplified approach can be followed to ensure a fast method development. First, suited stationary phases should be carefully selected for an initial screening, and then the other operating parameters can be limited to the co-solvent nature and percentage, maintaining the oven temperature and back-pressure constant. To describe simple method development guidelines in SFC, three sample applications are discussed in this paper: UV-filters (sunscreens) in sunscreen cream, glyceryl caprylate in eye liner and caffeine in eye serum. Firstly, five stationary phases (ACQUITY UPC(2)) are screened with isocratic elution conditions (10% methanol in carbon dioxide). Complementary of the stationary phases is assessed based on our spider diagram classification which compares a large number of stationary phases based on five molecular interactions. Secondly, the one or two best stationary phases are retained for further optimization of mobile phase composition, with isocratic elution conditions or, when

  19. Development of a Probabilistic Dynamic Synthesis Method for the Analysis of Nondeterministic Structures

    Science.gov (United States)

    Brown, A. M.

    1998-01-01

    Accounting for the statistical geometric and material variability of structures in analysis has been a topic of considerable research for the last 30 years. The determination of quantifiable measures of statistical probability of a desired response variable, such as natural frequency, maximum displacement, or stress, to replace experience-based "safety factors" has been a primary goal of these studies. There are, however, several problems associated with their satisfactory application to realistic structures, such as bladed disks in turbomachinery. These include the accurate definition of the input random variables (rv's), the large size of the finite element models frequently used to simulate these structures, which makes even a single deterministic analysis expensive, and accurate generation of the cumulative distribution function (CDF) necessary to obtain the probability of the desired response variables. The research presented here applies a methodology called probabilistic dynamic synthesis (PDS) to solve these problems. The PDS method uses dynamic characteristics of substructures measured from modal test as the input rv's, rather than "primitive" rv's such as material or geometric uncertainties. These dynamic characteristics, which are the free-free eigenvalues, eigenvectors, and residual flexibility (RF), are readily measured and for many substructures, a reasonable sample set of these measurements can be obtained. The statistics for these rv's accurately account for the entire random character of the substructure. Using the RF method of component mode synthesis, these dynamic characteristics are used to generate reduced-size sample models of the substructures, which are then coupled to form system models. These sample models are used to obtain the CDF of the response variable by either applying Monte Carlo simulation or by generating data points for use in the response surface reliability method, which can perform the probabilistic analysis with an order of

  20. Development of a method for analysis for wind turbines horizontal shaft by a method of fluid dynamics computational (CFD)

    International Nuclear Information System (INIS)

    In this paper we describe different approaches to solving problems computational fluid dynamics using the finite element method, there is a perspective what are the different problems that must be addressed when choose a path to develop a code that solves the problems of boundary layer and turbulence to simulate the transport equipment and fluid handling. In principle, the turbulent flow is governed by the equations of dynamics fluids. The nonlinearity of the Navier-Stokes equations, make the solution analytical is only possible in a few very specific cases and for senior Reynolds numbers the flow equations become a more complex, for it is necessary to use certain models dependent on some settings, usually obtained experimentally. Existing in the powerful techniques present numerical resolution of these equations such as the direct numerical simulation (DNS) and large eddy simulation or vertices (RES), discussed for use in solving problems flow machines. (author)

  1. Alternative method of highway traffic safety analysis for developing countries using delphi technique and Bayesian network.

    Science.gov (United States)

    Mbakwe, Anthony C; Saka, Anthony A; Choi, Keechoo; Lee, Young-Jae

    2016-08-01

    Highway traffic accidents all over the world result in more than 1.3 million fatalities annually. An alarming number of these fatalities occurs in developing countries. There are many risk factors that are associated with frequent accidents, heavy loss of lives, and property damage in developing countries. Unfortunately, poor record keeping practices are very difficult obstacle to overcome in striving to obtain a near accurate casualty and safety data. In light of the fact that there are numerous accident causes, any attempts to curb the escalating death and injury rates in developing countries must include the identification of the primary accident causes. This paper, therefore, seeks to show that the Delphi Technique is a suitable alternative method that can be exploited in generating highway traffic accident data through which the major accident causes can be identified. In order to authenticate the technique used, Korea, a country that underwent similar problems when it was in its early stages of development in addition to the availability of excellent highway safety records in its database, is chosen and utilized for this purpose. Validation of the methodology confirms the technique is suitable for application in developing countries. Furthermore, the Delphi Technique, in combination with the Bayesian Network Model, is utilized in modeling highway traffic accidents and forecasting accident rates in the countries of research. PMID:27183516

  2. A Review of the Use of Contingent Valuation Methods in Project Analysis at the Inter-American Development Bank

    OpenAIRE

    Sergio Ardila; Ricardo Quiroga; William J. Vaughan

    1998-01-01

    This paper (ENV-126) was originally presented at a National Science Foundation Workshop on Alternatives to Traditional Contingent Valuation Methods in Environmental Valuation, held at Vanderbilt University, Nashville Tennessee, on October 15-16, 1998. This paper reviews the past ten years of the Inter-American Development Bank's experience with stated preference methods, concentrating on their use in the cost-benefit analysis of projects supplying sewer service and improving ambient water qua...

  3. Development and validation of a GC–FID method for quantitative analysis of oleic acid and related fatty acids

    OpenAIRE

    Honggen Zhang; Zhenyu Wang; Oscar Liu

    2015-01-01

    Oleic acid is a common pharmaceutical excipient that has been widely used in various dosage forms. Gas chromatography (GC) has often been used as the quantitation method for fatty acids normally requiring a derivatization step. The aim of this study was to develop a simple, robust, and derivatization-free GC method that is suitable for routine analysis of all the major components in oleic acid USP-NF (United States Pharmacopeia-National Formulary) material. A gas chromatography–flame ionizati...

  4. Development of Translational Methods in Spectral Analysis of Human Infant Crying and Rat Pup Ultrasonic Vocalizations for Early Neurobehavioral Assessment

    OpenAIRE

    Philip Sanford Zeskind; McMurray, Matthew S.; Kristin Ann Garber; Juliana Miriam Neuspiel; Elizabeth Thomas Cox; Grewen, Karen M.; Mayes, Linda C.; Johns, Josephine M.

    2011-01-01

    The purpose of this article is to describe the development of translational methods by which spectrum analysis of human infant crying and rat pup ultrasonic vocalizations (USVs) can be used to assess potentially adverse effects of various prenatal conditions on early neurobehavioral development. The study of human infant crying has resulted in a rich set of measures that has long been used to assess early neurobehavioral insult due to non-optimal prenatal environments, even among seemingly he...

  5. SENSITIVITY ANALYSIS as a methodical approach to the development of design strategies for environmentally sustainable buildings

    DEFF Research Database (Denmark)

    Hansen, Hanne Tine Ring

    The field of environmentally sustainable architecture has been under development since the late 1960's when mankind first started to notice the consequences of industrialisation and modern lifestyle. Energy crises in 1973 and 1979, and global climatic changes ascribed to global warming have caused...... an increase in scientific and political awareness, which has lead to an escalation in the number of research publications in the field, as well as, legislative demands for the energy consumption of buildings. The publications in the field refer to many different approaches to environmentally...... sustainable architecture, such as: ecological, green, bio-climatic, sustainable, passive, low-energy and environmental architecture. This PhD project sets out to gain a better understanding of environmentally sustainable architecture and the methodical approaches applied in the development of this type of...

  6. Analysis and development of spatial hp-refinement methods for solving the neutron transport equation

    International Nuclear Information System (INIS)

    The different neutronic parameters have to be calculated with a higher accuracy in order to design the 4. generation reactor cores. As memory storage and computation time are limited, adaptive methods are a solution to solve the neutron transport equation. The neutronic flux, solution of this equation, depends on the energy, angle and space. The different variables are successively discretized. The energy with a multigroup approach, considering the different quantities to be constant on each group, the angle by a collocation method called SN approximation. Once the energy and angle variable are discretized, a system of spatially-dependent hyperbolic equations has to be solved. Discontinuous finite elements are used to make possible the development of hp-refinement methods. Thus, the accuracy of the solution can be improved by spatial refinement (h-refinement), consisting into subdividing a cell into sub-cells, or by order refinement (p-refinement), by increasing the order of the polynomial basis. In this thesis, the properties of this methods are analyzed showing the importance of the regularity of the solution to choose the type of refinement. Thus, two error estimators are used to lead the refinement process. Whereas the first one requires high regularity hypothesis (analytical solution), the second one supposes only the minimal hypothesis required for the solution to exist. The comparison of both estimators is done on benchmarks where the analytic solution is known by the method of manufactured solutions. Thus, the behaviour of the solution as a regard of the regularity can be studied. It leads to a hp-refinement method using the two estimators. Then, a comparison is done with other existing methods on simplified but also realistic benchmarks coming from nuclear cores. These adaptive methods considerably reduces the computational cost and memory footprint. To further improve these two points, an approach with energy-dependent meshes is proposed. Actually, as the

  7. Development and application of an automated analysis method for individual cerebral perfusion single photon emission tomography images

    International Nuclear Information System (INIS)

    Neurological images may be analysed by performing voxel by voxel comparisons with a group of control subject images. An automated, 3D, voxel-based method has been developed for the analysis of individual single photon emission tomography (SPET) scans. Clusters of voxels are identified that represent regions of abnormal radiopharmaceutical uptake. Morphological operators are applied to reduce noise in the clusters, then quantitative estimates of the size and degree of the radiopharmaceutical uptake abnormalities are derived. Statistical inference has been performed using a Monte Carlo method that has not previously been applied to SPET scans, or for the analysis of individual images. This has been validated for group comparisons of SPET scans and for the analysis of an individual image using comparison with a group. Accurate statistical inference was obtained independent of experimental factors such as degrees of freedom, image smoothing and voxel significance level threshold. The analysis method has been evaluated for application to cerebral perfusion SPET imaging in ischaemic stroke. It has been shown that useful quantitative estimates, high sensitivity and high specificity may be obtained. Sensitivity and the accuracy of signal quantification were found to be dependent on the operator defined analysis parameters. Recommendations for the values of these parameters have been made. The analysis method developed has been compared with an established method and shown to result in higher specificity for the data and analysis parameter sets tested. In addition, application to a group of ischaemic stroke patient SPET scans has demonstrated its clinical utility. The influence of imaging conditions has been assessed using phantom data acquired with different gamma camera SPET acquisition parameters. A lower limit of five million counts and standardisation of all acquisition parameters has been recommended for the analysis of individual SPET scans. (author)

  8. METHODS AND MODELS FOR ANALYSIS OF THE ORGANIZATIONAL ECONOMICS ACTIVITY USED FOR DEVELOPMENT OF INFORMATICS SYSTEMS

    Directory of Open Access Journals (Sweden)

    TEODORA VĂTUIU

    2014-10-01

    Full Text Available Study of organizational activity and highlighting problem situations that require specific solutions, require a detailed analysis of the models defined for the real system of the economic companies, regarded not as a sum of assets, but as organizations in which there are activities related into processes. In addition to the usual approach of using modeling languages in the development of information systems, in this paper we intend to present some examples that demonstrate the usefulness of a standard modeling language (UML to analyze organizational activities and to report problem situations that may occur in data management registered on primary documents or in processes that bring together activities. Examples that have been focused on a travel agency can be extrapolated to any other organization, and the diagrams can be used in different contexts, depending on the complexity of the activities identified.

  9. Comparative analysis of methods used to define eustatic variations in outcrop: Late Cambrian interbasinal sequence development

    Energy Technology Data Exchange (ETDEWEB)

    Osleger, D. (Univ. of California, Riverside (United States)); Read, J.F. (Virginia Polytechnic Inst. and State Univ., Blacksburg (United States))

    1993-03-01

    Interbasinal correlation of Late Cambrian cyclic carbonates from the Appalachian and Cordilleran passive margins, the Texas craton, and the southern Oklahoma aulacogen defines six major third-order depositional sequences. Graphic correlation of biostratigraphically-constrained strata was used to establish equivalency of stratigraphic sequences between the individual sections. Relatively isochronous biomere boundaries were used as time datums for lithostratigraphic correlation. Although the individual sections are composed of different types of meter-scale cycles and component lithofacies that reflect the various environmental settings of the localities, the overall upward-shallowing character of individual sequences is evident. The sequences are: late Cedaria, mid-Crepicephalus, late Crepicephalus, Aphelaspis to earliest Elvinia, Elvinia to early Saukia, and Saukia to the Cambrian-Ordovician boundary. Interbasinal correlation of stratigraphic sequences permits an evaluation of quantitative techniques for determining accommodation history. Correlation of Fischer plots of cyclic successions from separate basins supports a eustatic control of Late Cambrian sequence development. R2/R3 curves derived from subsidence analysis of the Late Cambrian sections provide good resolution of the second- and third-order scales of accommodation change, and interbasinal correlations of R2/R3 curves also support eustatic control on sequence development. Comparing the accomodation curves and subsidence analysis with paleobathymetric trends of Late Cambrian cyclic strata suggests that the curves may approximate the form of the eustatic sealevel signal. A composite eustatic sealevel curve for Late Cambrian time in North America was created by qualitatively combining the accommodation curves defined by the different techniques for each of the four localities. 129 refs., 16 figs., 3 tabs.

  10. Development of breached pin performance analysis code SAFFRON (System of Analyzing Failed Fuel under Reactor Operation by Numerical method)

    Energy Technology Data Exchange (ETDEWEB)

    Ukai, Shigeharu [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center

    1995-03-01

    On the assumption of fuel pin failure, the breached pin performance analysis code SAFFRON was developed to evaluate the fuel pin behavior in relation to the delayed neutron signal response during operational mode beyond the cladding failure. Following characteristic behavior in breached fuel pin is modeled in 3-dimensional finite element method : pellet swelling by fuel-sodium reaction, fuel temperature change, and resultant cladding breach extension and delayed neutron precursors release into coolant. Particularly, practical algorithm of numerical procedure in finite element method was originally developed in order to solve the 3-dimensional non-linear contact problem between the swollen pellet due to fuel-sodium reaction and breached cladding. (author).

  11. A development and integration of the concentration database for relative method, k0 method and absolute method in instrumental neutron activation analysis using Microsoft Access

    International Nuclear Information System (INIS)

    Instrumental Neutron Activation Analysis (INAA) is offen used to determine and calculate the concentration of an element in the sample by the National University of Malaysia, especially students of Nuclear Science Program. The lack of a database service leads consumers to take longer time to calculate the concentration of an element in the sample. This is because we are more dependent on software that is developed by foreign researchers which are costly. To overcome this problem, a study has been carried out to build an INAA database software. The objective of this study is to build a database software that help the users of INAA in Relative Method and Absolute Method for calculating the element concentration in the sample using Microsoft Excel 2010 and Microsoft Access 2010. The study also integrates k0 data, k0 Concent and k0-Westcott to execute and complete the system. After the integration, a study was conducted to test the effectiveness of the database software by comparing the concentrations between the experiments and in the database. Triple Bare Monitor Zr-Au and Cr-Mo-Au were used in Abs-INAA as monitor to determine the thermal to epithermal neutron flux ratio (f). Calculations involved in determining the concentration are the net peak area (Np), the measurement time (tm), the irradiation time (tirr), k-factor (k), thermal to epithermal neutron flux ratio (f), the parameters of the neutron flux distribution epithermal (α) and detection efficiency (εp). For Com-INAA databases, reference material IAEA-375 Soil was used to calculate the concentration of elements in the sample. CRM, SRM are also used in this database. After the INAA database integration, a verification process was to examine the effectiveness of the Abs-INAA was carried out by comparing the sample concentration between the in database and the experiment. The result of the experimental concentration value of INAA database software performed with high accuracy and precision. ICC

  12. Development of a UV/Vis spectrophotometric method for analysis of total polyphenols from Caesalpinia peltophoroides Benth

    Directory of Open Access Journals (Sweden)

    Fernanda G. Bueno

    2012-01-01

    Full Text Available Caesalpinia peltophoroides is a domesticated tree found in Brazil. It was necessary to develop an analytical method to determine the content of total polyphenols (TP in this herbal drug. The pre-analytical method was standardized for analysis time, wavelength, and the best standard to use. The optimum conditions were: pyrogallol, 760 nm, and 30 min respectively. Under these conditions, validation by UV/Vis spectrophotometry proved to be reliable for TP of the crude extract and semipurified fractions from C. peltophoroides. Standardization is required for every herbal drug, and this method proved to be linear, precise, accurate, reproducible, robust, and easy to perform.

  13. Development of numerical analysis methods for natural circulation decay heat removal system applied to a large scale JSFR

    International Nuclear Information System (INIS)

    A decay heat removal system utilizing passive natural circulation is applied to a large scale Japan Sodium-cooled Fast Reactor. As preparing for the future licensing, a one-dimensional flow network method and a three-dimensional numerical analysis method were developed to evaluate core cooling capability and thermal transient under decay heat removal modes after reactor trip. The one-dimensional method was applied to a water test simulating the primary system of the reactor, while the three-dimensional method was applied to the water test and a sodium test focusing on the decay heat removal system. The numerical results of both methods have turned out to agree well with the test results. And then the thermal-hydraulic behavior under a typical decay heat removal mode of the reactor has been predicted by the three-dimensional method. (author)

  14. Development and validation of an HPLC method for analysis of etoricoxib in human plasma

    Directory of Open Access Journals (Sweden)

    Mandal U

    2006-01-01

    Full Text Available A simple high-performance liquid chromatographic method for the determination of etoricoxib in human plasma has been developed. An aliquot quantity of 1 ml plasma sample was taken and 10 ml internal standard was added and mixed. Saturated borate solution of 0.3 ml was added to it and mixed for 1 minute followed by liquid-liquid extraction with ethyl acetate. Organic layer was separated and evaporated to dryness under nitrogen atmosphere at low temperature (below 50°. Residue was reconstituted with 150 µl of mobile phase. During the whole procedure the samples were protected from light. The assay was performed on Hypersil BDS, C18 (150x4.6 mm, 5 m particle size column, using 10 milimol ammonium acetate buffer:acetonitrile = 65:35 v/v as mobile phase with ultra violet detection at 235 nm. Lower limit of detection was 10 ng/ml and lower limit of quantitation was 20 ng/ml. Maximum between-run precision was 7.94%. Mean extraction recovery was found to be 79.53 to 85.70%. Stability study showed that after three freeze-thaw cycles the loss of three quality control samples were less than 10%. Samples were stable at room temperature for 12 h and at -20° for 3 months. Before injecting onto HPLC system, the processed samples were stable for at least 8 h. The method was used to perform bioequivalence study in human.

  15. Development of A Conservative Method for A Feedwater Pipe Break Analysis of An Integral Type Reactor

    International Nuclear Information System (INIS)

    The development of advanced small and medium sized nuclear power plants for multipurpose appears before the footlights, and some of them are ready for construction. The SMART, which is an integral pressurized water reactor is one of those advanced types of small sized nuclear reactors. The basic design of SMART was completed at the Korea Atomic Energy Research Institute. A new phase in order to test and verify the SMART design is currently underway in Korea. The results of these tests and verifications will be fed back into the SMART design for a further improvement of the safety and reliability. The integral type reactor can be mitigated design basis events by a reactor protection system, or engineered safety features. The consequences of design basis events must be less than the established acceptance limits and provide an acceptable margin to protect the health and safety. The design basis events are divided into general categories corresponding to their effect on a plant. One of these categories is a decrease in a heat removal by the secondary system. There are a turbine trip, a main steam isolation valve closure, a loss of the primary component cooling system, and a feedwater pipe break for the decrease in the heat removal by the secondary system. The feedwater pipe break accident is one of the most important accidents in the safety of the integral type reactor. Decrease in the feedwater supply to the steam generators causes a decrease in the heat extraction rate from the reactor coolant system, resulting in an increase of a primary coolant temperature and a pressure, and the nuclear power decreases due to a reactivity feedback. Performed sensitivity analysis to find parameters affecting seriously in the integral reactor's feedwater pipe break accident. According to these parametric analysis results, a power level, an initial system pressure, a moderator reactivity coefficient and a break size are major parameters for the maximum system pressure. The detailed

  16. Development and validation of a GC-FID method for quantitative analysis of oleic acid and related fatty acids☆

    Institute of Scientific and Technical Information of China (English)

    Honggen Zhang; Zhenyu Wang; Oscar Liu

    2015-01-01

    Oleic acid is a common pharmaceutical excipient that has been widely used in various dosage forms. Gas chromatography (GC) has often been used as the quantitation method for fatty acids normally requiring a derivatization step. The aim of this study was to develop a simple, robust, and derivatization-free GC method that is suitable for routine analysis of all the major components in oleic acid USP-NF (United States Pharmacopeia-National Formulary) material. A gas chromatography-flame ionization detection (GC-FID) method was developed for direct quantitative analysis of oleic acid and related fatty acids in oleic acid USP-NF material. Fifteen fatty acids were separated using a DB-FFAP (nitroterephthalic acid modified polyethylene glycol) capillary GC column (30 m × 0.32 mm i.d.) with a total run time of 20 min. The method was validated in terms of specificity, linearity, precision, accuracy, sensitivity, and robustness. The method can be routinely used for the purpose of oleic acid USP-NF material analysis.

  17. Development and validation of a GC–FID method for quantitative analysis of oleic acid and related fatty acids

    Directory of Open Access Journals (Sweden)

    Honggen Zhang

    2015-08-01

    Full Text Available Oleic acid is a common pharmaceutical excipient that has been widely used in various dosage forms. Gas chromatography (GC has often been used as the quantitation method for fatty acids normally requiring a derivatization step. The aim of this study was to develop a simple, robust, and derivatization-free GC method that is suitable for routine analysis of all the major components in oleic acid USP-NF (United States Pharmacopeia-National Formulary material. A gas chromatography–flame ionization detection (GC–FID method was developed for direct quantitative analysis of oleic acid and related fatty acids in oleic acid USP-NF material. Fifteen fatty acids were separated using a DB-FFAP (nitroterephthalic acid modified polyethylene glycol capillary GC column (30 m×0.32 mm i.d. with a total run time of 20 min. The method was validated in terms of specificity, linearity, precision, accuracy, sensitivity, and robustness. The method can be routinely used for the purpose of oleic acid USP-NF material analysis.

  18. Methods for Risk Analysis

    International Nuclear Information System (INIS)

    Many decision-making situations today affect humans and the environment. In practice, many such decisions are made without an overall view and prioritise one or other of the two areas. Now and then these two areas of regulation come into conflict, e.g. the best alternative as regards environmental considerations is not always the best from a human safety perspective and vice versa. This report was prepared within a major project with the aim of developing a framework in which both the environmental aspects and the human safety aspects are integrated, and decisions can be made taking both fields into consideration. The safety risks have to be analysed in order to be successfully avoided and one way of doing this is to use different kinds of risk analysis methods. There is an abundance of existing methods to choose from and new methods are constantly being developed. This report describes some of the risk analysis methods currently available for analysing safety and examines the relationships between them. The focus here is mainly on human safety aspects

  19. Development of soil-structure interaction analysis method (II) - Volume 1

    International Nuclear Information System (INIS)

    This project includes following six items : free field analysis for the determination of site input motions, impedance analysis which simplifies the effects of soil-structure interaction by using lumped parameters, soil-structure interaction analysis including the material nonlinearity of soil depending on the level of strains, strong geometric nonlinearity due to the uplifting of the base, seismic analysis of underground structure such as varied pipes, seismic analysis of liquid storage tanks. Each item contains following contents respectively : state-of-the-art review on each item and data base construction on the past researches, theoretical review on the technology of soil-structure interaction analysis, proposing preferable technology and estimating the domestic applicability, proposing guidelines for evaluation of safety and analysis scheme

  20. Development of soil-structure interaction analysis method (II) - Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Chang, S. P.; Ko, H. M.; Park, H. K. and others [Seoul National Univ., Seoul (Korea, Republic of)

    1994-02-15

    This project includes following six items : free field analysis for the determination of site input motions, impedance analysis which simplifies the effects of soil-structure interaction by using lumped parameters, soil-structure interaction analysis including the material nonlinearity of soil depending on the level of strains, strong geometric nonlinearity due to the uplifting of the base, seismic analysis of underground structure such as varied pipes, seismic analysis of liquid storage tanks. Each item contains following contents respectively : state-of-the-art review on each item and data base construction on the past researches, theoretical review on the technology of soil-structure interaction analysis, proposing preferable technology and estimating the domestic applicability, proposing guidelines for evaluation of safety and analysis scheme.

  1. Development of evaluation method for the quality of NPP MCR operators' communication using work domain analysis (WDA)

    International Nuclear Information System (INIS)

    Evolution of work demands has changed industrial evolution to computerization which makes systems complex and complicated: this field is called Complex Socio-Technical Systems. As communication failure is one problem of Complex Socio-Technical Systems, it has been discovered that communication failure is the reason for many incidents and accidents in various industries, including the nuclear, aerospace and railway industries. Despite the fact that there have been many studies on the severity of communication failure, there is no evaluation method for operators' communication quality in NPPs. Therefore, the objectives of this study are to develop an evaluation method for the quality of NPP Main Control Room (MCR) operators' communication and to apply the proposed method to operators in a full-scope simulator. In order to develop the proposed method, the Work Domain Analysis (WDA) method is introduced. Several characteristic of WDA, such as Abstraction Decomposition Space (ADS) and the diagonal of ADS are the key points in developing an evaluation method for the quality of NPP MCR operators' communication. In addition, in order to apply the proposed method, nine teams working in NPPs participated in the field simulation. Evaluation results reveal that operators' communication quality was higher as larger portion of components in the developed evaluation criteria were mentioned. Therefore, the proposed method could be a useful one for evaluating the communication quality in any complex system. In order to verify that the proposed method is meaningful to evaluate communication quality, the evaluation results were further investigated with objective performance measures. Further investigation of the evaluation results also supports the idea that the proposed method can be used in evaluating communication quality

  2. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    Science.gov (United States)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize

  3. Computational Methods Development at Ames

    Science.gov (United States)

    Kwak, Dochan; Smith, Charles A. (Technical Monitor)

    1998-01-01

    This viewgraph presentation outlines the development at Ames Research Center of advanced computational methods to provide appropriate fidelity computational analysis/design capabilities. Current thrusts of the Ames research include: 1) methods to enhance/accelerate viscous flow simulation procedures, and the development of hybrid/polyhedral-grid procedures for viscous flow; 2) the development of real time transonic flow simulation procedures for a production wind tunnel, and intelligent data management technology; and 3) the validation of methods and the flow physics study gives historical precedents to above research, and speculates on its future course.

  4. [Ocra Method: development of a new procedure for analysis of multiple tasks subject to infrequent rotation].

    Science.gov (United States)

    Occhipinti, E; Colombini, Daniela; Occhipinti, M

    2008-01-01

    In the Ocra methods (Ocra index and Ocra Checklist), when computing the final indices (Ocra index or checklist score), in the case of more than one repetitive task a "traditional" procedure was already proposed, the results of which could be defined as "time-weighted average". This approach appears to be appropriate when considering rotations among tasks that are performed very frequently, for instance almost once every hour (or for shorter periods). However, when rotation among repetitive tasks is less frequent (i.e. once every 1 1/2 or more hours), the "time-weighted average" approach could result in an underestimation of the exposure level (as it practically flattens peaks of high exposures). For those scenarios an alternative approach based on the "most stressful task as minimum" might be more realistic. This latter approach has already been included in the NIOSH approach for multiple sequential lifting tasks and, given the recent availability in the Ocra method of more detailed duration multipliers (practically one different Du(M) for each different step of one hour of duration of the repetitive task), it is now possible to define a particular procedure to compute the complex Ocra Multitask Index (cOCRA) and the complex Checklist Score (cCHESCO) for the analysis of two or more repetitive tasks when rotations are infrequent (rotations every 1 1/2 hours or more). The result of this approach will be at least equal to the index of the most stressful task considered for its individual daily duration and at the most equal to the index of the most stressful task when it is (only theoretically) considered as lasting for the overall daily duration of all examined repetitive tasks. The procedure is based on the following formula: Complex Ocra Multitask Index = Ocra(1(Dum1) + (Delta ocra1xK) where 1,2,3,...,N = repetitive tasks ordered by ocra index values (1 = highest; N = lowest) computed considering respective real duration multipliers (Dum(i)). ocra1 = ocra index of

  5. Development of Optimized Core Design and Analysis Methods for High Power Density BWRs

    Science.gov (United States)

    Shirvan, Koroush

    temperature was kept the same for the BWR-HD and ABWR which resulted in 4 °K cooler core inlet temperature for the BWR-HD given that its feedwater makes up a larger fraction of total core flow. The stability analysis using the STAB and S3K codes showed satisfactory results for the hot channel, coupled regional out-of-phase and coupled core-wide in-phase modes. A RELAPS model of the ABWR system was constructed and applied to six transients for the BWR-HD and ABWR. The 6MCPRs during all the transients were found to be equal or less for the new design and the core remained covered for both. The lower void coefficient along with smaller core volume proved to be advantages for the simulated transients. Helical Cruciform Fuel (HCF) rods were proposed in prior MIT studies to enhance the fuel surface to volume ratio. In this work, higher fidelity models (e.g. CFD instead of subchannel methods for the hydraulic behaviour) are used to investigate the resolution needed for accurate assessment of the HCF design. For neutronics, conserving the fuel area of cylindrical rods results in a different reactivity level with a lower void coefficient for the HCF design. In single-phase flow, for which experimental results existed, the friction factor is found to be sensitive to HCF geometry and cannot be calculated using current empirical models. A new approach for analysis of flow crisis conditions for HCF rods in the context of Departure from Nucleate Boiling (DNB) and dryout using the two phase interface tracking method was proposed and initial results are presented. It is shown that the twist of the HCF rods promotes detachment of a vapour bubble along the elbows which indicates no possibility for an early DNB for the HCF rods and in fact a potential for a higher DNB heat flux. Under annular flow conditions, it was found that the twist suppressed the liquid film thickness on the HCF rods, at the locations of the highest heat flux, which increases the possibility of reaching early dryout. It

  6. Development of safety evaluation methods and analysis codes applied to the safety regulations for the design and construction stage of fast breeder reactor

    International Nuclear Information System (INIS)

    The purposes of this study are to develop the safety evaluation methods and analysis codes needed in the design and construction stage of fast breeder reactor (FBR). In JFY 2012, the following results are obtained. As for the development of safety evaluation methods needed in the safety examination conducted for the reactor establishment permission, development of the analysis codes, such as core damage analysis code, were carried out following the planned schedule. As for the development of the safety evaluation method needed for the risk informed safety regulation, the quantification technique of the event tree using the Continuous Markov chain Monte Carlo method (CMMC method) were studied. (author)

  7. Quantitative developments in the cognitive reliability and error analysis method (CREAM) for the assessment of human performance

    International Nuclear Information System (INIS)

    The current 'second generation' approaches in human reliability analysis focus their attention on the contextual conditions under which a given action is performed rather than on the notion of inherent human error probabilities, as was done in the earlier 'first generation' techniques. Among the 'second generation' methods, this paper considers the Cognitive Reliability and Error Analysis Method (CREAM) and proposes some developments with respect to a systematic procedure for computing probabilities of action failure. The starting point for the quantification is a previously introduced fuzzy version of the CREAM paradigm which is here further extended to include uncertainty on the qualification of the conditions under which the action is performed and to account for the fact that the effects of the common performance conditions (CPCs) on performance reliability may not all be equal. By the proposed approach, the probability of action failure is estimated by rating the performance conditions in terms of their effect on the action

  8. Development and validation of a reversed phase liquid chromatographic method for analysis of oxytetracycline and related impurities.

    Science.gov (United States)

    Kahsay, Getu; Shraim, Fairouz; Villatte, Philippe; Rotger, Jacques; Cassus-Coussère, Céline; Van Schepdael, Ann; Hoogmartens, Jos; Adams, Erwin

    2013-03-01

    A simple, robust and fast high-performance liquid chromatographic method is described for the analysis of oxytetracycline and its related impurities. The principal peak and impurities are all baseline separated in 20 min using an Inertsil C₈ (150 mm × 4.6 mm, 5 μm) column kept at 50 °C. The mobile phase consists of a gradient mixture of mobile phases A (0.05% trifluoroacetic acid in water) and B (acetonitrile-methanol-tetrahydrofuran, 80:15:5, v/v/v) pumped at a flow rate of 1.3 ml/min. UV detection was performed at 254 nm. The developed method was validated for its robustness, sensitivity, precision and linearity in the range from limit of quantification (LOQ) to 120%. The limits of detection (LOD) and LOQ were found to be 0.08 μg/ml and 0.32 μg/ml, respectively. This method allows the separation of oxytetracycline from all known and 5 unknown impurities, which is better than previously reported in the literature. Moreover, the simple mobile phase composition devoid of non-volatile buffers made the method suitable to interface with mass spectrometry for further characterization of unknown impurities. The developed method has been applied for determination of related substances in oxytetracycline bulk samples available from four manufacturers. The validation results demonstrate that the method is reliable for quantification of oxytetracycline and its impurities. PMID:23277151

  9. Development and Implementation of Efficiency-Improving Analysis Methods for the SAGE III on ISS Thermal Model Originating

    Science.gov (United States)

    Liles, Kaitlin; Amundsen, Ruth; Davis, Warren; Scola, Salvatore; Tobin, Steven; McLeod, Shawn; Mannu, Sergio; Guglielmo, Corrado; Moeller, Timothy

    2013-01-01

    The Stratospheric Aerosol and Gas Experiment III (SAGE III) instrument is the fifth in a series of instruments developed for monitoring aerosols and gaseous constituents in the stratosphere and troposphere. SAGE III will be delivered to the International Space Station (ISS) via the SpaceX Dragon vehicle in 2015. A detailed thermal model of the SAGE III payload has been developed in Thermal Desktop (TD). Several novel methods have been implemented to facilitate efficient payload-level thermal analysis, including the use of a design of experiments (DOE) methodology to determine the worst-case orbits for SAGE III while on ISS, use of TD assemblies to move payloads from the Dragon trunk to the Enhanced Operational Transfer Platform (EOTP) to its final home on the Expedite the Processing of Experiments to Space Station (ExPRESS) Logistics Carrier (ELC)-4, incorporation of older models in varying unit sets, ability to change units easily (including hardcoded logic blocks), case-based logic to facilitate activating heaters and active elements for varying scenarios within a single model, incorporation of several coordinate frames to easily map to structural models with differing geometries and locations, and streamlined results processing using an Excel-based text file plotter developed in-house at LaRC. This document presents an overview of the SAGE III thermal model and describes the development and implementation of these efficiency-improving analysis methods.

  10. Statistical Methods for Analysis of High Throughput Experiments in Early Drug Development

    OpenAIRE

    Khamiakova, Tatsiana

    2013-01-01

    Introduction: Advances in biotechnology and the ability to obtain molecular profiles of biological samples, and in particular, the transcriptomic data, have been transforming the way biomedical research and early drug development are carried out for more than a decade (Clarke et al., 2004; Chengalvala et al., 2007; Hughes et al., 2011). In view of increasing costs of the drug development and nevertheless a large number of drugs which fail the clinical trials either due to the lack of efficacy...

  11. Further development of the ultrasonic testing method for improving the detection and analysis of corrosion cracks

    International Nuclear Information System (INIS)

    Defect detection and analysis can be improved applying the ultrasonic multifrequency technique basing on the principle that each defect type reveals two characteristic domains during ultrasonic testing, i.e. scattering and reflection. The frequency range ft, which is the frequency range where transition from scattering to reflection occurs, is a major characteristic value identifying the defect size, which in the case of corrosion cracks is directly proportional to the crack depth. (orig.)

  12. Analysis of Indonesian Agroindustry Competitiveness in Nanotechnology Development Perspective Using SWOT-AHP Method

    OpenAIRE

    Nurul Taufiqu Rochman; Gumbira-Sa’id, E.; Arief Daryanto; Nunung Nuryartono

    2011-01-01

    Application of nanotechnology opens vast opportunities for increasing the competitiveness of the nationalagroindustries. In this study, five agroindustries that potentially applied nanotechnology were reviewed andanalyzed by using a SWOT-AHP (strength, weakness, opportunity, threat, and analysis hierarchy process) todetermine the position of the competitiveness of each industry. Criteria were analyzed based on internal factorsthat have the potential to be the strengths and weaknesses, and ext...

  13. A Mixed-Method Analysis of the Alignment of Title I Achieving Schools' Professional Development to NCLB Professional Development Provisions

    Science.gov (United States)

    Quinn, Kerry

    2013-01-01

    The improvement of schools has been a central discussion among educators, legislators, and stakeholders, with professional development being acknowledged as a fundamental topic for the success of the education system. Studying the alignment of professional development programs provided at Title I Achieving schools to NCLB research based…

  14. Development of ginger-flavoured soya milk ice cream : Comparison of data analysis methods

    Directory of Open Access Journals (Sweden)

    Wiwat Wangcharoen

    2012-12-01

    Full Text Available Sucrose at the concentration of 6 and 7% (w/w and ginger extract at the concentration of 4 and 5% (w/w were added to ginger-flavoured soya milk ice cream recipes to determine the consumer acceptablility. One hundred consumers were requested to taste four samples of ginger-flavoured soya milk ice cream and rate the intensity of ginger flavour, sweetness, richness and smoothness, together with their indication for ideal intensity of these attributes as well as their preference of each sample by using the 10 cm-line scale. Ideal ratio scores, principal component analysis and analysis of variance with mean comparison were used for data analysis. It was shown that the recipe with 7% sucrose and 4% ginger extract was close to the ideal product (p>0.05 and it had the highest preference mean score (p<0.05. Total phenolic content of this recipe was 91.6 + 6.8 mg gallic acid equivalent per 100 grams and antioxidant capacity values including ferric reducing/antioxidative power (FRAP, 1,1-diphenyl-2-picryl-hydrazyl (DPPH and 2,2'-azino-bis(3-ethylbenzothiazoline-6-sulphonic acid (ABTS were 37.9+3.7, 13.4+1.2 and 49.0+5.1 mg vitamin C equivalent per 100 grams respectively.

  15. Spatial analysis method of assessing water supply and demand applied to energy development in the Ohio River Basin

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, A.D.

    1979-08-01

    The focus of the study is on water availability for energy development in the Ohio River Basin; however, the techniques developed are applicable to water supply investigations for other regions and uses. The study assesses the spatial association between water supply and demand for future energy development in the Basin. The problem is the development of a method that accurately portrays the actual spatial coincidence of water availability and use within a basin. The issues addressed involve questions of scale and methods used to create a model distribution of streamflow and to compare it with projected patterns of water requirements for energy production. The analysis procedure involves the compilation of streamflow data and calculation of 7-day/10-year low-flow estimates within the Basin. Low-flow probabilities are based on historical flows at gaging stations and are adjusted for the effects of reservoir augmentation. Once streamflow estimates have been determined at gaging stations, interpolation of these values is made between known data points to enable direct comparison with projected energy water-use data. Finally, a method is devised to compare the patterns of projected water requirements with the model distribution of streamflow, in sequential downstream order.

  16. Methodical Approaches To Analysis And Forecasting Of Development Fuel And Energy Complex And Gas Industry In The Region

    Directory of Open Access Journals (Sweden)

    Vladimir Andreyevich Tsybatov

    2014-12-01

    Full Text Available Fuel and energy complex (FEC is one of the main elements of the economy of any territory over which intertwine the interests of all economic entities. To ensure economic growth of the region should ensure that internal balance of energy resources, which should be developed with account of regional specifics of economic growth and energy security. The study examined the status of this equilibrium, indicating fuel and energy balance of the region (TEB. The aim of the research is the development of the fuel and energy balance, which will allow to determine exactly how many and what resources are not enough to ensure the regional development strategy and what resources need to be brought in. In the energy balances as the focus of displays all issues of regional development, so thermopile is necessary as a mechanism of analysis of current issues, economic development, and in the forward-looking version — as a tool future vision for the fuel and energy complex, energy threats and ways of overcoming them. The variety of relationships in the energy sector with other sectors and aspects of society lead to the fact that the development of the fuel and energy balance of the region have to go beyond the actual energy sector, involving the analysis of other sectors of economy, as well as systems such as banking, budgetary, legislative, tax. Due to the complexity of the discussed problems, the obvious is the need to develop appropriate forecast-analytical system, allowing regional authorities to implement evidence-based predictions of the consequences of management decisions. Multivariant scenario study on development of fuel and energy complex and separately industry, to use the methods of project-based management, harmonized application of state regulation of strategic and market mechanisms on the operational directions of development of fuel and energy complex and separately industry in the economy of the region.

  17. Development of a cell formation heuristic by considering realistic data using principal component analysis and Taguchi's method

    Science.gov (United States)

    Kumar, Shailendra; Sharma, Rajiv Kumar

    2015-12-01

    Over the last four decades of research, numerous cell formation algorithms have been developed and tested, still this research remains of interest to this day. Appropriate manufacturing cells formation is the first step in designing a cellular manufacturing system. In cellular manufacturing, consideration to manufacturing flexibility and production-related data is vital for cell formation. The consideration to this realistic data makes cell formation problem very complex and tedious. It leads to the invention and implementation of highly advanced and complex cell formation methods. In this paper an effort has been made to develop a simple and easy to understand/implement manufacturing cell formation heuristic procedure with considerations to the number of production and manufacturing flexibility-related parameters. The heuristic minimizes inter-cellular movement cost/time. Further, the proposed heuristic is modified for the application of principal component analysis and Taguchi's method. Numerical example is explained to illustrate the approach. A refinement in the results is observed with adoption of principal component analysis and Taguchi's method.

  18. Developments in Surrogating Methods

    Directory of Open Access Journals (Sweden)

    Hans van Dormolen

    2005-11-01

    Full Text Available In this paper, I would like to talk about the developments in surrogating methods for preservation. My main focus will be on the technical aspects of preservation surrogates. This means that I will tell you something about my job as Quality Manager Microfilming for the Netherlands’ national preservation program, Metamorfoze, which is coordinated by the National Library. I am responsible for the quality of the preservation microfilms, which are produced for Metamorfoze. Firstly, I will elaborate on developments in preservation methods in relation to the following subjects: · Preservation microfilms · Scanning of preservation microfilms · Preservation scanning · Computer Output Microfilm. In the closing paragraphs of this paper, I would like to tell you something about the methylene blue test. This is an important test for long-term storage of preservation microfilms. Also, I will give you a brief report on the Cellulose Acetate Microfilm Conference that was held in the British Library in London, May 2005.

  19. Analysis of the Difficulties and Improvement Method on Introduction of PBL Approach in Developing Country

    Science.gov (United States)

    Okano, Takasei; Sessa, Salvatore

    In the field of international cooperation, it is increasing to introduce Japanese engineering educational model in the developing country to improve the quality of education and research activity. A naive implementation of such model in different cultures and educational systems may lead to several problems. In this paper, we evaluated the Project Based Learning (PBL) class, developed at Waseda University in Japan, and employed to the Egyptian education context at the Egypt-Japan University of Science and Technology (E-JUST) . We found difficulties such as : non-homogeneous student’ s background, disconnection with the student’ s research, weak learning style adaptation, and irregular course conduction. To solve these difficulties at E-JUST, we proposed : the groupware introduction, project theme choice based on student’ s motivation, and curriculum modification.

  20. Effective methods of consumer protection in Brazil. An analysis in the context of property development contracts

    OpenAIRE

    Deborah Alcici Salomão

    2015-01-01

    This study examines consumer protection in arbitration, especially under the example of property development contract disputes in Brazil. This is a very current issue in light of the presidential veto of consumer arbitration on May 26, 2015. The article discusses the arbitrability of these disputes based on Brazilian legislation and relevant case law. It also analyzes of the advantages, disadvantages and trends of consumer arbitration in the context of real estate contracts. The paper conclud...

  1. SENSITIVITY ANALYSIS as a methodical approach to the development of design strategies for environmentally sustainable buildings

    OpenAIRE

    Hansen, Hanne Tine Ring

    2007-01-01

    The field of environmentally sustainable architecture has been under development since the late 1960's when mankind first started to notice the consequences of industrialisation and modern lifestyle. Energy crises in 1973 and 1979, and global climatic changes ascribed to global warming have caused an increase in scientific and political awareness, which has lead to an escalation in the number of research publications in the field, as well as, legislative demands for the energy consumption of ...

  2. Estimation of Bouguer Density Precision: Development of Method for Analysis of La Soufriere Volcano Gravity Data

    Directory of Open Access Journals (Sweden)

    Hendra Gunawan

    2014-06-01

    Full Text Available http://dx.doi.org/10.17014/ijog.vol3no3.20084The precision of topographic density (Bouguer density estimation by the Nettleton approach is based on a minimum correlation of Bouguer gravity anomaly and topography. The other method, the Parasnis approach, is based on a minimum correlation of Bouguer gravity anomaly and Bouguer correction. The precision of Bouguer density estimates was investigated by both methods on simple 2D syntetic models and under an assumption free-air anomaly consisting of an effect of topography, an effect of intracrustal, and an isostatic compensation. Based on simulation results, Bouguer density estimates were then investigated for a gravity survey of 2005 on La Soufriere Volcano-Guadeloupe area (Antilles Islands. The Bouguer density based on the Parasnis approach is 2.71 g/cm3 for the whole area, except the edifice area where average topography density estimates are 2.21 g/cm3 where Bouguer density estimates from previous gravity survey of 1975 are 2.67 g/cm3. The Bouguer density in La Soufriere Volcano was uncertainly estimated to be 0.1 g/cm3. For the studied area, the density deduced from refraction seismic data is coherent with the recent Bouguer density estimates. New Bouguer anomaly map based on these Bouguer density values allows to a better geological intepretation.    

  3. Development of LC-MS/MS Methods for the Analysis of Chiral and Achiral Pharmaceuticals and Metabolites in Aqueous Environmental Matrices

    OpenAIRE

    Barclay, Victoria K.H.

    2012-01-01

    This thesis describes the development of liquid chromatography tandem mass spectrometry (LC-MS/MS) methods for the trace analysis of active pharmaceutical ingredients (APIs) and their metabolites in aqueous environmental matrices. The research was focused on the development of chiral LC-MS/MS methods for the analysis of fluoxetine and metoprolol, as well as their chiral metabolites in environmental water samples. A method was also developed for the achiral compounds, diazepam and nordiazepam....

  4. Stakeholder Analysis as a tool  for working with  Social Responsibility : Developing a stakeholder analysis  method for ISO 26000

    OpenAIRE

    Weinestedt, Henrik

    2009-01-01

    This thesis aims to develop a stakeholder analysis method for the upcoming standard on social  responsibility  (SR)  ISO  26000. The  goal  is  a method  that,  by  adhering  to  three criteria regarding comprehensibility, flexibility and ease of use, can be used and applied by organizations regardless of type (corporation, NGO, municipality etc.) and value chain size. The method consists of  six different  steps which, when completed, will produce a situation map of the organization’s key SR...

  5. Method development for the analysis of nitrotoluenes, nitramines and other organic compounds in ammunition waste water

    International Nuclear Information System (INIS)

    Gas chromatography and high performance liquid chromatography were used to determine explosives, their by- and degradation products near the former ammunition plant Elsnig in Saxony. Enrichment procedures using liquid/liquid-and solid-phase extraction, which have already been developed, were used to investigate ground and surface water samples. Mono-, di- and trinitrotoluenes as well as aminonitro- and chlorinated nitroaromatics were identified and quantified using GC/MS, the electron capture detector (ECD) and the nitrogen-phosphorus detector (NPD). Besides, some nitrophenols were identified in ground water. Additionally, RDX, which is hardly to be determined by GC, was quantified using high performance liquid chromatography. Identification was performed by the UV-spectra using a photodiode array detector. (orig.)

  6. Development of partitioning method

    International Nuclear Information System (INIS)

    The literature survey was carried out on the amount of natural resources, behaviors in reprocessing process and in separation and recovery methods of the platinum group elements and technetium which are contained in spent fuel. The essential results are described below. (1) The platinum group elements, which are contained in spent fuel, are quantitatively limited, compared with total demand for them in Japan. And estimated separation and recovery cost is rather high. In spite of that, development of these techniques is considered to be very important because the supply of these elements is almost from foreign resources in Japan. (2) For recovery of these elements, studies of recovery from undisolved residue and from high level liquid waste (HLLW) also seem to be required. (3) As separation and recovery methods, following techniques are considered to be effective; lead extraction, liquid metal extraction, solvent extraction, ion-exchange, adsorption, precipitation, distillation, electrolysis or their combination. (4) But each of these methods has both advantages and disadvantages. So development of such processes largely depends on future works. (author) 94 refs

  7. Development of neutron multiplication analysis method for a subcritical system by reaction rate distribution measurement

    International Nuclear Information System (INIS)

    Basic experiments for ADSR are performed in KUCA to study the nuclear characteristics for establishing a new neutron source for research. Usually, nuclear reactors are operated in a critical state. Even though they are operated in an subcritical state, they are a very close to the critical state, and there are no problems to use the effective multiplication factor keff to express the subcriticality, which is obtained by solving the homogeneous neutron balance equation without external source. However, ADSR are operated in a subcritical state, and experiments which are fairly far from critical state may be performed to investigate their nuclear properties. In subcritical systems, the neutron flux distribution produced by an external source depends on the energy and position of the external source, and then the multiplication rate fission neutrons and the effectiveness of the external source should depend on the position of the external source. However, the effective multiplication factor keff cannot take into account the influence of such an effect. For a subcritical system, the neutron multiplication which is defined as the ratio of the total neutrons produced in the system by either fission or external source to those produced by external source only, can be a good measure for the efficiency of the system to produce neutrons with a specific spectrum which is one of the final goals of the 'Neutron Factory' project. Unlike the theoretical neutron multiplication definition, based on one point reactor approximation which depends only on the subcriticality of the system, the method considered in this study takes into account the effect on the neutron source position and energy, which plays an important role in the level of neutron multiplication for a given subcritical system. In this study, the value of neutron multiplication will be evaluated by utilizing the reaction rate distribution of KUCA A-core experiment which is analyzed in a subcritical system combined with

  8. Development of partitioning method

    International Nuclear Information System (INIS)

    A partitioning method has been developed under the concepts of separation of nuclides in high level nuclear fuel reprocessing liquid waste according to their half lives and radioactive toxicity and of disposal of them by suitable methods. In the partitioning process, which has been developed in JAERI, adoption of solvent extraction process with DIDPA (di-isodecyl phosphoric acid) has been studied for actinides separation. The present paper mainly describes studies on back extraction behavior of Np(IV), Pu(IV) and U(VI) in DIDPA. Most experiments were carried out according to following procedure. These actinides were extracted from 0.5 M nitric acid with DIDPA, where nitric acid concentration in HLW is expected to be adjusted to this value prior to actinides extraction in the partitioning process, and back-extracted with various reagents such as oxalic acid. The experimental results show that distribution ratios of Np(IV) and Pu(IV) can be reduced to less than unity with 1 M oxalic acid and those of U(VI) and Np(IV) with 5 M phosphoric acid. From results of these studies and previous research on Am and Cm, following possibilities were confirmed ; U, Pu, Np, Am and Cm, which are major actinides in HLW, can be extracted simultaneously with DIDPA, and they can be removed from DIDPA with various reagents. (nitric acid for Am and Cm, oxalic acid for Np and Pu, and phosphoric acid for U respectively). (author)

  9. Development of an offline bidimensional high-performance liquid chromatography method for analysis of stereospecific triacylglycerols in cocoa butter equivalents.

    Science.gov (United States)

    Kadivar, Sheida; De Clercq, Nathalie; Nusantoro, Bangun Prajanto; Le, Thien Trung; Dewettinck, Koen

    2013-08-21

    Acyl migration is a serious problem in enzymatic modification of fats and oils, particularly in production of cocoa butter equivalent (CBE) through enzymatic acidolysis reaction, which leads to the formation of non-symmetrical triacylglycerols (TAGs) from symmetrical TAGs. Non-symmetrical TAGs may affect the physical properties of final products and are therefore often undesired. Consequently, an accurate method is needed to determine positional isomer TAGs during the production of CBE. A bidimentional high-performance liquid chromatography (HPLC) method with combination of non-aqueous reversed-phase HPLC and silver ion HPLC joining with an evaporative light scattering detector was successfully developed for the analysis of stereospecific TAGs. The best separation of positional isomer standards was obtained with a heptane/acetone mobile-phase gradient at 25 °C and 1 mL/min. The developed method was then used in multidimensional determination of the TAG positional isomers in fat and oil blends and successfully identified the TAGs and possible isomers in enzymatically acidolyzed CBE. PMID:23931630

  10. Development of synthetic velocity - depth damage curves using a Weighted Monte Carlo method and Logistic Regression analysis

    Science.gov (United States)

    Vozinaki, Anthi Eirini K.; Karatzas, George P.; Sibetheros, Ioannis A.; Varouchakis, Emmanouil A.

    2014-05-01

    Damage curves are the most significant component of the flood loss estimation models. Their development is quite complex. Two types of damage curves exist, historical and synthetic curves. Historical curves are developed from historical loss data from actual flood events. However, due to the scarcity of historical data, synthetic damage curves can be alternatively developed. Synthetic curves rely on the analysis of expected damage under certain hypothetical flooding conditions. A synthetic approach was developed and presented in this work for the development of damage curves, which are subsequently used as the basic input to a flood loss estimation model. A questionnaire-based survey took place among practicing and research agronomists, in order to generate rural loss data based on the responders' loss estimates, for several flood condition scenarios. In addition, a similar questionnaire-based survey took place among building experts, i.e. civil engineers and architects, in order to generate loss data for the urban sector. By answering the questionnaire, the experts were in essence expressing their opinion on how damage to various crop types or building types is related to a range of values of flood inundation parameters, such as floodwater depth and velocity. However, the loss data compiled from the completed questionnaires were not sufficient for the construction of workable damage curves; to overcome this problem, a Weighted Monte Carlo method was implemented, in order to generate extra synthetic datasets with statistical properties identical to those of the questionnaire-based data. The data generated by the Weighted Monte Carlo method were processed via Logistic Regression techniques in order to develop accurate logistic damage curves for the rural and the urban sectors. A Python-based code was developed, which combines the Weighted Monte Carlo method and the Logistic Regression analysis into a single code (WMCLR Python code). Each WMCLR code execution

  11. Digital representation of meso-geomaterial spatial distribution and associated numerical analysis of geomechanics:methods,applications and developments

    Institute of Scientific and Technical Information of China (English)

    YUE Zhongqi

    2007-01-01

    This paper presents the author's efforts in the past decade for the establishment of a practical approach of digital representation of the geomaterial distribution of different minerals,particulars,and components in the meso-scale range(0.1 to 500 mm).The primary goal of the approach is to provide a possible solution to solve the two intrinsic problems associated with the current main-stream methods for geomechanics.The problems are (1) the constitutive models and parameters of soils and rocks cannot be given accurately in geomechanical prediction;and (2) there are numerous constitutive models of soils and rocks in the literature.The problems are possibly caused by the homogenization or averaging method in analyzing laboratory test results for establishing the constitutive models and parameters.The averaging method employs an assumption that the test samples can be represented by a homogeneous medium.Such averaging method ignores the fact that the geomaterial samples are also consisted of a number of materials and components whose properties may have significant differences.In the proposed approach,digital image processing methods are used as measurement tools to construct a digital representation for the actual spatial distribution of the different materials and components in geomaterial samples.The digital data are further processed to automatically generate meshes or grids for numerical analysis.These meshes or grids can be easily incorporated into existing numerical software packages for further mechanical analysis and failure prediction of the geomaterials under external loading.The paper presents case studies to illustrate the proposed approach.Further discussions are also made on how to use the proposed approach to develop the geomechanics by taking into account the geomaterial behavior at micro-scale,meso-scale and macro-scale levels.A literature review of the related developments is given by examining the SCI papers in the database of Science Citation

  12. Development and validation of reversed-phase high performance liquid chromatographic method for analysis of cephradine in human plasma samples

    International Nuclear Information System (INIS)

    An HPLC method with high precision, accuracy and selectivity was developed and validated for the assessment of cephradine in human plasma samples. The extraction procedure was simple and accurate with single step followed by direct injection of sample into HPLC system. The extracted cephradine in spiked human plasma was separated and quantitated using reversed phase C/sub 18/ column and UV detection wavelength of 254 nm. The optimized mobile phase of new composition of 0.05 M potassium dihydrogen phosphate (pH 3.4)-acetonitrile (88: 12) was pumped at an optimum flow rate of 1 mL.min/sup 1/. The method resulted linearity in the concentration range 0.15- 20 micro g mL/sup -1/. The limit of detection (LOD) and limit of quantification (LOQ) were 0.05 and 0.150 Microg.mL/sup -1/, respectively. The accuracy of method was 98.68 %. This method can 1>e applied for bioequivalence studies and therapeutic drug monitoring as well as for the routine analysis of cephradine. (author)

  13. Probabilistic methods for rotordynamics analysis

    Science.gov (United States)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  14. Generic Analysis Methods for Gas Turbine Engine Performance: The development of the gas turbine simulation program GSP

    NARCIS (Netherlands)

    Visser, W.P.J.

    2015-01-01

    Numerical modelling and simulation have played a critical role in the research and development towards today’s powerful and efficient gas turbine engines for both aviation and power generation. The simultaneous progress in modelling methods, numerical methods, software development tools and methods,

  15. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  16. Method Development for Rapid Analysis of Natural Radioactive Nuclides Using Sector Field Inductively Coupled Plasma Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Lim, J.M.; Ji, Y.Y.; Lee, H.; Park, J.H.; Jang, M.; Chung, K.H.; Kang, M.J.; Choi, G.S. [Korea Atomic Energy Research Institute (Korea, Republic of)

    2014-07-01

    As an attempt to reduce the social costs and apprehension arising from radioactivity in the environment, an accurate and rapid assessment of radioactivity is highly desirable. Naturally occurring radioactive materials (NORM) are widely spread throughout the environment. The concern with radioactivity from these materials has therefore been growing for the last decade. In particular, radiation exposure in the industry when handling raw materials (e.g., coal mining and combustion, oil and gas production, metal mining and smelting, mineral sands (REE, Ti, Zr), fertilizer (phosphate), and building materials) has been brought to the public's attention. To decide the proper handling options, a rapid and accurate analytical method that can be used to evaluate the radioactivity of radionuclides (e.g., {sup 238}U, {sup 235}U, {sup 232}Th, {sup 226}Ra, and {sup 40}K) should be developed and validated. Direct measuring methods such as alpha spectrometry, a liquid scintillation counter (LSC), and mass-spectrometry are usually used for the measurement of radioactivity in NORM samples, and they encounter the most significant difficulties during pretreatment (e.g., purification, speciation, and dilution/enrichment). Since the pretreatment process consequently plays an important role in the measurement uncertainty, method development and validation should be performed. Furthermore, a-spectrometry has a major disadvantage of a long counting time, while it has a prominent measurement capability at a very low activity level of {sup 238}U, {sup 235}U, {sup 232}Th, and {sup 226}Ra. Contrary to the α-spectrometry method, a measurement technique using ICP-MS allow radioactivity in many samples to be measured in a short time period with a high degree of accuracy and precision. In this study, a method was developed for a rapid analysis of natural radioactive nuclides using ICP-MS. A sample digestion process was established using LiBO{sub 2} fusion and Fe co-precipitation. A magnetic

  17. Method Development for Rapid Analysis of Natural Radioactive Nuclides Using Sector Field Inductively Coupled Plasma Mass Spectrometry

    International Nuclear Information System (INIS)

    As an attempt to reduce the social costs and apprehension arising from radioactivity in the environment, an accurate and rapid assessment of radioactivity is highly desirable. Naturally occurring radioactive materials (NORM) are widely spread throughout the environment. The concern with radioactivity from these materials has therefore been growing for the last decade. In particular, radiation exposure in the industry when handling raw materials (e.g., coal mining and combustion, oil and gas production, metal mining and smelting, mineral sands (REE, Ti, Zr), fertilizer (phosphate), and building materials) has been brought to the public's attention. To decide the proper handling options, a rapid and accurate analytical method that can be used to evaluate the radioactivity of radionuclides (e.g., 238U, 235U, 232Th, 226Ra, and 40K) should be developed and validated. Direct measuring methods such as alpha spectrometry, a liquid scintillation counter (LSC), and mass-spectrometry are usually used for the measurement of radioactivity in NORM samples, and they encounter the most significant difficulties during pretreatment (e.g., purification, speciation, and dilution/enrichment). Since the pretreatment process consequently plays an important role in the measurement uncertainty, method development and validation should be performed. Furthermore, a-spectrometry has a major disadvantage of a long counting time, while it has a prominent measurement capability at a very low activity level of 238U, 235U, 232Th, and 226Ra. Contrary to the α-spectrometry method, a measurement technique using ICP-MS allow radioactivity in many samples to be measured in a short time period with a high degree of accuracy and precision. In this study, a method was developed for a rapid analysis of natural radioactive nuclides using ICP-MS. A sample digestion process was established using LiBO2 fusion and Fe co-precipitation. A magnetic sector field ICP-MS (SPECTRO MS) was used for a rapid

  18. Development of an evaluation method for seismic isolation systems of nuclear power facilities. Seismic design analysis methods for crossover piping system

    International Nuclear Information System (INIS)

    This paper provides seismic design analysis methods suitable for crossover piping system, which connects between seismic isolated building and non-isolated building in the seismic isolated nuclear power plant. Through the numerical study focused on the main steam crossover piping system, seismic response spectrum analysis applying ISM (Independent Support Motion) method with SRSS combination or CCFS (Cross-oscillator, Cross-Floor response Spectrum) method has found to be quite effective for the seismic design of multiply supported crossover piping system. (author)

  19. Development of partitioning method

    International Nuclear Information System (INIS)

    A partitioning method has been developed under the concepts of separating radioactive nuclides from a high-level waste according to their half lives and radioactive toxicity, and of disposing the waste safely. The partitioning test using about 18 liters (--220Ci) of the fuel reprocessing waste prepared at PNC has been started in October of 1982. In this test the behavior of radioactive nuclides was made clear. The present paper describes chemical behavior of non-radioactive elements contained in the high-level liquid waste in the extraction with di-isodecyl phosphoric acid (DIDPA). Distribution ratios of most of metal ions for DIDPA were less than 0.05, except that those of Mo, Zr and Fe were higher than 7. Ferric ion could not be back-extracted with 4 M HNO3, but with 0.5 M (COOH)2. In the extractiion with DIDPA, the third phase, which causes closing the settling banks or the flow paths in a mixer settler, was formed when the ferric ion concentration was over 0.02 M. This unfavorable phenomenon, however, was found to be suppressed by diluting the ferric ion concentration to lower than 0.01 M or by reducing ferric ion to ferrous ion. (author)

  20. Development of numerical analysis method of flow-acoustic resonance in stub pipes of safety relief valves

    International Nuclear Information System (INIS)

    The boiling water reactors (BWRs) have steam dryer in the upper part of the pressure vessel to remove moisture from the steam. The steam dryer in the Quad Cities Unit 2 nuclear power plant was damaged by high-cycle fatigue due to acoustic-induced vibration during extended power uprate operation. The principal source of the acoustic-induced vibration was flow-acoustic resonance at the stub pipes of the safety relief valves (SRVs) in the main steam lines (MSLs). The acoustic wave generated at the SRV stub pipes propagates throughout the MSLs and eventually reaches and damages the steam dryer. Therefore, for power uprate operation of the BWRs, it has been required to predict the flow-acoustic resonance at the SRV stub pipes. The purpose of this article was to propose a numerical analysis method for evaluating the flow-acoustic resonance in the SRV stub pipes. The proposed method is based on the finite difference lattice Boltzmann method (FDLBM). So far, the FDLBM has been applied to flow-acoustic simulations of laminar flows around simple geometries at low Reynolds number. In order to apply the FDLBM to the flow-acoustic resonance simulations of turbulent flows around complicated geometries at the high Reynolds number, we developed computationally efficient model by introducing new function into the governing equation. The proposed method was compared with the conventional FDLBM in the cavity-driven flow simulation. The proposed method was validated by comparisons with the experimental data in the 1/10-scale test of BWR-5 under atmosphere condition. The following three results were obtained; the first is that the proposed method can reduce the computing time by 30% compared with the conventional FDLBM; the second is that the proposed method successfully simulated the flow-acoustic resonance in the SRV stub pipes of the BWR-5, and the pressure fluctuations of the simulation results agreed well with those of the experimental data; and the third is the mechanism of the

  1. Combined fluvial and pluvial urban flood hazard analysis: method development and application to Can Tho City, Mekong Delta, Vietnam

    Directory of Open Access Journals (Sweden)

    H. Apel

    2015-08-01

    Full Text Available Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas, and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either fluvial or pluvial flood hazard, studies of combined fluvial and pluvial flood hazard are hardly available. Thus this study aims at the analysis of fluvial and pluvial flood hazard individually, but also at developing a method for the analysis of combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as example. In this tropical environment the annual monsoon triggered floods of the Mekong River can coincide with heavy local convective precipitation events causing both fluvial and pluvial flooding at the same time. Fluvial flood hazard was estimated with a copula based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. Pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data, and a stochastic rain storm generator. Inundation was simulated by a 2-dimensional hydrodynamic model implemented on a Graphical Processor Unit (GPU for time-efficient flood propagation modelling. All hazards – fluvial, pluvial and combined – were accompanied by an uncertainty estimation considering the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median

  2. Combined fluvial and pluvial urban flood hazard analysis: method development and application to Can Tho City, Mekong Delta, Vietnam

    Science.gov (United States)

    Apel, H.; Trepat, O. M.; Hung, N. N.; Chinh, D. T.; Merz, B.; Dung, N. V.

    2015-08-01

    Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas, and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either fluvial or pluvial flood hazard, studies of combined fluvial and pluvial flood hazard are hardly available. Thus this study aims at the analysis of fluvial and pluvial flood hazard individually, but also at developing a method for the analysis of combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as example. In this tropical environment the annual monsoon triggered floods of the Mekong River can coincide with heavy local convective precipitation events causing both fluvial and pluvial flooding at the same time. Fluvial flood hazard was estimated with a copula based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. Pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data, and a stochastic rain storm generator. Inundation was simulated by a 2-dimensional hydrodynamic model implemented on a Graphical Processor Unit (GPU) for time-efficient flood propagation modelling. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation considering the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by

  3. Factor Analysis Methods and Validity Evidence: A Systematic Review of Instrument Development across the Continuum of Medical Education

    Science.gov (United States)

    Wetzel, Angela Payne

    2011-01-01

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across…

  4. Development and validation of methods for the analysis of reprocessing solvents: role of CETAMA Working Group 24

    International Nuclear Information System (INIS)

    An interlaboratory comparison has been organized by CETAMA Working Group 24 'Organic analysis' for the validation of a malonamide analysis method, DMDOHEMA, by Gas Chromatography coupled with a Flame Ionization Detector (GC-FID). This compound is studied as a new extraction solvent in nuclear reprocessing processes. Most of the results for DMDOHEMA showed an agreement between the laboratory values and the reference value better than 10%. Repeatability and reproducibility of the method were evaluated by robust statistics and detection limits were also estimated from laboratory data. The description of the method has been published in the CETAMA ANASOR collection book. (authors)

  5. DEVELOPMENT OF METHOD OF QUALITATIVE ANALYSIS OF BIRD CHERRY FRUIT FOR INCLUSION IN THE MONOGRAPH OF STATE PHARMACOPOEIA OF UKRAINE

    Directory of Open Access Journals (Sweden)

    Lenchyk L.V.

    2016-06-01

    Full Text Available Introduction. Bird cherry Padus avium Mill, Rosaceae, is widespread in Ukraine, especially in forests and forest-steppe areas. Bird cherry fruits have long been used in medicine and is a valuable medicinal raw materials. They stated to posess astringent, anti-inflammatory, phytoncidal properties. Bird cherry fruits are included in the USSR Pharmacopoeia IX ed., The State Pharmacopoeia of the Russian Federation, The State Pharmacopoeia of Republic of Belarus. In Ukraine there are no contemporary normative documents for this medicinal plant material, therefore it is the actual to develop projects in the national monographs "dry bird cherry fruit" and "fresh bird cherry fruit" to be included in the State Pharmacopoeia of Ukraine. According to European Pharmacopoeia recommendation method of thin-layer chromatography (TLC is prescribed only for the identification of the herbal drug. The principles of thin-layer chromatography and application of the technique in pharmaceutical analysis are described in State Pharmacopoeia of Ukraine. As it is effective and easy to perform, and the equipment required is inexpensive, the technique is frequently used for evaluating medicinal plant materials and their preparations. The TLC is aimed at elucidating the chromatogram of the drug with respect to selected reference compounds that are described for inclusion as reagents. Aim of this study was to develop methods of qualitative analysis of bird cherry fruits for a monograph in the State Pharmacopoeia of Ukraine (SPU. Materials and Methods. The object of our study was dried bird cherry fruits (7 samples and fresh bird cherry fruits (7 samples harvested in 2013-2015 in Kharkiv, Poltava, Luhansk, Sumy, Lviv, Mykolaiv regions and the city Mariupol. Samples were registered in the department of SPU State Enterprise "Pharmacopeia center". In accordance with the Ph. Eur. and SPU requirements in "identification C" determination was performed by TLC. TLC was performed on

  6. Probabilistic methods for structural response analysis

    Science.gov (United States)

    Wu, Y.-T.; Burnside, O. H.; Cruse, T. A.

    1988-01-01

    This paper addresses current work to develop probabilistic structural analysis methods for integration with a specially developed probabilistic finite element code. The goal is to establish distribution functions for the structural responses of stochastic structures under uncertain loadings. Several probabilistic analysis methods are proposed covering efficient structural probabilistic analysis methods, correlated random variables, and response of linear system under stationary random loading.

  7. Computational methods for global/local analysis

    Science.gov (United States)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  8. Forces in bolted joints: analysis methods and test results utilized for nuclear core applications (LWBR Development Program)

    International Nuclear Information System (INIS)

    Analytical methods and test data employed in the core design of bolted joints for the LWBR core are presented. The effects of external working loads, thermal expansion, and material stress relaxation are considered in the formulation developed to analyze joint performance. Extensions of these methods are also provided for bolted joints having both axial and bending flexibilities, and for the effect of plastic deformation on internal forces developed in a bolted joint. Design applications are illustrated by examples

  9. Development for ultra-trace analysis method of U and Pu in safeguards environmental samples at the clean facility

    International Nuclear Information System (INIS)

    Based on the strengthen safeguard program of the IAEA to detect undeclared nuclear activities and nuclear materials, the method of precise and accurate isotope ratio determination for uranium and plutonium in the environmental samples (cotton swipes) has been developed at JAERI. The samples should be treated in clean environment in order to secure the analytical reliability by eliminating external contamination from the samples containing trace amount of uranium and plutonium. Since the measurement by ICP-MS is favorable to bulk analysis from view points of analytical capacity and operation simplicity, we have studied sample preparation procedures for the trace amount of uranium and plutonium to be applied to ICP-MS. Up to the present, interfering factors involved during analytical processes and the ICP-MS measurement of uranium and plutonium were examined. As a result, uranium isotope measurement more than 100 pg became possible at JAERI clean facility by diminishing uranium blank introduced in the entire sample treatment procedure. And also, the estimation of plutonium recovery yield and uranium decontamination factor suggested the possibility in plutonium isotope measurement more than 100 fg. (author)

  10. Further development of the absorption method for preparing CO2 samples for radiocarbon analysis by liquid scintillation counting

    International Nuclear Information System (INIS)

    The CO2 absorption method for preparing samples for radiocarbon analysis by liquid scintillation counting has been successfully employed by several laboratories over many years. The main advantage of the method is its simplicity. Although the method allows for only relatively small sample sizes, implying relatively low figures of merit, adequate accuracy can be attained for a conventional dating limit up to 37 000 years - more than adequate for hydrological applications. The method reported by Aravena, Qureshi et al., and further developed by Nair et al., involves bubbling the CO2 sample gas through a liquid cocktail containing the scintillator and the alkaline absorber to the point of saturation. It relies on maintaining constant conditions to achieve reproducibility. This process ensures that a maximum amount of sample material is loaded into the cocktail. The sample is then transferred to the counting vial. We report here on further improvements which have considerably simplified the preparation method and have improved its accuracy. The CO2 sample gas is expanded from the purification line to standard pressure (610 torr) into a 1 litre measuring volume which is then isolated. 10 ml of Carbo Sorbreg E is pipetted into a counting vial which is attached to the system though a flexible connection. The vial is briefly pumped to remove air. The CO2 sample is then quantitatively frozen into a small trap and pumped to high vacuum to remove residual non-condensable gas. The low-volume (∼60 ml) section of the system is isolated and the CO2 allowed to sublime whilst the vial is shaken and cooled in a water bath. Absorption proceeds to a final pressure of some 40 torr. The counting vial is removed from the vacuum system, 10 ml Permafluorreg E+ added, capped and shaken well before counting. The absorption is not taken to saturation, which sacrifices some sensitivity. The advantage is that it allows for the amount of sample gas to be measured accurately. The pressure in

  11. Development of partitioning method

    International Nuclear Information System (INIS)

    Spent fuels from nuclear power stations contain many useful elements, which can be utilized as heat and irradiation sources, radioisotope, elemental resource, etc. Their recovery from spent fuel and effective uses have the advantages in not only converting the radioactive waste to beneficial resources but also promoting rationalization of the management and disposal of the radioactive wastes. In present study, published literature related to recovery and utilization of useful elements in spent fuel was mainly surveyed, present states and trends in their research and development were analyzed, and their future prospects were conjectured. Research and development on recovery and utilization of useful elements are being continued mainly in USA, Europe and Japan. A transportable food irradiator with Cs-137 and an electric power source with Sr-90 for remote weather station are typical examples in major past applications. However, research and development on recovery and utilization are not so much active and the future efforts should be expected hereafter. Present study was conducted under the auspices of the Science and Technology Agency of Japan. (author)

  12. Motion as perturbation. II. Development of the method for dosimetric analysis of motion effects with fixed-gantry IMRT

    Energy Technology Data Exchange (ETDEWEB)

    Nelms, Benjamin E. [Canis Lupus LLC, Merrimac, Wisconsin 53561 (United States); Opp, Daniel; Zhang, Geoffrey; Moros, Eduardo; Feygelman, Vladimir, E-mail: vladimir.feygelman@moffitt.org [Department of Radiation Oncology, Moffitt Cancer Center, Tampa, Florida 33612 (United States)

    2014-06-15

    Purpose: In this work, the feasibility of implementing a motion-perturbation approach to accurately estimate volumetric dose in the presence of organ motion—previously demonstrated for VMAT-–is studied for static gantry IMRT. The method's accuracy is improved for the voxels that have very low planned dose but acquire appreciable dose due to motion. The study describes the modified algorithm and its experimental validation and provides an example of a clinical application. Methods: A contoured region-of-interest is propagated according to the predefined motion kernel throughout time-resolved 4D phantom dose grids. This timed series of 3D dose grids is produced by the measurement-guided dose reconstruction algorithm, based on an irradiation of a staticARCCHECK (AC) helical dosimeter array (Sun Nuclear Corp., Melbourne, FL). Each moving voxel collects dose over the dynamic simulation. The difference in dose-to-moving voxel vs dose-to-static voxel in-phantom forms the basis of a motion perturbation correction that is applied to the corresponding voxel in the patient dataset. A new method to synchronize the accelerator and dosimeter clocks, applicable to fixed-gantry IMRT, was developed. Refinements to the algorithm account for the excursion of low dose voxels into high dose regions, causing appreciable dose increase due to motion (LDVE correction). For experimental validation, four plans using TG-119 structure sets and objectives were produced using segmented IMRT direct machine parameters optimization in Pinnacle treatment planning system (v. 9.6, Philips Radiation Oncology Systems, Fitchburg, WI). All beams were delivered with the gantry angle of 0°. Each beam was delivered three times: (1) to the static AC centered on the room lasers; (2) to a static phantom containing a MAPCHECK2 (MC2) planar diode array dosimeter (Sun Nuclear); and (3) to the moving MC2 phantom. The motion trajectory was an ellipse in the IEC XY plane, with 3 and 1.5 cm axes. The period

  13. Motion as perturbation. II. Development of the method for dosimetric analysis of motion effects with fixed-gantry IMRT

    International Nuclear Information System (INIS)

    Purpose: In this work, the feasibility of implementing a motion-perturbation approach to accurately estimate volumetric dose in the presence of organ motion—previously demonstrated for VMAT-–is studied for static gantry IMRT. The method's accuracy is improved for the voxels that have very low planned dose but acquire appreciable dose due to motion. The study describes the modified algorithm and its experimental validation and provides an example of a clinical application. Methods: A contoured region-of-interest is propagated according to the predefined motion kernel throughout time-resolved 4D phantom dose grids. This timed series of 3D dose grids is produced by the measurement-guided dose reconstruction algorithm, based on an irradiation of a staticARCCHECK (AC) helical dosimeter array (Sun Nuclear Corp., Melbourne, FL). Each moving voxel collects dose over the dynamic simulation. The difference in dose-to-moving voxel vs dose-to-static voxel in-phantom forms the basis of a motion perturbation correction that is applied to the corresponding voxel in the patient dataset. A new method to synchronize the accelerator and dosimeter clocks, applicable to fixed-gantry IMRT, was developed. Refinements to the algorithm account for the excursion of low dose voxels into high dose regions, causing appreciable dose increase due to motion (LDVE correction). For experimental validation, four plans using TG-119 structure sets and objectives were produced using segmented IMRT direct machine parameters optimization in Pinnacle treatment planning system (v. 9.6, Philips Radiation Oncology Systems, Fitchburg, WI). All beams were delivered with the gantry angle of 0°. Each beam was delivered three times: (1) to the static AC centered on the room lasers; (2) to a static phantom containing a MAPCHECK2 (MC2) planar diode array dosimeter (Sun Nuclear); and (3) to the moving MC2 phantom. The motion trajectory was an ellipse in the IEC XY plane, with 3 and 1.5 cm axes. The period was 5

  14. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    International Nuclear Information System (INIS)

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons

  15. The Development of a SPME-GC/MS Method for the Analysis of VOC Emissions from Historic Plastic and Rubber Materials

    OpenAIRE

    Curran, K.; Underhill, M.; Gibson, L. T.; Strlic, M.

    2015-01-01

    Analytical methods have been developed for the analysis of VOC emissions from historic plastic and rubber materials using SPME-GC/MS. Parameters such as analysis temperature, sampling time and choice of SPME fibre coating were investigated and sampling preparation strategies explored, including headspace sampling in vials and in gas sampling bags. The repeatability of the method was evaluated. It was found that a 7 d accumulation time at room temperature, followed by sampling using a DVB/CAR/...

  16. Harmony Analysis on Economic Development and Safety Production Level in China’s Mining Industry Based on Identity-difference-opposition Dynamic Associated Method

    OpenAIRE

    Tan Haixia; Chen Lin; Wang Hongtu

    2013-01-01

    In order to investigate whether the level of development between the mining economy and the national economy and between the safety of mining production and its economic development should be asynchronous or not and exploring the underlying causes for inharmonious development, using identity-difference-opposition analysis method, this study analyzed and evaluated the harmonies that economic development state of mining industry relative to that of industrial and t...

  17. Analysis and assessment on heavy metal sources in the coastal soils developed from alluvial deposits using multivariate statistical methods.

    Science.gov (United States)

    Li, Jinling; He, Ming; Han, Wei; Gu, Yifan

    2009-05-30

    An investigation on heavy metal sources, i.e., Cu, Zn, Ni, Pb, Cr, and Cd in the coastal soils of Shanghai, China, was conducted using multivariate statistical methods (principal component analysis, clustering analysis, and correlation analysis). All the results of the multivariate analysis showed that: (i) Cu, Ni, Pb, and Cd had anthropogenic sources (e.g., overuse of chemical fertilizers and pesticides, industrial and municipal discharges, animal wastes, sewage irrigation, etc.); (ii) Zn and Cr were associated with parent materials and therefore had natural sources (e.g., the weathering process of parent materials and subsequent pedo-genesis due to the alluvial deposits). The effect of heavy metals in the soils was greatly affected by soil formation, atmospheric deposition, and human activities. These findings provided essential information on the possible sources of heavy metals, which would contribute to the monitoring and assessment process of agricultural soils in worldwide regions. PMID:18976857

  18. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    While increasing evidence appoints diverse types of RNA as key players in the regulatory networks underlying cellular differentiation and metabolism, the potential functions of thousands of conserved RNA structures encoded in mammalian genomes remain to be determined. Since the functions of most...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA......-protein pulldown combined with mass spectrometry analysis is applied for in vivo as well as in vitro identification of RNA-binding proteins, the latter succeeding in verifying known RNA-protein interactions. Secondly, acknowledging the significance of flexible promoter usage for the diversification of the...

  19. Development of techniques using DNA analysis method for detection/analysis of radiation-induced mutation. Development of an useful probe/primer and improvement of detection efficacy

    International Nuclear Information System (INIS)

    Previously, it was demonstrated that detection of centromere became easy and reliable through fluorescent staining by FISH method using a probe of the sequence preserved in α-satelite DNA. Since it was, however, found inappropriate to detect dicentrics based on the relative amount of DNA probe on each chromosome. A prove which allows homogeneous detection of α-satelite DNA for each chromosome was constructed. A presumed sequence specific to kinetochore, CENP-B box was amplified by PCR method and the product DNA was used as a probe. However, the variation in amounts of probe DNA among chromosomes was decreased by only about 20%. Then, a program for image processing of the results obtained from FISH using α-satelite DNA was constructed to use as a marker for centromere. When compared with detection of abnormal chromosomes stained by the conventional method, calculation efficacy for only detection of centromere was improved by the use of this program. Calculation to discriminate the normal or not was still complicated and the detection efficacy was little improved. Chromosomal abnormalities in lymphocytes were used to detect the effects of radiation. In this method, it is needed to shift the phase of cells into metaphase. The mutation induced by radiation might be often repaired during shifting. To exclude this possibility, DNA extraction was conducted at a low temperature and immediately after exposure to 137Cs, and a rapid genome detection method was established using the genome DNA. As the model genomes, the following three were used: 1) long chain repeated sequences widely dispersed over chromosome, 2) cluster genes, 3) single copy genes. The effects of radiation were detectable at 1-2 Gy for the long repeated sequences and at 7 Gy for the cluster genes, respectively, whereas no significant effects were observed at any Gy tested for the single copy genes. Amplification was marked in the cells exposed at 1-10 Gy (peak at 4 Gy), suggesting that these regions had

  20. Development of partitioning method

    International Nuclear Information System (INIS)

    The present paper describes the examination of the possibility to improve denitration and extraction processes by adding oxalic acid in the partitioning process which has been developed for the purpose of separating high-level liquid waste (HLW) into a few groups of elements. First, the effect of oxalic acid in the denitration of HLW was examined to reduce the amount of the precipitate formed during the denitration. As a result, it was found that it was possible to reduce the precipitation of molybdenum, zirconium and tellurium. However, some elements precipitated at any concentration of oxalic acid. The addition of oxalic acid increased the amounts of precipitates of neodymium which was the representative of transuranic elements and strontium which was a troublesome element because of its heat generation. At the extraction process with DIDPA (diisodecyl phosphoric acid), oxalic acid was expected to prevent the third phase formation caused by iron, by making a complex with iron. However, the result showed that oxalic acid did not suppress the extraction of iron. The addition of oxalic acid was no effects on preventing the third phase formation. The influence of the presence of iron on the oxalate precipitation of rare earths was also examined in the present study. (author)

  1. Development of an improved method for quantitative analysis of skin blotting: Increasing reliability and applicability for skin assessment

    OpenAIRE

    Ogai, Kazuhiro; Matsumoto, Masaru; Minematsu, T; Kitamura, Keiichiro; Kobayashi, M.; Sugama, Junko; Sanada, Hiromi

    2015-01-01

    Objective A novel skin assessment tool named 'skin blotting' has been recently developed, which can easily predict the skin status to avoid its deterioration. The aim of this study was to propose a normalization method for skin blotting to compensate for individual differences that can hamper the quantitative comparisons and clinical applications. Methods To normalize individual differences, we utilized a total protein as a 'normalizer' with calibration curves. For evaluation, we performed a ...

  2. Methods Of Complex Analysis And Conformity Assessment Of Formation Of Regional Retail Networks To The Principles Of Sustainable Development

    OpenAIRE

    Zoryana Gerasymchuk; Victor Korsak

    2014-01-01

    The methodical approach to assessing of conformity of the formation of regional retail sales networks (RRSN) to the principles of sustainable development, which includes: selection of goals, objectives, directions and scorecard assessment of territorial retail network; calculation of integral indices of RRSN and socio-economic and environmental security in the region; calculation of the integral index of sustainability of RRSN; assessment of balance-imbalance of RRSN development and regional ...

  3. An analysis of used and under-development methods of fast-reactor core subassemblies monitoring in the USSR

    International Nuclear Information System (INIS)

    It is reported that due to the design features of the Soviet fast breeder reactors, the local methods of single subassembly (S/A) monitoring are not used, and the integral methods are preferred. In the BN-1600 reactor design, under development, now a version of a larger-size S/A with an open head and with monitoring the state of each of them is considered. Methods for monitoring of S/A in operating and being designed fast reactors are considered in the paper

  4. RISK ANALYSIS DEVELOPED MODEL

    Directory of Open Access Journals (Sweden)

    Georgiana Cristina NUKINA

    2012-07-01

    Full Text Available Through Risk analysis developed model deciding whether control measures suitable for implementation. However, the analysis determines whether the benefits of a data control options cost more than the implementation.

  5. LC-MS/MS method development for quantitative analysis of acetaminophen uptake by the aquatic fungus Mucor hiemalis.

    Science.gov (United States)

    Esterhuizen-Londt, Maranda; Schwartz, Katrin; Balsano, Evelyn; Kühn, Sandra; Pflugmacher, Stephan

    2016-06-01

    Acetaminophen is a pharmaceutical, frequently found in surface water as a contaminant. Bioremediation, in particular, mycoremediation of acetaminophen is a method to remove this compound from waters. Owing to the lack of quantitative analytical method for acetaminophen in aquatic organisms, the present study aimed to develop a method for the determination of acetaminophen using LC-MS/MS in the aquatic fungus Mucor hiemalis. The method was then applied to evaluate the uptake of acetaminophen by M. hiemalis, cultured in pellet morphology. The method was robust, sensitive and reproducible with a lower limit of quantification of 5pg acetaminophen on column. It was found that M. hiemalis internalize the pharmaceutical, and bioaccumulate it with time. Therefore, M. hiemalis was deemed a suitable candidate for further studies to elucidate its pharmaceutical tolerance and the longevity in mycoremediation applications. PMID:26950900

  6. Development of the safety analysis method based on the 3-D core kinetics coupled with thermal-hydraulics code

    International Nuclear Information System (INIS)

    In the present Non-LOCA safety analysis of the Pressurized Water Reactor (PWR), plant transient, core response and fuel behavior are independently calculated by different analysis codes to estimate the plant safety. Therefore these results often involve large un-quantified conservativeness due to additional safety margins for initial/boundary conditions of each calculation and simplistic approximations for complicated interactions between the core neutronics and plant thermal-hydraulics during the transient. Recently, best estimate 3-D core transient analysis codes have been widely developed in the area of nuclear reactor accident analysis to understand actual physical phenomena and quantification of conservativeness in the current safety analysis. Evaluating safety margins appropriately contributes to the more safety of the plant design and the efficiency of the plant operation. Mitsubishi Heavy Industries (MHI) has developed the 3-D core kinetics coupled with the thermal-hydraulics code SPARKLE, and has a plan to apply it for the commercial licensing in the near future. This paper presents the feature of the SPARKLE code and the results of the application to representative accident events. (author)

  7. Quick methods for radiochemical analysis

    International Nuclear Information System (INIS)

    Quick methods for radiochemical analysis, of adequate precision for the assay of a limited number of biologically important radionuclides, are important in the development of effective monitoring programs, particularly those that would be applied in emergency situations following an accidental release of radioactive substances. Methods of this type have been developed in a number of laboratories and are being altered and improved from time to time. The Agency invited several Member States to provide detailed information on any such procedures that might have been developed in their laboratories. From the information thus obtained a number of methods have been selected that appear to meet the criteria of speed and economy in the use of materials and equipment, and seem to be eminently suitable for those radionuclides that might be of major interest from the point of view of assessing the potential dose to persons following a serious dispersal of contamination. Refs, figs and tabs

  8. Development of core design/analysis technology for integral reactor; verification of SMART nuclear design by Monte Carlo method

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Hyo; Hong, In Seob; Han, Beom Seok; Jeong, Jong Seong [Seoul National University, Seoul (Korea)

    2002-03-01

    The objective of this project is to verify neutronics characteristics of the SMART core design as to compare computational results of the MCNAP code with those of the MASTER code. To achieve this goal, we will analyze neutronics characteristics of the SMART core using the MCNAP code and compare these results with results of the MASTER code. We improved parallel computing module and developed error analysis module of the MCNAP code. We analyzed mechanism of the error propagation through depletion computation and developed a calculation module for quantifying these errors. We performed depletion analysis for fuel pins and assemblies of the SMART core. We modeled a 3-D structure of the SMART core and considered a variation of material compositions by control rods operation and performed depletion analysis for the SMART core. We computed control-rod worths of assemblies and a reactor core for operation of individual control-rod groups. We computed core reactivity coefficients-MTC, FTC and compared these results with computational results of the MASTER code. To verify error analysis module of the MCNAP code, we analyzed error propagation through depletion of the SMART B-type assembly. 18 refs., 102 figs., 36 tabs. (Author)

  9. Development of radiochemical method of analysis of binding of tritium labeled drotaverine hydrochloride with human blood serum albumin

    International Nuclear Information System (INIS)

    Full text: The albumin, being a basic functional linkage of numerous endogenous and exogenous substances is the most important protein of blood plasma. At the diseases connected to liver disfunction, collected in blood metabolite reduce connecting ability of albumino. The aim of the present research was a development of radiochemical method of determination of ability of albumin to bind the tritium labeled preparation drotaverine hydrochloride (no - spa). We had developed a micromethod of definition of connecting ability of albumin, allowing to analyse 20 mkl of blood serum. The method consists in incubation of tritium labeled drotaverine hydrochloride with blood serum in vitro, the following fractionation of serum proteins by gel - filtration on a microcolumn with Sephadex G-25, and direct measurement of the radioactivity connected to fraction of proteins of blood serum. The method has been tested on a series of blood serum of control group of healthy people and on a series of blood serum of patients with hepatitis B. We received quantitative characteristics of binding of drotaverine hydrochloride with albumin of patients with hepatitis B. It was preliminary established that binding ability of serum albumin of children with various forms of acute virus hepatitis tends to decrease in comparison with group of the control. Advantage of the developed radiochemical method is high precision and the high sensitivity of detection of infringement of binding ability of albumin. Application of tritium labeled drotaverine hydrochloride allows to measure directly levels of binding of a preparation with albumin

  10. THE DEVELOPMENT OF METHOD FOR MINT AND TURMERIC ESSENTIAL OILS IDENTIFICATION AND QUANTITATIVE ANALYSIS IN COMPLEX DRUG

    OpenAIRE

    Smalyuh, O. G.

    2015-01-01

    Introduction.The aim of our study was to develop the method for identification and assay of essential oils of mint and turmeric in complex medicinal product in capsule form.         Materials and method.The paper used samples of turmeric and mint essential oils and complex drug, in the form of capsules containing oil of peppermint, oil of Curcuma longa, a mixture of extracts sandy everlasting (Helichrysumarenarium (L.) Moench), marigold (Caléndulaofficinális L), wild carrot (Daucussarota) and...

  11. Development of an RGB color analysis method for controlling uniformity in a long-length GdBCO coated conductor

    Science.gov (United States)

    Kim, Tae-Jin; Lee, Jae-Hun; Lee, Yu-Ri; Moon, Seung-Hyun

    2015-12-01

    Reactive co-evaporation-deposition and reaction (RCE-DR) is a very productive GdBa2Cu3O7-x (GdBCO) coated conductor (CC) fabrication process, which involves the fast phase conversion of an amorphous film formed by co-evaporation of three metal sources, Gd, Ba and Cu, and thus reduces the time and cost for fabrication of a GdBCO CC. We routinely use quartz crystal microbalance (QCM) to measure and control the evaporation rates of each metal source to keep a constant nominal composition of the superconducting (SC) layer. However, in the case of kilometre long GdBCO CC fabrication, evaporation rates measured by QCM do not exactly reflect deposition rates onto the substrate as source levels decrease, and thus an RGB color analysis method for quality control is designed. With this RGB color analysis method, it is possible to measure the composition of the converted SC layer very close to the actual composition, even in real time. We set up the RGB color analysis program by establishing a database, where RGB color values are matched to composition of the SC layer, and as a result of applying the program to the RCE-DR process, could fabricate high quality GdBCO CC with average critical current of 561 A cm-1 and 95% uniformity along a 1 km length.

  12. COMBINATION OF INDEPENDENT COMPONENT ANALYSIS, DESIGN OF EXPERIMENTS AND DESIGN SPACE FOR A NOVEL METHODOLOGY TO DEVELOP CHROMATOGRAPHIC METHODS

    OpenAIRE

    Rozet, Eric; Debrus, Benjamin; Lebrun, Pierre; Boulanger, B.; Hubert, Philippe

    2012-01-01

    As defined by ICH [1] and FDA, Quality by Design (QbD) stands for “a systematic approach to development that begins with predefined objectives and emphasizes product and process understanding and process control, based on sound science and quality risk management”. A risk–based QbD–compliant approach is proposed for the robust development of analytical methods. This methodology based on Design of Experiments (DoE) to study the experimental domain models the retention times at the beginning, t...

  13. Development of a Direct Headspace Collection Method from Arabidopsis Seedlings Using HS-SPME-GC-TOF-MS Analysis

    Directory of Open Access Journals (Sweden)

    Kazuki Saito

    2013-04-01

    Full Text Available Plants produce various volatile organic compounds (VOCs, which are thought to be a crucial factor in their interactions with harmful insects, plants and animals. Composition of VOCs may differ when plants are grown under different nutrient conditions, i.e., macronutrient-deficient conditions. However, in plants, relationships between macronutrient assimilation and VOC composition remain unclear. In order to identify the kinds of VOCs that can be emitted when plants are grown under various environmental conditions, we established a conventional method for VOC profiling in Arabidopsis thaliana (Arabidopsis involving headspace-solid-phase microextraction-gas chromatography-time-of-flight-mass spectrometry (HS-SPME-GC-TOF-MS. We grew Arabidopsis seedlings in an HS vial to directly perform HS analysis. To maximize the analytical performance of VOCs, we optimized the extraction method and the analytical conditions of HP-SPME-GC-TOF-MS. Using the optimized method, we conducted VOC profiling of Arabidopsis seedlings, which were grown under two different nutrition conditions, nutrition-rich and nutrition-deficient conditions. The VOC profiles clearly showed a distinct pattern with respect to each condition. This study suggests that HS-SPME-GC-TOF-MS analysis has immense potential to detect changes in the levels of VOCs in not only Arabidopsis, but other plants grown under various environmental conditions.

  14. Developing a New Sampling And Analysis Method For Hydrazine And Monomethyl Hydrazine: Using a Derivatizing Agent With Solid Phase Microextraction

    Science.gov (United States)

    Allen, John

    2001-01-01

    Solid phase microextraction (SPME) will be used to develop a method for detecting monomethyl hydrazine (MMH) and hydrazine (Hz). A derivatizing agent, pentafluorobenzoyl chloride (PFBCI), is known to react readily with MMH and Hz. The SPME fiber can either be coated with PFBCl and introduced into a gaseous stream containing MMH, or PFBCl and MMH can react first in a syringe barrel and after a short equilibration period a SPME is used to sample the resulting solution. These methods were optimized and compared. Because Hz and MMH can degrade the SPME, letting the reaction occur first gave better results. Only MMH could be detected using either of these methods. Future research will concentrate on constructing calibration curves and determining the detection limit.

  15. DEVELOPMENT AND VALIDATION OF HPTLC METHOD FOR SIMULTANEOUS ANALYSIS OF LOPINAVIR AND RITONAVIR IN THEIR COMBINED TABLET DOSAGE FORM

    Directory of Open Access Journals (Sweden)

    Mardia RB

    2012-03-01

    Full Text Available Simultaneous quantification of Lopinavir and Ritonavir in tablet by HPTLC method was developed andvalidated. The chromatograms were developed using a mobile phase of Chloroform: 1, 4 - Dioxane (7:3%v/v on pre-coated plate of silica gel GF aluminum TLC plate and quantified by densitometricabsorbance mode at 210 nm. The Rf value for lopinavir and ritonavir was 0.74 and 0.58 respectively.The linearity of the method was found to be within the concentration range of 160-960 ng/spot forLopinavir and for Ritonavir, it was 40-240 ng/spot. The lower limits of detection and quantificationwere 9.56 ng/spot and 28.96 ng/spot for Lopinavir and 6.82 ng/spot and 20.66 ng/spot for Ritonavir. Themethod was also validated for precision, specificity and recovery. This developed method was used toanalyze fixed-dose tablet (Lopimune, Cipla Ltd sample of Lopinavir and Ritonavir.

  16. Methods and Model Development for Coupled RELAP5/PARCS Analysis of the Atucha-II Nuclear Power Plant

    Directory of Open Access Journals (Sweden)

    Andrew M. Ward

    2011-01-01

    Full Text Available In order to analyze the steady state and transient behavior of CNA-II, several tasks were required. Methods and models were developed in several areas. HELIOS lattice models were developed and benchmarked against WIMS/MCNP5 results generated by NA-SA. Cross-sections for the coupled RELAP5/PARCS calculation were extracted from HELIOS within the GenPMAXS framework. The validation of both HELIOS and PARCS was performed primarily by comparisons to WIMS/PUMA and MCNP for idealized models. Special methods were developed to model the control rods and boron injection systems of CNA-II. The insertion of the rods is oblique, and a special routine was added to PARCS to treat this effect. CFD results combined with specialized mapping routines were used to model the boron injection system. In all cases there was good agreement in the results which provided confidence in the neutronics methods and modeling. A coupled code benchmark between U of M and U of Pisa is ongoing and results are still preliminary. Under a LOCA transient, the best estimate behavior of the core appears to be acceptable.

  17. Analysis of Scientific and Methodical Approaches to Portfolio Investment as a Tool of Financial Provision of Sustainable Economic Development

    Directory of Open Access Journals (Sweden)

    Leus Daryna V.

    2013-12-01

    Full Text Available The article analyses scientific and methodical approaches to portfolio investment. It develops recommendations on specification of the categorical apparatus of portfolio investment in the context of differentiation of strategic (direct and portfolio investments as alternative approaches to the conduct of investment activity. It identifies the composition and functions of objects and subjects of portfolio investment under conditions of globalisation of the world financial markets. It studies main postulates of the portfolio theory and justifies a necessity of identification of the place, role and functions of subjects of portfolio investment in them for ensuring sustainable development of the economy. It offers to specify, as one of the ways of further development of portfolio theories, a separate direction in the financial provision of economy with consideration of ecologic and social components – socio responsible investment.

  18. Contribution of ion beam analysis methods to the development of 2nd generation high temperature superconducting (HTS) wires

    Energy Technology Data Exchange (ETDEWEB)

    Usov, Igor O [Los Alamos National Laboratory; Arendt, Paul N [Los Alamos National Laboratory; Stan, Liliana [Los Alamos National Laboratory; Holesinger, Terry G [Los Alamos National Laboratory; Foltyn, Steven R [Los Alamos National Laboratory; Depaula, Raymond F [Los Alamos National Laboratory

    2009-01-01

    One of the crucial steps in the second generation high temperature superconducting wire program was development of the buffer layer architecture. The architecture designed at the Superconductivity Technology Center at Los Alamos National Laboratory consists of several oxide layers wherein each layer plays a specific role, namely: nucleation layer, diffusion barrier, biaxially textured template, and an intermediate layer with a good match to the lattice parameter of superconducting Y{sub 1}Ba{sub 2}Cu{sub 3}O{sub 7} (YBCO) compound. This report demonstrates how a wide range of ion beam analysis techniques (SIMS, RBS, channeling, PIXE, PIGE, NRA, ERD) was employed for analysis of each buffer layer and the YBCO films. These results assisted in understanding of a variety of physical processes occurring during the buffet layer fabrication and helped to optimize the buffer layer architecture as a whole.

  19. Electroporation-based methods for in vivo, whole mount and primary culture analysis of zebrafish brain development

    Directory of Open Access Journals (Sweden)

    Jesuthasan Suresh

    2007-03-01

    Full Text Available Abstract Background Electroporation is a technique for the introduction of nucleic acids and other macromolecules into cells. In chick embryos it has been a particularly powerful technique for the spatial and temporal control of gene expression in developmental studies. Electroporation methods have also been reported for Xenopus, zebrafish, and mouse. Results We present a new protocol for zebrafish brain electroporation. Using a simple set-up with fixed spaced electrodes and microinjection equipment, it is possible to electroporate 50 to 100 embryos in 1 hour with no lethality and consistently high levels of transgene expression in numerous cells. Transfected cells in the zebrafish brain are amenable to in vivo time lapse imaging. Explants containing transfected neurons can be cultured for in vitro analysis. We also present a simple enzymatic method to isolate whole brains from fixed zebrafish for immunocytochemistry. Conclusion Building on previously described methods, we have optimized several parameters to allow for highly efficient unilateral or bilateral transgenesis of a large number of cells in the zebrafish brain. This method is simple and provides consistently high levels of transgenesis for large numbers of embryos.

  20. Development of multi-dimensional analysis method for porous blockage in fuel subassembly. Numerical simulation for 4 subchannel geometry water test

    International Nuclear Information System (INIS)

    This investigation deals with the porous blockage in a wire spacer type fuel subassembly in Fast Breeder Reactors (FBR's). Multi-dimensional analysis method for a porous blockage in a fuel subassembly is developed using the standard k-ε turbulence model with the typical correlations in handbooks. The purpose of this analysis method is to evaluate the position and the magnitude of the maximum temperature, and to investigate the thermo-hydraulic phenomena in the porous blockage. Verification of this analysis method was conducted based on the results of 4-subchannel geometry water test. It was revealed that the evaluation of the porosity distribution and the particle diameter in a porous blockage was important to predict the temperature distribution. This analysis method could simulate the spatial characteristic of velocity and temperature distributions in the blockage and evaluate the pin surface temperature inside the porous blockage. Through the verification of this analysis method, it is shown that this multi-dimensional analysis method is useful to predict the thermo-hydraulic field and the highest temperature in a porous blockage. (author)

  1. Development of a sample preparation method for the analysis of current-use pesticides in sediment using gas chromatography.

    Science.gov (United States)

    Wang, Dongli; Weston, Donald P; Ding, Yuping; Lydy, Michael J

    2010-02-01

    Pyrethroid insecticides have been implicated as the cause of sediment toxicity to Hyalella azteca in both agricultural and urban areas of California; however, for a subset of these toxic sediments (approximately 30%), the cause of toxicity remains unidentified. This article describes the analytical method development for seven additional pesticides that are being examined to determine if they might play a role in the unexplained toxicity. A pressurized liquid extraction method was optimized to simultaneously extract diazinon, methyl parathion, oxyfluorfen, dicofol, fenpropathrin, pyraclostrobin, and indoxacarb from sediment, and the extracts were cleaned using a two-step solid-phase extraction procedure. The final extract was analyzed for the target pesticides by gas chromatography/nitrogen-phosphorus detector (GC/NPD), and gas chromatography/electron capture detector (GC/ECD), after sulfur was removed by shaking with copper and cold crystallization. Three sediments were used as reference matrices to assess method accuracy and precision. Method detection limits were 0.23-1.8 ng/g dry sediment using seven replicates of sediment spiked at 1.0 ng/g dry sediment. Recoveries ranged from 61.6 to 118% with relative standard deviations of 2.1-17% when spiked at 5.0 and 50 ng/g dry sediment. The three reference sediments, spiked with 50 ng/g dry weight of the pesticide mixture, were aged for 0.25, 1, 4, 7, and 14 days. Recoveries of the pesticides in the sediments generally decreased with increased aging time, but the magnitude of the decline was pesticide and sediment dependent. The developed method was applied to field-collected sediments from the Central Valley of California. PMID:19798461

  2. Development of CAPP code based on the finite element method for the analysis of VHTR cores - HTR2008-58169

    International Nuclear Information System (INIS)

    In this study, we developed a neutron diffusion equation solver based on the finite element method for CAPP code. Three types of triangular finite elements and five types of rectangular depending on the order of the shape functions were implemented for 2-D application. Ten types of triangular prismatic finite elements and seventeen types of rectangular prismatic finite elements were also implemented for 3-D application. Two types of polynomial mapping from the master finite element to a real finite element were adopted for flexibility in dealing with complex geometry. They are linear mapping and iso-parametric mapping. In linear mapping, only the vertex nodes are used as the mapping points. In iso-parametric mapping, all the nodal points in the finite element are used as the mapping points, which enables the real finite elements to have curved surfaces. For the treatment of spatial dependency of cross-sections in the finite elements, three types of polynomial expansion of the cross-sections in the finite elements were implemented. They are constant, linear, and iso-parametric cross-section expansions. The power method with the Wielandt acceleration technique was adopted as the outer iteration algorithm. The BiCGSTAB algorithm with the ILU (Incomplete LU) decomposition pre-conditioner was used as the linear equation solver in the inner iteration. The neutron diffusion equation solver developed in this study was verified against two well known benchmark problems, IAEA PWR benchmark problem and OECD/NEA PBMR400 benchmark problem. Results of numerical tests showed that the solution converged to the reference solution as the finite elements are refined and as the order of the finite elements increases. Numerical tests also showed that the higher order finite element method is much efficient than lower order finite element method or finite difference method. (authors)

  3. Development of a Novel Nuclear Safety Culture Evaluation Method for an Operating Team Using Probabilistic Safety Analysis

    International Nuclear Information System (INIS)

    IAEA defined safety culture as follows: 'Safety Culture is that assembly of characteristics and attitudes in organizations and individuals which establishes that, as an overriding priority, nuclear plant safety issues receive the attention warranted by their significance'. Also, celebrated behavioral scientist, Cooper, defined safety culture as,'safety culture is that observable degree of effort by which all organizational members direct their attention and actions toward improving safety on a daily basis' with his internal psychological, situational, and behavioral context model. With these various definitions and criteria of safety culture, several safety culture assessment methods have been developed to improve and manage safety culture. To develop a new quantitative safety culture evaluation method for an operating team, we unified and redefined safety culture assessment items. Then we modeled a new safety culture evaluation by adopting level 1 PSA concept. Finally, we suggested the criteria to obtain nominal success probabilities of assessment items by using 'operational definition'. To validate the suggested evaluation method, we analyzed the collected audio-visual recording data collected from a full scope main control room simulator of a NPP in Korea

  4. Development of a Novel Nuclear Safety Culture Evaluation Method for an Operating Team Using Probabilistic Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Han, Sangmin; Lee, Seung Min; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)

    2015-05-15

    IAEA defined safety culture as follows: 'Safety Culture is that assembly of characteristics and attitudes in organizations and individuals which establishes that, as an overriding priority, nuclear plant safety issues receive the attention warranted by their significance'. Also, celebrated behavioral scientist, Cooper, defined safety culture as,'safety culture is that observable degree of effort by which all organizational members direct their attention and actions toward improving safety on a daily basis' with his internal psychological, situational, and behavioral context model. With these various definitions and criteria of safety culture, several safety culture assessment methods have been developed to improve and manage safety culture. To develop a new quantitative safety culture evaluation method for an operating team, we unified and redefined safety culture assessment items. Then we modeled a new safety culture evaluation by adopting level 1 PSA concept. Finally, we suggested the criteria to obtain nominal success probabilities of assessment items by using 'operational definition'. To validate the suggested evaluation method, we analyzed the collected audio-visual recording data collected from a full scope main control room simulator of a NPP in Korea.

  5. Method development and validation for the analysis of a new anti-cancer infusion solution via HPLC.

    Science.gov (United States)

    Donnarumma, Fabrizio; Schober, Margot; Greilberger, Joachim; Matzi, Veronika; Lindenmann, Jörg; Maier, Alfred; Herwig, Ralf; Wintersteiger, Reinhold

    2011-01-01

    A fast and simple HPLC method has been developed and validated for the quantification of a completely new anti-cancer drug during the manufacturing process. The combination of four compounds including α-ketoglutaric acid, hydroxymethylfurfural, N-acetyl-L-methionine and N-acetyl-L-selenomethionine, administered intravenously, is still in test phase but has already shown promising results in cancer therapy. HPLC separation was achieved on an RP-18 column with a gradient system. However, the highly different concentrations of the compounds required a variation in the detection wavelength within one run. In order to produce a chromatogram where peaks were comparable on a similar range scale, detection at absorption maxima for the two most concentrated components was avoided. After optimization of the gradient program it was possible to detect all four substances within 14 min in spite of their strongly different chemical structure. The method developed was validated for accuracy, repeatability, reproducibility and robustness in relation to temperature and pH of buffer. Linearity as well as the limit of detection and quantification were determined. This HPLC method was found to be precise, accurate and reproducible and can be easily used for in-line process control during the manufacture of the anti-tumour infusion solution. PMID:21246718

  6. Development of quantitative analysis method for stereotactic brain image. Assessment of reduced accumulation in extent and severity using anatomical segmentation

    International Nuclear Information System (INIS)

    Through visual assessment by three-dimensional (3D) brain image analysis methods using stereotactic brain coordinates system, such as three-dimensional stereotactic surface projections and statistical parametric mapping, it is difficult to quantitatively assess anatomical information and the range of extent of an abnormal region. In this study, we devised a method to quantitatively assess local abnormal findings by segmenting a brain map according to anatomical structure. Through quantitative local abnormality assessment using this method, we studied the characteristics of distribution of reduced blood flow in cases with dementia of the Alzheimer type (DAT). Using twenty-five cases with DAT (mean age, 68.9 years old), all of whom were diagnosed as probable Alzheimer's disease based on National Institute of Neurological and Communicative Disorders and Stroke-Alzheimer's Disease and Related Disorders Association (NINCDS-ADRDA), we collected I-123 iodoamphetamine SPECT data. A 3D brain map using the 3D-stereotactic surface projections (SSP) program was compared with the data of 20 cases in the control group, who age-matched the subject cases. To study local abnormalities on the 3D images, we divided the whole brain into 24 segments based on anatomical classification. We assessed the extent of an abnormal region in each segment (rate of the coordinates with a Z-value that exceeds the threshold value, in all coordinates within a segment), and severity (average Z-value of the coordinates with a Z-value that exceeds the threshold value). This method clarified orientation and expansion of reduced accumulation, through classifying stereotactic brain coordinates according to the anatomical structure. This method was considered useful for quantitatively grasping distribution abnormalities in the brain and changes in abnormality distribution. (author)

  7. Development of methods for the application of neutron activation analysis to the determination of trace elements in food

    International Nuclear Information System (INIS)

    For the determination of trace elements in foodstuffs with the aid of neutron activation analysis the separation of volatile radionuclides after digestion of the sample is of special interest for radiochemical processing. A distallation procedure was developed to give reproducable results, however optimal conditions were not found for all volatile radionuclides studied. The required selective separation of Br82 from the distillate was best achieved by the application of an ion-exchange column-chromatography technique. The computer programs for the evaluation of complex gamma spectra have been developed further. The automatic peak search and peak area determination is based on a computer program using the correlation technique and carried out with a mini-computer coupled with a multi-channel gamma spectrometer. The results, which are presented in 3 earlier reports relating to this research program, reveal the advantages and disadvantages of the individual steps of the radiochemical separation scheme. Before neutron activation analysis can be introduced on a routine basis, some aspects of the radiochemical process remain to be tested; these studies will be published in a fourth and final report. (orig.)

  8. Numerical stability analysis and rapid algorithm for calculations of fully developed laminar flow through ducts using time-marching method

    Directory of Open Access Journals (Sweden)

    Tsugio Fukuchi

    2013-03-01

    Full Text Available In the numerical calculations of parabolic partial differential equations using explicit-type finite-difference methods (FDMs, the most fundamental finite-difference scheme is the time-forward, centered-space scheme. Under this scheme, several theoretical stability analysis methods have been established to avoid oscillations and divergences in numerical solutions. In solving elliptic partial differential equations using FDMs, the point successive over-relaxation method is one means to rapidly obtain steady state solutions, a major concern being to derive the optimum relaxation parameter. These problems have been solved theoretically over regular calculation domains. To be able to use FDMs freely over irregular calculation domains, the preceding two problems need to be reinvestigated but within a new theoretical framework. A theoretical approach is of great importance; nevertheless, it becomes so cumbersome that a numerical experimentation is the more realistic approach. A numerical approach—numerical stability analysis—is proposed and both problems are solved using similar algorithms. Although only the parabolic partial and elliptic differential equations concerned with Poiseuille flows are investigated here, the conclusions have a wider generality.

  9. Development of CAD based on ANN analysis of power spectra for pneumoconiosis in chest radiographs: effect of three new enhancement methods

    OpenAIRE

    Okumura, Eiichiro; Kawashita, Ikuo; Ishida, Takayuki

    2014-01-01

    We have been developing a computer-aided detection (CAD) scheme for pneumoconiosis based on a rule-based plus artificial neural network (ANN) analysis of power spectra. In this study, we have developed three enhancement methods for the abnormal patterns to reduce false-positive and false-negative values. The image database consisted of 2 normal and 15 abnormal chest radiographs. The International Labour Organization standard chest radiographs with pneumoconiosis were categorized as subcategor...

  10. Development and validation of a cleanup method for hydrocarbon containing samples for the analysis of semivolatile organic compounds

    International Nuclear Information System (INIS)

    Samples obtained from the Hanford single shell tanks (SSTs) are contaminated with normal paraffin hydrocarbon (NPH) as hydrostatic fluid from the sampling process or can be native to the tank waste. The contamination is usually high enough that a dilution of up to several orders of magnitude may be required before the sample can be analyzed by the conventional gas chromatography/mass spectrometry methodology. This can prevent detection and measurement of organic constituents that are present at lower concentration levels. To eliminate or minimize the problem, a sample cleanup method has been developed and validated and is presented in this document

  11. SIFT-MS and FA-MS methods for ambient gas phase analysis: developments and applications in the UK

    OpenAIRE

    Smith, D.; Spanel, P

    2015-01-01

    Selected ion flow tube mass spectrometry, SIFT-MS, a relatively new gas/vapour phase analytical method, is derived from the much earlier selected ion flow tube, SIFT, used for the study of gas phase ion-molecule reactions. Both the SIFT and SIFT-MS techniques were conceived and developed in the UK, the former at Birmingham University, the latter at Keele University along with the complementary flowing afterglow mass spectrometry, FA-MS, technique. The focus of this short review is largely to ...

  12. A strategy to the development of a human error analysis method for accident management in nuclear power plants using industrial accident dynamics

    International Nuclear Information System (INIS)

    This technical report describes the early progress of he establishment of a human error analysis method as a part of a human reliability analysis(HRA) method for the assessment of the human error potential in a given accident management strategy. At first, we review the shortages and limitations of the existing HRA methods through an example application. In order to enhance the bias to the quantitative aspect of the HRA method, we focused to the qualitative aspect, i.e., human error analysis(HEA), during the proposition of a strategy to the new method. For the establishment of a new HEA method, we discuss the basic theories and approaches to the human error in industry, and propose three basic requirements that should be maintained as pre-requisites for HEA method in practice. Finally, we test IAD(Industrial Accident Dynamics) which has been widely utilized in industrial fields, in order to know whether IAD can be so easily modified and extended to the nuclear power plant applications. We try to apply IAD to the same example case and develop new taxonomy of the performance shaping factors in accident management and their influence matrix, which could enhance the IAD method as an HEA method. (author). 33 refs., 17 tabs., 20 figs

  13. Development and application of RP-HPLC methods for the analysis of transition metals and their radioactive isotops in radioactive waste

    International Nuclear Information System (INIS)

    A major criterion in the final disposal of nuclear waste is to keep possible changes in the geosphere due to the introduction of radioactive waste as small as possible and to prevent any escape into the biosphere in the long term. The Federal Office for Radiation Protection (BfS) has therefore established limit values for a number of nuclides. Verifying these limits has to date involved laborious wet chemical analysis. In order to accelerate quantification there is a need to develop rapid multielement methods. HPLC methods represent a starting point for this development. Chemical separation is necessary to quantify β-emitters via their radioactive radiation since they are characterized by a continuous energy spectrum. A method for quantifying transition metals and their radioactive isotopes from radioactive waste has been created by using a chelating agent to select the analytes and RP-HPLC to separate the complexes formed. In addition to separating the matrix, complexation on a precolumn has the advantage of enriching the analytes. The subject of this thesis is the development and application of the method including studies of the mobile and stationary phase, as well as the optimization of all parameters, such as pH value, sample volume etc., which influence separation, enrichment or detection. The method developed was successfully tested using cement samples. It was also used for investigations of ion exchange resins and for trace analysis in calcium fluoride. Furthermore, the transferability of the method to actinides was examined by using a different complexing agent. (orig.)

  14. The application and development of k0-standardization method of neutron activation analysis at Dalat research reactor

    International Nuclear Information System (INIS)

    In recent years the k0-NAA method has been applied and developed at the 500 kW Dalat research reactor, which includes the establishment of a PC database of k0-NAA-related nuclear parameters, e.g., radionuclide produced, half-lives, k0-factors, Q0, E-barr Eγ, etc; the access to the database is able by a k0-NAA software or by manual; the detection efficiency calibration of gamma spectrometers used in k0-NAA, the determination of reactor neutron spectrum parameters such as α and f factors and neutron fluxes in the irradiation channels, and the validation of the developed k0-NAA procedure by analysing some SRMs, namely Coal Fly Ash (NIST-1633b), Bovine Liver (NIST-1577b) and IAEA-Soil7. The analytical results showed the deviations between experimental and certified values were mostly less than 15% with most Z-scores lower than 2. The k0-NAA procedure established at the Dalat research reactor has been regarded as a reliable standardization method of NAA and as available for practical applications, in particularly for airborne particulate and crude oil samples. (author)

  15. Advanced probabilistic method of development

    Science.gov (United States)

    Wirsching, P. H.

    1987-01-01

    Advanced structural reliability methods are utilized on the Probabilistic Structural Analysis Methods (PSAM) project to provide a tool for analysis and design of space propulsion system hardware. The role of the effort at the University of Arizona is to provide reliability technology support to this project. PSAM computer programs will provide a design tool for analyzing uncertainty associated with thermal and mechanical loading, material behavior, geometry, and the analysis methods used. Specifically, reliability methods are employed to perform sensitivity analyses, to establish the distribution of a critical response variable (e.g., stress, deflection), to perform reliability assessment, and ultimately to produce a design which will minimize cost and/or weight. Uncertainties in the design factors of space propulsion hardware are described by probability models constructed using statistical analysis of data. Statistical methods are employed to produce a probability model, i.e., a statistical synthesis or summary of each design variable in a format suitable for reliability analysis and ultimately, design decisions.

  16. Development of an optimized method for the detection of airborne viruses with real-time PCR analysis

    Directory of Open Access Journals (Sweden)

    Legaki Euaggelia

    2011-07-01

    Full Text Available Abstract Background Airborne viruses remain one of the major public health issues worldwide. Detection and quantification of airborne viruses is essential in order to provide information regarding public health risk assessment. Findings In this study, an optimized new, simple, low cost method for sampling of airborne viruses using Low Melting Agarose (LMA plates and a conventional microbial air sampling device has been developed. The use of LMA plates permits the direct nucleic acids extraction of the captured viruses without the need of any preliminary elution step. Molecular detection and quantification of airborne viruses is performed using real-time quantitative (RT-PCR (Q(RT-PCR technique. The method has been tested using Adenoviruses (AdVs and Noroviruses (NoVs GII, as representative DNA and RNA viruses, respectively. Moreover, the method has been tested successfully in outdoor experiments, by detecting and quantifying human adenoviruses (HAdVs in the airborne environment of a wastewater treatment plant. Conclusions The great advantage of LMA is that nucleic acids extraction is performed directly on the LMA plates, while the eluted nucleic acids are totally free of inhibitory substances. Coupled with QPCR the whole procedure can be completed in less than three (3 hours.

  17. Development of New Method for Simultaneous Analysis of Piracetam and Levetiracetam in Pharmaceuticals and Biological Fluids: Application in Stability Studies

    Science.gov (United States)

    Siddiqui, Farhan Ahmed; Sher, Nawab; Shafi, Nighat; Wafa Sial, Alisha; Ahmad, Mansoor; Mehjebeen

    2014-01-01

    RP-HPLC ultraviolet detection simultaneous quantification of piracetam and levetiracetam has been developed and validated. The chromatography was obtained on a Nucleosil C18 column of 25 cm × 0.46 cm, 10 μm, dimension. The mobile phase was a (70 : 30 v/v) mixture of 0.1 g/L of triethylamine and acetonitrile. Smooth flow of mobile phase at 1 mL/min was set and 205 nm wavelength was selected. Results were evaluated through statistical parameters which qualify the method reproducibility and selectivity for the quantification of piracetam, levetiracetam, and their impurities hence proving stability-indicating properties. The proposed method is significantly important, permitting the separation of the main constituent piracetam from levetiracetam. Linear behavior was observed between 20 ng/mL and 10000 ng/mL for both drugs. The proposed method was checked in bulk drugs, dosage formulations, physiological condition, and clinical investigations and excellent outcome was witnessed. PMID:25114921

  18. Development of energy-dispersive X-ray fluorescence as a mobile analysis method for hazardous metals in transuranic waste

    International Nuclear Information System (INIS)

    Energy-dispersive X-ray fluorescence (EDXRF) is a widely applied technique for both laboratory and field-based characterization of metals in complex matrices. Here an EDXRF method is described for analysis of 13 hazardous (RCRA) metals in Portland cement, a typical matrix for transuranic (TRU) waste from US Department of Energy (DOE) sites. Samples are analyzed as homogeneous powders prepared by simple drying, mixing, and milling. Analyses are performed using a commercial EDXRF spectrometer equipped with an X-ray tube, a high-resolution Si(Li) detector, and fundamental parameters software for data reduction. The spectrometer is rugged and suitable for use in either mobile or fixed-based laboratories. Standardization is accomplished using fundamental parameters techniques for several prepared standards which bracket the expected range in metal concentrations, and typical standardization uncertainties are < 10%. Detection limits range from 2--20 ppm and meet required action levels with a few exceptions including Be, Hg and V. Accuracy is evaluated from a series of unknown quality control samples and ranges from 85--102%, whereas the total method uncertainty is typically < 10%. Consequently, this simple, rapid, and inexpensive technique can provide quantitative characterization of virtually all of the RCRA metals in TRU waste cement samples

  19. Gait analysis methods in rehabilitation

    Directory of Open Access Journals (Sweden)

    Baker Richard

    2006-03-01

    Full Text Available Abstract Introduction Brand's four reasons for clinical tests and his analysis of the characteristics of valid biomechanical tests for use in orthopaedics are taken as a basis for determining what methodologies are required for gait analysis in a clinical rehabilitation context. Measurement methods in clinical gait analysis The state of the art of optical systems capable of measuring the positions of retro-reflective markers placed on the skin is sufficiently advanced that they are probably no longer a significant source of error in clinical gait analysis. Determining the anthropometry of the subject and compensating for soft tissue movement in relation to the under-lying bones are now the principal problems. Techniques for using functional tests to determine joint centres and axes of rotation are starting to be used successfully. Probably the last great challenge for optical systems is in using computational techniques to compensate for soft tissue measurements. In the long term future it is possible that direct imaging of bones and joints in three dimensions (using MRI or fluoroscopy may replace marker based systems. Methods for interpreting gait analysis data There is still not an accepted general theory of why we walk the way we do. In the absence of this, many explanations of walking address the mechanisms by which specific movements are achieved by particular muscles. A whole new methodology is developing to determine the functions of individual muscles. This needs further development and validation. A particular requirement is for subject specific models incorporating 3-dimensional imaging data of the musculo-skeletal anatomy with kinematic and kinetic data. Methods for understanding the effects of intervention Clinical gait analysis is extremely limited if it does not allow clinicians to choose between alternative possible interventions or to predict outcomes. This can be achieved either by rigorously planned clinical trials or using

  20. Development and application of a method for the analysis of 9 mycotoxins in maize by HPLC-MS/MS.

    Science.gov (United States)

    Wang, Yutang; Xiao, Chunxia; Guo, Jing; Yuan, Yahong; Wang, Jianguo; Liu, Laping; Yue, Tianli

    2013-11-01

    A reliable and sensitive liquid chromatography/tandem mass spectrometry (LC-MS/MS) method was developed for the simultaneous determination of aflatoxins (AFB1 , AFB2 , AFG1 , and AFG2 ), ochratoxin A (OTA), deoxynivalenol (DON), zearalenone (ZEA), fumonisin B1 (FB1 ), and T2-toxin in maize. The samples were first extracted using acetonitrile: water: acetic acid (79 : 20 : 1), and then further cleaned-up using OASIS HLB cartridge. Optimum conditions for the extraction and chromatographic separation were investigated. The mean recoveries of mycotoxins in spiked maize ranged from 68.3% to 94.3%. Limits of detection and quantification ranged from 0.01 to 0.64 μg/kg and from 0.03 to 2.12 μg/kg, respectively. The LC-MS/MS method has also been successfully applied to 60 maize samples, which were collected from Shaanxi Province of China. Twenty-four of the total 60 samples (40%) were contaminated with at least 1 of these 9 mycotoxins. Occurrence of mycotoxins were 6.7%, 1.7%, 3.3%, 6.7%, 1.7%, 23.3%, and 3.3% for AFB1 , AFB2 , OTA, ZEA, DON, FB1 , and T2-toxin, respectively. The results demonstrated that the procedure was suitable for the simultaneous determination of these mycotoxins in maize matrix. PMID:24245893

  1. Development of a best estimate analysis method on two-phase flow thermal-hydraulics for reduced-moderation water reactors

    International Nuclear Information System (INIS)

    The prediction performance on thermal-hydraulics of two-phase flow in light-water reactors has been verified by using operation data of current BWRs and PWRs. In general, as best estimate methods, system analysis codes (i.e., TRAC, RELAP) and subchannel analysis codes (i.e., COBRA, NASCA) are used to the thermal design of the nuclear reactor cores. Those codes need lots of composition equations and empirical correlations derived from experimental data to predict the two-phase flow thermal-hydraulics precisely. Japan Atomic Energy Research Institute (JAERI) is now developing a reduced-moderation water reactor (RMWR) which is one of advanced BWR type reactors. The feasibility condition of the RMWR core is outside the region of that condition of the current BWR core. Moreover, there are no experimental data on two-phase flow thermal-hydraulics of the RMWR. Therefore, it is very difficult to obtain highly precise predictions using the conventional best estimate methods. Then, the authors investigated the analytical procedures and best estimate methodologies on the thermal design of the RMWR core, and performed developing new analysis codes. This paper describes the developed best estimate analysis methods consisting experiments and analysis codes. The RMWR is a light water-cooled breeder reactor aiming at effective utilization of uranium resources, multiple recycling of plutonium, high burnup and long operation cycle. In order to get 0.1 or more conversion ratios, it is expected that the volume ratio of water and fuel must be decreased to about 0.25 or less. As a best estimate analysis method for the thermal design of the RMWR core, a combined method consisting of the subchannel analysis code NASCA and the three-dimensional two-phase flow structure analysis code TPFIT was proposed, and then the prediction accuracy on the feasibility of the RMWR was improved using the present combined method. From the result of the present study it was concluded that the presently

  2. New developments for the analysis of archaeological and artistic artifacts by optical and ion beam methods at LAMFI

    International Nuclear Information System (INIS)

    Full text: Since 2005, the analysis of artistic and cultural heritage objects at LAMFI-USP (Laboratorio de Analises de Materiais com Feixes Ionicos), initially restricted to ion beam methods, is growing steadily. Since then, alternative methodologies and procedures have been incorporated to better characterize these objects, that possess distinctive physical characteristics and also due to their high cultural and monetary value. The examinations were expanded to other non-destructive analytical techniques like portable XRF (X-ray fluorescence) analysis, X-ray radiography, visible, UV (ultraviolet) and IR (infrared) light imaging that are helping to better understand these art objects, particularly paintings, where the techniques are helping to access the conservation state and also reveal underlying drawings, which help understanding the creative process of the artist. The external beam arrangement at LAMFI was recently updated for simultaneous PIXE (Particle induced X-ray emission), RBS (Rutherford back scattering), PIGE (Particle induced gamma-ray emission) and IBL (Ion beam luminescence) analysis in open air. The new setup comprises a 2 π star-like detector assembly with 7 collimated telescopes: two openings have laser beams for optical alignment of the target, 2 are used for X-ray detectors, 1 for a particle detector, 1 for an optical spectrometer, and 1 for a image. The particle and X-ray detector telescopes can be evacuated to reduce signal losses. The 2 telescopes with the X-ray detectors have absorbers to selectively filter low energy X-rays, optimizing the PIXE detection limits. The beam exit window is made of an 8 μm aluminum foil to monitoring integrated beam charge by measuring the Al gamma rays with a NaI detector. The geometry and materials of the assembly have been carefully designed to shield the X-ray detectors from measuring the X-rays from the exit beam window as well as reducing the detection of Ar K α from the in air beam path. The new

  3. Development and validation of a method for removal of normal paraffin hydrocarbon in radioactive waste samples prior to analysis of semi volatile components

    International Nuclear Information System (INIS)

    A method has been developed at Pacific Northwest Laboratory (PNL) to remove normal paraffin hydrocarbon (NPH) from radioactive waste samples prior to gas chromatography/mass spectrometry analysis of semi volatile components. The effectiveness of the cleanup procedure was demonstrated for all the EPA semi volatile target list compounds. Blanks and spiked actual waste samples were utilized in the development and validation study. Approximately 95% of the NPH was removed from the single-shell tank samples. The recoveries were good for most of the target compounds. Results were compared with those obtained by utilizing EPA method 3630. The recoveries were much better for the PNL-developed method. (author). 4 refs., 3 figs., 6 tabs

  4. Analytical method development and validation for quantification of uranium by Fourier Transform Infrared Spectroscopy (FTIR) for routine quality control analysis

    International Nuclear Information System (INIS)

    This work presents a low cost, simple and new methodology for direct determination uranium in different matrices uranium: organic phase (UO2(NO3)2.2TBP - uranyl nitrate complex) and aqueous phase (UO2(NO3)2 - NTU - uranyl nitrate), based on Fourier Transform Infrared spectroscopy (FTIR) using KBr pellets technique. The analytical validation is essential to define if a developed methodology is completely adjusted to the objectives that it is destined and is considered one of the main instruments of quality control. The parameters used in the validation process were: selectivity, linearity, limits of detection (LD) and quantitation (LQ), precision (repeatability and intermediate precision), accuracy and robustness. The method for uranium in organic phase (UO2(NO3)2.2TBP in hexane/embedded in KBr) was linear (r=0.9989) over the range of 1.0 g L-1a 14.3 g L-1, LD were 92.1 mg L-1 and LQ 113.1 mg L-1, precision (RSD < 1.6% and p-value < 0.05), accurate (recovery of 100.1% - 102.9%). The method for uranium aqueous phase (UO2(NO3)2/embedded in KBr) was linear (r=0.9964) over the range of 5.4 g L-1 a 51.2 g L-1, LD were 835 mg L-1 and LQ 958 mg L-1, precision (RSD < 1.0% and p-value < 0.05), accurate (recovery of 99.1% - 102.0%). The FTIR method is robust regarding most of the variables analyzed, as the difference between results obtained under nominal and modified conditions were lower than the critical value for all analytical parameters studied. Some process samples were analyzed in FTIR and compared with gravimetric and x ray fluorescence (XRF) analyses showing similar results in all three methods. The statistical tests (Student-t and Fischer) showed that the techniques are equivalent. (author)

  5. Analysis and development of methods for the recovery of tri-n-butylphosphate (TBP)-30%v/v-degraded dodecane

    International Nuclear Information System (INIS)

    Tri-n-butyl phosphate associated with an inert hydrocarbon is the main solvent used in reprocessing of nuclear irradiated fuel arising of pressurized water reactors. The combined action of radiation and nitric acid cause severe damage to solvent, in reprocessing steps. The recovery of the solvent is, thus, of great importance, since it decreases the amount of the waste and improves the process economy. A comparative analysis of several methods of the recovery of this solvent was carried out, such as: alkaline washing, adsorption with resins, adsorption with aluminium oxide, adsorption by active carbon and adsorption by vermiculite. Some modifications of analytical 95Zr test and a mathematical definition of two new parameters (degradation grade and efficiency of recovery) were done. Through this modified 95Zr test, the residence time and the rate of degraded solvent: recuperator were determined. After laboratory tests, vermiculite associated with active carbon was employed for the treatment of 50 liters of tri-n-butyl phosphate (30% V/V)-dodecane, degraded by hydrolysis. Other analyses were performed to check the potentialities of these solids for this solvent recovery. (Author)

  6. Development of Methods for Determination of Aflatoxins.

    Science.gov (United States)

    Xie, Lijuan; Chen, Min; Ying, Yibin

    2016-12-01

    Aflatoxins can cause damage to the health of humans and animals. Several institutions around the world have established regulations to limit the levels of aflatoxins in food, and numerous analytical methods have been extensively developed for aflatoxin determination. This review covers the currently used analytical methods for the determination of aflatoxins in different food matrices, which includes sampling and sample preparation, sample pretreatment methods including extraction methods and purification methods of aflatoxin extracts, separation and determination methods. Validation for analysis of aflatoxins and safety considerations and precautions when doing the experiments are also discussed. PMID:25840003

  7. HPLC method development for the simultaneous analysis of amlodipine and valsartan in combined dosage forms and in vitro dissolution studies

    Directory of Open Access Journals (Sweden)

    Mustafa Çelebier

    2010-12-01

    Full Text Available A simple, rapid and reproducible HPLC method was developed for the simultaneous determination of amlodipine and valsartan in their combined dosage forms, and for drug dissolution studies. A C18 column (ODS 2, 10 μm, 200 x 4.6 mm and a mobile phase of phosphate buffer (pH 3.6 , 0.01 mol L-1:acetonitrile: methanol (46:44:10 v/v/v mixture were used for separation and quantification. Analyses were run at a flow-rate of 1 mL min-1 and at ambient temperature. The injection volume was 20 μL and the ultraviolet detector was set at 240 nm. Under these conditions, amlodipine and valsartan were eluted at 7.1 min and 3.4 min, respectively. Total run time was shorter than 9 min. The developed method was validated according to the literature and found to be linear within the range 0.1 - 50 μg mL-1 for amlodipine, and 0.05 - 50 μg mL-1 for valsartan. The developed method was applied successfully for quality control assay of amlodipine and valsartan in their combination drug product and in vitro dissolution studies.Desenvolveu-se método de HPLC rápido e reprodutível para a determinação simultânea de anlodipino e valsartana em suas formas de associação e para os estudos de dissolução dos fármacos. Utilizaram-se coluna C18 (ODS 2, 10 μm, 200 x 4,6 mm e fase móvel tampão fosfato (pH 3,6, 0,01 mol L-1:acetonitrila: metanol para a separação e a quantificação. As análises foram efetuadas com velocidade de fluxo de 1 mL min-1 e à temparatura ambiente O volume de injeção foi de 20 μL e utilizou-se detector de ultravioleta a 240 nm. Sob essas condições, anlodipino e valsartana foram eluídas a 7,1 min e 3,4 min, respectivamente. O tempo total de corrida foi menor que 9 min. O método desenvolvido foi validado de acordo com a literatura e se mostrou linear na faixa de 0,1-50 μg mL-1 para anlodipino e de 0,05-50 μg mL-1 para valsartana. O método desenvolvido foi aplicado com sucesso para ensaios de controle de qualidade de associações de

  8. Recent development of positron annihilation methods

    CERN Document Server

    Doyama, M

    2002-01-01

    When positron comes into solid or liquid, it moves in the matter and emitted two gamma rays at the opposite direction, by pair annihilation with electron. Each gamma ray is about 511 keV. The experiments of positron annihilation has been developed by three methods such as angular correlation between two gamma rays, energy analysis of emission gamma ray and positron lifetime. The angular correlation between two gamma rays is determined by gamma ray position detector.The energy analysis was measured by S-W analysis and Coincidence Doppler Broadening (CDB) method. Positron lifetime are determined by gamma-gamma lifetime measurement method, beta sup + -gamma lifetime measurement method and other method using waveform of photomultiplier, and determination of time and frequency of gamma-ray. Positron beam is applied to positron scattering, positron diffraction, low energy positron diffraction (LEPD), PELS, LEPSD, PAES, positron re-emission imaging microscope (PRIM) and positron channeling. The example of CDB method...

  9. Hydrophilic interaction liquid chromatography in analysis of granisetron HCl and its related substances. Retention mechanisms and method development.

    Science.gov (United States)

    Maksić, Jelena; Tumpa, Anja; Stajić, Ana; Jovanović, Marko; Rakić, Tijana; Jančić-Stojanović, Biljana

    2016-05-10

    In this paper separation of granisetron and its two related substances in HILIC mode is presented. Separation was done on silica column derivatized with sulfoalkylbetaine groups (ZIC-HILIC). Firstly, retention mechanisms were assessed whereby retention factors of substances were followed in wide range of acetonitrile content (80-97%), at constant concentration of aqueous buffer (10mM) as well as at constant pH value of 3.0. Further, in order to developed optimal HILIC method, Design of Experiments (DoE) methodology was applied. For optimization full factorial design 3(2) was employed. Influence of acetonitrile content and ammonium acetate concentration were investigated while pH of the water phase was kept at 3.3. Adequacy of obtained mathematical models was confirmed by ANOVA. Optimization goals (α>1.15 and minimal run time) were accomplished with 94.7% of acetonitrile in mobile phase and 70mM of ammonium acetate in water phase. Optimal point was in the middle of defined Design Space. In the next phase, robustness was experimetally tested by Rechtschaffen design. The investigated factors and their levels were: acetonitrile content (±1%), ammonium acetate molarity in water phase (±2mM), pH value of water phase (±0.2) and column temperature (±4°C). The validation scope included selectivity, linearity, accuracy and precision as well as determination of limit of detection (LOD) and limit of quantification (LOQ) for the related substances. Additionally, the validation acceptance criteria were met in all cases. Finally, the proposed method could be successfully utilized for estimation of granisetron HCl and its related substances in tablets and parenteral dosage forms, as well as for monitoring degradation under various stress conditions. PMID:26895494

  10. Communication Network Analysis Methods.

    Science.gov (United States)

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  11. Nonlinear structural analysis using integrated force method

    Indian Academy of Sciences (India)

    N R B Krishnam Raju; J Nagabhushanam

    2000-08-01

    Though the use of the integrated force method for linear investigations is well-recognised, no efforts were made to extend this method to nonlinear structural analysis. This paper presents the attempts to use this method for analysing nonlinear structures. General formulation of nonlinear structural analysis is given. Typically highly nonlinear bench-mark problems are considered. The characteristic matrices of the elements used in these problems are developed and later these structures are analysed. The results of the analysis are compared with the results of the displacement method. It has been demonstrated that the integrated force method is equally viable and efficient as compared to the displacement method.

  12. Development of bio-analytical methods for the quantitative and qualitative analysis of labelled peptides and proteins via hyphenation of chromatography and mass spectrometry

    OpenAIRE

    Holste, Angela Sarah

    2014-01-01

    This PhD thesis was a Cotutelle between the Université de Pau et des Pays de l’Adour (UPPA) in Pau, France and the Christian-Albrechts University (CAU) in Kiel, Germany. In the course of this international collaboration, bio-analytical methods for the quantitative and qualitative analysis of labelled peptides and proteins were developed, which were based on the hyphenation of chromatography with mass spectrometry. Peptides and protein digests were lanthanide labelled using DOTA-based comp...

  13. Validation of a Non-Targeted LC-MS Approach for Identifying Ancient Proteins: Method Development on Bone to Improve Artifact Residue Analysis

    OpenAIRE

    Andrew Barker; Jonathan Dombrosky; Dale Chaput; Barney Venbles; Steve Wolverton; Stevens, Stanley M.

    2015-01-01

    Identification of protein residues from prehistoric cooking pottery using mass spectrometry is challenging because proteins are removed from original tissues, are degraded from cooking, may be poorly preserved due to diagenesis, and occur in a palimpsest of exogenous soil proteins. In contrast, bone proteins are abundant and well preserved. This research is part of a larger method-development project for innovation and improvement of liquid chromatography – mass spectrometry analysis of prote...

  14. Development, optimization, validation and application of faster gas chromatography - flame ionization detector method for the analysis of total petroleum hydrocarbons in contaminated soils.

    Science.gov (United States)

    Zubair, Abdulrazaq; Pappoe, Michael; James, Lesley A; Hawboldt, Kelly

    2015-12-18

    This paper presents an important new approach to improving the timeliness of Total Petroleum Hydrocarbon (TPH) analysis in the soil by Gas Chromatography - Flame Ionization Detector (GC-FID) using the CCME Canada-Wide Standard reference method. The Canada-Wide Standard (CWS) method is used for the analysis of petroleum hydrocarbon compounds across Canada. However, inter-laboratory application of this method for the analysis of TPH in the soil has often shown considerable variability in the results. This could be due, in part, to the different gas chromatography (GC) conditions, other steps involved in the method, as well as the soil properties. In addition, there are differences in the interpretation of the GC results, which impacts the determination of the effectiveness of remediation at hydrocarbon-contaminated sites. In this work, multivariate experimental design approach was used to develop and validate the analytical method for a faster quantitative analysis of TPH in (contaminated) soil. A fractional factorial design (fFD) was used to screen six factors to identify the most significant factors impacting the analysis. These factors included: injection volume (μL), injection temperature (°C), oven program (°C/min), detector temperature (°C), carrier gas flow rate (mL/min) and solvent ratio (v/v hexane/dichloromethane). The most important factors (carrier gas flow rate and oven program) were then optimized using a central composite response surface design. Robustness testing and validation of model compares favourably with the experimental results with percentage difference of 2.78% for the analysis time. This research successfully reduced the method's standard analytical time from 20 to 8min with all the carbon fractions eluting. The method was successfully applied for fast TPH analysis of Bunker C oil contaminated soil. A reduced analytical time would offer many benefits including an improved laboratory reporting times, and overall improved clean up

  15. Theoretical and experimental analysis of electroweak corrections to the inclusive jet process. Development of extreme topologies detection methods

    International Nuclear Information System (INIS)

    We have studied the behaviour of the inclusive jet, W+jets and Z+jets processes from the phenomenological and experimental point of view in the ATLAS experiment at LHC in order to understand how important is the impact of Sudakov logarithms on electroweak corrections and in the associated production of weak vector boson and jets at LHC. We have computed the amplitude of the real electroweak corrections to the inclusive jet process due to the real emission of weak vector bosons from jets. We have done this computation with the MCFM and NLOjet++ generators at 7 TeV, 8 TeV and 14 TeV. This study shows that, for the inclusive jet process, the partial cancellation of the virtual weak corrections (due to weak bosons in loops) by the real electroweak corrections occurs. This effect shows that Bloch-Nordsieck violation is reduced for this process. We have then participated to the measure of the differential cross-section for these different processes in the ATLAS experiment at 7 TeV. In particular we have been involved into technical aspects of the measurement such as the study of the QCD background to the W+jets process in the muon channel. We have then combined the different measurements in this channel to compare their behaviour. This tends to show that several effects are giving to the electroweak corrections their relative importance as we see an increase of the relative contribution of weak bosons with jets processes to the inclusive jet process with the transverse momentum of jets, if we explicitly ask for the presence of electroweak bosons in the final state. This study is currently only a preliminary study and aims at showing that this study can be useful to investigate the underlying structure of these processes. Finally we have studied the noises affecting the ATLAS calorimeter. This has allowed for the development of a new way to detect problematic events using well known theorems from statistics. This new method is able to detect bursts of noise and

  16. Development of a reference method for speciation analysis of methylmercury in fish by HPLC-ICPMS and species specific isotope dilution

    International Nuclear Information System (INIS)

    Full text: LNE, the French National Metrology Institute, has established from many years isotope dilution applied to ICPMS to be able to certify reference materials in the fields of environmental monitoring, health and food. The present work shows the method developed for the analysis of methylmercury in fish products. The process was realized with a HPLC coupled to an ICPMS after the extraction of mercury by the use of microwave heating. Double IDMS was involved for quantification. Sources of bias were estimated to be taken into account within the uncertainty budget realized with GUM recommendations. Fresh and lyophilized CRMs were analyzed for method validation. (author)

  17. A thermogravimetric analysis (TGA) method developed for estimating the stoichiometric ratio of solid-state {alpha}-cyclodextrin-based inclusion complexes

    Energy Technology Data Exchange (ETDEWEB)

    Bai, Yuxiang; Wang, Jinpeng; Bashari, Mohanad; Hu, Xiuting [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Feng, Tao [School of Perfume and Aroma Technology, Shanghai Institute of Technology, Shanghai 201418 (China); Xu, Xueming [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Jin, Zhengyu, E-mail: jinlab2008@yahoo.com [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Tian, Yaoqi, E-mail: yqtian@jiangnan.edu.cn [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China)

    2012-08-10

    Highlights: Black-Right-Pointing-Pointer We develop a TGA method for the measurement of the stoichiometric ratio. Black-Right-Pointing-Pointer A series of formulas are deduced to calculate the stoichiometric ratio. Black-Right-Pointing-Pointer Four {alpha}-CD-based inclusion complexes were successfully prepared. Black-Right-Pointing-Pointer The developed method is applicable. - Abstract: An approach mainly based on thermogravimetric analysis (TGA) was developed to evaluate the stoichiometric ratio (SR, guest to host) of the guest-{alpha}-cyclodextrin (Guest-{alpha}-CD) inclusion complexes (4-cresol-{alpha}-CD, benzyl alcohol-{alpha}-CD, ferrocene-{alpha}-CD and decanoic acid-{alpha}-CD). The present data obtained from Fourier transform-infrared (FT-IR) spectroscopy showed that all the {alpha}-CD-based inclusion complexes were successfully prepared in a solid-state form. The stoichiometric ratios of {alpha}-CD to the relative guests (4-cresol, benzyl alcohol, ferrocene and decanoic acid) determined by the developed method were 1:1, 1:2, 2:1 and 1:2, respectively. These SR data were well demonstrated by the previously reported X-ray diffraction (XRD) method and the NMR confirmatory experiments, except the SR of decanoic acid with a larger size and longer chain was not consistent. It is, therefore, suggested that the TGA-based method is applicable to follow the stoichiometric ratio of the polycrystalline {alpha}-CD-based inclusion complexes with smaller and shorter chain guests.

  18. A thermogravimetric analysis (TGA) method developed for estimating the stoichiometric ratio of solid-state α-cyclodextrin-based inclusion complexes

    International Nuclear Information System (INIS)

    Highlights: ► We develop a TGA method for the measurement of the stoichiometric ratio. ► A series of formulas are deduced to calculate the stoichiometric ratio. ► Four α-CD-based inclusion complexes were successfully prepared. ► The developed method is applicable. - Abstract: An approach mainly based on thermogravimetric analysis (TGA) was developed to evaluate the stoichiometric ratio (SR, guest to host) of the guest–α-cyclodextrin (Guest-α-CD) inclusion complexes (4-cresol-α-CD, benzyl alcohol-α-CD, ferrocene-α-CD and decanoic acid-α-CD). The present data obtained from Fourier transform-infrared (FT-IR) spectroscopy showed that all the α-CD-based inclusion complexes were successfully prepared in a solid-state form. The stoichiometric ratios of α-CD to the relative guests (4-cresol, benzyl alcohol, ferrocene and decanoic acid) determined by the developed method were 1:1, 1:2, 2:1 and 1:2, respectively. These SR data were well demonstrated by the previously reported X-ray diffraction (XRD) method and the NMR confirmatory experiments, except the SR of decanoic acid with a larger size and longer chain was not consistent. It is, therefore, suggested that the TGA-based method is applicable to follow the stoichiometric ratio of the polycrystalline α-CD-based inclusion complexes with smaller and shorter chain guests.

  19. Development of a Univariate Membrane-Based Mid-Infrared Method for Protein Quantitation and Total Lipid Content Analysis of Biological Samples

    Directory of Open Access Journals (Sweden)

    Ivona Strug

    2014-01-01

    Full Text Available Biological samples present a range of complexities from homogeneous purified protein to multicomponent mixtures. Accurate qualification of such samples is paramount to downstream applications. We describe the development of an MIR spectroscopy-based analytical method offering simultaneous protein quantitation (0.25–5 mg/mL and analysis of total lipid or detergent species, as well as the identification of other biomolecules present in biological samples. The method utilizes a hydrophilic PTFE membrane engineered for presentation of aqueous samples in a dried format compatible with fast infrared analysis. Unlike classical quantification techniques, the reported method is amino acid sequence independent and thus applicable to complex samples of unknown composition. By comparison to existing platforms, this MIR-based method enables direct quantification using minimal sample volume (2 µL; it is well-suited where repeat access and limited sample size are critical parameters. Further, accurate results can be derived without specialized training or knowledge of IR spectroscopy. Overall, the simplified application and analysis system provides a more cost-effective alternative to high-throughput IR systems for research laboratories with minimal throughput demands. In summary, the MIR-based system provides a viable alternative to current protein quantitation methods; it also uniquely offers simultaneous qualification of other components, notably lipids and detergents.

  20. Development of computation mechanics analysis method taking microscopic structure of nuclear power materials in consideration and the optimal design method for structure constitution

    International Nuclear Information System (INIS)

    When materials are subjected to neutron irradiation, the characteristics deteriorate, causing dislocation loop, void, helium bubbles, segregation, precipitation and so on. In the structural design of the core of nuclear fusion reactors, in order to determine the applied temperature limit of structural materials, the elucidation of helium embrittlement mechanism and the development of the materials which have the excellent resistance to helium embrittlement are important. In this paper, the example of analyzing the form of bubbles at grain boundaries and the effect that the work hardening index of materials exerts to the stress-strain curves is shown. The formulation of finite element method for evaluating grain boundary helium embrittlement is explained. It is supposed that helium embrittlement arises because bubbles exist at grain boundaries, and deformation concentrates near the grain boundary which is perpendicular to tensile stress axis. As the results of calculation, the effect that bubble size exerts to stress-strain curves and the effects that bubble density and work hardening index exert to stress-strain curves are reported. (K.I.)

  1. COMPETITIVE INTELLIGENCE ANALYSIS - SCENARIOS METHOD

    Directory of Open Access Journals (Sweden)

    Ivan Valeriu

    2014-07-01

    Full Text Available Keeping a company in the top performing players in the relevant market depends not only on its ability to develop continually, sustainably and balanced, to the standards set by the customer and competition, but also on the ability to protect its strategic information and to know in advance the strategic information of the competition. In addition, given that economic markets, regardless of their profile, enable interconnection not only among domestic companies, but also between domestic companies and foreign companies, the issue of economic competition moves from the national economies to the field of interest of regional and international economic organizations. The stakes for each economic player is to keep ahead of the competition and to be always prepared to face market challenges. Therefore, it needs to know as early as possible, how to react to others’ strategy in terms of research, production and sales. If a competitor is planning to produce more and cheaper, then it must be prepared to counteract quickly this movement. Competitive intelligence helps to evaluate the capabilities of competitors in the market, legally and ethically, and to develop response strategies. One of the main goals of the competitive intelligence is to acknowledge the role of early warning and prevention of surprises that could have a major impact on the market share, reputation, turnover and profitability in the medium and long term of a company. This paper presents some aspects of competitive intelligence, mainly in terms of information analysis and intelligence generation. Presentation is theoretical and addresses a structured method of information analysis - scenarios method – in a version that combines several types of analysis in order to reveal some interconnecting aspects of the factors governing the activity of a company.

  2. Development of liquid chromatography methods coupled to mass spectrometry for the analysis of substances with a wide variety of polarity in meconium.

    Science.gov (United States)

    Meyer-Monath, Marie; Chatellier, Claudine; Cabooter, Deirdre; Rouget, Florence; Morel, Isabelle; Lestremau, Francois

    2015-06-01

    Meconium is the first fecal excretion of newborns. This complex accumulative matrix allows assessing the exposure of the fetus to xenobiotics during the last 6 months of pregnancy. To determine the eventual effect of fetal exposure to micropollutants in this matrix, robust and sensitive analytical methods must be developed. This article describes the method development of liquid chromatography methods coupled to triple quadrupole mass spectrometry for relevant pollutants. The 28 selected target compounds had different physico-chemical properties from very polar (glyphosate) to non-polar molecules (pyrethroids). Tests were performed with three different types of columns: reversed phase, ion exchange and HILIC. As a unique method could not be determined for the simultaneous analysis of all compounds, three columns were selected and suitable chromatographic methods were optimized. Similar results were noticed for the separation of the target compounds dissolved in either meconium extract or solvent for reversed phase and ion exchange columns. However, for HILIC, the matrix had a significant influence on the peak shape and robustness of the method. Finally, the analytical methods were applied to "real" meconium samples. PMID:25863396

  3. Development of a data base system and concentration calculation for neutron activation analysis as per the k0 method

    International Nuclear Information System (INIS)

    One of the most important nuclear analytical techniques is the neutron activation analysis used to determine which elements and their proportion are included within an analysis sample. A sample is undergone to the procedures of the technique, and the information, which is dispersed, is generated in each phase of this process. Therefore, it is necessary this information should be organized properly for its better use

  4. A CASE STUDY ANALYSIS OF CLT METHODS TO DEVELOP GRAMMAR COMPETENCY FOR ACADEMIC WRITING PURPOSES AT TERTIARY LEVEL

    Directory of Open Access Journals (Sweden)

    Almodad Biduk Asmani

    2013-09-01

    statistics, the numerical data show that there is no significant difference between the two methods results, and as a result, either method has its own strength and weaknesses. If one is to be implemented, it must be linked to the specific goals and purposes that each entails.

  5. Development of a potentiometric EDTA method for determination of molybdenum. Use of the analysis for molybdenite concentrates

    Science.gov (United States)

    Khristova, R.; Vanmen, M.

    1986-01-01

    Based on considerations of principles and experimental data, the interference of sulfate ions in poteniometric titration of EDTA with FeCl3 was confirmed. The method of back complexometric titration of molybdenum of Nonova and Gasheva was improved by replacing hydrazine sulfate with hydrazine hydrochloride for reduction of Mo(VI) to Mo(V). The method can be used for one to tenths of mg of molybdenum with 0.04 mg standard deviation. The specific method of determination of molybdenum in molybdenite concentrates is presented.

  6. Development of a potentiometric EDTA method for determination of molybdenum. Use of the analysis for molybdenite concentrates

    International Nuclear Information System (INIS)

    Based on considerations of principles and experimental data, the interference of sulfate ions in poteniometric titration of EDTA with FeCl3 was confirmed. The method of back complexometric titration of molybdenum of Nonova and Gasheva was improved by replacing hydrazine sulfate with hydrazine hydrochloride for reduction of Mo(VI) to Mo(V). The method can be used for one to tenths of mg of molybdenum with 0.04 mg standard deviation. The specific method of determination of molybdenum in molybdenite concentrates is presented

  7. Studies on application of neutron activation analysis -Applied research on air pollution monitoring and development of analytical method of environmental samples

    International Nuclear Information System (INIS)

    This research report is written for results of applied research on air pollution monitoring using instrumental neutron activation analysis. For identification and standardization of analytical method, 24 environmental samples are analyzed quantitatively, and accuracy and precision of this method are measured. Using airborne particulate matter and biomonitor chosen as environmental indicators, trace elemental concentrations of sample collected at urban and rural site monthly are determined ant then the calculation of statistics and the factor analysis are carried out for investigation of emission source. Facilities for NAA are installed in a new HANARO reactor, functional test is performed for routine operation. In addition, unified software code for NAA is developed to improve accuracy, precision and abilities of analytical processes. (author). 103 refs., 61 tabs., 19 figs

  8. Development of methods for multiresidue analysis of rice post-emergence herbicides in loam soil and their possible applications to soils of different composition.

    Science.gov (United States)

    Niell, Silvina; Pareja, Lucia; Asteggiante, Lucía Geis; Roehrs, Rafael; Pizzutti, Ionara R; García, Claudio; Heinzen, Horacio; Cesio, María Verónica

    2010-01-01

    Two simple and straightforward sample preparation methods were developed for the multiresidue analysis of post-emergence herbicides in loam soil that are commonly used in rice crop cultivation. A number of strategic soil extraction and cleanup methods were evaluated. The instrumental analysis was performed by HPLC with a diode array detector. The best compromise between the recoveries (69-98%) and good repeatability (RSD pyrazosulfuron ethyl, propanil, and clomazone were analyzed simultaneously. Quinclorac and bispyribac sodium were also assayed, but their recoveries were below 50%. Both methods had an LOD of 0.7 microg/kg and could accurately determine the residues at the 2 microg/kg level. These two methods could not be applied directly to other soil types as the recoveries strongly depended on the soil composition. The developed methodologies were successfully applied in monitoring 87 real-world soil samples, in which only propanil (6 to 12 microg/kg) and clomazone (15 to 20 microg/kg) residues could be detected. PMID:20480886

  9. Firm Analysis by Different Methods

    OpenAIRE

    Píbilová, Kateřina

    2012-01-01

    This Diploma Thesis deals with an analysis of the company made by selected methods. External environment of the company is analysed using PESTLE analysis and Porter’s five-factor model. The internal environment is analysed by means of Kralicek Quick test and Fundamental analysis. SWOT analysis represents opportunities and threats of the external environment with the strengths and weaknesses of the company. The proposal of betterment of the company’s economic management is designed on the basi...

  10. Development and validation of AccuTOF-DART™ as a screening method for analysis of bank security device and pepper spray components.

    Science.gov (United States)

    Pfaff, Allison M; Steiner, Robert R

    2011-03-20

    Analysis of bank security devices, containing 1-methylaminoanthraquinone (MAAQ) and o-chlorobenzylidenemalononitrile (CS), and pepper sprays, containing capsaicin, is a lengthy process with no specific screening technique to aid in identifying samples of interest. Direct Analysis in Real Time (DART™) ionization coupled with an Accurate Time of Flight (AccuTOF) mass detector is a fast, ambient ionization source that could significantly reduce time spent on these cases and increase the specificity of the screening process. A new method for screening clothing for bank dye and pepper spray, using AccuTOF-DART™ analysis, has been developed. Detection of MAAQ, CS, and capsaicin was achieved via extraction of each compound onto cardstock paper, which was then sampled in the AccuTOF-DART™. All results were verified using gas chromatography coupled with electron impact mass spectrometry. PMID:20643521

  11. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  12. Development methods of conflicts identification and evaluation

    OpenAIRE

    Podolchak, N. Y.; Kovalchuk, G. R.; Savchyn, O. I.

    2015-01-01

    It has been elaborated the method for quantitative evaluation of level and structure of the interpersonal management conflicts that prevail over the other conflicts in the machine-building enterprises functioning. According to the reasons of appearance investigated interpersonal management conflicts were divided into the following types: informational, behavioral, structural, conflicts of relationships and values. The method was developed due to the usage of conjoint analysis that allows to e...

  13. Image analysis methods for assessment of H2O2 production and Plasmopara viticola development in grapevine leaves: application to the evaluation of resistance to downy mildew.

    Science.gov (United States)

    Kim Khiook, Ian Li; Schneider, Charles; Heloir, Marie-Claire; Bois, Benjamin; Daire, Xavier; Adrian, Marielle; Trouvelot, Sophie

    2013-11-01

    The grapevine downy mildew (Plasmopara viticola) provokes severe damages and destroys the harvest in the absence of an effective protection. Numerous fungicide treatments are thus generally necessary. To promote a sustainable production, alternative strategies of protection including new antifungal molecules, resistant genotypes or elicitor-induced resistance are under trial. To evaluate the relevance of these strategies, resistance tests are required. In this context, three image analysis methods were developed to read the results of tests performed to assess P. viticola sporulation and mycelial development, and H(2)O(2) production in leaves. They have been validated using elicitors of plant defenses. These methods are reliable, innovative, rapid, and their modular concept allows their further adaptation to other host-pathogen systems. PMID:23994353

  14. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...

  15. Analysis and Development of Finite Element Methods for the Study of Nonlinear Thermomechanical Behavior of Structural Components

    Science.gov (United States)

    Oden, J. Tinsley

    1995-01-01

    Underintegrated methods are investigated with respect to their stability and convergence properties. The focus was on identifying regions where they work and regions where techniques such as hourglass viscosity and hourglass control can be used. Results obtained show that underintegrated methods typically lead to finite element stiffness with spurious modes in the solution. However, problems exist (scalar elliptic boundary value problems) where underintegrated with hourglass control yield convergent solutions. Also, stress averaging in underintegrated stiffness calculations does not necessarily lead to stable or convergent stress states.

  16. High-performance liquid chromatography analysis methods developed for quantifying enzymatic esterification of flavonoids in ionic liquids

    DEFF Research Database (Denmark)

    Lue, Bena-Marie; Guo, Zheng; Xu, X.B.

    2008-01-01

    Methods using reversed-phase high-performance liquid chromatography (RP-HPLC) with ELSD were investigated to quantify enzymatic reactions of flavonoids with fatty acids in the presence of diverse room temperature ionic liquids (RTILs). A buffered salt (preferably triethylamine-acetate) was found...

  17. Development and comparison of two multiresidue methods for the analysis of 17 mycotoxins in cereals by liquid chromatography electrospray ionization tandem mass spectrometry.

    Science.gov (United States)

    Desmarchelier, Aurelien; Oberson, Jean-Marie; Tella, Patricia; Gremaud, Eric; Seefelder, Walburga; Mottier, Pascal

    2010-07-14

    Two multiresidue methods based on different extraction procedures have been developed and compared for the liquid chromatography electrospray ionization tandem mass spectrometry analysis of 17 mycotoxins including ochratoxin A, aflatoxins (B(1), B(2), G(1), and G(2)), zearalenone, fumonisins (B(1) and B(2)), T-2 toxin, HT-2 toxin, nivalenol, deoxynivalenol, 3- and 15-acetyldeoxynivalenol, fusarenon-X, diacetoxyscirpenol, and neosolaniol in cereal-based commodities. The extraction procedures considered were a QuEChERS-like method and one using accelerated solvent extraction (ASE). Both extraction procedures gave similar performances in terms of linearity (r(2) > 0.98) and precision (both RSD(r) and RSD(iR) sample throughput as compared to the ASE method. PMID:20527950

  18. Development of multidimensional liquid chromatographic methods hyphenated to mass spectrometry, preparation and analysis of complex biological samples

    OpenAIRE

    Delmotte, Nathanaël

    2007-01-01

    Immunoadsorbers based on monolithic epoxy-activated CIM disks have been developed in order to target biomarkers of heart diseases. The developed immunoadsorbers permitted to selectively isolate myoglobin and NT-proBNP from human serum. Anti-NT-proBNP-CIM disks permitted a quantitative isolation of NT-proBNP at concentrations down to 750 amol/µL in serum (R2 = 0.998). Six different restricted access materials have been evaluated with respect to their ability to remove hemoglobin from hemoly...

  19. Dissecting multiple sequence alignment methods : the analysis, design and development of generic multiple sequence alignment components in SeqAn

    OpenAIRE

    Rausch, T.

    2010-01-01

    Multiple sequence alignments are an indispensable tool in bioinformatics. Many applications rely on accurate multiple alignments, including protein structure prediction, phylogeny and the modeling of binding sites. In this thesis we dissected and analyzed the crucial algorithms and data structures required to construct such a multiple alignment. Based upon that dissection, we present a novel graph-based multiple sequence alignment program and a new method for multi-read alignments occurring i...

  20. Development of genetic diagnosing method for diabetes and cholecystitis based on gene analysis of CCK-A receptor

    Energy Technology Data Exchange (ETDEWEB)

    Kono, Akira [National Kyushu Cancer Center, Fukuoka (Japan)

    2000-02-01

    Based on the gene analysis of cholecystokinin type A receptor (CCKAR) from normal mouse and its sequence analysis in the previous year, CCKAR knock-out gene which allows mRNA expression of {beta}-galactosidase gene in stead of CCKAR gene was constructed. Since some abnormality in CCKAR gene is thought to be a causal factor of diabetes and cholecystitis, a knock-out mouse that expressed LacZ but not CCKAR was constructed to investigate the correlation between the clinical features of diabetes and cholecystitis, and CCKAR gene abnormalities. F2 mice that had mutations in CCKAR gene were born according to the Mendel's low. The expression of CCKAR gene was investigated in detail based on the expression of LacZ gene in various tissues of homo (-/-) and hetero (-/+) knockout mice. Comparative study on blood sugar level, blood insulin level, the formation of biliary calculus, etc. is underway with the wild mouse, hetero and homo knockout mouse. (M.N.)

  1. Development of genetic diagnosing method for diabetes and cholecystitis based on gene analysis of CCK-A receptor

    International Nuclear Information System (INIS)

    Based on the gene analysis of cholecystokinin type A receptor (CCKAR) from normal mouse and its sequence analysis in the previous year, CCKAR knock-out gene which allows mRNA expression of β-galactosidase gene in stead of CCKAR gene was constructed. Since some abnormality in CCKAR gene is thought to be a causal factor of diabetes and cholecystitis, a knock-out mouse that expressed LacZ but not CCKAR was constructed to investigate the correlation between the clinical features of diabetes and cholecystitis, and CCKAR gene abnormalities. F2 mice that had mutations in CCKAR gene were born according to the Mendel's low. The expression of CCKAR gene was investigated in detail based on the expression of LacZ gene in various tissues of homo (-/-) and hetero (-/+) knockout mice. Comparative study on blood sugar level, blood insulin level, the formation of biliary calculus, etc. is underway with the wild mouse, hetero and homo knockout mouse. (M.N.)

  2. STABILITY INDICATING HPLC METHOD DEVELOPMENT: A REVIEW

    Directory of Open Access Journals (Sweden)

    Bhoomi P. Shah*, Suresh Jain, Krishna K. Prajapati and Nasimabanu Y. Mansuri

    2012-09-01

    Full Text Available High performance liquid chromatography is one of the most accurate methods widely used for the quantitative as well as qualitative analysis of drug product and is used for determining drug product stability. Stability indicating HPLC methods are used to separate various drug related impurities that are formed during the synthesis or manufacture of drug product. This article discusses the strategies and issues regarding the development of stability indicating HPLC system for drug substance. A number of key chromatographic factors were evaluated in order to optimize the detection of all potentially relevant degradants. The method should be carefully examined for its ability to distinguish the primary drug components from the impurities. New chemical entities and drug products must undergo forced degradation studies which would be helpful in developing and demonstrating the specificity of such stability indicating methods. At every stage of drug development practical recommendations are provided which will help to avoid failures.

  3. Development of Chiral LC-MS Methods for small Molecules and Their Applications in the Analysis of Enantiomeric Composition and Pharmacokinetic Studies

    Energy Technology Data Exchange (ETDEWEB)

    Meera Jay Desai

    2004-12-19

    The purpose of this research was to develop sensitive LC-MS methods for enantiomeric separation and detection, and then apply these methods for determination of enantiomeric composition and for the study of pharmacokinetic and pharmacodynamic properties of a chiral nutraceutical. Our first study, evaluated the use of reverse phase and polar organic mode for chiral LC-API/MS method development. Reverse phase methods containing high water were found to decrease ionization efficiency in electrospray, while polar organic methods offered good compatibility and low limits of detection with ESI. The use of lower flow rates dramatically increased the sensitivity by an order of magnitude. Additionally, for rapid chiral screening, the coupled Chirobiotic column afforded great applicability for LC-MS method development. Our second study, continued with chiral LC-MS method development in this case for the normal phase mode. Ethoxynonafluorobutane, a fluorocarbon with low flammability and no flashpoint, was used as a substitute solvent for hexane/heptane mobile phases for LC-APCI/MS. Comparable chromatographic resolutions and selectivities were found using ENFB substituted mobile phase systems, although, peak efficiencies were significantly diminished. Limits of detection were either comparable or better for ENFB-MS over heptane-PDA detection. The miscibility of ENFB with a variety of commonly used organic modifiers provided for flexibility in method development. For APCI, lower flow rates did not increase sensitivity as significantly as was previously found for ESI-MS detection. The chiral analysis of native amino acids was evaluated using both APCI and ESI sources. For free amino acids and small peptides, APCI was found to have better sensitivities over ESI at high flow rates. For larger peptides, however, sensitivity was greatly improved with the use of electrospray. Additionally, sensitivity was enhanced with the use of non-volatile additives, This optimized method was then

  4. Development of an ionic liquid based dispersive liquid-liquid microextraction method for the analysis of polycyclic aromatic hydrocarbons in water samples.

    Science.gov (United States)

    Pena, M Teresa; Casais, M Carmen; Mejuto, M Carmen; Cela, Rafael

    2009-09-01

    A simple, rapid and efficient method, ionic liquid based dispersive liquid-liquid microextraction (IL-DLLME), has been developed for the first time for the determination of 18 polycyclic aromatic hydrocarbons (PAHs) in water samples. The chemical affinity between the ionic liquid (1-octyl-3-methylimidazolium hexafluorophosphate) and the analytes permits the extraction of the PAHs from the sample matrix also allowing their preconcentration. Thus, this technique combines extraction and concentration of the analytes into one step and avoids using toxic chlorinated solvents. The factors affecting the extraction efficiency, such as the type and volume of ionic liquid, type and volume of disperser solvent, extraction time, dispersion stage, centrifuging time and ionic strength, were optimised. Analysis of extracts was performed by high performance liquid chromatography (HPLC) coupled with fluorescence detection (Flu). The optimised method exhibited a good precision level with relative standard deviation values between 1.2% and 5.7%. Quantification limits obtained for all of these considered compounds (between 0.1 and 7 ng L(-1)) were well below the limits recommended in the EU. The extraction yields for the different compounds obtained by IL-DLLME, ranged from 90.3% to 103.8%. Furthermore, high enrichment factors (301-346) were also achieved. The extraction efficiency of the optimised method is compared with that achieved by liquid-liquid extraction. Finally, the proposed method was successfully applied to the analysis of PAHs in real water samples (tap, bottled, fountain, well, river, rainwater, treated and raw wastewater). PMID:19646707

  5. SUBSURFACE CONSTRUCTION AND DEVELOPMENT ANALYSIS

    International Nuclear Information System (INIS)

    The purpose of this analysis is to identify appropriate construction methods and develop a feasible approach for construction and development of the repository subsurface facilities. The objective of this analysis is to support development of the subsurface repository layout for License Application (LA) design. The scope of the analysis for construction and development of the subsurface Repository facilities covers: (1) Excavation methods, including application of knowledge gained from construction of the Exploratory Studies Facility (ESF). (2) Muck removal from excavation headings to the surface. This task will examine ways of preventing interference with other subsurface construction activities. (3) The logistics and equipment for the construction and development rail haulage systems. (4) Impact of ground support installation on excavation and other construction activities. (5) Examination of how drift mapping will be accomplished. (6) Men and materials handling. (7) Installation and removal of construction utilities and ventilation systems. (8) Equipping and finishing of the emplacement drift mains and access ramps to fulfill waste emplacement operational needs. (9) Emplacement drift and access mains and ramps commissioning prior to handover for emplacement operations. (10) Examination of ways to structure the contracts for construction of the repository. (11) Discussion of different construction schemes and how to minimize the schedule risks implicit in those schemes. (12) Surface facilities needed for subsurface construction activities

  6. Ethnographic Contributions to Method Development

    DEFF Research Database (Denmark)

    Leander, Anna

    2015-01-01

    Contrary to common assumptions, there is much to be learned about methods from constructivist/post-structuralist approaches to International Relations (IR) broadly speaking. This article develops this point by unpacking the contributions of one specific method—ethnography—as used in one subfield of...... IR—Critical Security Studies. Ethnographic research works with what has been termed a “strong” understanding of objectivity. When this understanding is taken seriously, it must lead to a refashioning of the processes of gathering, analyzing, and presenting data in ways that reverse many standard...... assumptions and instructions pertaining to “sound methods.” Both in the context of observation and in that of justification, working with “strong objectivity” requires a flexibility and willingness to shift research strategies that is at odds with the usual emphasis on stringency, consistency, and carefully...

  7. Structural reliability methods: Code development status

    Science.gov (United States)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-01-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  8. [Development of Determination Method of Fluoroquinolone Antibiotics in Sludge Based on Solid Phase Extraction and HPLC-Fluorescence Detection Analysis].

    Science.gov (United States)

    Dai, Xiao-hu; Xue, Yong-gang; Liu, Hua-jie; Dai, Ling-ling; Yan, Han; Li, Ning

    2016-04-15

    Fluoroquinolone antibiotics (FQs), as the common pharmaceuticals and personal care products (PPCPs), are widespread in the environment. FQs contained in wastewater would be ultimately enriched in sludge, posing a potential threat to the consequent sludge utilization. To optimize the analytical method applicable to the determination of FQs in sludge, the authors selected ofloxacin (OFL), norfioxacin (NOR), ciprofloxacin (CIP) and lomefloxacin (LOM) as the target FQs, and established a method which was based on cell lysis, FQs extraction with triethylamine/methanol/water solution, Solid Phase Extraction (SPE) and HPLC-Fluorescence Detection (FLD) determination. After the investigation, phosphoric acid-triethylamine was decided to be the buffer salt, and methanol was chosen as the organic mobile phase. The gradient fluorescence scanning strategy was proved to be necessary for the optimal detection as well. Furthermore, by the designed orthogonal experiments, the effects of the extraction materials, pH, and the eluents on the efficiency of SPE extraction were evaluated, by which the optimal extraction conditions were determined. As a result, FQs in liquid samples could be analyzed by utilizing HLB extraction cartridge, and the recovery rates of the four FQs were in the range of 82%-103%. As for solid samples, the recovery rates of the four FQs contained reached up to 71%-101%. Finally, the adsorptivity of the sludge from the different tanks ( anaerobic, anoxic and oxic tanks) was investigated, showing gradual decrease in the adsorption capacity, but all adsorbed over 90% of the EQs. This conclusion also confirmed that 50% removal of FQs in the domestic wastewater treatment plant was realized by sludge adsorption. PMID:27548982

  9. A novel time-course cDNA microarray analysis method identifies genes associated with the development of cisplatin resistance.

    Science.gov (United States)

    Whiteside, Martin A; Chen, Dung-Tsa; Desmond, Renee A; Abdulkadir, Sarki A; Johanning, Gary L

    2004-01-22

    In recent years, most cDNA microarray studies of chemotherapeutic drug resistance have not considered the temporal pattern of gene expression. The objective of this study was to examine systematically changes in gene expression of NCI-H226 and NCI-H2170 lung cancer cells treated weekly with IC10 doses of cisplatin. NCI-H226 lung cancer cells were treated weekly with an IC10 dose of cisplatin. Candidate genes with a fold change of 2.0 or more were identified from this study. A second experiment was conducted by exposing NCI-H2170 cells to cisplatin doses that were increased in week 4 and decreased in week 5. Overall, 44 genes were differentially expressed in both the NCI-H226 and NCI-H2170 cell lines. In the NCI-H2170 cell line, 24 genes had a twofold gene expression change from weeks 3 to 4. Real-time PCR found a significant correlation of the gene expression changes for seven genes of interest. This small time-ordered series identified novel genes associated with cisplatin resistance. This kind of analysis should be viewed as a first step towards building gene-regulatory networks. PMID:14737109

  10. INTER-COUNTRY EFFICIENCY EVALUATION IN INNOVATION ACTIVITY ON THE BASIS OF METHOD FOR DATA ENVELOPMENT ANALYSIS AMONG COUNTRIES WITH DEVELOPED AND DEVELOPING ECONOMY, INCLUDING THE REPUBLIC OF BELARUS

    Directory of Open Access Journals (Sweden)

    I. V. Zhukovski

    2016-01-01

    Full Text Available The paper considers a problem on efficiency evaluation of innovation activity in 63 countries with developed and developing economies while using a method for data envelopment analysis. The following results of innovation activity have been used for calculation of an efficiency factor: export of high-technology products as percentage of industrial product export, export of ICT services as percentage of services export and payments obtained due to realization of intellectual property rights (in US dollars. A model of the data envelopment analysis with a changeable scale-dependent effect and which is directed on maximization of the obtained results (output-oriented VRS model has been used for the analysis. The evaluation has shown that such countries as the USA, Israel, Sweden and some others have maximum efficiency of resource transformation into innovative activity output. The executed analysis has revealed that the Republic of Belarus has a potential for improvement of indices on innovation results.

  11. Assessment of neutronic parameter's uncertainties obtained within the reactor dosimetry framework: Development and application of the stochastic methods of analysis

    International Nuclear Information System (INIS)

    One of the main objectives of reactor dosimetry is the determination of the physical parameters characterizing the neutronic field in which the studied sample is irradiated. The knowledge of the associated uncertainties represents a significant stake for nuclear industry as shows the high uncertainty value of 15% (k=1) commonly allowed for the calculated neutron flux (E> 1 MeV) on the vessel and internal structures. The study presented in this paper aims at determining then reducing uncertainties associated with the reactor dosimetry interpretation process. After a brief presentation of the interpretation process, input data uncertainties identification and quantification are performed in particular with regard to covariances. Then uncertainties propagation is carried out and analyzed by deterministic and stochastic methods on a representative case. Finally, a Monte Carlo sensitivity study based on Sobol indices is achieved on a case leading to derive the most penalizing input uncertainties. This paper concludes rising improvement axes to be studied for the input data knowledge. It highlights for example the need for having realistic variance-covariance matrices associated with input data (cross sections libraries, neutron computation code's outputs, ...). Lastly, the methodology principle presented in this paper is enough general to be easily transposable for other measurements data interpretation processes. (authors)

  12. Analysis of the mortality development of the population in the surroundings of Bohunice NPP using Fuzzy logic methods

    International Nuclear Information System (INIS)

    We pursue the vicinity of Bohunice NPP. The vicinity has cyclic form with radius of 30 km, what represents an area approximately 2 800 km2. This area of pursued vicinity is requisite by the security report of Bohunice NPP. To the presumptive calculations we used the complete databases of Register of death, Register of municipalities and of Register of age structure of the inhabitants of the Slovak republic from 1993 to 1999, fully-fashioned in Statistical authority of the Slovak republic. We work with databases, which don't contain personal identifications. We pursue the evolution of the mortality by the indicators of the mortality, calculated by the WHO. By the literary sources and by our experience is necessary the sum at least of three years to calculation of stable demographic and epidemiological parameters. Therefore we work with the method of short time series. The basic observed unit, which is represented by one value of the indicator, is one municipality. All our assessing analyses are calculated from triennial sums of all indicators, so we work with man-years. Advanced report is the adjusted extract from Complex report on situation of environment and health of the inhabitants in vicinity of Bohunice NPP in 1999, which was advanced by our society in March 2001. (authors)

  13. Development of methods for body composition studies

    Energy Technology Data Exchange (ETDEWEB)

    Mattsson, Soeren [Department of Radiation Physics, Lund University, Malmoe University Hospital, SE-205 02 Malmoe (Sweden); Thomas, Brian J [School of Physical and Chemical Sciences, Queensland University of Technology, Brisbane, QLD 4001 (Australia)

    2006-07-07

    This review is focused on experimental methods for determination of the composition of the human body, its organs and tissues. It summarizes the development and current status of fat determinations from body density, total body water determinations through the dilution technique, whole and partial body potassium measurements for body cell mass estimates, in vivo neutron activation analysis for body protein measurements, dual-energy absorptiometry (DEXA), computed tomography (CT) and magnetic resonance imaging (MRI, fMRI) and spectroscopy (MRS) for body composition studies on tissue and organ levels, as well as single- and multiple-frequency bioimpedance (BIA) and anthropometry as simple easily available methods. Methods for trace element analysis in vivo are also described. Using this wide range of measurement methods, together with gradually improved body composition models, it is now possible to quantify a number of body components and follow their changes in health and disease. (review)

  14. Development of methods for body composition studies

    International Nuclear Information System (INIS)

    This review is focused on experimental methods for determination of the composition of the human body, its organs and tissues. It summarizes the development and current status of fat determinations from body density, total body water determinations through the dilution technique, whole and partial body potassium measurements for body cell mass estimates, in vivo neutron activation analysis for body protein measurements, dual-energy absorptiometry (DEXA), computed tomography (CT) and magnetic resonance imaging (MRI, fMRI) and spectroscopy (MRS) for body composition studies on tissue and organ levels, as well as single- and multiple-frequency bioimpedance (BIA) and anthropometry as simple easily available methods. Methods for trace element analysis in vivo are also described. Using this wide range of measurement methods, together with gradually improved body composition models, it is now possible to quantify a number of body components and follow their changes in health and disease. (review)

  15. Novel methods to help develop healthier eating habits for eating and weight disorders: A systematic review and meta-analysis.

    Science.gov (United States)

    Turton, Robert; Bruidegom, Kiki; Cardi, Valentina; Hirsch, Colette R; Treasure, Janet

    2016-02-01

    This paper systematically reviews novel interventions developed and tested in healthy controls that may be able to change the over or under controlled eating behaviours in eating and weight disorders. Electronic databases were searched for interventions targeting habits related to eating behaviours (implementation intentions; food-specific inhibition training and attention bias modification). These were assessed in accordance with the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines. In healthy controls the implementation intention approach produces a small increase in healthy food intake and reduction in unhealthy food intake post-intervention. The size of these effects decreases over time and no change in weight was found. Unhealthy food intake was moderately reduced by food-specific inhibition training and attention bias modification post-intervention. This work may have important implications for the treatment of populations with eating and weight disorders. However, these findings are preliminary as there is a moderate to high level of heterogeneity in implementation intention studies and to date there are few food-specific inhibition training and attention bias modification studies. PMID:26695383

  16. Optimisation of isolation methods for the azaspiracid group of marine biotoxins and the development of accurate and precise methods of analysis

    OpenAIRE

    Kilcoyle, J.

    2015-01-01

    The two main groups of biotoxins which affect the Irish shellfish industry are azaspiracids (AZAs) and the okadaic acid (OA) group (OA, DTX2, DTX1 and their esters) toxins. Since AZAs were first identified in 1998, well over 30 analogues have been reported. Structural and toxicological data have been described for AZA1–5 (isolated from shellfish). LC-MS/MS is the EU reference method for detection of the AZAs (AZA1, -2 and -3) and the OA group toxins in raw shellfish with the regulatory limit ...

  17. Optimisation of Isolation Methods for the AZA Group of Marine Biotoxins and the Development of Accurate and Precise Methods of Analysis

    OpenAIRE

    Kilcoyne, Jane

    2015-01-01

    The two main groups of biotoxins which affect the Irish shellfish industry are azaspiracids (AZAs) and the okadaic acid (OA) group (OA, DTX2, DTX1 and their esters) toxins. Since AZAs were first identified in 1998, well over 30 analogues have been reported. Structural and toxicological data have been described for AZA1–5 (isolated from shellfish). LC-MS/MS is the EU reference method for detection of the AZAs (AZA1, -2 and -3) and the OA group toxins in raw shellfish with the regulatory limit ...

  18. The development of sequential separation methods for the analysis of actinides in sediments and biological materials using anion-exchange resins and extraction chromatography

    International Nuclear Information System (INIS)

    New, quantitative methods for the determination of actinides have been developed for application to marine environmental samples (e.g., sediment and fish). The procedures include aggressive dissolution, separation by anion-exchange resin, separation and purification by extraction chromatography (e.g., TRU, TEVA and UTEVA resins) with measurement of the radionuclides by semiconductor alpha-spectrometry (SAS). Anion-exchange has proved to be a strong tool to treat large volume samples, and extraction chromatography shows an excellent selectivity and reduction of the amounts of acids. The results of the analysis of uranium, thorium, plutonium and americium isotopes by this method in marine samples (IAEA-384, -385 and -414) provided excellent agreement with the recommended values with good chemical recoveries. (author)

  19. Excitation methods for energy dispersive analysis

    International Nuclear Information System (INIS)

    The rapid development in recent years of energy dispersive x-ray fluorescence analysis has been based primarily on improvements in semiconductor detector x-ray spectrometers. However, the whole analysis system performance is critically dependent on the availability of optimum methods of excitation for the characteristic x rays in specimens. A number of analysis facilities based on various methods of excitation have been developed over the past few years. A discussion is given of the features of various excitation methods including charged particles, monochromatic photons, and broad-energy band photons. The effects of the excitation method on background and sensitivity are discussed from both theoretical and experimental viewpoints. Recent developments such as pulsed excitation and polarized photons are also discussed

  20. Development of RP-HPLC method for Qualitative Analysis of Active Ingredient (Gallic acid from Stem Bark of Dendrophthoe falcate Linn.

    Directory of Open Access Journals (Sweden)

    Hafsa Deshmukh

    2011-04-01

    Full Text Available A simple, precise and sensitive, RP-HPLC method with UV detection at 271nm was developed and validated for qualitative determination of active ingredient Gallic acid from stem bark of Dendrophthoe falcate Linn. Separation was performed on a ThermoMOS 2 HYPERSIL C18 column (250 cm × 4.6 mm, 5µm ODS 3 using mobile phase comprising of 0.1% Orthophosphoric acid : Acetonitrile (400 cm3 : 600 cm3 with a flow rate of 1 ml/minute with a short run time of 13 minute. The method was validated according to the regulatory guidelines with respect to linearity, system suitability, precision, solution stability, accuracy, robustness, assay and recovery. Detector response was linear for HPLC in the range of 0.04 to 0.16 mg/cm3. The system suitability, precision, solution stability, accuracy, robustness, assay and recovery was assessed by calculating % COV for all these parameters which is less than two as expected. The recovery of the method for Gallic acid was found 98.94% which shows that method is accurate. The described method has the advantage of being rapid and easy hence it can be applied for routine quality control analysis of Gallic acid from Dendrophthoe falcate Linn.

  1. Development of plant dynamic analysis code for integrated self-pressurized water reactor (ISPDYN), and comparative study of pressure control methods

    International Nuclear Information System (INIS)

    This report describes the development of plant dynamic analysis code (ISPDYN) for integrated self-pressurized water reactor, and comparative study of pressure control methods with this code. ISPDYN is developed for integrated self-pressurized water reactor, one of the trial design by JAERI. In the transient responses, the calculated results by ISPDYN are in good agreement with the DRUCK calculations. In addition, this report presents some sensitivity studies for selected cases. Computing time of this code is very short so as about one fifth of real time. The comparative study of self-pressurized system with forced-pressurized system by this code, for rapid load decrease and increase cases, has provided useful informations. (author)

  2. Development of the Method of Bacterial Leaching of Metals out of Low-Grade Ores, Rocks, and Industrial Wastes Using Neutron Activation Analysis

    CERN Document Server

    Tsertsvadze, L A; Petriashvili, Sh G; Chutkerashvili, D G; Kirkesali, E I; Frontasyeva, M V; Pavlov, S S; Gundorina, S F

    2001-01-01

    The results of preliminary investigations aimed at the development of an economical and easy to apply technique of bacterial leaching of rare and valuable metals out of low-grade ores, complex composition ores, rocks, and industrial wastes in Georgia are discussed. The main groups of microbiological community of the peat suspension used in the experiments of bacterial leaching are investigated and the activity of particular microorganisms in the leaching of probes with different mineral compositions is assessed. The element composition of the primary and processed samples was investigated by the epithermal neutron activation analysis method and the enrichment/subtraction level is estimated for various elements. The efficiency of the developed technique to purify wastes, extract some scrace metals, and enrich ores or rocks in some elements, e.g. Au, U, Th, Cs, Sr, Rb, Sc, Zr, Hf, Ta, Gd, Er, Lu, Ce, etc., is demonstrated.

  3. Development of the HS-SPME-GC-MS/MS method for analysis of chemical warfare agent and their degradation products in environmental samples.

    Science.gov (United States)

    Nawała, Jakub; Czupryński, Krzysztof; Popiel, Stanisław; Dziedzic, Daniel; Bełdowski, Jacek

    2016-08-24

    After World War II approximately 50,000 tons of chemical weapons were dumped in the Baltic Sea by the Soviet Union under the provisions of the Potsdam Conference on Disarmament. These dumped chemical warfare agents still possess a major threat to the marine environment and to human life. Therefore, continue monitoring of these munitions is essential. In this work, we present the application of new solid phase microextraction fibers in analysis of chemical warfare agents and their degradation products. It can be concluded that the best fiber for analysis of sulfur mustard and its degradation products is butyl acrylate (BA), whereas for analysis of organoarsenic compounds and chloroacetophenone, the best fiber is a co-polymer of methyl acrylate and methyl methacrylate (MA/MMA). In order to achieve the lowest LOD and LOQ the samples should be divided into two subsamples. One of them should be analyzed using a BA fiber, and the second one using a MA/MMA fiber. When the fast analysis is required, the microextraction should be performed by use of a butyl acrylate fiber because the extraction efficiency of organoarsenic compounds for this fiber is acceptable. Next, we have elaborated of the HS-SPME-GC-MS/MS method for analysis of CWA degradation products in environmental samples using laboratory obtained fibers The analytical method for analysis of organosulfur and organoarsenic compounds was optimized and validated. The LOD's for all target chemicals were between 0.03 and 0.65 ppb. Then, the analytical method developed by us, was used for the analysis of sediment and pore water samples from the Baltic Sea. During these studies, 80 samples were analyzed. It was found that 25 sediments and 5 pore water samples contained CWA degradation products such as 1,4-dithiane, 1,4-oxathiane or triphenylarsine, the latter being a component of arsine oil. The obtained data is evidence that the CWAs present in the Baltic Sea have leaked into the general marine environment. PMID

  4. Development of a method to make use of sensitivity studies and its application to analysis of uncertainties in environmental loading on offshore structures

    Energy Technology Data Exchange (ETDEWEB)

    Carr, Peter (Atkins Oil and Gas Engineering Ltd., London (UK)); Birkinshaw, Malcolm (UKAEA Atomic Energy Research Establishment, Harwell (UK). Energy Technology Div.)

    1989-01-01

    By making use of existing sensitivity studies, our aim was to determine the relative importance of uncertainties in environmental loading parameters for fixed steel jackets. The technique of uncertainty tree analysis was used. This numerical method determines the probability distribution of a variable, given a series of equations linking it to a number of independent variables which have known probability distributions. The dependent variable was taken to be either base shear or overturning moment. The independent variables were wave height, period, hydrodynamic coefficients, current, current profile and other factors. The linking equations were developed empirically from the results of the existing sensitivity studies. Estimates for the input probability distributions were made using data in the general literature. The output of the analysis includes a measure of the relative importance of the various uncertainties, called the 'uncertainty index'. Uncertainty indices were developed for three of the platforms in the sensitivity studies: one southern North Sea and two northern North Sea platforms. The analysis for the southern North Sea platform was later extended to include a number of methodological uncertainties which had not been considered in the sensitivity studies, but the treatment is necessarily more approximate. The results indicate the overall importance of the methodological uncertainties. (author).

  5. A platform analytical quality by design (AQbD) approach for multiple UHPLC-UV and UHPLC-MS methods development for protein analysis.

    Science.gov (United States)

    Kochling, Jianmei; Wu, Wei; Hua, Yimin; Guan, Qian; Castaneda-Merced, Juan

    2016-06-01

    A platform analytical quality by design approach for methods development is presented in this paper. This approach is not limited just to method development following the same logical Analytical quality by design (AQbD) process, it is also exploited across a range of applications in methods development with commonality in equipment and procedures. As demonstrated by the development process of 3 methods, the systematic approach strategy offers a thorough understanding of the method scientific strength. The knowledge gained from the UHPLC-UV peptide mapping method can be easily transferred to the UHPLC-MS oxidation method and the UHPLC-UV C-terminal heterogeneity methods of the same protein. In addition, the platform AQbD method development strategy ensures method robustness is built in during development. In early phases, a good method can generate reliable data for product development allowing confident decision making. Methods generated following the AQbD approach have great potential for avoiding extensive post-approval analytical method change. While in the commercial phase, high quality data ensures timely data release, reduced regulatory risk, and lowered lab operational cost. Moreover, large, reliable database and knowledge gained during AQbD method development provide strong justifications during regulatory filling for the selection of important parameters or parameter change needs for method validation, and help to justify for removal of unnecessary tests used for product specifications. PMID:27017571

  6. Development of a capillary electrophoresis method for the analysis in alkaline media as polyoxoanions of two strategic metals: Niobium and tantalum.

    Science.gov (United States)

    Deblonde, Gauthier J-P; Chagnes, Alexandre; Cote, Gérard; Vial, Jérôme; Rivals, Isabelle; Delaunay, Nathalie

    2016-03-11

    Tantalum (Ta) and niobium (Nb) are two strategic metals essential to several key sectors, like the aerospace, gas and oil, nuclear and electronic industries, but their separation is really difficult due to their almost identical chemical properties. Whereas they are currently produced by hydrometallurgical processes using fluoride-based solutions, efforts are being made to develop cleaner processes by replacing the fluoride media by alkaline ones. However, methods to analyze Nb and Ta simultaneously in alkaline samples are lacking. In this work, we developed a capillary zone electrophoresis (CE) method able to separate and quantify Nb and Ta directly in alkaline media. This method takes advantage of the hexaniobate and hexatantalate ions which are naturally formed at pH>9 and absorb in the UV domain. First, the detection conditions, the background electrolyte (BGE) pH, the nature of the BGE co-ion and the internal standard (IS) were optimized by a systematic approach. As the BGE counter-ion nature modified the speciation of both ions, sodium- and lithium-based BGE were tested. For each alkaline cation, the BGE ionic strength and separation temperature were optimized using experimental designs. Since changes in the migration order of IS, Nb and Ta were observed within the experimental domain, the resolution was not a monotonic function of ionic strength and separation temperature. This forced us to develop an original data treatment for the prediction of the optimum separation conditions. Depending on the consideration of either peak widths or peak symmetries, with or without additional robustness constraints, four optima were predicted for each tested alkaline cation. The eight predicted optima were tested experimentally and the best experimental optimum was selected considering analysis time, resolution and robustness. The best separation was obtained at 31.0°C and in a BGE containing 10mM LiOH and 35mM LiCH3COO.The separation voltage was finally optimized

  7. An Analysis Method of Business Application Framework

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    We discuss the evolution of object-oriented software developmentpr o cess based on software pattern. For developing mature software fra mework and component, we advocate to elicit and incorporate software patterns fo r ensuing quality and reusability of software frameworks. On the analysis base o f requirement specification for business application domain, we present analysis method and basic role model of software framework. We also elicit analysis patt ern of framework architecture, and design basic role classes and their structure .

  8. Development of safety evaluation methods and analysis codes applied to the safety regulations for the design and construction stage of fast breeder reactor (Annual safety research report, JFY 2010)

    International Nuclear Information System (INIS)

    The purposes of this study are to develop the safety evaluation methods and analysis codes needed in the design and construction stage of fast breeder reactor (FBR). In JFY 2010, the following results are obtained. As for the development of safety evaluation methods needed in the safety examination achieved for the reactor establishment permission, development of the analysis codes such as core seismic analysis code, core safety analysis code and core damage analysis code were carried out according to the plan. As for the development of the safety evaluation method needed for the risk informed safety regulation, the quantification technique of the event tree using the Continuous Markov chain Monte Carlo method (CMMC method) were studied, and the seismic PSA to evaluate residual risk was studied. (author)

  9. Development of safety evaluation methods and analysis codes applied to the safety regulations for the design and construction stage of fast breeder reactor (Annual safety research report, JFY 2011)

    International Nuclear Information System (INIS)

    The purposes of this study are to develop the safety evaluation methods and analysis codes needed in the design and construction stage of fast breeder reactor (FBR). In JFY 2011, the following results are obtained. As for the development of safety evaluation methods needed in the safety examination achieved for the reactor establishment permission, development of the analysis codes such as core seismic analysis code, core safety analysis code and core damage analysis code were earned out according to the plan. As for the development of the safety evaluation method needed for the risk informed safety regulation, the quantification technique of the event tree using the Continuous Markov chain Monte Carlo method (CMMC method) were studied, and the seismic PSA to evaluate residual risk was studied. (author)

  10. Development of a method for comprehensive and quantitative analysis of plant hormones by highly sensitive nanoflow liquid chromatography-electrospray ionization-ion trap mass spectrometry

    International Nuclear Information System (INIS)

    In recent plant hormone research, there is an increased demand for a highly sensitive and comprehensive analytical approach to elucidate the hormonal signaling networks, functions, and dynamics. We have demonstrated the high sensitivity of a comprehensive and quantitative analytical method developed with nanoflow liquid chromatography-electrospray ionization-ion trap mass spectrometry (LC-ESI-IT-MS/MS) under multiple-reaction monitoring (MRM) in plant hormone profiling. Unlabeled and deuterium-labeled isotopomers of four classes of plant hormones and their derivatives, auxins, cytokinins (CK), abscisic acid (ABA), and gibberellins (GA), were analyzed by this method. The optimized nanoflow-LC-ESI-IT-MS/MS method showed ca. 5-10-fold greater sensitivity than capillary-LC-ESI-IT-MS/MS, and the detection limits (S/N = 3) of several plant hormones were in the sub-fmol range. The results showed excellent linearity (R2 values of 0.9937-1.0000) and reproducibility of elution times (relative standard deviations, RSDs, <1.1%) and peak areas (RSDs, <10.7%) for all target compounds. Further, sample purification using Oasis HLB and Oasis MCX cartridges significantly decreased the ion-suppressing effects of biological matrix as compared to the purification using only Oasis HLB cartridge. The optimized nanoflow-LC-ESI-IT-MS/MS method was successfully used to analyze endogenous plant hormones in Arabidopsis and tobacco samples. The samples used in this analysis were extracted from only 17 tobacco dry seeds (1 mg DW), indicating that the efficiency of analysis of endogenous plant hormones strongly depends on the detection sensitivity of the method. Our analytical approach will be useful for in-depth studies on complex plant hormonal metabolism.

  11. Development of a method for comprehensive and quantitative analysis of plant hormones by highly sensitive nanoflow liquid chromatography-electrospray ionization-ion trap mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Izumi, Yoshihiro; Okazawa, Atsushi; Bamba, Takeshi; Kobayashi, Akio [Department of Biotechnology, Graduate School of Engineering, Osaka University, 2-1 Yamadaoka, Suita, Osaka 565-0871 (Japan); Fukusaki, Eiichiro, E-mail: fukusaki@bio.eng.osaka-u.ac.jp [Department of Biotechnology, Graduate School of Engineering, Osaka University, 2-1 Yamadaoka, Suita, Osaka 565-0871 (Japan)

    2009-08-26

    In recent plant hormone research, there is an increased demand for a highly sensitive and comprehensive analytical approach to elucidate the hormonal signaling networks, functions, and dynamics. We have demonstrated the high sensitivity of a comprehensive and quantitative analytical method developed with nanoflow liquid chromatography-electrospray ionization-ion trap mass spectrometry (LC-ESI-IT-MS/MS) under multiple-reaction monitoring (MRM) in plant hormone profiling. Unlabeled and deuterium-labeled isotopomers of four classes of plant hormones and their derivatives, auxins, cytokinins (CK), abscisic acid (ABA), and gibberellins (GA), were analyzed by this method. The optimized nanoflow-LC-ESI-IT-MS/MS method showed ca. 5-10-fold greater sensitivity than capillary-LC-ESI-IT-MS/MS, and the detection limits (S/N = 3) of several plant hormones were in the sub-fmol range. The results showed excellent linearity (R{sup 2} values of 0.9937-1.0000) and reproducibility of elution times (relative standard deviations, RSDs, <1.1%) and peak areas (RSDs, <10.7%) for all target compounds. Further, sample purification using Oasis HLB and Oasis MCX cartridges significantly decreased the ion-suppressing effects of biological matrix as compared to the purification using only Oasis HLB cartridge. The optimized nanoflow-LC-ESI-IT-MS/MS method was successfully used to analyze endogenous plant hormones in Arabidopsis and tobacco samples. The samples used in this analysis were extracted from only 17 tobacco dry seeds (1 mg DW), indicating that the efficiency of analysis of endogenous plant hormones strongly depends on the detection sensitivity of the method. Our analytical approach will be useful for in-depth studies on complex plant hormonal metabolism.

  12. Development and validation of a LC-MS/MS method for quantitative analysis of uraemic toxins p-cresol sulphate and indoxyl sulphate in saliva.

    Science.gov (United States)

    Giebułtowicz, Joanna; Korytowska, Natalia; Sankowski, Bartłomiej; Wroczyński, Piotr

    2016-04-01

    p-Cresol sulphate (pCS) and indoxyl sulphate (IS) are uraemic toxins, the concentration of which in serum correlate with the stage of renal failure. The aim of this study was to develop and validate a high-performance liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for the analysis of pCS and IS in saliva. This is the first time, to our knowledge, that such a method has been developed using saliva. Unstimulated, fasting saliva was collected from healthy volunteers in the morning and pooled for validation assay. The method was validated for linearity, precision, accuracy, stability (freeze/thaw stability, stability in autosampler, short- and long-term stability, stock solution stability), dilution integrity and matrix effect. The analysed validation criteria were fulfilled. No influence of salivary flow (pCS: p=0.678; IS: p=0.238) nor type of swab in the Salivette device was detected. Finally, using the novel validated method, the saliva samples of healthy people (n=70) of various ages were analysed. We observed a tendency for an increase of concentration of toxins in saliva in the elderly. This could be a result of age-related diseases, e.g., diabetes and kidney function decline. We can conclude that the novel LC-MS/MS method can be used for the determination of pCS and IS in human saliva. The results encourage the validation of saliva as a clinical sample for monitoring toxin levels in organisms. PMID:26838447

  13. Development of a non-chromatographic method for the speciation analysis of inorganic antimony in mushroom samples by hydride generation atomic fluorescence spectrometry

    Science.gov (United States)

    Sousa Ferreira, Hadla; Costa Ferreira, Sergio Luis; Cervera, M. Luisa; de la Guardia, Miguel

    2009-06-01

    A simple and sensitive method has been developed for the direct determination of toxic species of antimony in mushroom samples by hydride generation atomic fluorescence spectrometry (HG AFS). The determination of Sb(III) and Sb(V) was based on the efficiency of hydride generation employing NaBH 4, with and without a previous KI reduction, using proportional equations corresponding to the two different measurement conditions. The extraction efficiency of total antimony and the stability of Sb(III) and Sb(V) in different extraction media (nitric, sulfuric, hydrochloric, acetic acid, methanol and ethanol) were evaluated. Results demonstrated that, based on the extraction yield and the stability of extracts, 0.5 mol L - 1 H 2SO 4 proved to be the best extracting solution for the speciation analysis of antimony in mushroom samples. The limits of detection of the developed methodology were 0.6 and 1.1 ng g - 1 for Sb(III) and Sb(V), respectively. The relative standard derivation was 3.8% (14.7 ng g - 1 ) for Sb(V) and 5.1% (4.6 ng g - 1 ) for Sb(III). The recovery values obtained for Sb(III) and Sb(V) varied from 94 to 106% and from 98 to 105%, respectively. The method has been applied to determine Sb(III), Sb(V) and total Sb in five different mushroom samples; the Sb(III) content varied from 4.6 to 11.4 ng g - 1 and Sb(V) from 14.7 to 21.2 ng g - 1 . The accuracy of the method was confirmed by the analysis of a certified reference material of tomato leaves.

  14. Further development of probabilistic analysis method for lifetime determination of piping and vessels. Final report; Weiterentwicklung probabilistischer Analysemethoden zur Lebensdauerbestimmung von Rohrleitungen und Behaeltern. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Heckmann, K.; Grebner, H.; Sievers, J.

    2013-07-15

    Within the framework of research project RS1196 the computer code PROST (Probabilistic Structure Calculation) for the quantitative evaluation of the structural reliability of pipe components has been further developed. Thereby models were provided and tested for the consideration of the damage mechanism 'stable crack growth' to determine leak and break probabilities in cylindrical structures of ferritic and austenitic reactor steels. These models are now additionally available to the model for the consideration of the damage mechanisms 'fatigue' and 'corrosion'. Moreover, a crack initiation model has been established supplementary to the treatment of initial cracks. Furthermore, the application range of the code was extended to the calculation of the growth of wall penetrating cracks. This is important for surface cracks growing until the formation of a stable leak. The calculation of the growth of the wall penetrating crack until break occurs improves the estimation of the break probability. For this purpose program modules were developed to be able to calculate stress intensity factors and critical crack lengths for wall penetrating cracks. In the frame of this work a restructuring of PROST was performed including possibilities to combine damage mechanisms during a calculation. Furthermore several additional fatigue crack growth laws were implemented. The implementation of methods to estimate leak areas and leak rates of wall penetrating cracks was completed by the inclusion of leak detection boundaries. The improved analysis methods were tested by calculation of cases treated already before. Furthermore comparative analyses have been performed for several tasks within the international activity BENCH-KJ. Altogether, the analyses show that with the provided flexible probabilistic analysis method quantitative determination of leak and break probabilities of a crack in a complex structure geometry under thermal-mechanical loading as

  15. Possibilities and limits of surface analysis methods

    International Nuclear Information System (INIS)

    The possibilities and limits of the surface analysis methods are presented and stated by means of a choice. It is tried to show how to built up a systematology of all methods. Some capable methods are described in detail. The examples of analyses are chosen under the point of view to give a contribution to the questions existing in the Institute for Reactor Development at the moment. (orig.)

  16. An integrated quality by design and mixture-process variable approach in the development of a capillary electrophoresis method for the analysis of almotriptan and its impurities.

    Science.gov (United States)

    Orlandini, S; Pasquini, B; Stocchero, M; Pinzauti, S; Furlanetto, S

    2014-04-25

    The development of a capillary electrophoresis (CE) method for the assay of almotriptan (ALM) and its main impurities using an integrated Quality by Design and mixture-process variable (MPV) approach is described. A scouting phase was initially carried out by evaluating different CE operative modes, including the addition of pseudostationary phases and additives to the background electrolyte, in order to approach the analytical target profile. This step made it possible to select normal polarity microemulsion electrokinetic chromatography (MEEKC) as operative mode, which allowed a good selectivity to be achieved in a low analysis time. On the basis of a general Ishikawa diagram for MEEKC methods, a screening asymmetric matrix was applied in order to screen the effects of the process variables (PVs) voltage, temperature, buffer concentration and buffer pH, on critical quality attributes (CQAs), represented by critical separation values and analysis time. A response surface study was then carried out considering all the critical process parameters, including both the PVs and the mixture components (MCs) of the microemulsion (borate buffer, n-heptane as oil, sodium dodecyl sulphate/n-butanol as surfactant/cosurfactant). The values of PVs and MCs were simultaneously changed in a MPV study, making it possible to find significant interaction effects. The design space (DS) was defined as the multidimensional combination of PVs and MCs where the probability for the different considered CQAs to be acceptable was higher than a quality level π=90%. DS was identified by risk of failure maps, which were drawn on the basis of Monte-Carlo simulations, and verification points spanning the design space were tested. Robustness testing of the method, performed by a D-optimal design, and system suitability criteria allowed a control strategy to be designed. The optimized method was validated following ICH Guideline Q2(R1) and was applied to a real sample of ALM coated tablets. PMID

  17. Developments of an Interactive Sail Design Method

    Directory of Open Access Journals (Sweden)

    S. M. Malpede

    2000-01-01

    Full Text Available This paper presents a new tool for performing the integrated design and analysis of a sail. The features of the system are the geometrical definition of a sail shape, using the Bezier surface method, the creation of a finite element model for the non-linear structural analysis and a fluid-dynamic model for the aerodynamic analysis. The system has been developed using MATLAB(r. Recent sail design efforts have been focused on solving the aeroelastic behavior of the sail. The pressure distribution on a sail changes continuously, by virtue of cloth stretch and flexing. The sail shape determines the pressure distribution and, at the same time, the pressure distribution on the sail stretches and flexes the sail material determining its shape. This characteristic non-linear behavior requires iterative solution strategies to obtain the equilibrium configuration and evaluate the forces involved. The aeroelastic problem is tackled by combining structural with aerodynamic analysis. Firstly, pressure loads for a known sail-shape are computed (aerodynamic analysis. Secondly, the sail-shape is analyzed for the obtained external loads (structural analysis. The final solution is obtained by using an iterative analysis process, which involves both aerodynamic and the structural analysis. When the solution converges, it is possible to make design modifications.

  18. Method development and analysis of free HS and HS in proteoglycans from pre- and postmenopausal women: Evidence for biosynthetic pathway changes in sulfotransferase and sulfatase enzymes

    Science.gov (United States)

    Wei, Wei; Miller, Rebecca L.; Leary, Julie A.

    2013-01-01

    Heparan sulfate (HS) is one of the most complex and informative biopolymers found on the cell surface or in the extracellular matrix as either free HS fragments or constituents of HS proteoglycans (HSPGs). Analysis of free HS and HSPG sugar chains in human serum at the disaccharide level has great potential for early disease diagnosis and prognosis, however, the low concentration of HS in human serum, together with the complexity of the serum matrix, limits the information on HS. In this study, we present and validate the development of a new sensitive method for in-depth compositional analysis of free HS and HSPG sugar chains. This protocol involved several steps including weak anion exchange chromatography, ultrafiltration and solid phase extraction for enhanced detection prior to LC-MS/MS analysis. Using this protocol, a total of 51 serum samples from 26 premenopausal and 25 postmenopausal women were analyzed. Statistically significant differences in heparin/HS disaccharide profiles were observed. The proportion of N-acetylation and N-sulfation in both free HS and HSPG sugar chains were significantly different between pre- and postmenopausal women, indicating changes in N-deacetylase/N-sulfotransferases (NDSTs), the enzymes involved in the initial step of the biosynthetic pathway. Differences in the proportion of 6-O-sulfation suggest that 6-O-sulfotransferase and/or 6-O-sulfatase enzymes may also be implicated. PMID:23659730

  19. Development and validation of a generic nontarget method based on liquid chromatography - high resolution mass spectrometry analysis for the evaluation of different wastewater treatment options.

    Science.gov (United States)

    Nürenberg, Gudrun; Schulz, Manoj; Kunkel, Uwe; Ternes, Thomas A

    2015-12-24

    A comprehensive workflow for using nontarget approaches as process evaluation tools was implemented, including data acquisition based on a LC-HRMS (QTOF) system using direct injection and data post-processing for the peak recognition in "full scan" data. Both parts of the approach were not only developed and validated in a conventional way using the suspected analysis of a set of spiked known micropollutants but also the nontarget analysis of a wastewater treatment plant (WWTP) effluent itself was utilized to consider a more environmental relevant range of analytes. Hereby, special focus was laid on the minimization of false positive results (FPs) during the peak recognition. The optimized data post-processing procedure reduced the percentage of FPs from 42% to 10-15%. Furthermore, the choice of a suitable chromatography for biological treated wastewater systems was also discussed during the method development. The workflow paid also attention to differences in the performance levels of the LC-HRMS system by implementation of an adaption system for intensity variations comparing different measurements dates or different instruments. The application of this workflow on wastewater samples from a municipal WWTP revealed that more than 91% compounds were eliminated by the biological treatment step and that the received effluent contained 55% newly formed potential transformation products. PMID:26654253

  20. Development of an LC-MS/MS method for analysis of interconvertible Z/E isomers of the novel anticancer agent, Bp4eT.

    Science.gov (United States)

    Stariat, Ján; Kovaríková, Petra; Klimes, Jirí; Kalinowski, Danuta S; Richardson, Des R

    2010-05-01

    This study was focused on a liquid chromatography/tandem mass spectrometry (LC/MS/MS) method development for quantification of a novel potential anticancer agent, 2-benzoylpyridine 4-ethyl-3-thiosemicarbazone (Bp4eT), in aqueous media. Solid Bp4eT was found to consist predominantly of the Z isomer, while in aqueous media, both isomers coexist. Sufficient separation of both isomers was achieved on a Synergi 4u Polar RP column with a mobile phase composed of 2 mM ammonium formate, acetonitrile, and methanol (30:63:7; v/v/v). The photo diode array analysis of both isomers demonstrated different absorption spectra which hindered UV-based quantification. However, an equal and reproducible response was found for both isomers using an MS detector, which enables the determination of the total content of Bp4eT (i.e., both E- and Z- isomeric forms) by summation of the peak areas of both isomers. 2-Hydroxy-1-naphthylaldehyde 4-methyl-3-thiosemicarbazone (N4mT) was selected as the internal standard. Quantification was performed in selective reaction monitoring using the main fragments of [M+H](+) (240 m/z for Bp4eT and 229 m/z for N4mT). The method was validated over 20-600 ng/ml. This procedure was applied to a preformulation study to determine the proper vehicle for parenteral administration. It was found that Bp4eT was poorly soluble in aqueous media. However, the solubility can be effectively improved using pharmaceutical cosolvents. In fact, a 1:1 mixture of PEG 300/0.14 M saline markedly increased solubility and may be a useful drug formulation for intravenous administration. This investigation further accelerates development of novel anticancer thiosemicarbazones. The described methods will be useful for analogs currently under development and suffering the same analytical issue. PMID:20127082

  1. Study on seismic equipment fragility analysis method

    International Nuclear Information System (INIS)

    Vulnerable points of nuclear power plants can be found through seismic PSA which is an effective method to evaluate the seismic effect on nuclear power plants, and fragility analysis is one important step for seismic PSA. In this paper, the concept of seismic equipment fragility is introduced, the mathematic model of seismic fragility is given, the determination of equipment failure modes is discussed, fragility analysis variables and methods (i. e. method based-on analysis and method based-on dynamic testing) are mainly studied, and finally median fragility, the distribution of randomness and uncertainty and HCLPF capacity can be calculated by formulations. On the other hand, when developing seismic fragility, there are three types of information that can be relied on: data from real earthquake experience, test data and analysis data, and this data used in specific nuclear plant need to be collected and completed. (authors)

  2. Scientific methods for developing ultrastable structures

    International Nuclear Information System (INIS)

    Scientific methods used by the Los Alamos National Laboratory for developing an ultrastable structure for study of silicon-based elementary particle tracking systems are addressed. In particular, the design, analysis, and monitoring of this system are explored. The development methodology was based on a triad of analytical, computational, and experimental techniques. These were used to achieve a significant degree of mechanical stability (alignment accuracy >1 μrad) and yet allow dynamic manipulation of the system. Estimates of system thermal and vibratory stability and component performance are compared with experimental data collected using laser interferometry and accelerometers. 8 refs., 5 figs., 4 tabs

  3. Development and validation of a method for the analysis of hydroxyzine hydrochloride in extracellular solution used in in vitro preclinical safety studies.

    Science.gov (United States)

    Briône, Willy; Brekelmans, Mari; Eijndhoven, Freek van; Schenkel, Eric; Noij, Theo

    2015-11-10

    In the process of drug development, preclinical safety studies are to be performed that require the analysis of the compound at very low concentrations with high demands on the performance of the analytical methods. In the current study, a UPLC-MS/MS method was developed and validated to quantify hydroxyzine hydrochloride in an extracellular solution used in a hERG assay in concentrations ranging from 0.01 to 10μM (4.5ng/ml-4.5μg/ml). Chromatographic separation was achieved isocratically on an Acquity BEH C18 analytical column. The assay was validated at concentrations of 0.11-1.1ng/ml in end solution for hydroxyzine hydrochloride. Linearity was demonstrated over the range of concentrations of 0.06-0.17ng/ml and over the range of concentrations of 0.6-1.7ng/ml in end solution with the coefficient of correlation r>0.99. Accuracy of the achieved concentration, intra-run, and inter-run precision of the method were well within the acceptance criteria (being mean recovery of 80-120% and relative standard deviation ≤10.0%). The limit of quantification in extracellular solution was 0.09ng/ml. Hydroxyzine hydrochloride in extracellular solution proved to be stable when stored in the fridge at 4-8°C for at least 37 days, at room temperature for at least 16 days and at +35°C for at least 16 days. The analytical method was successfully applied in hERG assay. PMID:26163869

  4. Development of an HPLC-UV Method for the Analysis of Drugs Used for Combined Hypertension Therapy in Pharmaceutical Preparations and Human Plasma

    Directory of Open Access Journals (Sweden)

    Serife Evrim Kepekci Tekkeli

    2013-01-01

    Full Text Available A simple, rapid, and selective HPLC-UV method was developed for the determination of antihypertensive drug substances: amlodipine besilat (AML, olmesartan medoxomil (OLM, valsartan (VAL, and hydrochlorothiazide (HCT in pharmaceuticals and plasma. These substances are mostly used as combinations. The combinations are found in various forms, especially in current pharmaceuticals as threesome components: OLM, AML, and HCT (combination I and AML, VAL, and HCT (combination II. The separation was achieved by using an RP-CN column, and acetonitrile-methanol-10 mmol orthophosphoric acid pH 2.5 (7 : 13 : 80, v/v/v was used as a mobile phase; the detector wavelength was set at 235 nm. The linear ranges were found as 0.1–18.5 μg/mL, 0.4–25.6 μg/mL, 0.3–15.5 μg/mL, and 0.3–22 μg/mL for AML, OLM, VAL, and HCT, respectively. In order to check the selectivity of the method for pharmaceutical preparations, forced degradation studies were carried out. According to the validation studies, the developed method was found to be reproducible and accurate as shown by RSD ≤6.1%, 5.7%, 6.9%, and 4.6% and relative mean error (RME ≤10.6%, 5.8%, 6.5%, and 6.8% for AML, OLM, VAL, and HCT, respectively. Consequently, the method was applied to the analysis of tablets and plasma of the patients using drugs including those substances.

  5. Development of an HPLC-UV Method for the Analysis of Drugs Used for Combined Hypertension Therapy in Pharmaceutical Preparations and Human Plasma.

    Science.gov (United States)

    Kepekci Tekkeli, Serife Evrim

    2013-01-01

    A simple, rapid, and selective HPLC-UV method was developed for the determination of antihypertensive drug substances: amlodipine besilat (AML), olmesartan medoxomil (OLM), valsartan (VAL), and hydrochlorothiazide (HCT) in pharmaceuticals and plasma. These substances are mostly used as combinations. The combinations are found in various forms, especially in current pharmaceuticals as threesome components: OLM, AML, and HCT (combination I) and AML, VAL, and HCT (combination II). The separation was achieved by using an RP-CN column, and acetonitrile-methanol-10 mmol orthophosphoric acid pH 2.5 (7 : 13 : 80, v/v/v) was used as a mobile phase; the detector wavelength was set at 235 nm. The linear ranges were found as 0.1-18.5  μ g/mL, 0.4-25.6  μ g/mL, 0.3-15.5  μ g/mL, and 0.3-22  μ g/mL for AML, OLM, VAL, and HCT, respectively. In order to check the selectivity of the method for pharmaceutical preparations, forced degradation studies were carried out. According to the validation studies, the developed method was found to be reproducible and accurate as shown by RSD ≤6.1%, 5.7%, 6.9%, and 4.6% and relative mean error (RME) ≤10.6%, 5.8%, 6.5%, and 6.8% for AML, OLM, VAL, and HCT, respectively. Consequently, the method was applied to the analysis of tablets and plasma of the patients using drugs including those substances. PMID:23634320

  6. Research on Development Strategy of YO Company Based on SWOT Analysis Method%基于SWOT分析法的YO公司发展战略研究

    Institute of Scientific and Technical Information of China (English)

    李连璋

    2016-01-01

    为应对当前企业经营者高度重视战略管理,且决策模式多为命令式和愿景式使得员工主动参与程度不高的问题。低价格、低成本、多种类的原始竞争手段已不能满足现代社会的多元化需求。依据SWOT分析方法,结合YO建筑公司实际,对公司发展战略从多维度研究,从产品定价策略、营销渠道、企业文化、人力资源管理、财务等方面提出了YO公司发展战略优化方法,通过案例验证发展战略可行。%Current business operators attach great importance to the strategic management, however, its decision model is imperative and visionary, making employees not willing to participate. Low price, low cost, many kinds of primitive means of competition can not meet the diverse needs of modern society. For this, based on the SWOT analysis method, combining the reality of YO construction company, this paper studies the company development strategy from multi-dimensions, and proposes the development strategy optimization method for YO company.

  7. Validation of a Non-Targeted LC-MS Approach for Identifying Ancient Proteins: Method Development on Bone to Improve Artifact Residue Analysis

    Directory of Open Access Journals (Sweden)

    Andrew Barker

    2015-09-01

    Full Text Available Identification of protein residues from prehistoric cooking pottery using mass spectrometry is challenging because proteins are removed from original tissues, are degraded from cooking, may be poorly preserved due to diagenesis, and occur in a palimpsest of exogenous soil proteins. In contrast, bone proteins are abundant and well preserved. This research is part of a larger method-development project for innovation and improvement of liquid chromatography – mass spectrometry analysis of protein residues from cooking pottery; here we validate the potential of our extraction and characterization approach via application to ancient bone proteins. Because of its preservation potential for proteins and given that our approach is destructive, ancient bone identified via skeletal morphology represents an appropriate verification target. Proteins were identified from zooarchaeological turkey (Meleagris gallopavo Linnaeus Phasianidae, rabbit (Lagomorpha, and squirrel (Sciuridae remains excavated from ancient pueblo archaeological sites in southwestern Colorado using a non-targeted LC-MS/MS approach. The data have been deposited to the ProteomeXchange Consortium with the dataset identifier PXD002440. Improvement of highly sensitive targeted LC-MS/MS approaches is an avenue for future method development related to the study of protein residues from artifacts such as stone tools and pottery.

  8. I. Developing Methods for the Analysis of Chemistry Students' Inscriptions, II. Exploring the Regioselectivity of 1,3-Dipolar Cycloadditions of Munchnones, III. Stereochemical Investigations of C-H Activation Reactions Involving Germylene and Stannylene/Aryl Iodide Reagents

    Science.gov (United States)

    Kiste, Alan L.

    2009-01-01

    I. Analyzing and comparing student-generated inscriptions in chemistry is crucial to gaining insight into students' understanding about chemistry concepts. Thus, we developed two methods of analyzing student-generated inscriptions: features analysis and thematic analysis. We have also demonstrated how these methods are able to discern differences…

  9. Chromolithic method development, validation and system suitability analysis of ultra-sound assisted extraction of glycyrrhizic acid and glycyrrhetinic acid from Glycyrrhiza glabra.

    Science.gov (United States)

    Gupta, Suphla; Sharma, Rajni; Pandotra, Pankaj; Jaglan, Sundeep; Gupta, Ajai Prakash

    2012-08-01

    An ultrasound-assisted extraction and chromolithic LC method was developed for simultaneous determination of glycyrrhizic acid (GA) and glycyrrhetinic acid (GL) from the root extract of Glycyrrhizza glabra using RPLC-PDA. The developed method was validated according to the International Conference on Harmonisation. The method exhibited good linearity (r2 > 0.9989) with high precision and achieved good accuracies between 97.5 to 101.3% of quantitative results. The method is more sensitive and faster (resolved within ten minutes) than the earlier developed methods using normal LC columns. PMID:22978213

  10. Development of a microwave assisted extraction method for the analysis of 2,4,6-trichloroanisole in cork stoppers by SIDA-SBSE-GC-MS

    International Nuclear Information System (INIS)

    The aim of this research work was focused on the replacement of the time-consuming soaking of cork stoppers which is mainly used as screening method for cork lots in connection with sensory analysis and/or analytical methods to detect releasable 2,4,6-trichloroanisole (TCA) of natural cork stoppers. Releasable TCA from whole cork stoppers was analysed with the application of a microwave assisted extraction method (MAE) in combination with stir bar sorptive extraction (SBSE). The soaking of corks (SOAK) was used as a reference method to optimise MAE parameters. Cork lots of different quality and TCA contamination levels were used to adapt MAE. Pre-tests indicated that an MAE at 40 deg. C for 120 min with 90 min of cooling time are suitable conditions to avoid an over-extraction of TCA of low and medium tainted cork stoppers in comparison to SOAK. These MAE parameters allow the measuring of almost the same amounts of releasable TCA as with the application of the soaking procedure in the relevant range (-1 releasable TCA from one cork) to evaluate the TCA level of cork stoppers. Stable isotope dilution assay (SIDA) was applied to optimise quantification of the released TCA with deuterium-labelled TCA (TCA-d5) using a time-saving GC-MS technique in single ion monitoring (SIM) mode. The developed MAE method allows the measuring of releasable TCA from the whole cork stopper under improved conditions and in connection with a low use of solvent and a higher sample throughput.

  11. Development and validation of a hydrophilic interaction chromatography method coupled with a charged aerosol detector for quantitative analysis of nonchromophoric α-hydroxyamines, organic impurities of metoprolol.

    Science.gov (United States)

    Xu, Qun; Tan, Shane; Petrova, Katya

    2016-01-25

    The European Pharmacopeia (EP) metoprolol impurities M and N are polar, nonchromophoric α-hydroxyamines, which are poorly retained in a conventional reversed-phase chromatographic system and are invisible for UV detection. Impurities M and N are currently analyzed by TLC methods in the EP as specified impurities and in the United States Pharmacopeia-National Formulary (USP-NF) as unspecified impurities. In order to modernize the USP monographs of metoprolol drug substances and related drug products, a hydrophilic interaction chromatography (HILIC) method coupled with a charged aerosol detector (CAD) was explored for the analysis of the two impurities. A comprehensive column screening that covers a variety of HILIC stationary phases (underivatized silica, amide, diol, amino, zwitterionic, polysuccinimide, cyclodextrin, and mixed-mode) and optimization of HPLC conditions led to the identification of a Halo Penta HILIC column (4.6 × 150 mm, 5 μm) and a mobile phase comprising 85% acetonitrile and 15% ammonium formate buffer (100 mM, pH 3.2). Efficient separations of metoprolol, succinic acid, and EP metoprolol impurities M and N were achieved within a short time frame (<8 min). The HILIC-CAD method was subsequently validated per USP validation guidelines with respect to specificity, robustness, linearity, accuracy, and precision, and could be incorporated into the current USP-NF monographs to replace the outdated TLC methods. Furthermore, the developed method was successfully applied to determine organic impurities in metoprolol drug substance (metoprolol succinate) and drug products (metoprolol tartrate injection and metoprolol succinate extended release tablets). PMID:26580821

  12. SWOT ANALYSIS ON SAMPLING METHOD

    Directory of Open Access Journals (Sweden)

    CHIS ANCA OANA

    2014-07-01

    Full Text Available Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors. It should be applied correctly for a fair view of financial statements, to satisfy the needs of all financial users. In order to be applied correctly the method must be understood by all its users and mainly by auditors. Otherwise the risk of not applying it correctly would cause loose of reputation and discredit, litigations and even prison. Since there is not a unitary practice and methodology for applying the technique, the risk of incorrectly applying it is pretty high. The SWOT analysis is a technique used that shows the advantages, disadvantages, threats and opportunities. We applied SWOT analysis in studying the sampling method, from the perspective of three players: the audit company, the audited entity and users of financial statements. The study shows that by applying the sampling method the audit company and the audited entity both save time, effort and money. The disadvantages of the method are difficulty in applying and understanding its insight. Being largely used as an audit method and being a factor of a correct audit opinion, the sampling method’s advantages, disadvantages, threats and opportunities must be understood by auditors.

  13. Development of a multianalyte method based on micro-matrix-solid-phase dispersion for the analysis of fragrance allergens and preservatives in personal care products.

    Science.gov (United States)

    Celeiro, Maria; Guerra, Eugenia; Lamas, J Pablo; Lores, Marta; Garcia-Jares, Carmen; Llompart, Maria

    2014-05-30

    An effective, simple and low cost sample preparation method based on matrix solid-phase dispersion (MSPD) followed by gas chromatography-mass spectrometry (GC-MS) or gas chromatography-triple quadrupole-mass spectrometry (GC-MS/MS) has been developed for the rapid simultaneous determination of 38 cosmetic ingredients, 25 fragrance allergens and 13 preservatives. All target substances are frequently used in cosmetics and personal care products and they are subjected to use restrictions or labeling requirements according to the EU Cosmetic Directive. The extraction procedure was optimized on real non-spiked rinse-off and leave-on cosmetic products by means of experimental designs. The final miniaturized process required the use of only 0.1g of sample and 1 mL of organic solvent, obtaining a final extract ready for analysis. The micro-MSPD method was validated showing satisfactory performance by GC-MS and GC-MS/MS analysis. The use of GC coupled to triple quadrupole mass detection allowed to reach very low detection limits (low ng g(-1)) improving, at the same time, method selectivity. In an attempt to improve the chromatographic analysis of preservatives, the inclusion of a derivatization step was also assessed. The proposed method was applied to a broad range of cosmetics and personal care products (shampoos, body milk, moisturizing milk, toothpaste, hand creams, gloss lipstick, sunblock, deodorants and liquid soaps among others), demonstrating the extended use of these substances. The concentration levels were ranging from the sub parts per million to the parts per mill. The number of target fragrance allergens per samples was quite high (up to 16). Several fragrances (linalool, farnesol, hexylcinnamal, and benzyl benzoate) have been detected at levels >0.1% (1,000 μg g(-1)). As regards preservatives, phenoxyethanol was the most frequently found additive reaching quite high concentration (>1,500 μg g(-1)) in five cosmetic products. BHT was detected in eight

  14. Statistical methods for bioimpedance analysis

    Directory of Open Access Journals (Sweden)

    Christian Tronstad

    2014-04-01

    Full Text Available This paper gives a basic overview of relevant statistical methods for the analysis of bioimpedance measurements, with an aim to answer questions such as: How do I begin with planning an experiment? How many measurements do I need to take? How do I deal with large amounts of frequency sweep data? Which statistical test should I use, and how do I validate my results? Beginning with the hypothesis and the research design, the methodological framework for making inferences based on measurements and statistical analysis is explained. This is followed by a brief discussion on correlated measurements and data reduction before an overview is given of statistical methods for comparison of groups, factor analysis, association, regression and prediction, explained in the context of bioimpedance research. The last chapter is dedicated to the validation of a new method by different measures of performance. A flowchart is presented for selection of statistical method, and a table is given for an overview of the most important terms of performance when evaluating new measurement technology.

  15. Assessment of Thorium Analysis Methods

    International Nuclear Information System (INIS)

    The Assessment of thorium analytical methods for mixture power fuel consisting of titrimetry, X-ray flouresence spectrometry, UV-VIS spectrometry, alpha spectrometry, emission spectrography, polarography, chromatography (HPLC) and neutron activation were carried out. It can be concluded that analytical methods which have high accuracy (deviation standard < 3%) were; titrimetry neutron activation analysis and UV-VIS spectrometry; whereas with low accuracy method (deviation standard 3-10%) were; alpha spectrometry and emission spectrography. Ore samples can be analyzed by X-ray flourescnce spectrometry, neutron activation analysis, UV-VIS spectrometry, emission spectrography, chromatography and alpha spectometry. Concentrated samples can be analyzed by X-ray flourescence spectrometry; simulation samples can be analyzed by titrimetry, polarography and UV-VIS spectrometry, and samples of thorium as minor constituent can be analyzed by neutron activation analysis and alpha spectrometry. Thorium purity (impurities element in thorium samples) can be analyzed by emission spectography. Considering interference aspects, in general analytical methods without molecule reaction are better than those involving molecule reactions (author). 19 refs., 1 tabs

  16. The development of a specific and sensitive LC-MS-based method for the detection and quantification of hydroperoxy- and hydroxydocosahexaenoic acids as a tool for lipidomic analysis.

    Directory of Open Access Journals (Sweden)

    Priscilla B M C Derogis

    Full Text Available Docosahexaenoic acid (DHA is an n-3 polyunsaturated fatty acid that is highly enriched in the brain, and the oxidation products of DHA are present or increased during neurodegenerative disease progression. The characterization of the oxidation products of DHA is critical to understanding the roles that these products play in the development of such diseases. In this study, we developed a sensitive and specific analytical tool for the detection and quantification of twelve major DHA hydroperoxide (HpDoHE and hydroxide (HDoHE isomers (isomers at positions 4, 5, 7, 8, 10, 11, 13, 14, 16, 17, 19 and 20 in biological systems. In this study, HpDoHE were synthesized by photooxidation, and the corresponding hydroxides were obtained by reduction with NaBH4. The isolated isomers were characterized by LC-MS/MS, and unique and specific fragment ions were chosen to construct a selected reaction monitoring (SRM method for the targeted quantitative analysis of each HpDoHE and HDoHE isomer. The detection limits for the LC-MS/MS-SRM assay were 1-670 pg for HpDoHE and 0.5-8.5 pg for HDoHE injected onto a column. Using this method, it was possible to detect the basal levels of HDoHE isomers in both rat plasma and brain samples. Therefore, the developed LC-MS/MS-SRM can be used as an important tool to identify and quantify the hydro(peroxy derivatives of DHA in biological system and may be helpful for the oxidative lipidomic studies.

  17. Development of sample preparation method for auxin analysis in plants by vacuum microwave-assisted extraction combined with molecularly imprinted clean-up procedure.

    Science.gov (United States)

    Hu, Yuling; Li, Yuanwen; Zhang, Yi; Li, Gongke; Chen, Yueqin

    2011-04-01

    A novel sample preparation method for auxin analysis in plant samples was developed by vacuum microwave-assisted extraction (VMAE) followed by molecularly imprinted clean-up procedure. The method was based on two steps. In the first one, conventional solvent extraction was replaced by VMAE for extraction of auxins from plant tissues. This step provided efficient extraction of 3-indole acetic acid (IAA) from plant with dramatically decreased extraction time, furthermore prevented auxins from degradation by creating a reduced oxygen environment under vacuum condition. In the second step, the raw extract of VMAE was further subjected to a clean-up procedure by magnetic molecularly imprinted polymer (MIP) beads. Owing to the high molecular recognition ability of the magnetic MIP beads for IAA and 3-indole-butyric acid (IBA), the two target auxins in plants can be selectively enriched and the interfering substance can be eliminated by dealing with a magnetic separation procedure. Both the VMAE and the molecularly imprinted clean-up conditions were investigated. The proposed sample preparation method was coupled with high-performance liquid chromatogram and fluorescence detection for determination of IAA and IBA in peas and rice. The detection limits obtained for IAA and IBA were 0.47 and 1.6 ng/mL and the relative standard deviation were 2.3% and 2.1%, respectively. The IAA contents in pea seeds, pea embryo, pea roots and rice seeds were determined. The recoveries were ranged from 70.0% to 85.6%. The proposed method was also applied to investigate the developmental profiles of IAA concentration in pea seeds and rice seeds during seed germination. PMID:20953778

  18. Scaling Internet Search Engines - Methods and Analysis

    OpenAIRE

    Risvik, Knut Magne

    2004-01-01

    This thesis focuses on methods and analysis for building scalable Internet Search Engines. In this work, we have developed a search kernel, an architecture framework and applications that are being used in industrial and commercial products. Furthermore, we present both analysis and design of key elements. Essential to building a large-scale search engine is to understand the dynamics of the content in which we are searching. For the challenging case of searching the web, there are multiple d...

  19. Analysis of mixed data methods & applications

    CERN Document Server

    de Leon, Alexander R

    2013-01-01

    A comprehensive source on mixed data analysis, Analysis of Mixed Data: Methods & Applications summarizes the fundamental developments in the field. Case studies are used extensively throughout the book to illustrate interesting applications from economics, medicine and health, marketing, and genetics. Carefully edited for smooth readability and seamless transitions between chaptersAll chapters follow a common structure, with an introduction and a concluding summary, and include illustrative examples from real-life case studies in developmental toxicolog

  20. Statistical methods for bioimpedance analysis

    OpenAIRE

    Christian Tronstad; Are Hugo Pripp

    2014-01-01

    This paper gives a basic overview of relevant statistical methods for the analysis of bioimpedance measurements, with an aim to answer questions such as: How do I begin with planning an experiment? How many measurements do I need to take? How do I deal with large amounts of frequency sweep data? Which statistical test should I use, and how do I validate my results? Beginning with the hypothesis and the research design, the methodological framework for making inferences based on measurements a...

  1. Nonlinear programming analysis and methods

    CERN Document Server

    Avriel, Mordecai

    2003-01-01

    Comprehensive and complete, this overview provides a single-volume treatment of key algorithms and theories. The author provides clear explanations of all theoretical aspects, with rigorous proof of most results. The two-part treatment begins with the derivation of optimality conditions and discussions of convex programming, duality, generalized convexity, and analysis of selected nonlinear programs. The second part concerns techniques for numerical solutions and unconstrained optimization methods, and it presents commonly used algorithms for constrained nonlinear optimization problems. This g

  2. Developments in gamma-ray spectrometry: systems, software, and methods-I. 5. Nuclear Spectral Analysis with Nonlinear Robust Fitting Techniques

    International Nuclear Information System (INIS)

    A new approach to nuclear spectral analysis based on nonlinear robust fitting techniques has been recently developed into a package suitable for public use. The methodology behind this approach was originally made available to the public as the RobFit command-line code, but it was extremely slow and difficult to use. Recent advances in microprocessor power and the development of a graphical user interface to make its use more intuitive have made this approach, which is quite computationally intensive, feasible for more routine applications. A brief description of some of the fundamental differences in the approach used by RobFit from the more common methods of nuclear spectral analysis involving local peak searches is presented here. Popular nuclear spectral analysis applications generally perform a peak search at their heart. The continuum in the neighborhood of each peak is estimated from local data and is subtracted from the data to yield the area and the energy of the peak. These are matched to a user-selected library of radionuclides containing the energies and areas of the most significant peaks, after accounting for the effects of detector efficiency and attenuation. With these codes, the energy-to-channel calibration, the peak width as a function of energy (or 'resolution calibration'), the detector intrinsic efficiency, and attenuation effects must usually be predetermined and provided as static input for the analysis. Most of these codes focus on regions of interest that represent many small pieces of the sample spectrum. In contrast, the RobFit approach works with an entire continuous spectrum to simultaneously determine the coefficients of all of the user-selected free variables that yield the best fit to the data. Peak searches are generally used only in interim steps to help suggest new radionuclides to include in the search library. Rather than first concentrate on the location of peaks, RobFit first concentrates on the determination of the continuum

  3. Chromatographic methods for analysis of triazine herbicides.

    Science.gov (United States)

    Abbas, Hana Hassan; Elbashir, Abdalla A; Aboul-Enein, Hassan Y

    2015-01-01

    Gas chromatography (GC) and high-performance liquid chromatography (HPLC) coupled to different detectors, and in combination with different sample extraction methods, are most widely used for analysis of triazine herbicides in different environmental samples. Nowadays, many variations and modifications of extraction and sample preparation methods such as solid-phase microextraction (SPME), hollow fiber-liquid phase microextraction (HF-LPME), stir bar sportive extraction (SBSE), headspace-solid phase microextraction (HS-SPME), dispersive liquid-liquid microextraction (DLLME), dispersive liquid-liquid microextraction based on solidification of floating organic droplet (DLLME-SFO), ultrasound-assisted emulsification microextraction (USAEME), and others have been introduced and developed to obtain sensitive and accurate methods for the analysis of these hazardous compounds. In this review, several analytical properties such as linearity, sensitivity, repeatability, and accuracy for each developed method are discussed, and excellent results were obtained for the most of developed methods combined with GC and HPLC techniques for the analysis of triazine herbicides. This review gives an overview of recent publications of the application of GC and HPLC for analysis of triazine herbicides residues in various samples. PMID:25849823

  4. Communication Error Analysis Method based on CREAM

    International Nuclear Information System (INIS)

    Communication error has been considered as a primary reason of many incidents and accidents in nuclear industry. In order to prevent these accidents, an analysis method of communication errors is proposed. This study presents a qualitative method to analyze communication errors. The qualitative method focuses on finding a root cause of the communication error and predicting the type of communication error which could happen in nuclear power plants. We develop context conditions and antecedent-consequent links of influential factors related to communication error. A case study has been conducted to validate the applicability of the proposed methods

  5. An Environment-based Education Approach to Professional Development: A Mixed Methods Analysis of the Creeks and Kids Watershed Workshop and Its Impact on K-12 Teachers

    Science.gov (United States)

    Austin, Tiffany Bridgette

    This research is an in-depth study of an environment-based education (EBE) professional development program titled "Creeks and Kids" that models how to employ thematic instruction about watersheds using the environment of a school and its community as a context to integrate teaching and learning about water across core subject areas. This case study investigates the EBE characteristics of the Creeks and Kids Workshop and explores how they adhere to the National Research Council's Standards for Professional Development for Teachers of Science. A mixed-methods analysis gathered qualitative data about the overall experience of teacher-participants during the Creeks and Kids Workshop and employed quantitative measures to identify evidence of success related to teachers' gains in knowledge, affect, confidence and intent to act to implement water-focused EBE curriculum in their classrooms. The findings of the study build upon existing research about what teachers need to implement EBE and their beliefs regarding what professional development should provide in relation to those needs. Qualitative results revealed that teachers need an EBE professional development program to include: 1) practical ways to integrate environmental education into their existing curricula and school settings; and, 2) direct experience with activities and field studies that are interdisciplinary, hands-on and inquiry-driven. Teacher-participants identified these characteristics as vital for them to effect a change in teaching practice and build their confidence to engage their students in EBE when they return to the classroom. Quantitative results revealed statistically significant gains across knowledge, affect, confidence and intent to act variables using the t-test statistic to compare means of participants' responses from the pre- to post-workshop questionnaires. The results of this study have broader implications for future educational research on: 1) the ways in which EBE professional

  6. Guidelines for Analysis of Pharmaceutical Supply System Planning in Developing Countries. Volume 7: Pharmaceutical Supply System Planning. International Health Planning Methods Series.

    Science.gov (United States)

    Schaumann, Leif; And Others

    Intended to assist Agency for International Development (AID) officers, advisors, and health officials in incorporating health planning into national plans for economic development, this seventh of ten manuals in the Interational Health Planning Methods Series deals with pharmaceutical supply systems planning in developing countries. Following an…

  7. Development and using radio analytical methods for the analysis of migration forms of pollutants in the main river waters of Central Asia

    International Nuclear Information System (INIS)

    Full text: Solution of problems of monitoring and protection, and rational using of river waters requires studying of space-temporarily distribution and migrations of pollutants, such as heavy metals (HM). It is also important to have exact information about forms of being HM because their fate, behavior, migration, and their toxic property are connected with their physic-chemical forms. But not enough sensitivity and accuracy of many physic-chemical methods of analysis requires necessity of developing and using of high-sensitive and multi-component methods of determination of contents and migration forms of HM in nature and sewage waters. Studying of migration forms of HM in the river waters was conducted by the following scheme: Neutron-activation analysis of divided fractions of separate forms of HM; experimental modeling by using of appropriate radio nuclides and thermodynamic modeling methods. There was developed and used neutron-activation method for getting quantitative data about forms of being HM in water. The ultra-filtration and electro-dialysis fractionating and concentrating of separate forms of HM was carried out before neutron activation analyses. There were established optimal conditions of division form of being of HM by using radionuclides 60Co, 51Cr and 124Sb in cationic and anionic forms. During 2003-2005 we have studied space-temporarily variations of content and phase distribution of Hg, Zn, Cd, Sb, Co, Th, Br, Cr, Au, La and Eu in the waters of Amudarya, Syrdarya and Surkhandarya rivers. Average concentration of HM fluctuates from 4.1 mg/l for Fe, till 2 ng/l for Au. Suspended composing of river waters makes from mountain rock and lands in river-heads and concentration of elements in weight form not exceeding the level of chalk's contents. In formation of solved phase of river water main role plays atmospheric precipitation. This fact concern to the technogenic elements (Hg, Cd, Zn, Sb, Cr, Se, V) mainly. Limits of determination of HM - 10

  8. Development, validation, and application of a method for the GC-MS analysis of fipronil and three of its degradation products in samples of water, soil, and sediment.

    Science.gov (United States)

    de Toffoli, Ana L; da Mata, Kamilla; Bisinoti, Márcia C; Moreira, Altair B

    2015-01-01

    A method for the identification and quantification of pesticide residues in water, soil, and sediment samples has been developed, validated, and applied for the analysis of real samples. The specificity was determined by the retention time and the confirmation and quantification of analyte ions. Linearity was demonstrated over the concentration range of 20 to 120 µg L(-1), and the correlation coefficients varied between 0.979 and 0.996, depending on the analytes. The recovery rates for all analytes in the studied matrix were between 86% and 112%. The intermediate precision and repeatability were determined at three concentration levels (40, 80, and 120 µg L(-1)), with the relative standard deviation for the intermediate precision between 1% and 5.3% and the repeatability varying between 2% and 13.4% for individual analytes. The limits of detection and quantification for fipronil, fipronil sulfide, fipronil-sulfone, and fipronil-desulfinyl were 6.2, 3.0, 6.6, and 4.0 ng L(-1) and 20.4, 9.0, 21.6, and 13.0 ng L(-1), respectively. The method developed was used in water, soil, and sediment samples containing 2.1 mg L(-1) and 1.2% and 5.3% of carbon, respectively. The recovery of pesticides in the environmental matrices varied from 88.26 to 109.63% for the lowest fortification level (40 and 100 µg kg(-1)), from 91.17 to 110.18% for the intermediate level (80 and 200 µg kg(-1)), and from 89.09 to 109.82% for the highest fortification level (120 and 300 µg kg(-1)). The relative standard deviation for the recovery of pesticides was under 15%. PMID:26357886

  9. Hybrid methods for cybersecurity analysis :

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Warren Leon,; Dunlavy, Daniel M.

    2014-01-01

    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling and analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and

  10. Combination Clustering Analysis Method and its Application

    OpenAIRE

    Bang-Chun Wen; Li-Yuan Dong; Qin-Liang Li; Yang Liu

    2013-01-01

    The traditional clustering analysis method can not automatically determine the optimal clustering number. In this study, we provided a new clustering analysis method which is combination clustering analysis method to solve this problem. Through analyzed 25 kinds of automobile data samples by combination clustering analysis method, the correctness of the analysis result was verified. It showed that combination clustering analysis method could objectively determine the number of clustering firs...

  11. Development of unconventional forming methods

    Directory of Open Access Journals (Sweden)

    S. Rusz

    2012-10-01

    Full Text Available Purpose: Paper presents results of progress ECAP processing method for UFG structure reached (gained.The properties and microstructure are influenced by technological factors during application ECAP method.Design/methodology/approach: Summary of methods studied on Department of technology at Machining faculty of VŠB-TU Ostrava through of co-operation with Institute of Engineering Materials and Biomaterials, Silesian University of Technology is presented.Findings: Achievement of ultra-fine grained structure in initial material leads to substantial increase of plasticity and makes it possible to form materials in conditions of „superplastic state“. Achievement of the required structure depends namely of the tool geometry, number of passes through the matrix, obtained deformation magnitude and strain rate, process temperature and lubrication conditions. High deformation at comparatively low homologous temperatures is an efficient method of production of ultra-fine grained solid materials.The new technologies, which use severe plastic deformation, comprise namely these techniques: High Pressure Torsion, Equal Channel Angular Pressing = ECAP, Cyclic Channel Die Compression = CCDC, Cyclic Extrusion Compression = CEC, Continuous Extrusion Forming = CONFORM, Accumulative Roll Bonding, Constrained Groove Pressing.Research limitations/implications: Achieved hardness and microstructure characteristics will be determined by new research.Practical implications: The results may be utilized for a relation between structure and properties of the investigated materials in future process of manufacturing.Originality/value: These results contribute to complex evaluation of properties new metals after application unconventional forming methods. The results of this paper are determined for research workers deal by the process severe plastic deformation.

  12. Developing Word Analysis Skills.

    Science.gov (United States)

    Heilman, Arthur W.

    The importance of word analysis skills to reading ability is discussed, and methodologies for teaching such skills are examined. It is stated that a child cannot become proficient in reading if he does not master the skill of associating printed letter symbols with the sounds they represent. Instructional procedures which augment the alphabet with…

  13. Economic analysis of alternative LLW disposal methods

    International Nuclear Information System (INIS)

    The Environmental Protection Agency (EPA) has evaluated the costs and benefits of alternative disposal technologies as part of its program to develop generally applicable environmental standards for the land disposal of low-level radioactive waste (LLW). Costs, population health effects and Critical Population Group (CPG) exposures resulting from alternative waste treatment and disposal methods were developed and input into the analysis. The cost-effectiveness analysis took into account a number of waste streams, hydrogeologic and climatic region settings, and waste treatment and disposal methods. Total costs of each level of a standard included costs for packaging, processing, transportation, and burial of waste. Benefits are defined in terms of reductions in the general population health risk (expected fatal cancers and genetic effects) evaluated over 10,000 years. A cost-effectiveness ratio, was calculated for each alternative standard. This paper describes the alternatives considered and preliminary results of the cost-effectiveness analysis

  14. Developments in geophysical exploration methods

    CERN Document Server

    1982-01-01

    One of the themes in current geophysical development is the bringing together of the results of observations made on the surface and those made in the subsurface. Several benefits result from this association. The detailed geological knowledge obtained in the subsurface can be extrapolated for short distances with more confidence when the geologi­ cal detail has been related to well-integrated subsurface and surface geophysical data. This is of value when assessing the characteristics of a partially developed petroleum reservoir. Interpretation of geophysical data is generally improved by the experience of seeing the surface and subsurface geophysical expression of a known geological configuration. On the theoretical side, the understanding of the geophysical processes themselves is furthered by the study of the phenomena in depth. As an example, the study of the progress of seismic wave trains downwards and upwards within the earth has proved most instructive. This set of original papers deals with some of ...

  15. Method development in automated mineralogy

    OpenAIRE

    Sandmann, Dirk

    2015-01-01

    The underlying research that resulted in this doctoral dissertation was performed at the Division of Economic Geology and Petrology of the Department of Mineralogy, TU Bergakademie Freiberg between 2011 and 2014. It was the primary aim of this thesis to develop and test novel applications for the technology of ‘Automated Mineralogy’ in the field of economic geology and geometallurgy. A “Mineral Liberation Analyser” (MLA) instrument of FEI Company was used to conduct most analytical studies. T...

  16. Development of a method for screening spill and leakage of antibiotics on surfaces based on wipe sampling and HPLC-MS/MS analysis

    OpenAIRE

    Nygren, Olle; Lindahl, Roger

    2011-01-01

    A screening method for determination of spill and leakage of 12 different antibiotic substances has been developed. The method is based on wipe sampling where the sampling procedure has been simplified for screening purposes. After sample processing, the antibiotic substances are determined by liquid chromatography coupled to tandem mass spectrometry (HPLC-MS/MS). Twelve antibiotic substances can be determined in the screening method: Cefadroxil, Cefalexin, Ciprofloxacin, Demeclocyklin HCl, D...

  17. Development of evaluation method for heat removal design of dry storage facilities. Pt. 4. Numerical analysis on vault storage system of cross flow type

    International Nuclear Information System (INIS)

    On the basis of the result of the heat removal test on vault storage system of cross flow type using the 1/5 scale model, an evaluation method for the heat removal design was established. It was composed of the numerical analysis for the convection phenomena of air flow inside the whole facility and that for the natural convection and the detailed turbulent mechanism near the surface of the storage tube. In the former analysis, air temperature distribution in the storage area obtained by the calculation gave good agreement within ±3degC with the test result. And fine turbulence models were introduced in the latter analysis to predict the separation flow in the boundary layer near the surface of the storage tube and the buoyant flow generated by the heat from the storage tube. Furthermore, the properties of removing the heat in a designed full-scale storage facility, such as flow pattern in the storage area, temperature and heat transfer rate of the storage tubes, were evaluated by using each of three methods, which were the established numerical analysis method, the experimental formula demonstrated in the heat removal test and the conventional evaluation method applied to the past heat removal design. As a result, the safety margin and issues included in the methods were grasped, and the measures to make a design more rational were proposed. (author)

  18. Review of strain buckling: analysis methods

    International Nuclear Information System (INIS)

    This report represents an attempt to review the mechanical analysis methods reported in the literature to account for the specific behaviour that we call buckling under strain. In this report, this expression covers all buckling mechanisms in which the strains imposed play a role, whether they act alone (as in simple buckling under controlled strain), or whether they act with other loadings (primary loading, such as pressure, for example). Attention is focused on the practical problems relevant to LMFBR reactors. The components concerned are distinguished by their high slenderness ratios and by rather high thermal levels, both constant and variable with time. Conventional static buckling analysis methods are not always appropriate for the consideration of buckling under strain. New methods must therefore be developed in certain cases. It is also hoped that this review will facilitate the coding of these analytical methods to aid the constructor in his design task and to identify the areas which merit further investigation

  19. Infinitesimal methods of mathematical analysis

    CERN Document Server

    Pinto, J S

    2004-01-01

    This modern introduction to infinitesimal methods is a translation of the book Métodos Infinitesimais de Análise Matemática by José Sousa Pinto of the University of Aveiro, Portugal and is aimed at final year or graduate level students with a background in calculus. Surveying modern reformulations of the infinitesimal concept with a thoroughly comprehensive exposition of important and influential hyperreal numbers, the book includes previously unpublished material on the development of hyperfinite theory of Schwartz distributions and its application to generalised Fourier transforms and harmon

  20. Data Analysis Methods for Paleogenomics

    DEFF Research Database (Denmark)

    Avila Arcos, Maria del Carmen

    , thanks to the introduction of NGS and the implementation of data analysis methods specific for each project. Chapters 1 to 3 have been published in peer-reviewed journals and Chapter 4 is currently in review. Chapter 5 consists of a manuscript describing initial results of an ongoing research project...... of sequence data, generated using next-generation sequencing (NGS) technologies, from either forensic (Chapter 1) or ancient (Chapters 2-5) materials. These chapters present projects very different in nature, reflecting the diversity of questions that have become possible to address in the ancient DNA field......, for which more data is currently being generated; therefore it should be interpreted as a preliminary report. In addition to the five chapters, an introduction and five appendices are included. Appended articles are included for the reader's interest, these represent the collaborations I have been part...

  1. Methods of quantitative fire hazard analysis

    International Nuclear Information System (INIS)

    Simplified fire hazard analysis methods have been developed as part of the FIVE risk-based fire induced vulnerability evaluation methodology for nuclear power plants. These fire hazard analyses are intended to permit plant fire protection personnel to conservatively evaluate the potential for credible exposure fires to cause critical damage to essential safe-shutdown equipment and thereby screen from further analysis spaces where a significant fire hazard clearly does not exist. This document addresses the technical bases for the fire hazard analysis methods. A separate user's guide addresses the implementation of the fire screening methodology, which has been implemented with three worksheets and a number of look-up tables. The worksheets address different locations of targets relative to exposure fire sources. The look-up tables address fire-induced conditions in enclosures in terms of three stages: a fire plume/ceiling jet period, an unventilated enclosure smoke filling period and a ventilated quasi-steady period

  2. Reliability and risk analysis methods research plan

    International Nuclear Information System (INIS)

    This document presents a plan for reliability and risk analysis methods research to be performed mainly by the Reactor Risk Branch (RRB), Division of Risk Analysis and Operations (DRAO), Office of Nuclear Regulatory Research. It includes those activities of other DRAO branches which are very closely related to those of the RRB. Related or interfacing programs of other divisions, offices and organizations are merely indicated. The primary use of this document is envisioned as an NRC working document, covering about a 3-year period, to foster better coordination in reliability and risk analysis methods development between the offices of Nuclear Regulatory Research and Nuclear Reactor Regulation. It will also serve as an information source for contractors and others to more clearly understand the objectives, needs, programmatic activities and interfaces together with the overall logical structure of the program

  3. Analytical Eco-Scale for Assessing the Greenness of a Developed RP-HPLC Method Used for Simultaneous Analysis of Combined Antihypertensive Medications.

    Science.gov (United States)

    Mohamed, Heba M; Lamie, Nesrine T

    2016-09-01

    In the past few decades the analytical community has been focused on eliminating or reducing the usage of hazardous chemicals and solvents, in different analytical methodologies, that have been ascertained to be extremely dangerous to human health and environment. In this context, environmentally friendly, green, or clean practices have been implemented in different research areas. This study presents a greener alternative of conventional RP-HPLC methods for the simultaneous determination and quantitative analysis of a pharmaceutical ternary mixture composed of telmisartan, hydrochlorothiazide, and amlodipine besylate, using an ecofriendly mobile phase and short run time with the least amount of waste production. This solvent-replacement approach was feasible without compromising method performance criteria, such as separation efficiency, peak symmetry, and chromatographic retention. The greenness profile of the proposed method was assessed and compared with reported conventional methods using the analytical Eco-Scale as an assessment tool. The proposed method was found to be greener in terms of usage of hazardous chemicals and solvents, energy consumption, and production of waste. The proposed method can be safely used for the routine analysis of the studied pharmaceutical ternary mixture with a minimal detrimental impact on human health and the environment. PMID:27492952

  4. Simple gas chromatographic method for furfural analysis.

    Science.gov (United States)

    Gaspar, Elvira M S M; Lopes, João F

    2009-04-01

    A new, simple, gas chromatographic method was developed for the direct analysis of 5-hydroxymethylfurfural (5-HMF), 2-furfural (2-F) and 5-methylfurfural (5-MF) in liquid and water soluble foods, using direct immersion SPME coupled to GC-FID and/or GC-TOF-MS. The fiber (DVB/CAR/PDMS) conditions were optimized: pH effect, temperature, adsorption and desorption times. The method is simple and accurate (RSDHMF; GC-TOF-MS: 0.3, 1.2 and 0.9 ngmL(-1) for 2-F, 5-MF and 5-HMF, respectively). It was applied to different commercial food matrices: honey, white, demerara, brown and yellow table sugars, and white and red balsamic vinegars. This one-step, sensitive and direct method for the analysis of furfurals will contribute to characterise and quantify their presence in the human diet. PMID:18976770

  5. Development of a versatile sample preparation method and its application for rare-earth pattern and Nd isotope ratio analysis in nuclear forensics

    International Nuclear Information System (INIS)

    An improved sample preparation procedure for trace-levels of lanthanides in uranium-bearing samples was developed. The method involves a simple co-precipitation using Fe(III) carrier in ammonium carbonate medium to remove the uranium matrix. The procedure is an effective initial pre-concentration step for the subsequent extraction chromatographic separations. The applicability of the method was demonstrated by the measurement of REE pattern and 143Nd/144Nd isotope ratio in uranium ore concentrate samples. (author)

  6. Enabling Predictive Simulation and UQ of Complex Multiphysics PDE Systems by the Development of Goal-Oriented Variational Sensitivity Analysis and A Posteriori Error Estimation Methods

    Energy Technology Data Exchange (ETDEWEB)

    Ginting, Victor

    2014-03-15

    it was demonstrated that a posteriori analyses in general and in particular one that uses adjoint methods can accurately and efficiently compute numerical error estimates and sensitivity for critical Quantities of Interest (QoIs) that depend on a large number of parameters. Activities include: analysis and implementation of several time integration techniques for solving system of ODEs as typically obtained from spatial discretization of PDE systems; multirate integration methods for ordinary differential equations; formulation and analysis of an iterative multi-discretization Galerkin finite element method for multi-scale reaction-diffusion equations; investigation of an inexpensive postprocessing technique to estimate the error of finite element solution of the second-order quasi-linear elliptic problems measured in some global metrics; investigation of an application of the residual-based a posteriori error estimates to symmetric interior penalty discontinuous Galerkin method for solving a class of second order quasi-linear elliptic problems; a posteriori analysis of explicit time integrations for system of linear ordinary differential equations; derivation of accurate a posteriori goal oriented error estimates for a user-defined quantity of interest for two classes of first and second order IMEX schemes for advection-diffusion-reaction problems; Postprocessing finite element solution; and A Bayesian Framework for Uncertain Quantification of Porous Media Flows.

  7. ORIGINAL ARTICLE Development and validation of a normal-phase HPTLC method for the simultaneous analysis of Lamivudine and Zidovudine in fixed-dose combination tablets

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Simultaneous quantification of Lamivudine and Zidovudine in tablets by HPTLC method was developed and validated.The chromatograms were developed using a mobile phase of toluene:ethyl acetate:methanol (4:4:2,v/v/v) on pre-coated plate of silica gel GF aluminum TLC plate and quantified by densitometric absorbance mode at 276 nm.The R f values were 0.4170.03 and 0.6070.04 for Lamivudine and Zidovudine,respectively.The linearity of the method was found to be within the concentration range of 50 250 ng/spot for ...

  8. Development of software safety analysis method for nuclear power plant I and C systems in requirement specification based on statechart and SCR

    International Nuclear Information System (INIS)

    In recent years, Instrumentation and Control (I and C) system based on digital computer technology has been widely used throughout industries. These industries such as Nuclear Power Plant (NPP) have safety critical systems. Thus, safety critical system must have sufficient quality to assure a safe and reliable design. In this work, a formal requirement analysis method for Nuclear Power Plant (NPP) Instrumentation and Control (I and ) systems is proposed. This method use the Statechart diagram, Software Cost Reduction (SCR) formalism and ISO table newly suggested in this paper for checking the modeled systems formally. The combined method of utilizing Statechart, SCR and ISO table has the advantage of checking the system easily, visually and formally. This method is applied to the Water Level Monitoring System (WLMS). As a result of the formal check, one reachability error is detected

  9. Cochrane methods - twenty years experience in developing systematic review methods

    OpenAIRE

    Chandler, Jackie; Hopewell, Sally

    2013-01-01

    : This year, The Cochrane Collaboration reached its 20th anniversary. It has played a pivotal role in the scientific development of systematic reviewing and in the development of review methods to synthesize research evidence, primarily from randomized trials, to answer questions about the effects of healthcare interventions. We introduce a series of articles, which form this special issue describing the development of systematic review methods within The Cochrane Collaboration. We also discu...

  10. Optimization and development of the instrumental parameters for a method of multielemental analysis through atomic spectroscopy emission, for the determination of My, Fe Mn and Cr

    International Nuclear Information System (INIS)

    This study optimized the instrumental parameters of a method of multielemental (sequential) analysis, through atomic emission, for the determination of My, Fe,Mn and Cr. It used the factorial design at two levels and the method of Simplex optimization, that permitted the determination of the four cations under the same instrumental conditions. The author studied an analytic system, in which the conditions were not lineal between instrumental answers and the concentration, having to make adjustment of the calibration curves in homocedastic and heterocedastic conditions. (S. Grainger)

  11. Review of Computational Stirling Analysis Methods

    Science.gov (United States)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.

    2004-01-01

    Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

  12. Analysis and estimation of risk management methods

    Directory of Open Access Journals (Sweden)

    Kankhva Vadim Sergeevich

    2016-05-01

    Full Text Available At the present time risk management is an integral part of state policy in all the countries with developed market economy. Companies dealing with consulting services and implementation of the risk management systems carve out a niche. Unfortunately, conscious preventive risk management in Russia is still far from the level of standardized process of a construction company activity, which often leads to scandals and disapproval in case of unprofessional implementation of projects. The authors present the results of the investigation of the modern understanding of the existing methodology classification and offer the authorial concept of classification matrix of risk management methods. Creation of the developed matrix is based on the analysis of the method in the context of incoming and outgoing transformed information, which may include different elements of risk control stages. So the offered approach allows analyzing the possibilities of each method.

  13. Automating Object-Oriented Software Development Methods

    OpenAIRE

    Tekinerdogan, Bedir; SAEKI, Motoshi; Sunyé, Gerson; Broek, van den, E.; Hruby, Pavel; Frohner, A´ kos

    2002-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software development methods have been defined. Nevertheless, methods often provide a complexity by their own due to their large number of artifacts, method rules and their complicated processes. We think that au...

  14. Automating Object-Oriented Software Development Methods

    OpenAIRE

    Tekinerdogan, Bedir; SAEKI, Motoshi; Sunyé, Gerson; Broek, van den, E.; Hruby, Pavel

    2001-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software development methods have been defined. Nevertheless, methods often provide a complexity by their own due to their large number of artifacts, method rules and their complicated processes. We think that au...

  15. Regional Development Sustainability Analysis Consept

    Directory of Open Access Journals (Sweden)

    Janno Reiljan

    2014-08-01

    Full Text Available Problems associated with the qualitative analysis and quantitative measurement of sustainability, and opportunities for connecting the concept with the methodological basis of development assessment and the essence of the subject that values sustainability are dealed. The goal of article is to work out the basics for analysis of the regional development in a country in terms and framework of sustainability concept. The article starts by outlining the definition of sustainability, which is followed by an analysis of the nature of sustainability. The third subsection highlights the demands of the decision-making process in guaranteeing sustainability and then considers sustainability in a competitive environment. In the second part of article the sustainable development conception is implemented in regional development sustainability analysis.

  16. Regional Development Sustainability Analysis Consept

    OpenAIRE

    Janno Reiljan

    2014-01-01

    Problems associated with the qualitative analysis and quantitative measurement of sustainability, and opportunities for connecting the concept with the methodological basis of development assessment and the essence of the subject that values sustainability are dealed. The goal of article is to work out the basics for analysis of the regional development in a country in terms and framework of sustainability concept. The article starts by outlining the definition of sustainability, which is fol...

  17. Generalized analysis method for neutron resonance transmission analysis

    International Nuclear Information System (INIS)

    Neutron resonance densitometry (NRD) is a non-destructive analysis method, which can be applied to quantify special nuclear materials (SNM) in small particle-like debris of melted fuel that are formed in severe accidents of nuclear reactors such as the Fukushima Daiichi nuclear power plants. NRD uses neutron resonance transmission analysis (NRTA) to quantify SNM and neutron resonance capture analysis (NRCA) to identify matrix materials and impurities. To apply NRD for the characterization of arbitrary-shaped thick materials, a generalized method for the analysis of NRTA data has been developed. The method has been applied on data resulting from transmission through thick samples with an irregular shape and an areal density of SNM up to 0.253 at/b (≈100 g/cm2). The investigation shows that NRD can be used to quantify SNM with a high accuracy not only in inhomogeneous samples made of particle-like debris but also in samples made of large rocks with an irregular shape by applying the generalized analysis method for NRTA. (author)

  18. Fourier methods for biosequence analysis.

    OpenAIRE

    Benson, D C

    1990-01-01

    Novel methods are discussed for using fast Fourier transforms for DNA or protein sequence comparison. These methods are also intended as a contribution to the more general computer science problem of text search. These methods extend the capabilities of previous FFT methods and show that these methods are capable of considerable refinement. In particular, novel methods are given which (1) enable the detection of clusters of matching letters, (2) facilitate the insertion of gaps to enhance seq...

  19. Further developments in the study of harmonic analysis by the correlation and spectral density methods, and its application to the adult rabbit EEG

    International Nuclear Information System (INIS)

    The application of harmonic analysis to the brain spontaneous electrical activity has been studied theoretically and practically in 30 adult rabbits chronically implanted with electrodes. Theoretically, an accurate energetic study of the signal can only be achieved by the calculation of the autocorrelation function and its Fourier transform, the power density spectrum. Secondly, a comparative study has been made of the analogical methods using analogic or hybrid devices and the digital method with an analysis and computing program (the sampling rate, the delay, the period of integration and the problems raised by the amplification of the biological signals and sampling). Data handling is discussed, the method mainly retaining the study of variance, the calculation of the total energy carried by the signal and the energies carried along the frequency bandwidth ΔF, their percentage as related to the total energy, the relationships of these various values for various electroencephalographic states. Experimentally, the general aspect of the spontaneous electric activity of the dorsal hippocampus and the visual cortex during vigilance variations is accurately described by the calculation of the variance and the study of the position of the maximum values of the peaks of the power density spectra on the frequency axis as well as by the calculation of the energies carried in various frequency bands, 0-4, 4-8, 8-12 Hz. With the same theoretical bases, both the analogical and digital methods lead to similar results, the former being easier to operate, the latter more accurate. (author)

  20. Finite Volume Methods: Foundation and Analysis

    Science.gov (United States)

    Barth, Timothy; Ohlberger, Mario

    2003-01-01

    Finite volume methods are a class of discretization schemes that have proven highly successful in approximating the solution of a wide variety of conservation law systems. They are extensively used in fluid mechanics, porous media flow, meteorology, electromagnetics, models of biological processes, semi-conductor device simulation and many other engineering areas governed by conservative systems that can be written in integral control volume form. This article reviews elements of the foundation and analysis of modern finite volume methods. The primary advantages of these methods are numerical robustness through the obtention of discrete maximum (minimum) principles, applicability on very general unstructured meshes, and the intrinsic local conservation properties of the resulting schemes. Throughout this article, specific attention is given to scalar nonlinear hyperbolic conservation laws and the development of high order accurate schemes for discretizing them. A key tool in the design and analysis of finite volume schemes suitable for non-oscillatory discontinuity capturing is discrete maximum principle analysis. A number of building blocks used in the development of numerical schemes possessing local discrete maximum principles are reviewed in one and several space dimensions, e.g. monotone fluxes, E-fluxes, TVD discretization, non-oscillatory reconstruction, slope limiters, positive coefficient schemes, etc. When available, theoretical results concerning a priori and a posteriori error estimates are given. Further advanced topics are then considered such as high order time integration, discretization of diffusion terms and the extension to systems of nonlinear conservation laws.

  1. Development of RP-HPLC method for Qualitative Analysis of Active Ingredient (Gallic acid) from Stem Bark of Dendrophthoe falcate Linn.

    OpenAIRE

    Hafsa Deshmukh; Pradnya J. Prabhu

    2011-01-01

    A simple, precise and sensitive, RP-HPLC method with UV detection at 271nm was developed and validated for qualitative determination of active ingredient Gallic acid from stem bark of Dendrophthoe falcate Linn. Separation was performed on a ThermoMOS 2 HYPERSIL C18 column (250 cm × 4.6 mm, 5µm ODS 3) using mobile phase comprising of 0.1% Orthophosphoric acid : Acetonitrile (400 cm3 : 600 cm3) with a flow rate of 1 ml/minute with a short run time of 13 minute. The method was validated accordi...

  2. Space Debris Reentry Analysis Methods and Tools

    Institute of Scientific and Technical Information of China (English)

    WU Ziniu; HU Ruifeng; QU Xi; WANG Xiang; WU Zhe

    2011-01-01

    The reentry of uncontrolled spacecraft may be broken into many pieces of debris at an altitude in the range of 75-85 km.The surviving fragments could pose great hazard and risk to ground and people.In recent years,methods and tools for predicting and analyzing debris reentry and ground risk assessment have been studied and developed in National Aeronautics and Space Administration(NASA),European Space Agency(ESA) and other organizations,including the group of the present authors.This paper reviews the current progress on this topic of debris reentry briefly.We outline the Monte Carlo method for uncertainty analysis,breakup prediction,and parameters affecting survivability of debris.The existing analysis tools can be classified into two categories,i.e.the object-oriented and the spacecraft-oriented methods,the latter being more accurate than the first one.The past object-oriented tools include objects of only simple shapes.For more realistic simulation,here we present an object-oriented tool debris reentry and ablation prediction system(DRAPS) developed by the present authors,which introduces new object shapes to 15 types,as well as 51 predefined motions and relevant aerodynamic and aerothermal models.The aerodynamic and aerothermal models in DRAPS are validated using direct simulation Monte Carlo(DSMC) method.

  3. Cooperative method development

    DEFF Research Database (Denmark)

    Dittrich, Yvonne; Rönkkö, Kari; Eriksson, Jeanette; Hansson, Christina; Lindeberg, Olle

    2008-01-01

    research is not easily combined with the improvement orientation of an engineering discipline. During the last 6 years, we have applied an approach we call `cooperative method development', which combines qualitative social science fieldwork, with problem-oriented method, technique and process improvement....... The action research based approach focusing on shop floor software development practices allows an understanding of how contextual contingencies influence the deployment and applicability of methods, processes and techniques. This article summarizes the experiences and discusses the further......The development of methods tools and process improvements is best to be based on the understanding of the development practice to be supported. Qualitative research has been proposed as a method for understanding the social and cooperative aspects of software development. However, qualitative...

  4. Report of Research Cooperation Sub-Committee 46 on research and development of methods for inelastic (EPICC: Elastic-PlastIC-Creep) structural analysis

    International Nuclear Information System (INIS)

    This report succeeds the preceding one on ''Verification and Qualification of Nonlinear Structural Analysis Computer Program''. PNC (Power Reactor and Nuclear Fuel Development Corporation) decided to sponsor an extended research project on inelastic structural analysis for a period spanning September, 1976 to May, 1978. Responding to PNC proposal, RC Sub-Committee 46 was formed in Japan Society of Mechanical Engineers and plunged into the cooperative work from October, 1976. Besides the verification and/or qualification of available general purpose computer programs which were the major objectives of previous contract, the Committee executed the research on the topics categorized into the following three fields of interests: 1. Material data for use in elastic analysis, 2. Inelastic analysis procedure and computer program verification, 3. Design code and processing of computer solutions. This report summarizes the efforts during the first year of the Sub-Committee and consists of three parts each corresponding to the research topics stated above. Part I. Inelastic constitutive equations for materials under high temperature service conditions Part II. EPICC standard benchmark test problem and solutions Part III. Examination of postprocessors and development Although the research is still in the intermediate stage, the features of research being actively under way are 1. Evaluative review and nationwide collection of material data, recommendation of tentative constitutive equations for elastic-plastic and creep analyses of benchmark test problem, 2. Revision and augmentation of EPICC standard benchmark test problem and competitive and/or cooperative execution of solutions, 3. Review of existing prototypical post processors, and development of a processor for piping design. (author)

  5. Development of Tsunami PSA method for Korean NPP site

    International Nuclear Information System (INIS)

    A methodology of tsunami PSA was developed in this study. A tsunami PSA consists of tsunami hazard analysis, tsunami fragility analysis and system analysis. In the case of tsunami hazard analysis, evaluation of tsunami return period is major task. For the evaluation of tsunami return period, numerical analysis and empirical method can be applied. The application of this method was applied to a nuclear power plant, Ulchin 56 NPP, which is located in the east coast of Korean peninsula. Through this study, whole tsunami PSA working procedure was established and example calculation was performed for one of real nuclear power plant in Korea

  6. Development of an HPLC-UV Method for the Analysis of Drugs Used for Combined Hypertension Therapy in Pharmaceutical Preparations and Human Plasma

    OpenAIRE

    Serife Evrim Kepekci Tekkeli

    2013-01-01

    A simple, rapid, and selective HPLC-UV method was developed for the determination of antihypertensive drug substances: amlodipine besilat (AML), olmesartan medoxomil (OLM), valsartan (VAL), and hydrochlorothiazide (HCT) in pharmaceuticals and plasma. These substances are mostly used as combinations. The combinations are found in various forms, especially in current pharmaceuticals as threesome components: OLM, AML, and HCT (combination I) and AML, VAL, and HCT (combination II). The separation...

  7. Development of a Radial Deconsolidation Method

    Energy Technology Data Exchange (ETDEWEB)

    Helmreich, Grant W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Montgomery, Fred C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hunn, John D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-01

    A series of experiments have been initiated to determine the retention or mobility of fission products* in AGR fuel compacts [Petti, et al. 2010]. This information is needed to refine fission product transport models. The AGR-3/4 irradiation test involved half-inch-long compacts that each contained twenty designed-to-fail (DTF) particles, with 20-μm thick carbon-coated kernels whose coatings were deliberately fabricated such that they would crack under irradiation, providing a known source of post-irradiation isotopes. The DTF particles in these compacts were axially distributed along the compact centerline so that the diffusion of fission products released from the DTF kernels would be radially symmetric [Hunn, et al. 2012; Hunn et al. 2011; Kercher, et al. 2011; Hunn, et al. 2007]. Compacts containing DTF particles were irradiated at Idaho National Laboratory (INL) at the Advanced Test Reactor (ATR) [Collin, 2015]. Analysis of the diffusion of these various post-irradiation isotopes through the compact requires a method to radially deconsolidate the compacts so that nested-annular volumes may be analyzed for post-irradiation isotope inventory in the compact matrix, TRISO outer pyrolytic carbon (OPyC), and DTF kernels. An effective radial deconsolidation method and apparatus appropriate to this application has been developed and parametrically characterized.

  8. Instrumental neutron activation analysis - a routine method

    International Nuclear Information System (INIS)

    This thesis describes the way in which at IRI instrumental neutron activation analysis (INAA) has been developed into an automated system for routine analysis. The basis of this work are 20 publications describing the development of INAA since 1968. (Auth.)

  9. Computer modeling for neutron activation analysis methods

    International Nuclear Information System (INIS)

    Full text: The INP AS RU develops databases for the neutron-activation analysis - ND INAA [1] and ELEMENT [2]. Based on these databases, the automated complex is under construction aimed at modeling of methods for natural and technogenic materials analysis. It is well known, that there is a variety of analysis objects with wide spectra, different composition and concentration of elements, which makes it impossible to develop universal methods applicable for every analytical research. The modelling is based on algorithm, that counts the period of time in which the sample was irradiated in nuclear reactor, providing the sample's total absorption and activity analytical peaks areas with given errors. The analytical complex was tested for low-elemental analysis (determination of Fe and Zn in vegetation samples, and Cu, Ag and Au - in technological objects). At present, the complex is applied for multielemental analysis of sediment samples. In this work, modern achievements in the analytical chemistry (measurement facilities, high-resolution detectors, IAEA and IUPAC databases) and information technology applications (Java software, database management systems (DBMS), internet technologies) are applied. Reference: 1. Tillaev T., Umaraliev A., Gurvich L.G., Yuldasheva K., Kadirova J. Specialized database for instrumental neutron activation analysis - ND INAA 1.0, The 3-rd Eurasian Conference Nuclear Science and its applications, 2004, pp.270-271.; 2. Gurvich L.G., Tillaev T., Umaraliev A. The Information-analytical database on the element contents of natural objects. The 4-th International Conference Modern problems of Nuclear Physics, Samarkand, 2003, p.337. (authors)

  10. Development of a rapid method to determine SR-isotopes and its application for the analysis of 90Sr in milk

    International Nuclear Information System (INIS)

    A rapid method was developed to determine radioactive Strontium isotopes in environmental samples after a nuclear fallout which is based on precipitation and extraction chromatography of Strontium. By this method 90Sr and 89Sr are determined, after separation, by liquid scintillation spectroscopy. The method was applied to determine the activity of 90Sr in milk samples in Austria in 1997 which is only achievable by the very low detection limit of 0.01Bq/l 90Sr in fresh milk. Furthermore, the variation of the ratio 90Sr / 137Cs in milk in Austria in dependence of the 137Cs concentration in milk in 1997 and thus the 137Cs deposition is given and the reasons for this distribution are discussed. (orig.)

  11. Development of a quantitative analysis method for mRNA from Mycobacterium leprae and slow-growing acid-fast bacteria

    International Nuclear Information System (INIS)

    This study aimed to develop a specific method for detection and quantitative determination of mRNA that allows estimation of viable counts of M. leprae and other mycobacteria. Of heart-shock protein of 65 kDa (hsp65), mRNA was used as an indicator to discriminate the living cells and died ones. To compare mRNA detections by RNase protection assay (RPA) and Northern blot hybridization (NBH), labelled anti-sense RNA for hsp65 gene of M. leprae was synthesized using plasmid pUC8/N5. The anti-sense RNA synthesized from the template DNA containing about 580 bp (194 to 762) of hsp65 gene. When compared with NBH method, the amount of probe required for the detection by RPA method was 1/30 or less and the detection sensitivity of RPA was also 10 times higher. In addition, complicated procedures were needed to eliminate non-specific reactions in NBH method. These results indicated that RPA method is more convenient and superior for the mRNA detection. However, isotope degradation in the probe used for RPA method might affect the results. Therefore, 33P of 35P, of which degradation energy is less that 32P should be used for labelling. Total RNA was effectively extracted from M. chelonae, M. marinum by AGPC method, but not from M. leprae. In conclusion, RPA is a very effective detection method for these mRNA, but it seems necessary to further improve the sensitivity of detection for a small amount of test materials. (M.N.)

  12. Development of a photometric measuring method for soot analysis in flames. Final report; Entwicklung eines photometrischen Messverfahrens zur Russanalyse in Flammen. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Weichert, R.; Niemann, J.

    1995-12-31

    The present photometric measuring method for soot analysis in flames meets the following specifications: determination of the volume concentration of soot particles from 2 x 10{sup -7} upwards by means of extinction measurement at three different wavelengths; determination of the particle size distribution of soot particles by means of nephelometry in the range betwenn 20 and 400 nm; contactless measurements on the particle collective in the flame; no need for calibration of the photometric measuring method on the basis of particles of known size and concentration. (orig./SR) [Deutsch] Es ergeben sich fuer das entwickelte photometrische Messverfahren zur Russanalyse in Flammen folgende Spezifikationen: - Bestimmung der Volumenkonzentration der Russpartikel ab 2 x 10{sup -7} mittels Extinktionsmessungen bei drei Lichtwellenlaengen, - Ermittlung der Partikelgroessenverteilung der Russpartikel aus Streulichtmessungen im Bereich von 20 bis 400 nm, - beruehrungsfreie Messung in der Flamme am Partikelkollektiv und, - keine Kalibrierung des photometrischen Messverfahrens mit Partikeln bekannter Groesse bzw. bekannter Konzentration erforderlich. (orig./SR)

  13. Development of analysis method of material f low cost accounting using lean technique in food production: A case study of Universal Food Public (UFC Co.,Ltd.

    Directory of Open Access Journals (Sweden)

    Wichai Chattinnawat

    2015-06-01

    Full Text Available This research aims to apply Lean technique in conjunction with analysis of Material Flow Cost Accounting (MFCA to production process of canned sweet corn in order to increase process efficiency, eliminate waste and reduce cost of the production. This research develops and presents new type of MFCA analysis by incorporating value and non-value added activities into the MFCA cost allocation process. According to the simulation-based measurement of the process efficiency, integrated cost allocation based on activity types results in higher proportion of negative product cost in comparison to that computed from conventional MFCA cost allocation. Thus, considering types of activities and process efficiency have great impacts on cost structure especially for the negative product cost. The research leads to solutions to improve work procedures, eliminate waste and reduce production cost. The overall cost per unit decreases with higher proportion of positive product cost.

  14. Development of a spatial analysis method using ground-based repeat photography to detect changes in the alpine treeline ecotone, Glacier National Park, Montana, U.S.A.

    Science.gov (United States)

    Roush, W.; Munroe, J.S.; Fagre, D.B.

    2007-01-01

    Repeat photography is a powerful tool for detection of landscape change over decadal timescales. Here a novel method is presented that applies spatial analysis software to digital photo-pairs, allowing vegetation change to be categorized and quantified. This method is applied to 12 sites within the alpine treeline ecotone of Glacier National Park, Montana, and is used to examine vegetation changes over timescales ranging from 71 to 93 years. Tree cover at the treeline ecotone increased in 10 out of the 12 photo-pairs (mean increase of 60%). Establishment occurred at all sites, infilling occurred at 11 sites. To demonstrate the utility of this method, patterns of tree establishment at treeline are described and the possible causes of changes within the treeline ecotone are discussed. Local factors undoubtedly affect the magnitude and type of the observed changes, however the ubiquity of the increase in tree cover implies a common forcing mechanism. Mean minimum summer temperatures have increased by 1.5??C over the past century and, coupled with variations in the amount of early spring snow water equivalent, likely account for much of the increase in tree cover at the treeline ecotone. Lastly, shortcomings of this method are presented along with possible solutions and areas for future research. ?? 2007 Regents of the University of Colorado.

  15. Strategic Options Development and Analysis

    Science.gov (United States)

    Ackermann, Fran; Eden, Colin

    Strategic Options Development and Analysis (SODA) enables a group or individual to construct a graphical representation of a problematic situation, and thus explore options and their ramifications with respect to a complex system of goals or objectives. In addition the method aims to help groups arrive at a negotiated agreement about how to act to resolve the situation. It is based upon the use of causal mapping - a formally constructed means-ends network - as representation form. Because the picture has been constructed using the natural language of the problem owners it becomes a model of the situation that is ‘owned' by those who define the problem. The use of formalities for the construction of the model makes it amenable to a range of analyses as well as encouraging reflection and a deeper understanding. These analyses can be used in a ‘rough and ready' manner by visual inspection or through the use of specialist causal mapping software (Decision Explorer). Each of the analyses helps a group or individual discover important features of the problem situation, and these features facilitate agreeing agood solution. The SODA process is aimed at helping a group learn about the situation they face before they reach agreements. Most significantly the exploration through the causal map leads to a higher probability of more creative solutions and promotes solutions that are more likely to be implemented because the problem construction process is wider and more likely to include richer social dimensions about the blockages to action and organizational change. The basic theories that inform SODA derive from cognitive psychology and social negotiation, where the model acts as a continuously changing representation of the problematic situation - changing as the views of a person or group shift through learning and exploration. This chapter, jointly written by two leading practitioner academics and the original developers of SODA, Colin Eden and Fran Ackermann

  16. Constructing an Intelligent Patent Network Analysis Method

    OpenAIRE

    Wu, Chao-Chan; Yao, Ching-Bang

    2012-01-01

    Patent network analysis, an advanced method of patent analysis, is a useful tool for technology management. This method visually displays all the relationships among the patents and enables the analysts to intuitively comprehend the overview of a set of patents in the field of the technology being studied. Although patent network analysis possesses relative advantages different from traditional methods of patent analysis, it is subject to several crucial limitations. To overcome the drawbacks...

  17. Data Analysis Methods for Library Marketing

    Science.gov (United States)

    Minami, Toshiro; Kim, Eunja

    Our society is rapidly changing to information society, where the needs and requests of the people on information access are different widely from person to person. Library's mission is to provide its users, or patrons, with the most appropriate information. Libraries have to know the profiles of their patrons, in order to achieve such a role. The aim of library marketing is to develop methods based on the library data, such as circulation records, book catalogs, book-usage data, and others. In this paper we discuss the methodology and imporatnce of library marketing at the beginning. Then we demonstrate its usefulness through some examples of analysis methods applied to the circulation records in Kyushu University and Guacheon Library, and some implication that obtained as the results of these methods. Our research is a big beginning towards the future when library marketing is an unavoidable tool.

  18. Information Security Risk Analysis Methods and Research Trends: AHP and Fuzzy Comprehensive Method

    OpenAIRE

    Ming-Chang Lee

    2014-01-01

    Information security risk analysis becomes an increasingly essential component of organization’s operations. Traditional Information security risk analysis is quantitative and qualitative analysis methods. Quantitative and qualitative analysis methods have some advantages for information risk analysis. However, hierarchy process has been widely used in security assessment. A future research direction may be development and application of soft computing such as rough sets, grey set...

  19. 语义Web应用程序开发方法及实例分析%Semantic Web Application Developing Method and Example Analysis

    Institute of Scientific and Technical Information of China (English)

    李新龙; 黄映辉

    2013-01-01

    随着语义Web技术的不断发展,语义Web应用程序越来越受到重视,但现在国内对语义Web应用程序的研究却比较少,缺少语义Web 应用程序的开发方法。文中通过对语义Web应用程序的研究,结合与Web应用程序的对比分析,给出了语义Web应用程序的定义、架构以及开发方法,并详细说明了基于数据层、逻辑层和表现层三层架构的语义Web应用程序的结构特征和构建过程,进而通过构建一个语义Web应用程序实例对所提出的开发方法进行了验证,取得了预期的成果。%With the continues development of semantic Web technology,more and more attention has been drawn to semantic Web appli-cations. However,less research has done on the semantic Web applications in the domestic,lacking of development methods. By study,the definition of the semantic Web application is given,as well as the basic framework and the developing method. The structure characteris-tics and building process of each layer in semantic Web application based on data layer,logic layer and presentation layer,are also de-scribed. Through constructing a semantic Web application example,a novel development method is verified and achieve the prospect re-sult.

  20. Spectroscopic Chemical Analysis Methods and Apparatus

    Science.gov (United States)

    Hug, William F.; Reid, Ray D.

    2012-01-01

    This invention relates to non-contact spectroscopic methods and apparatus for performing chemical analysis and the ideal wavelengths and sources needed for this analysis. It employs deep ultraviolet (200- to 300-nm spectral range) electron-beam-pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor lightemitting devices, and hollow cathode metal ion lasers. Three achieved goals for this innovation are to reduce the size (under 20 L), reduce the weight [under 100 lb (.45 kg)], and reduce the power consumption (under 100 W). This method can be used in microscope or macroscope to provide measurement of Raman and/or native fluorescence emission spectra either by point-by-point measurement, or by global imaging of emissions within specific ultraviolet spectral bands. In other embodiments, the method can be used in analytical instruments such as capillary electrophoresis, capillary electro-chromatography, high-performance liquid chromatography, flow cytometry, and related instruments for detection and identification of unknown analytes using a combination of native fluorescence and/or Raman spectroscopic methods. This design provides an electron-beampumped semiconductor radiation-producing method, or source, that can emit at a wavelength (or wavelengths) below 300 nm, e.g. in the deep ultraviolet between about 200 and 300 nm, and more preferably less than 260 nm. In some variations, the method is to produce incoherent radiation, while in other implementations it produces laser radiation. In some variations, this object is achieved by using an AlGaN emission medium, while in other implementations a diamond emission medium may be used. This instrument irradiates a sample with deep UV radiation, and then uses an improved filter for separating wavelengths to be detected. This provides a multi-stage analysis of the sample. To avoid the difficulties related to producing deep UV semiconductor sources, a pumping approach has been developed that uses

  1. Analysis of all-rac-alpha-tocopheryl acetate and retinyl palmitate in medical foods using a zero control reference material (ZRM) as a method development tool.

    Science.gov (United States)

    Chase, G W; Eitenmiller, R R; Long, A R

    1999-01-01

    A liquid chromatographic method is described for analysis of all- rac-alpha-tocopheryl acetate and retinyl palmitate in medical food. The vitamins are extracted in isopropyl alcohol and hexane-ethyl acetate without saponification and quantitated by normal-phase chromatography with fluorescence detection. All rac-alpha-tocopheryl acetate and retinyl palmitate are chromatographed isocratically with a mobile phase of 0.5% (v/v) and 0.125% (v/v) isopropyl alcohol in hexane, respectively. Recovery studies performed on a medical food zero control reference material (ZRM) fortified with the analytes averaged 99.7% (n = 25) for retinyl palmitate and 101% (n = 25) for all- rac-alpha-tocopheryl acetate. Coefficients of variation were 0.87-2.63% for retinyl palmitate and 1.42-3.20% for all-rac-alpha-tocopheryl acetate. The method provides a rapid, specific, and easily controlled assay for analysis of vitamin A and vitamin E in medical foods. Use of chlorinated solvents is avoided. PMID:10232898

  2. Research on the Method of Big Data Analysis

    OpenAIRE

    Li, Z. H.; H.F. Qin

    2013-01-01

    With the development of society, the relational database facing to the great opportunities and challenges, how to store big data, analysis big data is become a hot issue. This article from the traditional data analysis start, find out the traditional data analysis situation and the trend of data analysis. Big data is facing a lot of issues, such as architecture, analysis technical, storage, privacy and security. Due to the method of analysis, the article mainly introduced to the structu...

  3. On Software Development of Characteristic Set Method

    Institute of Scientific and Technical Information of China (English)

    WU Yong-wei; WANG Ding-kang; YANG Hong; LIN Dong-dai

    2002-01-01

    Characteristic set method of polynomial equation solving has been widely spread and its implementation in software has been urged to consider in recent years. Several packages for the method are implemented in some computer algebra systems, such as REDUCE and Maple. In order to improve the efficiency of the method, we have developed a computer algebra system "ELIMINO" written in C language and implemented on Linux operation system on a PC. The authors wish to share with the reader the knowledge and experiences about the design and development of software package of the characteristic set method.

  4. Evaluation methods of SWOT analysis

    OpenAIRE

    VANĚK, Michal; Mikoláš, Milan; Žváková, Kateřina

    2012-01-01

    Strategic management is an integral part of top management. By formulating the right strategy and its subsequent implementation, a managed organization can attract and retain a comparative advantage. In order to fulfil this expectation, the strategy also has to be supported with relevant findings of performed strategic analyses. The best known and probably the most common of these is a SWOT analysis. In practice, however, the analysis is reduced to mere presentation of influence factors, whic...

  5. Semiquantitative fluorescence method for bioconjugation analysis.

    Science.gov (United States)

    Brasil, Aluízio G; Carvalho, Kilmara H G; Leite, Elisa S; Fontes, Adriana; Santos, Beate Saegesser

    2014-01-01

    Quantum dots (QDs) have been used as fluorescent probes in biological and medical fields such as bioimaging, bioanalytical, and immunofluorescence assays. For these applications, it is important to characterize the QD-protein bioconjugates. This chapter provides details on a versatile method to confirm quantum dot-protein conjugation including the required materials and instrumentation in order to perform the step-by-step semiquantitative analysis of the bioconjugation efficiency by using fluorescence plate readings. Although the protocols to confirm the QD-protein attachment shown here were developed for CdTe QDs coated with specific ligands and proteins, the principles are the same for other QDs-protein bioconjugates. PMID:25103803

  6. Developing a multi-method approach to data collection and analysis for explaining the learning during simulation in undergraduate nurse education.

    Science.gov (United States)

    Bland, Andrew J; Tobbell, Jane

    2015-11-01

    Simulation has become an established feature of undergraduate nurse education and as such requires extensive investigation. Research limited to pre-constructed categories imposed by some questionnaire and interview methods may only provide partial understanding. This is problematic in understanding the mechanisms of learning in simulation-based education as contemporary distributed theories of learning posit that learning can be understood as the interaction of individual identity with context. This paper details a method of data collection and analysis that captures interaction of individuals within the simulation experience which can be analysed through multiple lenses, including context and through the lens of both researcher and learner. The study utilised a grounded theory approach involving 31 under-graduate third year student nurses. Data was collected and analysed through non-participant observation, digital recordings of simulation activity and focus group deconstruction of their recorded simulation by the participants and researcher. Focus group interviews enabled further clarification. The method revealed multiple levels of dynamic data, concluding that in order to better understand how students learn in social and active learning strategies, dynamic data is required enabling researchers and participants to unpack what is happening as it unfolds in action. PMID:26302649

  7. Safety relief valve alternate analysis method

    International Nuclear Information System (INIS)

    An experimental test program was started in the United States in 1976 to define and quantify Safety Relief Valve (SRV) phenomena in General Electric Mark I Suppression Chambers. The testing considered several discharged devices and was used to correlate SRV load prediction models. The program was funded by utilities with Mark I containments and has resulted in a detailed SRV load definition as a portion of the Mark I containment program Load Definition Report (LDR). The (USNRC) has reviewed and approved the LDR SRV load definition. In addition, the USNRC has permitted calibration of structural models used for predicting torus response to SRV loads. Model calibration is subject to confirmatory in-plant testing. The SRV methodology given in the LDR requires that transient dynamic pressures be applied to a torus structural model that includes a fluid added mass matrix. Preliminary evaluations of torus response have indicated order of magnitude conservatisms, with respect to test results, which could result in unrealistic containment modifications. In addition, structural response trends observed in full-scale tests between cold pipe, first valve actuation and hot pipe, subsequent valve actuation conditions have not been duplicated using current analysis methods. It was suggested by others that an energy approach using current fluid models be utilized to define loads. An alternate SRV analysis method is defined to correct suppression chamber structural response to a level that permits economical but conservative design. Simple analogs are developed for the purpose of correcting the analytical response obtained from LDR analysis methods. Analogs evaluated considered forced vibration and free vibration structural response. The corrected response correlated well with in-plant test response. The correlation of the analytical model at test conditions permits application of the alternate analysis method at design conditions. (orig./HP)

  8. Development and validation of automatic HS-SPME with a gas chromatography-ion trap/mass spectrometry method for analysis of volatiles in wines.

    Science.gov (United States)

    Paula Barros, Elisabete; Moreira, Nathalie; Elias Pereira, Giuliano; Leite, Selma Gomes Ferreira; Moraes Rezende, Claudia; Guedes de Pinho, Paula

    2012-11-15

    An automated headspace solid-phase microextraction (HS-SPME) combined with gas chromatography-ion trap/mass spectrometry (GC-IT/MS) was developed in order to quantify a large number of volatile compounds in wines such as alcohols, ester, norisoprenoids and terpenes. The procedures were optimized for SPME fiber selection, pre-incubation temperature and time, extraction temperature and time, and salt addition. A central composite experimental design was used in the optimization of the extraction conditions. The volatile compounds showed optimal extraction using a DVB/CAR/PDMS fiber, incubation of 5 ml of wine with 2g NaCl at 45 °C during 5 min, and subsequent extraction of 30 min at the same temperature. The method allowed the identification of 64 volatile compounds. Afterwards, the method was validated successfully for the most significant compounds and was applied to study the volatile composition of different white wines. PMID:23158309

  9. Information Security Risk Analysis Methods and Research Trends: AHP and Fuzzy Comprehensive Method

    Directory of Open Access Journals (Sweden)

    Ming-Chang Lee

    2014-02-01

    Full Text Available Information security risk analysis becomes an increasingly essential component of organization’s operations. Traditional Information security risk analysis is quantitative and qualitative analysis methods. Quantitative and qualitative analysis methods have some advantages for information risk analysis. However, hierarchy process has been widely used in security assessment. A future research direction may be development and application of soft computing such as rough sets, grey sets, fuzzy systems, generic algorithm, support vector machine, and Bayesian network and hybrid model. Hybrid model are developed by integrating two or more existing model. A Practical advice for evaluation information security risk is discussed. This approach is combination with AHP and Fuzzy comprehensive method

  10. Convergence analysis of combinations of different methods

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Y. [Clarkson Univ., Potsdam, NY (United States)

    1994-12-31

    This paper provides a convergence analysis for combinations of different numerical methods for solving systems of differential equations. The author proves that combinations of two convergent linear multistep methods or Runge-Kutta methods produce a new convergent method of which the order is equal to the smaller order of the two original methods.

  11. Development of Safety Analysis Technology for LMR

    International Nuclear Information System (INIS)

    In the safety analysis code system development area, the development of an analysis code for a flow blockage could be brought to completion throughout an integrated validation of MATRA-LMR-FB. The safety analysis code of SSC-K has been evolved by building detailed reactivity models and a core 3 dimensional T/H model into it, and developing its window version. A basic analysis module for SFR features also have been developed incorporating a numerical method, best estimated correlations, and a code structure module. For the analysis of the HCDA initiating phase, a sodium boiling model to be linked to SSC-K and a fuel transient performance/cladding failure model have been developed with a state-of-the-art study on the molten fuel movement models. Besides, scoping analysis models for the post-accident heat removal phase have been developed as well. In safety analysis area, the safety criteria for the KALIMER-600 have been set up, and an internal flow channel blockage and local faults have been analyzed for the assembly safety evaluation, while key safety concepts of the KALIMER-600 has been investigated getting through the analyses of ATWS as well as design basis accidents like TOP and LOF, from which the inherent safety due to a core reactivity feedback has been assessed. The HCDA analysis for the initiating phase and an estimation of the core energy release, subsequently, have been followed with setup of the safety criteria as well as T/H analysis for the core catcher. The thermal-hydraulic behaviors, and released radioactivity sources and dose rates in the containment have been analyzed for its performance evaluation in this area. The display of a data base for research products on the KALIMER Website and the detailed process planning with its status analysis, have become feasible from achievements in the area of the integrated technology development and establishment

  12. Protein crystallography. Methodological development and comprehensive analysis

    International Nuclear Information System (INIS)

    There have been remarkable developments in the methodology for protein structure analysis over the past few decades. Currently, single-wavelength anomalous diffraction phasing of a selenomethionyl derivative (Se-SAD) is used as a general method for determining protein structure, while the sulfur single-wavelength anomalous diffraction method (S-SAD) using native protein is evolving as a next-generation method. In this paper, we look back on the early applications of multi-wavelength anomalous diffraction phasing of a selenomethionyl derivative (Se-MAD) and introduce the study of ribosomal proteins as an example of the comprehensive analysis that took place in the 1990s. Furthermore, we refer to the current state of development of the S-SAD method as well as automatic structure determination. (author)

  13. Probabilistic methods in combinatorial analysis

    CERN Document Server

    Sachkov, Vladimir N

    2014-01-01

    This 1997 work explores the role of probabilistic methods for solving combinatorial problems. These methods not only provide the means of efficiently using such notions as characteristic and generating functions, the moment method and so on but also let us use the powerful technique of limit theorems. The basic objects under investigation are nonnegative matrices, partitions and mappings of finite sets, with special emphasis on permutations and graphs, and equivalence classes specified on sequences of finite length consisting of elements of partially ordered sets; these specify the probabilist

  14. Development of micro-PIXE method for geochemical analysis. Quantitative trace element analyses of minerals and single fluid inclusions by micro-PIXE

    International Nuclear Information System (INIS)

    This work was performed by University of Tsukuba under contract with Japan Nuclear Cycle Development Institute. In this study, quantitative trace element analytical techniques for minerals and single fluid inclusions by micro-PIXE have been developed investigate a behavior of radioactive nuclide in rocks and chemical compositions of deep underground water that is retained in minerals as fluid inclusions. By using the developed methods, trace elements in glasses and minerals were determined with average estimated relative error of ±10%. For single fluid inclusions, elements with concentrations of 10 to 1000 ppm were measured with average estimated relative error of ±7%. For natural fluid inclusions with 30 μm radius and 20 μm depth in quartz, the total analytical errors were estimated to be ±40% relative for Ca, ±16% for Fe, ±13% for Zn, ±12% for Sr, and ±11% for Br and Rb, by considering uncertainties in microscopic measurements of inclusion depths. Detection limits of 4 to 46 ppm for elements of mass numbers 25-50 were achieved for analyses of a spherical fluid inclusion with 30-μm radius and 20-μm depth in quartz at an integrated charge of 1.0 μC. The trace element compositions of single fluid inclusions in hydrothermal quartz veins from Nagano and Nagasaki Prefecture were also determined by using the developed method. High concentration (wt.%) of Ca and Fe, and tens to thousands ppm Mn, Zn, Cu, Br, Rb, Sr, Pb, and Ge, were observed in the fluid inclusions indicating higher metal contents in hydrothermal fluid released from granite. (author)

  15. Relativity Concept Inventory: Development, Analysis, and Results

    Science.gov (United States)

    Aslanides, J. S.; Savage, C. M.

    2013-01-01

    We report on a concept inventory for special relativity: the development process, data analysis methods, and results from an introductory relativity class. The Relativity Concept Inventory tests understanding of relativistic concepts. An unusual feature is confidence testing for each question. This can provide additional information; for example,…

  16. Developments of analysis method for tobacco flavors%烟用增香物质的分析技术进展

    Institute of Scientific and Technical Information of China (English)

    任志芹; 艾小勇; 张元; 王志; 付体鹏; 许成保; 张峰

    2014-01-01

    Public concern over varied flavor added in cigarettes have been increasing during the past few years. Tobacco flavors is indispensable for the production of cigarette. It is also an important factor affecting the taste of the cigarette. Some of the flavor added in cigarettes have been identified physiological toxicity. However, tobacco flavors is composed of complex chemicals. For the most part, we select different analytical methods based on the various nature of flavors. In this paper, pretreatment, separation and detection methods for tobacco flavors were reviewed. Some new analytical methods are also briefly introduced. Till now, Mass spectrometry is the most popular method of analysis over varied flavor added in cigarettes because of its high sensitivity and reproducibility.%卷烟中的增香物质是卷烟生产中不可或缺的原料,该物质既能减少焦油含量,同时也能影响卷烟口味。然而卷烟中的增香物质化学成分复杂,很多物质有不同程度的生理毒性,因此对烟草中增香成分的测定具有显著意义。目前烟用增香物质的的前处理和检测技术多种多样,本文对烟用增香物质的前处理和检测技术进行了综述和比较,对一些新的分析技术也作了简要介绍,拟为卷烟中增香物质的检测提供依据。

  17. Analysis of core-melt states for the development of detection methods for filling level change and deformation of the core in PWR-type reactors

    International Nuclear Information System (INIS)

    The project ''noninvasive status monitoring of nuclear reactors for detection of filling level changes and core deformation'' (NIZUK) is aimed to develop a measuring system for the core status diagnosis during severe accidents in PWR-type reactors. For the development of an appropriate measuring technology the knowledge on the processes during the in-vessel phase of the accident sequence is of main importance. Using the analysis of the accident sequence nine in-vessel phases were defined that are the basis for the development of the measuring system. The differences between the individual core-melt states include the different core geometries and a varying gamma radiation distribution at the reactor pressure vessel outer surface. Especially the appearance of local flow-off paths during a late in-vessel phase requires that several measuring probes with gamma radiation sensors have to be installed around the reactor pressure vessel in order to detect the gamma radiation distribution at the outside. The definition of further core-melt states would be possible in case of a re-flooding of the reactor pressure vessel. However, the increasing filling level would not significantly change the core deformation and the gamma distribution at the outside.

  18. Developing a TQM quality management method model

    OpenAIRE

    Zhang, Zhihai

    1997-01-01

    From an extensive review of total quality management literature, the external and internal environment affecting an organization's quality performance and the eleven primary elements of TQM are identified. Based on the primary TQM elements, a TQM quality management method model is developed. This model describes the primary quality management methods which may be used to assess an organization's present strengths and weaknesses with regard to its use of quality management methods. This model ...

  19. Metallurgical and chemical characterization of copper alloy reference materials within laser ablation inductively coupled plasma mass spectrometry: Method development for minimally-invasive analysis of ancient bronze objects

    Energy Technology Data Exchange (ETDEWEB)

    Walaszek, Damian, E-mail: damian.walaszek@empa.ch [Laboratory for Analytical Chemistry, Swiss Federal Laboratories for Materials Science and Technology, Überlandstrasse 129, CH-8600 Dübendorf (Switzerland); University of Warsaw, Faculty of Chemistry, Pasteura 1, 02-093 Warsaw (Poland); Senn, Marianne [Laboratory for Analytical Chemistry, Swiss Federal Laboratories for Materials Science and Technology, Überlandstrasse 129, CH-8600 Dübendorf (Switzerland); Faller, Markus [Laboratory for Jointing Technology and Corrosion, Swiss Federal Laboratories for Materials Science and Technology, Überlandstrasse 129, CH-8600 Dübendorf (Switzerland); Philippe, Laetitia [Laboratory for Mechanics of Materials and Nanostructures, Swiss Federal Laboratories for Materials Science and Technology, Feuerwerkstrasse 39, CH-3602 Thun (Switzerland); Wagner, Barbara; Bulska, Ewa [University of Warsaw, Faculty of Chemistry, Pasteura 1, 02-093 Warsaw (Poland); Ulrich, Andrea [Laboratory for Analytical Chemistry, Swiss Federal Laboratories for Materials Science and Technology, Überlandstrasse 129, CH-8600 Dübendorf (Switzerland)

    2013-01-01

    The chemical composition of ancient metal objects provides important information for manufacturing studies and authenticity verification of ancient copper or bronze artifacts. Non- or minimal-destructive analytical methods are preferred to mitigate visible damage. Laser ablation inductively coupled plasma mass spectrometry (LA-ICPMS) enables the determination of major elements as well as impurities down to lower ppm-levels, however, accuracy and precision of analysis strongly depend on the homogeneity of reference materials used for calibration. Moreover, appropriate analytical procedures are required e.g. in terms of ablation strategies (scan mode, spot size, etc.). This study reviews available copper alloy (certified) reference materials — (C)RMs from different sources and contributes new metallurgical data on homogeneity and spatial elemental distribution. Investigations of the standards were performed by optical and scanning electron microscopy with X-ray spectrometry (SEM-EDX) for the following copper alloy and bronze (certified) reference materials: NIST 454, BAM 374, BAM 211, BAM 227, BAM 374, BAM 378, BAS 50.01-2, BAS 50.03-4, and BAS 50.04-4. Additionally, the influence of inhomogeneities on different ablation and calibration strategies is evaluated to define an optimum analytical strategy in terms of line scan versus single spot ablation, variation of spot size, selection of the most appropriate RMs or minimum number of calibration reference materials. - Highlights: ► New metallographic data for copper alloy reference materials are provided. ► Influence of RMs homogeneity on quality of LA-ICPMS analysis was evaluated. ► Ablation and calibration strategies were critically discussed. ► An LA-ICPMS method is proposed for analyzing most typical ancient copper alloys.

  20. Metallurgical and chemical characterization of copper alloy reference materials within laser ablation inductively coupled plasma mass spectrometry: Method development for minimally-invasive analysis of ancient bronze objects

    International Nuclear Information System (INIS)

    The chemical composition of ancient metal objects provides important information for manufacturing studies and authenticity verification of ancient copper or bronze artifacts. Non- or minimal-destructive analytical methods are preferred to mitigate visible damage. Laser ablation inductively coupled plasma mass spectrometry (LA-ICPMS) enables the determination of major elements as well as impurities down to lower ppm-levels, however, accuracy and precision of analysis strongly depend on the homogeneity of reference materials used for calibration. Moreover, appropriate analytical procedures are required e.g. in terms of ablation strategies (scan mode, spot size, etc.). This study reviews available copper alloy (certified) reference materials — (C)RMs from different sources and contributes new metallurgical data on homogeneity and spatial elemental distribution. Investigations of the standards were performed by optical and scanning electron microscopy with X-ray spectrometry (SEM-EDX) for the following copper alloy and bronze (certified) reference materials: NIST 454, BAM 374, BAM 211, BAM 227, BAM 374, BAM 378, BAS 50.01-2, BAS 50.03-4, and BAS 50.04-4. Additionally, the influence of inhomogeneities on different ablation and calibration strategies is evaluated to define an optimum analytical strategy in terms of line scan versus single spot ablation, variation of spot size, selection of the most appropriate RMs or minimum number of calibration reference materials. - Highlights: ► New metallographic data for copper alloy reference materials are provided. ► Influence of RMs homogeneity on quality of LA-ICPMS analysis was evaluated. ► Ablation and calibration strategies were critically discussed. ► An LA-ICPMS method is proposed for analyzing most typical ancient copper alloys

  1. Development of a flow method for the determination of phosphate in estuarine and freshwaters-Comparison of flow cells in spectrophotometric sequential injection analysis

    International Nuclear Information System (INIS)

    Highlights: → Sequential injection determination of phosphate in estuarine and freshwaters. → Alternative spectrophotometric flow cells are compared. → Minimization of schlieren effect was assessed. → Proposed method can cope with wide salinity ranges. → Multi-reflective cell shows clear advantages. - Abstract: A sequential injection system with dual analytical line was developed and applied in the comparison of two different detection systems viz; a conventional spectrophotometer with a commercial flow cell, and a multi-reflective flow cell coupled with a photometric detector under the same experimental conditions. The study was based on the spectrophotometric determination of phosphate using the molybdenum-blue chemistry. The two alternative flow cells were compared in terms of their response to variation of sample salinity, susceptibility to interferences and to refractive index changes. The developed method was applied to the determination of phosphate in natural waters (estuarine, river, well and ground waters). The achieved detection limit (0.007 μM PO43-) is consistent with the requirement of the target water samples, and a wide quantification range (0.024-9.5 μM) was achieved using both detection systems.

  2. Development of instrumental methods of analysis of sulfur compounds in coal process streams. Twelfth quarterly technical progress report, July-September 1980

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, J.

    1980-10-01

    In the extension and refinement of thermodynamic survey and construction of pourbaix diagrams, when warranted by slow kinetics, specified moieties can appropriately be eliminated from consideration while assuming attainment of equilibrium between all other species. In the voltammetric methods development, it was found thaat tetrathionate can be quantitated by differential pulse polarography at the dropping mercury electrode. In the enthalpimetric methods development, sulfide can be determined by titration with standard thallous nitrate to a thermometric endpoint. The Tl/sub 2/S precipitated does not interfere in the subsequent acidimetric titration of Bronsted-base moieties, such as sulfite, carbonate and hydroxide. Generally, liquefaction by-product water samples are characterized by more negative redox potentials than their gasification counterparts. This reflects a preponderance of sulfidic sulfur versus thiosulfate. In gasification by-product waters, the situation is reversed. Cartesian displays of pH-redox potential correlations were constructed, revealing distinct patterns which clearly illustrate the difference between liquefaction and gasification by-product water samples.

  3. Development of a flow method for the determination of phosphate in estuarine and freshwaters-Comparison of flow cells in spectrophotometric sequential injection analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Raquel B.R. [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal); Laboratory of Hydrobiology, Institute of Biomedical Sciences Abel Salazar (ICBAS) and Institute of Marine Research (CIIMAR), Universidade do Porto, Lg. Abel Salazar 2, 4099-003 Porto (Portugal); Ferreira, M. Teresa S.O.B. [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal); Toth, Ildiko V. [REQUIMTE, Departamento de Quimica, Faculdade de Farmacia, Universidade de Porto, Rua Anibal Cunha, 164, 4050-047 Porto (Portugal); Bordalo, Adriano A. [Laboratory of Hydrobiology, Institute of Biomedical Sciences Abel Salazar (ICBAS) and Institute of Marine Research (CIIMAR), Universidade do Porto, Lg. Abel Salazar 2, 4099-003 Porto (Portugal); McKelvie, Ian D. [School of Chemistry, University of Melbourne, Victoria 3010 (Australia); Rangel, Antonio O.S.S., E-mail: aorangel@esb.ucp.pt [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal)

    2011-09-02

    Highlights: {yields} Sequential injection determination of phosphate in estuarine and freshwaters. {yields} Alternative spectrophotometric flow cells are compared. {yields} Minimization of schlieren effect was assessed. {yields} Proposed method can cope with wide salinity ranges. {yields} Multi-reflective cell shows clear advantages. - Abstract: A sequential injection system with dual analytical line was developed and applied in the comparison of two different detection systems viz; a conventional spectrophotometer with a commercial flow cell, and a multi-reflective flow cell coupled with a photometric detector under the same experimental conditions. The study was based on the spectrophotometric determination of phosphate using the molybdenum-blue chemistry. The two alternative flow cells were compared in terms of their response to variation of sample salinity, susceptibility to interferences and to refractive index changes. The developed method was applied to the determination of phosphate in natural waters (estuarine, river, well and ground waters). The achieved detection limit (0.007 {mu}M PO{sub 4}{sup 3-}) is consistent with the requirement of the target water samples, and a wide quantification range (0.024-9.5 {mu}M) was achieved using both detection systems.

  4. Development, optimization and validation of an HPLC-ELSD method for the analysis of enzymatically generated lactulose and saccharide by-products.

    Science.gov (United States)

    Schmidt, Christian M; Zürn, Tanja; Thienel, Katharina J F; Hinrichs, Jörg

    2017-01-15

    The aim of this study was to develop an HPLC-ELSD method for the quantification of lactulose in complex sugar solutions. Lactulose is a well-known prebiotic and supports the alleviation of digestive disorders. The enzymatic generation of lactulose requires fructose as nucleophilic acceptor. By-products such as glucose and galactose are generated. Four amino-modified silica-columns were tested and compared. The most suitable column based on peak resolution was used to optimize the method. Furthermore, sample preparation was optimized for the recovery of analytes. During the validation step, the following parameters were determined (e.g. for lactulose): recovery (106±7%), precision (98%), correctness (99%), limit of detection (3.9mg/L), limit of quantification (13.4mg/L) and linearity (0.993). The validated method was applied to samples from an enzymatic process for the production of lactulose at the laboratory scale. A final lactulose concentration of 6.7±0.4g/L was determined. PMID:27542485

  5. An investigation of thermo-compressor design by analysis and experiment: Part 2. Development of design method by using comprehensive characteristic curves

    International Nuclear Information System (INIS)

    Highlights: ► Introducing two non-dimensional geometrical variables in the thermo-compressor. ► Non-dimensional volume of the mixing zone has not been investigated previously. ► Producing comprehensive graphs for design purposes on the basis of these two variables. ► Generating a correlation to express the performance as a function of these variables. ► Developing the design method of thermo-compressors for obtaining an optimized design. - Abstract: In the first part of this study, the numerical evaluation of the thermo-compressor performance was validated with the experimental data and a precise CFD method for studying the internal flow was presented. The reliable simulation approach will enable us to introduce a design methodology for geometry selection for a thermo-compressor under the known operating conditions. In the current study, some important shape parameters of a conventional thermo-compressor will be introduced and the influence of these parameters on the overall performance will be investigated. In a further step, a wide range of geometrical parameters will be presented through the use of validated numerical method, and a collection of design curves will be produced. Finally, a practical relationship between characteristic parameters and non-dimensional geometrical parameters will be revealed. The objective of present work is to develop the design diagrams in order to establish a robust procedure for designing thermo-compressors with several facilities that one should expect from a standalone group of design charts. If the results of this procedure are carefully implemented to select shape parameters, a remarkable improvement to the overall performance of a thermo-compressor will be achieved perceptibly

  6. Selective spectroscopic methods for water analysis

    Energy Technology Data Exchange (ETDEWEB)

    Vaidya, B.

    1997-06-24

    This dissertation explores in large part the development of a few types of spectroscopic methods in the analysis of water. Methods for the determination of some of the most important properties of water like pH, metal ion content, and chemical oxygen demand are investigated in detail. This report contains a general introduction to the subject and the conclusions. Four chapters and an appendix have been processed separately. They are: chromogenic and fluorogenic crown ether compounds for the selective extraction and determination of Hg(II); selective determination of cadmium in water using a chromogenic crown ether in a mixed micellar solution; reduction of chloride interference in chemical oxygen demand determination without using mercury salts; structural orientation patterns for a series of anthraquinone sulfonates adsorbed at an aminophenol thiolate monolayer chemisorbed at gold; and the role of chemically modified surfaces in the construction of miniaturized analytical instrumentation.

  7. Development and Validation of a Standardized Method for Contouring the Brachial Plexus: Preliminary Dosimetric Analysis Among Patients Treated With IMRT for Head-and-Neck Cancer

    International Nuclear Information System (INIS)

    Purpose: Although Radiation Therapy Oncology Group protocols have proposed a limiting dose to the brachial plexus for patients undergoing intensity-modulated radiotherapy for head-and-neck cancer, essentially no recommendations exist for the delineation of this structure for treatment planning. Methods and Materials: Using anatomic texts, radiologic data, and magnetic resonance imaging, a standardized method for delineating the brachial plexus on 3-mm axial computed tomography images was devised. A neuroradiologist assisted with identification of the brachial plexus and adjacent structures. This organ at risk was then contoured on 10 consecutive patients undergoing intensity-modulated radiotherapy for head-and-neck cancer. Dose-volume histogram curves were generated by applying the proposed brachial plexus contour to the initial treatment plan. Results: The total dose to the planning target volume ranged from 60 to 70 Gy (median, 70). The mean brachial plexus volume was 33 ± 4 cm3 (range, 25.1-39.4). The mean irradiated volumes of the brachial plexus were 50 Gy (17 ± 3 cm3), 60 Gy (6 ± 3 cm3), 66 Gy (2 ± 1 cm3), 70 Gy (0 ± 1 cm3). The maximal dose to the brachial plexus was 69.9 Gy (range, 62.3-76.9) and was ≥60 Gy, ≥66 Gy, and ≥70 Gy in 100%, 70%, and 30% of patients, respectively. Conclusions: This technique provides a precise and accurate method for delineating the brachial plexus organ at risk on treatment planning computed tomography scans. Our dosimetric analysis suggest that for patients undergoing intensity-modulated radiotherapy for head-and-neck cancer, brachial plexus routinely receives doses in excess of historic and Radiation Therapy Oncology Group limits

  8. Analysis methods for photovoltaic applications

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-01-01

    Because photovoltaic power systems are being considered for an ever-widening range of applications, it is appropriate for system designers to have knowledge of and access to photovoltaic power systems simulation models and design tools. This brochure gives brief descriptions of a variety of such aids and was compiled after surveying both manufacturers and researchers. Services available through photovoltaic module manufacturers are outlined, and computer codes for systems analysis are briefly described. (WHK)

  9. The Functional Methods of Discourse Analysis

    Institute of Scientific and Technical Information of China (English)

    覃卓敏

    2008-01-01

    From the macroscopic angle of function, methods of discourse analysis are clarified to find out two important methods in pragmatics and through which will better used in the understanding of discourse.

  10. Statistical analysis and optimization methods

    Czech Academy of Sciences Publication Activity Database

    Halámek, Josef; Holík, M.; Jurák, Pavel; Kasal, Miroslav

    Liptovský Mikuláš : Vojenská Akadémia FZV, 2002 - (Puttera, J.), s. 286 - 289 ISBN 80-8040-180-2. [KTERP. Tatranské Zruby (SK), 24.04.2002-26.04.2002] R&D Projects: GA ČR GA102/02/1339 Institutional research plan: CEZ:AV0Z2065902 Keywords : scatter plot * confidence ellipses * graphical method Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering

  11. Method development for the determination of bromine in coal using high-resolution continuum source graphite furnace molecular absorption spectrometry and direct solid sample analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Éderson R.; Castilho, Ivan N.B. [Departamento de Química, Universidade Federal de Santa Catarina, 88040-900 Florianópolis, SC (Brazil); Welz, Bernhard, E-mail: w.bernardo@terra.com.br [Departamento de Química, Universidade Federal de Santa Catarina, 88040-900 Florianópolis, SC (Brazil); Instituto Nacional de Ciência e Tecnologia do CNPq, INCT de Energia e Ambiente, Universidade Federal da Bahia, 40170-115 Salvador, BA (Brazil); Gois, Jefferson S. [Departamento de Química, Universidade Federal de Santa Catarina, 88040-900 Florianópolis, SC (Brazil); Borges, Daniel L.G. [Departamento de Química, Universidade Federal de Santa Catarina, 88040-900 Florianópolis, SC (Brazil); Instituto Nacional de Ciência e Tecnologia do CNPq, INCT de Energia e Ambiente, Universidade Federal da Bahia, 40170-115 Salvador, BA (Brazil); Carasek, Eduardo [Departamento de Química, Universidade Federal de Santa Catarina, 88040-900 Florianópolis, SC (Brazil); Andrade, Jailson B. de [Instituto Nacional de Ciência e Tecnologia do CNPq, INCT de Energia e Ambiente, Universidade Federal da Bahia, 40170-115 Salvador, BA (Brazil)

    2014-06-01

    This work reports a simple approach for Br determination in coal using direct solid sample analysis in a graphite tube furnace and high-resolution continuum source molecular absorption spectrometry. The molecular absorbance of the calcium mono-bromide (CaBr) molecule has been measured using the rotational line at 625.315 nm. Different chemical modifiers (zirconium, ruthenium, palladium and a mixture of palladium and magnesium nitrates) have been evaluated in order to increase the sensitivity of the CaBr absorption, and Zr showed the best overall performance. The pyrolysis and vaporization temperatures were 800 °C and 2200 °C, respectively. Accuracy and precision of the method have been evaluated using certified coal reference materials (BCR 181, BCR 182, NIST 1630a, and NIST 1632b) with good agreement (between 98 and 103%) with the informed values for Br. The detection limit was around 4 ng Br, which corresponds to about 1.5 μg g{sup −1} Br in coal, based on a sample mass of 3 mg. In addition, the results were in agreement with those obtained using electrothermal vaporization inductively coupled plasma mass spectrometry, based on a Student t-test at a 95% confidence level. A mechanism for the formation of the CaBr molecule is proposed, which might be considered for other diatomic molecules as well. - Highlights: • Bromine has been determined in coal using direct solid sample analysis. • Calibration has been carried out against aqueous standard solutions. • The coal samples and the molecule-forming reagent have been separated in order to avoid interferences. • The results make possible to draw conclusions about the mechanisms of molecule formation.

  12. Method development for the determination of bromine in coal using high-resolution continuum source graphite furnace molecular absorption spectrometry and direct solid sample analysis

    International Nuclear Information System (INIS)

    This work reports a simple approach for Br determination in coal using direct solid sample analysis in a graphite tube furnace and high-resolution continuum source molecular absorption spectrometry. The molecular absorbance of the calcium mono-bromide (CaBr) molecule has been measured using the rotational line at 625.315 nm. Different chemical modifiers (zirconium, ruthenium, palladium and a mixture of palladium and magnesium nitrates) have been evaluated in order to increase the sensitivity of the CaBr absorption, and Zr showed the best overall performance. The pyrolysis and vaporization temperatures were 800 °C and 2200 °C, respectively. Accuracy and precision of the method have been evaluated using certified coal reference materials (BCR 181, BCR 182, NIST 1630a, and NIST 1632b) with good agreement (between 98 and 103%) with the informed values for Br. The detection limit was around 4 ng Br, which corresponds to about 1.5 μg g−1 Br in coal, based on a sample mass of 3 mg. In addition, the results were in agreement with those obtained using electrothermal vaporization inductively coupled plasma mass spectrometry, based on a Student t-test at a 95% confidence level. A mechanism for the formation of the CaBr molecule is proposed, which might be considered for other diatomic molecules as well. - Highlights: • Bromine has been determined in coal using direct solid sample analysis. • Calibration has been carried out against aqueous standard solutions. • The coal samples and the molecule-forming reagent have been separated in order to avoid interferences. • The results make possible to draw conclusions about the mechanisms of molecule formation

  13. Development of a nondestructive method for underglaze painted tiles--demonstrated by the analysis of Persian objects from the nineteenth century.

    Science.gov (United States)

    Reiche, Ina; Röhrs, Stefan; Salomon, Joseph; Kanngiesser, Birgit; Höhn, Yvonne; Malzer, Wolfgang; Voigt, Friederike

    2009-02-01

    The paper presents an analytical method developed for the nondestructive study of nineteenth-century Persian polychrome underglaze painted tiles. As an example, 9 tiles from French and German museum collections were investigated. Before this work was undertaken little was known about the materials used in pottery at that time, although the broad range of colors and shades, together with their brilliant glazes, made these objects stand out when compared with Iranian ceramics of the preceding periods and suggested the use of new pigments, colorants, and glaze compositions. These materials are thought to be related to provenance and as such appropriate criteria for art-historical attribution. The analytical method is based on the combination of different nondestructive spectroscopic techniques using microfocused beams such as proton-induced X-ray emission/proton-induced gamma-ray emission, X-ray fluorescence, 3D X-ray absorption near edge structure, and confocal Raman spectroscopy and also visible spectroscopy. It was established to address the specific difficulties these objects and the technique of underglaze painting raise. The exact definition of the colors observed on the tiles using the Natural Color System helped to attribute them to different colorants. It was possible to establish the presence of Cr- and U-based colorants as new materials in nineteenth-century Persian tilemaking. The difference in glaze composition (Pb, Sn, Na, and K contents) as well as the use of B and Sn were identified as a potential marker for different workshops. PMID:19030848

  14. Trace elements in e-liquids - Development and validation of an ICP-MS method for the analysis of electronic cigarette refills.

    Science.gov (United States)

    Beauval, N; Howsam, M; Antherieu, S; Allorge, D; Soyez, M; Garçon, G; Goossens, J F; Lo-Guidice, J M; Garat, A

    2016-08-01

    Electronic cigarette use has rapidly increased in recent years. In assessing their safety, and in view of coming regulations, trace elements (TE) are among the potentially toxic compounds required to be evaluated in electronic cigarette refill fluids ("e-liquids"). An analytical method using inductively coupled plasma with mass spectrometric detection (ICP-MS) was developed and rigorously validated in order to determine concentrations of 15 TE in 54 e-liquids from a French brand. Despite a significant matrix effect from the main e-liquid constituents, and difficulties related to the current lack of reference materials, our method demonstrated satisfactory linearity, precision and robustness, and permitted the quantification of low concentrations of these 15 elements: lower limits of quantification (LLQ) obtained were ≤4 ppb for all elements except for Ni, Cu and Zn (16 ppb, 20 ppb and 200 ppb, respectively). All TE concentrations in all tested samples were prospective demand in light of current or future regulations. PMID:27058761

  15. Development of an improved method for trace analysis of quinolones in eggs of laying hens and wildlife species using molecularly imprinted polymers.

    Science.gov (United States)

    Blasco, Cristina; Picó, Yolanda

    2012-11-01

    A sensitive, selective, and efficient method was developed for simultaneous determination of 11 fluoroquinolones (FQs), ciprofloxacin, danofloxacin, difloxacin, enrofloxacin, flumequine, marbofloxacin, norfloxacin, ofloxacin, oxolinic acid, pipemidic acid, and sarafloxacin, in eggs by molecularly imprinted polymer (MIP) and column liquid chromatography-electrospray ionization-tandem mass spectrometry (LC-ESI-MS/MS). Samples were diluted with 50 mM sodium dihydrogen phosphate at pH 7.4, followed by purification with a commercial MIP (SupelMIP SPE-Fluoroquinolones). Recoveries for the 11 quinolones were in the range of 90-106% with intra- and interday relative standard deviation ranging from 1 to 6% and from 3 to 8%, respectively. Limits of detection (LODs) were 0.12-0.85 ng/g, and limits of quantification (LOQs) were 0.36 and 2.59 ng/g, whereas the decision limit (CC(α)) and detection capability (CC(β)) ranged from 0.46 to 3.35 ng/g and from 0.59 to 4.12 ng/g, respectively. The calculated relevant validation parameters are in an acceptable range and in compliance with the requirements of Commission Decision 2002/657/EC. Moreover, a comparison to two other sample treatments [solid-phase extraction (SPE) and solvent extraction] has been carried out. The method was applied to lying hens, Japanese quail, and black-headed gull eggs, in which FQs were not found. The method was also applied to study the depletion of sarafloxacin in eggs. PMID:23009602

  16. Development of a sensitive and reliable high performance liquid chromatography method with fluorescence detection for high-throughput analysis of multi-class mycotoxins in Coix seed.

    Science.gov (United States)

    Kong, Wei-Jun; Li, Jun-Yuan; Qiu, Feng; Wei, Jian-He; Xiao, Xiao-He; Zheng, Yuguo; Yang, Mei-Hua

    2013-10-17

    As an edible and medicinal plant, Coix seed is readily contaminated by more than one group of mycotoxins resulting in potential risk to human health. A reliable and sensitive method has been developed to determine seven mycotoxins (aflatoxins B1, B2, G1, G2, zearalenone, α-zearalenol, and β-zearalenol) simultaneously in 10 batches of Coix seed marketed in China. The method is based on a rapid ultrasound-assisted solid-liquid extraction (USLE) using methanol/water (80/20) followed by immunoaffinity column (IAC) clean-up, on-line photochemical derivatization (PCD), and high performance liquid chromatography coupled with fluorescence detection (HPLC-FLD). Careful optimization of extraction, clean-up, separation and detection conditions was accomplished to increase sample throughput and to attain rapid separation and sensitive detection. Method validation was performed by analyzing samples spiked at three different concentrations for the seven mycotoxins. Recoveries were from 73.5% to 107.3%, with relative standard deviations (RSDs) lower than 7.7%. The intra- and inter-day precisions, expressed as RSDs, were lower than 4% for all studied analytes. Limits of detection and quantification ranged from 0.01 to 50.2 μg kg(-1), and from 0.04 to 125.5 μg kg(-1), respectively, which were below the tolerance levels for mycotoxins set by the European Union. Samples that tested positive were further analyzed by HPLC tandem electrospray ionization mass spectrometry for confirmatory purposes. This is the first application of USLE-IAC-HPLC-PCD-FLD for detecting the occurrence of multi-class mycotoxins in Coix seed. PMID:24091376

  17. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    Science.gov (United States)

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  18. Development of a Chemoenzymatic-like and Photoswitchable Method for the High-Throughput creation of Protein Microarrays. Application to the Analysis of the Protein/Protein Interactions Involved in the YOP Virulon from Yersinia pestis.

    Energy Technology Data Exchange (ETDEWEB)

    Camarero, J A

    2006-12-07

    Protein arrays are ideal tools for the rapid analysis of whole proteomes as well as for the development of reliable and cheap biosensors. The objective of this proposal is to develop a new ligand assisted ligation method based in the naturally occurring protein trans-splicing process. This method has been used for the generation of spatially addressable arrays of multiple protein components by standard micro-lithographic techniques. Key to our approach is the use of the protein trans-splicing process. This naturally occurring process allows the development of a truly generic and highly efficient method for the covalent attachment of proteins through its C-terminus to any solid support. This technology has been used for the creation of protein chips containing several virulence factors from the human pathogen Y. pestis.

  19. Analysis and development of methods for the recovery of degraded tri-n-butyl phosphate (TBP)-30%V/V-dodecane

    International Nuclear Information System (INIS)

    Tri-n-butyl phosphate associated with an inert hydrocarbon, is the principal solvent used in reprocessing of nuclear irradiated fuel arising of pressurized water reactors, nowdays. The combined action of radiation and nitric acid cause severe damage to solvent, in reprocessing steps. Then, the recovery of solvent gets some importance, since it decreases the amount of the waste and improves the economy of the process. A comparative analysis of several methods of the recovery of this solvent was done, such as: alkaline washing, adsortion with resins, adsorption with aluminium oxide, adsorption by active carbon and adsorption by vermiculite. Some modifications of the analytical test of 95Zr and a mathematical definition of two new parameters were done: the degradation grade and the eficiency of recovering. Through this modified test of 95Zr, the residence time and the rate of degraded solvent: recuperator, were determined. After the laboratory tests had been performed, vermiculite, associated with active carbon, were employed in the treatment of 50 liters of tri-n-butyl phosphate (30%V/V)-dodecane, degraded by hydrolysis. Succeding analyses were made to check up the potentialities of these solids in the recovering of this solvent. (Author)

  20. Reconfigurability Analysis Method for Spacecraft Autonomous Control

    OpenAIRE

    Dayi Wang; Chengrui Liu

    2014-01-01

    As a critical requirement for spacecraft autonomous control, reconfigurability should be considered in design stage of spacecrafts by involving effective reconfigurability analysis method in guiding system designs. In this paper, a novel reconfigurability analysis method is proposed for spacecraft design. First, some basic definitions regarding spacecraft reconfigurability are given. Then, based on function tree theory, a reconfigurability modeling approach is established to properly describe...

  1. The Qualitative Method of Impact Analysis.

    Science.gov (United States)

    Mohr, Lawrence B.

    1999-01-01

    Discusses qualitative methods of impact analysis and provides an introductory treatment of one such approach. Combines an awareness of an alternative causal epistemology with current knowledge of qualitative methods of data collection and measurement to produce an approach to the analysis of impacts. (SLD)

  2. Analysis of Vibration Diagnostics Methods for Induction Motors

    Directory of Open Access Journals (Sweden)

    A. Kalinov

    2014-09-01

    Full Text Available The paper presents an analysis of existing vibration diagnostics methods. In order to evaluate an efficiency of method application the following criteria have been proposed: volume of input data required for establishing diagnosis, data content, software and hardware level, execution time for vibration diagnostics. According to the mentioned criteria a classification of vibration diagnostics methods for determination of their advantages and disadvantages, search for their development and improvement has been presented in paper. The paper contains a comparative estimation of methods in accordance with the proposed  criteria. According to this estimation the most efficient methods are a spectral analysis and spectral analysis of the vibration signal envelope.

  3. Developing numerical methods for experimental data processing

    International Nuclear Information System (INIS)

    Materials study implies experimental measurements the results of which are always affected by noise. To perform numerical data processing, as for instance, numerical derivation preparatory smoothing it is necessary to avoid instabilities. This implies the noise extraction from the experimental data. When obtaining great amount of data is possible, many of the noise related problems can be solved by using statistical indicators. In case of high cost experiments or problems of unique type, the task of extracting useful information referring to given materials parameters is of paramount significance. The paper presents several numerical methods for processing the experimental data developed at INR Pitesti. These were employed in treating the experimental data obtained in nuclear materials studies and which aimed at materials characterization and fabrication technology development. To refine and determine the accuracy of the real experimental data processing methods, computerized simulations were largely used. These methods refer to the transfer relations for important statistical indicators in case of mediate measurements, to increase the resolution of the measurements carried out with linear detectors as well as for numerical smoothing of experimental data. A figure is given with results obtained by applying the numerical smoothing method for the experimental data from X-ray diffraction measurements on Zircaloy-4. The numerical methods developed were applied in materials studies of the structure materials used in CANDU 600 reactor and advanced CANDU type fuels as well as for natural uranium or thorium and thorium-uranium fuel pellets. These methods helped in increasing the measurements' accuracy and confidence level

  4. 基于企业价值系统方法论的企业发展战略分析方法的探讨%Exploration of Enterprise Development Strategy Analysis Method Based on Enterprise Value System Methodology

    Institute of Scientific and Technical Information of China (English)

    陈向荣

    2014-01-01

    This paper firstly reviews and introduces value engineering theory and the historical development and theoretical system of enterprise value system methodology, then puts forward the enterprise development strategy analysis method based on enterprise value system methodology, which using the PEST analysis, Porter's five forces model analysis, SWOT analysis and AHP analysis method to carry on environmental analysis, using RWFJ analysis to analyze the coupling relationship of key elements for the things and environment, as well as using Boston matrix analysis to analyze and formulate portfolio combination development strategy, finally gives the conclusions and precautions.%本文首先对价值工程理论和企业价值系统方法论的历史发展和理论体系进行了回顾和介绍;然后提出了基于企业价值系统方法论的企业发展战略分析方法,采用PEST分析法、波特五力模型分析法、SWOT分析法和AHP分析法进行环境分析,使用RWFJ分析法分析关键要素对事和环境的耦合关系,应用波士顿矩阵分析法分析制定业务组合发展策略;最后给出了分析企业发展战略的结论和需要注意的事项。

  5. Development of a novel two-stage liquid desiccant dehumidification system assisted by CaCl2 solution using exergy analysis method

    International Nuclear Information System (INIS)

    Air conditioning system based on liquid desiccant has been recognized as an efficient independent air humidity control HVAC system. To improve thermal coefficient of performance, a novel two-stage liquid desiccant dehumidification system assisted by calcium chloride (CaCl2) solution is developed through exergy analysis based on the second thermodynamic law. Compared with the basic liquid desiccant dehumidification system, the proposed system is improved by two ways, i.e. increasing the concentration variance and the pre-dehumidification of CaCl2. The exergy loss in the desiccant-desiccant heat recovery process can be significantly reduced by increasing desiccant concentration variance between strong desiccant solution after regeneration and weak desiccant solution after dehumidification. Meanwhile, the pre-dehumidification of CaCl2 solution can reduce the irreversibility in the regeneration/dehumidification process. Compared to the basic system, the thermal coefficient performance and exergy efficiency of the proposed system are increased from 0.24 to 0.73 and from 6.8% to 23.0%, respectively, under the given conditions. Useful energy storage capacity of CaCl2 solution and LiCl solution at concentration of 40% reach 237.8 and 395.1 MJ/m3, respectively. The effects of desiccant regeneration temperature, air mass flux, desiccant mass flux, etc., on the performance of the proposed system are also analyzed.

  6. Probabilistic structural analysis by extremum methods

    Science.gov (United States)

    Nafday, Avinash M.

    1990-01-01

    The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

  7. Nonlinear time series analysis methods and applications

    CERN Document Server

    Diks, Cees

    1999-01-01

    Methods of nonlinear time series analysis are discussed from a dynamical systems perspective on the one hand, and from a statistical perspective on the other. After giving an informal overview of the theory of dynamical systems relevant to the analysis of deterministic time series, time series generated by nonlinear stochastic systems and spatio-temporal dynamical systems are considered. Several statistical methods for the analysis of nonlinear time series are presented and illustrated with applications to physical and physiological time series.

  8. Development of medical application methods using radiation. Radionuclide therapy

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Chang Woon; Lim, S. M.; Kim, E.H.; Woo, K. S.; Chung, W. S.; Lim, S. J.; Choi, T. H.; Hong, S. W.; Chung, H. Y.; No, W. C. [Korea Atomic Energy Research Institute. Korea Cancer Center Hospital, Seoul, (Korea, Republic of); Oh, B. H. [Seoul National University. Hospital, Seoul (Korea, Republic of); Hong, H. J. [Antibody Engineering Research Unit, Taejon (Korea, Republic of)

    1999-04-01

    In this project, we studied following subjects: 1. development of monoclonal antibodies and radiopharmaceuticals 2. clinical applications of radionuclide therapy 3. radioimmunoguided surgery 4. prevention of restenosis with intracoronary radiation. The results can be applied for the following objectives: (1) radionuclide therapy will be applied in clinical practice to treat the cancer patients or other diseases in multi-center trial. (2) The newly developed monoclonal antibodies and biomolecules can be used in biology, chemistry or other basic life science research. (3) The new methods for the analysis of therapeutic effects, such as dosimetry, and quantitative analysis methods of radioactivity, can be applied in basic research, such as radiation oncology and radiation biology.

  9. Development of medical application methods using radiation. Radionuclide therapy

    International Nuclear Information System (INIS)

    In this project, we studied following subjects: 1. development of monoclonal antibodies and radiopharmaceuticals 2. clinical applications of radionuclide therapy 3. radioimmunoguided surgery 4. prevention of restenosis with intracoronary radiation. The results can be applied for the following objectives: 1) radionuclide therapy will be applied in clinical practice to treat the cancer patients or other diseases in multi-center trial. 2) The newly developed monoclonal antibodies and biomolecules can be used in biology, chemistry or other basic life science research. 3) The new methods for the analysis of therapeutic effects, such as dosimetry, and quantitative analysis methods of radioactivity, can be applied in basic research, such as radiation oncology and radiation biology

  10. Economic analysis of alternative LLW disposal methods

    International Nuclear Information System (INIS)

    The Environmental Protection Agency (EPA) has evaluated the costs and benefits of alternative disposal technologies as part of its program to develop generally applicable environmental standards for the land disposal of low-level radioactive waste (LLW). Costs, population health effects and Critical Population Group (CPG) exposures resulting from alternative waste treatment and disposal methods were evaluated both in absolute terms and also relative to a base case (current practice). Incremental costs of the standard included costs for packaging, processing, transportation, and burial of waste. Benefits are defined in terms of reductions in the general population health risk (expected fatal cancers and genetic effects) evaluated over 10,000 years. A cost-effectiveness ratio, defined as the incremental cost per avoided health effect, was calculated for each alternative standard. The cost-effectiveness analysis took into account a number of waste streams, hydrogeologic and climatic region settings, and waste treatment and disposal methods. This paper describes the alternatives considered and preliminary results of the cost-effectiveness analysis. 15 references, 7 figures, 3 tables

  11. Development of advanced nodal diffusion methods for modern computer architectures

    International Nuclear Information System (INIS)

    A family of highly efficient multidimensional multigroup advanced neutron-diffusion nodal methods, ILLICO, were implemented on sequential, vector, and vector-concurrent computers. Three-dimensional realistic benchmark problems can be solved in vectorized mode in less than 0.73 s (33.86 Mflops) on a Cray X-MP/48. Vector-concurrent implementations yield speedups as high as 9.19 on an Alliant FX/8. These results show that the ILLICO method preserves essentially all of its speed advantage over finite-difference methods. A self-consistent higher-order nodal diffusion method was developed and implemented. Nodal methods for global nuclear reactor multigroup diffusion calculations which account explicitly for heterogeneities in the assembly nuclear properties were developed and evaluated. A systematic analysis of the zero-order variable cross section nodal method was conducted. Analyzing the KWU PWR depletion benchmark problem, it is shown that when burnup heterogeneities arise, ordinary nodal methods, which do not explicitly treat the heterogeneities, suffer a significant systematic error that accumulates. A nodal method that treats explicitly the space dependence of diffusion coefficients was developed and implemented. A consistent burnup-correction method for nodal microscopic depletion analysis was developed

  12. Probabilistic structural analysis methods of hot engine structures

    Science.gov (United States)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures is a major activity at Lewis Research Center. Recent activities have focused on extending the methods to include the combined uncertainties in several factors on structural response. This paper briefly describes recent progress on composite load spectra models, probabilistic finite element structural analysis, and probabilistic strength degradation modeling. Progress is described in terms of fundamental concepts, computer code development, and representative numerical results.

  13. Overview of activities related to the WGCS/AG2 in the study of structural analysis methods and damage models for the development of design rules

    International Nuclear Information System (INIS)

    , two other areas, one related to Research and Development programmes the other to and industrial projects conducted by EU organizations, have been an important source of input to the WGCS work. Therefore, it could be mentioned that the work produced by the WGCS moves amongst the three boundary areas described above. Paper describes the outstanding studies and ongoing activities of the WGCS including simplified calculation methods and established computational techniques as well as constitutive equations needed for complicated material descriptions and benchmarks

  14. SWOT分析法在三甲医院门诊建设和发展中的应用%The application of SWOT analysis method in the construction and development of the province tumor hospital new district outpatient service

    Institute of Scientific and Technical Information of China (English)

    江锦平; 张敬; 赵翠霞; 单保恩; 王士杰; 席彪

    2014-01-01

    目的:探讨SWOT分析法(态势分析法)在医院新区门诊建设与发展中的应用。方法:运用SWOT分析法对医院新区门诊建设与发展进行分析。结果:通过分析找出了医院新区门诊建设与发展的优势与劣势,并针有对性地提出相应的策略,使医院新区门诊量实现持续增长。结论:SWOT分析法能够帮助医院新区门诊制定相应切实可行的发展策略,促进医院良性发展。%Objective:To investigate the application of SWOT analysis method in the construction and development of the province tumor hospital new district outpatient service. Methods:It was analyzed which construction and development of the provincial cancer hospital district outpatient through using the SWOT analysis method. Results: The advantages and disadvantages, the opportunities and threats of new outpatient service were found by using the SWOT analysis. The corresponding development strategy of SO, WO, ST, WT were established and implemented, It made new outpatient service to achieve the sustained growth. Conclusion:The SWOT analysis method is feasible and useful to promote the development of new district hospital outpatient service better.

  15. Development of a method for the analysis of underivatized amino acids by liquid chromatography/tandem mass spectrometry: application on Standard Reference Material 1649a (urban dust).

    Science.gov (United States)

    Buiarelli, Francesca; Gallo, Valentina; Di Filippo, Patrizia; Pomata, Donatella; Riccardi, Carmela

    2013-10-15

    A liquid chromatography-tandem mass spectrometry analytical procedure has been developed for the detection and quantitative determination of underivatized amino acids at low concentrations in a Standard Reference Material-urban dust. In order to minimize interferences of other compounds, an accelerated solvent extraction followed by a solid phase extraction on two different cartridges was applied prior to LC-MS-MS. Fourteen amino acids were separated by high resolution liquid chromatography, detected and quantified by multiple reaction monitoring on a triple quadrupole. The proposed methodology has been applied for the first time on Standard Reference Material 1649a (urban dust) from the National Institute of Standards and Technology, that does not report certification values for these compounds. This methodology avoids the derivatization step and allows the amino acid quantification in a complex matrix, such as that of atmospheric particulate matter, and represent a good method suitable to analyze this class of compounds in atmospheric aerosol. The selected strategy demonstrated to be fit-for-purpose, by applying it to a real atmospheric sample with the aim to verify the efficacy of the study and to provide information about the organic matter content. PMID:24054689

  16. Prescription and illicit psychoactive drugs in oral fluid--LC-MS/MS method development and analysis of samples from Brazilian drivers.

    Science.gov (United States)

    Zancanaro, Ivomar; Limberger, Renata Pereira; Bohel, Paula O; dos Santos, Maíra Kerpel; De Boni, Raquel B; Pechansky, Flavio; Caldas, Eloisa Dutra

    2012-11-30

    This study is part of a larger project designed to investigate the prevalence of psychoactive drug (PAD) use among Brazilian drivers. In this paper we describe the development and validation of an analytical method to analyze 32 prescription and illicit PADs (amphetamines, benzodiazepines, cocaine, cannabis, opioids, ketamine and m-CPP) and metabolites in oral fluid samples collected with a Quantisal™ device. Samples were extracted with ethyl acetate:hexane and analyzed by LC-MS/MS. Instrumental LOD ranged from 0.26 to 0.65 ng/mL. Mean procedural recoveries at 1.3 ng/mL (LLOQ) ranged from 50% to 120% for 24 compounds. Recoveries were concentration independent, with the exception of femproporex, heroin and ecgonine methyl-ester (EME) for which the recovery decreased significantly at higher levels (13 and 52 ng/mL). RSD was metabolites were the analytes most detected in the samples (129; 5.8%), followed by amphetamines/metabolite (69; 3.1%), benzodiazepines (28; 1.2%), cannabinoids (23; 1.1%) and opioids (8; 0.4%). Detection of at least two PADs from different classes accounted for 9.3% of the 236 positive samples. Cocaine was found at higher levels in the samples (up to 1165 ng/mL). Preventive measures aimed at reducing the use of PADs by drivers in Brazil will certainly contribute to decrease the country's highway death rates. PMID:23000138

  17. New Developments of the Shared Concern Method.

    Science.gov (United States)

    Pikas, Anatol

    2002-01-01

    Reviews and describes new developments in the Shared Concern method (SCm), a tool for tackling group bullying amongst teenagers by individual talks. The psychological mechanisms of healing in the bully group and what hinders the bully therapist in eliciting them have become better clarified. The most important recent advancement of the SCm…

  18. Benchmarking Learning and Teaching: Developing a Method

    Science.gov (United States)

    Henderson-Smart, Cheryl; Winning, Tracey; Gerzina, Tania; King, Shalinie; Hyde, Sarah

    2006-01-01

    Purpose: To develop a method for benchmarking teaching and learning in response to an institutional need to validate a new program in Dentistry at the University of Sydney, Australia. Design/methodology/approach: After a collaborative partner, University of Adelaide, was identified, the areas of teaching and learning to be benchmarked, PBL…

  19. A Framework for Teaching Software Development Methods

    Science.gov (United States)

    Dubinsky, Yael; Hazzan, Orit

    2005-01-01

    This article presents a study that aims at constructing a teaching framework for software development methods in higher education. The research field is a capstone project-based course, offered by the Technion's Department of Computer Science, in which Extreme Programming is introduced. The research paradigm is an Action Research that involves…

  20. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, Bedir; Saeki, Motoshi; Sunyé, Gerson; Broek, van den Pim; Hruby, Pavel; Frohner, A´ kos

    2002-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software develo

  1. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, Bedir; Saeki, Motoshi; Sunyé, Gerson; Broek, van den Pim; Hruby, Pavel

    2001-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software devel

  2. Development of spatially dependent resonance shielding method

    International Nuclear Information System (INIS)

    A new spatially dependent resonance self-shielding method (SDDM: Spatially Dependent Dancoff Method) was developed based on the generalization of the conventional Dancoff method to multi-regions in a fuel pellet based on the Stoker/Weiss technique. SDDM correctly accounts for radial power distribution within fuel rods in a fuel assembly. SDDM is fully consistent with the conventional method if the pellet is not sub-divided. It also has the advantage of being less computing time consuming when compared to more rigorous resonance shielding method such as sub-group and special fine energy mesh methods. Moreover, it can be installed easily into the lattice physics code widely used in commercial LWR design. To validate the method, spatial concentration of isotopes and burnup distribution within a rod are evaluated using SDDM and the results are compared to the destructive measurement data. From the comparison, it is concluded that the spatially dependent Dancoff method, SDDM, is appropriate for generating the effective cross-sections in the fuel rings. (author)

  3. CARBON SEQUESTRATION: A METHODS COMPARATIVE ANALYSIS

    International Nuclear Information System (INIS)

    All human activities are related with the energy consumption. Energy requirements will continue to rise, due to the modern life and the developing countries growth. Most of the energy demand emanates from fossil fuels. Fossil fuels combustion has negative environmental impacts, with the CO2 production to be dominating. The fulfillment of the Kyoto protocol criteria requires the minimization of CO2 emissions. Thus the management of the CO2 emissions is an urgent matter. The use of appliances with low energy use and the adoption of an energy policy that prevents the unnecessary energy use, can play lead to the reduction of carbon emissions. A different route is the introduction of ''clean'' energy sources, such as renewable energy sources. Last but not least, the development of carbon sequestration methods can be promising technique with big future potential. The objective of this work is the analysis and comparison of different carbon sequestration and deposit methods. Ocean deposit, land ecosystems deposit, geological formations deposit and radical biological and chemical approaches will be analyzed

  4. Computational structural analysis and finite element methods

    CERN Document Server

    Kaveh, A

    2014-01-01

    Graph theory gained initial prominence in science and engineering through its strong links with matrix algebra and computer science. Moreover, the structure of the mathematics is well suited to that of engineering problems in analysis and design. The methods of analysis in this book employ matrix algebra, graph theory and meta-heuristic algorithms, which are ideally suited for modern computational mechanics. Efficient methods are presented that lead to highly sparse and banded structural matrices. The main features of the book include: application of graph theory for efficient analysis; extension of the force method to finite element analysis; application of meta-heuristic algorithms to ordering and decomposition (sparse matrix technology); efficient use of symmetry and regularity in the force method; and simultaneous analysis and design of structures.

  5. Radioisotope method of compound flow analysis

    Directory of Open Access Journals (Sweden)

    Petryka Leszek

    2015-01-01

    Full Text Available The paper presents gamma radiation application to analysis of a multicomponent or multiphase flow. Such information as a selected component content in the mixture transported through pipe is crucial in many industrial or laboratory installations. Properly selected sealed radioactive source and collimators, deliver the photon beam, penetrating cross section of the flow. Detectors mounted at opposite to the source side of the pipe, allow recording of digital signals representing composition of the stream. In the present development of electronics, detectors and computer software, a significant progress in know-how of this field may be observed. The paper describes application of this method to optimization and control of hydrotransport of solid particles and propose monitoring facilitating prevent of a pipe clogging or dangerous oscillations.

  6. A liquid chromatographic method for analysis of all-rac-alpha-tocopheryl acetate and retinyl palmitate in medical food using matrix solid-phase dispersion in conjunction with a zero reference material as a method development tool.

    Science.gov (United States)

    Chase, G W; Eitenmiller, R R; Long, A R

    1999-01-01

    A liquid chromatographic method is described for analysis of all-rac-alpha-tocopheryl acetate and retinyl palmitate in medical food. The vitamins are extracted from medical food without saponification by matrix solid-phase dispersion and chromatographed by normal-phase chromatography with fluorescence detection. Retinyl palmitate and all-rac-alpha-tocopheryl acetate are quantitated isocratically with a mobile phase of 0.125% (v/v) and 0.5% (v/v) isopropyl alcohol in hexane, respectively. Results compared favorably with label declarations on retail medical foods. Recoveries determined on an analyte-fortified zero reference material for a milk-based medical food averaged 98.3% (n = 25) for retinyl palmitate spikes and 95.7% (n = 25) for all-rac-alpha-tocopheryl acetate spikes. Five concentrations were examined for each analyte, and results were linear (r2 = 0.995 for retinyl palmitate and 0.9998 for all-rac-alpha-tocopheryl acetate) over the concentration range examined, with coefficients of variation in the range 0.81-4.22%. The method provides a rapid, specific, and easily controlled assay for analysis of retinyl palmitate and all-rac-alpha-tocopheryl acetate in fortified medical foods. PMID:10028678

  7. Development of Quantification Method for Bioluminescence Imaging

    International Nuclear Information System (INIS)

    Optical molecular luminescence imaging is widely used for detection and imaging of bio-photons emitted by luminescent luciferase activation. The measured photons in this method provide the degree of molecular alteration or cell numbers with the advantage of high signal-to-noise ratio. To extract useful information from the measured results, the analysis based on a proper quantification method is necessary. In this research, we propose a quantification method presenting linear response of measured light signal to measurement time. We detected the luminescence signal by using lab-made optical imaging equipment of animal light imaging system (ALIS) and different two kinds of light sources. One is three bacterial light-emitting sources containing different number of bacteria. The other is three different non-bacterial light sources emitting very weak light. By using the concept of the candela and the flux, we could derive simplified linear quantification formula. After experimentally measuring light intensity, the data was processed with the proposed quantification function. We could obtain linear response of photon counts to measurement time by applying the pre-determined quantification function. The ratio of the re-calculated photon counts and measurement time present a constant value although different light source was applied. The quantification function for linear response could be applicable to the standard quantification process. The proposed method could be used for the exact quantitative analysis in various light imaging equipment with presenting linear response behavior of constant light emitting sources to measurement time

  8. Development of semiclassical molecular dynamics simulation method.

    Science.gov (United States)

    Nakamura, Hiroki; Nanbu, Shinkoh; Teranishi, Yoshiaki; Ohta, Ayumi

    2016-04-28

    Various quantum mechanical effects such as nonadiabatic transitions, quantum mechanical tunneling and coherence play crucial roles in a variety of chemical and biological systems. In this paper, we propose a method to incorporate tunneling effects into the molecular dynamics (MD) method, which is purely based on classical mechanics. Caustics, which define the boundary between classically allowed and forbidden regions, are detected along classical trajectories and the optimal tunneling path with minimum action is determined by starting from each appropriate caustic. The real phase associated with tunneling can also be estimated. Numerical demonstration with use of a simple collinear chemical reaction O + HCl → OH + Cl is presented in order to help the reader to well comprehend the method proposed here. Generalization to the on-the-fly ab initio version is rather straightforward. By treating the nonadiabatic transitions at conical intersections by the Zhu-Nakamura theory, new semiclassical MD methods can be developed. PMID:27067383

  9. Chapter 11. Community analysis-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  10. Development of an Aerodynamic Analysis Method and Database for the SLS Service Module Panel Jettison Event Utilizing Inviscid CFD and MATLAB

    Science.gov (United States)

    Applebaum, Michael P.; Hall, Leslie, H.; Eppard, William M.; Purinton, David C.; Campbell, John R.; Blevins, John A.

    2015-01-01

    This paper describes the development, testing, and utilization of an aerodynamic force and moment database for the Space Launch System (SLS) Service Module (SM) panel jettison event. The database is a combination of inviscid Computational Fluid Dynamic (CFD) data and MATLAB code written to query the data at input values of vehicle/SM panel parameters and return the aerodynamic force and moment coefficients of the panels as they are jettisoned from the vehicle. The database encompasses over 5000 CFD simulations with the panels either in the initial stages of separation where they are hinged to the vehicle, in close proximity to the vehicle, or far enough from the vehicle that body interference effects are neglected. A series of viscous CFD check cases were performed to assess the accuracy of the Euler solutions for this class of problem and good agreement was obtained. The ultimate goal of the panel jettison database was to create a tool that could be coupled with any 6-Degree-Of-Freedom (DOF) dynamics model to rapidly predict SM panel separation from the SLS vehicle in a quasi-unsteady manner. Results are presented for panel jettison simulations that utilize the database at various SLS flight conditions. These results compare favorably to an approach that directly couples a 6-DOF model with the Cart3D Euler flow solver and obtains solutions for the panels at exact locations. This paper demonstrates a method of using inviscid CFD simulations coupled with a 6-DOF model that provides adequate fidelity to capture the physics of this complex multiple moving-body panel separation event.

  11. Probabilistic structural analysis methods for space propulsion system components

    Science.gov (United States)

    Chamis, Christos C.

    1987-01-01

    The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.

  12. Generalized Analysis of a Distribution Separation Method

    Directory of Open Access Journals (Sweden)

    Peng Zhang

    2016-04-01

    Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.

  13. Financial Analysis: A Review of the Methods and Their Application to Employee Training. Training and Development Research Center Project Number Nine.

    Science.gov (United States)

    Mosier, Nancy R.

    Financial analysis techniques are tools that help managers make sound financial decisions that contribute to general corporate objectives. A literature review reveals that the most commonly used financial analysis techniques are payback time, average rate of return, present value or present worth, and internal rate of return. Despite the success…

  14. Expediting the method development and quality control of reversed-phase liquid chromatography electrospray ionization mass spectrometry for pharmaceutical analysis by using an LC/MS performance test mix.

    Science.gov (United States)

    Tang, L; Fitch, W L; Alexander, M S; Dolan, J W

    2000-11-01

    Mass spectrometry combined with liquid chromatography (LC/MS) has become an important analytical methodology in both pharmaceutical and biomolecule analyses. LC/MS, especially with reversed-phase HPLC (RP-LC), is extensively used in the separation and structural identification of pharmaceutical samples. However, many parameters have to be considered when a new LC/MS method is developed for either separation and structural analysis of unknown mixtures or quantitative analysis of a set of known compounds in an assay. The optimization of a new LC/MS method can be a time-consuming process. A novel kit-LC/MS performance test mix-composed of aspartame, cortisone, reserpine, and dioctyl phthalate has been developed to accelerate the process of establishing a new RP-LC/MS method. The LC/MS mix makes the evaluation and validation of an LC/MS method more efficient and easier. It also simplifies the quality control procedure for an LC/MS method in use. PMID:11080866

  15. Development of a multiple bulked segregant analysis (MBSA) method used to locate a new stem rust resistance gene (Sr54) in the winter wheat cultivar Norin 40.

    Science.gov (United States)

    Ghazvini, Habibollah; Hiebert, Colin W; Thomas, Julian B; Fetch, Thomas

    2013-02-01

    An important aspect of studying putative new genes in wheat is determining their position on the wheat genetic map. The primary difficulty in mapping genes is determining which chromosome carries the gene of interest. Several approaches have been developed to address this problem, each with advantages and disadvantages. Here we describe a new approach called multiple bulked segregant analysis (MBSA). A set of 423 simple sequence repeat (SSR) markers were selected based on profile simplicity, frequency of polymorphism, and distribution across the wheat genome. SSR primers were preloaded in 384-well PCR plates with each primer occupying 16 wells. In practice, 14 wells are reserved for "mini-bulks" that are equivalent to four gametes (e.g. two F(2) individuals) comprised of individuals from a segregated population that have a known homozygous genotype for the gene of interest. The remaining two wells are reserved for the parents of the population. Each well containing a mini-bulk can have one of three allele compositions for each SSR: only the allele from one parent, only the allele from the other parent, or both alleles. Simulation experiments were performed to determine the pattern of mini-bulk allele composition that would indicate putative linkage between the SSR in question and the gene of interest. As a test case, MBSA was employed to locate an unidentified stem rust resistance (Sr) gene in the winter wheat cultivar Norin 40. A doubled haploid (DH) population (n = 267) was produced from hybrids of the cross LMPG-6S/Norin 40. The DH population segregated for a single gene (χ (1:1) (2) = 0.093, p = 0.76) for resistance to Puccinia graminis f.sp. tritici race LCBN. Four resistant DH lines were included in each of the 14 mini-bulks for screening. The Sr gene was successfully located to the long arm of chromosome 2D using MBSA. Further mapping confirmed the chromosome location and revealed that the Sr gene was located in a linkage block that may represent an alien

  16. Improving Software Development Processes with Multicriteria Methods

    CERN Document Server

    Kornyshova, Elena; Salinesi, Camille

    2009-01-01

    All software development processes include steps where several alternatives induce a choice, a decision-making. Sometimes, methodologies offer a way to make decisions. However, in a lot of cases, the arguments to carry out the decision are very poor and the choice is made in an intuitive and hazardous way. The aim of our work is to offer a scientifically founded way to guide the engineer through tactical choices with the application of multicriteria methods in software development processes. This approach is illustrated with three cases: risks, use cases and tools within Rational Unified Process.

  17. Transport Test Problems for Hybrid Methods Development

    Energy Technology Data Exchange (ETDEWEB)

    Shaver, Mark W.; Miller, Erin A.; Wittman, Richard S.; McDonald, Benjamin S.

    2011-12-28

    This report presents 9 test problems to guide testing and development of hybrid calculations for the ADVANTG code at ORNL. These test cases can be used for comparing different types of radiation transport calculations, as well as for guiding the development of variance reduction methods. Cases are drawn primarily from existing or previous calculations with a preference for cases which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22.

  18. Current situation and development of analysis methods of power system transient stability%电力系统暂态稳定分析方法的现状与发展

    Institute of Scientific and Technical Information of China (English)

    李晨; 蒋德珑; 程生安

    2012-01-01

    For the interconnection power grid expands rapidly with the development of power system, its transient stability becomes increasingly seriously. The reliable transient stability analysis is one of the keys in safe operation of power system. The developing history and status quo of power system transient stability technology is reviewed in the paper by introducing the common methods of power system transient stability analysis. The features and applicability of various methods are analyzed in detail. The development foreground of power system transient stability analysis is clarified. It is pointed out that the wavelet analysis used in the transient stability analysis has a broad space for development, especially in the transient signal processing, it is a valuable research direction.%随着电力系统的发展,互联电力网络变得越来越大,暂态稳定性问题日趋严重,而电力系统安全运行的关键之一是可靠的暂态稳定分析.通过介绍电力系统暂态稳定分析常用的几种方法,回顾电力系统暂态稳定的发展历史和现状,对比分析了几种方法的特点及适用范围,并在此基础上对电力系统暂态稳定分析的发展前景进行了展望.最后指出,小波分析用于电力系统暂态稳定分析具有广阔的发展空间,特别是在处理暂态信号方面,更是一个很有应用价值的研究方向.

  19. A Research on Competitiveness of Guangxi City - Based on System Clustering Method and Principal Component Analysis Method

    OpenAIRE

    Fan, Chang-ke; Wu, Yu

    2010-01-01

    A total of 10 indices of regional economic development in Guangxi are selected. According to the relevant economic data, regional economic development in Guangxi City is analyzed by using System Clustering Method and Principal Component Analysis Method. Result shows that System Clustering Method and Principal Component Analysis Method have revealed similar results analysis of economic development level. Overall economic strength of Guangxi is weak and Nanning has relatively high scores of fac...

  20. LANDSCAPE ANALYSIS METHOD OF RIVERINE TERRITORIES

    OpenAIRE

    Fedoseeva O. S.

    2013-01-01

    The article proposes a method for landscape area analysis, which consists of four stages. Technique is proposed as a tool for the practical application of pre-project research materials in the design solutions for landscape areas planning and organization

  1. Research and Development of Evaluation Method for Interview Skills in Acupuncture-and-Moxibustion Medical Treatment : Analysis based on lecture evaluation of interview skills

    OpenAIRE

    Kaneda, Daigo

    2012-01-01

    This research centered on the objective clinical capability examination (OSCE), which was recently been introduced into an acupuncture-and-moxibustion training school. The problems in OSCE were evaluations in a medical interview station. This research examined the validity and inner compatibility of the evaluation criteria as a whole using the Cronbach alpha coefficient and factor analysis. The evaluation criteria consisted of 20 items of four factors and were validated in the factor analysis...

  2. Development of Gocing Storage Method for Cocoyam

    OpenAIRE

    Chukwu, G.O; Nwosu, K.I; Madu, T.U; Chinaka, C; B.C. Okoye

    2008-01-01

    Lack of good storage reduces the shelf life of harvested cocoyam (Colocasia spp and Xanthosoma spp) corms and cormels. This is a major challenge facing cocoyam farmers, processors, and marketers in Nigeria. The National Root Crops Research Institute (NRCRI), Umudike, Nigeria, which has a national mandate to research into root and tubers crops of economic importance, has developed the ‘Gocing Storage’ for improved storage of cocoyam. The paper highlights this improved method of storing cocoya...

  3. Improving Software Development Processes with Multicriteria Methods

    OpenAIRE

    Kornyshova, Elena; Deneckere, Rebecca; Salinesi, Camille

    2008-01-01

    11 pages National audience All software development processes include steps where several alternatives induce a choice, a decision-making. Sometimes, methodologies offer a way to make decisions. However, in a lot of cases, the arguments to carry out the decision are very poor and the choice is made in an intuitive and hazardous way. The aim of our work is to offer a scientifically founded way to guide the engineer through tactical choices with the application of multicriteria methods in...

  4. Raman spectroscopic analysis of cyanogenic glucosides in plants: development of a Flow Injection Surface-Enhanced Raman Scatter (FI-SERS) method for determination of cyanide

    DEFF Research Database (Denmark)

    Thygesen, Lisbeth Garbrecht; Jørgensen, Kirsten; Møller, Birger Lindberg; Engelsen, Søren Balling

    2004-01-01

    -dried sorghum leaf was also obtained using this instrument. Surface-enhanced Raman Spectroscopy (SERS) was demonstrated to be a more sensitive method that enabled determination of the cyanogenic potential of plant tissue. The SERS method was optimized by flow injection (FI) using a colloidal gold dispersion as...

  5. Recent Developments in the Methods of Estimating Shooting Distance

    Directory of Open Access Journals (Sweden)

    Arie Zeichner

    2002-01-01

    Full Text Available A review of developments during the past 10 years in the methods of estimating shooting distance is provided. This review discusses the examination of clothing targets, cadavers, and exhibits that cannot be processed in the laboratory. The methods include visual/microscopic examinations, color tests, and instrumental analysis of the gunshot residue deposits around the bullet entrance holes. The review does not cover shooting distance estimation from shotguns that fired pellet loads.

  6. Recent Developments in the Methods of Estimating Shooting Distance

    OpenAIRE

    Arie Zeichner; Baruch Glattstein

    2002-01-01

    A review of developments during the past 10 years in the methods of estimating shooting distance is provided. This review discusses the examination of clothing targets, cadavers, and exhibits that cannot be processed in the laboratory. The methods include visual/microscopic examinations, color tests, and instrumental analysis of the gunshot residue deposits around the bullet entrance holes. The review does not cover shooting distance estimation from shotguns that fired pellet loads.

  7. An introduction to numerical methods and analysis

    CERN Document Server

    Epperson, James F

    2013-01-01

    Praise for the First Edition "". . . outstandingly appealing with regard to its style, contents, considerations of requirements of practice, choice of examples, and exercises.""-Zentralblatt MATH "". . . carefully structured with many detailed worked examples.""-The Mathematical Gazette The Second Edition of the highly regarded An Introduction to Numerical Methods and Analysis provides a fully revised guide to numerical approximation. The book continues to be accessible and expertly guides readers through the many available techniques of numerical methods and analysis. An Introduction to

  8. Development of a transient criticality evaluation method

    International Nuclear Information System (INIS)

    In developing a transient criticality evaluation method we model, in full spatial/temporal detail, the neutron fluxes and consequent power and the evolving material properties - their flows, energies, phase changes etc. These methods are embodied in the generic method FETCH code which is based as far as possible on basic principles and is capable of use in exploring safety-related situations somewhat beyond the range of experiment. FETCH is a general geometry code capable of addressing a range of criticality issues in fissile materials. The code embodies both transient radiation transport and transient fluid dynamics. Work on powders, granular materials, porous media and solutions is reviewed. The capability for modelling transient criticality for chemical plant, waste matrices and advanced reactors is also outlined. (author)

  9. Development of a hydraulic turbine design method

    Science.gov (United States)

    Kassanos, Ioannis; Anagnostopoulos, John; Papantonis, Dimitris

    2013-10-01

    In this paper a hydraulic turbine parametric design method is presented which is based on the combination of traditional methods and parametric surface modeling techniques. The blade of the turbine runner is described using Bezier surfaces for the definition of the meridional plane as well as the blade angle distribution, and a thickness distribution applied normal to the mean blade surface. In this way, it is possible to define parametrically the whole runner using a relatively small number of design parameters, compared to conventional methods. The above definition is then combined with a commercial CFD software and a stochastic optimization algorithm towards the development of an automated design optimization procedure. The process is demonstrated with the design of a Francis turbine runner.

  10. Development of a second order penalty method

    International Nuclear Information System (INIS)

    The simulations of fluid flows in complex geometries require the generation of body-fitted meshes which are difficult to create. The penalty method developed in this work is useful to simplify the mesh generation task. The governing equations of fluid flow are discretized using a finite volume method on an unfitted mesh. The immersed boundary conditions are taken into account through a penalty term added to the governing equations. We are interested in the approximation of the penalty term using a finite volume discretization with collocated and staggered grid. The penalty method is second-order spatial accurate for Poisson and Navier-Stokes equations. Finally, simulations of turbulent flows around a cylinder at Re = 3900 and turbulent motions in a rod bundle at Re = 9500 are performed. (author)

  11. Gap analysis: Concepts, methods, and recent results

    Science.gov (United States)

    Jennings, M.D.

    2000-01-01

    Rapid progress is being made in the conceptual, technical, and organizational requirements for generating synoptic multi-scale views of the earth's surface and its biological content. Using the spatially comprehensive data that are now available, researchers, land managers, and land-use planners can, for the first time, quantitatively place landscape units - from general categories such as 'Forests' or 'Cold-Deciduous Shrubland Formation' to more categories such as 'Picea glauca-Abies balsamea-Populus spp. Forest Alliance' - in their large-area contexts. The National Gap Analysis Program (GAP) has developed the technical and organizational capabilities necessary for the regular production and analysis of such information. This paper provides a brief overview of concepts and methods as well as some recent results from the GAP projects. Clearly, new frameworks for biogeographic information and organizational cooperation are needed if we are to have any hope of documenting the full range of species occurrences and ecological processes in ways meaningful to their management. The GAP experience provides one model for achieving these new frameworks.

  12. Schedulability Analysis Method of Timing Constraint Petri Nets

    Institute of Scientific and Technical Information of China (English)

    李慧芳; 范玉顺

    2002-01-01

    Timing constraint Petri nets (TCPNs) can be used to model a real-time system specification and to verify the timing behavior of the system. This paper describes the limitations of the reachability analysis method in analyzing complex systems for existing TCPNs. Based on further research on the schedulability analysis method with various topology structures, a more general state reachability analysis method is proposed. To meet various requirements of timely response for actual systems, this paper puts forward a heuristic method for selecting decision-spans of transitions and develops a heuristic algorithm for schedulability analysis of TCPNs. Examples are given showing the practicality of the method in the schedulability analysis for real-time systems with various structures.

  13. Hydropower development priority using MCDM method

    International Nuclear Information System (INIS)

    Hydropower is recognized as a renewable and clean energy sources and its potential should be realized in an environmentally sustainable and socially equitable manner. Traditionally, the decision criteria when analyzing hydropower projects, have been mostly a technical and economical analysis which focused on the production of electricity. However, environmental awareness and sensitivity to locally affected people should also be considered. Multi-criteria decision analysis has been applied to study the potential to develop hydropower projects with electric power greater than 100 kW in the Ping River Basin, Thailand, and to determine the advantages and disadvantages of the projects in five main criteria: electricity generation, engineering and economics, socio-economics, environment, and stakeholder involvement. There are 64 potential sites in the study area. Criteria weights have been discussed and assigned by expert groups for each main criteria and subcriteria. As a consequence of weight assignment, the environmental aspect is the most important aspect in the view of the experts. Two scenarios using expert weight and fair weight have been studied to determine the priority for development of each project. This study has been done to assist policy making for hydropower development in the Ping River Basin.

  14. Advanced Software Methods for Physics Analysis

    International Nuclear Information System (INIS)

    Unprecedented data analysis complexity is experienced in modern High Energy Physics experiments. The complexity arises from the growing size of recorded data samples, the large number of data analyses performed by different users in each single experiment, and the level of complexity of each single analysis. For this reason, the requirements on software for data analysis impose a very high level of reliability. We present two concrete examples: the former from BaBar experience with the migration to a new Analysis Model with the definition of a new model for the Event Data Store, the latter about a toolkit for multivariate statistical and parametric Monte Carlo analysis developed using generic programming

  15. Development of a Method for Detection of Giardia duodenalis Cysts on Lettuce and for Simultaneous Analysis of Salad Products for the Presence of Giardia Cysts and Cryptosporidium Oocysts▿

    OpenAIRE

    Cook, N.; Nichols, R.A.B.; Wilkinson, N; Paton, C. A.; Barker, K; Smith, H.V.

    2007-01-01

    We report a method for detecting Giardia duodenalis cysts on lettuce, which we subsequently use to examine salad products for the presence of Giardia cysts and Cryptosporidium oocysts. The method is based on four basic steps: extraction of cysts from the foodstuffs, concentration of the extract and separation of the cysts from food materials, staining of the cysts to allow their visualization, and identification of cysts by microscopy. The concentration and separation steps are performed by c...

  16. Development of a reliable analytical method for the precise extractive spectrophotometric determination of osmium(VIII) with 2-nitrobenzaldehydethiocarbohydrazone: Analysis of alloys and real sample.

    Science.gov (United States)

    Zanje, Sunil B; Kokare, Arjun N; Suryavanshi, Vishal J; Waghmode, Duryodhan P; Joshi, Sunil S; Anuse, Mansing A

    2016-12-01

    The proposed method demonstrates that the osmium(VIII) forms complex with 2-NBATCH from 0.8molL(-1) HCl at room temperature. The complex formed was extracted in 10mL of chloroform with a 5min equilibration time. The absorbance of the red colored complex was measured at 440nm against the reagent blank. The Beer's law was obeyed in the range of 5-25μgmL(-1), the optimum concentration range was 10-20μgmL(-1) of osmium(VIII) as evaluated by Ringbom's plot. Molar absorptivity and Sandell's sensitivity of osmium(VIII)-2NBATCH complex in chloroform is 8.94×10(3)Lmol(-1)cm(-1) and 0.021μgcm(-2), respectively. The composition of osmium(VIII)-2NBATCH complex was 1:2 investigated from Job's method of continuous variation, Mole ratio method and slope ratio method. The interference of diverse ions was studied and masking agents were used wherever necessary. The present method was successfully applied for determination of osmium(VIII) from binary, ternary and synthetic mixtures corresponding to alloys and real samples. The validity of the method was confirmed by finding the relative standard deviation for five determinations which was 0.29%. PMID:27380306

  17. Comparison of extraction methods for analysis of flavonoids in onions

    OpenAIRE

    Soeltoft, Malene; Knuthsen, Pia; Nielsen, John

    2008-01-01

    Onions are known to contain high levels of flavonoids and a comparison of the efficiency, reproducibility and detection limits of various extraction methods has been made in order to develop fast and reliable analytical methods for analysis of flavonoids in onions. Conventional and classical methods are time- and solvent-consuming and the presence of light and oxygen during sample preparation facilitate degradation reactions. Thus, classical methods were compared with microwave (irradiatio...

  18. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  19. ANALYSIS OF MODERN CAR BODY STRAIGHTENING METHODS

    Directory of Open Access Journals (Sweden)

    Arhun, Sch.

    2013-01-01

    Full Text Available The analysis of modern car body panels straightening methods is carried out. There have been described both traditional and alternative methods of car body panels straightening. The urgency of magnetic pulse teсhnology dignment is grounded. The main advantages of magnetic pulse teсhno-logy of car body straightening are defernined.

  20. Probabilistic Analysis Methods for Hybrid Ventilation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Frier, Christian; Heiselberg, Per

    This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application of...