WorldWideScience

Sample records for analysis methods developed

  1. Cloud Based Development Issues: A Methodical Analysis

    Directory of Open Access Journals (Sweden)

    Sukhpal Singh

    2012-11-01

    Full Text Available Cloud based development is a challenging task for various software engineering projects, especifically for those which demand extraordinary quality, reusability and security along with general architecture. In this paper we present a report on a methodical analysis of cloud based development problems published in major computer science and software engineering journals and conferences organized by various researchers. Research papers were collected from different scholarly databases using search engines within a particular period of time. A total of 89 research papers were analyzed in this methodical study and we categorized into four classes according to the problems addressed by them. The majority of the research papers focused on quality (24 papers associated with cloud based development and 16 papers focused on analysis and design. By considering the areas focused by existing authors and their gaps, untouched areas of cloud based development can be discovered for future research works.

  2. Allomtric Scaling Analysis for City Development: Model, Method, and Applications

    CERN Document Server

    Chen, Yanguang

    2011-01-01

    An allometric scaling analysis method based on the idea from fractal theory, general system theory, and analytical hierarchy process are proposed to make a comprehensive evaluation for the relative level of city development.

  3. Method development of gas analysis with mass spectrometer

    International Nuclear Information System (INIS)

    Dissolved gas content in deep saline groundwater is an important factor, which has to be known and taken into account when planning the deep repository for the spent nuclear fuel. Posiva has investigated dissolved gases in deep groundwaters since the 1990's. In 2002 Posiva started a project that focused on developing the mass spectrometric method for measuring the dissolved gas content in deep saline groundwater. The main idea of the project was to analyse the dissolved gas content of both the gas phase and the water phase by a mass spectrometer. The development of the method started in 2003 (in the autumn). One of the aims was to create a parallel method for gas analysis with the gas chromatographic method. The starting point of this project was to test if gases could be analysed directly from water using a membrane inlet in the mass spectrometer. The main objective was to develop mass spectrometric methods for gas analysis with direct and membrane inlets. An analysis method for dissolved gases was developed for direct gas inlet mass spectrometry. The accuracy of the analysis method is tested with parallel real PAVE samples analysed in the laboratory of Insinoeoeritoimisto Paavo Ristola Oy. The results were good. The development of the membrane inlet mass spectrometric method still continues. Two different membrane materials (silicone and teflon) were tested. Some basic tests (linearity,repeatability and detection limits for different gases) will be done by this method. (orig.)

  4. Scientific and methodical approaches to analysis of enterprise development potential

    Directory of Open Access Journals (Sweden)

    Hrechina Iryna V.

    2014-01-01

    Full Text Available The modern state of the Ukrainian economy urge enterprises to search for new possibilities of their development, which makes the study subject topical. The article systemises existing approaches to analysis of the potential of enterprise development and marks out two main scientific approaches: first is directed at analysis of prospects of self-development of the economic system; the second – at analysis of probability of possibilities of growth. In order to increase the quality of the process of formation of methods of analysis of potential of enterprise development, the article offers an organisation model of methods and characterises its main elements. It develops methods of analysis, in the basis of which there are indicators of potentialogical sustainability. Scientific novelty of the obtained results lies in a possibility of identification of main directions of enterprise development with the use of the enterprise development potential ration: self-development or probability of augmenting opportunities, which is traced through interconnection of resources and profit.

  5. Recent Developments in Helioseismic Analysis Methods and Solar Data Assimilation

    CERN Document Server

    Schad, Ariane; Duvall, Tom L; Roth, Markus; Vorontsov, Sergei V

    2016-01-01

    We review recent advances and results in enhancing and developing helioseismic analysis methods and in solar data assimilation. In the first part of this paper we will focus on selected developments in time-distance and global helioseismology. In the second part, we review the application of data assimilation methods on solar data. Relating solar surface observations as well as helioseismic proxies with solar dynamo models by means of the techniques from data assimilation is a promising new approach to explore and to predict the magnetic activity cycle of the Sun.

  6. Development of Photogrammetric Methods of Stress Analysis and Quality Control

    CERN Document Server

    Kubik, D L; Kubik, Donna L.; Greenwood, John A.

    2003-01-01

    A photogrammetric method of stress analysis has been developed to test thin, nonstandard windows designed for hydrogen absorbers, major components of a muon cooling channel. The purpose of the absorber window tests is to demonstrate an understanding of the window behavior and strength as a function of applied pressure. This is done by comparing the deformation of the window, measured via photogrammetry, to the deformation predicted by finite element analysis (FEA). FEA analyses indicate a strong sensitivity of strain to the window thickness. Photogrammetric methods were chosen to measure the thickness of the window, thus providing data that are more accurate to the FEA. This, plus improvements made in hardware and testing procedures, resulted in a precision of 5 microns in all dimensions and substantial agreement with FEA predictions.

  7. Development of Analysis Methods for Designing with Composites

    Science.gov (United States)

    Madenci, E.

    1999-01-01

    The project involved the development of new analysis methods to achieve efficient design of composite structures. We developed a complex variational formulation to analyze the in-plane and bending coupling response of an unsymmetrically laminated plate with an elliptical cutout subjected to arbitrary edge loading as shown in Figure 1. This formulation utilizes four independent complex potentials that satisfy the coupled in-plane and bending equilibrium equations, thus eliminating the area integrals from the strain energy expression. The solution to a finite geometry laminate under arbitrary loading is obtained by minimizing the total potential energy function and solving for the unknown coefficients of the complex potentials. The validity of this approach is demonstrated by comparison with finite element analysis predictions for a laminate with an inclined elliptical cutout under bi-axial loading.The geometry and loading of this laminate with a lay-up of [-45/45] are shown in Figure 2. The deformed configuration shown in Figure 3 reflects the presence of bending-stretching coupling. The validity of the present method is established by comparing the out-of-plane deflections along the boundary of the elliptical cutout from the present approach with those of the finite element method. The comparison shown in Figure 4 indicates remarkable agreement. The details of this method are described in a manuscript by Madenci et al. (1998).

  8. Multiphysics methods development for high temperature gas reactor analysis

    Science.gov (United States)

    Seker, Volkan

    Multiphysics computational methods were developed to perform design and safety analysis of the next generation Pebble Bed High Temperature Gas Cooled Reactors. A suite of code modules was developed to solve the coupled thermal-hydraulics and neutronics field equations. The thermal-hydraulics module is based on the three dimensional solution of the mass, momentum and energy equations in cylindrical coordinates within the framework of the porous media method. The neutronics module is a part of the PARCS (Purdue Advanced Reactor Core Simulator) code and provides a fine mesh finite difference solution of the neutron diffusion equation in three dimensional cylindrical coordinates. Coupling of the two modules was performed by mapping the solution variables from one module to the other. Mapping is performed automatically in the code system by the use of a common material mesh in both modules. The standalone validation of the thermal-hydraulics module was performed with several cases of the SANA experiment and the standalone thermal-hydraulics exercise of the PBMR-400 benchmark problem. The standalone neutronics module was validated by performing the relevant exercises of the PBMR-268 and PBMR-400 benchmark problems. Additionally, the validation of the coupled code system was performed by analyzing several steady state and transient cases of the OECD/NEA PBMR-400 benchmark problem.

  9. Development of root observation method by image analysis system

    OpenAIRE

    Kim, Giyoung

    1995-01-01

    Knowledge of plant roots is important for determining plant-soil relationships, managing soil effectively, studying nutrient and water extraction, and creating a soil quality index. Plant root research is limited by the large amount of time and labor required to wash the roots from the soil and measure the viable roots. A root measurement method based on image analysis was proposed to reduce the time and labor requirement. A thinning algorithm-based image analysis method was us...

  10. Task analysis method for procedural training curriculum development.

    Science.gov (United States)

    Riggle, Jakeb D; Wadman, Michael C; McCrory, Bernadette; Lowndes, Bethany R; Heald, Elizabeth A; Carstens, Patricia K; Hallbeck, M Susan

    2014-06-01

    A central venous catheter (CVC) is an important medical tool used in critical care and emergent situations. Integral to proper care in many circumstances, insertion of a CVC introduces the risk of central line-associated blood stream infections and mechanical adverse events; proper training is important for safe CVC insertion. Cognitive task analysis (CTA) methods have been successfully implemented in the medical field to improve the training of postgraduate medical trainees, but can be very time-consuming to complete and require a significant time commitment from many subject matter experts (SMEs). Many medical procedures such as CVC insertion are linear processes with well-documented procedural steps. These linear procedures may not require a traditional CTA to gather the information necessary to create a training curriculum. Accordingly, a novel, streamlined CTA method designed primarily to collect cognitive cues for linear procedures was developed to be used by medical professionals with minimal CTA training. This new CTA methodology required fewer trained personnel, fewer interview sessions, and less time commitment from SMEs than a traditional CTA. Based on this study, a streamlined CTA methodology can be used to efficiently gather cognitive information on linear medical procedures for the creation of resident training curricula and procedural skills assessments. PMID:24366759

  11. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  12. Developing the UIC 406 Method for Capacity Analysis

    DEFF Research Database (Denmark)

    Khadem Sameni, Melody; Landex, Alex; Preston, John

    2011-01-01

    utilisation, two methods of CUI and the UIC 406 are compared with each other. A British and a Danish case study are explored for a periodic and a nonperiodic timetable: 1- Freeing up capacity by omitting the train that has the highest capacity consumption (British case study). 2- Adding trains to use...... the spare capacity (Danish case study). Some suggestions are made to develop meso indices by using the UIC 406 method to decide between the alternatives for adding or removing trains....

  13. Rapid software development : ANALYSIS OF AGILE METHODS FOR APP STARTUPS

    OpenAIRE

    Wahlqvist, Daniel

    2014-01-01

    This thesis is focused on software development using so called Agile methods. The scope of research is startup companies creating consumer apps. The thesis work was performed at a Swedish app startup; Storypic/Accelit AB. An overview of current research on Agile methods is given. A qualitative case study was undertaken in four parts; 1. Observing the team 2. Testing business hypotheses 3. Interviews with the team and 4. User feedback. Analyzing the findings some conclusions are drawn:  An ag...

  14. The development of a 3D risk analysis method.

    Science.gov (United States)

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  15. Leadership Development Expertise: A Mixed-Method Analysis

    Science.gov (United States)

    Okpala, Comfort O.; Hopson, Linda B.; Chapman, Bernadine; Fort, Edward

    2011-01-01

    In this study, the impact of graduate curriculum, experience, and standards in the development of leadership expertise were examined. The major goals of the study were to (1) examine the impact of college content curriculum in the development of leadership expertise, (2) examine the impact of on the job experience in the development of leadership…

  16. Analysis of cultural development of Isfahan city Using Factor analysis method

    Directory of Open Access Journals (Sweden)

    J.Mohammadi

    2013-01-01

    Full Text Available Extended abstract1-IntroductionCultural spaces are consideredas one of the main factors for development. Cultural development is a qualitative and valuable process that for assessing it, quantitative indicators in cultural planning are used to obtain development objectives in the pattern of goods and services. The aim of the study is to determine and analyze cultural development level and regional inequality of different districts of Isfahan using factor analysis technique. The statistical population of the study is 14 districts of Isfahan municipality. The dominant approach ofthis study is quantitative – description and analytical. In this study, 35 indices have been summarized by factor analysis method and have been reduced to 5 factors and combined in significant ones and delivered.2 – Theoretical basesThe most important objectives of spatial planning, considering limitation of resources, are optimum distributions of facilities and services among different locations in which people live. To do this,there is a need to identify different locations in terms of having different facilities and services, so that developed locations are specified and planners can proceed to do something for spatial equilibrium and reducing privileged distance between districts.The present study has been conducted to reach to an equal development in Isfahan urban districts following identifying the situation and the manner of distributing development facilities cultural selected indices in different districts.3 – DiscussionCultural development of societies is evaluated by considering the changes and improvement of its indices and measured by quantitative frames. Cultural development indices are the most important tools for cultural planning in a special district in a society. In this study, cultural development indices have been used to determine the levels of districts. By using factor analysis model, the share of influential factors in the cultural

  17. +DEVELOPMENT OF METHODS FOR THE ANALYSIS OF PETROLEUM CONTAMINATED SOILS.

    OpenAIRE

    Okop, Imeh

    2010-01-01

    Soil contamination from petroleum spills is a frequent environmental problem in the world. It is obvious that petroleum exploration has contributed immensely to the economic growth of Nigeria, but over the last few decades, the Niger Delta of Nigeria has suffered grave human health risk and ecosystem degradation resulting from oil spillages, petroleum products leakages and other involuntary effluent discharges from oil exploration activities. This research seeks to develop and optimize GC-FID...

  18. Development of Rotor Diagnosis Method via Motor Current Signature Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Seok; Huh, Hyung; Kim, Min Hwan; Jeong, Kyeong Hoon; Lee, Gyu Mhan; Park, Jin Ho; Park, Keun Bae; Lee, Cheol Kwon; Hur, S

    2006-01-15

    A study on motor current signature analysis has been performed to monitor a journal bearing fault due to increasing clearance. It was known that the journal bearing clearance produces side band frequencies, the supplied current frequency plus and minus rotational rotor frequency in motor current. But the existence information of the side band frequencies is not sufficient to diagnose whether the journal bearing is safe or not. Four journal bearing sets with different clearances are used to measure the side band frequency amplitude and the rotor vibration amplitude versus the journal bearing clearance. The side band frequency amplitude and the rotor vibration amplitude are increased as the journal bearing clearance is increasing. This trend assures that ASME OM vibration guide line can be applied to estimate the journal bearing clearance size. In this research, 2.5 times the reference side band amplitude is suggested as an indicator of a journal bearing fault. Further study is necessary to make out more specific quantitative relations between the side band frequency amplitude and the journal bearing clearance of a motor.

  19. Further development of a static seismic analysis method for piping systems: The load coefficient method

    International Nuclear Information System (INIS)

    Currently the ASME Boiler and Pressure Vessel Code is considering a simplified Static Analysis Method for seismic design of piping systems for incorporation into Appendix N of Section 3, Division 1, of the Code. This proposed method, called the Load Coefficient Method, uses coefficients, ranging from .4 to 1.0, times the peak value of the in-structure response spectra with a static analysis technique to evaluate the response of piping systems to seismic events. The coefficient used is a function of the pipe support spacing hence the frequency response of the system and in general, the greater the support spacing the lower the frequency, the lower the spectral response, hence the lower the coefficient. The results of the Load Coefficient Method static analyses have been compared to analyses using the Response Spectrum Modal Analysis Method. Reaction loads were also evaluated with one important modification, a minimum support reaction load as a function of nominal pipe diameter has been established. This assures that lightly loaded supports regardless of the analytical method used will be loaded to realistic values and eliminate the potential for under designed supports. With respect to the accelerations applicable to inline components, a factor of 0.9 times the Square Root of Sum of Square of horizontal floor spectra peaks was determined to envelop the horizontal accelerations and a coefficient of 1.2 was shown to envelop the vertical accelerations. Presented in this paper is the current form of the load coefficient method, a summarization of the results of the over 2,700 benchmark analysis of piping system segments which form the basis for the acceptance of the method, and an explanation of the use of the method

  20. Analysis of cultural development of Isfahan city Using Factor analysis method

    OpenAIRE

    Mohammadi, J.; M Izadi

    2013-01-01

    Extended abstract1-IntroductionCultural spaces are consideredas one of the main factors for development. Cultural development is a qualitative and valuable process that for assessing it, quantitative indicators in cultural planning are used to obtain development objectives in the pattern of goods and services. The aim of the study is to determine and analyze cultural development level and regional inequality of different districts of Isfahan using factor analysis technique. The statistical po...

  1. Chemical analysis of solid residue from liquid and solid fuel combustion: Method development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Trkmic, M. [University of Zagreb, Faculty of Mechanical Engineering and Naval Architecturek Zagreb (Croatia); Curkovic, L. [University of Zagreb, Faculty of Chemical Engineering and Technology, Zagreb (Croatia); Asperger, D. [HEP-Proizvodnja, Thermal Power Plant Department, Zagreb (Croatia)

    2012-06-15

    This paper deals with the development and validation of methods for identifying the composition of solid residue after liquid and solid fuel combustion in thermal power plant furnaces. The methods were developed for energy dispersive X-ray fluorescence (EDXRF) spectrometer analysis. Due to the fuels used, the different composition and the location of creation of solid residue, it was necessary to develop two methods. The first method is used for identifying solid residue composition after fuel oil combustion (Method 1), while the second method is used for identifying solid residue composition after the combustion of solid fuels, i. e. coal (Method 2). Method calibration was performed on sets of 12 (Method 1) and 6 (Method 2) certified reference materials (CRM). CRMs and analysis test samples were prepared in pellet form using hydraulic press. For the purpose of method validation the linearity, accuracy, precision and specificity were determined, and the measurement uncertainty of methods for each analyte separately was assessed. The methods were applied in the analysis of real furnace residue samples. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  2. Methodological aspects of development of new instrumental methods of analysis and improvement of known ones

    International Nuclear Information System (INIS)

    Consideration is given to possibilities of instrumental methods of analysis, such as method of precision registration of natural isotope rates of light elements from gaseous phase; method of piezoquartz microweighting; probe methods of analysis in spark mass-spectroscopy; extraction-atomic-emission spectroscopy with inductively coupled plasma. Prediction of further development of these methods, improvement of their analytic characteristics is given: increase of sensitivity, accuracy and rapidity. Extension of fields of their application is forecasted as well. 20 refs.; 7 figs.; 2 tabs

  3. Development of quantitative duplex real-time PCR method for screening analysis of genetically modified maize.

    Science.gov (United States)

    Oguchi, Taichi; Onishi, Mari; Minegishi, Yasutaka; Kurosawa, Yasunori; Kasahara, Masaki; Akiyama, Hiroshi; Teshima, Reiko; Futo, Satoshi; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi

    2009-06-01

    A duplex real-time PCR method was developed for quantitative screening analysis of GM maize. The duplex real-time PCR simultaneously detected two GM-specific segments, namely the cauliflower mosaic virus (CaMV) 35S promoter (P35S) segment and an event-specific segment for GA21 maize which does not contain P35S. Calibration was performed with a plasmid calibrant specially designed for the duplex PCR. The result of an in-house evaluation suggested that the analytical precision of the developed method was almost equivalent to those of simplex real-time PCR methods, which have been adopted as ISO standard methods for the analysis of GMOs in foodstuffs and have also been employed for the analysis of GMOs in Japan. In addition, this method will reduce both the cost and time requirement of routine GMO analysis by half. The high analytical performance demonstrated in the current study would be useful for the quantitative screening analysis of GM maize. We believe the developed method will be useful for practical screening analysis of GM maize, although interlaboratory collaborative studies should be conducted to confirm this.

  4. Recent developments in quasi-Newton methods for structural analysis and synthesis

    Science.gov (United States)

    Kamat, M. P.; Hayduk, R. J.

    1981-01-01

    Unlike the Newton-Raphson method, quasi-Newton methods by virture of the updates and step length control procedures are globally convergent and hence better suited for the solution of nonlinear problems of structural analysis and synthesis. Extension of quasi-Newton algorithms to large scale problems has led to the development of sparse update algorithms and to economical strategies for evaluating sparse Hessians. Ill-conditioning problems have led to the development of self-scaled variable metric and conjugate gradient algorithms, as well as the use of the singular perturbation theory. This paper emphasizes the effectiveness of such quasi-Newton algorithms for nonlinear structural analysis and synthesis.

  5. Development of pin-by-pin core analysis method using three-dimensional direct response matrix

    International Nuclear Information System (INIS)

    A three-dimensional direct response matrix method using a Monte Carlo calculation has been developed. The direct response matrix is formalized by four subresponse matrices in order to respond to a core eigenvalue k and thus can be recomposed at each outer iteration in core analysis. The subresponse matrices can be evaluated by ordinary single fuel assembly calculations with the Monte Carlo method in three dimensions. Since these subresponse matrices are calculated for the actual geometry of the fuel assembly, the effects of intra- and inter-assembly heterogeneities can be reflected on global partial neutron current balance calculations in core analysis. To verify this method, calculations for heterogeneous systems were performed. The results obtained using this method agreed well with those obtained using direct calculations with a Monte Carlo method. This means that this method accurately reflects the effects of intra- and inter-assembly heterogeneities and can be used for core analysis. A core analysis method, in which neutronic calculations using this direct response matrix method are coupled with thermal-hydraulic calculations, has also been developed. As it requires neither diffusion approximation nor a homogenization process of lattice constants, a precise representation of the effects of neutronic heterogeneities is possible. Moreover, the fuel rod power distribution can be directly evaluated, which enables accurate evaluations of core thermal margins. A method of reconstructing the response matrices according to the condition of each node in the core has been developed. The test revealed that the neutron multiplication factors and the fuel rod neutron production rates could be reproduced by interpolating the elements of the response matrix. A coupled analysis of neutronic calculations using the direct response matrix method and thermal-hydraulic calculations for an ABWR quarter core was performed, and it was found that the thermal power and coolant

  6. Development of spectral history methods for pin-by-pin core analysis method using three-dimensional direct response matrix

    International Nuclear Information System (INIS)

    Spectral history methods for pin-by-pin core analysis method using the three-dimensional direct response matrix have been developed. The direct response matrix is formalized by four sub-response matrices in order to respond to a core eigenvalue k and thus can be recomposed at each outer iteration in the core analysis. For core analysis, it is necessary to take into account the burn-up effect related to spectral history. One of the methods is to evaluate the nodal burn-up spectrum obtained using the out-going neutron current. The other is to correct the fuel rod neutron production rates obtained the pin-by-pin correction. These spectral history methods were tested in a heterogeneous system. The test results show that the neutron multiplication factor error can be reduced by half during burn-up, the nodal neutron production rates errors can be reduced by 30% or more. The root-mean-square differences between the relative fuel rod neutron production rate distributions can be reduced within 1.1% error. This means that these methods can accurately reflect the effects of intra- and inter-assembly heterogeneities during burn-up and can be used for core analysis. Core analysis with the DRM method was carried out for an ABWR quarter core and it was found that both thermal power and coolant-flow distributions were smoothly converged. (authors)

  7. Development of A Standard Method for Human Reliability Analysis (HRA) of Nuclear Power Plants

    International Nuclear Information System (INIS)

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  8. Development of A Standard Method for Human Reliability Analysis of Nuclear Power Plants

    International Nuclear Information System (INIS)

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  9. Preliminary Tests For Development Of A Non-Pertechnetate Analysis Method

    Energy Technology Data Exchange (ETDEWEB)

    Diprete, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); McCabe, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-09-28

    The objective of this task was to develop a non-pertechnetate analysis method that 222-S lab could easily implement. The initial scope involved working with 222-S laboratory personnel to adapt the existing Tc analytical method to fractionate the non-pertechnetate and pertechnetate. SRNL then developed and tested amethod using commercial sorbents containing Aliquat® 336 to extract the pertechnetate (thereby separating it from non-pertechnetate), followed by oxidation, extraction, and stripping steps, and finally analysis by beta counting and Mass Spectroscopy. Several additional items were partially investigated, including impacts of a 137Cs removal step. The method was initially tested on SRS tank waste samples to determine its viability. Although SRS tank waste does not contain non-pertechnetate, testing with it was useful to investigate the compatibility, separation efficiency, interference removal efficacy, and method sensitivity.

  10. Analysis of Pre-Service Science Teachers' Views about the Methods Which Develop Reflective Thinking

    Science.gov (United States)

    Töman, Ufuk; Odabasi Çimer, Sabiha; Çimer, Atilla

    2014-01-01

    In this study, we investigate of science and technology pre-service teachers' opinions about the methods developed reflective thinking and we determined at the level of reflective thinking. This study is a descriptive study. Open-ended questions were used to determine the views of pre-service teachers. Questions used in the statistical analysis of…

  11. DEVELOPMENT OF TECHNOLOGY AND ANALYSIS METHODS OF COSMETICS WITH WATER EXTRACTS FROM HERBAL DRUGS RAW MATERIALS

    Directory of Open Access Journals (Sweden)

    A. A. Chakhirova

    2014-01-01

    Full Text Available The article presents studies on development of cosmetics with complex extracts from herb of Bidens, flowers of Calendula, and flowers of Matricaria. We cited the analysis methods of the received extract and a drug on its base

  12. Considerations for Task Analysis Methods and Rapid E-Learning Development Techniques

    Directory of Open Access Journals (Sweden)

    Dr. Ismail Ipek

    2014-02-01

    Full Text Available The purpose of this paper is to provide basic dimensions for rapid training development in e-learning courses in education and business. Principally, it starts with defining task analysis and how to select tasks for analysis and task analysis methods for instructional design. To do this, first, learning and instructional technologies as visions of the future were discussed. Second, the importance of task analysis methods in rapid e-learning was considered, with learning technologies as asynchronous and synchronous e-learning development. Finally, rapid instructional design concepts and e-learning design strategies were defined and clarified with examples, that is, all steps for effective task analysis and rapid training development techniques based on learning and instructional design approaches were discussed, such as m-learning and other delivery systems. As a result, the concept of task analysis, rapid e-learning development strategies and the essentials of online course design were discussed, alongside learner interface design features for learners and designers.

  13. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  14. A Product Analysis Method and its Staging to Develop Redesign Competences

    DEFF Research Database (Denmark)

    Hansen, Claus Thorp; Lenau, Torben Anker

    2013-01-01

    Most product development work in industrial practice is incremental, i.e. the company has had a product in production and on the market for some time, and now time has come to design an upgraded variant. This type of redesign project requires that the engineering designers have competences to carry...... to be very productive. In this paper we present the product analysis method and its staging and we reflect on the students’ application of it. We conclude that the method is a valid contribution to develop the students’ redesign competences....

  15. Development of methods of the Fractal Dimension estimation for the ecological data analysis

    CERN Document Server

    Jura, Jakub; Mironovová, Martina

    2015-01-01

    This paper deals with an estimating of the Fractal Dimension of a hydrometeorology variables like an Air temperature or humidity at a different sites in a landscape (and will be further evaluated from the land use point of view). Three algorithms and methods of an estimation of the Fractal Dimension of a hydrometeorology time series were developed. The first results indicate that developed methods are usable for the analysis of a hydrometeorology variables and for a testing of the relation with autoregulation functions of ecosystem

  16. Whole genome sequence analysis of unidentified genetically modified papaya for development of a specific detection method.

    Science.gov (United States)

    Nakamura, Kosuke; Kondo, Kazunari; Akiyama, Hiroshi; Ishigaki, Takumi; Noguchi, Akio; Katsumata, Hiroshi; Takasaki, Kazuto; Futo, Satoshi; Sakata, Kozue; Fukuda, Nozomi; Mano, Junichi; Kitta, Kazumi; Tanaka, Hidenori; Akashi, Ryo; Nishimaki-Mogami, Tomoko

    2016-08-15

    Identification of transgenic sequences in an unknown genetically modified (GM) papaya (Carica papaya L.) by whole genome sequence analysis was demonstrated. Whole genome sequence data were generated for a GM-positive fresh papaya fruit commodity detected in monitoring using real-time polymerase chain reaction (PCR). The sequences obtained were mapped against an open database for papaya genome sequence. Transgenic construct- and event-specific sequences were identified as a GM papaya developed to resist infection from a Papaya ringspot virus. Based on the transgenic sequences, a specific real-time PCR detection method for GM papaya applicable to various food commodities was developed. Whole genome sequence analysis enabled identifying unknown transgenic construct- and event-specific sequences in GM papaya and development of a reliable method for detecting them in papaya food commodities.

  17. Whole genome sequence analysis of unidentified genetically modified papaya for development of a specific detection method.

    Science.gov (United States)

    Nakamura, Kosuke; Kondo, Kazunari; Akiyama, Hiroshi; Ishigaki, Takumi; Noguchi, Akio; Katsumata, Hiroshi; Takasaki, Kazuto; Futo, Satoshi; Sakata, Kozue; Fukuda, Nozomi; Mano, Junichi; Kitta, Kazumi; Tanaka, Hidenori; Akashi, Ryo; Nishimaki-Mogami, Tomoko

    2016-08-15

    Identification of transgenic sequences in an unknown genetically modified (GM) papaya (Carica papaya L.) by whole genome sequence analysis was demonstrated. Whole genome sequence data were generated for a GM-positive fresh papaya fruit commodity detected in monitoring using real-time polymerase chain reaction (PCR). The sequences obtained were mapped against an open database for papaya genome sequence. Transgenic construct- and event-specific sequences were identified as a GM papaya developed to resist infection from a Papaya ringspot virus. Based on the transgenic sequences, a specific real-time PCR detection method for GM papaya applicable to various food commodities was developed. Whole genome sequence analysis enabled identifying unknown transgenic construct- and event-specific sequences in GM papaya and development of a reliable method for detecting them in papaya food commodities. PMID:27006240

  18. Development of an adjoint sensitivity method for site characterization, uncertainty analysis, and code calibration/validation

    Energy Technology Data Exchange (ETDEWEB)

    Lu, A.H.

    1991-09-01

    The adjoint method is applied to groundwater flow-mass transport coupled equations in variably saturated media. The sensitivity coefficients derived by this method can be calculated by a single execution for each performance measure regardless of the number of parameters in question. The method provides an efficient and effective way to rank the importance of the parameters, so that data collection can be guided in support of site characterization programs. The developed code will facilitate the sensitivity/uncertainty analysis in both model prediction and model calibration/validation. 13 refs., 1 tab.

  19. Method development and validation for pharmaceutical tablets analysis using transmission Raman spectroscopy.

    Science.gov (United States)

    Li, Yi; Igne, Benoît; Drennen, James K; Anderson, Carl A

    2016-02-10

    The objective of the study is to demonstrate the development and validation of a transmission Raman spectroscopic method using the ICH-Q2 Guidance as a template. Specifically, Raman spectroscopy was used to determine niacinamide content in tablet cores. A 3-level, 2-factor full factorial design was utilized to generate a partial least-squares model for active pharmaceutical ingredient quantification. Validation of the transmission Raman model was focused on figures of merit from three independent batches manufactured at pilot scale. The resultant model statistics were evaluated along with the linearity, accuracy, precision and robustness assessments. Method specificity was demonstrated by accurate determination of niacinamide in the presence of niacin (an expected related substance). The method was demonstrated as fit for purpose and had the desirable characteristics of very short analysis times (∼2.5s per tablet). The resulting method was used for routine content uniformity analysis of single dosage units in a stability study.

  20. Method development and validation for pharmaceutical tablets analysis using transmission Raman spectroscopy.

    Science.gov (United States)

    Li, Yi; Igne, Benoît; Drennen, James K; Anderson, Carl A

    2016-02-10

    The objective of the study is to demonstrate the development and validation of a transmission Raman spectroscopic method using the ICH-Q2 Guidance as a template. Specifically, Raman spectroscopy was used to determine niacinamide content in tablet cores. A 3-level, 2-factor full factorial design was utilized to generate a partial least-squares model for active pharmaceutical ingredient quantification. Validation of the transmission Raman model was focused on figures of merit from three independent batches manufactured at pilot scale. The resultant model statistics were evaluated along with the linearity, accuracy, precision and robustness assessments. Method specificity was demonstrated by accurate determination of niacinamide in the presence of niacin (an expected related substance). The method was demonstrated as fit for purpose and had the desirable characteristics of very short analysis times (∼2.5s per tablet). The resulting method was used for routine content uniformity analysis of single dosage units in a stability study. PMID:26656945

  1. Development of a Probabilistic Component Mode Synthesis Method for the Analysis of Non-Deterministic Substructures

    Science.gov (United States)

    Brown, Andrew M.; Ferri, Aldo A.

    1995-01-01

    Standard methods of structural dynamic analysis assume that the structural characteristics are deterministic. Recognizing that these characteristics are actually statistical in nature, researchers have recently developed a variety of methods that use this information to determine probabilities of a desired response characteristic, such as natural frequency, without using expensive Monte Carlo simulations. One of the problems in these methods is correctly identifying the statistical properties of primitive variables such as geometry, stiffness, and mass. This paper presents a method where the measured dynamic properties of substructures are used instead as the random variables. The residual flexibility method of component mode synthesis is combined with the probabilistic methods to determine the cumulative distribution function of the system eigenvalues. A simple cantilever beam test problem is presented that illustrates the theory.

  2. REQUIREMENT ANALYSIS METHOD OF ECOMMERCE WEBSITES DEVELOPMENT FOR SMALLMEDIUM ENTERPRISES, CASE STUDY: INDONESIA

    Directory of Open Access Journals (Sweden)

    Veronica S. Moertini

    2014-03-01

    Full Text Available Along with the growth of the Internet, the trend shows that e-commerce have been growing significantly in the last several years. This means business opportunities for small-medium enterprises (SMEs, which are recognized as the backbone of the economy. SMEs may develop and run small to medium size of particular e-commerce websites as the solution of specific business opportunities. Certainly, the websites should be developed accordingly to support business success. In developing the websites, key elements of e-commerce business model that are necessary to ensure the success should be resolved at the requirement stage of the development. In this paper, we propose an enhancement of requirement analysis method found in literatures such that it includes activities to resolve the key elements. The method has been applied in three case studies based on Indonesia situations and we conclude that it is suitable to be adopted by SMEs.

  3. Ion beam analysis - development and application of nuclear reaction analysis methods, in particular at a nuclear microprobe

    Energy Technology Data Exchange (ETDEWEB)

    Sjoeland, K.A.

    1996-11-01

    This thesis treats the development of Ion Beam Analysis methods, principally for the analysis of light elements at a nuclear microprobe. The light elements in this context are defined as having an atomic number less than approx. 13. The work reported is to a large extent based on multiparameter methods. Several signals are recorded simultaneously, and the data can be effectively analyzed to reveal structures that can not be observed through one-parameter collection. The different techniques are combined in a new set-up at the Lund Nuclear Microprobe. The various detectors for reaction products are arranged in such a way that they can be used for the simultaneous analysis of hydrogen, lithium, boron and fluorine together with traditional PIXE analysis and Scanning Transmission Ion Microscopy as well as photon-tagged Nuclear Reaction Analysis. 48 refs.

  4. Ion beam analysis - development and application of nuclear reaction analysis methods, in particular at a nuclear microprobe

    International Nuclear Information System (INIS)

    This thesis treats the development of Ion Beam Analysis methods, principally for the analysis of light elements at a nuclear microprobe. The light elements in this context are defined as having an atomic number less than approx. 13. The work reported is to a large extent based on multiparameter methods. Several signals are recorded simultaneously, and the data can be effectively analyzed to reveal structures that can not be observed through one-parameter collection. The different techniques are combined in a new set-up at the Lund Nuclear Microprobe. The various detectors for reaction products are arranged in such a way that they can be used for the simultaneous analysis of hydrogen, lithium, boron and fluorine together with traditional PIXE analysis and Scanning Transmission Ion Microscopy as well as photon-tagged Nuclear Reaction Analysis. 48 refs

  5. The development of controlled damage mechanisms-based design method for nonlinear static pushover analysis

    Directory of Open Access Journals (Sweden)

    Ćosić Mladen

    2014-01-01

    Full Text Available This paper presents the original method of controlled building damage mechanisms based on Nonlinear Static Pushover Analysis (NSPA-DMBD. The optimal building damage mechanism is determined based on the solution of the Capacity Design Method (CDM, and the response of the building is considered in incremental situations. The development of damage mechanism of a system in such incremental situations is being controlled on the strain level, examining the relationship of current and limit strains in concrete and reinforcement steel. Since the procedure of the system damage mechanism analysis according to the NSPA-DMBD method is being iteratively implemented and designing checked after the strain reaches the limit, for this analysis a term Iterative-Interactive Design (IID has been introduced. By selecting, monitoring and controlling the optimal damage mechanism of the system and by developed NSPA-DMBD method, damage mechanism of the building is being controlled and the level of resistance to an early collapse is being increased. [Projekat Ministarstva nauke Republike Srbije, br. TR 36043

  6. The development of a task analysis method applicable to the tasks of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Wan Chul; Park, Ji Soo; Baek, Dong Hyeon; Ham, Dong Han; Kim, Huhn [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1996-07-01

    While task analysis is one of the essential processes for human factors studies, traditional methods reveal weaknesses in dealing with the cognitive aspects, which become more critical in modern complex system. This report proposes a cognitive task analysis (CTA) method for identifying cognitive requirements of operators' tasks in nuclear power plants. The proposed CTA method is characterized by the information-oriented concept and procedure-based approach. The task prescription identifies the information requirements and trace the information flow to reveal the cognitive organization of task procedure with emphasis to the relations among the information requirements. The cognitive requirements are then analyzed in terms of cognitive span of task information, cognitive envelope and working memory relief point of t procedures, and working memory load. The proposed method is relatively simple and, possibly being incorporated in a full task analysis scheme, directly applicable to the design/evaluation of human-machine interfaces and operating procedures. A prototype of a computerized support system is developed for supporting the practicality of the proposed method. (Author) 104 refs., 8 tabs., 7 figs.

  7. Development and Validation of an HPLC Method for the Analysis of Sirolimus in Drug Products

    Directory of Open Access Journals (Sweden)

    Hadi Valizadeh

    2012-05-01

    Full Text Available Purpose: The aim of this study was to develop a simple, rapid and sensitive reverse phase high performance liquid chromatography (RP-HPLC method for quantification of sirolimus (SRL in pharmaceutical dosage forms. Methods: The chromatographic system employs isocratic elution using a Knauer- C18, 5 mm, 4.6 × 150 mm. Mobile phase consisting of acetonitril and ammonium acetate buffer set at flow rate 1.5 ml/min. The analyte was detected and quantified at 278nm using ultraviolet detector. The method was validated as per ICH guidelines. Results: The standard curve was found to have a linear relationship (r2 > 0.99 over the analytical range of 125–2000ng/ml. For all quality control (QC standards in intraday and interday assay, accuracy and precision range were -0.96 to 6.30 and 0.86 to 13.74 respectively, demonstrating the precision and accuracy over the analytical range. Samples were stable during preparation and analysis procedure. Conclusion: Therefore the rapid and sensitive developed method can be used for the routine analysis of sirolimus such as dissolution and stability assays of pre- and post-marketed dosage forms.

  8. Recent developments of nanoparticle-based enrichment methods for mass spectrometric analysis in proteomics

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    In proteome research, rapid and effective separation strategies are essential for successful protein identification due to the broad dynamic range of proteins in biological samples. Some important proteins are often expressed in ultra low abundance, thus making the pre-concentration procedure before mass spectrometric analysis prerequisite. The main purpose of enrichment is to isolate target molecules from complex mixtures to reduce sample complexity and facilitate the subsequent analyzing steps. The introduction of nanoparticles into this field has accelerated the development of enrichment methods. In this review, we mainly focus on recent developments of using different nanomaterials for pre-concentration of low-abundance peptides/ proteins, including those containing post-translational modifications, such as phosphorylation and glycosylation, prior to mass spectrometric analysis.

  9. Development of prediction method of void fraction distribution in fuel assemblies for use in safety analysis

    International Nuclear Information System (INIS)

    The establishment of code system for BWR safety analysis is now in progress at Institute of Nuclear Safety (INS), in order to predict the onset of boiling transition (BT) in nuclear fuel assemblies in any thermal-hydraulic condition without relying on the thermal-hydraulic characteristic data provided by licensee. The prediction method for void fraction distribution across cross section of BWR fuel assemblies has been developed based on multi-dimensional two-fluid model. Lift forces working on bubbles and void diffusion that can not be handled with one-dimensional analysis were considered. Comparisons between calculated results and experimental data obtained from thermal-hydraulic tests of PWR and BWR mock-up fuel assemblies showed good agreement. Lift force models have been empirical and further studies were needed, but the calculations showed the possibility of applying these models to multi-dimensional gas-liquid two-phase flow analysis. (author)

  10. Development of Finite Elements for Two-Dimensional Structural Analysis Using the Integrated Force Method

    Science.gov (United States)

    Kaljevic, Igor; Patnaik, Surya N.; Hopkins, Dale A.

    1996-01-01

    The Integrated Force Method has been developed in recent years for the analysis of structural mechanics problems. This method treats all independent internal forces as unknown variables that can be calculated by simultaneously imposing equations of equilibrium and compatibility conditions. In this paper a finite element library for analyzing two-dimensional problems by the Integrated Force Method is presented. Triangular- and quadrilateral-shaped elements capable of modeling arbitrary domain configurations are presented. The element equilibrium and flexibility matrices are derived by discretizing the expressions for potential and complementary energies, respectively. The displacement and stress fields within the finite elements are independently approximated. The displacement field is interpolated as it is in the standard displacement method, and the stress field is approximated by using complete polynomials of the correct order. A procedure that uses the definitions of stress components in terms of an Airy stress function is developed to derive the stress interpolation polynomials. Such derived stress fields identically satisfy the equations of equilibrium. Moreover, the resulting element matrices are insensitive to the orientation of local coordinate systems. A method is devised to calculate the number of rigid body modes, and the present elements are shown to be free of spurious zero-energy modes. A number of example problems are solved by using the present library, and the results are compared with corresponding analytical solutions and with results from the standard displacement finite element method. The Integrated Force Method not only gives results that agree well with analytical and displacement method results but also outperforms the displacement method in stress calculations.

  11. Advanced organic analysis and analytical methods development: FY 1995 progress report. Waste Tank Organic Safety Program

    Energy Technology Data Exchange (ETDEWEB)

    Wahl, K.L.; Campbell, J.A.; Clauss, S.A. [and others

    1995-09-01

    This report describes the work performed during FY 1995 by Pacific Northwest Laboratory in developing and optimizing analysis techniques for identifying organics present in Hanford waste tanks. The main focus was to provide a means for rapidly obtaining the most useful information concerning the organics present in tank waste, with minimal sample handling and with minimal waste generation. One major focus has been to optimize analytical methods for organic speciation. Select methods, such as atmospheric pressure chemical ionization mass spectrometry and matrix-assisted laser desorption/ionization mass spectrometry, were developed to increase the speciation capabilities, while minimizing sample handling. A capillary electrophoresis method was developed to improve separation capabilities while minimizing additional waste generation. In addition, considerable emphasis has been placed on developing a rapid screening tool, based on Raman and infrared spectroscopy, for determining organic functional group content when complete organic speciation is not required. This capability would allow for a cost-effective means to screen the waste tanks to identify tanks that require more specialized and complete organic speciation to determine tank safety.

  12. Development of a diagnostic expert system for eddy current data analysis using applied artificial intelligence methods

    International Nuclear Information System (INIS)

    A diagnostic expert system that integrates database management methods, artificial neural networks, and decision-making using fuzzy logic has been developed for the automation of steam generator eddy current test (ECT) data analysis. The new system, known as EDDYAI, considers the following key issues: (1) digital eddy current test data calibration, compression, and representation; (2) development of robust neural networks with low probability of misclassification for flaw depth estimation; (3) flaw detection using fuzzy logic; (4) development of an expert system for database management, compilation of a trained neural network library, and a decision module; and (5) evaluation of the integrated approach using eddy current data. The implementation to field test data includes the selection of proper feature vectors for ECT data analysis, development of a methodology for large eddy current database management, artificial neural networks for flaw depth estimation, and a fuzzy logic decision algorithm for flaw detection. A large eddy current inspection database from the Electric Power Research Institute NDE Center is being utilized in this research towards the development of an expert system for steam generator tube diagnosis. The integration of ECT data pre-processing as part of the data management, fuzzy logic flaw detection technique, and tube defect parameter estimation using artificial neural networks are the fundamental contributions of this research. (orig.)

  13. Recent developments in methods of chemical analysis in investigations of firearm-related events.

    Science.gov (United States)

    Zeichner, Arie

    2003-08-01

    A review of recent (approximately the last ten years) developments in the methods used for chemical analysis in investigations of firearm-related events is provided. This review discusses:examination of gunshot (primer) residues (GSR) and gunpowder (propellant) residues on suspects and their clothing;detection of firearm imprints on the hands of suspects;identification of the bullet entry holes and estimation of shooting distance;linking weapons and/or fired ammunition to the gunshot entries, and estimation of the time since discharge. PMID:12811451

  14. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    Science.gov (United States)

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  15. Development of an improved HRA method: A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Probabilistic risk assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification

  16. Development of an improved HRA method: A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, J.H.; Luckas, W.J. [Brookhaven National Lab., Upton, NY (United States); Wreathall, J. [John Wreathall & Co., Dublin, OH (United States)] [and others

    1996-03-01

    Probabilistic risk assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification.

  17. Methods and considerations for longitudinal structural brain imaging analysis across development

    Directory of Open Access Journals (Sweden)

    Kathryn L. Mills

    2014-07-01

    Full Text Available Magnetic resonance imaging (MRI has allowed the unprecedented capability to measure the human brain in vivo. This technique has paved the way for longitudinal studies exploring brain changes across the entire life span. Results from these studies have given us a glimpse into the remarkably extended and multifaceted development of our brain, converging with evidence from anatomical and histological studies. Ever-evolving techniques and analytical methods provide new avenues to explore and questions to consider, requiring researchers to balance excitement with caution. This review addresses what MRI studies of structural brain development in children and adolescents typically measure and how. We focus on measurements of brain morphometry (e.g., volume, cortical thickness, surface area, folding patterns, as well as measurements derived from diffusion tensor imaging (DTI. By integrating finding from multiple longitudinal investigations, we give an update on current knowledge of structural brain development and how it relates to other aspects of biological development and possible underlying physiological mechanisms. Further, we review and discuss current strategies in image processing, analysis techniques and modeling of brain development. We hope this review will aid current and future longitudinal investigations of brain development, as well as evoke a discussion amongst researchers regarding best practices.

  18. Development of a Bayesian method for the analysis of inertial confinement fusion experiments on the NIF

    CERN Document Server

    Gaffney, Jim A; Sonnad, Vijay; Libby, Stephen B

    2013-01-01

    The complex nature of inertial confinement fusion (ICF) experiments results in a very large number of experimental parameters that are only known with limited reliability. These parameters, combined with the myriad physical models that govern target evolution, make the reliable extraction of physics from experimental campaigns very difficult. We develop an inference method that allows all important experimental parameters, and previous knowledge, to be taken into account when investigating underlying microphysics models. The result is framed as a modified $\\chi^{2}$ analysis which is easy to implement in existing analyses, and quite portable. We present a first application to a recent convergent ablator experiment performed at the NIF, and investigate the effect of variations in all physical dimensions of the target (very difficult to do using other methods). We show that for well characterised targets in which dimensions vary at the 0.5% level there is little effect, but 3% variations change the results of i...

  19. Development of an improved method to perform single particle analysis by TIMS for nuclear safeguards.

    Science.gov (United States)

    Kraiem, M; Richter, S; Kühn, H; Aregbe, Y

    2011-02-28

    A method is described that allows measuring the isotopic composition of small uranium oxide particles (less than 1μm in diameter) for nuclear safeguards purposes. In support to the development of reliable tools for the identification of uranium and plutonium signatures in trace amounts of nuclear materials, improvements in scanning electron microscopy (SEM) and thermal ionization mass spectrometry (TIMS) in combination with filament carburization and multiple ion counting (MIC) detection were investigated. The method that has been set up enables the analysis of single particles by a combination of analytical tools, thus yielding morphological, elemental and isotopic information. Hereby individual particles of certified reference materials (CRMs) containing uranium at femtogram levels were analysed. The results showed that the combination of techniques proposed in this work is suitable for the accurate determination of uranium isotope ratios in single particles with improved capabilities for the minor abundant isotopes. PMID:21296200

  20. Development and validation of HPLC method for quantitative analysis of triamcinolone in biodegradable microparticles

    Directory of Open Access Journals (Sweden)

    A. A. Silva-Júnior

    2009-01-01

    Full Text Available

    A simple, rapid, selective and specific high performance liquid chromatographic (HPLC method for quantitative analysis of the triamcinolone in polylactide-co-glycolide acid (PLGA microparticles was developed. The chromatographic parameters were reversed-phase C18 column, 250mm x 4.6mm, with particle size 5 m. The column oven was thermostated at 35 ºC ± 2 ºC. The mobile phase was methanol/water 45:55 (v/v and elution was isocratic at a flow-rate of 1mL.mL-1. The determinations were performed using a UV-Vis detector at 239 nm. The injected sample volume was 10 µL. The standard curve was linear (r2 > 0.999 in the concentration range 100-2500 ng.mL-1. The method showed adequate precision, with a relative standard deviation (RSD was smaller than 3%. The accuracy was analyzed by adding a standard drug and good recovery values were obtained for all drug concentrations used. The method showed specificity and selectivity with linearity in the working range and good precision and accuracy, making it very suitable for quantitation of triamcinolone in PLGA microparticles. Keywords: triamcinolone; HPLC analytical method; PLGA microparticles; analytical method validation.

  1. Analysis and development of methods of correcting for heterogeneities to cobalt-60: computing application

    International Nuclear Information System (INIS)

    The purpose of this work is the analysis of the influence of inhomogeneities of the human body on the determination of the dose in Cobalt-60 radiation therapy. The first part is dedicated to the physical characteristics of inhomogeneities and to the conventional methods of correction. New methods of correction are proposed based on the analysis of the scatter. This analysis allows to take account, with a greater accuracy of their physical characteristics and of the corresponding modifications of the dose: ''the differential TAR method'' and ''the Beam Substraction Method''. The second part is dedicated to the computer implementation of the second method of correction for routine application in hospital

  2. Priority-sequence of mineral resources’ development and utilization based on grey relational analysis method

    Institute of Scientific and Technical Information of China (English)

    Wang Ying; Zhang Chang; Jiang Gaopeng

    2016-01-01

    Generally, the sequence decision of the development and utilization of Chinese mineral resources is based on national and provincial overall plan of the mineral resources. Such plan usually cannot reflect the relative size of the suitability of the development and utilization of mineral resources. To solve the problem, the paper has selected the gift condition, the market condition, the technological condition, socio-economic condition and environmental condition as the starting-points to analyze the influential factors of the priority-sequence of mineral resources’ development and utilization. The above 5 condi-tions are further specified into 9 evaluative indicators to establish an evaluation indicator system. At last, we propose a decision model of the priority sequence based on grey relational analysis method, and fig-ure out the observation objects by the suitability index of development. Finally, the mineral resources of a certain province in China were analyzed as an example. The calculation results indicate that silver (2.0057), coal (1.9955), zinc (1.9442), cement limestone (1.9077), solvent limestone (1.5624) and other minerals in the province are suitable for development and utilization.

  3. Further development of probabilistic analysis method for lifetime determination of piping and vessels. Final report

    International Nuclear Information System (INIS)

    Within the framework of research project RS1196 the computer code PROST (Probabilistic Structure Calculation) for the quantitative evaluation of the structural reliability of pipe components has been further developed. Thereby models were provided and tested for the consideration of the damage mechanism 'stable crack growth' to determine leak and break probabilities in cylindrical structures of ferritic and austenitic reactor steels. These models are now additionally available to the model for the consideration of the damage mechanisms 'fatigue' and 'corrosion'. Moreover, a crack initiation model has been established supplementary to the treatment of initial cracks. Furthermore, the application range of the code was extended to the calculation of the growth of wall penetrating cracks. This is important for surface cracks growing until the formation of a stable leak. The calculation of the growth of the wall penetrating crack until break occurs improves the estimation of the break probability. For this purpose program modules were developed to be able to calculate stress intensity factors and critical crack lengths for wall penetrating cracks. In the frame of this work a restructuring of PROST was performed including possibilities to combine damage mechanisms during a calculation. Furthermore several additional fatigue crack growth laws were implemented. The implementation of methods to estimate leak areas and leak rates of wall penetrating cracks was completed by the inclusion of leak detection boundaries. The improved analysis methods were tested by calculation of cases treated already before. Furthermore comparative analyses have been performed for several tasks within the international activity BENCH-KJ. Altogether, the analyses show that with the provided flexible probabilistic analysis method quantitative determination of leak and break probabilities of a crack in a complex structure geometry under thermal-mechanical loading as

  4. Development and Analysis of Train Brake Curve Calculation Methods with Complex Simulation

    Directory of Open Access Journals (Sweden)

    Bela Vincze

    2006-01-01

    Full Text Available This paper describes an efficient method using simulation for developing and analyzing train brake curve calculation methods for the on-board computer of the ETCS system. An application example with actual measurements is also presented.

  5. SENSITIVITY ANALYSIS as a methodical approach to the development of design strategies for environmentally sustainable buildings

    DEFF Research Database (Denmark)

    Hansen, Hanne Tine Ring

    . The research methodology applied in the project combines a literature study of descriptions of methodical approaches and built examples with a sensitivity analysis and a qualitative interview with two designers from a best practice example of a practice that has achieved environmentally sustainable...... architecture through an integrated design approach. The findings of the literature study and the qualitative interview have directed the PhD project towards the importance of project specific design strategies and an integrated and multiprofessional approach to environmentally sustainable building design......The field of environmentally sustainable architecture has been under development since the late 1960's when mankind first started to notice the consequences of industrialisation and modern lifestyle. Energy crises in 1973 and 1979, and global climatic changes ascribed to global warming have caused...

  6. CZECHOSLOVAK FOOTPRINTS IN THE DEVELOPMENT OF METHODS OF THERMOMETRY, CALORIMETRY AND THERMAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Pavel Holba

    2012-07-01

    Full Text Available A short history on the development of thermometric methods are reviewed accentuating the role of Rudolf Bárta in underpinning special thermoanalytical conferences and new journal Silikáty in fifties as well as Vladimir Šatava mentioning his duty in the creation of the Czech school on thermoanalytical kinetics. This review surveys the innovative papers dealing with thermal analysis and the related fields (e.g. calorimetry, kinetics which have been published by noteworthy postwar Czechoslovak scholars and scientists and by their disciples in 1950-1980. Itemized 227 references with titles show rich scientific productivity revealing that many of them were ahead of time even at international connotation.

  7. Development on quantitative safety analysis method of accident scenario. The automatic scenario generator development for event sequence construction of accident

    International Nuclear Information System (INIS)

    This study intends to develop a more sophisticated tool that will advance the current event tree method used in all PSA, and to focus on non-catastrophic events, specifically a non-core melt sequence scenario not included in an ordinary PSA. In the non-catastrophic event PSA, it is necessary to consider various end states and failure combinations for the purpose of multiple scenario construction. Therefore it is anticipated that an analysis work should be reduced and automated method and tool is required. A scenario generator that can automatically handle scenario construction logic and generate the enormous size of sequences logically identified by state-of-the-art methodology was developed. To fulfill the scenario generation as a technical tool, a simulation model associated with AI technique and graphical interface, was introduced. The AI simulation model in this study was verified for the feasibility of its capability to evaluate actual systems. In this feasibility study, a spurious SI signal was selected to test the model's applicability. As a result, the basic capability of the scenario generator could be demonstrated and important scenarios were generated. The human interface with a system and its operation, as well as time dependent factors and their quantification in scenario modeling, was added utilizing human scenario generator concept. Then the feasibility of an improved scenario generator was tested for actual use. Automatic scenario generation with a certain level of credibility, was achieved by this study. (author)

  8. Regional development assessment using parametric and non-parametric ranking methods: A comparative analysis of Slovenia and Croatia

    OpenAIRE

    Cziraky, Dario; Puljiz, Jaksa; Rovan, Joze; Sambt, Joze; Polic, Mario; Malekovic, Sanja

    2003-01-01

    In this paper we describe several regional development-assessment methods and subsequently apply them in a comparative development level analysis of the Slovenian and Croatian municipalities. The aim is to compare performance and suitability of several parametric and non-parametric ranking methods and to develop a suitable multivariate methodological framework for distinguishing development level of particular territorial units. However, the usefulness and appropriateness of various multivari...

  9. Analysis and development of stochastic multigrid methods in lattice field theory

    International Nuclear Information System (INIS)

    We study the relation between the dynamical critical behavior and the kinematics of stochastic multigrid algorithms. The scale dependence of acceptance rates for nonlocal Metropolis updates is analyzed with the help of an approximation formula. A quantitative study of the kinematics of multigrid algorithms in several interacting models is performed. We find that for a critical model with Hamiltonian H(Φ) absence of critical slowing down can only be expected if the expansion of (H(Φ+ψ)) in terms of the shift ψ contains no relevant term (mass term). The predictions of this rule was verified in a multigrid Monte Carlo simulation of the Sine Gordon model in two dimensions. Our analysis can serve as a guideline for the development of new algorithms: We propose a new multigrid method for nonabelian lattice gauge theory, the time slice blocking. For SU(2) gauge fields in two dimensions, critical slowing down is almost completely eliminated by this method, in accordance with the theoretical prediction. The generalization of the time slice blocking to SU(2) in four dimensions is investigated analytically and by numerical simulations. Compared to two dimensions, the local disorder in the four dimensional gauge field leads to kinematical problems. (orig.)

  10. Development of Methods for Sampling and Analysis of Particulate and Gaseous Fluorides from Stationary Sources.

    Science.gov (United States)

    Peters, E. T.; And Others

    A study was conducted which has resulted in the development of tentative sampling and analysis of fluorides emitted from various stationary sources. The study was directed toward developing and understanding the kinds of species which are present in each source emission. The report presents the following information: review of the various unit…

  11. Analysis of Perfluorinated Chemicals and Their Fluorinated Precursors in Sludge: Method Development and Initial Results

    Science.gov (United States)

    A rigorous method was developed to maximize the extraction efficacy for perfluorocarboxylic acids (PFCAs), perfluorosulfonates (PFSAs), fluorotelomer alcohols (FTOHs), fluorotelomer acrylates (FTAc), perfluorosulfonamides (FOSAs), and perfluorosulfonamidoethanols (FOSEs) from was...

  12. Development of a forensically useful age prediction method based on DNA methylation analysis.

    Science.gov (United States)

    Zbieć-Piekarska, Renata; Spólnicka, Magdalena; Kupiec, Tomasz; Parys-Proszek, Agnieszka; Makowska, Żanetta; Pałeczka, Anna; Kucharczyk, Krzysztof; Płoski, Rafał; Branicki, Wojciech

    2015-07-01

    Forensic DNA phenotyping needs to be supplemented with age prediction to become a relevant source of information on human appearance. Recent progress in analysis of the human methylome has enabled selection of multiple candidate loci showing linear correlation with chronological age. Practical application in forensic science depends on successful validation of these potential age predictors. In this study, eight DNA methylation candidate loci were analysed using convenient and reliable pyrosequencing technology. A total number of 41 CpG sites was investigated in 420 samples collected from men and women aged from 2 to 75 years. The study confirmed correlation of all the investigated markers with human age. The five most significantly correlated CpG sites in ELOVL2 on 6p24.2, C1orf132 on 1q32.2, TRIM59 on 3q25.33, KLF14 on 7q32.3 and FHL2 on 2q12.2 were chosen to build a prediction model. This restriction allowed the technical analysis to be simplified without lowering the prediction accuracy significantly. Model parameters for a discovery set of 300 samples were R(2)=0.94 and the standard error of the estimate=4.5 years. An independent set of 120 samples was used to test the model performance. Mean absolute deviation for this testing set was 3.9 years. The number of correct predictions ±5 years achieved a very high level of 86.7% in the age category 2-19 and gradually decreased to 50% in the age category 60-75. The prediction model was deterministic for individuals belonging to these two extreme age categories. The developed method was implemented in a freely available online age prediction calculator. PMID:26026729

  13. Pathways to Lean Software Development: An Analysis of Effective Methods of Change

    Science.gov (United States)

    Hanson, Richard D.

    2014-01-01

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is…

  14. Development of Design Analysis Methods for C/SiC Composite Structures

    Science.gov (United States)

    Sullivan, Roy M.; Mital, Subodh K.; Murthy, Pappu L. N.; Palko, Joseph L.; Cueno, Jacques C.; Koenig, John R.

    2006-01-01

    The stress-strain behavior at room temperature and at 1100 C (2000 F) was measured for two carbon-fiber-reinforced silicon carbide (C/SiC) composite materials: a two-dimensional plain-weave quasi-isotropic laminate and a three-dimensional angle-interlock woven composite. Micromechanics-based material models were developed for predicting the response properties of these two materials. The micromechanics based material models were calibrated by correlating the predicted material property values with the measured values. Four-point beam bending sub-element specimens were fabricated with these two fiber architectures and four-point bending tests were performed at room temperature and at 1100 C. Displacements and strains were measured at various locations along the beam and recorded as a function of load magnitude. The calibrated material models were used in concert with a nonlinear finite element solution to simulate the structural response of these two materials in the four-point beam bending tests. The structural response predicted by the nonlinear analysis method compares favorably with the measured response for both materials and for both test temperatures. Results show that the material models scale up fairly well from coupon to subcomponent level.

  15. Development of Evaluation Methods for Lower Limb Function between Aged and Young Using Principal Component Analysis

    Science.gov (United States)

    Nomoto, Yohei; Yamashita, Kazuhiko; Ohya, Tetsuya; Koyama, Hironori; Kawasumi, Masashi

    There is the increasing concern of the society to prevent the fall of the aged. The improvement in aged people's the muscular strength of the lower-limb, postural control and walking ability are important for quality of life and fall prevention. The aim of this study was to develop multiple evaluation methods in order to advise for improvement and maintenance of lower limb function between aged and young. The subjects were 16 healthy young volunteers (mean ± S.D: 19.9 ± 0.6 years) and 10 healthy aged volunteers (mean ± S.D: 80.6 ± 6.1 years). Measurement items related to lower limb function were selected from the items which we have ever used. Selected measurement items of function of lower are distance of extroversion of the toe, angle of flexion of the toe, maximum width of step, knee elevation, moving distance of greater trochanter, walking balance, toe-gap force and rotation range of ankle joint. Measurement items summarized by the principal component analysis into lower ability evaluation methods including walking ability and muscle strength of lower limb and flexibility of ankle. The young group demonstrated the factor of 1.6 greater the assessment score of walking ability compared with the aged group. The young group demonstrated the factor of 1.4 greater the assessment score of muscle strength of lower limb compared with the aged group. The young group demonstrated the factor of 1.2 greater the assessment score of flexibility of ankle compared with the aged group. The results suggested that it was possible to assess the lower limb function of aged and young numerically and to advise on their foot function.

  16. X-RAY FLUORESCENCE ANALYSIS OF HANFORD LOW ACTIVITY WASTE SIMULANTS METHOD DEVELOPMENT

    Energy Technology Data Exchange (ETDEWEB)

    Jurgensen, A; David Missimer, D; Ronny Rutherford, R

    2007-08-08

    The x-ray fluorescence laboratory (XRF) in the Analytical Development Directorate (ADD) of the Savannah River National Laboratory (SRNL) was requested to develop an x-ray fluorescence spectrometry method for elemental characterization of the Hanford Tank Waste Treatment and Immobilization Plant (WTP) pretreated low activity waste (LAW) stream to the LAW Vitrification Plant. The WTP is evaluating the potential for using XRF as a rapid turnaround technique to support LAW product compliance and glass former batching. The overall objective of this task was to develop an XRF analytical method that provides rapid turnaround time (<8 hours), while providing sufficient accuracy and precision to determine variations in waste.

  17. Pathways to lean software development: An analysis of effective methods of change

    Science.gov (United States)

    Hanson, Richard D.

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is firmly entrenched in American business today was to blame for this difficulty (Chatterjee, 2010). Each of these proponents of new methods sought to remove waste, lighten out the process, and implement lean principles in software development. Through this study, the experts evaluated the barriers to effective development principles and defined leadership qualities necessary to overcome these barriers. The barriers identified were issues of resistance to change, risk and reward issues, and management buy-in. Thirty experts in software development from several Fortune 500 companies across the United States explored each of these issues in detail. The conclusion reached by these experts was that visionary leadership is necessary to overcome these challenges.

  18. Development and validation of HPLC method for analysis of dexamethasone acetate in microemulsions

    Directory of Open Access Journals (Sweden)

    Maria Cristina Cocenza Urban

    2009-03-01

    Full Text Available A simple, rapid, accurate and sensitive method was developed for quantitative analysis of dexamethasone acetate in microemulsions using high performance liquid chromatography (HPLC with UV detection. The chromatography parameters were stainless steel Lichrospher 100 RP-18 column (250 mm x 4 mm i.d., 5 μm particle size, at 30 ± 2 ºC. The isocratic mobile phase was methanol:water (65:35; v/v at a flow rate of at 1.0 mL.min-1. The determinations were performed using UV-Vis detector set at 239 nm. Samples were prepared with methanol and the volume injected was 20 μL. The analytical curve was linear (r² 0.9995 over a wide concentration range (2.0-30.0 μg.mL-1. The presence of components of the microemulsion did not interfere in the results of the analysis. The method showed adequate precision, with a relative standard deviation (RSD smaller than 3%. The accuracy was analyzed by adding a standard drug and good recovery values were obtained for all drug concentrations used. The HPLC method developed in this study showed specificity and selectivity with linearity in the working range and good precision and accuracy, making it very suitable for quantification of dexamethasone in microemulsions. The analytical procedure is reliable and offers advantages in terms of speed and low cost of reagents.Um método simples, rápido, preciso e sensível foi desenvolvido para a análise quantitativa de acetato de dexametasona em microemulsões usando cromatografia líquida de alta eficiência (CLAE. Os parâmetros cromatográficos foram: coluna cromatográfica Lichrospher 100 RP-18, (250 mm x 4 mm i.d., 5 μm partícula tamanho, com temperatura de coluna de 30 ± 2 ºC. A fase móvel foi composta de metanol: água (65:35; v/v com fluxo isocrático de 1 mL.min-1 e volume de injeção de 20 μL. As determinações foram realizadas utilizando detector UV-Vis no comprimento de onda de 239 nm. A curva analítica mostrou-se linear (r² 0,999 em uma ampla faixa de

  19. Recursive Frame Analysis: Reflections on the Development of a Qualitative Research Method

    Science.gov (United States)

    Keeney, Hillary; Keeney, Bradford

    2012-01-01

    The origin of recursive frame analysis (RFA) is revisited and discussed as a postmodern alternative to modernist therapeutic models and research methods that foster hegemony of a preferred therapeutic metaphor, narrative, or strategy. It encourages improvisational performance while enabling a means of scoring the change and movement of the…

  20. Development of LC-MS/MS method for analysis of polyphenolic compounds in juice, tea and coffee samples

    Science.gov (United States)

    A simple and fast method for the analysis of a wide range of polyphenolic compounds in juice, tea, and coffee samples was developed using liquid chromatography-tandem mass spectrometry (LC-MS/MS). The method was based on a simple sample preparation “dilute and shoot” approach, and LC-MS/MS triple qu...

  1. Level set method for computational multi-fluid dynamics: A review on developments, applications and analysis

    Indian Academy of Sciences (India)

    Atul Sharma

    2015-05-01

    Functions and conservation as well as subsidiary equations in Level Set Method (LSM) are presented. After the mathematical formulation, improvements in the numerical methodology for LSM are reviewed here for advection schemes, reinitialization methods, hybrid methods, adaptive-grid LSM, dual-resolution LSM, sharp-interface LSM, conservative LSM, parallel computing and extension from two to multi fluid/phase as well as to various types of two-phase flow. In the second part of this article, LSM method based Computational Multi-Fluid Dynamics (CMFD) applications and analysis are reviewed for four different types of multi-phase flow: separated and parallel internal flow, drop/bubble dynamics during jet break-up, drop impact dynamics on a solid or liquid surface and boiling. In the last twenty years, LSM has established itself as a method which is easy to program and is accurate as well as computationally-efficient.

  2. A Product Analysis Method and Its Staging to Develop Redesign Competences

    Science.gov (United States)

    Hansen, Claus Thorp; Lenau, Torben Anker

    2013-01-01

    Most product development work in industrial practice is incremental, i.e., the company has had a product in production and on the market for some time, and now time has come to design an upgraded variant. This type of redesign project requires that the engineering designers have competences to carry through an analysis of the existing product…

  3. Development of an unbiased statistical method for the analysis of unigenic evolution

    Directory of Open Access Journals (Sweden)

    Shilton Brian H

    2006-03-01

    Full Text Available Abstract Background Unigenic evolution is a powerful genetic strategy involving random mutagenesis of a single gene product to delineate functionally important domains of a protein. This method involves selection of variants of the protein which retain function, followed by statistical analysis comparing expected and observed mutation frequencies of each residue. Resultant mutability indices for each residue are averaged across a specified window of codons to identify hypomutable regions of the protein. As originally described, the effect of changes to the length of this averaging window was not fully eludicated. In addition, it was unclear when sufficient functional variants had been examined to conclude that residues conserved in all variants have important functional roles. Results We demonstrate that the length of averaging window dramatically affects identification of individual hypomutable regions and delineation of region boundaries. Accordingly, we devised a region-independent chi-square analysis that eliminates loss of information incurred during window averaging and removes the arbitrary assignment of window length. We also present a method to estimate the probability that conserved residues have not been mutated simply by chance. In addition, we describe an improved estimation of the expected mutation frequency. Conclusion Overall, these methods significantly extend the analysis of unigenic evolution data over existing methods to allow comprehensive, unbiased identification of domains and possibly even individual residues that are essential for protein function.

  4. Development of advanced methods for analysis of experimental data in diffusion

    Science.gov (United States)

    Jaques, Alonso V.

    There are numerous experimental configurations and data analysis techniques for the characterization of diffusion phenomena. However, the mathematical methods for estimating diffusivities traditionally do not take into account the effects of experimental errors in the data, and often require smooth, noiseless data sets to perform the necessary analysis steps. The current methods used for data smoothing require strong assumptions which can introduce numerical "artifacts" into the data, affecting confidence in the estimated parameters. The Boltzmann-Matano method is used extensively in the determination of concentration - dependent diffusivities, D(C), in alloys. In the course of analyzing experimental data, numerical integrations and differentiations of the concentration profile are performed. These methods require smoothing of the data prior to analysis. We present here an approach to the Boltzmann-Matano method that is based on a regularization method to estimate a differentiation operation on the data, i.e., estimate the concentration gradient term, which is important in the analysis process for determining the diffusivity. This approach, therefore, has the potential to be less subjective, and in numerical simulations shows an increased accuracy in the estimated diffusion coefficients. We present a regression approach to estimate linear multicomponent diffusion coefficients that eliminates the need pre-treat or pre-condition the concentration profile. This approach fits the data to a functional form of the mathematical expression for the concentration profile, and allows us to determine the diffusivity matrix directly from the fitted parameters. Reformulation of the equation for the analytical solution is done in order to reduce the size of the problem and accelerate the convergence. The objective function for the regression can incorporate point estimations for error in the concentration, improving the statistical confidence in the estimated diffusivity matrix

  5. Development and Validation of a Triple Quad LC/MS Method for Fiber Dye Analysis

    Science.gov (United States)

    Connolly-Ingram, Ceirin M.

    This study aims to determine whether the analysis of dyed fiber through liquid chromatography (HPLC) with triple-quadrupole mass spectrometry (MS) can be used as a reliable alternative to the current chemical techniques used to differentiate dyes. Other methods of analysis involving HPLC and MS have proven to be capable of distinguishing chemically different dyes within a few dye classifications, but none have proven capable of providing a complete alternative to the current accepted technique of thin layer chromatography (TLC). In theory, HPLC-triple quad MS is capable of providing more reproducible and reliable data than the conventional TLC methods with a much greater depth of measurable information with which to characterize dye components. In this study, dyes will be extracted from various types of fibers, including commonly worn types like cotton, polyester, nylon, and wool, and examine dyes from most of the eight different dye classes.

  6. Personnel planning in general practices: development and testing of a skill mix analysis method.

    NARCIS (Netherlands)

    Eitzen-Strassel, J. von; Vrijhoef, H.J.M.; Derckx, E.W.C.C.; Bakker, D.H. de

    2014-01-01

    Background: General practitioners (GPs) have to match patients’ demands with the mix of their practice staff’s competencies. However, apart from some general principles, there is little guidance on recruiting new staff. The purpose of this study was to develop and test a method which would allow GPs

  7. Development of unstructured grid methods for steady and unsteady aerodynamic analysis

    Science.gov (United States)

    Batina, John T.

    1990-01-01

    The current status of the development of unstructured grid methods in the Unsteady Aerodynamics Branch at NASA-Langley is described. These methods are being developed for steady and unsteady aerodynamic applications. The flow solvers that were developed for the solution of the unsteady Euler and Navier-Stokes equations are highlighted and selected results are given which demonstrate various features of the capability. The results demonstrate 2-D and 3-D applications for both steady and unsteady flows. Comparisons are also made with solutions obtained using a structured grid code and with experimental data to determine the accuracy of the unstructured grid methodology. These comparisons show good agreement which thus verifies the accuracy.

  8. Analysis of numerical methods

    CERN Document Server

    Isaacson, Eugene

    1994-01-01

    This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.

  9. Development of Multi-Disciplinary Finite Element Method Analysis Courses at California State University, Los Angeles

    Science.gov (United States)

    McKinney, John; Wu, Chivey

    1998-01-01

    The NASA Dryden Flight Research Center (DFRC) Partnership Awards Grant to California State University, Los Angeles (CSULA) has two primary goals that help to achieve NASA objectives. The overall objectives of the NASA Partnership Awards are to create opportunities for joint University NASA/Government sponsored research and related activities. One of the goals of the grant is to have university faculty researchers participate and contribute to the development of NASA technology that supports NASA goals for research and development (R&D) in Aeronautics and Astronautics. The other goal is technology transfer in the other direction, where NASA developed technology is made available to the general public and more specifically, targeted to industries that can profit from utilization of government developed technology. This years NASA Dryden Partnership Awards grant to CSULA entitled, "Computer Simulation of Multi-Disciplinary Engineering Systems", has two major tasks that satisfy overall NASA objectives. The first task conducts basic and applied research that contributes to technology development at the Dryden Flight Research Center. The second part of the grant provides for dissemination of NASA developed technology, by using the teaching environment created in the CSULA classroom. The second task and how this is accomplished is the topic of this paper. The NASA STARS (Structural Analysis Routines) computer simulation program is used at the Dryden center to support flight testing of high-performance experimental aircraft and to conduct research and development of new and advanced Aerospace technology.

  10. Development of a method for the analysis of perfluoroalkylated compounds in whole blood

    Energy Technology Data Exchange (ETDEWEB)

    Kaerrman, A.; Bavel, B. van; Lindstroem, G. [Oerebro Univ. (Sweden). Man-Technology-Environmental Research Centre; Jaernberg, U. [Stockholm Univ. (Sweden). Inst. of Applied Environmental Research

    2004-09-15

    The commercialisation of interfaced high performance liquid chromatography mass spectrometry (HPLC-MS) facilitated selective and sensitive analysis of perfluoroalkylated (PFA) acids, a group of compounds frequently used for example as industrial surfactants and which are very persistent and biologically active, in a more convenient way than before. Since then a number of reports on PFA compounds found in humans and wildlife have been published. The most used technique for the analysis of perfluoroalkylated compounds has been ion-pair extraction followed by high performance liquid chromatography (HPLC) and negative electrospray tandem mass spectrometry (MS/MS). Tetrabutylammonium ion as the counter ion in the ion-pair extraction has been used together with GC-analysis, LC-fluorescence and LC-MS/MS. Recently, solid phase extraction (SPE) has been used instead of ion-pair extraction for the extraction of human serum. Previously reported studies on human exposure have mainly been on serum, probably because there are indications that PFA acids bind to plasma proteins. We here present a fast and simple method that involves SPE and which is suitable for extracting whole blood samples. Further more, 13 PFAs were included in the method, which uses HPLC and single quadropole mass spectrometry.

  11. Cooperative method development

    DEFF Research Database (Denmark)

    Dittrich, Yvonne; Rönkkö, Kari; Eriksson, Jeanette;

    2008-01-01

    The development of methods tools and process improvements is best to be based on the understanding of the development practice to be supported. Qualitative research has been proposed as a method for understanding the social and cooperative aspects of software development. However, qualitative...... research is not easily combined with the improvement orientation of an engineering discipline. During the last 6 years, we have applied an approach we call `cooperative method development', which combines qualitative social science fieldwork, with problem-oriented method, technique and process improvement....... The action research based approach focusing on shop floor software development practices allows an understanding of how contextual contingencies influence the deployment and applicability of methods, processes and techniques. This article summarizes the experiences and discusses the further development...

  12. Analysis of heavy oils: Method development and application to Cerro Negro heavy petroleum

    Energy Technology Data Exchange (ETDEWEB)

    Carbognani, L.; Hazos, M.; Sanchez, V. (INTEVEP, Filial de Petroleos de Venezuela, SA, Caracas (Venezuela)); Green, J.A.; Green, J.B.; Grigsby, R.D.; Pearson, C.D.; Reynolds, J.W.; Shay, J.Y.; Sturm, G.P. Jr.; Thomson, J.S.; Vogh, J.W.; Vrana, R.P.; Yu, S.K.T.; Diehl, B.H.; Grizzle, P.L.; Hirsch, D.E; Hornung, K.W.; Tang, S.Y.

    1989-12-01

    On March 6, 1980, the US Department of Energy (DOE) and the Ministry of Energy and Mines of Venezuela (MEMV) entered into a joint agreement which included analysis of heavy crude oils from the Venezuelan Orinoco oil belt.The purpose of this report is to present compositional data and describe new analytical methods obtained from work on the Cerro Negro Orinoco belt crude oil since 1980. Most of the chapters focus on the methods rather than the resulting data on Cerro Negro oil, and results from other oils obtained during the verification of the method are included. In addition, published work on analysis of heavy oils, tar sand bitumens, and like materials is reviewed, and the overall state of the art in analytical methodology for heavy fossil liquids is assessed. The various phases of the work included: distillation and determination of routine'' physical/chemical properties (Chapter 1); preliminary separation of >200{degree}C distillates and the residue into acid, base, neutral, saturated hydrocarbon and neutral-aromatic concentrates (Chapter 2); further separation of acid, base, and neutral concentrates into subtypes (Chapters 3-5); and determination of the distribution of metal-containing compounds in all fractions (Chapter 6).

  13. Methods developed for the mass sampling analysis of CO and carboxyhemoglobin in man

    Energy Technology Data Exchange (ETDEWEB)

    Baretta, E.D.; Stewart, R.D.; Graff, S.A.; Donahoo, K.K.

    1978-03-01

    Gas chromatography was used to quantitate CO in air and also as an indirect means of determining %COHb in blood. The blood was then used to calibrate four CO-Oximeters used in a survey to determine average COHb levels in various segments of the U.S. population. Mean differences, both between the two methods of analysis and between pairs of CO-Oximeters, were less than 0.1% COHb saturation. COHb values obtained on consecutive days using one CO-Oximeter were repeatable within a S.D. +- 0.13% COHb.

  14. Development and application of an automated analysis method for individual cerebral perfusion single photon emission tomography images

    CERN Document Server

    Cluckie, A J

    2001-01-01

    Neurological images may be analysed by performing voxel by voxel comparisons with a group of control subject images. An automated, 3D, voxel-based method has been developed for the analysis of individual single photon emission tomography (SPET) scans. Clusters of voxels are identified that represent regions of abnormal radiopharmaceutical uptake. Morphological operators are applied to reduce noise in the clusters, then quantitative estimates of the size and degree of the radiopharmaceutical uptake abnormalities are derived. Statistical inference has been performed using a Monte Carlo method that has not previously been applied to SPET scans, or for the analysis of individual images. This has been validated for group comparisons of SPET scans and for the analysis of an individual image using comparison with a group. Accurate statistical inference was obtained independent of experimental factors such as degrees of freedom, image smoothing and voxel significance level threshold. The analysis method has been eval...

  15. Development of an evaluation method for the quality of NPP MCR operators' communication using Work Domain Analysis (WDA)

    International Nuclear Information System (INIS)

    Research highlights: → No evaluation method is available for operators' communication quality in NPPs. → To model this evaluation method, the Work Domain Analysis (WDA) method was found. → This proposed method was applied to NPP MCR operators. → The quality of operators' communication can be evaluated with the propose method. - Abstract: The evolution of work demands has seen industrial evolution itself evolve into the computerization of these demands, making systems more complex. This field is now known as the Complex Socio-Technical System. As communication failures are problems associated with Complex Socio-Technical Systems, it has been discovered that communication failures are the cause of many incidents and accidents in various industries, including the nuclear, aerospace and railway industries. Despite the fact that there have been many studies on the severity of communication failures, there is no evaluation method for operators' communication quality in Nuclear Power Plants (NPPs). Therefore, the objectives of this study are to develop an evaluation method for the quality of NPP Main Control Room (MCR) operators' communication and to apply the proposed method to operators in a full-scope simulator. To develop the proposed method, the Work Domain Analysis (WDA) method is introduced. Several characteristics of WDA, including Abstraction Decomposition Space (ADS) and the diagonal of ADS are the important points in developing an evaluation method for the quality of NPP MCR operators' communication. In addition, to apply the proposed method, nine teams working in NPPs participated in a field simulation. The results of this evaluation reveal that operators' communication quality improved as a greater proportion of the components in the developed evaluation criteria were mentioned. Therefore, the proposed method could be useful for evaluating the communication quality in any complex system.

  16. Effective methods of consumer protection in Brazil. An analysis in the context of property development contracts

    Directory of Open Access Journals (Sweden)

    Deborah Alcici Salomão

    2015-12-01

    Full Text Available This study examines consumer protection in arbitration, especially under the example of property development contract disputes in Brazil. This is a very current issue in light of the presidential veto of consumer arbitration on May 26, 2015. The article discusses the arbitrability of these disputes based on Brazilian legislation and relevant case law. It also analyzes of the advantages, disadvantages and trends of consumer arbitration in the context of real estate contracts. The paper concludes by providing suggestions specific to consumer protection in arbitration based on this analysis.

  17. Development of a preparation and staining method for fetal erythroblasts in maternal blood : Simultaneous immunocytochemical staining and FISH analysis

    NARCIS (Netherlands)

    Oosterwijk, JC; Mesker, WE; Ouwerkerk-van Velzen, MCM; Knepfle, CFHM; Wiesmeijer, KC; van den Burg, MJM; Beverstock, GC; Bernini, LF; van Ommen, Gert-Jan B; Kanhai, HHH; Tanke, HJ

    1998-01-01

    In order to detect fetal nucleated red blood cells (NRBCs) in maternal blood, a protocol was developed which aimed at producing a reliable staining method for combined immunocytochemical and FISH analysis. The technique had to be suitable for eventual automated screening of slides. Chorionic villi w

  18. Method developments approaches in supercritical fluid chromatography applied to the analysis of cosmetics.

    Science.gov (United States)

    Lesellier, E; Mith, D; Dubrulle, I

    2015-12-01

    Analyses of complex samples of cosmetics, such as creams or lotions, are generally achieved by HPLC. These analyses are often multistep gradients, due to the presence of compounds with a large range of polarity. For instance, the bioactive compounds may be polar, while the matrix contains lipid components that are rather non-polar, thus cosmetic formulations are usually oil-water emulsions. Supercritical fluid chromatography (SFC) uses mobile phases composed of carbon dioxide and organic co-solvents, allowing for good solubility of both the active compounds and the matrix excipients. Moreover, the classical and well-known properties of these mobile phases yield fast analyses and ensure rapid method development. However, due to the large number of stationary phases available for SFC and to the varied additional parameters acting both on retention and separation factors (co-solvent nature and percentage, temperature, backpressure, flow rate, column dimensions and particle size), a simplified approach can be followed to ensure a fast method development. First, suited stationary phases should be carefully selected for an initial screening, and then the other operating parameters can be limited to the co-solvent nature and percentage, maintaining the oven temperature and back-pressure constant. To describe simple method development guidelines in SFC, three sample applications are discussed in this paper: UV-filters (sunscreens) in sunscreen cream, glyceryl caprylate in eye liner and caffeine in eye serum. Firstly, five stationary phases (ACQUITY UPC(2)) are screened with isocratic elution conditions (10% methanol in carbon dioxide). Complementary of the stationary phases is assessed based on our spider diagram classification which compares a large number of stationary phases based on five molecular interactions. Secondly, the one or two best stationary phases are retained for further optimization of mobile phase composition, with isocratic elution conditions or, when

  19. Alternative method of highway traffic safety analysis for developing countries using delphi technique and Bayesian network.

    Science.gov (United States)

    Mbakwe, Anthony C; Saka, Anthony A; Choi, Keechoo; Lee, Young-Jae

    2016-08-01

    Highway traffic accidents all over the world result in more than 1.3 million fatalities annually. An alarming number of these fatalities occurs in developing countries. There are many risk factors that are associated with frequent accidents, heavy loss of lives, and property damage in developing countries. Unfortunately, poor record keeping practices are very difficult obstacle to overcome in striving to obtain a near accurate casualty and safety data. In light of the fact that there are numerous accident causes, any attempts to curb the escalating death and injury rates in developing countries must include the identification of the primary accident causes. This paper, therefore, seeks to show that the Delphi Technique is a suitable alternative method that can be exploited in generating highway traffic accident data through which the major accident causes can be identified. In order to authenticate the technique used, Korea, a country that underwent similar problems when it was in its early stages of development in addition to the availability of excellent highway safety records in its database, is chosen and utilized for this purpose. Validation of the methodology confirms the technique is suitable for application in developing countries. Furthermore, the Delphi Technique, in combination with the Bayesian Network Model, is utilized in modeling highway traffic accidents and forecasting accident rates in the countries of research. PMID:27183516

  20. Development of a Probabilistic Dynamic Synthesis Method for the Analysis of Nondeterministic Structures

    Science.gov (United States)

    Brown, A. M.

    1998-01-01

    Accounting for the statistical geometric and material variability of structures in analysis has been a topic of considerable research for the last 30 years. The determination of quantifiable measures of statistical probability of a desired response variable, such as natural frequency, maximum displacement, or stress, to replace experience-based "safety factors" has been a primary goal of these studies. There are, however, several problems associated with their satisfactory application to realistic structures, such as bladed disks in turbomachinery. These include the accurate definition of the input random variables (rv's), the large size of the finite element models frequently used to simulate these structures, which makes even a single deterministic analysis expensive, and accurate generation of the cumulative distribution function (CDF) necessary to obtain the probability of the desired response variables. The research presented here applies a methodology called probabilistic dynamic synthesis (PDS) to solve these problems. The PDS method uses dynamic characteristics of substructures measured from modal test as the input rv's, rather than "primitive" rv's such as material or geometric uncertainties. These dynamic characteristics, which are the free-free eigenvalues, eigenvectors, and residual flexibility (RF), are readily measured and for many substructures, a reasonable sample set of these measurements can be obtained. The statistics for these rv's accurately account for the entire random character of the substructure. Using the RF method of component mode synthesis, these dynamic characteristics are used to generate reduced-size sample models of the substructures, which are then coupled to form system models. These sample models are used to obtain the CDF of the response variable by either applying Monte Carlo simulation or by generating data points for use in the response surface reliability method, which can perform the probabilistic analysis with an order of

  1. Development and validation of a GC–FID method for quantitative analysis of oleic acid and related fatty acids

    OpenAIRE

    Honggen Zhang; Zhenyu Wang; Oscar Liu

    2015-01-01

    Oleic acid is a common pharmaceutical excipient that has been widely used in various dosage forms. Gas chromatography (GC) has often been used as the quantitation method for fatty acids normally requiring a derivatization step. The aim of this study was to develop a simple, robust, and derivatization-free GC method that is suitable for routine analysis of all the major components in oleic acid USP-NF (United States Pharmacopeia-National Formulary) material. A gas chromatography–flame ionizati...

  2. A Review of the Use of Contingent Valuation Methods in Project Analysis at the Inter-American Development Bank

    OpenAIRE

    Sergio Ardila; Ricardo Quiroga; William J. Vaughan

    1998-01-01

    This paper (ENV-126) was originally presented at a National Science Foundation Workshop on Alternatives to Traditional Contingent Valuation Methods in Environmental Valuation, held at Vanderbilt University, Nashville Tennessee, on October 15-16, 1998. This paper reviews the past ten years of the Inter-American Development Bank's experience with stated preference methods, concentrating on their use in the cost-benefit analysis of projects supplying sewer service and improving ambient water qua...

  3. Development of Translational Methods in Spectral Analysis of Human Infant Crying and Rat Pup Ultrasonic Vocalizations for Early Neurobehavioral Assessment

    OpenAIRE

    Philip Sanford Zeskind; McMurray, Matthew S.; Kristin Ann Garber; Juliana Miriam Neuspiel; Elizabeth Thomas Cox; Grewen, Karen M.; Mayes, Linda C.; Johns, Josephine M.

    2011-01-01

    The purpose of this article is to describe the development of translational methods by which spectrum analysis of human infant crying and rat pup ultrasonic vocalizations (USVs) can be used to assess potentially adverse effects of various prenatal conditions on early neurobehavioral development. The study of human infant crying has resulted in a rich set of measures that has long been used to assess early neurobehavioral insult due to non-optimal prenatal environments, even among seemingly he...

  4. Analysis and development of spatial hp-refinement methods for solving the neutron transport equation

    International Nuclear Information System (INIS)

    The different neutronic parameters have to be calculated with a higher accuracy in order to design the 4. generation reactor cores. As memory storage and computation time are limited, adaptive methods are a solution to solve the neutron transport equation. The neutronic flux, solution of this equation, depends on the energy, angle and space. The different variables are successively discretized. The energy with a multigroup approach, considering the different quantities to be constant on each group, the angle by a collocation method called SN approximation. Once the energy and angle variable are discretized, a system of spatially-dependent hyperbolic equations has to be solved. Discontinuous finite elements are used to make possible the development of hp-refinement methods. Thus, the accuracy of the solution can be improved by spatial refinement (h-refinement), consisting into subdividing a cell into sub-cells, or by order refinement (p-refinement), by increasing the order of the polynomial basis. In this thesis, the properties of this methods are analyzed showing the importance of the regularity of the solution to choose the type of refinement. Thus, two error estimators are used to lead the refinement process. Whereas the first one requires high regularity hypothesis (analytical solution), the second one supposes only the minimal hypothesis required for the solution to exist. The comparison of both estimators is done on benchmarks where the analytic solution is known by the method of manufactured solutions. Thus, the behaviour of the solution as a regard of the regularity can be studied. It leads to a hp-refinement method using the two estimators. Then, a comparison is done with other existing methods on simplified but also realistic benchmarks coming from nuclear cores. These adaptive methods considerably reduces the computational cost and memory footprint. To further improve these two points, an approach with energy-dependent meshes is proposed. Actually, as the

  5. METHODS AND MODELS FOR ANALYSIS OF THE ORGANIZATIONAL ECONOMICS ACTIVITY USED FOR DEVELOPMENT OF INFORMATICS SYSTEMS

    Directory of Open Access Journals (Sweden)

    TEODORA VĂTUIU

    2014-10-01

    Full Text Available Study of organizational activity and highlighting problem situations that require specific solutions, require a detailed analysis of the models defined for the real system of the economic companies, regarded not as a sum of assets, but as organizations in which there are activities related into processes. In addition to the usual approach of using modeling languages in the development of information systems, in this paper we intend to present some examples that demonstrate the usefulness of a standard modeling language (UML to analyze organizational activities and to report problem situations that may occur in data management registered on primary documents or in processes that bring together activities. Examples that have been focused on a travel agency can be extrapolated to any other organization, and the diagrams can be used in different contexts, depending on the complexity of the activities identified.

  6. Development of distinction method of production area of ginsengs by using a neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Youngjin; Chung, Yongsam; Sim, Chulmuu; Sun, Gwangmin; Lee, Yuna; Yoo, Sangho

    2011-01-15

    During the last 2 years of the project, we have tried to develop the technology to make a distinction of the production areas for Korean ginsengs cultivated in the various provinces in Korea and foreign countries. It will contribute to secure the health food safety for public and stability of its market. In this year, we collected ginseng samples cultivated in the northeastern province in Chinese mainland such as Liaoning province, Jilin province and Baekdu mountain within Jilin province. 10 ginseng samples were collected at each province. The elemental concentrations in the ginseng were analyzed by using a neutron activation analysis technique at the HANARO research reactor. The distinction of production area was made by using a statistical software. As a result, the Chinese Korean ginsengs were certainly differentiated from those cultivated in the famous province in Korea though there was a limitation that the number of our sample we analyzed is very small.

  7. Comparative analysis of methods used to define eustatic variations in outcrop: Late Cambrian interbasinal sequence development

    Energy Technology Data Exchange (ETDEWEB)

    Osleger, D. (Univ. of California, Riverside (United States)); Read, J.F. (Virginia Polytechnic Inst. and State Univ., Blacksburg (United States))

    1993-03-01

    Interbasinal correlation of Late Cambrian cyclic carbonates from the Appalachian and Cordilleran passive margins, the Texas craton, and the southern Oklahoma aulacogen defines six major third-order depositional sequences. Graphic correlation of biostratigraphically-constrained strata was used to establish equivalency of stratigraphic sequences between the individual sections. Relatively isochronous biomere boundaries were used as time datums for lithostratigraphic correlation. Although the individual sections are composed of different types of meter-scale cycles and component lithofacies that reflect the various environmental settings of the localities, the overall upward-shallowing character of individual sequences is evident. The sequences are: late Cedaria, mid-Crepicephalus, late Crepicephalus, Aphelaspis to earliest Elvinia, Elvinia to early Saukia, and Saukia to the Cambrian-Ordovician boundary. Interbasinal correlation of stratigraphic sequences permits an evaluation of quantitative techniques for determining accommodation history. Correlation of Fischer plots of cyclic successions from separate basins supports a eustatic control of Late Cambrian sequence development. R2/R3 curves derived from subsidence analysis of the Late Cambrian sections provide good resolution of the second- and third-order scales of accommodation change, and interbasinal correlations of R2/R3 curves also support eustatic control on sequence development. Comparing the accomodation curves and subsidence analysis with paleobathymetric trends of Late Cambrian cyclic strata suggests that the curves may approximate the form of the eustatic sealevel signal. A composite eustatic sealevel curve for Late Cambrian time in North America was created by qualitatively combining the accommodation curves defined by the different techniques for each of the four localities. 129 refs., 16 figs., 3 tabs.

  8. Development of breached pin performance analysis code SAFFRON (System of Analyzing Failed Fuel under Reactor Operation by Numerical method)

    Energy Technology Data Exchange (ETDEWEB)

    Ukai, Shigeharu [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center

    1995-03-01

    On the assumption of fuel pin failure, the breached pin performance analysis code SAFFRON was developed to evaluate the fuel pin behavior in relation to the delayed neutron signal response during operational mode beyond the cladding failure. Following characteristic behavior in breached fuel pin is modeled in 3-dimensional finite element method : pellet swelling by fuel-sodium reaction, fuel temperature change, and resultant cladding breach extension and delayed neutron precursors release into coolant. Particularly, practical algorithm of numerical procedure in finite element method was originally developed in order to solve the 3-dimensional non-linear contact problem between the swollen pellet due to fuel-sodium reaction and breached cladding. (author).

  9. Development and validation of an HPLC method for analysis of etoricoxib in human plasma

    Directory of Open Access Journals (Sweden)

    Mandal U

    2006-01-01

    Full Text Available A simple high-performance liquid chromatographic method for the determination of etoricoxib in human plasma has been developed. An aliquot quantity of 1 ml plasma sample was taken and 10 ml internal standard was added and mixed. Saturated borate solution of 0.3 ml was added to it and mixed for 1 minute followed by liquid-liquid extraction with ethyl acetate. Organic layer was separated and evaporated to dryness under nitrogen atmosphere at low temperature (below 50°. Residue was reconstituted with 150 µl of mobile phase. During the whole procedure the samples were protected from light. The assay was performed on Hypersil BDS, C18 (150x4.6 mm, 5 m particle size column, using 10 milimol ammonium acetate buffer:acetonitrile = 65:35 v/v as mobile phase with ultra violet detection at 235 nm. Lower limit of detection was 10 ng/ml and lower limit of quantitation was 20 ng/ml. Maximum between-run precision was 7.94%. Mean extraction recovery was found to be 79.53 to 85.70%. Stability study showed that after three freeze-thaw cycles the loss of three quality control samples were less than 10%. Samples were stable at room temperature for 12 h and at -20° for 3 months. Before injecting onto HPLC system, the processed samples were stable for at least 8 h. The method was used to perform bioequivalence study in human.

  10. Development of numerical analysis methods for natural circulation decay heat removal system applied to a large scale JSFR

    Energy Technology Data Exchange (ETDEWEB)

    Watanabe, O.; Suemori, M.; Endoh, J.; Oyama, K. [Mitsubishi FBR Systems, Inc. (MFBR), Tokyo (Japan); Koga, T. [Central Research Inst. of Electric Power Industry (CRIEPI), Chiba (Japan); Kamide, H. [Japan Atomic Energy Agency (JAEA), Ibaraki (Japan)

    2011-07-01

    A decay heat removal system utilizing passive natural circulation is applied to a large scale Japan Sodium-cooled Fast Reactor. As preparing for the future licensing, a one-dimensional flow network method and a three-dimensional numerical analysis method were developed to evaluate core cooling capability and thermal transient under decay heat removal modes after reactor trip. The one-dimensional method was applied to a water test simulating the primary system of the reactor, while the three-dimensional method was applied to the water test and a sodium test focusing on the decay heat removal system. The numerical results of both methods have turned out to agree well with the test results. And then the thermal-hydraulic behavior under a typical decay heat removal mode of the reactor has been predicted by the three-dimensional method. (author)

  11. Development and validation of a GC–FID method for quantitative analysis of oleic acid and related fatty acids

    Directory of Open Access Journals (Sweden)

    Honggen Zhang

    2015-08-01

    Full Text Available Oleic acid is a common pharmaceutical excipient that has been widely used in various dosage forms. Gas chromatography (GC has often been used as the quantitation method for fatty acids normally requiring a derivatization step. The aim of this study was to develop a simple, robust, and derivatization-free GC method that is suitable for routine analysis of all the major components in oleic acid USP-NF (United States Pharmacopeia-National Formulary material. A gas chromatography–flame ionization detection (GC–FID method was developed for direct quantitative analysis of oleic acid and related fatty acids in oleic acid USP-NF material. Fifteen fatty acids were separated using a DB-FFAP (nitroterephthalic acid modified polyethylene glycol capillary GC column (30 m×0.32 mm i.d. with a total run time of 20 min. The method was validated in terms of specificity, linearity, precision, accuracy, sensitivity, and robustness. The method can be routinely used for the purpose of oleic acid USP-NF material analysis.

  12. Development and validation of a GC-FID method for quantitative analysis of oleic acid and related fatty acids☆

    Institute of Scientific and Technical Information of China (English)

    Honggen Zhang; Zhenyu Wang; Oscar Liu

    2015-01-01

    Oleic acid is a common pharmaceutical excipient that has been widely used in various dosage forms. Gas chromatography (GC) has often been used as the quantitation method for fatty acids normally requiring a derivatization step. The aim of this study was to develop a simple, robust, and derivatization-free GC method that is suitable for routine analysis of all the major components in oleic acid USP-NF (United States Pharmacopeia-National Formulary) material. A gas chromatography-flame ionization detection (GC-FID) method was developed for direct quantitative analysis of oleic acid and related fatty acids in oleic acid USP-NF material. Fifteen fatty acids were separated using a DB-FFAP (nitroterephthalic acid modified polyethylene glycol) capillary GC column (30 m × 0.32 mm i.d.) with a total run time of 20 min. The method was validated in terms of specificity, linearity, precision, accuracy, sensitivity, and robustness. The method can be routinely used for the purpose of oleic acid USP-NF material analysis.

  13. Computational Methods Development at Ames

    Science.gov (United States)

    Kwak, Dochan; Smith, Charles A. (Technical Monitor)

    1998-01-01

    This viewgraph presentation outlines the development at Ames Research Center of advanced computational methods to provide appropriate fidelity computational analysis/design capabilities. Current thrusts of the Ames research include: 1) methods to enhance/accelerate viscous flow simulation procedures, and the development of hybrid/polyhedral-grid procedures for viscous flow; 2) the development of real time transonic flow simulation procedures for a production wind tunnel, and intelligent data management technology; and 3) the validation of methods and the flow physics study gives historical precedents to above research, and speculates on its future course.

  14. [Ocra Method: development of a new procedure for analysis of multiple tasks subject to infrequent rotation].

    Science.gov (United States)

    Occhipinti, E; Colombini, Daniela; Occhipinti, M

    2008-01-01

    In the Ocra methods (Ocra index and Ocra Checklist), when computing the final indices (Ocra index or checklist score), in the case of more than one repetitive task a "traditional" procedure was already proposed, the results of which could be defined as "time-weighted average". This approach appears to be appropriate when considering rotations among tasks that are performed very frequently, for instance almost once every hour (or for shorter periods). However, when rotation among repetitive tasks is less frequent (i.e. once every 1 1/2 or more hours), the "time-weighted average" approach could result in an underestimation of the exposure level (as it practically flattens peaks of high exposures). For those scenarios an alternative approach based on the "most stressful task as minimum" might be more realistic. This latter approach has already been included in the NIOSH approach for multiple sequential lifting tasks and, given the recent availability in the Ocra method of more detailed duration multipliers (practically one different Du(M) for each different step of one hour of duration of the repetitive task), it is now possible to define a particular procedure to compute the complex Ocra Multitask Index (cOCRA) and the complex Checklist Score (cCHESCO) for the analysis of two or more repetitive tasks when rotations are infrequent (rotations every 1 1/2 hours or more). The result of this approach will be at least equal to the index of the most stressful task considered for its individual daily duration and at the most equal to the index of the most stressful task when it is (only theoretically) considered as lasting for the overall daily duration of all examined repetitive tasks. The procedure is based on the following formula: Complex Ocra Multitask Index = Ocra(1(Dum1) + (Delta ocra1xK) where 1,2,3,...,N = repetitive tasks ordered by ocra index values (1 = highest; N = lowest) computed considering respective real duration multipliers (Dum(i)). ocra1 = ocra index of

  15. Development of soil-structure interaction analysis method (II) - Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Chang, S. P.; Ko, H. M.; Park, H. K. and others [Seoul National Univ., Seoul (Korea, Republic of)

    1994-02-15

    This project includes following six items : free field analysis for the determination of site input motions, impedance analysis which simplifies the effects of soil-structure interaction by using lumped parameters, soil-structure interaction analysis including the material nonlinearity of soil depending on the level of strains, strong geometric nonlinearity due to the uplifting of the base, seismic analysis of underground structure such as varied pipes, seismic analysis of liquid storage tanks. Each item contains following contents respectively : state-of-the-art review on each item and data base construction on the past researches, theoretical review on the technology of soil-structure interaction analysis, proposing preferable technology and estimating the domestic applicability, proposing guidelines for evaluation of safety and analysis scheme.

  16. Development of soil-structure interaction analysis method (II) - Volume 1

    International Nuclear Information System (INIS)

    This project includes following six items : free field analysis for the determination of site input motions, impedance analysis which simplifies the effects of soil-structure interaction by using lumped parameters, soil-structure interaction analysis including the material nonlinearity of soil depending on the level of strains, strong geometric nonlinearity due to the uplifting of the base, seismic analysis of underground structure such as varied pipes, seismic analysis of liquid storage tanks. Each item contains following contents respectively : state-of-the-art review on each item and data base construction on the past researches, theoretical review on the technology of soil-structure interaction analysis, proposing preferable technology and estimating the domestic applicability, proposing guidelines for evaluation of safety and analysis scheme

  17. Development of evaluation method for the quality of NPP MCR operators' communication using work domain analysis (WDA)

    International Nuclear Information System (INIS)

    Evolution of work demands has changed industrial evolution to computerization which makes systems complex and complicated: this field is called Complex Socio-Technical Systems. As communication failure is one problem of Complex Socio-Technical Systems, it has been discovered that communication failure is the reason for many incidents and accidents in various industries, including the nuclear, aerospace and railway industries. Despite the fact that there have been many studies on the severity of communication failure, there is no evaluation method for operators' communication quality in NPPs. Therefore, the objectives of this study are to develop an evaluation method for the quality of NPP Main Control Room (MCR) operators' communication and to apply the proposed method to operators in a full-scope simulator. In order to develop the proposed method, the Work Domain Analysis (WDA) method is introduced. Several characteristic of WDA, such as Abstraction Decomposition Space (ADS) and the diagonal of ADS are the key points in developing an evaluation method for the quality of NPP MCR operators' communication. In addition, in order to apply the proposed method, nine teams working in NPPs participated in the field simulation. Evaluation results reveal that operators' communication quality was higher as larger portion of components in the developed evaluation criteria were mentioned. Therefore, the proposed method could be a useful one for evaluating the communication quality in any complex system. In order to verify that the proposed method is meaningful to evaluate communication quality, the evaluation results were further investigated with objective performance measures. Further investigation of the evaluation results also supports the idea that the proposed method can be used in evaluating communication quality

  18. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    Science.gov (United States)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize

  19. [Ocra Method: development of a new procedure for analysis of multiple tasks subject to infrequent rotation].

    Science.gov (United States)

    Occhipinti, E; Colombini, Daniela; Occhipinti, M

    2008-01-01

    In the Ocra methods (Ocra index and Ocra Checklist), when computing the final indices (Ocra index or checklist score), in the case of more than one repetitive task a "traditional" procedure was already proposed, the results of which could be defined as "time-weighted average". This approach appears to be appropriate when considering rotations among tasks that are performed very frequently, for instance almost once every hour (or for shorter periods). However, when rotation among repetitive tasks is less frequent (i.e. once every 1 1/2 or more hours), the "time-weighted average" approach could result in an underestimation of the exposure level (as it practically flattens peaks of high exposures). For those scenarios an alternative approach based on the "most stressful task as minimum" might be more realistic. This latter approach has already been included in the NIOSH approach for multiple sequential lifting tasks and, given the recent availability in the Ocra method of more detailed duration multipliers (practically one different Du(M) for each different step of one hour of duration of the repetitive task), it is now possible to define a particular procedure to compute the complex Ocra Multitask Index (cOCRA) and the complex Checklist Score (cCHESCO) for the analysis of two or more repetitive tasks when rotations are infrequent (rotations every 1 1/2 hours or more). The result of this approach will be at least equal to the index of the most stressful task considered for its individual daily duration and at the most equal to the index of the most stressful task when it is (only theoretically) considered as lasting for the overall daily duration of all examined repetitive tasks. The procedure is based on the following formula: Complex Ocra Multitask Index = Ocra(1(Dum1) + (Delta ocra1xK) where 1,2,3,...,N = repetitive tasks ordered by ocra index values (1 = highest; N = lowest) computed considering respective real duration multipliers (Dum(i)). ocra1 = ocra index of

  20. Development of an Ion Chromatography Method for Analysis of Organic Anions (Fumarate, Oxalate, Succinate, and Tartrate) in Single Chromatographic Conditions.

    Science.gov (United States)

    Kaviraj, Yarbagi; Srikanth, B; Moses Babu, J; Venkateswara Rao, B; Paul Douglas, S

    2015-01-01

    A single organic counterion analysis method was developed by using an ion chromatography separation technique and conductivity detector. This allows the rapid characterization of an API to support clinical studies and to fulfil the regulatory requirements for the quantitation of fumarate, oxalate, succinate, and tartrate counterions in active pharmaceutical ingredients (quetiapine fumarate, escitalopram oxalate, sumatriptan succinate, and tolterodine tartrate). The method was developed by using the Metrohm Metrosep A Supp 1 (250 × 4.0 mm, 5.0 µm particle size) column with a mobile phase containing an isocratic mixture of solution A: 7.5 mM sodium carbonate and 2.0 mM sodium bicarbonate in Milli-Q water and solution B: acetonitrile. The flow rate was set at 1.0 mL/min and the run time was 25 minutes. The developed method was validated as per ICH guidelines, and the method parameters were chosen to ensure the spontaneous quantitation of all four anions. The method was validated for all four anions to demonstrate the applicability of this method to common anions present in various APIs.

  1. Development of Optimized Core Design and Analysis Methods for High Power Density BWRs

    Science.gov (United States)

    Shirvan, Koroush

    temperature was kept the same for the BWR-HD and ABWR which resulted in 4 °K cooler core inlet temperature for the BWR-HD given that its feedwater makes up a larger fraction of total core flow. The stability analysis using the STAB and S3K codes showed satisfactory results for the hot channel, coupled regional out-of-phase and coupled core-wide in-phase modes. A RELAPS model of the ABWR system was constructed and applied to six transients for the BWR-HD and ABWR. The 6MCPRs during all the transients were found to be equal or less for the new design and the core remained covered for both. The lower void coefficient along with smaller core volume proved to be advantages for the simulated transients. Helical Cruciform Fuel (HCF) rods were proposed in prior MIT studies to enhance the fuel surface to volume ratio. In this work, higher fidelity models (e.g. CFD instead of subchannel methods for the hydraulic behaviour) are used to investigate the resolution needed for accurate assessment of the HCF design. For neutronics, conserving the fuel area of cylindrical rods results in a different reactivity level with a lower void coefficient for the HCF design. In single-phase flow, for which experimental results existed, the friction factor is found to be sensitive to HCF geometry and cannot be calculated using current empirical models. A new approach for analysis of flow crisis conditions for HCF rods in the context of Departure from Nucleate Boiling (DNB) and dryout using the two phase interface tracking method was proposed and initial results are presented. It is shown that the twist of the HCF rods promotes detachment of a vapour bubble along the elbows which indicates no possibility for an early DNB for the HCF rods and in fact a potential for a higher DNB heat flux. Under annular flow conditions, it was found that the twist suppressed the liquid film thickness on the HCF rods, at the locations of the highest heat flux, which increases the possibility of reaching early dryout. It

  2. Development of Distinction Method of Production Area of Ginsengs by Using a Neutron Activation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Jin; Chung, Yong Sam; Sun, Gwang Min; Lee, Yu Na; Yoo, Sang Ho [KAERI, Daejeon (Korea, Republic of)

    2010-05-15

    Distinction of production area of Korean ginsengs has been tried by using neutron activation techniques such as an instrumental neutron activation analysis (INAA) and a prompt gamma activation analysis (PGAA). A distribution of elements has varied according to the part of plant clue to the difference of enrichment effect and influence from a soil where the plants have been grown. So correlation study between plants and soil has been an Issue. In this study, the distribution of trace elements within a Korean ginseng was investigated by using an instrumental neutron activation analysis

  3. Development of safety evaluation methods and analysis codes applied to the safety regulations for the design and construction stage of fast breeder reactor

    International Nuclear Information System (INIS)

    The purposes of this study are to develop the safety evaluation methods and analysis codes needed in the design and construction stage of fast breeder reactor (FBR). In JFY 2012, the following results are obtained. As for the development of safety evaluation methods needed in the safety examination conducted for the reactor establishment permission, development of the analysis codes, such as core damage analysis code, were carried out following the planned schedule. As for the development of the safety evaluation method needed for the risk informed safety regulation, the quantification technique of the event tree using the Continuous Markov chain Monte Carlo method (CMMC method) were studied. (author)

  4. Application of a method of self-identification of territory for its analysis and estimation of prospects of development

    Directory of Open Access Journals (Sweden)

    Vladimir Stepanovich Bochko

    2011-09-01

    Full Text Available This paper presents the analysis of essence of territory and definition of prospects of its development; it is offered to use the new tool of the decision of tasks of an estimation of the come to pass socio economic transformations and objective vision of the future, which is named as a method of self-identification of territory. It consists in an identification of the current image and establishment of its communication with the previous image made in features of labour, intellectual, moral, industrial and other potential. The territory with the help of a method of self-identification carries out self-recognition itself on the basis of connection in a single unit of information resources last, present and future development. The self-identification of territory is the phenomenon continuous. It is capable not only to react to calls of time, but also to provide a scientific sight on processes of socio economic transformation of territory both from the past in the present, and from the present in the future. On the basis of application of a method of self-identification the comparative analysis of development of branches of Sverdlovsk area is given.

  5. Developments in Surrogating Methods

    Directory of Open Access Journals (Sweden)

    Hans van Dormolen

    2005-11-01

    Full Text Available In this paper, I would like to talk about the developments in surrogating methods for preservation. My main focus will be on the technical aspects of preservation surrogates. This means that I will tell you something about my job as Quality Manager Microfilming for the Netherlands’ national preservation program, Metamorfoze, which is coordinated by the National Library. I am responsible for the quality of the preservation microfilms, which are produced for Metamorfoze. Firstly, I will elaborate on developments in preservation methods in relation to the following subjects: · Preservation microfilms · Scanning of preservation microfilms · Preservation scanning · Computer Output Microfilm. In the closing paragraphs of this paper, I would like to tell you something about the methylene blue test. This is an important test for long-term storage of preservation microfilms. Also, I will give you a brief report on the Cellulose Acetate Microfilm Conference that was held in the British Library in London, May 2005.

  6. Development and validation of a reversed phase liquid chromatographic method for analysis of oxytetracycline and related impurities.

    Science.gov (United States)

    Kahsay, Getu; Shraim, Fairouz; Villatte, Philippe; Rotger, Jacques; Cassus-Coussère, Céline; Van Schepdael, Ann; Hoogmartens, Jos; Adams, Erwin

    2013-03-01

    A simple, robust and fast high-performance liquid chromatographic method is described for the analysis of oxytetracycline and its related impurities. The principal peak and impurities are all baseline separated in 20 min using an Inertsil C₈ (150 mm × 4.6 mm, 5 μm) column kept at 50 °C. The mobile phase consists of a gradient mixture of mobile phases A (0.05% trifluoroacetic acid in water) and B (acetonitrile-methanol-tetrahydrofuran, 80:15:5, v/v/v) pumped at a flow rate of 1.3 ml/min. UV detection was performed at 254 nm. The developed method was validated for its robustness, sensitivity, precision and linearity in the range from limit of quantification (LOQ) to 120%. The limits of detection (LOD) and LOQ were found to be 0.08 μg/ml and 0.32 μg/ml, respectively. This method allows the separation of oxytetracycline from all known and 5 unknown impurities, which is better than previously reported in the literature. Moreover, the simple mobile phase composition devoid of non-volatile buffers made the method suitable to interface with mass spectrometry for further characterization of unknown impurities. The developed method has been applied for determination of related substances in oxytetracycline bulk samples available from four manufacturers. The validation results demonstrate that the method is reliable for quantification of oxytetracycline and its impurities. PMID:23277151

  7. Development and Implementation of Efficiency-Improving Analysis Methods for the SAGE III on ISS Thermal Model Originating

    Science.gov (United States)

    Liles, Kaitlin; Amundsen, Ruth; Davis, Warren; Scola, Salvatore; Tobin, Steven; McLeod, Shawn; Mannu, Sergio; Guglielmo, Corrado; Moeller, Timothy

    2013-01-01

    The Stratospheric Aerosol and Gas Experiment III (SAGE III) instrument is the fifth in a series of instruments developed for monitoring aerosols and gaseous constituents in the stratosphere and troposphere. SAGE III will be delivered to the International Space Station (ISS) via the SpaceX Dragon vehicle in 2015. A detailed thermal model of the SAGE III payload has been developed in Thermal Desktop (TD). Several novel methods have been implemented to facilitate efficient payload-level thermal analysis, including the use of a design of experiments (DOE) methodology to determine the worst-case orbits for SAGE III while on ISS, use of TD assemblies to move payloads from the Dragon trunk to the Enhanced Operational Transfer Platform (EOTP) to its final home on the Expedite the Processing of Experiments to Space Station (ExPRESS) Logistics Carrier (ELC)-4, incorporation of older models in varying unit sets, ability to change units easily (including hardcoded logic blocks), case-based logic to facilitate activating heaters and active elements for varying scenarios within a single model, incorporation of several coordinate frames to easily map to structural models with differing geometries and locations, and streamlined results processing using an Excel-based text file plotter developed in-house at LaRC. This document presents an overview of the SAGE III thermal model and describes the development and implementation of these efficiency-improving analysis methods.

  8. Development and evaluation of the piecewise Prony method for evoked potential analysis.

    Science.gov (United States)

    Garoosi, V; Jansen, B H

    2000-12-01

    A new method is presented to decompose nonstationary signals into a summation of oscillatory components with time varying frequency, amplitude, and phase characteristics. This method, referred to as piecewise Prony method (PPM), is an improvement over the classical Prony method, which can only deal with signals containing components with fixed frequency, amplitude and phase, and monotonically increasing or decreasing rate of change. PPM allows the study of the temporal profile of post-stimulus signal changes in single-trial evoked potentials (EPs), which can lead to new insights in EP generation. We have evaluated this method on simulated data to test its limitations and capabilities, and also on single-trial EPs. The simulation experiments showed that the PPM can detect amplitude changes as small as 10%, rate changes as small as 10%, and 0.15 Hz of frequency changes. The capabilities of the PPM were demonstrated using single electroencephalogram/EP trials of flash visual EPs recorded from one normal subject. The trial-by-trial results confirmed that the stimulation drastically attenuates the alpha activity shortly after stimulus presentation, with the alpha activity returning about 0.5 s later. The PPM results also provided evidence that delta activity undergoes phase alignment following stimulus presentation.

  9. Statistical Methods for Analysis of High Throughput Experiments in Early Drug Development

    OpenAIRE

    Khamiakova, Tatsiana

    2013-01-01

    Introduction: Advances in biotechnology and the ability to obtain molecular profiles of biological samples, and in particular, the transcriptomic data, have been transforming the way biomedical research and early drug development are carried out for more than a decade (Clarke et al., 2004; Chengalvala et al., 2007; Hughes et al., 2011). In view of increasing costs of the drug development and nevertheless a large number of drugs which fail the clinical trials either due to the lack of efficacy...

  10. Analysis of Indonesian Agroindustry Competitiveness in Nanotechnology Development Perspective Using SWOT-AHP Method

    OpenAIRE

    Nurul Taufiqu Rochman; Gumbira-Sa’id, E.; Arief Daryanto; Nunung Nuryartono

    2011-01-01

    Application of nanotechnology opens vast opportunities for increasing the competitiveness of the nationalagroindustries. In this study, five agroindustries that potentially applied nanotechnology were reviewed andanalyzed by using a SWOT-AHP (strength, weakness, opportunity, threat, and analysis hierarchy process) todetermine the position of the competitiveness of each industry. Criteria were analyzed based on internal factorsthat have the potential to be the strengths and weaknesses, and ext...

  11. The analysis of lipophilic marine toxins : development of an alternative method

    NARCIS (Netherlands)

    Gerssen, A.

    2010-01-01

    Lipophilic marine toxins are produced by certain algae species and can accumulate in filter feeding shellfish such as mussels, scallops and oysters. Consumption of contaminated shellfish can lead to severe intoxications such as diarrhea, abdominal cramps and vomiting. Methods described in European U

  12. Development of Methods for Sampling and Analysis of Polychlorinated Naphthalenes in Ambient Air

    Science.gov (United States)

    Erickson, Mitchell D.; And Others

    1978-01-01

    The procedure and sampler described permits detection of less than 50pg of one polychlorinated naphthalene (PCN) isomer. The method uses gas chromatography-mass spectrometry. The PCNs are collected on a glass fiber filter and two polyurethane foam plugs and extracted with toluene at 25 degrees Celsius. (BB)

  13. Spatial analysis method of assessing water supply and demand applied to energy development in the Ohio River Basin

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, A.D.

    1979-08-01

    The focus of the study is on water availability for energy development in the Ohio River Basin; however, the techniques developed are applicable to water supply investigations for other regions and uses. The study assesses the spatial association between water supply and demand for future energy development in the Basin. The problem is the development of a method that accurately portrays the actual spatial coincidence of water availability and use within a basin. The issues addressed involve questions of scale and methods used to create a model distribution of streamflow and to compare it with projected patterns of water requirements for energy production. The analysis procedure involves the compilation of streamflow data and calculation of 7-day/10-year low-flow estimates within the Basin. Low-flow probabilities are based on historical flows at gaging stations and are adjusted for the effects of reservoir augmentation. Once streamflow estimates have been determined at gaging stations, interpolation of these values is made between known data points to enable direct comparison with projected energy water-use data. Finally, a method is devised to compare the patterns of projected water requirements with the model distribution of streamflow, in sequential downstream order.

  14. Methodical Approaches To Analysis And Forecasting Of Development Fuel And Energy Complex And Gas Industry In The Region

    Directory of Open Access Journals (Sweden)

    Vladimir Andreyevich Tsybatov

    2014-12-01

    Full Text Available Fuel and energy complex (FEC is one of the main elements of the economy of any territory over which intertwine the interests of all economic entities. To ensure economic growth of the region should ensure that internal balance of energy resources, which should be developed with account of regional specifics of economic growth and energy security. The study examined the status of this equilibrium, indicating fuel and energy balance of the region (TEB. The aim of the research is the development of the fuel and energy balance, which will allow to determine exactly how many and what resources are not enough to ensure the regional development strategy and what resources need to be brought in. In the energy balances as the focus of displays all issues of regional development, so thermopile is necessary as a mechanism of analysis of current issues, economic development, and in the forward-looking version — as a tool future vision for the fuel and energy complex, energy threats and ways of overcoming them. The variety of relationships in the energy sector with other sectors and aspects of society lead to the fact that the development of the fuel and energy balance of the region have to go beyond the actual energy sector, involving the analysis of other sectors of economy, as well as systems such as banking, budgetary, legislative, tax. Due to the complexity of the discussed problems, the obvious is the need to develop appropriate forecast-analytical system, allowing regional authorities to implement evidence-based predictions of the consequences of management decisions. Multivariant scenario study on development of fuel and energy complex and separately industry, to use the methods of project-based management, harmonized application of state regulation of strategic and market mechanisms on the operational directions of development of fuel and energy complex and separately industry in the economy of the region.

  15. Development of a cell formation heuristic by considering realistic data using principal component analysis and Taguchi's method

    Science.gov (United States)

    Kumar, Shailendra; Sharma, Rajiv Kumar

    2015-12-01

    Over the last four decades of research, numerous cell formation algorithms have been developed and tested, still this research remains of interest to this day. Appropriate manufacturing cells formation is the first step in designing a cellular manufacturing system. In cellular manufacturing, consideration to manufacturing flexibility and production-related data is vital for cell formation. The consideration to this realistic data makes cell formation problem very complex and tedious. It leads to the invention and implementation of highly advanced and complex cell formation methods. In this paper an effort has been made to develop a simple and easy to understand/implement manufacturing cell formation heuristic procedure with considerations to the number of production and manufacturing flexibility-related parameters. The heuristic minimizes inter-cellular movement cost/time. Further, the proposed heuristic is modified for the application of principal component analysis and Taguchi's method. Numerical example is explained to illustrate the approach. A refinement in the results is observed with adoption of principal component analysis and Taguchi's method.

  16. Analysis of the Difficulties and Improvement Method on Introduction of PBL Approach in Developing Country

    Science.gov (United States)

    Okano, Takasei; Sessa, Salvatore

    In the field of international cooperation, it is increasing to introduce Japanese engineering educational model in the developing country to improve the quality of education and research activity. A naive implementation of such model in different cultures and educational systems may lead to several problems. In this paper, we evaluated the Project Based Learning (PBL) class, developed at Waseda University in Japan, and employed to the Egyptian education context at the Egypt-Japan University of Science and Technology (E-JUST) . We found difficulties such as : non-homogeneous student’ s background, disconnection with the student’ s research, weak learning style adaptation, and irregular course conduction. To solve these difficulties at E-JUST, we proposed : the groupware introduction, project theme choice based on student’ s motivation, and curriculum modification.

  17. SENSITIVITY ANALYSIS as a methodical approach to the development of design strategies for environmentally sustainable buildings

    OpenAIRE

    Hansen, Hanne Tine Ring

    2007-01-01

    The field of environmentally sustainable architecture has been under development since the late 1960's when mankind first started to notice the consequences of industrialisation and modern lifestyle. Energy crises in 1973 and 1979, and global climatic changes ascribed to global warming have caused an increase in scientific and political awareness, which has lead to an escalation in the number of research publications in the field, as well as, legislative demands for the energy consumption of ...

  18. Generation of Synthetic Transcriptome Data with Defined Statistical Properties for the Development and Testing of New Analysis Methods

    Institute of Scientific and Technical Information of China (English)

    Guillaume Brysbaert; Sebastian Noth; Arndt Benecke

    2007-01-01

    We have previously developed a combined signal/variance distribution model that accounts for the particular statistical properties of datasets generated on the Applied Biosystems AB1700 transcriptome system. Here we show that this model can be efficiently used to generate synthetic datasets with statistical properties virtually identical to those of the actual data by aid of the JAVA application ace.map creator 1.0 that we have developed. The fundamentally different structure of AB1700 transcriptome profiles requires re-evaluation, adaptation, or even redevelopment of many of the standard microarray analysis methods in order to avoid misinterpretation of the data on the one hand, and to draw full benefit from their increased specificity and sensitivity on the other hand. Our composite data model and the ace.map creator 1.0 application thereby not only present proof of the correctness of our parameter estimation, but also provide a tool for the generation of synthetic test data that will be useful for further development and testing of analysis methods.

  19. Estimation of Bouguer Density Precision: Development of Method for Analysis of La Soufriere Volcano Gravity Data

    Directory of Open Access Journals (Sweden)

    Hendra Gunawan

    2014-06-01

    Full Text Available http://dx.doi.org/10.17014/ijog.vol3no3.20084The precision of topographic density (Bouguer density estimation by the Nettleton approach is based on a minimum correlation of Bouguer gravity anomaly and topography. The other method, the Parasnis approach, is based on a minimum correlation of Bouguer gravity anomaly and Bouguer correction. The precision of Bouguer density estimates was investigated by both methods on simple 2D syntetic models and under an assumption free-air anomaly consisting of an effect of topography, an effect of intracrustal, and an isostatic compensation. Based on simulation results, Bouguer density estimates were then investigated for a gravity survey of 2005 on La Soufriere Volcano-Guadeloupe area (Antilles Islands. The Bouguer density based on the Parasnis approach is 2.71 g/cm3 for the whole area, except the edifice area where average topography density estimates are 2.21 g/cm3 where Bouguer density estimates from previous gravity survey of 1975 are 2.67 g/cm3. The Bouguer density in La Soufriere Volcano was uncertainly estimated to be 0.1 g/cm3. For the studied area, the density deduced from refraction seismic data is coherent with the recent Bouguer density estimates. New Bouguer anomaly map based on these Bouguer density values allows to a better geological intepretation.    

  20. Stakeholder Analysis as a tool  for working with  Social Responsibility : Developing a stakeholder analysis  method for ISO 26000

    OpenAIRE

    Weinestedt, Henrik

    2009-01-01

    This thesis aims to develop a stakeholder analysis method for the upcoming standard on social  responsibility  (SR)  ISO  26000. The  goal  is  a method  that,  by  adhering  to  three criteria regarding comprehensibility, flexibility and ease of use, can be used and applied by organizations regardless of type (corporation, NGO, municipality etc.) and value chain size. The method consists of  six different  steps which, when completed, will produce a situation map of the organization’s key SR...

  1. Episode-Based Evolution Pattern Analysis of Haze Pollution: Method Development and Results from Beijing, China.

    Science.gov (United States)

    Zheng, Guangjie; Duan, Fengkui; Ma, Yongliang; Zhang, Qiang; Huang, Tao; Kimoto, Takashi; Cheng, Yafang; Su, Hang; He, Kebin

    2016-05-01

    Haze episodes occurred in Beijing repeatedly in 2013, resulting in 189 polluted days. These episodes differed in terms of sources, formation processes, and chemical composition and thus required different control policies. Therefore, an overview of the similarities and differences among these episodes is needed. For this purpose, we conducted one-year online observations and developed a program that can simultaneously divide haze episodes and identify their shapes. A total of 73 episodes were identified, and their shapes were linked with synoptic conditions. Pure-haze events dominated in wintertime, whereas mixed haze-dust (PM2.5/PM10 haze-fog (Aerosol Water/PM2.5 ∼ 0.3) events dominated in spring and summer-autumn, respectively. For all types, increase of ratio of PM2.5 in PM10 was typically achieved before PM2.5 reached ∼150 μg/m(3). In all PM2.5 species observed, organic matter (OM) was always the most abundant component (18-60%), but it was rarely the driving factor: its relative contribution usually decreased as the pollution level increased. The only OM-driven episode observed was associated with intensive biomass-burning activities. In comparison, haze evolution generally coincided with increasing sulfur and nitrogen oxidation ratios (SOR and NOR), indicating the enhanced production of secondary inorganic species. Applicability of these conclusions required further tests with simultaneously multisite observations.

  2. Development of neutron multiplication analysis method for a subcritical system by reaction rate distribution measurement

    International Nuclear Information System (INIS)

    Basic experiments for ADSR are performed in KUCA to study the nuclear characteristics for establishing a new neutron source for research. Usually, nuclear reactors are operated in a critical state. Even though they are operated in an subcritical state, they are a very close to the critical state, and there are no problems to use the effective multiplication factor keff to express the subcriticality, which is obtained by solving the homogeneous neutron balance equation without external source. However, ADSR are operated in a subcritical state, and experiments which are fairly far from critical state may be performed to investigate their nuclear properties. In subcritical systems, the neutron flux distribution produced by an external source depends on the energy and position of the external source, and then the multiplication rate fission neutrons and the effectiveness of the external source should depend on the position of the external source. However, the effective multiplication factor keff cannot take into account the influence of such an effect. For a subcritical system, the neutron multiplication which is defined as the ratio of the total neutrons produced in the system by either fission or external source to those produced by external source only, can be a good measure for the efficiency of the system to produce neutrons with a specific spectrum which is one of the final goals of the 'Neutron Factory' project. Unlike the theoretical neutron multiplication definition, based on one point reactor approximation which depends only on the subcriticality of the system, the method considered in this study takes into account the effect on the neutron source position and energy, which plays an important role in the level of neutron multiplication for a given subcritical system. In this study, the value of neutron multiplication will be evaluated by utilizing the reaction rate distribution of KUCA A-core experiment which is analyzed in a subcritical system combined with

  3. Development of synthetic velocity - depth damage curves using a Weighted Monte Carlo method and Logistic Regression analysis

    Science.gov (United States)

    Vozinaki, Anthi Eirini K.; Karatzas, George P.; Sibetheros, Ioannis A.; Varouchakis, Emmanouil A.

    2014-05-01

    Damage curves are the most significant component of the flood loss estimation models. Their development is quite complex. Two types of damage curves exist, historical and synthetic curves. Historical curves are developed from historical loss data from actual flood events. However, due to the scarcity of historical data, synthetic damage curves can be alternatively developed. Synthetic curves rely on the analysis of expected damage under certain hypothetical flooding conditions. A synthetic approach was developed and presented in this work for the development of damage curves, which are subsequently used as the basic input to a flood loss estimation model. A questionnaire-based survey took place among practicing and research agronomists, in order to generate rural loss data based on the responders' loss estimates, for several flood condition scenarios. In addition, a similar questionnaire-based survey took place among building experts, i.e. civil engineers and architects, in order to generate loss data for the urban sector. By answering the questionnaire, the experts were in essence expressing their opinion on how damage to various crop types or building types is related to a range of values of flood inundation parameters, such as floodwater depth and velocity. However, the loss data compiled from the completed questionnaires were not sufficient for the construction of workable damage curves; to overcome this problem, a Weighted Monte Carlo method was implemented, in order to generate extra synthetic datasets with statistical properties identical to those of the questionnaire-based data. The data generated by the Weighted Monte Carlo method were processed via Logistic Regression techniques in order to develop accurate logistic damage curves for the rural and the urban sectors. A Python-based code was developed, which combines the Weighted Monte Carlo method and the Logistic Regression analysis into a single code (WMCLR Python code). Each WMCLR code execution

  4. Digital representation of meso-geomaterial spatial distribution and associated numerical analysis of geomechanics:methods,applications and developments

    Institute of Scientific and Technical Information of China (English)

    YUE Zhongqi

    2007-01-01

    This paper presents the author's efforts in the past decade for the establishment of a practical approach of digital representation of the geomaterial distribution of different minerals,particulars,and components in the meso-scale range(0.1 to 500 mm).The primary goal of the approach is to provide a possible solution to solve the two intrinsic problems associated with the current main-stream methods for geomechanics.The problems are (1) the constitutive models and parameters of soils and rocks cannot be given accurately in geomechanical prediction;and (2) there are numerous constitutive models of soils and rocks in the literature.The problems are possibly caused by the homogenization or averaging method in analyzing laboratory test results for establishing the constitutive models and parameters.The averaging method employs an assumption that the test samples can be represented by a homogeneous medium.Such averaging method ignores the fact that the geomaterial samples are also consisted of a number of materials and components whose properties may have significant differences.In the proposed approach,digital image processing methods are used as measurement tools to construct a digital representation for the actual spatial distribution of the different materials and components in geomaterial samples.The digital data are further processed to automatically generate meshes or grids for numerical analysis.These meshes or grids can be easily incorporated into existing numerical software packages for further mechanical analysis and failure prediction of the geomaterials under external loading.The paper presents case studies to illustrate the proposed approach.Further discussions are also made on how to use the proposed approach to develop the geomechanics by taking into account the geomaterial behavior at micro-scale,meso-scale and macro-scale levels.A literature review of the related developments is given by examining the SCI papers in the database of Science Citation

  5. Sensitivity and Uncertainty Analysis of Coupled Reactor Physics Problems: Method Development for Multi-Physics in Reactors

    NARCIS (Netherlands)

    Perkó, Z.

    2015-01-01

    This thesis presents novel adjoint and spectral methods for the sensitivity and uncertainty (S&U) analysis of multi-physics problems encountered in the field of reactor physics. The first part focuses on the steady state of reactors and extends the adjoint sensitivity analysis methods well establish

  6. Generic Analysis Methods for Gas Turbine Engine Performance: The development of the gas turbine simulation program GSP

    NARCIS (Netherlands)

    Visser, W.P.J.

    2015-01-01

    Numerical modelling and simulation have played a critical role in the research and development towards today’s powerful and efficient gas turbine engines for both aviation and power generation. The simultaneous progress in modelling methods, numerical methods, software development tools and methods,

  7. Development of partitioning method

    International Nuclear Information System (INIS)

    A partitioning method has been developed under the concepts of separating radioactive nuclides from a high-level waste according to their half lives and radioactive toxicity, and of disposing the waste safely. The partitioning test using about 18 liters (--220Ci) of the fuel reprocessing waste prepared at PNC has been started in October of 1982. In this test the behavior of radioactive nuclides was made clear. The present paper describes chemical behavior of non-radioactive elements contained in the high-level liquid waste in the extraction with di-isodecyl phosphoric acid (DIDPA). Distribution ratios of most of metal ions for DIDPA were less than 0.05, except that those of Mo, Zr and Fe were higher than 7. Ferric ion could not be back-extracted with 4 M HNO3, but with 0.5 M (COOH)2. In the extractiion with DIDPA, the third phase, which causes closing the settling banks or the flow paths in a mixer settler, was formed when the ferric ion concentration was over 0.02 M. This unfavorable phenomenon, however, was found to be suppressed by diluting the ferric ion concentration to lower than 0.01 M or by reducing ferric ion to ferrous ion. (author)

  8. Method Development for Rapid Analysis of Natural Radioactive Nuclides Using Sector Field Inductively Coupled Plasma Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Lim, J.M.; Ji, Y.Y.; Lee, H.; Park, J.H.; Jang, M.; Chung, K.H.; Kang, M.J.; Choi, G.S. [Korea Atomic Energy Research Institute (Korea, Republic of)

    2014-07-01

    As an attempt to reduce the social costs and apprehension arising from radioactivity in the environment, an accurate and rapid assessment of radioactivity is highly desirable. Naturally occurring radioactive materials (NORM) are widely spread throughout the environment. The concern with radioactivity from these materials has therefore been growing for the last decade. In particular, radiation exposure in the industry when handling raw materials (e.g., coal mining and combustion, oil and gas production, metal mining and smelting, mineral sands (REE, Ti, Zr), fertilizer (phosphate), and building materials) has been brought to the public's attention. To decide the proper handling options, a rapid and accurate analytical method that can be used to evaluate the radioactivity of radionuclides (e.g., {sup 238}U, {sup 235}U, {sup 232}Th, {sup 226}Ra, and {sup 40}K) should be developed and validated. Direct measuring methods such as alpha spectrometry, a liquid scintillation counter (LSC), and mass-spectrometry are usually used for the measurement of radioactivity in NORM samples, and they encounter the most significant difficulties during pretreatment (e.g., purification, speciation, and dilution/enrichment). Since the pretreatment process consequently plays an important role in the measurement uncertainty, method development and validation should be performed. Furthermore, a-spectrometry has a major disadvantage of a long counting time, while it has a prominent measurement capability at a very low activity level of {sup 238}U, {sup 235}U, {sup 232}Th, and {sup 226}Ra. Contrary to the α-spectrometry method, a measurement technique using ICP-MS allow radioactivity in many samples to be measured in a short time period with a high degree of accuracy and precision. In this study, a method was developed for a rapid analysis of natural radioactive nuclides using ICP-MS. A sample digestion process was established using LiBO{sub 2} fusion and Fe co-precipitation. A magnetic

  9. Method Development for Rapid Analysis of Natural Radioactive Nuclides Using Sector Field Inductively Coupled Plasma Mass Spectrometry

    International Nuclear Information System (INIS)

    As an attempt to reduce the social costs and apprehension arising from radioactivity in the environment, an accurate and rapid assessment of radioactivity is highly desirable. Naturally occurring radioactive materials (NORM) are widely spread throughout the environment. The concern with radioactivity from these materials has therefore been growing for the last decade. In particular, radiation exposure in the industry when handling raw materials (e.g., coal mining and combustion, oil and gas production, metal mining and smelting, mineral sands (REE, Ti, Zr), fertilizer (phosphate), and building materials) has been brought to the public's attention. To decide the proper handling options, a rapid and accurate analytical method that can be used to evaluate the radioactivity of radionuclides (e.g., 238U, 235U, 232Th, 226Ra, and 40K) should be developed and validated. Direct measuring methods such as alpha spectrometry, a liquid scintillation counter (LSC), and mass-spectrometry are usually used for the measurement of radioactivity in NORM samples, and they encounter the most significant difficulties during pretreatment (e.g., purification, speciation, and dilution/enrichment). Since the pretreatment process consequently plays an important role in the measurement uncertainty, method development and validation should be performed. Furthermore, a-spectrometry has a major disadvantage of a long counting time, while it has a prominent measurement capability at a very low activity level of 238U, 235U, 232Th, and 226Ra. Contrary to the α-spectrometry method, a measurement technique using ICP-MS allow radioactivity in many samples to be measured in a short time period with a high degree of accuracy and precision. In this study, a method was developed for a rapid analysis of natural radioactive nuclides using ICP-MS. A sample digestion process was established using LiBO2 fusion and Fe co-precipitation. A magnetic sector field ICP-MS (SPECTRO MS) was used for a rapid

  10. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  11. A bottom-up method for module-based product platform development through mapping, clustering and matching analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Meng; LI Guo-xi; CAO Jian-ping; GONG Jing-zhong; WU Bao-zhong

    2016-01-01

    Designing product platform could be an effective and efficient solution for manufacturing firms. Product platforms enable firms to provide increased product variety for the marketplace with as little variety between products as possible. Developed consumer products and modules within a firm can further be investigated to find out the possibility of product platform creation. A bottom-up method is proposed for module-based product platform through mapping, clustering and matching analysis. The framework and the parametric model of the method are presented, which consist of three steps: (1) mapping parameters from existing product families to functional modules, (2) clustering the modules within existing module families based on their parameters so as to generate module clusters, and selecting the satisfactory module clusters based on commonality, and (3) matching the parameters of the module clusters to the functional modules in order to capture platform elements. In addition, the parameter matching criterion and mismatching treatment are put forward to ensure the effectiveness of the platform process, while standardization and serialization of the platform element are presented. A design case of the belt conveyor is studied to demonstrate the feasibility of the proposed method.

  12. Development of a Matlab/Simulink tool to facilitate system analysis and simulation via the adjoint and covariance methods

    NARCIS (Netherlands)

    Bucco, D.; Weiss, M.

    2007-01-01

    The COVariance and ADjoint Analysis Tool (COVAD) is a specially designed software tool, written for the Matlab/Simulink environment, which allows the user the capability to carry out system analysis and simulation using the adjoint, covariance or Monte Carlo methods. This paper describes phase one o

  13. Development and application of a validated HPLC method for the analysis of dissolution samples of mexiletine hydrochloride capsules

    Directory of Open Access Journals (Sweden)

    ZORAN B.TODOROVIĆ

    2010-07-01

    Full Text Available The aim of this work was to develop and validate a simple, efficient, sensitive and selective method for the analysis of dissolution samples of mexiletine hydrochloride capsules by HPLC without the necessity of any time-consuming extraction, dilution or evaporation steps prior to drug assay. Separation was performed isocratically on a 5 µm LiChrospher 60, RP-Select B column (250·4 mm ID using the mobile phase buffer–acetonitrile (60:42, v/v at a flow rate of 1.2 mL min-1 and UV detection at 262 nm. The elution occurred in less than 10 min. The assay was linear in the concentration range 50–300 µg mL-1 (r2 = 0.9998. The validation characteristics included accuracy, precision, linearity, specificity, limits of detection and quantification, stability, and robustness. Validation acceptance criteria were met in all cases (the percent recoveries ranged between 100.01 and 101.68 %, RSD < 0.44 %. The method could be used for the determination of mexiletine hydrochloride and for monitoring its concentration in in vitro dissolution studies.

  14. Combined fluvial and pluvial urban flood hazard analysis: method development and application to Can Tho City, Mekong Delta, Vietnam

    Directory of Open Access Journals (Sweden)

    H. Apel

    2015-08-01

    Full Text Available Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas, and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either fluvial or pluvial flood hazard, studies of combined fluvial and pluvial flood hazard are hardly available. Thus this study aims at the analysis of fluvial and pluvial flood hazard individually, but also at developing a method for the analysis of combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as example. In this tropical environment the annual monsoon triggered floods of the Mekong River can coincide with heavy local convective precipitation events causing both fluvial and pluvial flooding at the same time. Fluvial flood hazard was estimated with a copula based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. Pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data, and a stochastic rain storm generator. Inundation was simulated by a 2-dimensional hydrodynamic model implemented on a Graphical Processor Unit (GPU for time-efficient flood propagation modelling. All hazards – fluvial, pluvial and combined – were accompanied by an uncertainty estimation considering the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median

  15. Combined fluvial and pluvial urban flood hazard analysis: method development and application to Can Tho City, Mekong Delta, Vietnam

    Science.gov (United States)

    Apel, H.; Trepat, O. M.; Hung, N. N.; Chinh, D. T.; Merz, B.; Dung, N. V.

    2015-08-01

    Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas, and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either fluvial or pluvial flood hazard, studies of combined fluvial and pluvial flood hazard are hardly available. Thus this study aims at the analysis of fluvial and pluvial flood hazard individually, but also at developing a method for the analysis of combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as example. In this tropical environment the annual monsoon triggered floods of the Mekong River can coincide with heavy local convective precipitation events causing both fluvial and pluvial flooding at the same time. Fluvial flood hazard was estimated with a copula based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. Pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data, and a stochastic rain storm generator. Inundation was simulated by a 2-dimensional hydrodynamic model implemented on a Graphical Processor Unit (GPU) for time-efficient flood propagation modelling. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation considering the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by

  16. Factor Analysis Methods and Validity Evidence: A Systematic Review of Instrument Development across the Continuum of Medical Education

    Science.gov (United States)

    Wetzel, Angela Payne

    2011-01-01

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across…

  17. Forces in bolted joints: analysis methods and test results utilized for nuclear core applications (LWBR Development Program)

    Energy Technology Data Exchange (ETDEWEB)

    Crescimanno, P.J.; Keller, K.L.

    1981-03-01

    Analytical methods and test data employed in the core design of bolted joints for the LWBR core are presented. The effects of external working loads, thermal expansion, and material stress relaxation are considered in the formulation developed to analyze joint performance. Extensions of these methods are also provided for bolted joints having both axial and bending flexibilities, and for the effect of plastic deformation on internal forces developed in a bolted joint. Design applications are illustrated by examples.

  18. Forces in bolted joints: analysis methods and test results utilized for nuclear core applications (LWBR Development Program)

    International Nuclear Information System (INIS)

    Analytical methods and test data employed in the core design of bolted joints for the LWBR core are presented. The effects of external working loads, thermal expansion, and material stress relaxation are considered in the formulation developed to analyze joint performance. Extensions of these methods are also provided for bolted joints having both axial and bending flexibilities, and for the effect of plastic deformation on internal forces developed in a bolted joint. Design applications are illustrated by examples

  19. DEVELOPMENT OF METHOD OF QUALITATIVE ANALYSIS OF BIRD CHERRY FRUIT FOR INCLUSION IN THE MONOGRAPH OF STATE PHARMACOPOEIA OF UKRAINE

    Directory of Open Access Journals (Sweden)

    Lenchyk L.V.

    2016-06-01

    Full Text Available Introduction. Bird cherry Padus avium Mill, Rosaceae, is widespread in Ukraine, especially in forests and forest-steppe areas. Bird cherry fruits have long been used in medicine and is a valuable medicinal raw materials. They stated to posess astringent, anti-inflammatory, phytoncidal properties. Bird cherry fruits are included in the USSR Pharmacopoeia IX ed., The State Pharmacopoeia of the Russian Federation, The State Pharmacopoeia of Republic of Belarus. In Ukraine there are no contemporary normative documents for this medicinal plant material, therefore it is the actual to develop projects in the national monographs "dry bird cherry fruit" and "fresh bird cherry fruit" to be included in the State Pharmacopoeia of Ukraine. According to European Pharmacopoeia recommendation method of thin-layer chromatography (TLC is prescribed only for the identification of the herbal drug. The principles of thin-layer chromatography and application of the technique in pharmaceutical analysis are described in State Pharmacopoeia of Ukraine. As it is effective and easy to perform, and the equipment required is inexpensive, the technique is frequently used for evaluating medicinal plant materials and their preparations. The TLC is aimed at elucidating the chromatogram of the drug with respect to selected reference compounds that are described for inclusion as reagents. Aim of this study was to develop methods of qualitative analysis of bird cherry fruits for a monograph in the State Pharmacopoeia of Ukraine (SPU. Materials and Methods. The object of our study was dried bird cherry fruits (7 samples and fresh bird cherry fruits (7 samples harvested in 2013-2015 in Kharkiv, Poltava, Luhansk, Sumy, Lviv, Mykolaiv regions and the city Mariupol. Samples were registered in the department of SPU State Enterprise "Pharmacopeia center". In accordance with the Ph. Eur. and SPU requirements in "identification C" determination was performed by TLC. TLC was performed on

  20. Computational methods for global/local analysis

    Science.gov (United States)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  1. Motion as perturbation. II. Development of the method for dosimetric analysis of motion effects with fixed-gantry IMRT

    Energy Technology Data Exchange (ETDEWEB)

    Nelms, Benjamin E. [Canis Lupus LLC, Merrimac, Wisconsin 53561 (United States); Opp, Daniel; Zhang, Geoffrey; Moros, Eduardo; Feygelman, Vladimir, E-mail: vladimir.feygelman@moffitt.org [Department of Radiation Oncology, Moffitt Cancer Center, Tampa, Florida 33612 (United States)

    2014-06-15

    Purpose: In this work, the feasibility of implementing a motion-perturbation approach to accurately estimate volumetric dose in the presence of organ motion—previously demonstrated for VMAT-–is studied for static gantry IMRT. The method's accuracy is improved for the voxels that have very low planned dose but acquire appreciable dose due to motion. The study describes the modified algorithm and its experimental validation and provides an example of a clinical application. Methods: A contoured region-of-interest is propagated according to the predefined motion kernel throughout time-resolved 4D phantom dose grids. This timed series of 3D dose grids is produced by the measurement-guided dose reconstruction algorithm, based on an irradiation of a staticARCCHECK (AC) helical dosimeter array (Sun Nuclear Corp., Melbourne, FL). Each moving voxel collects dose over the dynamic simulation. The difference in dose-to-moving voxel vs dose-to-static voxel in-phantom forms the basis of a motion perturbation correction that is applied to the corresponding voxel in the patient dataset. A new method to synchronize the accelerator and dosimeter clocks, applicable to fixed-gantry IMRT, was developed. Refinements to the algorithm account for the excursion of low dose voxels into high dose regions, causing appreciable dose increase due to motion (LDVE correction). For experimental validation, four plans using TG-119 structure sets and objectives were produced using segmented IMRT direct machine parameters optimization in Pinnacle treatment planning system (v. 9.6, Philips Radiation Oncology Systems, Fitchburg, WI). All beams were delivered with the gantry angle of 0°. Each beam was delivered three times: (1) to the static AC centered on the room lasers; (2) to a static phantom containing a MAPCHECK2 (MC2) planar diode array dosimeter (Sun Nuclear); and (3) to the moving MC2 phantom. The motion trajectory was an ellipse in the IEC XY plane, with 3 and 1.5 cm axes. The period

  2. The Development of a SPME-GC/MS Method for the Analysis of VOC Emissions from Historic Plastic and Rubber Materials

    OpenAIRE

    Curran, K.; Underhill, M.; Gibson, L. T.; Strlic, M.

    2015-01-01

    Analytical methods have been developed for the analysis of VOC emissions from historic plastic and rubber materials using SPME-GC/MS. Parameters such as analysis temperature, sampling time and choice of SPME fibre coating were investigated and sampling preparation strategies explored, including headspace sampling in vials and in gas sampling bags. The repeatability of the method was evaluated. It was found that a 7 d accumulation time at room temperature, followed by sampling using a DVB/CAR/...

  3. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    International Nuclear Information System (INIS)

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons

  4. Analysis and assessment on heavy metal sources in the coastal soils developed from alluvial deposits using multivariate statistical methods.

    Science.gov (United States)

    Li, Jinling; He, Ming; Han, Wei; Gu, Yifan

    2009-05-30

    An investigation on heavy metal sources, i.e., Cu, Zn, Ni, Pb, Cr, and Cd in the coastal soils of Shanghai, China, was conducted using multivariate statistical methods (principal component analysis, clustering analysis, and correlation analysis). All the results of the multivariate analysis showed that: (i) Cu, Ni, Pb, and Cd had anthropogenic sources (e.g., overuse of chemical fertilizers and pesticides, industrial and municipal discharges, animal wastes, sewage irrigation, etc.); (ii) Zn and Cr were associated with parent materials and therefore had natural sources (e.g., the weathering process of parent materials and subsequent pedo-genesis due to the alluvial deposits). The effect of heavy metals in the soils was greatly affected by soil formation, atmospheric deposition, and human activities. These findings provided essential information on the possible sources of heavy metals, which would contribute to the monitoring and assessment process of agricultural soils in worldwide regions. PMID:18976857

  5. Analysis and assessment on heavy metal sources in the coastal soils developed from alluvial deposits using multivariate statistical methods.

    Science.gov (United States)

    Li, Jinling; He, Ming; Han, Wei; Gu, Yifan

    2009-05-30

    An investigation on heavy metal sources, i.e., Cu, Zn, Ni, Pb, Cr, and Cd in the coastal soils of Shanghai, China, was conducted using multivariate statistical methods (principal component analysis, clustering analysis, and correlation analysis). All the results of the multivariate analysis showed that: (i) Cu, Ni, Pb, and Cd had anthropogenic sources (e.g., overuse of chemical fertilizers and pesticides, industrial and municipal discharges, animal wastes, sewage irrigation, etc.); (ii) Zn and Cr were associated with parent materials and therefore had natural sources (e.g., the weathering process of parent materials and subsequent pedo-genesis due to the alluvial deposits). The effect of heavy metals in the soils was greatly affected by soil formation, atmospheric deposition, and human activities. These findings provided essential information on the possible sources of heavy metals, which would contribute to the monitoring and assessment process of agricultural soils in worldwide regions.

  6. Development of an improved method for quantitative analysis of skin blotting: Increasing reliability and applicability for skin assessment

    OpenAIRE

    Ogai, Kazuhiro; Matsumoto, Masaru; Minematsu, T; Kitamura, Keiichiro; Kobayashi, M.; Sugama, Junko; Sanada, Hiromi

    2015-01-01

    Objective A novel skin assessment tool named 'skin blotting' has been recently developed, which can easily predict the skin status to avoid its deterioration. The aim of this study was to propose a normalization method for skin blotting to compensate for individual differences that can hamper the quantitative comparisons and clinical applications. Methods To normalize individual differences, we utilized a total protein as a 'normalizer' with calibration curves. For evaluation, we performed a ...

  7. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    While increasing evidence appoints diverse types of RNA as key players in the regulatory networks underlying cellular differentiation and metabolism, the potential functions of thousands of conserved RNA structures encoded in mammalian genomes remain to be determined. Since the functions of most...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA......-protein pulldown combined with mass spectrometry analysis is applied for in vivo as well as in vitro identification of RNA-binding proteins, the latter succeeding in verifying known RNA-protein interactions. Secondly, acknowledging the significance of flexible promoter usage for the diversification...

  8. Method development for liquid chromatographic/triple quadrupole mass spectrometric analysis of trace level perfluorocarboxylic acids in articles of commerce

    Science.gov (United States)

    An analytical method to identify and quantify trace levels of C5 to C12 perfluorocarboxylic acids (PFCAs) in articles of commerce (AOC) is developed and rigorously validated. Solid samples were extracted in methanol, and liquid samples were diluted with a solvent consisting of 60...

  9. RISK ANALYSIS DEVELOPED MODEL

    Directory of Open Access Journals (Sweden)

    Georgiana Cristina NUKINA

    2012-07-01

    Full Text Available Through Risk analysis developed model deciding whether control measures suitable for implementation. However, the analysis determines whether the benefits of a data control options cost more than the implementation.

  10. Complementing Gender Analysis Methods.

    Science.gov (United States)

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital. PMID:25941756

  11. Complementing Gender Analysis Methods.

    Science.gov (United States)

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital.

  12. Development of core design/analysis technology for integral reactor; verification of SMART nuclear design by Monte Carlo method

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Hyo; Hong, In Seob; Han, Beom Seok; Jeong, Jong Seong [Seoul National University, Seoul (Korea)

    2002-03-01

    The objective of this project is to verify neutronics characteristics of the SMART core design as to compare computational results of the MCNAP code with those of the MASTER code. To achieve this goal, we will analyze neutronics characteristics of the SMART core using the MCNAP code and compare these results with results of the MASTER code. We improved parallel computing module and developed error analysis module of the MCNAP code. We analyzed mechanism of the error propagation through depletion computation and developed a calculation module for quantifying these errors. We performed depletion analysis for fuel pins and assemblies of the SMART core. We modeled a 3-D structure of the SMART core and considered a variation of material compositions by control rods operation and performed depletion analysis for the SMART core. We computed control-rod worths of assemblies and a reactor core for operation of individual control-rod groups. We computed core reactivity coefficients-MTC, FTC and compared these results with computational results of the MASTER code. To verify error analysis module of the MCNAP code, we analyzed error propagation through depletion of the SMART B-type assembly. 18 refs., 102 figs., 36 tabs. (Author)

  13. Guidelines for Analysis of Health Sector Financing in Developing Countries. Volume 8: Health Sector Financing in Developing Countries. International Health Planning Methods Series.

    Science.gov (United States)

    Robertson, Robert L.; And Others

    Intended to assist Agency for International Development officers, advisors, and health officials in incorporating health planning into national plans for economic development, this eighth of ten manuals in the International Health Planning Methods series provides a methodology for conducting a study of health sector financing. It presents an…

  14. THE DEVELOPMENT OF METHOD FOR MINT AND TURMERIC ESSENTIAL OILS IDENTIFICATION AND QUANTITATIVE ANALYSIS IN COMPLEX DRUG

    OpenAIRE

    Smalyuh, O. G.

    2015-01-01

    Introduction.The aim of our study was to develop the method for identification and assay of essential oils of mint and turmeric in complex medicinal product in capsule form.         Materials and method.The paper used samples of turmeric and mint essential oils and complex drug, in the form of capsules containing oil of peppermint, oil of Curcuma longa, a mixture of extracts sandy everlasting (Helichrysumarenarium (L.) Moench), marigold (Caléndulaofficinális L), wild carrot (Daucussarota) and...

  15. Development and validation of an alternative to conventional pretreatment methods for residue analysis of butachlor in water, soil, and rice.

    Science.gov (United States)

    Xue, Jiaying; Jiang, Wenqing; Liu, Fengmao; Zhao, Huiyu; Wang, Suli; Peng, Wei

    2014-01-01

    A rapid and effective alternative analytical method for residues of butachlor in water, soil, and rice was established. The operating variables affecting performance of this method, including different extraction conditions and cleanup adsorbents, were evaluated. The determination of butachlor residues in soil, straw, rice hull, and husked rice was performed using GC/MS after extraction with n-hexane and cleanup with graphite carbon black. The average recoveries ranged from 81.5 to 102.7%, with RSDs of 0.6-7.7% for all of the matrixes investigated. The limits of quantitation were 0.05 mg/kg in water and rice plant, and 0.01 mg/kg in soil, straw, rice hull, and husked rice. A comparison among this proposed method, the conventional liquid-liquid extraction, the Quick, Easy, Cheap, Effective, Rugged, and Safe method, and Soxhlet extraction indicated that this method was more suitable for analyzing butachlor in rice samples. The further validation of the proposed method was carried out by Soxhlet extraction for the determination of butachlor residues in the husked rice samples, and the residue results showed there was no obvious difference obtained from these two methods. Samples from a rice field were found to contain butachlor residues below the maximum residue limits set by China (0.5 mg/kg) and Japan (0.1 mg/kg). The proposed method has a strong potential for application in routine screening and processing of large numbers of samples. This study developed a more effective alternative to the conventional analytical methods for analyzing butachlor residues in various matrixes.

  16. Development of a Direct Headspace Collection Method from Arabidopsis Seedlings Using HS-SPME-GC-TOF-MS Analysis

    Directory of Open Access Journals (Sweden)

    Kazuki Saito

    2013-04-01

    Full Text Available Plants produce various volatile organic compounds (VOCs, which are thought to be a crucial factor in their interactions with harmful insects, plants and animals. Composition of VOCs may differ when plants are grown under different nutrient conditions, i.e., macronutrient-deficient conditions. However, in plants, relationships between macronutrient assimilation and VOC composition remain unclear. In order to identify the kinds of VOCs that can be emitted when plants are grown under various environmental conditions, we established a conventional method for VOC profiling in Arabidopsis thaliana (Arabidopsis involving headspace-solid-phase microextraction-gas chromatography-time-of-flight-mass spectrometry (HS-SPME-GC-TOF-MS. We grew Arabidopsis seedlings in an HS vial to directly perform HS analysis. To maximize the analytical performance of VOCs, we optimized the extraction method and the analytical conditions of HP-SPME-GC-TOF-MS. Using the optimized method, we conducted VOC profiling of Arabidopsis seedlings, which were grown under two different nutrition conditions, nutrition-rich and nutrition-deficient conditions. The VOC profiles clearly showed a distinct pattern with respect to each condition. This study suggests that HS-SPME-GC-TOF-MS analysis has immense potential to detect changes in the levels of VOCs in not only Arabidopsis, but other plants grown under various environmental conditions.

  17. Methods and Model Development for Coupled RELAP5/PARCS Analysis of the Atucha-II Nuclear Power Plant

    Directory of Open Access Journals (Sweden)

    Andrew M. Ward

    2011-01-01

    Full Text Available In order to analyze the steady state and transient behavior of CNA-II, several tasks were required. Methods and models were developed in several areas. HELIOS lattice models were developed and benchmarked against WIMS/MCNP5 results generated by NA-SA. Cross-sections for the coupled RELAP5/PARCS calculation were extracted from HELIOS within the GenPMAXS framework. The validation of both HELIOS and PARCS was performed primarily by comparisons to WIMS/PUMA and MCNP for idealized models. Special methods were developed to model the control rods and boron injection systems of CNA-II. The insertion of the rods is oblique, and a special routine was added to PARCS to treat this effect. CFD results combined with specialized mapping routines were used to model the boron injection system. In all cases there was good agreement in the results which provided confidence in the neutronics methods and modeling. A coupled code benchmark between U of M and U of Pisa is ongoing and results are still preliminary. Under a LOCA transient, the best estimate behavior of the core appears to be acceptable.

  18. Analysis of Scientific and Methodical Approaches to Portfolio Investment as a Tool of Financial Provision of Sustainable Economic Development

    Directory of Open Access Journals (Sweden)

    Leus Daryna V.

    2013-12-01

    Full Text Available The article analyses scientific and methodical approaches to portfolio investment. It develops recommendations on specification of the categorical apparatus of portfolio investment in the context of differentiation of strategic (direct and portfolio investments as alternative approaches to the conduct of investment activity. It identifies the composition and functions of objects and subjects of portfolio investment under conditions of globalisation of the world financial markets. It studies main postulates of the portfolio theory and justifies a necessity of identification of the place, role and functions of subjects of portfolio investment in them for ensuring sustainable development of the economy. It offers to specify, as one of the ways of further development of portfolio theories, a separate direction in the financial provision of economy with consideration of ecologic and social components – socio responsible investment.

  19. Contribution of ion beam analysis methods to the development of 2nd generation high temperature superconducting (HTS) wires

    Energy Technology Data Exchange (ETDEWEB)

    Usov, Igor O [Los Alamos National Laboratory; Arendt, Paul N [Los Alamos National Laboratory; Stan, Liliana [Los Alamos National Laboratory; Holesinger, Terry G [Los Alamos National Laboratory; Foltyn, Steven R [Los Alamos National Laboratory; Depaula, Raymond F [Los Alamos National Laboratory

    2009-01-01

    One of the crucial steps in the second generation high temperature superconducting wire program was development of the buffer layer architecture. The architecture designed at the Superconductivity Technology Center at Los Alamos National Laboratory consists of several oxide layers wherein each layer plays a specific role, namely: nucleation layer, diffusion barrier, biaxially textured template, and an intermediate layer with a good match to the lattice parameter of superconducting Y{sub 1}Ba{sub 2}Cu{sub 3}O{sub 7} (YBCO) compound. This report demonstrates how a wide range of ion beam analysis techniques (SIMS, RBS, channeling, PIXE, PIGE, NRA, ERD) was employed for analysis of each buffer layer and the YBCO films. These results assisted in understanding of a variety of physical processes occurring during the buffet layer fabrication and helped to optimize the buffer layer architecture as a whole.

  20. Electroporation-based methods for in vivo, whole mount and primary culture analysis of zebrafish brain development

    Directory of Open Access Journals (Sweden)

    Jesuthasan Suresh

    2007-03-01

    Full Text Available Abstract Background Electroporation is a technique for the introduction of nucleic acids and other macromolecules into cells. In chick embryos it has been a particularly powerful technique for the spatial and temporal control of gene expression in developmental studies. Electroporation methods have also been reported for Xenopus, zebrafish, and mouse. Results We present a new protocol for zebrafish brain electroporation. Using a simple set-up with fixed spaced electrodes and microinjection equipment, it is possible to electroporate 50 to 100 embryos in 1 hour with no lethality and consistently high levels of transgene expression in numerous cells. Transfected cells in the zebrafish brain are amenable to in vivo time lapse imaging. Explants containing transfected neurons can be cultured for in vitro analysis. We also present a simple enzymatic method to isolate whole brains from fixed zebrafish for immunocytochemistry. Conclusion Building on previously described methods, we have optimized several parameters to allow for highly efficient unilateral or bilateral transgenesis of a large number of cells in the zebrafish brain. This method is simple and provides consistently high levels of transgenesis for large numbers of embryos.

  1. Development of a sample preparation method for the analysis of current-use pesticides in sediment using gas chromatography.

    Science.gov (United States)

    Wang, Dongli; Weston, Donald P; Ding, Yuping; Lydy, Michael J

    2010-02-01

    Pyrethroid insecticides have been implicated as the cause of sediment toxicity to Hyalella azteca in both agricultural and urban areas of California; however, for a subset of these toxic sediments (approximately 30%), the cause of toxicity remains unidentified. This article describes the analytical method development for seven additional pesticides that are being examined to determine if they might play a role in the unexplained toxicity. A pressurized liquid extraction method was optimized to simultaneously extract diazinon, methyl parathion, oxyfluorfen, dicofol, fenpropathrin, pyraclostrobin, and indoxacarb from sediment, and the extracts were cleaned using a two-step solid-phase extraction procedure. The final extract was analyzed for the target pesticides by gas chromatography/nitrogen-phosphorus detector (GC/NPD), and gas chromatography/electron capture detector (GC/ECD), after sulfur was removed by shaking with copper and cold crystallization. Three sediments were used as reference matrices to assess method accuracy and precision. Method detection limits were 0.23-1.8 ng/g dry sediment using seven replicates of sediment spiked at 1.0 ng/g dry sediment. Recoveries ranged from 61.6 to 118% with relative standard deviations of 2.1-17% when spiked at 5.0 and 50 ng/g dry sediment. The three reference sediments, spiked with 50 ng/g dry weight of the pesticide mixture, were aged for 0.25, 1, 4, 7, and 14 days. Recoveries of the pesticides in the sediments generally decreased with increased aging time, but the magnitude of the decline was pesticide and sediment dependent. The developed method was applied to field-collected sediments from the Central Valley of California. PMID:19798461

  2. Development of a Novel Nuclear Safety Culture Evaluation Method for an Operating Team Using Probabilistic Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Han, Sangmin; Lee, Seung Min; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)

    2015-05-15

    IAEA defined safety culture as follows: 'Safety Culture is that assembly of characteristics and attitudes in organizations and individuals which establishes that, as an overriding priority, nuclear plant safety issues receive the attention warranted by their significance'. Also, celebrated behavioral scientist, Cooper, defined safety culture as,'safety culture is that observable degree of effort by which all organizational members direct their attention and actions toward improving safety on a daily basis' with his internal psychological, situational, and behavioral context model. With these various definitions and criteria of safety culture, several safety culture assessment methods have been developed to improve and manage safety culture. To develop a new quantitative safety culture evaluation method for an operating team, we unified and redefined safety culture assessment items. Then we modeled a new safety culture evaluation by adopting level 1 PSA concept. Finally, we suggested the criteria to obtain nominal success probabilities of assessment items by using 'operational definition'. To validate the suggested evaluation method, we analyzed the collected audio-visual recording data collected from a full scope main control room simulator of a NPP in Korea.

  3. Development of a Novel Nuclear Safety Culture Evaluation Method for an Operating Team Using Probabilistic Safety Analysis

    International Nuclear Information System (INIS)

    IAEA defined safety culture as follows: 'Safety Culture is that assembly of characteristics and attitudes in organizations and individuals which establishes that, as an overriding priority, nuclear plant safety issues receive the attention warranted by their significance'. Also, celebrated behavioral scientist, Cooper, defined safety culture as,'safety culture is that observable degree of effort by which all organizational members direct their attention and actions toward improving safety on a daily basis' with his internal psychological, situational, and behavioral context model. With these various definitions and criteria of safety culture, several safety culture assessment methods have been developed to improve and manage safety culture. To develop a new quantitative safety culture evaluation method for an operating team, we unified and redefined safety culture assessment items. Then we modeled a new safety culture evaluation by adopting level 1 PSA concept. Finally, we suggested the criteria to obtain nominal success probabilities of assessment items by using 'operational definition'. To validate the suggested evaluation method, we analyzed the collected audio-visual recording data collected from a full scope main control room simulator of a NPP in Korea

  4. Method development and validation for the analysis of a new anti-cancer infusion solution via HPLC.

    Science.gov (United States)

    Donnarumma, Fabrizio; Schober, Margot; Greilberger, Joachim; Matzi, Veronika; Lindenmann, Jörg; Maier, Alfred; Herwig, Ralf; Wintersteiger, Reinhold

    2011-01-01

    A fast and simple HPLC method has been developed and validated for the quantification of a completely new anti-cancer drug during the manufacturing process. The combination of four compounds including α-ketoglutaric acid, hydroxymethylfurfural, N-acetyl-L-methionine and N-acetyl-L-selenomethionine, administered intravenously, is still in test phase but has already shown promising results in cancer therapy. HPLC separation was achieved on an RP-18 column with a gradient system. However, the highly different concentrations of the compounds required a variation in the detection wavelength within one run. In order to produce a chromatogram where peaks were comparable on a similar range scale, detection at absorption maxima for the two most concentrated components was avoided. After optimization of the gradient program it was possible to detect all four substances within 14 min in spite of their strongly different chemical structure. The method developed was validated for accuracy, repeatability, reproducibility and robustness in relation to temperature and pH of buffer. Linearity as well as the limit of detection and quantification were determined. This HPLC method was found to be precise, accurate and reproducible and can be easily used for in-line process control during the manufacture of the anti-tumour infusion solution. PMID:21246718

  5. Development of CAD based on ANN analysis of power spectra for pneumoconiosis in chest radiographs: effect of three new enhancement methods

    OpenAIRE

    Okumura, Eiichiro; Kawashita, Ikuo; Ishida, Takayuki

    2014-01-01

    We have been developing a computer-aided detection (CAD) scheme for pneumoconiosis based on a rule-based plus artificial neural network (ANN) analysis of power spectra. In this study, we have developed three enhancement methods for the abnormal patterns to reduce false-positive and false-negative values. The image database consisted of 2 normal and 15 abnormal chest radiographs. The International Labour Organization standard chest radiographs with pneumoconiosis were categorized as subcategor...

  6. Development and validation of a cleanup method for hydrocarbon containing samples for the analysis of semivolatile organic compounds

    International Nuclear Information System (INIS)

    Samples obtained from the Hanford single shell tanks (SSTs) are contaminated with normal paraffin hydrocarbon (NPH) as hydrostatic fluid from the sampling process or can be native to the tank waste. The contamination is usually high enough that a dilution of up to several orders of magnitude may be required before the sample can be analyzed by the conventional gas chromatography/mass spectrometry methodology. This can prevent detection and measurement of organic constituents that are present at lower concentration levels. To eliminate or minimize the problem, a sample cleanup method has been developed and validated and is presented in this document

  7. SIFT-MS and FA-MS methods for ambient gas phase analysis: developments and applications in the UK

    OpenAIRE

    Smith, D.; Spanel, P

    2015-01-01

    Selected ion flow tube mass spectrometry, SIFT-MS, a relatively new gas/vapour phase analytical method, is derived from the much earlier selected ion flow tube, SIFT, used for the study of gas phase ion-molecule reactions. Both the SIFT and SIFT-MS techniques were conceived and developed in the UK, the former at Birmingham University, the latter at Keele University along with the complementary flowing afterglow mass spectrometry, FA-MS, technique. The focus of this short review is largely to ...

  8. Development of detection filter for mammographic microcalcifications. A method based on density gradient and triple-ring filter analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hirako, Kenichi; Fujita, Hiroshi; Hara, Takeshi [Gifu Univ. (Japan); Endo, Tokiko

    1995-09-01

    We have developed a new automated-detection algorithm for clustered microcalcifications on digital mammograms (breast radiographs). The vectors of density gradient were firstly calculated within the area of breast which was segmented automatically. Secondly, the triple-ring filter was developed to extract the specific features for the pattern of microcalcifications form the vectors. The simulation study was performed to demonstrate the effectiveness of the new method. The true positive detection rate of our algorithm was 90.3% with 0.84 false-positive detection per image on our data base of 132 mammograms. These results showed that our new algorithm is applicable to the mammographic computer-aided diagnostic system. (author).

  9. Negotiating a Systems Development Method

    Science.gov (United States)

    Karlsson, Fredrik; Hedström, Karin

    Systems development methods (or methods) are often applied in tailored version to fit the actual situation. Method tailoring is in most the existing literature viewed as either (a) a highly rational process with the method engineer as the driver where the project members are passive information providers or (b) an unstructured process where the systems developer makes individual choices, a selection process without any driver. The purpose of this chapter is to illustrate that important design decisions during method tailoring are made by project members through negotiation. The study has been carried out using the perspective of actor-network theory. Our narratives depict method tailoring as more complex than (a) and (b) show the driver role rotates between the project members, and design decisions are based on influences from several project members. However, these design decisions are not consensus decisions.

  10. A strategy to the development of a human error analysis method for accident management in nuclear power plants using industrial accident dynamics

    International Nuclear Information System (INIS)

    This technical report describes the early progress of he establishment of a human error analysis method as a part of a human reliability analysis(HRA) method for the assessment of the human error potential in a given accident management strategy. At first, we review the shortages and limitations of the existing HRA methods through an example application. In order to enhance the bias to the quantitative aspect of the HRA method, we focused to the qualitative aspect, i.e., human error analysis(HEA), during the proposition of a strategy to the new method. For the establishment of a new HEA method, we discuss the basic theories and approaches to the human error in industry, and propose three basic requirements that should be maintained as pre-requisites for HEA method in practice. Finally, we test IAD(Industrial Accident Dynamics) which has been widely utilized in industrial fields, in order to know whether IAD can be so easily modified and extended to the nuclear power plant applications. We try to apply IAD to the same example case and develop new taxonomy of the performance shaping factors in accident management and their influence matrix, which could enhance the IAD method as an HEA method. (author). 33 refs., 17 tabs., 20 figs

  11. Development of Methods for Determination of Aflatoxins.

    Science.gov (United States)

    Xie, Lijuan; Chen, Min; Ying, Yibin

    2016-12-01

    Aflatoxins can cause damage to the health of humans and animals. Several institutions around the world have established regulations to limit the levels of aflatoxins in food, and numerous analytical methods have been extensively developed for aflatoxin determination. This review covers the currently used analytical methods for the determination of aflatoxins in different food matrices, which includes sampling and sample preparation, sample pretreatment methods including extraction methods and purification methods of aflatoxin extracts, separation and determination methods. Validation for analysis of aflatoxins and safety considerations and precautions when doing the experiments are also discussed. PMID:25840003

  12. Gait analysis methods in rehabilitation

    Directory of Open Access Journals (Sweden)

    Baker Richard

    2006-03-01

    Full Text Available Abstract Introduction Brand's four reasons for clinical tests and his analysis of the characteristics of valid biomechanical tests for use in orthopaedics are taken as a basis for determining what methodologies are required for gait analysis in a clinical rehabilitation context. Measurement methods in clinical gait analysis The state of the art of optical systems capable of measuring the positions of retro-reflective markers placed on the skin is sufficiently advanced that they are probably no longer a significant source of error in clinical gait analysis. Determining the anthropometry of the subject and compensating for soft tissue movement in relation to the under-lying bones are now the principal problems. Techniques for using functional tests to determine joint centres and axes of rotation are starting to be used successfully. Probably the last great challenge for optical systems is in using computational techniques to compensate for soft tissue measurements. In the long term future it is possible that direct imaging of bones and joints in three dimensions (using MRI or fluoroscopy may replace marker based systems. Methods for interpreting gait analysis data There is still not an accepted general theory of why we walk the way we do. In the absence of this, many explanations of walking address the mechanisms by which specific movements are achieved by particular muscles. A whole new methodology is developing to determine the functions of individual muscles. This needs further development and validation. A particular requirement is for subject specific models incorporating 3-dimensional imaging data of the musculo-skeletal anatomy with kinematic and kinetic data. Methods for understanding the effects of intervention Clinical gait analysis is extremely limited if it does not allow clinicians to choose between alternative possible interventions or to predict outcomes. This can be achieved either by rigorously planned clinical trials or using

  13. Development and application of a method for the analysis of 9 mycotoxins in maize by HPLC-MS/MS.

    Science.gov (United States)

    Wang, Yutang; Xiao, Chunxia; Guo, Jing; Yuan, Yahong; Wang, Jianguo; Liu, Laping; Yue, Tianli

    2013-11-01

    A reliable and sensitive liquid chromatography/tandem mass spectrometry (LC-MS/MS) method was developed for the simultaneous determination of aflatoxins (AFB1 , AFB2 , AFG1 , and AFG2 ), ochratoxin A (OTA), deoxynivalenol (DON), zearalenone (ZEA), fumonisin B1 (FB1 ), and T2-toxin in maize. The samples were first extracted using acetonitrile: water: acetic acid (79 : 20 : 1), and then further cleaned-up using OASIS HLB cartridge. Optimum conditions for the extraction and chromatographic separation were investigated. The mean recoveries of mycotoxins in spiked maize ranged from 68.3% to 94.3%. Limits of detection and quantification ranged from 0.01 to 0.64 μg/kg and from 0.03 to 2.12 μg/kg, respectively. The LC-MS/MS method has also been successfully applied to 60 maize samples, which were collected from Shaanxi Province of China. Twenty-four of the total 60 samples (40%) were contaminated with at least 1 of these 9 mycotoxins. Occurrence of mycotoxins were 6.7%, 1.7%, 3.3%, 6.7%, 1.7%, 23.3%, and 3.3% for AFB1 , AFB2 , OTA, ZEA, DON, FB1 , and T2-toxin, respectively. The results demonstrated that the procedure was suitable for the simultaneous determination of these mycotoxins in maize matrix. PMID:24245893

  14. Development and validation of a method for removal of normal paraffin hydrocarbon in radioactive waste samples prior to analysis of semi volatile components

    International Nuclear Information System (INIS)

    A method has been developed at Pacific Northwest Laboratory (PNL) to remove normal paraffin hydrocarbon (NPH) from radioactive waste samples prior to gas chromatography/mass spectrometry analysis of semi volatile components. The effectiveness of the cleanup procedure was demonstrated for all the EPA semi volatile target list compounds. Blanks and spiked actual waste samples were utilized in the development and validation study. Approximately 95% of the NPH was removed from the single-shell tank samples. The recoveries were good for most of the target compounds. Results were compared with those obtained by utilizing EPA method 3630. The recoveries were much better for the PNL-developed method. (author). 4 refs., 3 figs., 6 tabs

  15. Analytical method development and validation for quantification of uranium by Fourier Transform Infrared Spectroscopy (FTIR) for routine quality control analysis

    International Nuclear Information System (INIS)

    This work presents a low cost, simple and new methodology for direct determination uranium in different matrices uranium: organic phase (UO2(NO3)2.2TBP - uranyl nitrate complex) and aqueous phase (UO2(NO3)2 - NTU - uranyl nitrate), based on Fourier Transform Infrared spectroscopy (FTIR) using KBr pellets technique. The analytical validation is essential to define if a developed methodology is completely adjusted to the objectives that it is destined and is considered one of the main instruments of quality control. The parameters used in the validation process were: selectivity, linearity, limits of detection (LD) and quantitation (LQ), precision (repeatability and intermediate precision), accuracy and robustness. The method for uranium in organic phase (UO2(NO3)2.2TBP in hexane/embedded in KBr) was linear (r=0.9989) over the range of 1.0 g L-1a 14.3 g L-1, LD were 92.1 mg L-1 and LQ 113.1 mg L-1, precision (RSD < 1.6% and p-value < 0.05), accurate (recovery of 100.1% - 102.9%). The method for uranium aqueous phase (UO2(NO3)2/embedded in KBr) was linear (r=0.9964) over the range of 5.4 g L-1 a 51.2 g L-1, LD were 835 mg L-1 and LQ 958 mg L-1, precision (RSD < 1.0% and p-value < 0.05), accurate (recovery of 99.1% - 102.0%). The FTIR method is robust regarding most of the variables analyzed, as the difference between results obtained under nominal and modified conditions were lower than the critical value for all analytical parameters studied. Some process samples were analyzed in FTIR and compared with gravimetric and x ray fluorescence (XRF) analyses showing similar results in all three methods. The statistical tests (Student-t and Fischer) showed that the techniques are equivalent. (author)

  16. HPLC method development for the simultaneous analysis of amlodipine and valsartan in combined dosage forms and in vitro dissolution studies

    Directory of Open Access Journals (Sweden)

    Mustafa Çelebier

    2010-12-01

    Full Text Available A simple, rapid and reproducible HPLC method was developed for the simultaneous determination of amlodipine and valsartan in their combined dosage forms, and for drug dissolution studies. A C18 column (ODS 2, 10 μm, 200 x 4.6 mm and a mobile phase of phosphate buffer (pH 3.6 , 0.01 mol L-1:acetonitrile: methanol (46:44:10 v/v/v mixture were used for separation and quantification. Analyses were run at a flow-rate of 1 mL min-1 and at ambient temperature. The injection volume was 20 μL and the ultraviolet detector was set at 240 nm. Under these conditions, amlodipine and valsartan were eluted at 7.1 min and 3.4 min, respectively. Total run time was shorter than 9 min. The developed method was validated according to the literature and found to be linear within the range 0.1 - 50 μg mL-1 for amlodipine, and 0.05 - 50 μg mL-1 for valsartan. The developed method was applied successfully for quality control assay of amlodipine and valsartan in their combination drug product and in vitro dissolution studies.Desenvolveu-se método de HPLC rápido e reprodutível para a determinação simultânea de anlodipino e valsartana em suas formas de associação e para os estudos de dissolução dos fármacos. Utilizaram-se coluna C18 (ODS 2, 10 μm, 200 x 4,6 mm e fase móvel tampão fosfato (pH 3,6, 0,01 mol L-1:acetonitrila: metanol para a separação e a quantificação. As análises foram efetuadas com velocidade de fluxo de 1 mL min-1 e à temparatura ambiente O volume de injeção foi de 20 μL e utilizou-se detector de ultravioleta a 240 nm. Sob essas condições, anlodipino e valsartana foram eluídas a 7,1 min e 3,4 min, respectivamente. O tempo total de corrida foi menor que 9 min. O método desenvolvido foi validado de acordo com a literatura e se mostrou linear na faixa de 0,1-50 μg mL-1 para anlodipino e de 0,05-50 μg mL-1 para valsartana. O método desenvolvido foi aplicado com sucesso para ensaios de controle de qualidade de associações de

  17. Communication Network Analysis Methods.

    Science.gov (United States)

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  18. Nonlinear structural analysis using integrated force method

    Indian Academy of Sciences (India)

    N R B Krishnam Raju; J Nagabhushanam

    2000-08-01

    Though the use of the integrated force method for linear investigations is well-recognised, no efforts were made to extend this method to nonlinear structural analysis. This paper presents the attempts to use this method for analysing nonlinear structures. General formulation of nonlinear structural analysis is given. Typically highly nonlinear bench-mark problems are considered. The characteristic matrices of the elements used in these problems are developed and later these structures are analysed. The results of the analysis are compared with the results of the displacement method. It has been demonstrated that the integrated force method is equally viable and efficient as compared to the displacement method.

  19. Theoretical and experimental analysis of electroweak corrections to the inclusive jet process. Development of extreme topologies detection methods

    International Nuclear Information System (INIS)

    We have studied the behaviour of the inclusive jet, W+jets and Z+jets processes from the phenomenological and experimental point of view in the ATLAS experiment at LHC in order to understand how important is the impact of Sudakov logarithms on electroweak corrections and in the associated production of weak vector boson and jets at LHC. We have computed the amplitude of the real electroweak corrections to the inclusive jet process due to the real emission of weak vector bosons from jets. We have done this computation with the MCFM and NLOjet++ generators at 7 TeV, 8 TeV and 14 TeV. This study shows that, for the inclusive jet process, the partial cancellation of the virtual weak corrections (due to weak bosons in loops) by the real electroweak corrections occurs. This effect shows that Bloch-Nordsieck violation is reduced for this process. We have then participated to the measure of the differential cross-section for these different processes in the ATLAS experiment at 7 TeV. In particular we have been involved into technical aspects of the measurement such as the study of the QCD background to the W+jets process in the muon channel. We have then combined the different measurements in this channel to compare their behaviour. This tends to show that several effects are giving to the electroweak corrections their relative importance as we see an increase of the relative contribution of weak bosons with jets processes to the inclusive jet process with the transverse momentum of jets, if we explicitly ask for the presence of electroweak bosons in the final state. This study is currently only a preliminary study and aims at showing that this study can be useful to investigate the underlying structure of these processes. Finally we have studied the noises affecting the ATLAS calorimeter. This has allowed for the development of a new way to detect problematic events using well known theorems from statistics. This new method is able to detect bursts of noise and

  20. Development, optimization, validation and application of faster gas chromatography - flame ionization detector method for the analysis of total petroleum hydrocarbons in contaminated soils.

    Science.gov (United States)

    Zubair, Abdulrazaq; Pappoe, Michael; James, Lesley A; Hawboldt, Kelly

    2015-12-18

    This paper presents an important new approach to improving the timeliness of Total Petroleum Hydrocarbon (TPH) analysis in the soil by Gas Chromatography - Flame Ionization Detector (GC-FID) using the CCME Canada-Wide Standard reference method. The Canada-Wide Standard (CWS) method is used for the analysis of petroleum hydrocarbon compounds across Canada. However, inter-laboratory application of this method for the analysis of TPH in the soil has often shown considerable variability in the results. This could be due, in part, to the different gas chromatography (GC) conditions, other steps involved in the method, as well as the soil properties. In addition, there are differences in the interpretation of the GC results, which impacts the determination of the effectiveness of remediation at hydrocarbon-contaminated sites. In this work, multivariate experimental design approach was used to develop and validate the analytical method for a faster quantitative analysis of TPH in (contaminated) soil. A fractional factorial design (fFD) was used to screen six factors to identify the most significant factors impacting the analysis. These factors included: injection volume (μL), injection temperature (°C), oven program (°C/min), detector temperature (°C), carrier gas flow rate (mL/min) and solvent ratio (v/v hexane/dichloromethane). The most important factors (carrier gas flow rate and oven program) were then optimized using a central composite response surface design. Robustness testing and validation of model compares favourably with the experimental results with percentage difference of 2.78% for the analysis time. This research successfully reduced the method's standard analytical time from 20 to 8min with all the carbon fractions eluting. The method was successfully applied for fast TPH analysis of Bunker C oil contaminated soil. A reduced analytical time would offer many benefits including an improved laboratory reporting times, and overall improved clean up

  1. A thermogravimetric analysis (TGA) method developed for estimating the stoichiometric ratio of solid-state {alpha}-cyclodextrin-based inclusion complexes

    Energy Technology Data Exchange (ETDEWEB)

    Bai, Yuxiang; Wang, Jinpeng; Bashari, Mohanad; Hu, Xiuting [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Feng, Tao [School of Perfume and Aroma Technology, Shanghai Institute of Technology, Shanghai 201418 (China); Xu, Xueming [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Jin, Zhengyu, E-mail: jinlab2008@yahoo.com [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Tian, Yaoqi, E-mail: yqtian@jiangnan.edu.cn [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China)

    2012-08-10

    Highlights: Black-Right-Pointing-Pointer We develop a TGA method for the measurement of the stoichiometric ratio. Black-Right-Pointing-Pointer A series of formulas are deduced to calculate the stoichiometric ratio. Black-Right-Pointing-Pointer Four {alpha}-CD-based inclusion complexes were successfully prepared. Black-Right-Pointing-Pointer The developed method is applicable. - Abstract: An approach mainly based on thermogravimetric analysis (TGA) was developed to evaluate the stoichiometric ratio (SR, guest to host) of the guest-{alpha}-cyclodextrin (Guest-{alpha}-CD) inclusion complexes (4-cresol-{alpha}-CD, benzyl alcohol-{alpha}-CD, ferrocene-{alpha}-CD and decanoic acid-{alpha}-CD). The present data obtained from Fourier transform-infrared (FT-IR) spectroscopy showed that all the {alpha}-CD-based inclusion complexes were successfully prepared in a solid-state form. The stoichiometric ratios of {alpha}-CD to the relative guests (4-cresol, benzyl alcohol, ferrocene and decanoic acid) determined by the developed method were 1:1, 1:2, 2:1 and 1:2, respectively. These SR data were well demonstrated by the previously reported X-ray diffraction (XRD) method and the NMR confirmatory experiments, except the SR of decanoic acid with a larger size and longer chain was not consistent. It is, therefore, suggested that the TGA-based method is applicable to follow the stoichiometric ratio of the polycrystalline {alpha}-CD-based inclusion complexes with smaller and shorter chain guests.

  2. A thermogravimetric analysis (TGA) method developed for estimating the stoichiometric ratio of solid-state α-cyclodextrin-based inclusion complexes

    International Nuclear Information System (INIS)

    Highlights: ► We develop a TGA method for the measurement of the stoichiometric ratio. ► A series of formulas are deduced to calculate the stoichiometric ratio. ► Four α-CD-based inclusion complexes were successfully prepared. ► The developed method is applicable. - Abstract: An approach mainly based on thermogravimetric analysis (TGA) was developed to evaluate the stoichiometric ratio (SR, guest to host) of the guest–α-cyclodextrin (Guest-α-CD) inclusion complexes (4-cresol-α-CD, benzyl alcohol-α-CD, ferrocene-α-CD and decanoic acid-α-CD). The present data obtained from Fourier transform-infrared (FT-IR) spectroscopy showed that all the α-CD-based inclusion complexes were successfully prepared in a solid-state form. The stoichiometric ratios of α-CD to the relative guests (4-cresol, benzyl alcohol, ferrocene and decanoic acid) determined by the developed method were 1:1, 1:2, 2:1 and 1:2, respectively. These SR data were well demonstrated by the previously reported X-ray diffraction (XRD) method and the NMR confirmatory experiments, except the SR of decanoic acid with a larger size and longer chain was not consistent. It is, therefore, suggested that the TGA-based method is applicable to follow the stoichiometric ratio of the polycrystalline α-CD-based inclusion complexes with smaller and shorter chain guests.

  3. Development of computation mechanics analysis method taking microscopic structure of nuclear power materials in consideration and the optimal design method for structure constitution

    International Nuclear Information System (INIS)

    When materials are subjected to neutron irradiation, the characteristics deteriorate, causing dislocation loop, void, helium bubbles, segregation, precipitation and so on. In the structural design of the core of nuclear fusion reactors, in order to determine the applied temperature limit of structural materials, the elucidation of helium embrittlement mechanism and the development of the materials which have the excellent resistance to helium embrittlement are important. In this paper, the example of analyzing the form of bubbles at grain boundaries and the effect that the work hardening index of materials exerts to the stress-strain curves is shown. The formulation of finite element method for evaluating grain boundary helium embrittlement is explained. It is supposed that helium embrittlement arises because bubbles exist at grain boundaries, and deformation concentrates near the grain boundary which is perpendicular to tensile stress axis. As the results of calculation, the effect that bubble size exerts to stress-strain curves and the effects that bubble density and work hardening index exert to stress-strain curves are reported. (K.I.)

  4. Development of a Univariate Membrane-Based Mid-Infrared Method for Protein Quantitation and Total Lipid Content Analysis of Biological Samples

    Directory of Open Access Journals (Sweden)

    Ivona Strug

    2014-01-01

    Full Text Available Biological samples present a range of complexities from homogeneous purified protein to multicomponent mixtures. Accurate qualification of such samples is paramount to downstream applications. We describe the development of an MIR spectroscopy-based analytical method offering simultaneous protein quantitation (0.25–5 mg/mL and analysis of total lipid or detergent species, as well as the identification of other biomolecules present in biological samples. The method utilizes a hydrophilic PTFE membrane engineered for presentation of aqueous samples in a dried format compatible with fast infrared analysis. Unlike classical quantification techniques, the reported method is amino acid sequence independent and thus applicable to complex samples of unknown composition. By comparison to existing platforms, this MIR-based method enables direct quantification using minimal sample volume (2 µL; it is well-suited where repeat access and limited sample size are critical parameters. Further, accurate results can be derived without specialized training or knowledge of IR spectroscopy. Overall, the simplified application and analysis system provides a more cost-effective alternative to high-throughput IR systems for research laboratories with minimal throughput demands. In summary, the MIR-based system provides a viable alternative to current protein quantitation methods; it also uniquely offers simultaneous qualification of other components, notably lipids and detergents.

  5. Development of liquid chromatography methods coupled to mass spectrometry for the analysis of substances with a wide variety of polarity in meconium.

    Science.gov (United States)

    Meyer-Monath, Marie; Chatellier, Claudine; Cabooter, Deirdre; Rouget, Florence; Morel, Isabelle; Lestremau, Francois

    2015-06-01

    Meconium is the first fecal excretion of newborns. This complex accumulative matrix allows assessing the exposure of the fetus to xenobiotics during the last 6 months of pregnancy. To determine the eventual effect of fetal exposure to micropollutants in this matrix, robust and sensitive analytical methods must be developed. This article describes the method development of liquid chromatography methods coupled to triple quadrupole mass spectrometry for relevant pollutants. The 28 selected target compounds had different physico-chemical properties from very polar (glyphosate) to non-polar molecules (pyrethroids). Tests were performed with three different types of columns: reversed phase, ion exchange and HILIC. As a unique method could not be determined for the simultaneous analysis of all compounds, three columns were selected and suitable chromatographic methods were optimized. Similar results were noticed for the separation of the target compounds dissolved in either meconium extract or solvent for reversed phase and ion exchange columns. However, for HILIC, the matrix had a significant influence on the peak shape and robustness of the method. Finally, the analytical methods were applied to "real" meconium samples.

  6. Development of liquid chromatography methods coupled to mass spectrometry for the analysis of substances with a wide variety of polarity in meconium.

    Science.gov (United States)

    Meyer-Monath, Marie; Chatellier, Claudine; Cabooter, Deirdre; Rouget, Florence; Morel, Isabelle; Lestremau, Francois

    2015-06-01

    Meconium is the first fecal excretion of newborns. This complex accumulative matrix allows assessing the exposure of the fetus to xenobiotics during the last 6 months of pregnancy. To determine the eventual effect of fetal exposure to micropollutants in this matrix, robust and sensitive analytical methods must be developed. This article describes the method development of liquid chromatography methods coupled to triple quadrupole mass spectrometry for relevant pollutants. The 28 selected target compounds had different physico-chemical properties from very polar (glyphosate) to non-polar molecules (pyrethroids). Tests were performed with three different types of columns: reversed phase, ion exchange and HILIC. As a unique method could not be determined for the simultaneous analysis of all compounds, three columns were selected and suitable chromatographic methods were optimized. Similar results were noticed for the separation of the target compounds dissolved in either meconium extract or solvent for reversed phase and ion exchange columns. However, for HILIC, the matrix had a significant influence on the peak shape and robustness of the method. Finally, the analytical methods were applied to "real" meconium samples. PMID:25863396

  7. Developments in CTG analysis.

    Science.gov (United States)

    Van Geijn, H P

    1996-06-01

    FHR monitoring has been the subject of many debates. The technique, in itself, can be considered to be accurate and reliable both in the antenatal period, when using the Doppler signal in combination with autocorrelation techniques, and during the intrapartum period, in particular when the FHR signal can be obtained from a fetal ECG electrode placed on the presenting part. The major problems with FHR monitoring relate to the reading and interpretation of the CTG tracings. Since the FHR pattern is primarily an expression of the activity of the control by the central and peripheral nervous system over cardiovascular haemodynamics, it is possibly too indirect a signal. In other specialities such as neonatology, anaesthesiology and cardiology, monitoring and graphic display of heart rate patterns have not gained wide acceptance among clinicians. Digitized archiving, numerical analysis and even more advanced techniques, as described in this chapter, have primarily found a place in obstetrics. This can be easily explained, since the obstetrician is fully dependent on indirectly collected information regarding the fetal condition, such as (a) movements experienced by the mother, observed with ultrasound or recorded with kinetocardiotocography (Schmidt, 1994), (b) perfusion of various vessels, as assessed by Doppler velocimetry, (c) the amount of amniotic fluid or (d) changes reflected in the condition of the mother, such as the development of gestation-induced hypertension and (e) the easily, continuously obtainable FHR signal. It is of particular comfort to the obstetrician that a normal FHR tracing reliably predicts the birth of the infant in a good condition, which makes cardiotocography so attractive for widespread application. However, in the intrapartum period, many traces cannot fulfil the criteria of normality, especially in the second stage. In this respect, cardiotocography remains primarily a screening and not so much a diagnostic method. As long as continuous

  8. Development of a data base system and concentration calculation for neutron activation analysis as per the k0 method

    International Nuclear Information System (INIS)

    One of the most important nuclear analytical techniques is the neutron activation analysis used to determine which elements and their proportion are included within an analysis sample. A sample is undergone to the procedures of the technique, and the information, which is dispersed, is generated in each phase of this process. Therefore, it is necessary this information should be organized properly for its better use

  9. Development of a potentiometric EDTA method for determination of molybdenum. Use of the analysis for molybdenite concentrates

    Science.gov (United States)

    Khristova, R.; Vanmen, M.

    1986-01-01

    Based on considerations of principles and experimental data, the interference of sulfate ions in poteniometric titration of EDTA with FeCl3 was confirmed. The method of back complexometric titration of molybdenum of Nonova and Gasheva was improved by replacing hydrazine sulfate with hydrazine hydrochloride for reduction of Mo(VI) to Mo(V). The method can be used for one to tenths of mg of molybdenum with 0.04 mg standard deviation. The specific method of determination of molybdenum in molybdenite concentrates is presented.

  10. Development of a potentiometric EDTA method for determination of molybdenum. Use of the analysis for molybdenite concentrates

    International Nuclear Information System (INIS)

    Based on considerations of principles and experimental data, the interference of sulfate ions in poteniometric titration of EDTA with FeCl3 was confirmed. The method of back complexometric titration of molybdenum of Nonova and Gasheva was improved by replacing hydrazine sulfate with hydrazine hydrochloride for reduction of Mo(VI) to Mo(V). The method can be used for one to tenths of mg of molybdenum with 0.04 mg standard deviation. The specific method of determination of molybdenum in molybdenite concentrates is presented

  11. Studies on application of neutron activation analysis -Applied research on air pollution monitoring and development of analytical method of environmental samples

    International Nuclear Information System (INIS)

    This research report is written for results of applied research on air pollution monitoring using instrumental neutron activation analysis. For identification and standardization of analytical method, 24 environmental samples are analyzed quantitatively, and accuracy and precision of this method are measured. Using airborne particulate matter and biomonitor chosen as environmental indicators, trace elemental concentrations of sample collected at urban and rural site monthly are determined ant then the calculation of statistics and the factor analysis are carried out for investigation of emission source. Facilities for NAA are installed in a new HANARO reactor, functional test is performed for routine operation. In addition, unified software code for NAA is developed to improve accuracy, precision and abilities of analytical processes. (author). 103 refs., 61 tabs., 19 figs

  12. Development of methods for multiresidue analysis of rice post-emergence herbicides in loam soil and their possible applications to soils of different composition.

    Science.gov (United States)

    Niell, Silvina; Pareja, Lucia; Asteggiante, Lucía Geis; Roehrs, Rafael; Pizzutti, Ionara R; García, Claudio; Heinzen, Horacio; Cesio, María Verónica

    2010-01-01

    Two simple and straightforward sample preparation methods were developed for the multiresidue analysis of post-emergence herbicides in loam soil that are commonly used in rice crop cultivation. A number of strategic soil extraction and cleanup methods were evaluated. The instrumental analysis was performed by HPLC with a diode array detector. The best compromise between the recoveries (69-98%) and good repeatability (RSD clomazone were analyzed simultaneously. Quinclorac and bispyribac sodium were also assayed, but their recoveries were below 50%. Both methods had an LOD of 0.7 microg/kg and could accurately determine the residues at the 2 microg/kg level. These two methods could not be applied directly to other soil types as the recoveries strongly depended on the soil composition. The developed methodologies were successfully applied in monitoring 87 real-world soil samples, in which only propanil (6 to 12 microg/kg) and clomazone (15 to 20 microg/kg) residues could be detected.

  13. Analysis of Human Serum and Whole Blood for Mineral Content by ICP-MS and ICP-OES: Development of a Mineralomics Method

    Science.gov (United States)

    Harrington, James M.; Young, Daniel J.; Essader, Amal S.; Sumner, Susan J.; Levine, Keith E.

    2014-01-01

    Minerals are inorganic compounds that are essential to the support of a variety of biological functions. Understanding the range and variability of the content of these minerals in biological samples can provide insight into the relationships between mineral content and the health of individuals. In particular, abnormal mineral content may serve as an indicator of illness. The development of robust, reliable analytical methods for the determination of the mineral content of biological samples is essential to developing biological models for understanding the relationship between minerals and illnesses. This manuscript describes a method for the analysis of the mineral content of small volumes of serum and whole blood samples from healthy individuals. Interday and intraday precision for the mineral content of the blood (250 μl) and serum (250 μl) samples was measured for eight essential minerals, sodium (Na), calcium (Ca), magnesium (Mg), potassium (K), iron (Fe), zinc (Zn), copper (Cu), and selenium (Se) by plasma spectrometric methods and ranged from 0.635 – 10.1% relative standard deviation (RSD) for serum and 0.348 – 5.98% for whole blood. A comparison of the determined ranges for ten serum samples and six whole blood samples provided good agreement with literature reference ranges. The results demonstrate that the digestion and analysis methods can be used to reliably measure the content of these minerals, and potentially to add other minerals. PMID:24917052

  14. Development of an SDS-gel electrophoresis method on SU-8 microchips for protein separation with LIF detection: Application to the analysis of whey proteins.

    Science.gov (United States)

    Del Mar Barrios-Romero, Maria; Crevillén, Agustín G; Diez-Masa, José Carlos

    2013-08-01

    This work describes the development of an SDS-gel electrophoresis method for the analysis of major whey proteins (α-lactalbumin, β-lactoglobulin, and BSA) carried out in SU-8 microchips. The method uses a low-viscosity solution of dextran as a sieving polymer. A commercial coating agent (EOTrol LN) was added to the separation buffer to control the EOF of the chips. The potential of this coating agent to prevent protein adsorption on the walls of the SU-8 channels was also evaluated. Additionally, the fluorescence background of the SU-8 material was studied to improve the sensitivity of the method. By selecting an excitation wavelength of 532 nm at which the background fluorescence remains low and by replacing the mercury arc lamp by a laser in the detection system, an LOD in the nanomolar range was achieved for proteins derivatized with the fluorogenic reagent Chromeo P540. Finally, the method was applied to the analysis of milk samples, demonstrating the potential of SU-8 microchips for the analysis of proteins in complex food samples.

  15. Development and validation of AccuTOF-DART™ as a screening method for analysis of bank security device and pepper spray components.

    Science.gov (United States)

    Pfaff, Allison M; Steiner, Robert R

    2011-03-20

    Analysis of bank security devices, containing 1-methylaminoanthraquinone (MAAQ) and o-chlorobenzylidenemalononitrile (CS), and pepper sprays, containing capsaicin, is a lengthy process with no specific screening technique to aid in identifying samples of interest. Direct Analysis in Real Time (DART™) ionization coupled with an Accurate Time of Flight (AccuTOF) mass detector is a fast, ambient ionization source that could significantly reduce time spent on these cases and increase the specificity of the screening process. A new method for screening clothing for bank dye and pepper spray, using AccuTOF-DART™ analysis, has been developed. Detection of MAAQ, CS, and capsaicin was achieved via extraction of each compound onto cardstock paper, which was then sampled in the AccuTOF-DART™. All results were verified using gas chromatography coupled with electron impact mass spectrometry. PMID:20643521

  16. STABILITY INDICATING HPLC METHOD DEVELOPMENT: A REVIEW

    Directory of Open Access Journals (Sweden)

    Bhoomi P. Shah*, Suresh Jain, Krishna K. Prajapati and Nasimabanu Y. Mansuri

    2012-09-01

    Full Text Available High performance liquid chromatography is one of the most accurate methods widely used for the quantitative as well as qualitative analysis of drug product and is used for determining drug product stability. Stability indicating HPLC methods are used to separate various drug related impurities that are formed during the synthesis or manufacture of drug product. This article discusses the strategies and issues regarding the development of stability indicating HPLC system for drug substance. A number of key chromatographic factors were evaluated in order to optimize the detection of all potentially relevant degradants. The method should be carefully examined for its ability to distinguish the primary drug components from the impurities. New chemical entities and drug products must undergo forced degradation studies which would be helpful in developing and demonstrating the specificity of such stability indicating methods. At every stage of drug development practical recommendations are provided which will help to avoid failures.

  17. Analysis and Development of Finite Element Methods for the Study of Nonlinear Thermomechanical Behavior of Structural Components

    Science.gov (United States)

    Oden, J. Tinsley

    1995-01-01

    Underintegrated methods are investigated with respect to their stability and convergence properties. The focus was on identifying regions where they work and regions where techniques such as hourglass viscosity and hourglass control can be used. Results obtained show that underintegrated methods typically lead to finite element stiffness with spurious modes in the solution. However, problems exist (scalar elliptic boundary value problems) where underintegrated with hourglass control yield convergent solutions. Also, stress averaging in underintegrated stiffness calculations does not necessarily lead to stable or convergent stress states.

  18. Development of liquid chromatography-tandem mass spectrometry method for analysis of polyphenolic compounds in liquid samples of grape juice, green tea and coffee.

    Science.gov (United States)

    Sapozhnikova, Yelena

    2014-05-01

    A simple and fast method for the analysis of a wide range of polyphenolic compounds in juice, tea, and coffee samples was developed using liquid chromatography-tandem mass spectrometry (LC-MS/MS). The method was based on a simple sample preparation "dilute and shoot" approach, and LC-MS/MS quantification using genistein-d4 as an internal standard. The performance of six different syringeless filter devices was tested for sample preparation. The method was evaluated for recoveries of polyphenols at three spiking levels in juice, tea, and coffee samples. The recoveries of the majority of polyphenols were satisfactory (70-120%), but some varied significantly (20-138%) depending on the matrix. NIST Standard Reference Materials (SRM) 3257 Catechin Calibration Solutions and 3255 Camellia sinensis (Green Tea) Extract with certified concentrations of catechin and epicatechin were used for method validation. The measurement accuracy in two SRMs was 71-113%. The method was successfully applied to the analysis of liquid samples of grape juice, green tea, and coffee.

  19. High-performance liquid chromatography analysis methods developed for quantifying enzymatic esterification of flavonoids in ionic liquids

    DEFF Research Database (Denmark)

    Lue, Bena-Marie; Guo, Zheng; Xu, X.B.

    2008-01-01

    Methods using reversed-phase high-performance liquid chromatography (RP-HPLC) with ELSD were investigated to quantify enzymatic reactions of flavonoids with fatty acids in the presence of diverse room temperature ionic liquids (RTILs). A buffered salt (preferably triethylamine-acetate) was found...

  20. Firm Analysis by Different Methods

    OpenAIRE

    Píbilová, Kateřina

    2012-01-01

    This Diploma Thesis deals with an analysis of the company made by selected methods. External environment of the company is analysed using PESTLE analysis and Porter’s five-factor model. The internal environment is analysed by means of Kralicek Quick test and Fundamental analysis. SWOT analysis represents opportunities and threats of the external environment with the strengths and weaknesses of the company. The proposal of betterment of the company’s economic management is designed on the basi...

  1. Development of multidimensional liquid chromatographic methods hyphenated to mass spectrometry, preparation and analysis of complex biological samples

    OpenAIRE

    Delmotte, Nathanaël

    2007-01-01

    Immunoadsorbers based on monolithic epoxy-activated CIM disks have been developed in order to target biomarkers of heart diseases. The developed immunoadsorbers permitted to selectively isolate myoglobin and NT-proBNP from human serum. Anti-NT-proBNP-CIM disks permitted a quantitative isolation of NT-proBNP at concentrations down to 750 amol/µL in serum (R2 = 0.998). Six different restricted access materials have been evaluated with respect to their ability to remove hemoglobin from hemoly...

  2. Ethnographic Contributions to Method Development

    DEFF Research Database (Denmark)

    Leander, Anna

    2015-01-01

    Contrary to common assumptions, there is much to be learned about methods from constructivist/post-structuralist approaches to International Relations (IR) broadly speaking. This article develops this point by unpacking the contributions of one specific method—ethnography—as used in one subfield...... of IR—Critical Security Studies. Ethnographic research works with what has been termed a “strong” understanding of objectivity. When this understanding is taken seriously, it must lead to a refashioning of the processes of gathering, analyzing, and presenting data in ways that reverse many standard...

  3. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  4. Structural reliability methods: Code development status

    Science.gov (United States)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-01-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  5. Method development for the redox speciation analysis of iron by ion chromatography-inductively coupled plasma mass spectrometry and carryover assessment using isotopically labeled analyte analogues.

    Science.gov (United States)

    Wolle, Mesay Mulugeta; Fahrenholz, Timothy; Rahman, G M Mizanur; Pamuku, Matt; Kingston, H M 'Skip'; Browne, Damien

    2014-06-20

    An ion chromatography-inductively coupled plasma mass spectrometry (IC-ICP-MS) method was developed for the redox speciation analysis of iron (Fe) based on in-column complexation of Fe(2+) and Fe(3+) by dipicolinic acid (DPA). The effects of column type, mobile phase composition and molecular ion interference were studied in the method optimization. The carryover of the target species in the IC-ICP-MS method was uniquely and effectively evaluated using isotopically enriched analogues of the analytes ((54)Fe(2+) and (57)Fe(3+)). Standard solutions of the enriched standards were injected into the system following analysis of a sample, and the ratios of the isotopes of iron in the enriched standards were calculated based on the chromatographic peak areas. The concentrations of the analytes carried over from the sample to the enriched standards were determined using the quantitative relationship in isotope dilution mass spectrometry (IDMS). In contrast to the routine way of evaluating carryover effect by injecting a blank solution after sample analysis, the use of isotopically enriched standards identified significant analyte carryover in the present method. Extensive experiments were carried out to systematically identify the source of the carryover and to eliminate the problem; the separation column was found to be the exclusive source. More than 95% of the analyte carryover was eliminated by reducing the length of the column. The detection limit of the IC-ICP-MS method (MDL) for the iron species was 2ngg(-1). The method was used to determine Fe(2+) and Fe(3+) in synthetic aqueous standard solutions and a beverage sample.

  6. Development and validation of a LC-MS/MS method for quantitative analysis of uraemic toxins p-cresol sulphate and indoxyl sulphate in saliva.

    Science.gov (United States)

    Giebułtowicz, Joanna; Korytowska, Natalia; Sankowski, Bartłomiej; Wroczyński, Piotr

    2016-04-01

    p-Cresol sulphate (pCS) and indoxyl sulphate (IS) are uraemic toxins, the concentration of which in serum correlate with the stage of renal failure. The aim of this study was to develop and validate a high-performance liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for the analysis of pCS and IS in saliva. This is the first time, to our knowledge, that such a method has been developed using saliva. Unstimulated, fasting saliva was collected from healthy volunteers in the morning and pooled for validation assay. The method was validated for linearity, precision, accuracy, stability (freeze/thaw stability, stability in autosampler, short- and long-term stability, stock solution stability), dilution integrity and matrix effect. The analysed validation criteria were fulfilled. No influence of salivary flow (pCS: p=0.678; IS: p=0.238) nor type of swab in the Salivette device was detected. Finally, using the novel validated method, the saliva samples of healthy people (n=70) of various ages were analysed. We observed a tendency for an increase of concentration of toxins in saliva in the elderly. This could be a result of age-related diseases, e.g., diabetes and kidney function decline. We can conclude that the novel LC-MS/MS method can be used for the determination of pCS and IS in human saliva. The results encourage the validation of saliva as a clinical sample for monitoring toxin levels in organisms.

  7. Analysis of urban metabolic processes based on input-output method: model development and a case study for Beijing

    Science.gov (United States)

    Zhang, Yan; Liu, Hong; Chen, Bin; Zheng, Hongmei; Li, Yating

    2014-06-01

    Discovering ways in which to increase the sustainability of the metabolic processes involved in urbanization has become an urgent task for urban design and management in China. As cities are analogous to living organisms, the disorders of their metabolic processes can be regarded as the cause of "urban disease". Therefore, identification of these causes through metabolic process analysis and ecological element distribution through the urban ecosystem's compartments will be helpful. By using Beijing as an example, we have compiled monetary input-output tables from 1997, 2000, 2002, 2005, and 2007 and calculated the intensities of the embodied ecological elements to compile the corresponding implied physical input-output tables. We then divided Beijing's economy into 32 compartments and analyzed the direct and indirect ecological intensities embodied in the flows of ecological elements through urban metabolic processes. Based on the combination of input-output tables and ecological network analysis, the description of multiple ecological elements transferred among Beijing's industrial compartments and their distribution has been refined. This hybrid approach can provide a more scientific basis for management of urban resource flows. In addition, the data obtained from distribution characteristics of ecological elements may provide a basic data platform for exploring the metabolic mechanism of Beijing.

  8. Development of genetic diagnosing method for diabetes and cholecystitis based on gene analysis of CCK-A receptor

    International Nuclear Information System (INIS)

    Based on the gene analysis of cholecystokinin type A receptor (CCKAR) from normal mouse and its sequence analysis in the previous year, CCKAR knock-out gene which allows mRNA expression of β-galactosidase gene in stead of CCKAR gene was constructed. Since some abnormality in CCKAR gene is thought to be a causal factor of diabetes and cholecystitis, a knock-out mouse that expressed LacZ but not CCKAR was constructed to investigate the correlation between the clinical features of diabetes and cholecystitis, and CCKAR gene abnormalities. F2 mice that had mutations in CCKAR gene were born according to the Mendel's low. The expression of CCKAR gene was investigated in detail based on the expression of LacZ gene in various tissues of homo (-/-) and hetero (-/+) knockout mice. Comparative study on blood sugar level, blood insulin level, the formation of biliary calculus, etc. is underway with the wild mouse, hetero and homo knockout mouse. (M.N.)

  9. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...

  10. Development of methods for body composition studies

    Energy Technology Data Exchange (ETDEWEB)

    Mattsson, Soeren [Department of Radiation Physics, Lund University, Malmoe University Hospital, SE-205 02 Malmoe (Sweden); Thomas, Brian J [School of Physical and Chemical Sciences, Queensland University of Technology, Brisbane, QLD 4001 (Australia)

    2006-07-07

    This review is focused on experimental methods for determination of the composition of the human body, its organs and tissues. It summarizes the development and current status of fat determinations from body density, total body water determinations through the dilution technique, whole and partial body potassium measurements for body cell mass estimates, in vivo neutron activation analysis for body protein measurements, dual-energy absorptiometry (DEXA), computed tomography (CT) and magnetic resonance imaging (MRI, fMRI) and spectroscopy (MRS) for body composition studies on tissue and organ levels, as well as single- and multiple-frequency bioimpedance (BIA) and anthropometry as simple easily available methods. Methods for trace element analysis in vivo are also described. Using this wide range of measurement methods, together with gradually improved body composition models, it is now possible to quantify a number of body components and follow their changes in health and disease. (review)

  11. Development of methods for body composition studies

    International Nuclear Information System (INIS)

    This review is focused on experimental methods for determination of the composition of the human body, its organs and tissues. It summarizes the development and current status of fat determinations from body density, total body water determinations through the dilution technique, whole and partial body potassium measurements for body cell mass estimates, in vivo neutron activation analysis for body protein measurements, dual-energy absorptiometry (DEXA), computed tomography (CT) and magnetic resonance imaging (MRI, fMRI) and spectroscopy (MRS) for body composition studies on tissue and organ levels, as well as single- and multiple-frequency bioimpedance (BIA) and anthropometry as simple easily available methods. Methods for trace element analysis in vivo are also described. Using this wide range of measurement methods, together with gradually improved body composition models, it is now possible to quantify a number of body components and follow their changes in health and disease. (review)

  12. Development of Chiral LC-MS Methods for small Molecules and Their Applications in the Analysis of Enantiomeric Composition and Pharmacokinetic Studies

    Energy Technology Data Exchange (ETDEWEB)

    Desai, Meera Jay [Iowa State Univ., Ames, IA (United States)

    2004-01-01

    The purpose of this research was to develop sensitive LC-MS methods for enantiomeric separation and detection, and then apply these methods for determination of enantiomeric composition and for the study of pharmacokinetic and pharmacodynamic properties of a chiral nutraceutical. Our first study, evaluated the use of reverse phase and polar organic mode for chiral LC-API/MS method development. Reverse phase methods containing high water were found to decrease ionization efficiency in electrospray, while polar organic methods offered good compatibility and low limits of detection with ESI. The use of lower flow rates dramatically increased the sensitivity by an order of magnitude. Additionally, for rapid chiral screening, the coupled Chirobiotic column afforded great applicability for LC-MS method development. Our second study, continued with chiral LC-MS method development in this case for the normal phase mode. Ethoxynonafluorobutane, a fluorocarbon with low flammability and no flashpoint, was used as a substitute solvent for hexane/heptane mobile phases for LC-APCI/MS. Comparable chromatographic resolutions and selectivities were found using ENFB substituted mobile phase systems, although, peak efficiencies were significantly diminished. Limits of detection were either comparable or better for ENFB-MS over heptane-PDA detection. The miscibility of ENFB with a variety of commonly used organic modifiers provided for flexibility in method development. For APCI, lower flow rates did not increase sensitivity as significantly as was previously found for ESI-MS detection. The chiral analysis of native amino acids was evaluated using both APCI and ESI sources. For free amino acids and small peptides, APCI was found to have better sensitivities over ESI at high flow rates. For larger peptides, however, sensitivity was greatly improved with the use of electrospray. Additionally, sensitivity was enhanced with the use of non-volatile additives, This optimized method was then

  13. Development of Chiral LC-MS Methods for small Molecules and Their Applications in the Analysis of Enantiomeric Composition and Pharmacokinetic Studies

    Energy Technology Data Exchange (ETDEWEB)

    Meera Jay Desai

    2004-12-19

    The purpose of this research was to develop sensitive LC-MS methods for enantiomeric separation and detection, and then apply these methods for determination of enantiomeric composition and for the study of pharmacokinetic and pharmacodynamic properties of a chiral nutraceutical. Our first study, evaluated the use of reverse phase and polar organic mode for chiral LC-API/MS method development. Reverse phase methods containing high water were found to decrease ionization efficiency in electrospray, while polar organic methods offered good compatibility and low limits of detection with ESI. The use of lower flow rates dramatically increased the sensitivity by an order of magnitude. Additionally, for rapid chiral screening, the coupled Chirobiotic column afforded great applicability for LC-MS method development. Our second study, continued with chiral LC-MS method development in this case for the normal phase mode. Ethoxynonafluorobutane, a fluorocarbon with low flammability and no flashpoint, was used as a substitute solvent for hexane/heptane mobile phases for LC-APCI/MS. Comparable chromatographic resolutions and selectivities were found using ENFB substituted mobile phase systems, although, peak efficiencies were significantly diminished. Limits of detection were either comparable or better for ENFB-MS over heptane-PDA detection. The miscibility of ENFB with a variety of commonly used organic modifiers provided for flexibility in method development. For APCI, lower flow rates did not increase sensitivity as significantly as was previously found for ESI-MS detection. The chiral analysis of native amino acids was evaluated using both APCI and ESI sources. For free amino acids and small peptides, APCI was found to have better sensitivities over ESI at high flow rates. For larger peptides, however, sensitivity was greatly improved with the use of electrospray. Additionally, sensitivity was enhanced with the use of non-volatile additives, This optimized method was then

  14. Development of an ionic liquid based dispersive liquid-liquid microextraction method for the analysis of polycyclic aromatic hydrocarbons in water samples.

    Science.gov (United States)

    Pena, M Teresa; Casais, M Carmen; Mejuto, M Carmen; Cela, Rafael

    2009-09-01

    A simple, rapid and efficient method, ionic liquid based dispersive liquid-liquid microextraction (IL-DLLME), has been developed for the first time for the determination of 18 polycyclic aromatic hydrocarbons (PAHs) in water samples. The chemical affinity between the ionic liquid (1-octyl-3-methylimidazolium hexafluorophosphate) and the analytes permits the extraction of the PAHs from the sample matrix also allowing their preconcentration. Thus, this technique combines extraction and concentration of the analytes into one step and avoids using toxic chlorinated solvents. The factors affecting the extraction efficiency, such as the type and volume of ionic liquid, type and volume of disperser solvent, extraction time, dispersion stage, centrifuging time and ionic strength, were optimised. Analysis of extracts was performed by high performance liquid chromatography (HPLC) coupled with fluorescence detection (Flu). The optimised method exhibited a good precision level with relative standard deviation values between 1.2% and 5.7%. Quantification limits obtained for all of these considered compounds (between 0.1 and 7 ng L(-1)) were well below the limits recommended in the EU. The extraction yields for the different compounds obtained by IL-DLLME, ranged from 90.3% to 103.8%. Furthermore, high enrichment factors (301-346) were also achieved. The extraction efficiency of the optimised method is compared with that achieved by liquid-liquid extraction. Finally, the proposed method was successfully applied to the analysis of PAHs in real water samples (tap, bottled, fountain, well, river, rainwater, treated and raw wastewater). PMID:19646707

  15. [Development of Determination Method of Fluoroquinolone Antibiotics in Sludge Based on Solid Phase Extraction and HPLC-Fluorescence Detection Analysis].

    Science.gov (United States)

    Dai, Xiao-hu; Xue, Yong-gang; Liu, Hua-jie; Dai, Ling-ling; Yan, Han; Li, Ning

    2016-04-15

    Fluoroquinolone antibiotics (FQs), as the common pharmaceuticals and personal care products (PPCPs), are widespread in the environment. FQs contained in wastewater would be ultimately enriched in sludge, posing a potential threat to the consequent sludge utilization. To optimize the analytical method applicable to the determination of FQs in sludge, the authors selected ofloxacin (OFL), norfioxacin (NOR), ciprofloxacin (CIP) and lomefloxacin (LOM) as the target FQs, and established a method which was based on cell lysis, FQs extraction with triethylamine/methanol/water solution, Solid Phase Extraction (SPE) and HPLC-Fluorescence Detection (FLD) determination. After the investigation, phosphoric acid-triethylamine was decided to be the buffer salt, and methanol was chosen as the organic mobile phase. The gradient fluorescence scanning strategy was proved to be necessary for the optimal detection as well. Furthermore, by the designed orthogonal experiments, the effects of the extraction materials, pH, and the eluents on the efficiency of SPE extraction were evaluated, by which the optimal extraction conditions were determined. As a result, FQs in liquid samples could be analyzed by utilizing HLB extraction cartridge, and the recovery rates of the four FQs were in the range of 82%-103%. As for solid samples, the recovery rates of the four FQs contained reached up to 71%-101%. Finally, the adsorptivity of the sludge from the different tanks ( anaerobic, anoxic and oxic tanks) was investigated, showing gradual decrease in the adsorption capacity, but all adsorbed over 90% of the EQs. This conclusion also confirmed that 50% removal of FQs in the domestic wastewater treatment plant was realized by sludge adsorption. PMID:27548982

  16. [Development of Determination Method of Fluoroquinolone Antibiotics in Sludge Based on Solid Phase Extraction and HPLC-Fluorescence Detection Analysis].

    Science.gov (United States)

    Dai, Xiao-hu; Xue, Yong-gang; Liu, Hua-jie; Dai, Ling-ling; Yan, Han; Li, Ning

    2016-04-15

    Fluoroquinolone antibiotics (FQs), as the common pharmaceuticals and personal care products (PPCPs), are widespread in the environment. FQs contained in wastewater would be ultimately enriched in sludge, posing a potential threat to the consequent sludge utilization. To optimize the analytical method applicable to the determination of FQs in sludge, the authors selected ofloxacin (OFL), norfioxacin (NOR), ciprofloxacin (CIP) and lomefloxacin (LOM) as the target FQs, and established a method which was based on cell lysis, FQs extraction with triethylamine/methanol/water solution, Solid Phase Extraction (SPE) and HPLC-Fluorescence Detection (FLD) determination. After the investigation, phosphoric acid-triethylamine was decided to be the buffer salt, and methanol was chosen as the organic mobile phase. The gradient fluorescence scanning strategy was proved to be necessary for the optimal detection as well. Furthermore, by the designed orthogonal experiments, the effects of the extraction materials, pH, and the eluents on the efficiency of SPE extraction were evaluated, by which the optimal extraction conditions were determined. As a result, FQs in liquid samples could be analyzed by utilizing HLB extraction cartridge, and the recovery rates of the four FQs were in the range of 82%-103%. As for solid samples, the recovery rates of the four FQs contained reached up to 71%-101%. Finally, the adsorptivity of the sludge from the different tanks ( anaerobic, anoxic and oxic tanks) was investigated, showing gradual decrease in the adsorption capacity, but all adsorbed over 90% of the EQs. This conclusion also confirmed that 50% removal of FQs in the domestic wastewater treatment plant was realized by sludge adsorption.

  17. A novel time-course cDNA microarray analysis method identifies genes associated with the development of cisplatin resistance.

    Science.gov (United States)

    Whiteside, Martin A; Chen, Dung-Tsa; Desmond, Renee A; Abdulkadir, Sarki A; Johanning, Gary L

    2004-01-22

    In recent years, most cDNA microarray studies of chemotherapeutic drug resistance have not considered the temporal pattern of gene expression. The objective of this study was to examine systematically changes in gene expression of NCI-H226 and NCI-H2170 lung cancer cells treated weekly with IC10 doses of cisplatin. NCI-H226 lung cancer cells were treated weekly with an IC10 dose of cisplatin. Candidate genes with a fold change of 2.0 or more were identified from this study. A second experiment was conducted by exposing NCI-H2170 cells to cisplatin doses that were increased in week 4 and decreased in week 5. Overall, 44 genes were differentially expressed in both the NCI-H226 and NCI-H2170 cell lines. In the NCI-H2170 cell line, 24 genes had a twofold gene expression change from weeks 3 to 4. Real-time PCR found a significant correlation of the gene expression changes for seven genes of interest. This small time-ordered series identified novel genes associated with cisplatin resistance. This kind of analysis should be viewed as a first step towards building gene-regulatory networks. PMID:14737109

  18. INTER-COUNTRY EFFICIENCY EVALUATION IN INNOVATION ACTIVITY ON THE BASIS OF METHOD FOR DATA ENVELOPMENT ANALYSIS AMONG COUNTRIES WITH DEVELOPED AND DEVELOPING ECONOMY, INCLUDING THE REPUBLIC OF BELARUS

    Directory of Open Access Journals (Sweden)

    I. V. Zhukovski

    2016-01-01

    Full Text Available The paper considers a problem on efficiency evaluation of innovation activity in 63 countries with developed and developing economies while using a method for data envelopment analysis. The following results of innovation activity have been used for calculation of an efficiency factor: export of high-technology products as percentage of industrial product export, export of ICT services as percentage of services export and payments obtained due to realization of intellectual property rights (in US dollars. A model of the data envelopment analysis with a changeable scale-dependent effect and which is directed on maximization of the obtained results (output-oriented VRS model has been used for the analysis. The evaluation has shown that such countries as the USA, Israel, Sweden and some others have maximum efficiency of resource transformation into innovative activity output. The executed analysis has revealed that the Republic of Belarus has a potential for improvement of indices on innovation results.

  19. ARK methods: some recent developments

    Science.gov (United States)

    Moir, Nicolette

    2005-03-01

    Almost Runge-Kutta methods are a sub-class of the family of methods known as general linear methods, used for solving ordinary differential equations. They combine many of the favourable properties of traditional Runge-Kutta methods with some additional advantages. We will introduce these methods, concentrating on methods of order four, and present some recent results.

  20. SUBSURFACE CONSTRUCTION AND DEVELOPMENT ANALYSIS

    International Nuclear Information System (INIS)

    The purpose of this analysis is to identify appropriate construction methods and develop a feasible approach for construction and development of the repository subsurface facilities. The objective of this analysis is to support development of the subsurface repository layout for License Application (LA) design. The scope of the analysis for construction and development of the subsurface Repository facilities covers: (1) Excavation methods, including application of knowledge gained from construction of the Exploratory Studies Facility (ESF). (2) Muck removal from excavation headings to the surface. This task will examine ways of preventing interference with other subsurface construction activities. (3) The logistics and equipment for the construction and development rail haulage systems. (4) Impact of ground support installation on excavation and other construction activities. (5) Examination of how drift mapping will be accomplished. (6) Men and materials handling. (7) Installation and removal of construction utilities and ventilation systems. (8) Equipping and finishing of the emplacement drift mains and access ramps to fulfill waste emplacement operational needs. (9) Emplacement drift and access mains and ramps commissioning prior to handover for emplacement operations. (10) Examination of ways to structure the contracts for construction of the repository. (11) Discussion of different construction schemes and how to minimize the schedule risks implicit in those schemes. (12) Surface facilities needed for subsurface construction activities

  1. 武器装备建设方案的组合分析方法%A Portfolio-Analysis Method for Selecting Armament Development Candidates

    Institute of Scientific and Technical Information of China (English)

    卜广志

    2011-01-01

    In the overarching design of armament development, it's important to construct and select an armament development scheme from on many candidates. A portfolio-analysis method is studied.Firstly, a series of armament development scheme candidates are constructed and to be assessed in detail.Secondly, various measures are selected to assess every candidate. The measures include the degree of mission accomplished, the risk in executing multi-mission, the cost of armament options and the risk in development process. Finally, the decision maker ranks these candidates according to their objectives and intensions based on the assessment results. The example followed verifies the portfolio-analysis method in selecting armament development candidates.%在武器装备建设顶层设计中,将若干备选的装备项目构建成武器装备建设方案是一项非常重要的工作.使用组合分析的理论对此问题进行了研究.首先,按照装备种类由备选的装备项目组合出多种武器装备建设方案,作为决策分析的对象.然后,选择满足任务程度、满足任务风险、建设费用和建设风险作为指标评价每种方案的优劣.最后,按照不同的决策目的和重点,对建设方案进行排序和选优.所附算例说明了方法的适用性.

  2. The forensic analysis of office paper using carbon isotope ratio mass spectrometry--part 2: method development, validation and sample handling.

    Science.gov (United States)

    Jones, Kylie; Benson, Sarah; Roux, Claude

    2013-09-10

    This paper describes the development and validation of a method for the analysis of office papers by measuring carbon isotopes using isotope ratio mass spectrometry (IRMS). The method development phase included testing protocols for storage, sample materials, set-up of the analytical run; and examining the effects of other paper examination procedures on IRMS results. A method validation was performed so that the Delta(plus) XP IRMS instrument (Thermo Finnigan, Bremen, Germany) with Flash EA™ 1112 could be used to measure document paper samples for forensic casework. A validation protocol that would meet international standards for laboratory accreditation (international standard ISO 17025) was structured so that the instruments performance characteristics could be observed. All performance characteristics measured were found to be within an acceptable range and an expanded measurement uncertainty for the measurement of carbon isotopes in paper was calculated at 0.26‰, with a coverage factor of 2. This method was utilized in a large-scale study, published as part one of this series, that showed that IRMS of document papers is useful as a chemical comparison technique for 80 gsm white office papers. PMID:23810570

  3. Assessment of neutronic parameter's uncertainties obtained within the reactor dosimetry framework: Development and application of the stochastic methods of analysis

    International Nuclear Information System (INIS)

    One of the main objectives of reactor dosimetry is the determination of the physical parameters characterizing the neutronic field in which the studied sample is irradiated. The knowledge of the associated uncertainties represents a significant stake for nuclear industry as shows the high uncertainty value of 15% (k=1) commonly allowed for the calculated neutron flux (E> 1 MeV) on the vessel and internal structures. The study presented in this paper aims at determining then reducing uncertainties associated with the reactor dosimetry interpretation process. After a brief presentation of the interpretation process, input data uncertainties identification and quantification are performed in particular with regard to covariances. Then uncertainties propagation is carried out and analyzed by deterministic and stochastic methods on a representative case. Finally, a Monte Carlo sensitivity study based on Sobol indices is achieved on a case leading to derive the most penalizing input uncertainties. This paper concludes rising improvement axes to be studied for the input data knowledge. It highlights for example the need for having realistic variance-covariance matrices associated with input data (cross sections libraries, neutron computation code's outputs, ...). Lastly, the methodology principle presented in this paper is enough general to be easily transposable for other measurements data interpretation processes. (authors)

  4. Novel methods to help develop healthier eating habits for eating and weight disorders: A systematic review and meta-analysis.

    Science.gov (United States)

    Turton, Robert; Bruidegom, Kiki; Cardi, Valentina; Hirsch, Colette R; Treasure, Janet

    2016-02-01

    This paper systematically reviews novel interventions developed and tested in healthy controls that may be able to change the over or under controlled eating behaviours in eating and weight disorders. Electronic databases were searched for interventions targeting habits related to eating behaviours (implementation intentions; food-specific inhibition training and attention bias modification). These were assessed in accordance with the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines. In healthy controls the implementation intention approach produces a small increase in healthy food intake and reduction in unhealthy food intake post-intervention. The size of these effects decreases over time and no change in weight was found. Unhealthy food intake was moderately reduced by food-specific inhibition training and attention bias modification post-intervention. This work may have important implications for the treatment of populations with eating and weight disorders. However, these findings are preliminary as there is a moderate to high level of heterogeneity in implementation intention studies and to date there are few food-specific inhibition training and attention bias modification studies. PMID:26695383

  5. Novel methods to help develop healthier eating habits for eating and weight disorders: A systematic review and meta-analysis.

    Science.gov (United States)

    Turton, Robert; Bruidegom, Kiki; Cardi, Valentina; Hirsch, Colette R; Treasure, Janet

    2016-02-01

    This paper systematically reviews novel interventions developed and tested in healthy controls that may be able to change the over or under controlled eating behaviours in eating and weight disorders. Electronic databases were searched for interventions targeting habits related to eating behaviours (implementation intentions; food-specific inhibition training and attention bias modification). These were assessed in accordance with the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines. In healthy controls the implementation intention approach produces a small increase in healthy food intake and reduction in unhealthy food intake post-intervention. The size of these effects decreases over time and no change in weight was found. Unhealthy food intake was moderately reduced by food-specific inhibition training and attention bias modification post-intervention. This work may have important implications for the treatment of populations with eating and weight disorders. However, these findings are preliminary as there is a moderate to high level of heterogeneity in implementation intention studies and to date there are few food-specific inhibition training and attention bias modification studies.

  6. Optimisation of isolation methods for the azaspiracid group of marine biotoxins and the development of accurate and precise methods of analysis

    OpenAIRE

    Kilcoyle, J.

    2015-01-01

    The two main groups of biotoxins which affect the Irish shellfish industry are azaspiracids (AZAs) and the okadaic acid (OA) group (OA, DTX2, DTX1 and their esters) toxins. Since AZAs were first identified in 1998, well over 30 analogues have been reported. Structural and toxicological data have been described for AZA1–5 (isolated from shellfish). LC-MS/MS is the EU reference method for detection of the AZAs (AZA1, -2 and -3) and the OA group toxins in raw shellfish with the regulatory limit ...

  7. Optimisation of Isolation Methods for the AZA Group of Marine Biotoxins and the Development of Accurate and Precise Methods of Analysis

    OpenAIRE

    Kilcoyne, Jane

    2015-01-01

    The two main groups of biotoxins which affect the Irish shellfish industry are azaspiracids (AZAs) and the okadaic acid (OA) group (OA, DTX2, DTX1 and their esters) toxins. Since AZAs were first identified in 1998, well over 30 analogues have been reported. Structural and toxicological data have been described for AZA1–5 (isolated from shellfish). LC-MS/MS is the EU reference method for detection of the AZAs (AZA1, -2 and -3) and the OA group toxins in raw shellfish with the regulatory limit ...

  8. The behavioral satiety sequence in pigeons (Columba livia). Description and development of a method for quantitative analysis.

    Science.gov (United States)

    Spudeit, William Anderson; Sulzbach, Natalia Saretta; Bittencourt, Myla de A; Duarte, Anita Maurício Camillo; Liang, Hua; Lino-de-Oliveira, Cilene; Marino-Neto, José

    2013-10-01

    The postprandial event known as the specific dynamic action is an evolutionarily conserved physiological set of metabolic responses to feeding. Its behavioral counterpart, a sequence of drinking, maintenance (e.g., grooming) and sleep-like behaviors known as the behavioral satiety sequence (BSS), has been thoroughly described in rodents and has enabled the refined evaluation of potential appetite modifiers. However, the presence and attributes of a BSS have not been systematically studied in non-mammalian species. Here, we describe the BSS induced in pigeons (Columba livia) by 1) the presentation of a palatable seed mixture (SM) food to free-feeding animals (SM+FF condition) and 2) re-feeding after a 24-h fasting period (FD24h+SM), which was examined by continuous behavioral recording for 2h. We then compare these patterns to those observed in free-feeding (FF) animals. A set of graphic representations and indexes, drawn from these behaviors (latency, time-to-peak, inter-peak intervals and the first intersection between feeding curves and those of other BSS-typical behaviors) were used to describe the temporal structure and sequential relationships between the pigeon's BSS components. Cramér-von Mises-based statistical procedures and bootstrapping-based methods to compare pairs of complex behavioral curves were described and used for comparisons among the behavioral profiles during the free-feeding recordings and after fasting- and SM-induced BSS. FD24h+SM- and SM+FF-induced feeding were consistently followed by a similar sequence of increased bouts of drinking, followed by preening and then sleep, which were significantly different from that of FF birds. The sequential and temporal patterns of the pigeon's BSS were not affected by differences in food intake or by dissimilarity in motivational content of feeding stimuli. The present data indicated that a BSS pattern can be reliably evoked in the pigeon, in a chronological succession and sequence that strongly

  9. The development of sequential separation methods for the analysis of actinides in sediments and biological materials using anion-exchange resins and extraction chromatography

    International Nuclear Information System (INIS)

    New, quantitative methods for the determination of actinides have been developed for application to marine environmental samples (e.g., sediment and fish). The procedures include aggressive dissolution, separation by anion-exchange resin, separation and purification by extraction chromatography (e.g., TRU, TEVA and UTEVA resins) with measurement of the radionuclides by semiconductor alpha-spectrometry (SAS). Anion-exchange has proved to be a strong tool to treat large volume samples, and extraction chromatography shows an excellent selectivity and reduction of the amounts of acids. The results of the analysis of uranium, thorium, plutonium and americium isotopes by this method in marine samples (IAEA-384, -385 and -414) provided excellent agreement with the recommended values with good chemical recoveries. (author)

  10. Development of RP-HPLC method for Qualitative Analysis of Active Ingredient (Gallic acid from Stem Bark of Dendrophthoe falcate Linn.

    Directory of Open Access Journals (Sweden)

    Hafsa Deshmukh

    2011-04-01

    Full Text Available A simple, precise and sensitive, RP-HPLC method with UV detection at 271nm was developed and validated for qualitative determination of active ingredient Gallic acid from stem bark of Dendrophthoe falcate Linn. Separation was performed on a ThermoMOS 2 HYPERSIL C18 column (250 cm × 4.6 mm, 5µm ODS 3 using mobile phase comprising of 0.1% Orthophosphoric acid : Acetonitrile (400 cm3 : 600 cm3 with a flow rate of 1 ml/minute with a short run time of 13 minute. The method was validated according to the regulatory guidelines with respect to linearity, system suitability, precision, solution stability, accuracy, robustness, assay and recovery. Detector response was linear for HPLC in the range of 0.04 to 0.16 mg/cm3. The system suitability, precision, solution stability, accuracy, robustness, assay and recovery was assessed by calculating % COV for all these parameters which is less than two as expected. The recovery of the method for Gallic acid was found 98.94% which shows that method is accurate. The described method has the advantage of being rapid and easy hence it can be applied for routine quality control analysis of Gallic acid from Dendrophthoe falcate Linn.

  11. Development of the Method of Bacterial Leaching of Metals out of Low-Grade Ores, Rocks, and Industrial Wastes Using Neutron Activation Analysis

    CERN Document Server

    Tsertsvadze, L A; Petriashvili, Sh G; Chutkerashvili, D G; Kirkesali, E I; Frontasyeva, M V; Pavlov, S S; Gundorina, S F

    2001-01-01

    The results of preliminary investigations aimed at the development of an economical and easy to apply technique of bacterial leaching of rare and valuable metals out of low-grade ores, complex composition ores, rocks, and industrial wastes in Georgia are discussed. The main groups of microbiological community of the peat suspension used in the experiments of bacterial leaching are investigated and the activity of particular microorganisms in the leaching of probes with different mineral compositions is assessed. The element composition of the primary and processed samples was investigated by the epithermal neutron activation analysis method and the enrichment/subtraction level is estimated for various elements. The efficiency of the developed technique to purify wastes, extract some scrace metals, and enrich ores or rocks in some elements, e.g. Au, U, Th, Cs, Sr, Rb, Sc, Zr, Hf, Ta, Gd, Er, Lu, Ce, etc., is demonstrated.

  12. A platform analytical quality by design (AQbD) approach for multiple UHPLC-UV and UHPLC-MS methods development for protein analysis.

    Science.gov (United States)

    Kochling, Jianmei; Wu, Wei; Hua, Yimin; Guan, Qian; Castaneda-Merced, Juan

    2016-06-01

    A platform analytical quality by design approach for methods development is presented in this paper. This approach is not limited just to method development following the same logical Analytical quality by design (AQbD) process, it is also exploited across a range of applications in methods development with commonality in equipment and procedures. As demonstrated by the development process of 3 methods, the systematic approach strategy offers a thorough understanding of the method scientific strength. The knowledge gained from the UHPLC-UV peptide mapping method can be easily transferred to the UHPLC-MS oxidation method and the UHPLC-UV C-terminal heterogeneity methods of the same protein. In addition, the platform AQbD method development strategy ensures method robustness is built in during development. In early phases, a good method can generate reliable data for product development allowing confident decision making. Methods generated following the AQbD approach have great potential for avoiding extensive post-approval analytical method change. While in the commercial phase, high quality data ensures timely data release, reduced regulatory risk, and lowered lab operational cost. Moreover, large, reliable database and knowledge gained during AQbD method development provide strong justifications during regulatory filling for the selection of important parameters or parameter change needs for method validation, and help to justify for removal of unnecessary tests used for product specifications.

  13. A platform analytical quality by design (AQbD) approach for multiple UHPLC-UV and UHPLC-MS methods development for protein analysis.

    Science.gov (United States)

    Kochling, Jianmei; Wu, Wei; Hua, Yimin; Guan, Qian; Castaneda-Merced, Juan

    2016-06-01

    A platform analytical quality by design approach for methods development is presented in this paper. This approach is not limited just to method development following the same logical Analytical quality by design (AQbD) process, it is also exploited across a range of applications in methods development with commonality in equipment and procedures. As demonstrated by the development process of 3 methods, the systematic approach strategy offers a thorough understanding of the method scientific strength. The knowledge gained from the UHPLC-UV peptide mapping method can be easily transferred to the UHPLC-MS oxidation method and the UHPLC-UV C-terminal heterogeneity methods of the same protein. In addition, the platform AQbD method development strategy ensures method robustness is built in during development. In early phases, a good method can generate reliable data for product development allowing confident decision making. Methods generated following the AQbD approach have great potential for avoiding extensive post-approval analytical method change. While in the commercial phase, high quality data ensures timely data release, reduced regulatory risk, and lowered lab operational cost. Moreover, large, reliable database and knowledge gained during AQbD method development provide strong justifications during regulatory filling for the selection of important parameters or parameter change needs for method validation, and help to justify for removal of unnecessary tests used for product specifications. PMID:27017571

  14. Development of the HS-SPME-GC-MS/MS method for analysis of chemical warfare agent and their degradation products in environmental samples.

    Science.gov (United States)

    Nawała, Jakub; Czupryński, Krzysztof; Popiel, Stanisław; Dziedzic, Daniel; Bełdowski, Jacek

    2016-08-24

    After World War II approximately 50,000 tons of chemical weapons were dumped in the Baltic Sea by the Soviet Union under the provisions of the Potsdam Conference on Disarmament. These dumped chemical warfare agents still possess a major threat to the marine environment and to human life. Therefore, continue monitoring of these munitions is essential. In this work, we present the application of new solid phase microextraction fibers in analysis of chemical warfare agents and their degradation products. It can be concluded that the best fiber for analysis of sulfur mustard and its degradation products is butyl acrylate (BA), whereas for analysis of organoarsenic compounds and chloroacetophenone, the best fiber is a co-polymer of methyl acrylate and methyl methacrylate (MA/MMA). In order to achieve the lowest LOD and LOQ the samples should be divided into two subsamples. One of them should be analyzed using a BA fiber, and the second one using a MA/MMA fiber. When the fast analysis is required, the microextraction should be performed by use of a butyl acrylate fiber because the extraction efficiency of organoarsenic compounds for this fiber is acceptable. Next, we have elaborated of the HS-SPME-GC-MS/MS method for analysis of CWA degradation products in environmental samples using laboratory obtained fibers The analytical method for analysis of organosulfur and organoarsenic compounds was optimized and validated. The LOD's for all target chemicals were between 0.03 and 0.65 ppb. Then, the analytical method developed by us, was used for the analysis of sediment and pore water samples from the Baltic Sea. During these studies, 80 samples were analyzed. It was found that 25 sediments and 5 pore water samples contained CWA degradation products such as 1,4-dithiane, 1,4-oxathiane or triphenylarsine, the latter being a component of arsine oil. The obtained data is evidence that the CWAs present in the Baltic Sea have leaked into the general marine environment. PMID

  15. Development of the HS-SPME-GC-MS/MS method for analysis of chemical warfare agent and their degradation products in environmental samples.

    Science.gov (United States)

    Nawała, Jakub; Czupryński, Krzysztof; Popiel, Stanisław; Dziedzic, Daniel; Bełdowski, Jacek

    2016-08-24

    After World War II approximately 50,000 tons of chemical weapons were dumped in the Baltic Sea by the Soviet Union under the provisions of the Potsdam Conference on Disarmament. These dumped chemical warfare agents still possess a major threat to the marine environment and to human life. Therefore, continue monitoring of these munitions is essential. In this work, we present the application of new solid phase microextraction fibers in analysis of chemical warfare agents and their degradation products. It can be concluded that the best fiber for analysis of sulfur mustard and its degradation products is butyl acrylate (BA), whereas for analysis of organoarsenic compounds and chloroacetophenone, the best fiber is a co-polymer of methyl acrylate and methyl methacrylate (MA/MMA). In order to achieve the lowest LOD and LOQ the samples should be divided into two subsamples. One of them should be analyzed using a BA fiber, and the second one using a MA/MMA fiber. When the fast analysis is required, the microextraction should be performed by use of a butyl acrylate fiber because the extraction efficiency of organoarsenic compounds for this fiber is acceptable. Next, we have elaborated of the HS-SPME-GC-MS/MS method for analysis of CWA degradation products in environmental samples using laboratory obtained fibers The analytical method for analysis of organosulfur and organoarsenic compounds was optimized and validated. The LOD's for all target chemicals were between 0.03 and 0.65 ppb. Then, the analytical method developed by us, was used for the analysis of sediment and pore water samples from the Baltic Sea. During these studies, 80 samples were analyzed. It was found that 25 sediments and 5 pore water samples contained CWA degradation products such as 1,4-dithiane, 1,4-oxathiane or triphenylarsine, the latter being a component of arsine oil. The obtained data is evidence that the CWAs present in the Baltic Sea have leaked into the general marine environment.

  16. Development of a capillary electrophoresis method for the analysis in alkaline media as polyoxoanions of two strategic metals: Niobium and tantalum.

    Science.gov (United States)

    Deblonde, Gauthier J-P; Chagnes, Alexandre; Cote, Gérard; Vial, Jérôme; Rivals, Isabelle; Delaunay, Nathalie

    2016-03-11

    Tantalum (Ta) and niobium (Nb) are two strategic metals essential to several key sectors, like the aerospace, gas and oil, nuclear and electronic industries, but their separation is really difficult due to their almost identical chemical properties. Whereas they are currently produced by hydrometallurgical processes using fluoride-based solutions, efforts are being made to develop cleaner processes by replacing the fluoride media by alkaline ones. However, methods to analyze Nb and Ta simultaneously in alkaline samples are lacking. In this work, we developed a capillary zone electrophoresis (CE) method able to separate and quantify Nb and Ta directly in alkaline media. This method takes advantage of the hexaniobate and hexatantalate ions which are naturally formed at pH>9 and absorb in the UV domain. First, the detection conditions, the background electrolyte (BGE) pH, the nature of the BGE co-ion and the internal standard (IS) were optimized by a systematic approach. As the BGE counter-ion nature modified the speciation of both ions, sodium- and lithium-based BGE were tested. For each alkaline cation, the BGE ionic strength and separation temperature were optimized using experimental designs. Since changes in the migration order of IS, Nb and Ta were observed within the experimental domain, the resolution was not a monotonic function of ionic strength and separation temperature. This forced us to develop an original data treatment for the prediction of the optimum separation conditions. Depending on the consideration of either peak widths or peak symmetries, with or without additional robustness constraints, four optima were predicted for each tested alkaline cation. The eight predicted optima were tested experimentally and the best experimental optimum was selected considering analysis time, resolution and robustness. The best separation was obtained at 31.0°C and in a BGE containing 10mM LiOH and 35mM LiCH3COO.The separation voltage was finally optimized

  17. Development of a capillary electrophoresis method for the analysis in alkaline media as polyoxoanions of two strategic metals: Niobium and tantalum.

    Science.gov (United States)

    Deblonde, Gauthier J-P; Chagnes, Alexandre; Cote, Gérard; Vial, Jérôme; Rivals, Isabelle; Delaunay, Nathalie

    2016-03-11

    Tantalum (Ta) and niobium (Nb) are two strategic metals essential to several key sectors, like the aerospace, gas and oil, nuclear and electronic industries, but their separation is really difficult due to their almost identical chemical properties. Whereas they are currently produced by hydrometallurgical processes using fluoride-based solutions, efforts are being made to develop cleaner processes by replacing the fluoride media by alkaline ones. However, methods to analyze Nb and Ta simultaneously in alkaline samples are lacking. In this work, we developed a capillary zone electrophoresis (CE) method able to separate and quantify Nb and Ta directly in alkaline media. This method takes advantage of the hexaniobate and hexatantalate ions which are naturally formed at pH>9 and absorb in the UV domain. First, the detection conditions, the background electrolyte (BGE) pH, the nature of the BGE co-ion and the internal standard (IS) were optimized by a systematic approach. As the BGE counter-ion nature modified the speciation of both ions, sodium- and lithium-based BGE were tested. For each alkaline cation, the BGE ionic strength and separation temperature were optimized using experimental designs. Since changes in the migration order of IS, Nb and Ta were observed within the experimental domain, the resolution was not a monotonic function of ionic strength and separation temperature. This forced us to develop an original data treatment for the prediction of the optimum separation conditions. Depending on the consideration of either peak widths or peak symmetries, with or without additional robustness constraints, four optima were predicted for each tested alkaline cation. The eight predicted optima were tested experimentally and the best experimental optimum was selected considering analysis time, resolution and robustness. The best separation was obtained at 31.0°C and in a BGE containing 10mM LiOH and 35mM LiCH3COO.The separation voltage was finally optimized

  18. Excitation methods for energy dispersive analysis

    International Nuclear Information System (INIS)

    The rapid development in recent years of energy dispersive x-ray fluorescence analysis has been based primarily on improvements in semiconductor detector x-ray spectrometers. However, the whole analysis system performance is critically dependent on the availability of optimum methods of excitation for the characteristic x rays in specimens. A number of analysis facilities based on various methods of excitation have been developed over the past few years. A discussion is given of the features of various excitation methods including charged particles, monochromatic photons, and broad-energy band photons. The effects of the excitation method on background and sensitivity are discussed from both theoretical and experimental viewpoints. Recent developments such as pulsed excitation and polarized photons are also discussed

  19. Development of safety evaluation methods and analysis codes applied to the safety regulations for the design and construction stage of fast breeder reactor (Annual safety research report, JFY 2011)

    International Nuclear Information System (INIS)

    The purposes of this study are to develop the safety evaluation methods and analysis codes needed in the design and construction stage of fast breeder reactor (FBR). In JFY 2011, the following results are obtained. As for the development of safety evaluation methods needed in the safety examination achieved for the reactor establishment permission, development of the analysis codes such as core seismic analysis code, core safety analysis code and core damage analysis code were earned out according to the plan. As for the development of the safety evaluation method needed for the risk informed safety regulation, the quantification technique of the event tree using the Continuous Markov chain Monte Carlo method (CMMC method) were studied, and the seismic PSA to evaluate residual risk was studied. (author)

  20. Development of a method for comprehensive and quantitative analysis of plant hormones by highly sensitive nanoflow liquid chromatography-electrospray ionization-ion trap mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Izumi, Yoshihiro; Okazawa, Atsushi; Bamba, Takeshi; Kobayashi, Akio [Department of Biotechnology, Graduate School of Engineering, Osaka University, 2-1 Yamadaoka, Suita, Osaka 565-0871 (Japan); Fukusaki, Eiichiro, E-mail: fukusaki@bio.eng.osaka-u.ac.jp [Department of Biotechnology, Graduate School of Engineering, Osaka University, 2-1 Yamadaoka, Suita, Osaka 565-0871 (Japan)

    2009-08-26

    In recent plant hormone research, there is an increased demand for a highly sensitive and comprehensive analytical approach to elucidate the hormonal signaling networks, functions, and dynamics. We have demonstrated the high sensitivity of a comprehensive and quantitative analytical method developed with nanoflow liquid chromatography-electrospray ionization-ion trap mass spectrometry (LC-ESI-IT-MS/MS) under multiple-reaction monitoring (MRM) in plant hormone profiling. Unlabeled and deuterium-labeled isotopomers of four classes of plant hormones and their derivatives, auxins, cytokinins (CK), abscisic acid (ABA), and gibberellins (GA), were analyzed by this method. The optimized nanoflow-LC-ESI-IT-MS/MS method showed ca. 5-10-fold greater sensitivity than capillary-LC-ESI-IT-MS/MS, and the detection limits (S/N = 3) of several plant hormones were in the sub-fmol range. The results showed excellent linearity (R{sup 2} values of 0.9937-1.0000) and reproducibility of elution times (relative standard deviations, RSDs, <1.1%) and peak areas (RSDs, <10.7%) for all target compounds. Further, sample purification using Oasis HLB and Oasis MCX cartridges significantly decreased the ion-suppressing effects of biological matrix as compared to the purification using only Oasis HLB cartridge. The optimized nanoflow-LC-ESI-IT-MS/MS method was successfully used to analyze endogenous plant hormones in Arabidopsis and tobacco samples. The samples used in this analysis were extracted from only 17 tobacco dry seeds (1 mg DW), indicating that the efficiency of analysis of endogenous plant hormones strongly depends on the detection sensitivity of the method. Our analytical approach will be useful for in-depth studies on complex plant hormonal metabolism.

  1. Developments of an Interactive Sail Design Method

    Directory of Open Access Journals (Sweden)

    S. M. Malpede

    2000-01-01

    Full Text Available This paper presents a new tool for performing the integrated design and analysis of a sail. The features of the system are the geometrical definition of a sail shape, using the Bezier surface method, the creation of a finite element model for the non-linear structural analysis and a fluid-dynamic model for the aerodynamic analysis. The system has been developed using MATLAB(r. Recent sail design efforts have been focused on solving the aeroelastic behavior of the sail. The pressure distribution on a sail changes continuously, by virtue of cloth stretch and flexing. The sail shape determines the pressure distribution and, at the same time, the pressure distribution on the sail stretches and flexes the sail material determining its shape. This characteristic non-linear behavior requires iterative solution strategies to obtain the equilibrium configuration and evaluate the forces involved. The aeroelastic problem is tackled by combining structural with aerodynamic analysis. Firstly, pressure loads for a known sail-shape are computed (aerodynamic analysis. Secondly, the sail-shape is analyzed for the obtained external loads (structural analysis. The final solution is obtained by using an iterative analysis process, which involves both aerodynamic and the structural analysis. When the solution converges, it is possible to make design modifications.

  2. Development and validation of a LC-MS/MS method for quantitative analysis of uraemic toxins p-cresol sulphate and indoxyl sulphate in saliva.

    Science.gov (United States)

    Giebułtowicz, Joanna; Korytowska, Natalia; Sankowski, Bartłomiej; Wroczyński, Piotr

    2016-04-01

    p-Cresol sulphate (pCS) and indoxyl sulphate (IS) are uraemic toxins, the concentration of which in serum correlate with the stage of renal failure. The aim of this study was to develop and validate a high-performance liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for the analysis of pCS and IS in saliva. This is the first time, to our knowledge, that such a method has been developed using saliva. Unstimulated, fasting saliva was collected from healthy volunteers in the morning and pooled for validation assay. The method was validated for linearity, precision, accuracy, stability (freeze/thaw stability, stability in autosampler, short- and long-term stability, stock solution stability), dilution integrity and matrix effect. The analysed validation criteria were fulfilled. No influence of salivary flow (pCS: p=0.678; IS: p=0.238) nor type of swab in the Salivette device was detected. Finally, using the novel validated method, the saliva samples of healthy people (n=70) of various ages were analysed. We observed a tendency for an increase of concentration of toxins in saliva in the elderly. This could be a result of age-related diseases, e.g., diabetes and kidney function decline. We can conclude that the novel LC-MS/MS method can be used for the determination of pCS and IS in human saliva. The results encourage the validation of saliva as a clinical sample for monitoring toxin levels in organisms. PMID:26838447

  3. An Analysis Method of Business Application Framework

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    We discuss the evolution of object-oriented software developmentpr o cess based on software pattern. For developing mature software fra mework and component, we advocate to elicit and incorporate software patterns fo r ensuing quality and reusability of software frameworks. On the analysis base o f requirement specification for business application domain, we present analysis method and basic role model of software framework. We also elicit analysis patt ern of framework architecture, and design basic role classes and their structure .

  4. Scientific methods for developing ultrastable structures

    International Nuclear Information System (INIS)

    Scientific methods used by the Los Alamos National Laboratory for developing an ultrastable structure for study of silicon-based elementary particle tracking systems are addressed. In particular, the design, analysis, and monitoring of this system are explored. The development methodology was based on a triad of analytical, computational, and experimental techniques. These were used to achieve a significant degree of mechanical stability (alignment accuracy >1 μrad) and yet allow dynamic manipulation of the system. Estimates of system thermal and vibratory stability and component performance are compared with experimental data collected using laser interferometry and accelerometers. 8 refs., 5 figs., 4 tabs

  5. Development of a non-chromatographic method for the speciation analysis of inorganic antimony in mushroom samples by hydride generation atomic fluorescence spectrometry

    Science.gov (United States)

    Sousa Ferreira, Hadla; Costa Ferreira, Sergio Luis; Cervera, M. Luisa; de la Guardia, Miguel

    2009-06-01

    A simple and sensitive method has been developed for the direct determination of toxic species of antimony in mushroom samples by hydride generation atomic fluorescence spectrometry (HG AFS). The determination of Sb(III) and Sb(V) was based on the efficiency of hydride generation employing NaBH 4, with and without a previous KI reduction, using proportional equations corresponding to the two different measurement conditions. The extraction efficiency of total antimony and the stability of Sb(III) and Sb(V) in different extraction media (nitric, sulfuric, hydrochloric, acetic acid, methanol and ethanol) were evaluated. Results demonstrated that, based on the extraction yield and the stability of extracts, 0.5 mol L - 1 H 2SO 4 proved to be the best extracting solution for the speciation analysis of antimony in mushroom samples. The limits of detection of the developed methodology were 0.6 and 1.1 ng g - 1 for Sb(III) and Sb(V), respectively. The relative standard derivation was 3.8% (14.7 ng g - 1 ) for Sb(V) and 5.1% (4.6 ng g - 1 ) for Sb(III). The recovery values obtained for Sb(III) and Sb(V) varied from 94 to 106% and from 98 to 105%, respectively. The method has been applied to determine Sb(III), Sb(V) and total Sb in five different mushroom samples; the Sb(III) content varied from 4.6 to 11.4 ng g - 1 and Sb(V) from 14.7 to 21.2 ng g - 1 . The accuracy of the method was confirmed by the analysis of a certified reference material of tomato leaves.

  6. Further development of probabilistic analysis method for lifetime determination of piping and vessels. Final report; Weiterentwicklung probabilistischer Analysemethoden zur Lebensdauerbestimmung von Rohrleitungen und Behaeltern. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Heckmann, K.; Grebner, H.; Sievers, J.

    2013-07-15

    Within the framework of research project RS1196 the computer code PROST (Probabilistic Structure Calculation) for the quantitative evaluation of the structural reliability of pipe components has been further developed. Thereby models were provided and tested for the consideration of the damage mechanism 'stable crack growth' to determine leak and break probabilities in cylindrical structures of ferritic and austenitic reactor steels. These models are now additionally available to the model for the consideration of the damage mechanisms 'fatigue' and 'corrosion'. Moreover, a crack initiation model has been established supplementary to the treatment of initial cracks. Furthermore, the application range of the code was extended to the calculation of the growth of wall penetrating cracks. This is important for surface cracks growing until the formation of a stable leak. The calculation of the growth of the wall penetrating crack until break occurs improves the estimation of the break probability. For this purpose program modules were developed to be able to calculate stress intensity factors and critical crack lengths for wall penetrating cracks. In the frame of this work a restructuring of PROST was performed including possibilities to combine damage mechanisms during a calculation. Furthermore several additional fatigue crack growth laws were implemented. The implementation of methods to estimate leak areas and leak rates of wall penetrating cracks was completed by the inclusion of leak detection boundaries. The improved analysis methods were tested by calculation of cases treated already before. Furthermore comparative analyses have been performed for several tasks within the international activity BENCH-KJ. Altogether, the analyses show that with the provided flexible probabilistic analysis method quantitative determination of leak and break probabilities of a crack in a complex structure geometry under thermal-mechanical loading as

  7. Development of a mixed-mode solid phase extraction method and further gas chromatography mass spectrometry for the analysis of 3-alkyl-2-methoxypyrazines in wine.

    Science.gov (United States)

    López, Ricardo; Gracia-Moreno, Elisa; Cacho, Juan; Ferrreira, Vicente

    2011-02-11

    A new method for analysing 3-isopropyl-2-methoxypyrazine, 3-sec-butyl-2-methoxypyrazine and 3-isobutyl-2-methoxypyrazine in wine has been developed and applied to wine. The analytes are extracted from 25 mL of wine in a solid-phase extraction cartridge filled with 60 mg of cation-exchange mixed-mode sorbent. Analytes are recovered with triethylamine in dichloromethane and the organic extract is analysed by GC-SIM-MS using 3-isopropyl-2-ethoxypyrazine as internal standard. The detection limits of the method are in all cases under 1 ng/L, below the olfactory thresholds of the compounds in wine. The repeatability of the method is around 15% for levels in wine of 2 ng/L. Linearity is satisfactory and recoveries are in all cases close to 100% with RSD between 13% and 20%. The method has been applied to the analysis of 12 Chilean white and 8 Spanish red wines. The levels found suggest that 3-alkyl-2-methoxypyrazines can exert a significant sensory contribution to the aroma of Chilean Sauvignon Blanc wines, while most likely they play a nearly negligible role on traditional Ribera and Rioja Spanish red wines.

  8. An integrated quality by design and mixture-process variable approach in the development of a capillary electrophoresis method for the analysis of almotriptan and its impurities.

    Science.gov (United States)

    Orlandini, S; Pasquini, B; Stocchero, M; Pinzauti, S; Furlanetto, S

    2014-04-25

    The development of a capillary electrophoresis (CE) method for the assay of almotriptan (ALM) and its main impurities using an integrated Quality by Design and mixture-process variable (MPV) approach is described. A scouting phase was initially carried out by evaluating different CE operative modes, including the addition of pseudostationary phases and additives to the background electrolyte, in order to approach the analytical target profile. This step made it possible to select normal polarity microemulsion electrokinetic chromatography (MEEKC) as operative mode, which allowed a good selectivity to be achieved in a low analysis time. On the basis of a general Ishikawa diagram for MEEKC methods, a screening asymmetric matrix was applied in order to screen the effects of the process variables (PVs) voltage, temperature, buffer concentration and buffer pH, on critical quality attributes (CQAs), represented by critical separation values and analysis time. A response surface study was then carried out considering all the critical process parameters, including both the PVs and the mixture components (MCs) of the microemulsion (borate buffer, n-heptane as oil, sodium dodecyl sulphate/n-butanol as surfactant/cosurfactant). The values of PVs and MCs were simultaneously changed in a MPV study, making it possible to find significant interaction effects. The design space (DS) was defined as the multidimensional combination of PVs and MCs where the probability for the different considered CQAs to be acceptable was higher than a quality level π=90%. DS was identified by risk of failure maps, which were drawn on the basis of Monte-Carlo simulations, and verification points spanning the design space were tested. Robustness testing of the method, performed by a D-optimal design, and system suitability criteria allowed a control strategy to be designed. The optimized method was validated following ICH Guideline Q2(R1) and was applied to a real sample of ALM coated tablets.

  9. Novel methods for spectral analysis

    Science.gov (United States)

    Roy, R.; Sumpter, B. G.; Pfeffer, G. A.; Gray, S. K.; Noid, D. W.

    1991-06-01

    In this review article, various techniques for obtaining estimates of parameters related to the spectrum of an underlying process are discussed. These techniques include the conventional nonparametric FFT approach and more recently developed parametric techniques such as maximum entropy, MUSIC, and ESPRIT, the latter two being classified as signal-subspace or eigenvector techniques. These estimators span the spectrum of possible estimators in that extremes of a priori knowledge are assumed (nonparametric versus parametric) and extremes in the underlying model of the observed process (deterministic versus stochastic) are involved. The advantage of parametric techniques is their ability to provide very accurate estimates using data from extremely short time intervals. Several applications of these novel methods for frequency analysis of very short time data are presented. These include calculation of dispersion curves, and the density of vibrational states g(ω) for many-body systems, semiclassical transition frequencies, overtone linewidths, and resonance energies of the time-dependent Schrödinger equation for few-body problems.

  10. Development and validation of a generic nontarget method based on liquid chromatography - high resolution mass spectrometry analysis for the evaluation of different wastewater treatment options.

    Science.gov (United States)

    Nürenberg, Gudrun; Schulz, Manoj; Kunkel, Uwe; Ternes, Thomas A

    2015-12-24

    A comprehensive workflow for using nontarget approaches as process evaluation tools was implemented, including data acquisition based on a LC-HRMS (QTOF) system using direct injection and data post-processing for the peak recognition in "full scan" data. Both parts of the approach were not only developed and validated in a conventional way using the suspected analysis of a set of spiked known micropollutants but also the nontarget analysis of a wastewater treatment plant (WWTP) effluent itself was utilized to consider a more environmental relevant range of analytes. Hereby, special focus was laid on the minimization of false positive results (FPs) during the peak recognition. The optimized data post-processing procedure reduced the percentage of FPs from 42% to 10-15%. Furthermore, the choice of a suitable chromatography for biological treated wastewater systems was also discussed during the method development. The workflow paid also attention to differences in the performance levels of the LC-HRMS system by implementation of an adaption system for intensity variations comparing different measurements dates or different instruments. The application of this workflow on wastewater samples from a municipal WWTP revealed that more than 91% compounds were eliminated by the biological treatment step and that the received effluent contained 55% newly formed potential transformation products. PMID:26654253

  11. Development of an LC-MS/MS method for analysis of interconvertible Z/E isomers of the novel anticancer agent, Bp4eT.

    Science.gov (United States)

    Stariat, Ján; Kovaríková, Petra; Klimes, Jirí; Kalinowski, Danuta S; Richardson, Des R

    2010-05-01

    This study was focused on a liquid chromatography/tandem mass spectrometry (LC/MS/MS) method development for quantification of a novel potential anticancer agent, 2-benzoylpyridine 4-ethyl-3-thiosemicarbazone (Bp4eT), in aqueous media. Solid Bp4eT was found to consist predominantly of the Z isomer, while in aqueous media, both isomers coexist. Sufficient separation of both isomers was achieved on a Synergi 4u Polar RP column with a mobile phase composed of 2 mM ammonium formate, acetonitrile, and methanol (30:63:7; v/v/v). The photo diode array analysis of both isomers demonstrated different absorption spectra which hindered UV-based quantification. However, an equal and reproducible response was found for both isomers using an MS detector, which enables the determination of the total content of Bp4eT (i.e., both E- and Z- isomeric forms) by summation of the peak areas of both isomers. 2-Hydroxy-1-naphthylaldehyde 4-methyl-3-thiosemicarbazone (N4mT) was selected as the internal standard. Quantification was performed in selective reaction monitoring using the main fragments of [M+H](+) (240 m/z for Bp4eT and 229 m/z for N4mT). The method was validated over 20-600 ng/ml. This procedure was applied to a preformulation study to determine the proper vehicle for parenteral administration. It was found that Bp4eT was poorly soluble in aqueous media. However, the solubility can be effectively improved using pharmaceutical cosolvents. In fact, a 1:1 mixture of PEG 300/0.14 M saline markedly increased solubility and may be a useful drug formulation for intravenous administration. This investigation further accelerates development of novel anticancer thiosemicarbazones. The described methods will be useful for analogs currently under development and suffering the same analytical issue. PMID:20127082

  12. Development and validation of a method for the analysis of hydroxyzine hydrochloride in extracellular solution used in in vitro preclinical safety studies.

    Science.gov (United States)

    Briône, Willy; Brekelmans, Mari; Eijndhoven, Freek van; Schenkel, Eric; Noij, Theo

    2015-11-10

    In the process of drug development, preclinical safety studies are to be performed that require the analysis of the compound at very low concentrations with high demands on the performance of the analytical methods. In the current study, a UPLC-MS/MS method was developed and validated to quantify hydroxyzine hydrochloride in an extracellular solution used in a hERG assay in concentrations ranging from 0.01 to 10μM (4.5ng/ml-4.5μg/ml). Chromatographic separation was achieved isocratically on an Acquity BEH C18 analytical column. The assay was validated at concentrations of 0.11-1.1ng/ml in end solution for hydroxyzine hydrochloride. Linearity was demonstrated over the range of concentrations of 0.06-0.17ng/ml and over the range of concentrations of 0.6-1.7ng/ml in end solution with the coefficient of correlation r>0.99. Accuracy of the achieved concentration, intra-run, and inter-run precision of the method were well within the acceptance criteria (being mean recovery of 80-120% and relative standard deviation ≤10.0%). The limit of quantification in extracellular solution was 0.09ng/ml. Hydroxyzine hydrochloride in extracellular solution proved to be stable when stored in the fridge at 4-8°C for at least 37 days, at room temperature for at least 16 days and at +35°C for at least 16 days. The analytical method was successfully applied in hERG assay. PMID:26163869

  13. Development of an HPLC-UV Method for the Analysis of Drugs Used for Combined Hypertension Therapy in Pharmaceutical Preparations and Human Plasma

    Directory of Open Access Journals (Sweden)

    Serife Evrim Kepekci Tekkeli

    2013-01-01

    Full Text Available A simple, rapid, and selective HPLC-UV method was developed for the determination of antihypertensive drug substances: amlodipine besilat (AML, olmesartan medoxomil (OLM, valsartan (VAL, and hydrochlorothiazide (HCT in pharmaceuticals and plasma. These substances are mostly used as combinations. The combinations are found in various forms, especially in current pharmaceuticals as threesome components: OLM, AML, and HCT (combination I and AML, VAL, and HCT (combination II. The separation was achieved by using an RP-CN column, and acetonitrile-methanol-10 mmol orthophosphoric acid pH 2.5 (7 : 13 : 80, v/v/v was used as a mobile phase; the detector wavelength was set at 235 nm. The linear ranges were found as 0.1–18.5 μg/mL, 0.4–25.6 μg/mL, 0.3–15.5 μg/mL, and 0.3–22 μg/mL for AML, OLM, VAL, and HCT, respectively. In order to check the selectivity of the method for pharmaceutical preparations, forced degradation studies were carried out. According to the validation studies, the developed method was found to be reproducible and accurate as shown by RSD ≤6.1%, 5.7%, 6.9%, and 4.6% and relative mean error (RME ≤10.6%, 5.8%, 6.5%, and 6.8% for AML, OLM, VAL, and HCT, respectively. Consequently, the method was applied to the analysis of tablets and plasma of the patients using drugs including those substances.

  14. Development of an HPLC-UV Method for the Analysis of Drugs Used for Combined Hypertension Therapy in Pharmaceutical Preparations and Human Plasma.

    Science.gov (United States)

    Kepekci Tekkeli, Serife Evrim

    2013-01-01

    A simple, rapid, and selective HPLC-UV method was developed for the determination of antihypertensive drug substances: amlodipine besilat (AML), olmesartan medoxomil (OLM), valsartan (VAL), and hydrochlorothiazide (HCT) in pharmaceuticals and plasma. These substances are mostly used as combinations. The combinations are found in various forms, especially in current pharmaceuticals as threesome components: OLM, AML, and HCT (combination I) and AML, VAL, and HCT (combination II). The separation was achieved by using an RP-CN column, and acetonitrile-methanol-10 mmol orthophosphoric acid pH 2.5 (7 : 13 : 80, v/v/v) was used as a mobile phase; the detector wavelength was set at 235 nm. The linear ranges were found as 0.1-18.5  μ g/mL, 0.4-25.6  μ g/mL, 0.3-15.5  μ g/mL, and 0.3-22  μ g/mL for AML, OLM, VAL, and HCT, respectively. In order to check the selectivity of the method for pharmaceutical preparations, forced degradation studies were carried out. According to the validation studies, the developed method was found to be reproducible and accurate as shown by RSD ≤6.1%, 5.7%, 6.9%, and 4.6% and relative mean error (RME) ≤10.6%, 5.8%, 6.5%, and 6.8% for AML, OLM, VAL, and HCT, respectively. Consequently, the method was applied to the analysis of tablets and plasma of the patients using drugs including those substances. PMID:23634320

  15. Validation of a Non-Targeted LC-MS Approach for Identifying Ancient Proteins: Method Development on Bone to Improve Artifact Residue Analysis

    Directory of Open Access Journals (Sweden)

    Andrew Barker

    2015-09-01

    Full Text Available Identification of protein residues from prehistoric cooking pottery using mass spectrometry is challenging because proteins are removed from original tissues, are degraded from cooking, may be poorly preserved due to diagenesis, and occur in a palimpsest of exogenous soil proteins. In contrast, bone proteins are abundant and well preserved. This research is part of a larger method-development project for innovation and improvement of liquid chromatography – mass spectrometry analysis of protein residues from cooking pottery; here we validate the potential of our extraction and characterization approach via application to ancient bone proteins. Because of its preservation potential for proteins and given that our approach is destructive, ancient bone identified via skeletal morphology represents an appropriate verification target. Proteins were identified from zooarchaeological turkey (Meleagris gallopavo Linnaeus Phasianidae, rabbit (Lagomorpha, and squirrel (Sciuridae remains excavated from ancient pueblo archaeological sites in southwestern Colorado using a non-targeted LC-MS/MS approach. The data have been deposited to the ProteomeXchange Consortium with the dataset identifier PXD002440. Improvement of highly sensitive targeted LC-MS/MS approaches is an avenue for future method development related to the study of protein residues from artifacts such as stone tools and pottery.

  16. Research on Development Strategy of YO Company Based on SWOT Analysis Method%基于SWOT分析法的YO公司发展战略研究

    Institute of Scientific and Technical Information of China (English)

    李连璋

    2016-01-01

    为应对当前企业经营者高度重视战略管理,且决策模式多为命令式和愿景式使得员工主动参与程度不高的问题。低价格、低成本、多种类的原始竞争手段已不能满足现代社会的多元化需求。依据SWOT分析方法,结合YO建筑公司实际,对公司发展战略从多维度研究,从产品定价策略、营销渠道、企业文化、人力资源管理、财务等方面提出了YO公司发展战略优化方法,通过案例验证发展战略可行。%Current business operators attach great importance to the strategic management, however, its decision model is imperative and visionary, making employees not willing to participate. Low price, low cost, many kinds of primitive means of competition can not meet the diverse needs of modern society. For this, based on the SWOT analysis method, combining the reality of YO construction company, this paper studies the company development strategy from multi-dimensions, and proposes the development strategy optimization method for YO company.

  17. Development of a multianalyte method based on micro-matrix-solid-phase dispersion for the analysis of fragrance allergens and preservatives in personal care products.

    Science.gov (United States)

    Celeiro, Maria; Guerra, Eugenia; Lamas, J Pablo; Lores, Marta; Garcia-Jares, Carmen; Llompart, Maria

    2014-05-30

    An effective, simple and low cost sample preparation method based on matrix solid-phase dispersion (MSPD) followed by gas chromatography-mass spectrometry (GC-MS) or gas chromatography-triple quadrupole-mass spectrometry (GC-MS/MS) has been developed for the rapid simultaneous determination of 38 cosmetic ingredients, 25 fragrance allergens and 13 preservatives. All target substances are frequently used in cosmetics and personal care products and they are subjected to use restrictions or labeling requirements according to the EU Cosmetic Directive. The extraction procedure was optimized on real non-spiked rinse-off and leave-on cosmetic products by means of experimental designs. The final miniaturized process required the use of only 0.1g of sample and 1 mL of organic solvent, obtaining a final extract ready for analysis. The micro-MSPD method was validated showing satisfactory performance by GC-MS and GC-MS/MS analysis. The use of GC coupled to triple quadrupole mass detection allowed to reach very low detection limits (low ng g(-1)) improving, at the same time, method selectivity. In an attempt to improve the chromatographic analysis of preservatives, the inclusion of a derivatization step was also assessed. The proposed method was applied to a broad range of cosmetics and personal care products (shampoos, body milk, moisturizing milk, toothpaste, hand creams, gloss lipstick, sunblock, deodorants and liquid soaps among others), demonstrating the extended use of these substances. The concentration levels were ranging from the sub parts per million to the parts per mill. The number of target fragrance allergens per samples was quite high (up to 16). Several fragrances (linalool, farnesol, hexylcinnamal, and benzyl benzoate) have been detected at levels >0.1% (1,000 μg g(-1)). As regards preservatives, phenoxyethanol was the most frequently found additive reaching quite high concentration (>1,500 μg g(-1)) in five cosmetic products. BHT was detected in eight

  18. SWOT ANALYSIS ON SAMPLING METHOD

    Directory of Open Access Journals (Sweden)

    CHIS ANCA OANA

    2014-07-01

    Full Text Available Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors. It should be applied correctly for a fair view of financial statements, to satisfy the needs of all financial users. In order to be applied correctly the method must be understood by all its users and mainly by auditors. Otherwise the risk of not applying it correctly would cause loose of reputation and discredit, litigations and even prison. Since there is not a unitary practice and methodology for applying the technique, the risk of incorrectly applying it is pretty high. The SWOT analysis is a technique used that shows the advantages, disadvantages, threats and opportunities. We applied SWOT analysis in studying the sampling method, from the perspective of three players: the audit company, the audited entity and users of financial statements. The study shows that by applying the sampling method the audit company and the audited entity both save time, effort and money. The disadvantages of the method are difficulty in applying and understanding its insight. Being largely used as an audit method and being a factor of a correct audit opinion, the sampling method’s advantages, disadvantages, threats and opportunities must be understood by auditors.

  19. Development of sample preparation method for auxin analysis in plants by vacuum microwave-assisted extraction combined with molecularly imprinted clean-up procedure.

    Science.gov (United States)

    Hu, Yuling; Li, Yuanwen; Zhang, Yi; Li, Gongke; Chen, Yueqin

    2011-04-01

    A novel sample preparation method for auxin analysis in plant samples was developed by vacuum microwave-assisted extraction (VMAE) followed by molecularly imprinted clean-up procedure. The method was based on two steps. In the first one, conventional solvent extraction was replaced by VMAE for extraction of auxins from plant tissues. This step provided efficient extraction of 3-indole acetic acid (IAA) from plant with dramatically decreased extraction time, furthermore prevented auxins from degradation by creating a reduced oxygen environment under vacuum condition. In the second step, the raw extract of VMAE was further subjected to a clean-up procedure by magnetic molecularly imprinted polymer (MIP) beads. Owing to the high molecular recognition ability of the magnetic MIP beads for IAA and 3-indole-butyric acid (IBA), the two target auxins in plants can be selectively enriched and the interfering substance can be eliminated by dealing with a magnetic separation procedure. Both the VMAE and the molecularly imprinted clean-up conditions were investigated. The proposed sample preparation method was coupled with high-performance liquid chromatogram and fluorescence detection for determination of IAA and IBA in peas and rice. The detection limits obtained for IAA and IBA were 0.47 and 1.6 ng/mL and the relative standard deviation were 2.3% and 2.1%, respectively. The IAA contents in pea seeds, pea embryo, pea roots and rice seeds were determined. The recoveries were ranged from 70.0% to 85.6%. The proposed method was also applied to investigate the developmental profiles of IAA concentration in pea seeds and rice seeds during seed germination. PMID:20953778

  20. Statistical methods for bioimpedance analysis

    Directory of Open Access Journals (Sweden)

    Christian Tronstad

    2014-04-01

    Full Text Available This paper gives a basic overview of relevant statistical methods for the analysis of bioimpedance measurements, with an aim to answer questions such as: How do I begin with planning an experiment? How many measurements do I need to take? How do I deal with large amounts of frequency sweep data? Which statistical test should I use, and how do I validate my results? Beginning with the hypothesis and the research design, the methodological framework for making inferences based on measurements and statistical analysis is explained. This is followed by a brief discussion on correlated measurements and data reduction before an overview is given of statistical methods for comparison of groups, factor analysis, association, regression and prediction, explained in the context of bioimpedance research. The last chapter is dedicated to the validation of a new method by different measures of performance. A flowchart is presented for selection of statistical method, and a table is given for an overview of the most important terms of performance when evaluating new measurement technology.

  1. Assessment of Thorium Analysis Methods

    International Nuclear Information System (INIS)

    The Assessment of thorium analytical methods for mixture power fuel consisting of titrimetry, X-ray flouresence spectrometry, UV-VIS spectrometry, alpha spectrometry, emission spectrography, polarography, chromatography (HPLC) and neutron activation were carried out. It can be concluded that analytical methods which have high accuracy (deviation standard < 3%) were; titrimetry neutron activation analysis and UV-VIS spectrometry; whereas with low accuracy method (deviation standard 3-10%) were; alpha spectrometry and emission spectrography. Ore samples can be analyzed by X-ray flourescnce spectrometry, neutron activation analysis, UV-VIS spectrometry, emission spectrography, chromatography and alpha spectometry. Concentrated samples can be analyzed by X-ray flourescence spectrometry; simulation samples can be analyzed by titrimetry, polarography and UV-VIS spectrometry, and samples of thorium as minor constituent can be analyzed by neutron activation analysis and alpha spectrometry. Thorium purity (impurities element in thorium samples) can be analyzed by emission spectography. Considering interference aspects, in general analytical methods without molecule reaction are better than those involving molecule reactions (author). 19 refs., 1 tabs

  2. Method of photon spectral analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gehrke, Robert J. (Idaho Falls, ID); Putnam, Marie H. (Idaho Falls, ID); Killian, E. Wayne (Idaho Falls, ID); Helmer, Richard G. (Idaho Falls, ID); Kynaston, Ronnie L. (Blackfoot, ID); Goodwin, Scott G. (Idaho Falls, ID); Johnson, Larry O. (Pocatello, ID)

    1993-01-01

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and .gamma.-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2000 keV), as well as high-energy .gamma. rays (>1 MeV). A 8192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The .gamma.-ray portion of each spectrum is analyzed by a standard Ge .gamma.-ray analysis program. This method can be applied to any analysis involving x- and .gamma.-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the .gamma.-ray analysis and accommodated during the x-ray analysis.

  3. Method of photon spectral analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gehrke, R.J.; Putnam, M.H.; Killian, E.W.; Helmer, R.G.; Kynaston, R.L.; Goodwin, S.G.; Johnson, L.O.

    1993-04-27

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and [gamma]-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2,000 keV), as well as high-energy [gamma] rays (>1 MeV). A 8,192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The [gamma]-ray portion of each spectrum is analyzed by a standard Ge [gamma]-ray analysis program. This method can be applied to any analysis involving x- and [gamma]-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the [gamma]-ray analysis and accommodated during the x-ray analysis.

  4. Development of unconventional forming methods

    Directory of Open Access Journals (Sweden)

    S. Rusz

    2012-10-01

    Full Text Available Purpose: Paper presents results of progress ECAP processing method for UFG structure reached (gained.The properties and microstructure are influenced by technological factors during application ECAP method.Design/methodology/approach: Summary of methods studied on Department of technology at Machining faculty of VŠB-TU Ostrava through of co-operation with Institute of Engineering Materials and Biomaterials, Silesian University of Technology is presented.Findings: Achievement of ultra-fine grained structure in initial material leads to substantial increase of plasticity and makes it possible to form materials in conditions of „superplastic state“. Achievement of the required structure depends namely of the tool geometry, number of passes through the matrix, obtained deformation magnitude and strain rate, process temperature and lubrication conditions. High deformation at comparatively low homologous temperatures is an efficient method of production of ultra-fine grained solid materials.The new technologies, which use severe plastic deformation, comprise namely these techniques: High Pressure Torsion, Equal Channel Angular Pressing = ECAP, Cyclic Channel Die Compression = CCDC, Cyclic Extrusion Compression = CEC, Continuous Extrusion Forming = CONFORM, Accumulative Roll Bonding, Constrained Groove Pressing.Research limitations/implications: Achieved hardness and microstructure characteristics will be determined by new research.Practical implications: The results may be utilized for a relation between structure and properties of the investigated materials in future process of manufacturing.Originality/value: These results contribute to complex evaluation of properties new metals after application unconventional forming methods. The results of this paper are determined for research workers deal by the process severe plastic deformation.

  5. Analysis of mixed data methods & applications

    CERN Document Server

    de Leon, Alexander R

    2013-01-01

    A comprehensive source on mixed data analysis, Analysis of Mixed Data: Methods & Applications summarizes the fundamental developments in the field. Case studies are used extensively throughout the book to illustrate interesting applications from economics, medicine and health, marketing, and genetics. Carefully edited for smooth readability and seamless transitions between chaptersAll chapters follow a common structure, with an introduction and a concluding summary, and include illustrative examples from real-life case studies in developmental toxicolog

  6. Statistical methods for bioimpedance analysis

    OpenAIRE

    Christian Tronstad; Are Hugo Pripp

    2014-01-01

    This paper gives a basic overview of relevant statistical methods for the analysis of bioimpedance measurements, with an aim to answer questions such as: How do I begin with planning an experiment? How many measurements do I need to take? How do I deal with large amounts of frequency sweep data? Which statistical test should I use, and how do I validate my results? Beginning with the hypothesis and the research design, the methodological framework for making inferences based on measurements a...

  7. Nonlinear programming analysis and methods

    CERN Document Server

    Avriel, Mordecai

    2003-01-01

    Comprehensive and complete, this overview provides a single-volume treatment of key algorithms and theories. The author provides clear explanations of all theoretical aspects, with rigorous proof of most results. The two-part treatment begins with the derivation of optimality conditions and discussions of convex programming, duality, generalized convexity, and analysis of selected nonlinear programs. The second part concerns techniques for numerical solutions and unconstrained optimization methods, and it presents commonly used algorithms for constrained nonlinear optimization problems. This g

  8. Development, validation, and application of a method for the GC-MS analysis of fipronil and three of its degradation products in samples of water, soil, and sediment.

    Science.gov (United States)

    de Toffoli, Ana L; da Mata, Kamilla; Bisinoti, Márcia C; Moreira, Altair B

    2015-01-01

    A method for the identification and quantification of pesticide residues in water, soil, and sediment samples has been developed, validated, and applied for the analysis of real samples. The specificity was determined by the retention time and the confirmation and quantification of analyte ions. Linearity was demonstrated over the concentration range of 20 to 120 µg L(-1), and the correlation coefficients varied between 0.979 and 0.996, depending on the analytes. The recovery rates for all analytes in the studied matrix were between 86% and 112%. The intermediate precision and repeatability were determined at three concentration levels (40, 80, and 120 µg L(-1)), with the relative standard deviation for the intermediate precision between 1% and 5.3% and the repeatability varying between 2% and 13.4% for individual analytes. The limits of detection and quantification for fipronil, fipronil sulfide, fipronil-sulfone, and fipronil-desulfinyl were 6.2, 3.0, 6.6, and 4.0 ng L(-1) and 20.4, 9.0, 21.6, and 13.0 ng L(-1), respectively. The method developed was used in water, soil, and sediment samples containing 2.1 mg L(-1) and 1.2% and 5.3% of carbon, respectively. The recovery of pesticides in the environmental matrices varied from 88.26 to 109.63% for the lowest fortification level (40 and 100 µg kg(-1)), from 91.17 to 110.18% for the intermediate level (80 and 200 µg kg(-1)), and from 89.09 to 109.82% for the highest fortification level (120 and 300 µg kg(-1)). The relative standard deviation for the recovery of pesticides was under 15%. PMID:26357886

  9. Developments in geophysical exploration methods

    CERN Document Server

    1982-01-01

    One of the themes in current geophysical development is the bringing together of the results of observations made on the surface and those made in the subsurface. Several benefits result from this association. The detailed geological knowledge obtained in the subsurface can be extrapolated for short distances with more confidence when the geologi­ cal detail has been related to well-integrated subsurface and surface geophysical data. This is of value when assessing the characteristics of a partially developed petroleum reservoir. Interpretation of geophysical data is generally improved by the experience of seeing the surface and subsurface geophysical expression of a known geological configuration. On the theoretical side, the understanding of the geophysical processes themselves is furthered by the study of the phenomena in depth. As an example, the study of the progress of seismic wave trains downwards and upwards within the earth has proved most instructive. This set of original papers deals with some of ...

  10. Method development in automated mineralogy

    OpenAIRE

    Sandmann, Dirk

    2015-01-01

    The underlying research that resulted in this doctoral dissertation was performed at the Division of Economic Geology and Petrology of the Department of Mineralogy, TU Bergakademie Freiberg between 2011 and 2014. It was the primary aim of this thesis to develop and test novel applications for the technology of ‘Automated Mineralogy’ in the field of economic geology and geometallurgy. A “Mineral Liberation Analyser” (MLA) instrument of FEI Company was used to conduct most analytical studies. T...

  11. Chromatographic methods for analysis of triazine herbicides.

    Science.gov (United States)

    Abbas, Hana Hassan; Elbashir, Abdalla A; Aboul-Enein, Hassan Y

    2015-01-01

    Gas chromatography (GC) and high-performance liquid chromatography (HPLC) coupled to different detectors, and in combination with different sample extraction methods, are most widely used for analysis of triazine herbicides in different environmental samples. Nowadays, many variations and modifications of extraction and sample preparation methods such as solid-phase microextraction (SPME), hollow fiber-liquid phase microextraction (HF-LPME), stir bar sportive extraction (SBSE), headspace-solid phase microextraction (HS-SPME), dispersive liquid-liquid microextraction (DLLME), dispersive liquid-liquid microextraction based on solidification of floating organic droplet (DLLME-SFO), ultrasound-assisted emulsification microextraction (USAEME), and others have been introduced and developed to obtain sensitive and accurate methods for the analysis of these hazardous compounds. In this review, several analytical properties such as linearity, sensitivity, repeatability, and accuracy for each developed method are discussed, and excellent results were obtained for the most of developed methods combined with GC and HPLC techniques for the analysis of triazine herbicides. This review gives an overview of recent publications of the application of GC and HPLC for analysis of triazine herbicides residues in various samples. PMID:25849823

  12. Chromatographic methods for analysis of triazine herbicides.

    Science.gov (United States)

    Abbas, Hana Hassan; Elbashir, Abdalla A; Aboul-Enein, Hassan Y

    2015-01-01

    Gas chromatography (GC) and high-performance liquid chromatography (HPLC) coupled to different detectors, and in combination with different sample extraction methods, are most widely used for analysis of triazine herbicides in different environmental samples. Nowadays, many variations and modifications of extraction and sample preparation methods such as solid-phase microextraction (SPME), hollow fiber-liquid phase microextraction (HF-LPME), stir bar sportive extraction (SBSE), headspace-solid phase microextraction (HS-SPME), dispersive liquid-liquid microextraction (DLLME), dispersive liquid-liquid microextraction based on solidification of floating organic droplet (DLLME-SFO), ultrasound-assisted emulsification microextraction (USAEME), and others have been introduced and developed to obtain sensitive and accurate methods for the analysis of these hazardous compounds. In this review, several analytical properties such as linearity, sensitivity, repeatability, and accuracy for each developed method are discussed, and excellent results were obtained for the most of developed methods combined with GC and HPLC techniques for the analysis of triazine herbicides. This review gives an overview of recent publications of the application of GC and HPLC for analysis of triazine herbicides residues in various samples.

  13. Hybrid methods for cybersecurity analysis :

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Warren Leon,; Dunlavy, Daniel M.

    2014-01-01

    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling and analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and

  14. Developing Word Analysis Skills.

    Science.gov (United States)

    Heilman, Arthur W.

    The importance of word analysis skills to reading ability is discussed, and methodologies for teaching such skills are examined. It is stated that a child cannot become proficient in reading if he does not master the skill of associating printed letter symbols with the sounds they represent. Instructional procedures which augment the alphabet with…

  15. Data Analysis Methods for Paleogenomics

    DEFF Research Database (Denmark)

    Avila Arcos, Maria del Carmen

    , thanks to the introduction of NGS and the implementation of data analysis methods specific for each project. Chapters 1 to 3 have been published in peer-reviewed journals and Chapter 4 is currently in review. Chapter 5 consists of a manuscript describing initial results of an ongoing research project...... (Danmarks Grundforskningfond) 'Centre of Excellence in GeoGenetics' grant, with additional funding provided by the Danish Council for Independent Research 'Sapere Aude' programme. The thesis comprises five chapters, all of which represent different projects that involved the analysis of massive amounts...... of sequence data, generated using next-generation sequencing (NGS) technologies, from either forensic (Chapter 1) or ancient (Chapters 2-5) materials. These chapters present projects very different in nature, reflecting the diversity of questions that have become possible to address in the ancient DNA field...

  16. Development of a method for screening spill and leakage of antibiotics on surfaces based on wipe sampling and HPLC-MS/MS analysis

    OpenAIRE

    Nygren, Olle; Lindahl, Roger

    2011-01-01

    A screening method for determination of spill and leakage of 12 different antibiotic substances has been developed. The method is based on wipe sampling where the sampling procedure has been simplified for screening purposes. After sample processing, the antibiotic substances are determined by liquid chromatography coupled to tandem mass spectrometry (HPLC-MS/MS). Twelve antibiotic substances can be determined in the screening method: Cefadroxil, Cefalexin, Ciprofloxacin, Demeclocyklin HCl, D...

  17. Review of strain buckling: analysis methods

    International Nuclear Information System (INIS)

    This report represents an attempt to review the mechanical analysis methods reported in the literature to account for the specific behaviour that we call buckling under strain. In this report, this expression covers all buckling mechanisms in which the strains imposed play a role, whether they act alone (as in simple buckling under controlled strain), or whether they act with other loadings (primary loading, such as pressure, for example). Attention is focused on the practical problems relevant to LMFBR reactors. The components concerned are distinguished by their high slenderness ratios and by rather high thermal levels, both constant and variable with time. Conventional static buckling analysis methods are not always appropriate for the consideration of buckling under strain. New methods must therefore be developed in certain cases. It is also hoped that this review will facilitate the coding of these analytical methods to aid the constructor in his design task and to identify the areas which merit further investigation

  18. Enabling Predictive Simulation and UQ of Complex Multiphysics PDE Systems by the Development of Goal-Oriented Variational Sensitivity Analysis and a-Posteriori Error Estimation Methods

    Energy Technology Data Exchange (ETDEWEB)

    Estep, Donald [Colorado State Univ., Fort Collins, CO (United States)

    2015-11-30

    This project addressed the challenge of predictive computational analysis of strongly coupled, highly nonlinear multiphysics systems characterized by multiple physical phenomena that span a large range of length- and time-scales. Specifically, the project was focused on computational estimation of numerical error and sensitivity analysis of computational solutions with respect to variations in parameters and data. In addition, the project investigated the use of accurate computational estimates to guide efficient adaptive discretization. The project developed, analyzed and evaluated new variational adjoint-based techniques for integration, model, and data error estimation/control and sensitivity analysis, in evolutionary multiphysics multiscale simulations.

  19. Infinitesimal methods of mathematical analysis

    CERN Document Server

    Pinto, J S

    2004-01-01

    This modern introduction to infinitesimal methods is a translation of the book Métodos Infinitesimais de Análise Matemática by José Sousa Pinto of the University of Aveiro, Portugal and is aimed at final year or graduate level students with a background in calculus. Surveying modern reformulations of the infinitesimal concept with a thoroughly comprehensive exposition of important and influential hyperreal numbers, the book includes previously unpublished material on the development of hyperfinite theory of Schwartz distributions and its application to generalised Fourier transforms and harmon

  20. Automating Object-Oriented Software Development Methods

    OpenAIRE

    Tekinerdogan, Bedir; SAEKI, Motoshi; Sunyé, Gerson; Broek, van den, E.; Hruby, Pavel; Frohner, A´ kos

    2002-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software development methods have been defined. Nevertheless, methods often provide a complexity by their own due to their large number of artifacts, method rules and their complicated processes. We think that au...

  1. Automating Object-Oriented Software Development Methods

    OpenAIRE

    Tekinerdogan, Bedir; SAEKI, Motoshi; Sunyé, Gerson; Broek, van den, E.; Hruby, Pavel

    2001-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software development methods have been defined. Nevertheless, methods often provide a complexity by their own due to their large number of artifacts, method rules and their complicated processes. We think that au...

  2. Moral counselling: a method in development.

    Science.gov (United States)

    de Groot, Jack; Leget, Carlo

    2011-01-01

    This article describes a method of moral counselling developed in the Radboud University Medical Centre Nijmegen (The Netherlands). The authors apply insights of Paul Ricoeur to the non-directive counselling method of Carl Rogers in their work of coaching patients with moral problems in health care. The developed method was shared with other health care professionals in a training course. Experiences in the course and further practice led to further improvement of the method.

  3. Analytical Eco-Scale for Assessing the Greenness of a Developed RP-HPLC Method Used for Simultaneous Analysis of Combined Antihypertensive Medications.

    Science.gov (United States)

    Mohamed, Heba M; Lamie, Nesrine T

    2016-09-01

    In the past few decades the analytical community has been focused on eliminating or reducing the usage of hazardous chemicals and solvents, in different analytical methodologies, that have been ascertained to be extremely dangerous to human health and environment. In this context, environmentally friendly, green, or clean practices have been implemented in different research areas. This study presents a greener alternative of conventional RP-HPLC methods for the simultaneous determination and quantitative analysis of a pharmaceutical ternary mixture composed of telmisartan, hydrochlorothiazide, and amlodipine besylate, using an ecofriendly mobile phase and short run time with the least amount of waste production. This solvent-replacement approach was feasible without compromising method performance criteria, such as separation efficiency, peak symmetry, and chromatographic retention. The greenness profile of the proposed method was assessed and compared with reported conventional methods using the analytical Eco-Scale as an assessment tool. The proposed method was found to be greener in terms of usage of hazardous chemicals and solvents, energy consumption, and production of waste. The proposed method can be safely used for the routine analysis of the studied pharmaceutical ternary mixture with a minimal detrimental impact on human health and the environment. PMID:27492952

  4. ORIGINAL ARTICLE Development and validation of a normal-phase HPTLC method for the simultaneous analysis of Lamivudine and Zidovudine in fixed-dose combination tablets

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Simultaneous quantification of Lamivudine and Zidovudine in tablets by HPTLC method was developed and validated.The chromatograms were developed using a mobile phase of toluene:ethyl acetate:methanol (4:4:2,v/v/v) on pre-coated plate of silica gel GF aluminum TLC plate and quantified by densitometric absorbance mode at 276 nm.The R f values were 0.4170.03 and 0.6070.04 for Lamivudine and Zidovudine,respectively.The linearity of the method was found to be within the concentration range of 50 250 ng/spot for ...

  5. Development and Validation of a Stability-Indicating HPTLC Method for Analysis of Rasagiline Mesylate in the Bulk Drug and Tablet Dosage Form

    Directory of Open Access Journals (Sweden)

    Singaram Kathirvel

    2012-01-01

    Full Text Available A simple and sensitive thin-layer chromatographic method has been established for analysis of rasagiline mesylate in pharmaceutical dosage form. Chromatography on silica gel 60 F254 plates with 6 : 1 : 2(v/v/v butanol-methanol water as mobile phase furnished compact spots at Rf  0.76±0.01. Densitometric analysis was performed at 254 nm. To show the specificity of the method, rasagiline mesylate was subjected to acid, base, neutral hydrolysis, oxidation, photolysis, and thermal decomposition, and the peaks of degradation products were well resolved from that of the pure drug. Linear regression analysis revealed a good linear relationship between peak area and amount of rasagiline mesylate in the range of 100–350 ng/band. The minimum amount of rasagiline mesylate that could be authentically detected and quantified was 11.12 and 37.21 ng/band, respectively. The method was validated, in accordance with ICH guidelines for precision, accuracy, and robustness. Since the method could effectively separate the drug from its degradation products, it can be regarded as stability indicating.

  6. Enabling Predictive Simulation and UQ of Complex Multiphysics PDE Systems by the Development of Goal-Oriented Variational Sensitivity Analysis and A Posteriori Error Estimation Methods

    Energy Technology Data Exchange (ETDEWEB)

    Ginting, Victor

    2014-03-15

    it was demonstrated that a posteriori analyses in general and in particular one that uses adjoint methods can accurately and efficiently compute numerical error estimates and sensitivity for critical Quantities of Interest (QoIs) that depend on a large number of parameters. Activities include: analysis and implementation of several time integration techniques for solving system of ODEs as typically obtained from spatial discretization of PDE systems; multirate integration methods for ordinary differential equations; formulation and analysis of an iterative multi-discretization Galerkin finite element method for multi-scale reaction-diffusion equations; investigation of an inexpensive postprocessing technique to estimate the error of finite element solution of the second-order quasi-linear elliptic problems measured in some global metrics; investigation of an application of the residual-based a posteriori error estimates to symmetric interior penalty discontinuous Galerkin method for solving a class of second order quasi-linear elliptic problems; a posteriori analysis of explicit time integrations for system of linear ordinary differential equations; derivation of accurate a posteriori goal oriented error estimates for a user-defined quantity of interest for two classes of first and second order IMEX schemes for advection-diffusion-reaction problems; Postprocessing finite element solution; and A Bayesian Framework for Uncertain Quantification of Porous Media Flows.

  7. Practical Fourier analysis for multigrid methods

    CERN Document Server

    Wienands, Roman

    2004-01-01

    Before applying multigrid methods to a project, mathematicians, scientists, and engineers need to answer questions related to the quality of convergence, whether a development will pay out, whether multigrid will work for a particular application, and what the numerical properties are. Practical Fourier Analysis for Multigrid Methods uses a detailed and systematic description of local Fourier k-grid (k=1,2,3) analysis for general systems of partial differential equations to provide a framework that answers these questions.This volume contains software that confirms written statements about convergence and efficiency of algorithms and is easily adapted to new applications. Providing theoretical background and the linkage between theory and practice, the text and software quickly combine learning by reading and learning by doing. The book enables understanding of basic principles of multigrid and local Fourier analysis, and also describes the theory important to those who need to delve deeper into the detai...

  8. Reliability and risk analysis methods research plan

    International Nuclear Information System (INIS)

    This document presents a plan for reliability and risk analysis methods research to be performed mainly by the Reactor Risk Branch (RRB), Division of Risk Analysis and Operations (DRAO), Office of Nuclear Regulatory Research. It includes those activities of other DRAO branches which are very closely related to those of the RRB. Related or interfacing programs of other divisions, offices and organizations are merely indicated. The primary use of this document is envisioned as an NRC working document, covering about a 3-year period, to foster better coordination in reliability and risk analysis methods development between the offices of Nuclear Regulatory Research and Nuclear Reactor Regulation. It will also serve as an information source for contractors and others to more clearly understand the objectives, needs, programmatic activities and interfaces together with the overall logical structure of the program

  9. Development of software safety analysis method for nuclear power plant I and C systems in requirement specification based on statechart and SCR

    International Nuclear Information System (INIS)

    In recent years, Instrumentation and Control (I and C) system based on digital computer technology has been widely used throughout industries. These industries such as Nuclear Power Plant (NPP) have safety critical systems. Thus, safety critical system must have sufficient quality to assure a safe and reliable design. In this work, a formal requirement analysis method for Nuclear Power Plant (NPP) Instrumentation and Control (I and ) systems is proposed. This method use the Statechart diagram, Software Cost Reduction (SCR) formalism and ISO table newly suggested in this paper for checking the modeled systems formally. The combined method of utilizing Statechart, SCR and ISO table has the advantage of checking the system easily, visually and formally. This method is applied to the Water Level Monitoring System (WLMS). As a result of the formal check, one reachability error is detected

  10. Regional Development Sustainability Analysis Consept

    Directory of Open Access Journals (Sweden)

    Janno Reiljan

    2014-08-01

    Full Text Available Problems associated with the qualitative analysis and quantitative measurement of sustainability, and opportunities for connecting the concept with the methodological basis of development assessment and the essence of the subject that values sustainability are dealed. The goal of article is to work out the basics for analysis of the regional development in a country in terms and framework of sustainability concept. The article starts by outlining the definition of sustainability, which is followed by an analysis of the nature of sustainability. The third subsection highlights the demands of the decision-making process in guaranteeing sustainability and then considers sustainability in a competitive environment. In the second part of article the sustainable development conception is implemented in regional development sustainability analysis.

  11. Review of Computational Stirling Analysis Methods

    Science.gov (United States)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.

    2004-01-01

    Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

  12. Analysis and estimation of risk management methods

    Directory of Open Access Journals (Sweden)

    Kankhva Vadim Sergeevich

    2016-05-01

    Full Text Available At the present time risk management is an integral part of state policy in all the countries with developed market economy. Companies dealing with consulting services and implementation of the risk management systems carve out a niche. Unfortunately, conscious preventive risk management in Russia is still far from the level of standardized process of a construction company activity, which often leads to scandals and disapproval in case of unprofessional implementation of projects. The authors present the results of the investigation of the modern understanding of the existing methodology classification and offer the authorial concept of classification matrix of risk management methods. Creation of the developed matrix is based on the analysis of the method in the context of incoming and outgoing transformed information, which may include different elements of risk control stages. So the offered approach allows analyzing the possibilities of each method.

  13. Regional Development Sustainability Analysis Consept

    OpenAIRE

    Janno Reiljan

    2014-01-01

    Problems associated with the qualitative analysis and quantitative measurement of sustainability, and opportunities for connecting the concept with the methodological basis of development assessment and the essence of the subject that values sustainability are dealed. The goal of article is to work out the basics for analysis of the regional development in a country in terms and framework of sustainability concept. The article starts by outlining the definition of sustainability, which is fol...

  14. Development and validation of confirmatory method for analysis of nitrofuran metabolites in milk, honey, poultry meat and fish by liquid chromatography-mass spectrometry

    Directory of Open Access Journals (Sweden)

    Fatih Alkan

    2016-03-01

    Full Text Available In this study we have devoloped and validated a confirmatory analysis method for nitrofuran metabolites, which is in accordance with European Commission Decision 2002/657/EC requirements. Nitrofuran metabolites in honey, milk, poultry meat and fish samples were acidic hydrolised followed by derivatisation with nitrobenzaldehyde and liquid-liquid extracted with ethylacetate. The quantitative and confirmative determination of nitrofuran metbolites was performed by liquid chromatography/electrospray ionisation tandem mass spectrometry (LC/ESI-MS/MS in the positive ion mode. In-house method validation was performed and reported data of validation (specificity, linearity, recovery, CCα and CCβ. The advantage of this method is that it avoids the use of clean-up by Solid-Phase Extraction (SPE. Furthermore, low levels of nitrofuran metabolites are detectable and quantitatively confirmed at a rapid rate in all samples.

  15. Development of a Radial Deconsolidation Method

    Energy Technology Data Exchange (ETDEWEB)

    Helmreich, Grant W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Montgomery, Fred C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hunn, John D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-01

    A series of experiments have been initiated to determine the retention or mobility of fission products* in AGR fuel compacts [Petti, et al. 2010]. This information is needed to refine fission product transport models. The AGR-3/4 irradiation test involved half-inch-long compacts that each contained twenty designed-to-fail (DTF) particles, with 20-μm thick carbon-coated kernels whose coatings were deliberately fabricated such that they would crack under irradiation, providing a known source of post-irradiation isotopes. The DTF particles in these compacts were axially distributed along the compact centerline so that the diffusion of fission products released from the DTF kernels would be radially symmetric [Hunn, et al. 2012; Hunn et al. 2011; Kercher, et al. 2011; Hunn, et al. 2007]. Compacts containing DTF particles were irradiated at Idaho National Laboratory (INL) at the Advanced Test Reactor (ATR) [Collin, 2015]. Analysis of the diffusion of these various post-irradiation isotopes through the compact requires a method to radially deconsolidate the compacts so that nested-annular volumes may be analyzed for post-irradiation isotope inventory in the compact matrix, TRISO outer pyrolytic carbon (OPyC), and DTF kernels. An effective radial deconsolidation method and apparatus appropriate to this application has been developed and parametrically characterized.

  16. Development of a liquid chromatography-electrospray ionization-tandem mass spectrometry method for the simultaneous analysis of intact glucosinolates and isothiocyanates in Brassicaceae seeds and functional foods.

    Science.gov (United States)

    Franco, P; Spinozzi, S; Pagnotta, E; Lazzeri, L; Ugolini, L; Camborata, C; Roda, A

    2016-01-01

    A new high pressure liquid chromatography-electrospray ionization-tandem mass spectrometry method for the simultaneous determination of glucosinolates, as glucoraphanin and glucoerucin, and the corresponding isothiocyanates, as sulforaphane and erucin, was developed and applied to quantify these compounds in Eruca sativa defatted seed meals and enriched functional foods. The method involved solvent extraction, separation was achieved in gradient mode using water with 0.5% formic acid and acetonitrile with 0.5% formic acid and using a reverse phase C18 column. The electrospray ion source operated in negative and positive mode for the detection of glucosinolates and isothiocyanates, respectively, and the multiple reaction monitoring (MRM) was selected as acquisition mode. The method was validated following the ICH guidelines. Replicate experiments demonstrated a good accuracy (bias%<10%) and precision (CV%<10%). Detection limits and quantification limits are in the range of 1-400ng/mL for each analytes. Calibration curves were validated on concentration ranges from 0.05 to 50μg/mL. The method proved to be suitable for glucosinolates and isothiocyanates determination both in biomasses and in complex matrices such as food products enriched with glucosinolates, or nutraceutical bakery products. In addition, the developed method was applied to the simultaneous determination of glucosinolates and isothiocyanates in bakery product enriched with glucosinolates, to evaluate their thermal stability after different industrial processes from cultivation phases to consumer processing.

  17. Development of RP-HPLC method for Qualitative Analysis of Active Ingredient (Gallic acid) from Stem Bark of Dendrophthoe falcate Linn.

    OpenAIRE

    Hafsa Deshmukh; Pradnya J. Prabhu

    2011-01-01

    A simple, precise and sensitive, RP-HPLC method with UV detection at 271nm was developed and validated for qualitative determination of active ingredient Gallic acid from stem bark of Dendrophthoe falcate Linn. Separation was performed on a ThermoMOS 2 HYPERSIL C18 column (250 cm × 4.6 mm, 5µm ODS 3) using mobile phase comprising of 0.1% Orthophosphoric acid : Acetonitrile (400 cm3 : 600 cm3) with a flow rate of 1 ml/minute with a short run time of 13 minute. The method was validated accordi...

  18. A DECOMPOSITION METHOD OF STRUCTURAL DECOMPOSITION ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    LI Jinghua

    2005-01-01

    Over the past two decades,structural decomposition analysis(SDA)has developed into a major analytical tool in the field of input-output(IO)techniques,but the method was found to suffer from one or more of the following problems.The decomposition forms,which are used to measure the contribution of a specific determinant,are not unique due to the existence of a multitude of equivalent forms,irrational due to the weights of different determinants not matching,inexact due to the existence of large interaction terms.In this paper,a decomposition method is derived to overcome these deficiencies,and we prove that the result of this approach is equal to the Shapley value in cooperative games,and so some properties of the method are obtained.Beyond that,the two approaches that have been used predominantly in the literature have been proved to be the approximate solutions of the method.

  19. Generalized analysis method for neutron resonance transmission analysis

    International Nuclear Information System (INIS)

    Neutron resonance densitometry (NRD) is a non-destructive analysis method, which can be applied to quantify special nuclear materials (SNM) in small particle-like debris of melted fuel that are formed in severe accidents of nuclear reactors such as the Fukushima Daiichi nuclear power plants. NRD uses neutron resonance transmission analysis (NRTA) to quantify SNM and neutron resonance capture analysis (NRCA) to identify matrix materials and impurities. To apply NRD for the characterization of arbitrary-shaped thick materials, a generalized method for the analysis of NRTA data has been developed. The method has been applied on data resulting from transmission through thick samples with an irregular shape and an areal density of SNM up to 0.253 at/b (≈100 g/cm2). The investigation shows that NRD can be used to quantify SNM with a high accuracy not only in inhomogeneous samples made of particle-like debris but also in samples made of large rocks with an irregular shape by applying the generalized analysis method for NRTA. (author)

  20. Finite Volume Methods: Foundation and Analysis

    Science.gov (United States)

    Barth, Timothy; Ohlberger, Mario

    2003-01-01

    Finite volume methods are a class of discretization schemes that have proven highly successful in approximating the solution of a wide variety of conservation law systems. They are extensively used in fluid mechanics, porous media flow, meteorology, electromagnetics, models of biological processes, semi-conductor device simulation and many other engineering areas governed by conservative systems that can be written in integral control volume form. This article reviews elements of the foundation and analysis of modern finite volume methods. The primary advantages of these methods are numerical robustness through the obtention of discrete maximum (minimum) principles, applicability on very general unstructured meshes, and the intrinsic local conservation properties of the resulting schemes. Throughout this article, specific attention is given to scalar nonlinear hyperbolic conservation laws and the development of high order accurate schemes for discretizing them. A key tool in the design and analysis of finite volume schemes suitable for non-oscillatory discontinuity capturing is discrete maximum principle analysis. A number of building blocks used in the development of numerical schemes possessing local discrete maximum principles are reviewed in one and several space dimensions, e.g. monotone fluxes, E-fluxes, TVD discretization, non-oscillatory reconstruction, slope limiters, positive coefficient schemes, etc. When available, theoretical results concerning a priori and a posteriori error estimates are given. Further advanced topics are then considered such as high order time integration, discretization of diffusion terms and the extension to systems of nonlinear conservation laws.

  1. Development of an HPLC-UV Method for the Analysis of Drugs Used for Combined Hypertension Therapy in Pharmaceutical Preparations and Human Plasma

    OpenAIRE

    Serife Evrim Kepekci Tekkeli

    2013-01-01

    A simple, rapid, and selective HPLC-UV method was developed for the determination of antihypertensive drug substances: amlodipine besilat (AML), olmesartan medoxomil (OLM), valsartan (VAL), and hydrochlorothiazide (HCT) in pharmaceuticals and plasma. These substances are mostly used as combinations. The combinations are found in various forms, especially in current pharmaceuticals as threesome components: OLM, AML, and HCT (combination I) and AML, VAL, and HCT (combination II). The separation...

  2. Measuring sap flow and stem water content in trees: a critical analysis and development of a new heat pulse method (Sapflow+)

    OpenAIRE

    Vandegehuchte, Maurits

    2013-01-01

    Just as carbon, water is indispensable for plants to develop and grow. A lack of water causes turgor loss in plant cells which prevents further expansion of these cells and the coupled incorporation of carbon sources in the cell wall. This inhibits growth and, if this water scarcity continues, plant dimensions such as the stem diameter will start to decrease. Finally the plant will lose its vital functions and die. Worldwide, sap flow methods are applied to monitor plant water status and v...

  3. Development of a method for the analysis of four plant growth regulators (PGRs) residues in soybean sprouts and mung bean sprouts by liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Zhang, Fengzu; Zhao, Pengyue; Shan, Weili; Gong, Yong; Jian, Qiu; Pan, Canping

    2012-09-01

    A method has been developed for the simultaneous determination of four plant growth regulators (PGRs) residues in soybean sprouts and mung bean sprouts. The sample preparation procedure was based on a QuEChERS method. The method showed excellent linearity (r(2) ≥ 0.9985) and precision (RSDs ≤ 13.0%). Average recoveries of four PGRs ranged between 74.9% and 106.3% at spiking levels 0.05, 0.5 and 1 mg kg(-1). The LODs and LOQs were in the ranges of 0.27-9.3 μg kg(-1) and 0.90-31 μg kg(-1), respectively. The procedure was applied to 18 bean sprout samples, and benzyladenine was found in some of the analyzed samples.

  4. On Software Development of Characteristic Set Method

    Institute of Scientific and Technical Information of China (English)

    WU Yong-wei; WANG Ding-kang; YANG Hong; LIN Dong-dai

    2002-01-01

    Characteristic set method of polynomial equation solving has been widely spread and its implementation in software has been urged to consider in recent years. Several packages for the method are implemented in some computer algebra systems, such as REDUCE and Maple. In order to improve the efficiency of the method, we have developed a computer algebra system "ELIMINO" written in C language and implemented on Linux operation system on a PC. The authors wish to share with the reader the knowledge and experiences about the design and development of software package of the characteristic set method.

  5. Space Debris Reentry Analysis Methods and Tools

    Institute of Scientific and Technical Information of China (English)

    WU Ziniu; HU Ruifeng; QU Xi; WANG Xiang; WU Zhe

    2011-01-01

    The reentry of uncontrolled spacecraft may be broken into many pieces of debris at an altitude in the range of 75-85 km.The surviving fragments could pose great hazard and risk to ground and people.In recent years,methods and tools for predicting and analyzing debris reentry and ground risk assessment have been studied and developed in National Aeronautics and Space Administration(NASA),European Space Agency(ESA) and other organizations,including the group of the present authors.This paper reviews the current progress on this topic of debris reentry briefly.We outline the Monte Carlo method for uncertainty analysis,breakup prediction,and parameters affecting survivability of debris.The existing analysis tools can be classified into two categories,i.e.the object-oriented and the spacecraft-oriented methods,the latter being more accurate than the first one.The past object-oriented tools include objects of only simple shapes.For more realistic simulation,here we present an object-oriented tool debris reentry and ablation prediction system(DRAPS) developed by the present authors,which introduces new object shapes to 15 types,as well as 51 predefined motions and relevant aerodynamic and aerothermal models.The aerodynamic and aerothermal models in DRAPS are validated using direct simulation Monte Carlo(DSMC) method.

  6. Development of analysis method of material f low cost accounting using lean technique in food production: A case study of Universal Food Public (UFC Co.,Ltd.

    Directory of Open Access Journals (Sweden)

    Wichai Chattinnawat

    2015-06-01

    Full Text Available This research aims to apply Lean technique in conjunction with analysis of Material Flow Cost Accounting (MFCA to production process of canned sweet corn in order to increase process efficiency, eliminate waste and reduce cost of the production. This research develops and presents new type of MFCA analysis by incorporating value and non-value added activities into the MFCA cost allocation process. According to the simulation-based measurement of the process efficiency, integrated cost allocation based on activity types results in higher proportion of negative product cost in comparison to that computed from conventional MFCA cost allocation. Thus, considering types of activities and process efficiency have great impacts on cost structure especially for the negative product cost. The research leads to solutions to improve work procedures, eliminate waste and reduce production cost. The overall cost per unit decreases with higher proportion of positive product cost.

  7. Development of a spatial analysis method using ground-based repeat photography to detect changes in the alpine treeline ecotone, Glacier National Park, Montana, U.S.A.

    Science.gov (United States)

    Roush, W.; Munroe, J.S.; Fagre, D.B.

    2007-01-01

    Repeat photography is a powerful tool for detection of landscape change over decadal timescales. Here a novel method is presented that applies spatial analysis software to digital photo-pairs, allowing vegetation change to be categorized and quantified. This method is applied to 12 sites within the alpine treeline ecotone of Glacier National Park, Montana, and is used to examine vegetation changes over timescales ranging from 71 to 93 years. Tree cover at the treeline ecotone increased in 10 out of the 12 photo-pairs (mean increase of 60%). Establishment occurred at all sites, infilling occurred at 11 sites. To demonstrate the utility of this method, patterns of tree establishment at treeline are described and the possible causes of changes within the treeline ecotone are discussed. Local factors undoubtedly affect the magnitude and type of the observed changes, however the ubiquity of the increase in tree cover implies a common forcing mechanism. Mean minimum summer temperatures have increased by 1.5??C over the past century and, coupled with variations in the amount of early spring snow water equivalent, likely account for much of the increase in tree cover at the treeline ecotone. Lastly, shortcomings of this method are presented along with possible solutions and areas for future research. ?? 2007 Regents of the University of Colorado.

  8. Instrumental neutron activation analysis - a routine method

    International Nuclear Information System (INIS)

    This thesis describes the way in which at IRI instrumental neutron activation analysis (INAA) has been developed into an automated system for routine analysis. The basis of this work are 20 publications describing the development of INAA since 1968. (Auth.)

  9. 语义Web应用程序开发方法及实例分析%Semantic Web Application Developing Method and Example Analysis

    Institute of Scientific and Technical Information of China (English)

    李新龙; 黄映辉

    2013-01-01

    随着语义Web技术的不断发展,语义Web应用程序越来越受到重视,但现在国内对语义Web应用程序的研究却比较少,缺少语义Web 应用程序的开发方法。文中通过对语义Web应用程序的研究,结合与Web应用程序的对比分析,给出了语义Web应用程序的定义、架构以及开发方法,并详细说明了基于数据层、逻辑层和表现层三层架构的语义Web应用程序的结构特征和构建过程,进而通过构建一个语义Web应用程序实例对所提出的开发方法进行了验证,取得了预期的成果。%With the continues development of semantic Web technology,more and more attention has been drawn to semantic Web appli-cations. However,less research has done on the semantic Web applications in the domestic,lacking of development methods. By study,the definition of the semantic Web application is given,as well as the basic framework and the developing method. The structure characteris-tics and building process of each layer in semantic Web application based on data layer,logic layer and presentation layer,are also de-scribed. Through constructing a semantic Web application example,a novel development method is verified and achieve the prospect re-sult.

  10. Analysis of all-rac-alpha-tocopheryl acetate and retinyl palmitate in medical foods using a zero control reference material (ZRM) as a method development tool.

    Science.gov (United States)

    Chase, G W; Eitenmiller, R R; Long, A R

    1999-01-01

    A liquid chromatographic method is described for analysis of all- rac-alpha-tocopheryl acetate and retinyl palmitate in medical food. The vitamins are extracted in isopropyl alcohol and hexane-ethyl acetate without saponification and quantitated by normal-phase chromatography with fluorescence detection. All rac-alpha-tocopheryl acetate and retinyl palmitate are chromatographed isocratically with a mobile phase of 0.5% (v/v) and 0.125% (v/v) isopropyl alcohol in hexane, respectively. Recovery studies performed on a medical food zero control reference material (ZRM) fortified with the analytes averaged 99.7% (n = 25) for retinyl palmitate and 101% (n = 25) for all- rac-alpha-tocopheryl acetate. Coefficients of variation were 0.87-2.63% for retinyl palmitate and 1.42-3.20% for all-rac-alpha-tocopheryl acetate. The method provides a rapid, specific, and easily controlled assay for analysis of vitamin A and vitamin E in medical foods. Use of chlorinated solvents is avoided. PMID:10232898

  11. Strategic Options Development and Analysis

    Science.gov (United States)

    Ackermann, Fran; Eden, Colin

    Strategic Options Development and Analysis (SODA) enables a group or individual to construct a graphical representation of a problematic situation, and thus explore options and their ramifications with respect to a complex system of goals or objectives. In addition the method aims to help groups arrive at a negotiated agreement about how to act to resolve the situation. It is based upon the use of causal mapping - a formally constructed means-ends network - as representation form. Because the picture has been constructed using the natural language of the problem owners it becomes a model of the situation that is ‘owned' by those who define the problem. The use of formalities for the construction of the model makes it amenable to a range of analyses as well as encouraging reflection and a deeper understanding. These analyses can be used in a ‘rough and ready' manner by visual inspection or through the use of specialist causal mapping software (Decision Explorer). Each of the analyses helps a group or individual discover important features of the problem situation, and these features facilitate agreeing agood solution. The SODA process is aimed at helping a group learn about the situation they face before they reach agreements. Most significantly the exploration through the causal map leads to a higher probability of more creative solutions and promotes solutions that are more likely to be implemented because the problem construction process is wider and more likely to include richer social dimensions about the blockages to action and organizational change. The basic theories that inform SODA derive from cognitive psychology and social negotiation, where the model acts as a continuously changing representation of the problematic situation - changing as the views of a person or group shift through learning and exploration. This chapter, jointly written by two leading practitioner academics and the original developers of SODA, Colin Eden and Fran Ackermann

  12. METHOD DEVELOPMENT FOR THE ANALYSIS OF N-NITROSODIMETHYLAMINE AND OTHER N-NITROSAMINES IN DRINKING WATER AT LOW NANOGRAM/LITER CONCENTRATIONS USING SOLID PHASE EXTRACTION AND GAS CHROMATOGRAPHY WITH CHEMICAL IONIZATION TANDEM MASS SPECTROMETRY

    Science.gov (United States)

    N-Nitrosodimethylamine (NDMA) is a probable human carcinogen that has been identified as a drinking water contaminant of concern. United States Environmental Protection Agency (USEPA) Method 521 has been developed for the analysis of NDMA and six additional N-nitrosamines in dri...

  13. Information Security Risk Analysis Methods and Research Trends: AHP and Fuzzy Comprehensive Method

    OpenAIRE

    Ming-Chang Lee

    2014-01-01

    Information security risk analysis becomes an increasingly essential component of organization’s operations. Traditional Information security risk analysis is quantitative and qualitative analysis methods. Quantitative and qualitative analysis methods have some advantages for information risk analysis. However, hierarchy process has been widely used in security assessment. A future research direction may be development and application of soft computing such as rough sets, grey set...

  14. Data Analysis Methods for Library Marketing

    Science.gov (United States)

    Minami, Toshiro; Kim, Eunja

    Our society is rapidly changing to information society, where the needs and requests of the people on information access are different widely from person to person. Library's mission is to provide its users, or patrons, with the most appropriate information. Libraries have to know the profiles of their patrons, in order to achieve such a role. The aim of library marketing is to develop methods based on the library data, such as circulation records, book catalogs, book-usage data, and others. In this paper we discuss the methodology and imporatnce of library marketing at the beginning. Then we demonstrate its usefulness through some examples of analysis methods applied to the circulation records in Kyushu University and Guacheon Library, and some implication that obtained as the results of these methods. Our research is a big beginning towards the future when library marketing is an unavoidable tool.

  15. Constructing an Intelligent Patent Network Analysis Method

    OpenAIRE

    Wu, Chao-Chan; Yao, Ching-Bang

    2012-01-01

    Patent network analysis, an advanced method of patent analysis, is a useful tool for technology management. This method visually displays all the relationships among the patents and enables the analysts to intuitively comprehend the overview of a set of patents in the field of the technology being studied. Although patent network analysis possesses relative advantages different from traditional methods of patent analysis, it is subject to several crucial limitations. To overcome the drawbacks...

  16. Rapid coal proximate analysis by thermogravimetric method

    Energy Technology Data Exchange (ETDEWEB)

    Mao Jianxiong; Yang Dezhong; Zhao Baozhong

    1987-09-01

    A rapid coal proximate analysis by thermogravimetric analysis (TGA) can be used as an alternative method for the standard proximate analysis. This paper presents a program set up to rapidly perform coal proximate analysis by using a thermal analyzer and TGA module. A comparison between coal proximate analyses by standard method (GB) and TGA is also given. It shows that most data from TGA fall within the tolerance limit of standard method.

  17. Quantitative gold nanoparticle analysis methods: A review.

    Science.gov (United States)

    Yu, Lei; Andriola, Angelo

    2010-08-15

    Research and development in the area of gold nanoparticles' (AuNPs) preparation, characterization, and applications are burgeoning in recent years. Many of the techniques and protocols are very mature, but two major concerns are with the mass domestic production and the consumption of AuNP based products. First, how many AuNPs exist in a dispersion? Second, where are the AuNPs after digestion by the environment and how many are there? To answer these two questions, reliable and reproducible methods are needed to analyze the existence and the population of AuNP in samples. This review summarized the most recent chemical and particle quantitative analysis methods that have been used to characterize the concentration (in number of moles of gold per liter) or population (in number of particles per mL) of AuNPs. The methods summarized in this review include, mass spectroscopy, electroanalytical methods, spectroscopic methods, and particle counting methods. These methods may count the number of AuNP directly or analyze the total concentration of element gold in an AuNP dispersion.

  18. Parallel development of chromatographic and mass-spectrometric methods for quantitative analysis of glycation on an IgG1 monoclonal antibody.

    Science.gov (United States)

    Viski, Kornél; Gengeliczki, Zsolt; Lenkey, Krisztián; Baranyáné Ganzler, Katalin

    2016-10-01

    Monitoring post-translational modifications (PTMs) in biotherapeutics is of paramount importance. In pharmaceutical industry, chromatography with optical detection is the standard choice of quantitation of product related impurities; and mass spectrometry is used only for characterization. Parallel development of a boronate affinity chromatographic (BAC) and a mass spectrometric methods for quantitative measurement of glycation on a monoclonal antibody (mAb) shed light on the importance of certain characteristics of the individual methods. Non-specific interactions in BAC has to be suppressed with the so-called shielding reagent. We have found that excessive amount of shielding reagents in the chromatographic solvents may cause significant underestimation of glycation. Although contamination of the retained peak with the non-glycated isoforms in BAC is unavoidable, our work shows that it can be characterized and quantitated by mass spectrometry. It has been demonstrated that glycation can be measured by mass spectrometry at the intact protein level with an LOQ value of 3.0% and error bar of ±0.5%. The BAC and MS methods have been found to provide equivalent results. These methods have not been compared from these points of view before.

  19. Development and validation of automatic HS-SPME with a gas chromatography-ion trap/mass spectrometry method for analysis of volatiles in wines.

    Science.gov (United States)

    Paula Barros, Elisabete; Moreira, Nathalie; Elias Pereira, Giuliano; Leite, Selma Gomes Ferreira; Moraes Rezende, Claudia; Guedes de Pinho, Paula

    2012-11-15

    An automated headspace solid-phase microextraction (HS-SPME) combined with gas chromatography-ion trap/mass spectrometry (GC-IT/MS) was developed in order to quantify a large number of volatile compounds in wines such as alcohols, ester, norisoprenoids and terpenes. The procedures were optimized for SPME fiber selection, pre-incubation temperature and time, extraction temperature and time, and salt addition. A central composite experimental design was used in the optimization of the extraction conditions. The volatile compounds showed optimal extraction using a DVB/CAR/PDMS fiber, incubation of 5 ml of wine with 2g NaCl at 45 °C during 5 min, and subsequent extraction of 30 min at the same temperature. The method allowed the identification of 64 volatile compounds. Afterwards, the method was validated successfully for the most significant compounds and was applied to study the volatile composition of different white wines. PMID:23158309

  20. Developing a multi-method approach to data collection and analysis for explaining the learning during simulation in undergraduate nurse education.

    Science.gov (United States)

    Bland, Andrew J; Tobbell, Jane

    2015-11-01

    Simulation has become an established feature of undergraduate nurse education and as such requires extensive investigation. Research limited to pre-constructed categories imposed by some questionnaire and interview methods may only provide partial understanding. This is problematic in understanding the mechanisms of learning in simulation-based education as contemporary distributed theories of learning posit that learning can be understood as the interaction of individual identity with context. This paper details a method of data collection and analysis that captures interaction of individuals within the simulation experience which can be analysed through multiple lenses, including context and through the lens of both researcher and learner. The study utilised a grounded theory approach involving 31 under-graduate third year student nurses. Data was collected and analysed through non-participant observation, digital recordings of simulation activity and focus group deconstruction of their recorded simulation by the participants and researcher. Focus group interviews enabled further clarification. The method revealed multiple levels of dynamic data, concluding that in order to better understand how students learn in social and active learning strategies, dynamic data is required enabling researchers and participants to unpack what is happening as it unfolds in action.

  1. Spectroscopic Chemical Analysis Methods and Apparatus

    Science.gov (United States)

    Hug, William F.; Reid, Ray D.

    2012-01-01

    This invention relates to non-contact spectroscopic methods and apparatus for performing chemical analysis and the ideal wavelengths and sources needed for this analysis. It employs deep ultraviolet (200- to 300-nm spectral range) electron-beam-pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor lightemitting devices, and hollow cathode metal ion lasers. Three achieved goals for this innovation are to reduce the size (under 20 L), reduce the weight [under 100 lb (.45 kg)], and reduce the power consumption (under 100 W). This method can be used in microscope or macroscope to provide measurement of Raman and/or native fluorescence emission spectra either by point-by-point measurement, or by global imaging of emissions within specific ultraviolet spectral bands. In other embodiments, the method can be used in analytical instruments such as capillary electrophoresis, capillary electro-chromatography, high-performance liquid chromatography, flow cytometry, and related instruments for detection and identification of unknown analytes using a combination of native fluorescence and/or Raman spectroscopic methods. This design provides an electron-beampumped semiconductor radiation-producing method, or source, that can emit at a wavelength (or wavelengths) below 300 nm, e.g. in the deep ultraviolet between about 200 and 300 nm, and more preferably less than 260 nm. In some variations, the method is to produce incoherent radiation, while in other implementations it produces laser radiation. In some variations, this object is achieved by using an AlGaN emission medium, while in other implementations a diamond emission medium may be used. This instrument irradiates a sample with deep UV radiation, and then uses an improved filter for separating wavelengths to be detected. This provides a multi-stage analysis of the sample. To avoid the difficulties related to producing deep UV semiconductor sources, a pumping approach has been developed that uses

  2. Developments of analysis method for tobacco flavors%烟用增香物质的分析技术进展

    Institute of Scientific and Technical Information of China (English)

    任志芹; 艾小勇; 张元; 王志; 付体鹏; 许成保; 张峰

    2014-01-01

    Public concern over varied flavor added in cigarettes have been increasing during the past few years. Tobacco flavors is indispensable for the production of cigarette. It is also an important factor affecting the taste of the cigarette. Some of the flavor added in cigarettes have been identified physiological toxicity. However, tobacco flavors is composed of complex chemicals. For the most part, we select different analytical methods based on the various nature of flavors. In this paper, pretreatment, separation and detection methods for tobacco flavors were reviewed. Some new analytical methods are also briefly introduced. Till now, Mass spectrometry is the most popular method of analysis over varied flavor added in cigarettes because of its high sensitivity and reproducibility.%卷烟中的增香物质是卷烟生产中不可或缺的原料,该物质既能减少焦油含量,同时也能影响卷烟口味。然而卷烟中的增香物质化学成分复杂,很多物质有不同程度的生理毒性,因此对烟草中增香成分的测定具有显著意义。目前烟用增香物质的的前处理和检测技术多种多样,本文对烟用增香物质的前处理和检测技术进行了综述和比较,对一些新的分析技术也作了简要介绍,拟为卷烟中增香物质的检测提供依据。

  3. Evaluation methods of SWOT analysis

    OpenAIRE

    VANĚK, Michal; Mikoláš, Milan; Žváková, Kateřina

    2012-01-01

    Strategic management is an integral part of top management. By formulating the right strategy and its subsequent implementation, a managed organization can attract and retain a comparative advantage. In order to fulfil this expectation, the strategy also has to be supported with relevant findings of performed strategic analyses. The best known and probably the most common of these is a SWOT analysis. In practice, however, the analysis is reduced to mere presentation of influence factors, whic...

  4. Semiquantitative fluorescence method for bioconjugation analysis.

    Science.gov (United States)

    Brasil, Aluízio G; Carvalho, Kilmara H G; Leite, Elisa S; Fontes, Adriana; Santos, Beate Saegesser

    2014-01-01

    Quantum dots (QDs) have been used as fluorescent probes in biological and medical fields such as bioimaging, bioanalytical, and immunofluorescence assays. For these applications, it is important to characterize the QD-protein bioconjugates. This chapter provides details on a versatile method to confirm quantum dot-protein conjugation including the required materials and instrumentation in order to perform the step-by-step semiquantitative analysis of the bioconjugation efficiency by using fluorescence plate readings. Although the protocols to confirm the QD-protein attachment shown here were developed for CdTe QDs coated with specific ligands and proteins, the principles are the same for other QDs-protein bioconjugates. PMID:25103803

  5. Information Security Risk Analysis Methods and Research Trends: AHP and Fuzzy Comprehensive Method

    Directory of Open Access Journals (Sweden)

    Ming-Chang Lee

    2014-02-01

    Full Text Available Information security risk analysis becomes an increasingly essential component of organization’s operations. Traditional Information security risk analysis is quantitative and qualitative analysis methods. Quantitative and qualitative analysis methods have some advantages for information risk analysis. However, hierarchy process has been widely used in security assessment. A future research direction may be development and application of soft computing such as rough sets, grey sets, fuzzy systems, generic algorithm, support vector machine, and Bayesian network and hybrid model. Hybrid model are developed by integrating two or more existing model. A Practical advice for evaluation information security risk is discussed. This approach is combination with AHP and Fuzzy comprehensive method

  6. Development of Safety Analysis Technology for LMR

    International Nuclear Information System (INIS)

    In the safety analysis code system development area, the development of an analysis code for a flow blockage could be brought to completion throughout an integrated validation of MATRA-LMR-FB. The safety analysis code of SSC-K has been evolved by building detailed reactivity models and a core 3 dimensional T/H model into it, and developing its window version. A basic analysis module for SFR features also have been developed incorporating a numerical method, best estimated correlations, and a code structure module. For the analysis of the HCDA initiating phase, a sodium boiling model to be linked to SSC-K and a fuel transient performance/cladding failure model have been developed with a state-of-the-art study on the molten fuel movement models. Besides, scoping analysis models for the post-accident heat removal phase have been developed as well. In safety analysis area, the safety criteria for the KALIMER-600 have been set up, and an internal flow channel blockage and local faults have been analyzed for the assembly safety evaluation, while key safety concepts of the KALIMER-600 has been investigated getting through the analyses of ATWS as well as design basis accidents like TOP and LOF, from which the inherent safety due to a core reactivity feedback has been assessed. The HCDA analysis for the initiating phase and an estimation of the core energy release, subsequently, have been followed with setup of the safety criteria as well as T/H analysis for the core catcher. The thermal-hydraulic behaviors, and released radioactivity sources and dose rates in the containment have been analyzed for its performance evaluation in this area. The display of a data base for research products on the KALIMER Website and the detailed process planning with its status analysis, have become feasible from achievements in the area of the integrated technology development and establishment

  7. Development of a flow method for the determination of phosphate in estuarine and freshwaters-Comparison of flow cells in spectrophotometric sequential injection analysis

    International Nuclear Information System (INIS)

    Highlights: → Sequential injection determination of phosphate in estuarine and freshwaters. → Alternative spectrophotometric flow cells are compared. → Minimization of schlieren effect was assessed. → Proposed method can cope with wide salinity ranges. → Multi-reflective cell shows clear advantages. - Abstract: A sequential injection system with dual analytical line was developed and applied in the comparison of two different detection systems viz; a conventional spectrophotometer with a commercial flow cell, and a multi-reflective flow cell coupled with a photometric detector under the same experimental conditions. The study was based on the spectrophotometric determination of phosphate using the molybdenum-blue chemistry. The two alternative flow cells were compared in terms of their response to variation of sample salinity, susceptibility to interferences and to refractive index changes. The developed method was applied to the determination of phosphate in natural waters (estuarine, river, well and ground waters). The achieved detection limit (0.007 μM PO43-) is consistent with the requirement of the target water samples, and a wide quantification range (0.024-9.5 μM) was achieved using both detection systems.

  8. Development of a flow method for the determination of phosphate in estuarine and freshwaters-Comparison of flow cells in spectrophotometric sequential injection analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Raquel B.R. [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal); Laboratory of Hydrobiology, Institute of Biomedical Sciences Abel Salazar (ICBAS) and Institute of Marine Research (CIIMAR), Universidade do Porto, Lg. Abel Salazar 2, 4099-003 Porto (Portugal); Ferreira, M. Teresa S.O.B. [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal); Toth, Ildiko V. [REQUIMTE, Departamento de Quimica, Faculdade de Farmacia, Universidade de Porto, Rua Anibal Cunha, 164, 4050-047 Porto (Portugal); Bordalo, Adriano A. [Laboratory of Hydrobiology, Institute of Biomedical Sciences Abel Salazar (ICBAS) and Institute of Marine Research (CIIMAR), Universidade do Porto, Lg. Abel Salazar 2, 4099-003 Porto (Portugal); McKelvie, Ian D. [School of Chemistry, University of Melbourne, Victoria 3010 (Australia); Rangel, Antonio O.S.S., E-mail: aorangel@esb.ucp.pt [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal)

    2011-09-02

    Highlights: {yields} Sequential injection determination of phosphate in estuarine and freshwaters. {yields} Alternative spectrophotometric flow cells are compared. {yields} Minimization of schlieren effect was assessed. {yields} Proposed method can cope with wide salinity ranges. {yields} Multi-reflective cell shows clear advantages. - Abstract: A sequential injection system with dual analytical line was developed and applied in the comparison of two different detection systems viz; a conventional spectrophotometer with a commercial flow cell, and a multi-reflective flow cell coupled with a photometric detector under the same experimental conditions. The study was based on the spectrophotometric determination of phosphate using the molybdenum-blue chemistry. The two alternative flow cells were compared in terms of their response to variation of sample salinity, susceptibility to interferences and to refractive index changes. The developed method was applied to the determination of phosphate in natural waters (estuarine, river, well and ground waters). The achieved detection limit (0.007 {mu}M PO{sub 4}{sup 3-}) is consistent with the requirement of the target water samples, and a wide quantification range (0.024-9.5 {mu}M) was achieved using both detection systems.

  9. Protein crystallography. Methodological development and comprehensive analysis

    International Nuclear Information System (INIS)

    There have been remarkable developments in the methodology for protein structure analysis over the past few decades. Currently, single-wavelength anomalous diffraction phasing of a selenomethionyl derivative (Se-SAD) is used as a general method for determining protein structure, while the sulfur single-wavelength anomalous diffraction method (S-SAD) using native protein is evolving as a next-generation method. In this paper, we look back on the early applications of multi-wavelength anomalous diffraction phasing of a selenomethionyl derivative (Se-MAD) and introduce the study of ribosomal proteins as an example of the comprehensive analysis that took place in the 1990s. Furthermore, we refer to the current state of development of the S-SAD method as well as automatic structure determination. (author)

  10. Metallurgical and chemical characterization of copper alloy reference materials within laser ablation inductively coupled plasma mass spectrometry: Method development for minimally-invasive analysis of ancient bronze objects

    Energy Technology Data Exchange (ETDEWEB)

    Walaszek, Damian, E-mail: damian.walaszek@empa.ch [Laboratory for Analytical Chemistry, Swiss Federal Laboratories for Materials Science and Technology, Überlandstrasse 129, CH-8600 Dübendorf (Switzerland); University of Warsaw, Faculty of Chemistry, Pasteura 1, 02-093 Warsaw (Poland); Senn, Marianne [Laboratory for Analytical Chemistry, Swiss Federal Laboratories for Materials Science and Technology, Überlandstrasse 129, CH-8600 Dübendorf (Switzerland); Faller, Markus [Laboratory for Jointing Technology and Corrosion, Swiss Federal Laboratories for Materials Science and Technology, Überlandstrasse 129, CH-8600 Dübendorf (Switzerland); Philippe, Laetitia [Laboratory for Mechanics of Materials and Nanostructures, Swiss Federal Laboratories for Materials Science and Technology, Feuerwerkstrasse 39, CH-3602 Thun (Switzerland); Wagner, Barbara; Bulska, Ewa [University of Warsaw, Faculty of Chemistry, Pasteura 1, 02-093 Warsaw (Poland); Ulrich, Andrea [Laboratory for Analytical Chemistry, Swiss Federal Laboratories for Materials Science and Technology, Überlandstrasse 129, CH-8600 Dübendorf (Switzerland)

    2013-01-01

    The chemical composition of ancient metal objects provides important information for manufacturing studies and authenticity verification of ancient copper or bronze artifacts. Non- or minimal-destructive analytical methods are preferred to mitigate visible damage. Laser ablation inductively coupled plasma mass spectrometry (LA-ICPMS) enables the determination of major elements as well as impurities down to lower ppm-levels, however, accuracy and precision of analysis strongly depend on the homogeneity of reference materials used for calibration. Moreover, appropriate analytical procedures are required e.g. in terms of ablation strategies (scan mode, spot size, etc.). This study reviews available copper alloy (certified) reference materials — (C)RMs from different sources and contributes new metallurgical data on homogeneity and spatial elemental distribution. Investigations of the standards were performed by optical and scanning electron microscopy with X-ray spectrometry (SEM-EDX) for the following copper alloy and bronze (certified) reference materials: NIST 454, BAM 374, BAM 211, BAM 227, BAM 374, BAM 378, BAS 50.01-2, BAS 50.03-4, and BAS 50.04-4. Additionally, the influence of inhomogeneities on different ablation and calibration strategies is evaluated to define an optimum analytical strategy in terms of line scan versus single spot ablation, variation of spot size, selection of the most appropriate RMs or minimum number of calibration reference materials. - Highlights: ► New metallographic data for copper alloy reference materials are provided. ► Influence of RMs homogeneity on quality of LA-ICPMS analysis was evaluated. ► Ablation and calibration strategies were critically discussed. ► An LA-ICPMS method is proposed for analyzing most typical ancient copper alloys.

  11. Relativity Concept Inventory: Development, Analysis, and Results

    Science.gov (United States)

    Aslanides, J. S.; Savage, C. M.

    2013-01-01

    We report on a concept inventory for special relativity: the development process, data analysis methods, and results from an introductory relativity class. The Relativity Concept Inventory tests understanding of relativistic concepts. An unusual feature is confidence testing for each question. This can provide additional information; for example,…

  12. Convergence analysis of combinations of different methods

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Y. [Clarkson Univ., Potsdam, NY (United States)

    1994-12-31

    This paper provides a convergence analysis for combinations of different numerical methods for solving systems of differential equations. The author proves that combinations of two convergent linear multistep methods or Runge-Kutta methods produce a new convergent method of which the order is equal to the smaller order of the two original methods.

  13. Development of a fast extraction method and optimization of liquid chromatography-mass spectrometry for the analysis of phenolic compounds in lentil seed coats.

    Science.gov (United States)

    Mirali, Mahla; Ambrose, Stephen J; Wood, Stephen A; Vandenberg, Albert; Purves, Randy W

    2014-10-15

    A systematic set of optimization experiments was conducted to design an efficient extraction and analysis protocol for screening six different sub-classes of phenolic compounds in the seed coat of various lentil (Lens culinaris Medik.) genotypes. Different compounds from anthocyanidins, flavan-3-ols, proanthocyanidins, flavanones, flavones, and flavonols sub-classes were first optimized for use as standards for liquid chromatography mass spectrometry (LC-MS) with UV detection. The effect of maceration duration, reconstitution solvent, and extraction solvent were investigated using lentil genotype CDC Maxim. Chromatographic conditions were optimized by examining column separation efficiencies, organic composition, and solvent gradient. The results showed that a 1h maceration step was sufficient and that non-acidified solvents were more appropriate; a 70:30 acetone: water (v/v) solvent was ultimately selected. Using a Kinetex PFP column, the organic concentration, gradient, and flow rate were optimized to maximize the resolution of phenolic compounds in a short 30-min analysis time. The optimized method was applied to three lentil genotypes with different phenolic compound profiles to provide information of value to breeding programs.

  14. Development, optimization and validation of an HPLC-ELSD method for the analysis of enzymatically generated lactulose and saccharide by-products.

    Science.gov (United States)

    Schmidt, Christian M; Zürn, Tanja; Thienel, Katharina J F; Hinrichs, Jörg

    2017-01-15

    The aim of this study was to develop an HPLC-ELSD method for the quantification of lactulose in complex sugar solutions. Lactulose is a well-known prebiotic and supports the alleviation of digestive disorders. The enzymatic generation of lactulose requires fructose as nucleophilic acceptor. By-products such as glucose and galactose are generated. Four amino-modified silica-columns were tested and compared. The most suitable column based on peak resolution was used to optimize the method. Furthermore, sample preparation was optimized for the recovery of analytes. During the validation step, the following parameters were determined (e.g. for lactulose): recovery (106±7%), precision (98%), correctness (99%), limit of detection (3.9mg/L), limit of quantification (13.4mg/L) and linearity (0.993). The validated method was applied to samples from an enzymatic process for the production of lactulose at the laboratory scale. A final lactulose concentration of 6.7±0.4g/L was determined. PMID:27542485

  15. Probabilistic methods in combinatorial analysis

    CERN Document Server

    Sachkov, Vladimir N

    2014-01-01

    This 1997 work explores the role of probabilistic methods for solving combinatorial problems. These methods not only provide the means of efficiently using such notions as characteristic and generating functions, the moment method and so on but also let us use the powerful technique of limit theorems. The basic objects under investigation are nonnegative matrices, partitions and mappings of finite sets, with special emphasis on permutations and graphs, and equivalence classes specified on sequences of finite length consisting of elements of partially ordered sets; these specify the probabilist

  16. Development of a technique using MCNPX code for determination of nitrogen content of explosive materials using prompt gamma neutron activation analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Nasrabadi, M.N., E-mail: mnnasrabadi@ast.ui.ac.ir [Department of Nuclear Engineering, Faculty of Advanced Sciences and Technologies, University of Isfahan, Isfahan 81746-73441 (Iran, Islamic Republic of); Bakhshi, F.; Jalali, M.; Mohammadi, A. [Department of Nuclear Engineering, Faculty of Advanced Sciences and Technologies, University of Isfahan, Isfahan 81746-73441 (Iran, Islamic Republic of)

    2011-12-11

    Nuclear-based explosive detection methods can detect explosives by identifying their elemental components, especially nitrogen. Thermal neutron capture reactions have been used for detecting prompt gamma 10.8 MeV following radioactive neutron capture by {sup 14}N nuclei. We aimed to study the feasibility of using field-portable prompt gamma neutron activation analysis (PGNAA) along with improved nuclear equipment to detect and identify explosives, illicit substances or landmines. A {sup 252}Cf radio-isotopic source was embedded in a cylinder made of high-density polyethylene (HDPE) and the cylinder was then placed in another cylindrical container filled with water. Measurements were performed on high nitrogen content compounds such as melamine (C{sub 3}H{sub 6}N{sub 6}). Melamine powder in a HDPE bottle was placed underneath the vessel containing water and the neutron source. Gamma rays were detected using two NaI(Tl) crystals. The results were simulated with MCNP4c code calculations. The theoretical calculations and experimental measurements were in good agreement indicating that this method can be used for detection of explosives and illicit drugs.

  17. Development and validation of a hydrophilic interaction chromatography method coupled with a charged aerosol detector for quantitative analysis of nonchromophoric α-hydroxyamines, organic impurities of metoprolol.

    Science.gov (United States)

    Xu, Qun; Tan, Shane; Petrova, Katya

    2016-01-25

    The European Pharmacopeia (EP) metoprolol impurities M and N are polar, nonchromophoric α-hydroxyamines, which are poorly retained in a conventional reversed-phase chromatographic system and are invisible for UV detection. Impurities M and N are currently analyzed by TLC methods in the EP as specified impurities and in the United States Pharmacopeia-National Formulary (USP-NF) as unspecified impurities. In order to modernize the USP monographs of metoprolol drug substances and related drug products, a hydrophilic interaction chromatography (HILIC) method coupled with a charged aerosol detector (CAD) was explored for the analysis of the two impurities. A comprehensive column screening that covers a variety of HILIC stationary phases (underivatized silica, amide, diol, amino, zwitterionic, polysuccinimide, cyclodextrin, and mixed-mode) and optimization of HPLC conditions led to the identification of a Halo Penta HILIC column (4.6 × 150 mm, 5 μm) and a mobile phase comprising 85% acetonitrile and 15% ammonium formate buffer (100 mM, pH 3.2). Efficient separations of metoprolol, succinic acid, and EP metoprolol impurities M and N were achieved within a short time frame (drug substance (metoprolol succinate) and drug products (metoprolol tartrate injection and metoprolol succinate extended release tablets). PMID:26580821

  18. Applying critical analysis - main methods

    Directory of Open Access Journals (Sweden)

    Miguel Araujo Alonso

    2012-02-01

    Full Text Available What is the usefulness of critical appraisal of literature? Critical analysis is a fundamental condition for the correct interpretation of any study that is subject to review. In epidemiology, in order to learn how to read a publication, we must be able to analyze it critically. Critical analysis allows us to check whether a study fulfills certain previously established methodological inclusion and exclusion criteria. This is frequently used in conducting systematic reviews although eligibility criteria are generally limited to the study design. Critical analysis of literature and be done implicitly while reading an article, as in reading for personal interest, or can be conducted in a structured manner, using explicit and previously established criteria. The latter is done when formally reviewing a topic.

  19. Root Cause Analysis: Methods and Mindsets.

    Science.gov (United States)

    Kluch, Jacob H.

    This instructional unit is intended for use in training operations personnel and others involved in scram analysis at nuclear power plants in the techniques of root cause analysis. Four lessons are included. The first lesson provides an overview of the goals and benefits of the root cause analysis method. Root cause analysis techniques are covered…

  20. Development of test methods for textile composites

    Science.gov (United States)

    Masters, John E.; Ifju, Peter G.; Fedro, Mark J.

    1993-01-01

    NASA's Advanced Composite Technology (ACT) Program was initiated in 1990 with the purpose of developing less costly composite aircraft structures. A number of innovative materials and processes were evaluated as a part of this effort. Chief among them are composite materials reinforced with textile preforms. These new forms of composite materials bring with them potential testing problems. Methods currently in practice were developed over the years for composite materials made from prepreg tape or simple 2-D woven fabrics. A wide variety of 2-D and 3-D braided, woven, stitched, and knit preforms were suggested for application in the ACT program. The applicability of existing test methods to the wide range of emerging materials bears investigation. The overriding concern is that the values measured are accurate representations of the true material response. The ultimate objective of this work is to establish a set of test methods to evaluate the textile composites developed for the ACT Program.

  1. Cost Analysis: Methods and Realities.

    Science.gov (United States)

    Cummings, Martin M.

    1989-01-01

    Argues that librarians need to be concerned with cost analysis of library functions and services because, in the allocation of resources, decision makers will favor library managers who demonstrate understanding of the relationships between costs and productive outputs. Factors that should be included in a reliable scheme for cost accounting are…

  2. Developing numerical methods for experimental data processing

    International Nuclear Information System (INIS)

    Materials study implies experimental measurements the results of which are always affected by noise. To perform numerical data processing, as for instance, numerical derivation preparatory smoothing it is necessary to avoid instabilities. This implies the noise extraction from the experimental data. When obtaining great amount of data is possible, many of the noise related problems can be solved by using statistical indicators. In case of high cost experiments or problems of unique type, the task of extracting useful information referring to given materials parameters is of paramount significance. The paper presents several numerical methods for processing the experimental data developed at INR Pitesti. These were employed in treating the experimental data obtained in nuclear materials studies and which aimed at materials characterization and fabrication technology development. To refine and determine the accuracy of the real experimental data processing methods, computerized simulations were largely used. These methods refer to the transfer relations for important statistical indicators in case of mediate measurements, to increase the resolution of the measurements carried out with linear detectors as well as for numerical smoothing of experimental data. A figure is given with results obtained by applying the numerical smoothing method for the experimental data from X-ray diffraction measurements on Zircaloy-4. The numerical methods developed were applied in materials studies of the structure materials used in CANDU 600 reactor and advanced CANDU type fuels as well as for natural uranium or thorium and thorium-uranium fuel pellets. These methods helped in increasing the measurements' accuracy and confidence level

  3. Analysis methods for photovoltaic applications

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-01-01

    Because photovoltaic power systems are being considered for an ever-widening range of applications, it is appropriate for system designers to have knowledge of and access to photovoltaic power systems simulation models and design tools. This brochure gives brief descriptions of a variety of such aids and was compiled after surveying both manufacturers and researchers. Services available through photovoltaic module manufacturers are outlined, and computer codes for systems analysis are briefly described. (WHK)

  4. Development of a nondestructive method for underglaze painted tiles--demonstrated by the analysis of Persian objects from the nineteenth century.

    Science.gov (United States)

    Reiche, Ina; Röhrs, Stefan; Salomon, Joseph; Kanngiesser, Birgit; Höhn, Yvonne; Malzer, Wolfgang; Voigt, Friederike

    2009-02-01

    The paper presents an analytical method developed for the nondestructive study of nineteenth-century Persian polychrome underglaze painted tiles. As an example, 9 tiles from French and German museum collections were investigated. Before this work was undertaken little was known about the materials used in pottery at that time, although the broad range of colors and shades, together with their brilliant glazes, made these objects stand out when compared with Iranian ceramics of the preceding periods and suggested the use of new pigments, colorants, and glaze compositions. These materials are thought to be related to provenance and as such appropriate criteria for art-historical attribution. The analytical method is based on the combination of different nondestructive spectroscopic techniques using microfocused beams such as proton-induced X-ray emission/proton-induced gamma-ray emission, X-ray fluorescence, 3D X-ray absorption near edge structure, and confocal Raman spectroscopy and also visible spectroscopy. It was established to address the specific difficulties these objects and the technique of underglaze painting raise. The exact definition of the colors observed on the tiles using the Natural Color System helped to attribute them to different colorants. It was possible to establish the presence of Cr- and U-based colorants as new materials in nineteenth-century Persian tilemaking. The difference in glaze composition (Pb, Sn, Na, and K contents) as well as the use of B and Sn were identified as a potential marker for different workshops. PMID:19030848

  5. Using numerical analysis to develop and evaluate the method of high temperature sous-vide to soften carrot texture in different-sized packages.

    Science.gov (United States)

    Hong, Yoon-Ki; Uhm, Joo-Tae; Yoon, Won Byong

    2014-04-01

    The high-temperature sous-vide (HTSV) method was developed to prepare carrots with a soft texture at the appropriate degree of pasteurization. The effect of heating conditions, such as temperature and time, was investigated on various package sizes. Heating temperatures of 70, 80, and 90 °C and heating times of 10 and 20 min were used to evaluate the HTSV method. A 3-dimensional conduction model and numerical simulations were used to estimate the temperature distribution and the rate of heat transfer to samples with various geometries. Four different-sized packages were prepared by stacking carrot sticks of identical size (9.6 × 9.6 × 90 mm) in a row. The sizes of the packages used were as follows: (1) 9.6 × 86.4 × 90, (2) 19.2 × 163.2 × 90, (3) 28.8 × 86.4 × 90, and (4) 38.4 × 86.4 × 90 mm. Although only a moderate change in color (L*, a*, and b*) was observed following HTSV cooking, there was a significant decrease in carrot hardness. The geometry of the package and the heating conditions significantly influenced the degree of pasteurization and the final texture of the carrots. Numerical simulations successfully described the effect of geometry on samples at different heating conditions.

  6. Development and validation of a simple thin-layer chromatographic method for the analysis of p-chlorophenol in treated wastewater

    Directory of Open Access Journals (Sweden)

    Tešić Živoslav

    2012-01-01

    Full Text Available A thin-layer chromatographic method with densitometric detection was established for quantification of p-chlorophenol in waste water. Degradation efficiency of p-chlorophenol was monitored after each treatment of the wastewater samples. Degradation of p-chlorophenol was performed with advanced oxidation processes (AOPs, using UV, H2O2/UV, O3/H2O2/UV, O3 and O3/UV. Developed TLC procedure has been found to be simple, rapid and precise. The method was characterized by high sensitivity (limit of detection was 11 ng per band and limit of quantification 35 ng per band, linear range (from 75 to 500 ng per band, r = 0.9965, and high precision, accuracy (mean percentage recovery 98.6%, and specificity. Additionally, the efficiency of degradation was monitored using HPLC giving comparable results with RP TLC measurements. [Acknowledgement. This work was performed within the framework of the research project No. 172017 supported by the Ministry of Education and Science of Serbia.

  7. Using numerical analysis to develop and evaluate the method of high temperature sous-vide to soften carrot texture in different-sized packages.

    Science.gov (United States)

    Hong, Yoon-Ki; Uhm, Joo-Tae; Yoon, Won Byong

    2014-04-01

    The high-temperature sous-vide (HTSV) method was developed to prepare carrots with a soft texture at the appropriate degree of pasteurization. The effect of heating conditions, such as temperature and time, was investigated on various package sizes. Heating temperatures of 70, 80, and 90 °C and heating times of 10 and 20 min were used to evaluate the HTSV method. A 3-dimensional conduction model and numerical simulations were used to estimate the temperature distribution and the rate of heat transfer to samples with various geometries. Four different-sized packages were prepared by stacking carrot sticks of identical size (9.6 × 9.6 × 90 mm) in a row. The sizes of the packages used were as follows: (1) 9.6 × 86.4 × 90, (2) 19.2 × 163.2 × 90, (3) 28.8 × 86.4 × 90, and (4) 38.4 × 86.4 × 90 mm. Although only a moderate change in color (L*, a*, and b*) was observed following HTSV cooking, there was a significant decrease in carrot hardness. The geometry of the package and the heating conditions significantly influenced the degree of pasteurization and the final texture of the carrots. Numerical simulations successfully described the effect of geometry on samples at different heating conditions. PMID:24689882

  8. Method development for the determination of bromine in coal using high-resolution continuum source graphite furnace molecular absorption spectrometry and direct solid sample analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Éderson R.; Castilho, Ivan N.B. [Departamento de Química, Universidade Federal de Santa Catarina, 88040-900 Florianópolis, SC (Brazil); Welz, Bernhard, E-mail: w.bernardo@terra.com.br [Departamento de Química, Universidade Federal de Santa Catarina, 88040-900 Florianópolis, SC (Brazil); Instituto Nacional de Ciência e Tecnologia do CNPq, INCT de Energia e Ambiente, Universidade Federal da Bahia, 40170-115 Salvador, BA (Brazil); Gois, Jefferson S. [Departamento de Química, Universidade Federal de Santa Catarina, 88040-900 Florianópolis, SC (Brazil); Borges, Daniel L.G. [Departamento de Química, Universidade Federal de Santa Catarina, 88040-900 Florianópolis, SC (Brazil); Instituto Nacional de Ciência e Tecnologia do CNPq, INCT de Energia e Ambiente, Universidade Federal da Bahia, 40170-115 Salvador, BA (Brazil); Carasek, Eduardo [Departamento de Química, Universidade Federal de Santa Catarina, 88040-900 Florianópolis, SC (Brazil); Andrade, Jailson B. de [Instituto Nacional de Ciência e Tecnologia do CNPq, INCT de Energia e Ambiente, Universidade Federal da Bahia, 40170-115 Salvador, BA (Brazil)

    2014-06-01

    This work reports a simple approach for Br determination in coal using direct solid sample analysis in a graphite tube furnace and high-resolution continuum source molecular absorption spectrometry. The molecular absorbance of the calcium mono-bromide (CaBr) molecule has been measured using the rotational line at 625.315 nm. Different chemical modifiers (zirconium, ruthenium, palladium and a mixture of palladium and magnesium nitrates) have been evaluated in order to increase the sensitivity of the CaBr absorption, and Zr showed the best overall performance. The pyrolysis and vaporization temperatures were 800 °C and 2200 °C, respectively. Accuracy and precision of the method have been evaluated using certified coal reference materials (BCR 181, BCR 182, NIST 1630a, and NIST 1632b) with good agreement (between 98 and 103%) with the informed values for Br. The detection limit was around 4 ng Br, which corresponds to about 1.5 μg g{sup −1} Br in coal, based on a sample mass of 3 mg. In addition, the results were in agreement with those obtained using electrothermal vaporization inductively coupled plasma mass spectrometry, based on a Student t-test at a 95% confidence level. A mechanism for the formation of the CaBr molecule is proposed, which might be considered for other diatomic molecules as well. - Highlights: • Bromine has been determined in coal using direct solid sample analysis. • Calibration has been carried out against aqueous standard solutions. • The coal samples and the molecule-forming reagent have been separated in order to avoid interferences. • The results make possible to draw conclusions about the mechanisms of molecule formation.

  9. Development of a sensitive and reliable high performance liquid chromatography method with fluorescence detection for high-throughput analysis of multi-class mycotoxins in Coix seed.

    Science.gov (United States)

    Kong, Wei-Jun; Li, Jun-Yuan; Qiu, Feng; Wei, Jian-He; Xiao, Xiao-He; Zheng, Yuguo; Yang, Mei-Hua

    2013-10-17

    As an edible and medicinal plant, Coix seed is readily contaminated by more than one group of mycotoxins resulting in potential risk to human health. A reliable and sensitive method has been developed to determine seven mycotoxins (aflatoxins B1, B2, G1, G2, zearalenone, α-zearalenol, and β-zearalenol) simultaneously in 10 batches of Coix seed marketed in China. The method is based on a rapid ultrasound-assisted solid-liquid extraction (USLE) using methanol/water (80/20) followed by immunoaffinity column (IAC) clean-up, on-line photochemical derivatization (PCD), and high performance liquid chromatography coupled with fluorescence detection (HPLC-FLD). Careful optimization of extraction, clean-up, separation and detection conditions was accomplished to increase sample throughput and to attain rapid separation and sensitive detection. Method validation was performed by analyzing samples spiked at three different concentrations for the seven mycotoxins. Recoveries were from 73.5% to 107.3%, with relative standard deviations (RSDs) lower than 7.7%. The intra- and inter-day precisions, expressed as RSDs, were lower than 4% for all studied analytes. Limits of detection and quantification ranged from 0.01 to 50.2 μg kg(-1), and from 0.04 to 125.5 μg kg(-1), respectively, which were below the tolerance levels for mycotoxins set by the European Union. Samples that tested positive were further analyzed by HPLC tandem electrospray ionization mass spectrometry for confirmatory purposes. This is the first application of USLE-IAC-HPLC-PCD-FLD for detecting the occurrence of multi-class mycotoxins in Coix seed. PMID:24091376

  10. Development of a sensitive and reliable high performance liquid chromatography method with fluorescence detection for high-throughput analysis of multi-class mycotoxins in Coix seed.

    Science.gov (United States)

    Kong, Wei-Jun; Li, Jun-Yuan; Qiu, Feng; Wei, Jian-He; Xiao, Xiao-He; Zheng, Yuguo; Yang, Mei-Hua

    2013-10-17

    As an edible and medicinal plant, Coix seed is readily contaminated by more than one group of mycotoxins resulting in potential risk to human health. A reliable and sensitive method has been developed to determine seven mycotoxins (aflatoxins B1, B2, G1, G2, zearalenone, α-zearalenol, and β-zearalenol) simultaneously in 10 batches of Coix seed marketed in China. The method is based on a rapid ultrasound-assisted solid-liquid extraction (USLE) using methanol/water (80/20) followed by immunoaffinity column (IAC) clean-up, on-line photochemical derivatization (PCD), and high performance liquid chromatography coupled with fluorescence detection (HPLC-FLD). Careful optimization of extraction, clean-up, separation and detection conditions was accomplished to increase sample throughput and to attain rapid separation and sensitive detection. Method validation was performed by analyzing samples spiked at three different concentrations for the seven mycotoxins. Recoveries were from 73.5% to 107.3%, with relative standard deviations (RSDs) lower than 7.7%. The intra- and inter-day precisions, expressed as RSDs, were lower than 4% for all studied analytes. Limits of detection and quantification ranged from 0.01 to 50.2 μg kg(-1), and from 0.04 to 125.5 μg kg(-1), respectively, which were below the tolerance levels for mycotoxins set by the European Union. Samples that tested positive were further analyzed by HPLC tandem electrospray ionization mass spectrometry for confirmatory purposes. This is the first application of USLE-IAC-HPLC-PCD-FLD for detecting the occurrence of multi-class mycotoxins in Coix seed.

  11. Probabilistic Analysis Methods for Hybrid Ventilation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Frier, Christian; Heiselberg, Per

    This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application...

  12. The Functional Methods of Discourse Analysis

    Institute of Scientific and Technical Information of China (English)

    覃卓敏

    2008-01-01

    From the macroscopic angle of function, methods of discourse analysis are clarified to find out two important methods in pragmatics and through which will better used in the understanding of discourse.

  13. Development of medical application methods using radiation. Radionuclide therapy

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Chang Woon; Lim, S. M.; Kim, E.H.; Woo, K. S.; Chung, W. S.; Lim, S. J.; Choi, T. H.; Hong, S. W.; Chung, H. Y.; No, W. C. [Korea Atomic Energy Research Institute. Korea Cancer Center Hospital, Seoul, (Korea, Republic of); Oh, B. H. [Seoul National University. Hospital, Seoul (Korea, Republic of); Hong, H. J. [Antibody Engineering Research Unit, Taejon (Korea, Republic of)

    1999-04-01

    In this project, we studied following subjects: 1. development of monoclonal antibodies and radiopharmaceuticals 2. clinical applications of radionuclide therapy 3. radioimmunoguided surgery 4. prevention of restenosis with intracoronary radiation. The results can be applied for the following objectives: (1) radionuclide therapy will be applied in clinical practice to treat the cancer patients or other diseases in multi-center trial. (2) The newly developed monoclonal antibodies and biomolecules can be used in biology, chemistry or other basic life science research. (3) The new methods for the analysis of therapeutic effects, such as dosimetry, and quantitative analysis methods of radioactivity, can be applied in basic research, such as radiation oncology and radiation biology.

  14. Development of medical application methods using radiation. Radionuclide therapy

    International Nuclear Information System (INIS)

    In this project, we studied following subjects: 1. development of monoclonal antibodies and radiopharmaceuticals 2. clinical applications of radionuclide therapy 3. radioimmunoguided surgery 4. prevention of restenosis with intracoronary radiation. The results can be applied for the following objectives: 1) radionuclide therapy will be applied in clinical practice to treat the cancer patients or other diseases in multi-center trial. 2) The newly developed monoclonal antibodies and biomolecules can be used in biology, chemistry or other basic life science research. 3) The new methods for the analysis of therapeutic effects, such as dosimetry, and quantitative analysis methods of radioactivity, can be applied in basic research, such as radiation oncology and radiation biology

  15. Development of a Chemoenzymatic-like and Photoswitchable Method for the High-Throughput creation of Protein Microarrays. Application to the Analysis of the Protein/Protein Interactions Involved in the YOP Virulon from Yersinia pestis.

    Energy Technology Data Exchange (ETDEWEB)

    Camarero, J A

    2006-12-07

    Protein arrays are ideal tools for the rapid analysis of whole proteomes as well as for the development of reliable and cheap biosensors. The objective of this proposal is to develop a new ligand assisted ligation method based in the naturally occurring protein trans-splicing process. This method has been used for the generation of spatially addressable arrays of multiple protein components by standard micro-lithographic techniques. Key to our approach is the use of the protein trans-splicing process. This naturally occurring process allows the development of a truly generic and highly efficient method for the covalent attachment of proteins through its C-terminus to any solid support. This technology has been used for the creation of protein chips containing several virulence factors from the human pathogen Y. pestis.

  16. Developing a TQM quality management method model

    NARCIS (Netherlands)

    Zhang, Zhihai

    1997-01-01

    From an extensive review of total quality management literature, the external and internal environment affecting an organization's quality performance and the eleven primary elements of TQM are identified. Based on the primary TQM elements, a TQM quality management method model is developed. This mo

  17. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, Bedir; Saeki, Motoshi; Sunyé, Gerson; Broek, van den Pim; Hruby, Pavel

    2001-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software devel

  18. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, Bedir; Saeki, Motoshi; Sunyé, Gerson; Broek, van den Pim; Hruby, Pavel; Frohner, A´ kos

    2002-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software develo

  19. New Developments of the Shared Concern Method.

    Science.gov (United States)

    Pikas, Anatol

    2002-01-01

    Reviews and describes new developments in the Shared Concern method (SCm), a tool for tackling group bullying amongst teenagers by individual talks. The psychological mechanisms of healing in the bully group and what hinders the bully therapist in eliciting them have become better clarified. The most important recent advancement of the SCm…

  20. The development of spectrophotometric and electroanalytical methods for ascorbic acid and acetaminophen and their applications in the analysis of effervescent dosage forms.

    Science.gov (United States)

    Săndulescu, R; Mirel, S; Oprean, R

    2000-08-01

    The electroanalytical study of ascorbic acid, acetaminophen and of several mixtures of these compounds in different ratios has been made by using a carbon paste electrode (CPE-graphite:solid paraffin 2:1) as working electrode and an Ag/AgCl reference electrode. The potential curves were recorded using different concentrations of ascorbic acid and acetaminophen by measuring samples between 10 and 50 microl. The oxidation reactions were studied in a potential range from -0.1 to +1.3 V with different sweep rates, at different current sensitivities, in stationary working conditions and stirring before each replicate. The oxidation of ascorbic acid occurs at +0.31 +/- 0.02 V and the oxidation of acetaminophen at +0.60 +/- 0.05 V; meanwhile, the current has a linear variation for the following concentration ranges: 10(-3)-10(-2) M for the ascorbic acid and 3 x 10(-6)-7.5 x 10(-3) M for acetaminophen (r2 = 0.999 for both ascorbic acid and acetaminophen). The mixtures of ascorbic acid and acetaminophen were made as follows: 1:1, 1:2, 1:3, 2:1, and 3:1. The studies revealed the alteration of the voltammograms processed according to the validation methodology. The best potential variation range for different current sensitivities, the influence of the sweep rate, of the solvent volume and of the pH were studied. The mutual interferences of the compounds in the mixtures and the electroactive compounds in the pharmaceutical dosage forms, especially effervescent ones, also made the object of the research. The same mixtures were studied using the direct spectrophotometric method that revealed a lot of spectral interferences. In order to solve this problem, an appropriate separation or an indirect spectrophotometric method (the apparent content curves method) were used. The spectrophotometric and voltammetric methods developed were used to determine ascorbic acid and acetaminophen in different dosage forms (vials, tablets, suppositories and effervescent dosage forms). The results

  1. Development of advanced nodal diffusion methods for modern computer architectures

    International Nuclear Information System (INIS)

    A family of highly efficient multidimensional multigroup advanced neutron-diffusion nodal methods, ILLICO, were implemented on sequential, vector, and vector-concurrent computers. Three-dimensional realistic benchmark problems can be solved in vectorized mode in less than 0.73 s (33.86 Mflops) on a Cray X-MP/48. Vector-concurrent implementations yield speedups as high as 9.19 on an Alliant FX/8. These results show that the ILLICO method preserves essentially all of its speed advantage over finite-difference methods. A self-consistent higher-order nodal diffusion method was developed and implemented. Nodal methods for global nuclear reactor multigroup diffusion calculations which account explicitly for heterogeneities in the assembly nuclear properties were developed and evaluated. A systematic analysis of the zero-order variable cross section nodal method was conducted. Analyzing the KWU PWR depletion benchmark problem, it is shown that when burnup heterogeneities arise, ordinary nodal methods, which do not explicitly treat the heterogeneities, suffer a significant systematic error that accumulates. A nodal method that treats explicitly the space dependence of diffusion coefficients was developed and implemented. A consistent burnup-correction method for nodal microscopic depletion analysis was developed

  2. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    Science.gov (United States)

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  3. Development of a novel two-stage liquid desiccant dehumidification system assisted by CaCl2 solution using exergy analysis method

    International Nuclear Information System (INIS)

    Air conditioning system based on liquid desiccant has been recognized as an efficient independent air humidity control HVAC system. To improve thermal coefficient of performance, a novel two-stage liquid desiccant dehumidification system assisted by calcium chloride (CaCl2) solution is developed through exergy analysis based on the second thermodynamic law. Compared with the basic liquid desiccant dehumidification system, the proposed system is improved by two ways, i.e. increasing the concentration variance and the pre-dehumidification of CaCl2. The exergy loss in the desiccant-desiccant heat recovery process can be significantly reduced by increasing desiccant concentration variance between strong desiccant solution after regeneration and weak desiccant solution after dehumidification. Meanwhile, the pre-dehumidification of CaCl2 solution can reduce the irreversibility in the regeneration/dehumidification process. Compared to the basic system, the thermal coefficient performance and exergy efficiency of the proposed system are increased from 0.24 to 0.73 and from 6.8% to 23.0%, respectively, under the given conditions. Useful energy storage capacity of CaCl2 solution and LiCl solution at concentration of 40% reach 237.8 and 395.1 MJ/m3, respectively. The effects of desiccant regeneration temperature, air mass flux, desiccant mass flux, etc., on the performance of the proposed system are also analyzed.

  4. 基于企业价值系统方法论的企业发展战略分析方法的探讨%Exploration of Enterprise Development Strategy Analysis Method Based on Enterprise Value System Methodology

    Institute of Scientific and Technical Information of China (English)

    陈向荣

    2014-01-01

    This paper firstly reviews and introduces value engineering theory and the historical development and theoretical system of enterprise value system methodology, then puts forward the enterprise development strategy analysis method based on enterprise value system methodology, which using the PEST analysis, Porter's five forces model analysis, SWOT analysis and AHP analysis method to carry on environmental analysis, using RWFJ analysis to analyze the coupling relationship of key elements for the things and environment, as well as using Boston matrix analysis to analyze and formulate portfolio combination development strategy, finally gives the conclusions and precautions.%本文首先对价值工程理论和企业价值系统方法论的历史发展和理论体系进行了回顾和介绍;然后提出了基于企业价值系统方法论的企业发展战略分析方法,采用PEST分析法、波特五力模型分析法、SWOT分析法和AHP分析法进行环境分析,使用RWFJ分析法分析关键要素对事和环境的耦合关系,应用波士顿矩阵分析法分析制定业务组合发展策略;最后给出了分析企业发展战略的结论和需要注意的事项。

  5. Development of Quantification Method for Bioluminescence Imaging

    International Nuclear Information System (INIS)

    Optical molecular luminescence imaging is widely used for detection and imaging of bio-photons emitted by luminescent luciferase activation. The measured photons in this method provide the degree of molecular alteration or cell numbers with the advantage of high signal-to-noise ratio. To extract useful information from the measured results, the analysis based on a proper quantification method is necessary. In this research, we propose a quantification method presenting linear response of measured light signal to measurement time. We detected the luminescence signal by using lab-made optical imaging equipment of animal light imaging system (ALIS) and different two kinds of light sources. One is three bacterial light-emitting sources containing different number of bacteria. The other is three different non-bacterial light sources emitting very weak light. By using the concept of the candela and the flux, we could derive simplified linear quantification formula. After experimentally measuring light intensity, the data was processed with the proposed quantification function. We could obtain linear response of photon counts to measurement time by applying the pre-determined quantification function. The ratio of the re-calculated photon counts and measurement time present a constant value although different light source was applied. The quantification function for linear response could be applicable to the standard quantification process. The proposed method could be used for the exact quantitative analysis in various light imaging equipment with presenting linear response behavior of constant light emitting sources to measurement time

  6. Analysis of Vibration Diagnostics Methods for Induction Motors

    Directory of Open Access Journals (Sweden)

    A. Kalinov

    2012-01-01

    Full Text Available The paper presents an analysis of existing vibration diagnostics methods. In order to evaluate an efficiency of method application the following criteria have been proposed: volume of input data required for establishing diagnosis, data content, software and hardware level, execution time for vibration diagnostics. According to the mentioned criteria a classification of vibration diagnostics methods for determination of their advantages and disadvantages, search for their development and improvement has been presented in paper. The paper contains a comparative estimation of methods in accordance with the proposed  criteria. According to this estimation the most efficient methods are a spectral analysis and spectral analysis of the vibration signal envelope.

  7. The Qualitative Method of Impact Analysis.

    Science.gov (United States)

    Mohr, Lawrence B.

    1999-01-01

    Discusses qualitative methods of impact analysis and provides an introductory treatment of one such approach. Combines an awareness of an alternative causal epistemology with current knowledge of qualitative methods of data collection and measurement to produce an approach to the analysis of impacts. (SLD)

  8. Prioritizing pesticide compounds for analytical methods development

    Science.gov (United States)

    Norman, Julia E.; Kuivila, Kathryn M.; Nowell, Lisa H.

    2012-01-01

    The U.S. Geological Survey (USGS) has a periodic need to re-evaluate pesticide compounds in terms of priorities for inclusion in monitoring and studies and, thus, must also assess the current analytical capabilities for pesticide detection. To meet this need, a strategy has been developed to prioritize pesticides and degradates for analytical methods development. Screening procedures were developed to separately prioritize pesticide compounds in water and sediment. The procedures evaluate pesticide compounds in existing USGS analytical methods for water and sediment and compounds for which recent agricultural-use information was available. Measured occurrence (detection frequency and concentrations) in water and sediment, predicted concentrations in water and predicted likelihood of occurrence in sediment, potential toxicity to aquatic life or humans, and priorities of other agencies or organizations, regulatory or otherwise, were considered. Several existing strategies for prioritizing chemicals for various purposes were reviewed, including those that identify and prioritize persistent, bioaccumulative, and toxic compounds, and those that determine candidates for future regulation of drinking-water contaminants. The systematic procedures developed and used in this study rely on concepts common to many previously established strategies. The evaluation of pesticide compounds resulted in the classification of compounds into three groups: Tier 1 for high priority compounds, Tier 2 for moderate priority compounds, and Tier 3 for low priority compounds. For water, a total of 247 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods for monitoring and studies. Of these, about three-quarters are included in some USGS analytical method; however, many of these compounds are included on research methods that are expensive and for which there are few data on environmental samples. The remaining quarter of Tier 1

  9. 共形天线分析综合方法研究进展%Development of the Method of Analysis and Synthesis for Conformal Antenna

    Institute of Scientific and Technical Information of China (English)

    刘元柱; 肖绍球; 唐明春; 王秉中

    2011-01-01

    共形天线阵列技术是天线技术发展的重要方向,在军民用雷达与通信系统中具有广阔的应用前景,其分析与综合问题是天线领域研究的热点与难点课题。本文对共形天线分析综合方法的发展进行了详细总结,并展望了未来共形天线研究的发展趋势。%The technology of conformal antenna array is of significant importance in antenna investi- gation, and has wide potential application in military/civil radars and communication systems, while its a- nalysis and synthesis are witnessing difficulties in such hot research. In this paper, the methods of analy- sis and synthesis for conformal antenna are detailed summarized, and the outlook of developing trend of conformal antenna in the future is also forecasted.

  10. SWOT分析法在三甲医院门诊建设和发展中的应用%The application of SWOT analysis method in the construction and development of the province tumor hospital new district outpatient service

    Institute of Scientific and Technical Information of China (English)

    江锦平; 张敬; 赵翠霞; 单保恩; 王士杰; 席彪

    2014-01-01

    目的:探讨SWOT分析法(态势分析法)在医院新区门诊建设与发展中的应用。方法:运用SWOT分析法对医院新区门诊建设与发展进行分析。结果:通过分析找出了医院新区门诊建设与发展的优势与劣势,并针有对性地提出相应的策略,使医院新区门诊量实现持续增长。结论:SWOT分析法能够帮助医院新区门诊制定相应切实可行的发展策略,促进医院良性发展。%Objective:To investigate the application of SWOT analysis method in the construction and development of the province tumor hospital new district outpatient service. Methods:It was analyzed which construction and development of the provincial cancer hospital district outpatient through using the SWOT analysis method. Results: The advantages and disadvantages, the opportunities and threats of new outpatient service were found by using the SWOT analysis. The corresponding development strategy of SO, WO, ST, WT were established and implemented, It made new outpatient service to achieve the sustained growth. Conclusion:The SWOT analysis method is feasible and useful to promote the development of new district hospital outpatient service better.

  11. Probabilistic structural analysis by extremum methods

    Science.gov (United States)

    Nafday, Avinash M.

    1990-01-01

    The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

  12. Economic analysis of alternative LLW disposal methods

    International Nuclear Information System (INIS)

    The Environmental Protection Agency (EPA) has evaluated the costs and benefits of alternative disposal technologies as part of its program to develop generally applicable environmental standards for the land disposal of low-level radioactive waste (LLW). Costs, population health effects and Critical Population Group (CPG) exposures resulting from alternative waste treatment and disposal methods were evaluated both in absolute terms and also relative to a base case (current practice). Incremental costs of the standard included costs for packaging, processing, transportation, and burial of waste. Benefits are defined in terms of reductions in the general population health risk (expected fatal cancers and genetic effects) evaluated over 10,000 years. A cost-effectiveness ratio, defined as the incremental cost per avoided health effect, was calculated for each alternative standard. The cost-effectiveness analysis took into account a number of waste streams, hydrogeologic and climatic region settings, and waste treatment and disposal methods. This paper describes the alternatives considered and preliminary results of the cost-effectiveness analysis. 15 references, 7 figures, 3 tables

  13. Improving Software Development Processes with Multicriteria Methods

    CERN Document Server

    Kornyshova, Elena; Salinesi, Camille

    2009-01-01

    All software development processes include steps where several alternatives induce a choice, a decision-making. Sometimes, methodologies offer a way to make decisions. However, in a lot of cases, the arguments to carry out the decision are very poor and the choice is made in an intuitive and hazardous way. The aim of our work is to offer a scientifically founded way to guide the engineer through tactical choices with the application of multicriteria methods in software development processes. This approach is illustrated with three cases: risks, use cases and tools within Rational Unified Process.

  14. Transport Test Problems for Hybrid Methods Development

    Energy Technology Data Exchange (ETDEWEB)

    Shaver, Mark W.; Miller, Erin A.; Wittman, Richard S.; McDonald, Benjamin S.

    2011-12-28

    This report presents 9 test problems to guide testing and development of hybrid calculations for the ADVANTG code at ORNL. These test cases can be used for comparing different types of radiation transport calculations, as well as for guiding the development of variance reduction methods. Cases are drawn primarily from existing or previous calculations with a preference for cases which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22.

  15. A liquid chromatographic method for analysis of all-rac-alpha-tocopheryl acetate and retinyl palmitate in medical food using matrix solid-phase dispersion in conjunction with a zero reference material as a method development tool.

    Science.gov (United States)

    Chase, G W; Eitenmiller, R R; Long, A R

    1999-01-01

    A liquid chromatographic method is described for analysis of all-rac-alpha-tocopheryl acetate and retinyl palmitate in medical food. The vitamins are extracted from medical food without saponification by matrix solid-phase dispersion and chromatographed by normal-phase chromatography with fluorescence detection. Retinyl palmitate and all-rac-alpha-tocopheryl acetate are quantitated isocratically with a mobile phase of 0.125% (v/v) and 0.5% (v/v) isopropyl alcohol in hexane, respectively. Results compared favorably with label declarations on retail medical foods. Recoveries determined on an analyte-fortified zero reference material for a milk-based medical food averaged 98.3% (n = 25) for retinyl palmitate spikes and 95.7% (n = 25) for all-rac-alpha-tocopheryl acetate spikes. Five concentrations were examined for each analyte, and results were linear (r2 = 0.995 for retinyl palmitate and 0.9998 for all-rac-alpha-tocopheryl acetate) over the concentration range examined, with coefficients of variation in the range 0.81-4.22%. The method provides a rapid, specific, and easily controlled assay for analysis of retinyl palmitate and all-rac-alpha-tocopheryl acetate in fortified medical foods. PMID:10028678

  16. Development of Gocing Storage Method for Cocoyam

    OpenAIRE

    Chukwu, G.O; Nwosu, K.I; Madu, T.U; Chinaka, C; B.C. Okoye

    2008-01-01

    Lack of good storage reduces the shelf life of harvested cocoyam (Colocasia spp and Xanthosoma spp) corms and cormels. This is a major challenge facing cocoyam farmers, processors, and marketers in Nigeria. The National Root Crops Research Institute (NRCRI), Umudike, Nigeria, which has a national mandate to research into root and tubers crops of economic importance, has developed the ‘Gocing Storage’ for improved storage of cocoyam. The paper highlights this improved method of storing cocoya...

  17. Improving Software Development Processes with Multicriteria Methods

    OpenAIRE

    Kornyshova, Elena; Deneckere, Rebecca; Salinesi, Camille

    2008-01-01

    11 pages National audience All software development processes include steps where several alternatives induce a choice, a decision-making. Sometimes, methodologies offer a way to make decisions. However, in a lot of cases, the arguments to carry out the decision are very poor and the choice is made in an intuitive and hazardous way. The aim of our work is to offer a scientifically founded way to guide the engineer through tactical choices with the application of multicriteria methods in...

  18. Development of a hydraulic turbine design method

    Science.gov (United States)

    Kassanos, Ioannis; Anagnostopoulos, John; Papantonis, Dimitris

    2013-10-01

    In this paper a hydraulic turbine parametric design method is presented which is based on the combination of traditional methods and parametric surface modeling techniques. The blade of the turbine runner is described using Bezier surfaces for the definition of the meridional plane as well as the blade angle distribution, and a thickness distribution applied normal to the mean blade surface. In this way, it is possible to define parametrically the whole runner using a relatively small number of design parameters, compared to conventional methods. The above definition is then combined with a commercial CFD software and a stochastic optimization algorithm towards the development of an automated design optimization procedure. The process is demonstrated with the design of a Francis turbine runner.

  19. Development of an Aerodynamic Analysis Method and Database for the SLS Service Module Panel Jettison Event Utilizing Inviscid CFD and MATLAB

    Science.gov (United States)

    Applebaum, Michael P.; Hall, Leslie, H.; Eppard, William M.; Purinton, David C.; Campbell, John R.; Blevins, John A.

    2015-01-01

    This paper describes the development, testing, and utilization of an aerodynamic force and moment database for the Space Launch System (SLS) Service Module (SM) panel jettison event. The database is a combination of inviscid Computational Fluid Dynamic (CFD) data and MATLAB code written to query the data at input values of vehicle/SM panel parameters and return the aerodynamic force and moment coefficients of the panels as they are jettisoned from the vehicle. The database encompasses over 5000 CFD simulations with the panels either in the initial stages of separation where they are hinged to the vehicle, in close proximity to the vehicle, or far enough from the vehicle that body interference effects are neglected. A series of viscous CFD check cases were performed to assess the accuracy of the Euler solutions for this class of problem and good agreement was obtained. The ultimate goal of the panel jettison database was to create a tool that could be coupled with any 6-Degree-Of-Freedom (DOF) dynamics model to rapidly predict SM panel separation from the SLS vehicle in a quasi-unsteady manner. Results are presented for panel jettison simulations that utilize the database at various SLS flight conditions. These results compare favorably to an approach that directly couples a 6-DOF model with the Cart3D Euler flow solver and obtains solutions for the panels at exact locations. This paper demonstrates a method of using inviscid CFD simulations coupled with a 6-DOF model that provides adequate fidelity to capture the physics of this complex multiple moving-body panel separation event.

  20. Matrix methods for bare resonator eigenvalue analysis.

    Science.gov (United States)

    Latham, W P; Dente, G C

    1980-05-15

    Bare resonator eigenvalues have traditionally been calculated using Fox and Li iterative techniques or the Prony method presented by Siegman and Miller. A theoretical framework for bare resonator eigenvalue analysis is presented. Several new methods are given and compared with the Prony method.

  1. Recent Developments in the Methods of Estimating Shooting Distance

    Directory of Open Access Journals (Sweden)

    Arie Zeichner

    2002-01-01

    Full Text Available A review of developments during the past 10 years in the methods of estimating shooting distance is provided. This review discusses the examination of clothing targets, cadavers, and exhibits that cannot be processed in the laboratory. The methods include visual/microscopic examinations, color tests, and instrumental analysis of the gunshot residue deposits around the bullet entrance holes. The review does not cover shooting distance estimation from shotguns that fired pellet loads.

  2. Recent Developments in the Methods of Estimating Shooting Distance

    OpenAIRE

    Arie Zeichner; Baruch Glattstein

    2002-01-01

    A review of developments during the past 10 years in the methods of estimating shooting distance is provided. This review discusses the examination of clothing targets, cadavers, and exhibits that cannot be processed in the laboratory. The methods include visual/microscopic examinations, color tests, and instrumental analysis of the gunshot residue deposits around the bullet entrance holes. The review does not cover shooting distance estimation from shotguns that fired pellet loads.

  3. Computational structural analysis and finite element methods

    CERN Document Server

    Kaveh, A

    2014-01-01

    Graph theory gained initial prominence in science and engineering through its strong links with matrix algebra and computer science. Moreover, the structure of the mathematics is well suited to that of engineering problems in analysis and design. The methods of analysis in this book employ matrix algebra, graph theory and meta-heuristic algorithms, which are ideally suited for modern computational mechanics. Efficient methods are presented that lead to highly sparse and banded structural matrices. The main features of the book include: application of graph theory for efficient analysis; extension of the force method to finite element analysis; application of meta-heuristic algorithms to ordering and decomposition (sparse matrix technology); efficient use of symmetry and regularity in the force method; and simultaneous analysis and design of structures.

  4. Chapter 11. Community analysis-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  5. Current situation and development of analysis methods of power system transient stability%电力系统暂态稳定分析方法的现状与发展

    Institute of Scientific and Technical Information of China (English)

    李晨; 蒋德珑; 程生安

    2012-01-01

    For the interconnection power grid expands rapidly with the development of power system, its transient stability becomes increasingly seriously. The reliable transient stability analysis is one of the keys in safe operation of power system. The developing history and status quo of power system transient stability technology is reviewed in the paper by introducing the common methods of power system transient stability analysis. The features and applicability of various methods are analyzed in detail. The development foreground of power system transient stability analysis is clarified. It is pointed out that the wavelet analysis used in the transient stability analysis has a broad space for development, especially in the transient signal processing, it is a valuable research direction.%随着电力系统的发展,互联电力网络变得越来越大,暂态稳定性问题日趋严重,而电力系统安全运行的关键之一是可靠的暂态稳定分析.通过介绍电力系统暂态稳定分析常用的几种方法,回顾电力系统暂态稳定的发展历史和现状,对比分析了几种方法的特点及适用范围,并在此基础上对电力系统暂态稳定分析的发展前景进行了展望.最后指出,小波分析用于电力系统暂态稳定分析具有广阔的发展空间,特别是在处理暂态信号方面,更是一个很有应用价值的研究方向.

  6. Generalized Analysis of a Distribution Separation Method

    Directory of Open Access Journals (Sweden)

    Peng Zhang

    2016-04-01

    Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.

  7. Research and Development of Evaluation Method for Interview Skills in Acupuncture-and-Moxibustion Medical Treatment : Analysis based on lecture evaluation of interview skills

    OpenAIRE

    Kaneda, Daigo

    2012-01-01

    This research centered on the objective clinical capability examination (OSCE), which was recently been introduced into an acupuncture-and-moxibustion training school. The problems in OSCE were evaluations in a medical interview station. This research examined the validity and inner compatibility of the evaluation criteria as a whole using the Cronbach alpha coefficient and factor analysis. The evaluation criteria consisted of 20 items of four factors and were validated in the factor analysis...

  8. Computational Aeroacoustic Analysis System Development

    Science.gov (United States)

    Hadid, A.; Lin, W.; Ascoli, E.; Barson, S.; Sindir, M.

    2001-01-01

    Many industrial and commercial products operate in a dynamic flow environment and the aerodynamically generated noise has become a very important factor in the design of these products. In light of the importance in characterizing this dynamic environment, Rocketdyne has initiated a multiyear effort to develop an advanced general-purpose Computational Aeroacoustic Analysis System (CAAS) to address these issues. This system will provide a high fidelity predictive capability for aeroacoustic design and analysis. The numerical platform is able to provide high temporal and spatial accuracy that is required for aeroacoustic calculations through the development of a high order spectral element numerical algorithm. The analysis system is integrated with well-established CAE tools, such as a graphical user interface (GUI) through PATRAN, to provide cost-effective access to all of the necessary tools. These include preprocessing (geometry import, grid generation and boundary condition specification), code set up (problem specification, user parameter definition, etc.), and postprocessing. The purpose of the present paper is to assess the feasibility of such a system and to demonstrate the efficiency and accuracy of the numerical algorithm through numerical examples. Computations of vortex shedding noise were carried out in the context of a two-dimensional low Mach number turbulent flow past a square cylinder. The computational aeroacoustic approach that is used in CAAS relies on coupling a base flow solver to the acoustic solver throughout a computational cycle. The unsteady fluid motion, which is responsible for both the generation and propagation of acoustic waves, is calculated using a high order flow solver. The results of the flow field are then passed to the acoustic solver through an interpolator to map the field values into the acoustic grid. The acoustic field, which is governed by the linearized Euler equations, is then calculated using the flow results computed

  9. DEVELOPMENT OF A RISK SCREENING METHOD FOR CREDITED OPERATOR ACTIONS

    International Nuclear Information System (INIS)

    DEVELOPMENT OF A RISK SCREENING METHOD FOR CREDITED OPERATOR ACTIONS. THE U.S. NUCLEAR REGULATORY COMMISSION (NRC) REVIEWS THE HUMAN FACTORS ASPECTS OF PROPOSED LICENSE AMENDMENTS THAT IMPACT HUMAN ACTIONS THAT ARE CREDITED IN A PLANTS SAFETY ANALYSIS. THE STAFF IS COMMITTED TO A GRADED APPROACH TO THESE REVIEWS THAT FOCUS RESOURCES ON THE MOST RISK IMPORTANT CHANGES. THEREFORE, A RISK INFORMED SCREENING METHOD WAS DEVELOPED BASED ON AN ADAPTATION OF EXISTING GUIDANCE FOR RISK INFORMED REGULATION AND HUMAN FACTORS. THE METHOD USES BOTH QUANTITATIVE AND QUALITATIVE INFORMATION TO DIVIDE THE AMENDMENT REQUESTS INTO DIFFERENT LEVELS OF REVIEW. THE METHOD WAS EVALUATED USING A VARIETY OF TESTS. THIS PAPER WILL SUMMARIZE THE DEVELOPMENT OF THE METHODOLOGY AND THE EVALUATIONS THAT WERE PERFORMED TO VERIFY ITS USEFULNESS

  10. Development of a reliable analytical method for the precise extractive spectrophotometric determination of osmium(VIII) with 2-nitrobenzaldehydethiocarbohydrazone: Analysis of alloys and real sample.

    Science.gov (United States)

    Zanje, Sunil B; Kokare, Arjun N; Suryavanshi, Vishal J; Waghmode, Duryodhan P; Joshi, Sunil S; Anuse, Mansing A

    2016-12-01

    The proposed method demonstrates that the osmium(VIII) forms complex with 2-NBATCH from 0.8molL(-1) HCl at room temperature. The complex formed was extracted in 10mL of chloroform with a 5min equilibration time. The absorbance of the red colored complex was measured at 440nm against the reagent blank. The Beer's law was obeyed in the range of 5-25μgmL(-1), the optimum concentration range was 10-20μgmL(-1) of osmium(VIII) as evaluated by Ringbom's plot. Molar absorptivity and Sandell's sensitivity of osmium(VIII)-2NBATCH complex in chloroform is 8.94×10(3)Lmol(-1)cm(-1) and 0.021μgcm(-2), respectively. The composition of osmium(VIII)-2NBATCH complex was 1:2 investigated from Job's method of continuous variation, Mole ratio method and slope ratio method. The interference of diverse ions was studied and masking agents were used wherever necessary. The present method was successfully applied for determination of osmium(VIII) from binary, ternary and synthetic mixtures corresponding to alloys and real samples. The validity of the method was confirmed by finding the relative standard deviation for five determinations which was 0.29%. PMID:27380306

  11. An introduction to numerical methods and analysis

    CERN Document Server

    Epperson, James F

    2013-01-01

    Praise for the First Edition "". . . outstandingly appealing with regard to its style, contents, considerations of requirements of practice, choice of examples, and exercises.""-Zentralblatt MATH "". . . carefully structured with many detailed worked examples.""-The Mathematical Gazette The Second Edition of the highly regarded An Introduction to Numerical Methods and Analysis provides a fully revised guide to numerical approximation. The book continues to be accessible and expertly guides readers through the many available techniques of numerical methods and analysis. An Introduction to

  12. Gap analysis: Concepts, methods, and recent results

    Science.gov (United States)

    Jennings, M.D.

    2000-01-01

    Rapid progress is being made in the conceptual, technical, and organizational requirements for generating synoptic multi-scale views of the earth's surface and its biological content. Using the spatially comprehensive data that are now available, researchers, land managers, and land-use planners can, for the first time, quantitatively place landscape units - from general categories such as 'Forests' or 'Cold-Deciduous Shrubland Formation' to more categories such as 'Picea glauca-Abies balsamea-Populus spp. Forest Alliance' - in their large-area contexts. The National Gap Analysis Program (GAP) has developed the technical and organizational capabilities necessary for the regular production and analysis of such information. This paper provides a brief overview of concepts and methods as well as some recent results from the GAP projects. Clearly, new frameworks for biogeographic information and organizational cooperation are needed if we are to have any hope of documenting the full range of species occurrences and ecological processes in ways meaningful to their management. The GAP experience provides one model for achieving these new frameworks.

  13. Schedulability Analysis Method of Timing Constraint Petri Nets

    Institute of Scientific and Technical Information of China (English)

    李慧芳; 范玉顺

    2002-01-01

    Timing constraint Petri nets (TCPNs) can be used to model a real-time system specification and to verify the timing behavior of the system. This paper describes the limitations of the reachability analysis method in analyzing complex systems for existing TCPNs. Based on further research on the schedulability analysis method with various topology structures, a more general state reachability analysis method is proposed. To meet various requirements of timely response for actual systems, this paper puts forward a heuristic method for selecting decision-spans of transitions and develops a heuristic algorithm for schedulability analysis of TCPNs. Examples are given showing the practicality of the method in the schedulability analysis for real-time systems with various structures.

  14. Analysis and development of spatial hp-refinement methods for solving the neutron transport equation; Analyse et developpement de methodes de raffinement hp en espace pour l'equation de transport des neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Fournier, D.

    2011-10-10

    The different neutronic parameters have to be calculated with a higher accuracy in order to design the 4. generation reactor cores. As memory storage and computation time are limited, adaptive methods are a solution to solve the neutron transport equation. The neutronic flux, solution of this equation, depends on the energy, angle and space. The different variables are successively discretized. The energy with a multigroup approach, considering the different quantities to be constant on each group, the angle by a collocation method called SN approximation. Once the energy and angle variable are discretized, a system of spatially-dependent hyperbolic equations has to be solved. Discontinuous finite elements are used to make possible the development of hp-refinement methods. Thus, the accuracy of the solution can be improved by spatial refinement (h-refinement), consisting into subdividing a cell into sub-cells, or by order refinement (p-refinement), by increasing the order of the polynomial basis. In this thesis, the properties of this methods are analyzed showing the importance of the regularity of the solution to choose the type of refinement. Thus, two error estimators are used to lead the refinement process. Whereas the first one requires high regularity hypothesis (analytical solution), the second one supposes only the minimal hypothesis required for the solution to exist. The comparison of both estimators is done on benchmarks where the analytic solution is known by the method of manufactured solutions. Thus, the behaviour of the solution as a regard of the regularity can be studied. It leads to a hp-refinement method using the two estimators. Then, a comparison is done with other existing methods on simplified but also realistic benchmarks coming from nuclear cores. These adaptive methods considerably reduces the computational cost and memory footprint. To further improve these two points, an approach with energy-dependent meshes is proposed. Actually, as the

  15. Advanced Software Methods for Physics Analysis

    International Nuclear Information System (INIS)

    Unprecedented data analysis complexity is experienced in modern High Energy Physics experiments. The complexity arises from the growing size of recorded data samples, the large number of data analyses performed by different users in each single experiment, and the level of complexity of each single analysis. For this reason, the requirements on software for data analysis impose a very high level of reliability. We present two concrete examples: the former from BaBar experience with the migration to a new Analysis Model with the definition of a new model for the Event Data Store, the latter about a toolkit for multivariate statistical and parametric Monte Carlo analysis developed using generic programming

  16. Comparison of extraction methods for analysis of flavonoids in onions

    OpenAIRE

    Soeltoft, Malene; Knuthsen, Pia; Nielsen, John

    2008-01-01

    Onions are known to contain high levels of flavonoids and a comparison of the efficiency, reproducibility and detection limits of various extraction methods has been made in order to develop fast and reliable analytical methods for analysis of flavonoids in onions. Conventional and classical methods are time- and solvent-consuming and the presence of light and oxygen during sample preparation facilitate degradation reactions. Thus, classical methods were compared with microwave (irradiatio...

  17. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  18. REVIEW: Development of methods for body composition studies

    Science.gov (United States)

    Mattsson, Sören; Thomas, Brian J.

    2006-07-01

    This review is focused on experimental methods for determination of the composition of the human body, its organs and tissues. It summarizes the development and current status of fat determinations from body density, total body water determinations through the dilution technique, whole and partial body potassium measurements for body cell mass estimates, in vivo neutron activation analysis for body protein measurements, dual-energy absorptiometry (DEXA), computed tomography (CT) and magnetic resonance imaging (MRI, fMRI) and spectroscopy (MRS) for body composition studies on tissue and organ levels, as well as single- and multiple-frequency bioimpedance (BIA) and anthropometry as simple easily available methods. Methods for trace element analysis in vivo are also described. Using this wide range of measurement methods, together with gradually improved body composition models, it is now possible to quantify a number of body components and follow their changes in health and disease.

  19. Analysis within the systems development life-cycle

    CERN Document Server

    Rock-Evans, Rosemary

    1987-01-01

    Analysis within the Systems Development Life-Cycle: Book 4, Activity Analysis-The Methods describes the techniques and concepts for carrying out activity analysis within the systems development life-cycle. Reference is made to the deliverables of data analysis and more than one method of analysis, each a viable alternative to the other, are discussed. The """"bottom-up"""" and """"top-down"""" methods are highlighted. Comprised of seven chapters, this book illustrates how dependent data and activities are on each other. This point is especially brought home when the task of inventing new busin

  20. Causal Moderation Analysis Using Propensity Score Methods

    Science.gov (United States)

    Dong, Nianbo

    2012-01-01

    This paper is based on previous studies in applying propensity score methods to study multiple treatment variables to examine the causal moderator effect. The propensity score methods will be demonstrated in a case study to examine the causal moderator effect, where the moderators are categorical and continuous variables. Moderation analysis is an…

  1. ANALYSIS OF MODERN CAR BODY STRAIGHTENING METHODS

    Directory of Open Access Journals (Sweden)

    Arhun, Sch.

    2013-01-01

    Full Text Available The analysis of modern car body panels straightening methods is carried out. There have been described both traditional and alternative methods of car body panels straightening. The urgency of magnetic pulse teсhnology dignment is grounded. The main advantages of magnetic pulse teсhno-logy of car body straightening are defernined.

  2. Relating Actor Analysis Methods to Policy Problems

    NARCIS (Netherlands)

    Van der Lei, T.E.

    2009-01-01

    For a policy analyst the policy problem is the starting point for the policy analysis process. During this process the policy analyst structures the policy problem and makes a choice for an appropriate set of methods or techniques to analyze the problem (Goeller 1984). The methods of the policy anal

  3. Empirical likelihood method in survival analysis

    CERN Document Server

    Zhou, Mai

    2015-01-01

    Add the Empirical Likelihood to Your Nonparametric ToolboxEmpirical Likelihood Method in Survival Analysis explains how to use the empirical likelihood method for right censored survival data. The author uses R for calculating empirical likelihood and includes many worked out examples with the associated R code. The datasets and code are available for download on his website and CRAN.The book focuses on all the standard survival analysis topics treated with empirical likelihood, including hazard functions, cumulative distribution functions, analysis of the Cox model, and computation of empiric

  4. Development and validation of an in-house quantitative analysis method for cylindrospermopsin using hydrophilic interaction liquid chromatography-tandem mass spectrometry: Quantification demonstrated in 4 aquatic organisms.

    Science.gov (United States)

    Esterhuizen-Londt, Maranda; Kühn, Sandra; Pflugmacher, Stephan

    2015-12-01

    The cyanobacterial toxin cylindrospermopsin (CYN) is of great concern in aquatic environments because of its incidence, multiple toxicity endpoints, and, therefore, the severity of health implications. It may bioaccumulate in aquatic food webs, resulting in high exposure concentrations to higher-order trophic levels, particularly humans. Because of accumulation at primary levels resulting from exposure to trace amounts of toxin, a sensitive analytical technique with proven aquatic applications is required. In the present study, a hydrophilic interaction liquid chromatographic-tandem mass spectrometric method with a lower limit of detection of 200 fg on column (signal-to-noise ratio = 3, n = 9) and a lower limit of quantification of 1 pg on column (signal-to-noise ratio = 11, n = 9) with demonstrated application in 4 aquatic organisms is described. The analytical method was optimized and validated with a linear range (r(2) = 0.999) from 0.1 ng mL(-1) to 100 ng mL(-1) CYN. Mean recovery of the extraction method was 98 ± 2%. Application of the method was demonstrated by quantifying CYN uptake in Scenedesmus subspicatus (green algae), Egeria densa (Brazilian waterweed), Daphnia magna (water flea), and Lumbriculus variegatus (blackworm) after 24 h of static exposure to 50 μg L(-1) CYN. Uptake ranged from 0.05% to 0.11% of the nominal CYN exposure amount. This constitutes a sensitive and reproducible method for extraction and quantification of unconjugated CYN with demonstrated application in 4 aquatic organisms, which can be used in further aquatic toxicological investigations. PMID:26126753

  5. Development and validation of an in-house quantitative analysis method for cylindrospermopsin using hydrophilic interaction liquid chromatography-tandem mass spectrometry: Quantification demonstrated in 4 aquatic organisms.

    Science.gov (United States)

    Esterhuizen-Londt, Maranda; Kühn, Sandra; Pflugmacher, Stephan

    2015-12-01

    The cyanobacterial toxin cylindrospermopsin (CYN) is of great concern in aquatic environments because of its incidence, multiple toxicity endpoints, and, therefore, the severity of health implications. It may bioaccumulate in aquatic food webs, resulting in high exposure concentrations to higher-order trophic levels, particularly humans. Because of accumulation at primary levels resulting from exposure to trace amounts of toxin, a sensitive analytical technique with proven aquatic applications is required. In the present study, a hydrophilic interaction liquid chromatographic-tandem mass spectrometric method with a lower limit of detection of 200 fg on column (signal-to-noise ratio = 3, n = 9) and a lower limit of quantification of 1 pg on column (signal-to-noise ratio = 11, n = 9) with demonstrated application in 4 aquatic organisms is described. The analytical method was optimized and validated with a linear range (r(2) = 0.999) from 0.1 ng mL(-1) to 100 ng mL(-1) CYN. Mean recovery of the extraction method was 98 ± 2%. Application of the method was demonstrated by quantifying CYN uptake in Scenedesmus subspicatus (green algae), Egeria densa (Brazilian waterweed), Daphnia magna (water flea), and Lumbriculus variegatus (blackworm) after 24 h of static exposure to 50 μg L(-1) CYN. Uptake ranged from 0.05% to 0.11% of the nominal CYN exposure amount. This constitutes a sensitive and reproducible method for extraction and quantification of unconjugated CYN with demonstrated application in 4 aquatic organisms, which can be used in further aquatic toxicological investigations.

  6. A MULTIPLE TESTING OF THE ABC METHOD AND THE DEVELOPMENT OF A SECOND-GENERATION MODEL. PART II, TEST RESULTS AND AN ANALYSIS OF RECALL RATIO.

    Science.gov (United States)

    ALTMANN, BERTHOLD

    AFTER A BRIEF SUMMARY OF THE TEST PROGRAM (DESCRIBED MORE FULLY IN LI 000 318), THE STATISTICAL RESULTS TABULATED AS OVERALL "ABC (APPROACH BY CONCEPT)-RELEVANCE RATIOS" AND "ABC-RECALL FIGURES" ARE PRESENTED AND REVIEWED. AN ABSTRACT MODEL DEVELOPED IN ACCORDANCE WITH MAX WEBER'S "IDEALTYPUS" ("DIE OBJEKTIVITAET SOZIALWISSENSCHAFTLICHER UND…

  7. Analysis of intestinal flora development in breast-fed and formula-fed infants by using molecular identification and detection methods

    NARCIS (Netherlands)

    Harmsen, HJM; Wildeboer-Veloo, ACM; Raangs, GC; Wagendorp, AA; Klijn, N; Bindels, JG; Welling, GW

    2000-01-01

    Background: An obvious difference between breast-fed and formula-fed newborn infants is the development of the intestinal flora, considered to be of importance for protection against harmful micro-organisms and for the maturation of the intestinal immune system. In this study, novel molecular identi

  8. Epistemological development and judgments and reasoning about teaching methods.

    Science.gov (United States)

    Spence, Sarah; Helwig, Charles C

    2013-01-01

    Children's, adolescents', and adults' (N = 96 7-8, 10-11, and 13-14-year-olds and university students) epistemological development and its relation to judgments and reasoning about teaching methods was examined. The domain (scientific or moral), nature of the topic (controversial or noncontroversial), and teaching method (direct instruction by lectures versus class discussions) were systematically varied. Epistemological development was assessed in the aesthetics, values, and physical truth domains. All participants took the domain, nature of the topic, and teaching method into consideration in ways that showed age-related variations. Epistemological development in the value domain alone was predictive of preferences for class discussions and a critical perspective on teacher-centered direct instruction, even when age was controlled in the analysis.

  9. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  10. Scope-Based Method Cache Analysis

    DEFF Research Database (Denmark)

    Huber, Benedikt; Hepp, Stefan; Schoeberl, Martin

    2014-01-01

    , as it requests memory transfers at well-defined instructions only. In this article, we present a new cache analysis framework that generalizes and improves work on cache persistence analysis. The analysis demonstrates that a global view on the cache behavior permits the precise analyses of caches which are hard......The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution...

  11. Robust methods for multivariate data analysis A1

    DEFF Research Database (Denmark)

    Frosch, Stina; Von Frese, J.; Bro, Rasmus

    2005-01-01

    Outliers may hamper proper classical multivariate analysis, and lead to incorrect conclusions. To remedy the problem of outliers, robust methods are developed in statistics and chemometrics. Robust methods reduce or remove the effect of outlying data points and allow the ?good? data to primarily...

  12. Development of microbial biosensors for food analysis

    DEFF Research Database (Denmark)

    Lukasiak, Justyna

    , optimize and characterize various reporter strains utilizing different signal transducers and targeting carbohydrate constituents of pectin and arabinoxylan. Addit onally, the objective was to assess the potential suitability of microbial biosensors for food ingredients analysis. Pectin is a plant...... in order to fulfill the needs of different fields, from environmental sciences to food industry. Moreover, they can be an answer for the need of novel, less expensive and environmentally neutral methods of analysis particularly in food ingredients assessment. The aim of this PhD thesis was to develop...... heteropolysaccharide commonly used in food industry as a gelling agent and food stabilizer. The chemical analysis of the pectin carbohydrate composition is a significant issue during the study of its function and properties. Arabinoxylan is one of the main non-starch polysaccharide derived from the cell wall of cereal...

  13. Development and prototypical application of analysis methods for complex anion mixtures in waters and heavy metal organyls in sediments; Entwicklung und prototypische Anwendung von Analysenverfahren fuer komplexe Anionengemische in Waessern und Schwermetallorganylen in Sedimenten. Schlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Anders, B.; Knoechel, A.; Potgeter, H.; Staub, S.; Stocker, M.

    2002-07-01

    When it comes to assessing the hazards emanating from heavy pollutants in waters mere elemental analysis provides too little information. Due to the great differences in toxicity and mobility it is important to know more about the exact species in question. This is particularly true of heavy metals that form stable organyls, specifically As, Pb, Sn and Hg, but also of alkylated arsenic acids, which need to be measured in complex anion mixtures. The purpose of the present project was to develop robust, powerful analysis methods and thus overcome the existing deficit in reliable analysis methods for these substances. An important approach in this connection is the use of coupled chromatography and detection systems for separation and analysis. [German] Hinsichtlich der von einer Schwermetallbelastung in Gewaessern ausgehenden Gefahren liefert die reine Elementanalytik nur unzureichende Aussagen. Aufgrund der grossen Unterschiede in Toxiditaet und Mobilitaet ist die Kenntnis der jeweils vorliegenden Spezies bedeutungsvoll. Dies gilt in besonderem Masse fuer die stabile Organyle bildenden Schwermetalle As, Pb, Sn und Hg sowie die alkylierten Arsensaeuren, die es innerhalb komplexer Anionengemische zu bestimmen gilt. Hinsichtlich ihrer sicheren Bestimmung bestehen methodische Defizite, die das vorliegende Projekt durch die Entwicklung robuster, nachweisstarker Analysenverfahren zu beseitigen versucht. Grosse Bedeutung kommt dabei gekoppelten Systemen aus Chromatographie und Detektion als Trenn- und Bestimmungsmethode zu. (orig.)

  14. Spatiotemporal analysis of an agent-based model of a colony of keratinocytes: a first approach for the development of validation methods

    OpenAIRE

    Pichardo-Almarza, C.; Smallwood, R.; Billings, S. A.

    2007-01-01

    Agent-based models are widely used for the simulation of systems from several domains (biology, economics, meteorology, etc). In biology agent-based models are very useful for predicting the social behaviour of systems; in particular they seem well adapted to model the behaviour of a cell population. In this paper an agent-based model, developed to study normal human keratinocytes (tissue cells), will be investigated. This kind of model exhibits probabilistic behavi...

  15. XRSW method, its application and development

    Energy Technology Data Exchange (ETDEWEB)

    Zheludeva, S.I.; Kovalchuk, M.V. [Russian Academy of Sciences, Institute of Crystallography, Moscow (Russian Federation)

    1996-09-01

    X-Ray Standing Waves (XRSW) may be obtained under dynamical diffraction in periodic structures or under total external reflection conditions (TR) is stratified medium. As the incident angle varies, XRSW nodes and antinodes move in the direction perpendicular to the reflecting planes, leading to drastic variation of photoelectron interaction of X-ray with matter and resulting in specific angular dependencies of secondary radiation yields (photoelectrons, fluorescence, internal photoeffect, photoluminescence, Compton and thermal diffuse scattering). The structural information - the position of investigated atoms in the direction of XRSW movement (coherent position), the distribution of atoms about this position (coherent fraction) - is obtained with the accuracy about several percents from XRSW period D. The objects under investigation are: semiconductor surface layers, heterostructure, multicomponent crystals, interfaces, adsorbed layers. Besides the development of XRSW method allow to obtain structure, geometrical and optical parameters of ultrathin films (crystalline and disordered, organic and inorganic) and nanostructures on their base.

  16. Real-time analysis of δ13C- and δD-CH4 in ambient air with laser spectroscopy: method development and first intercomparison results

    Directory of Open Access Journals (Sweden)

    S. Eyer

    2015-08-01

    Full Text Available In situ and simultaneous measurement of the three most abundant isotopologues of methane using mid-infrared laser absorption spectroscopy is demonstrated. A field-deployable, autonomous platform is realized by coupling a compact quantum cascade laser absorption spectrometer (QCLAS to a preconcentration unit, called TRace gas EXtractor (TREX. This unit enhances CH4 mole fractions by a factor of up to 500 above ambient levels and quantitatively separates interfering trace gases such as N2O and CO2. The analytical precision of the QCLAS isotope measurement on the preconcentrated (750 ppm, parts-per-million, μmole/mole methane is 0.1 and 0.5 ‰ for δ13C- and δD-CH4 at 10 min averaging time. Based on replicate measurements of compressed air during a two-week intercomparison campaign, the repeatability of the TREX-QCLAS was determined to be 0.19 and 1.9 ‰ for δ13C and δD-CH4, respectively. In this intercomparison campaign the new in situ technique is compared to isotope-ratio mass-spectrometry (IRMS based on glass flask and bag sampling and real time CH4 isotope analysis by two commercially available laser spectrometers. Both laser-based analyzers were limited to methane mole fraction and δ13C-CH4 analysis, and only one of them, a cavity ring down spectrometer, was capable to deliver meaningful data for the isotopic composition. After correcting for scale offsets, the average difference between TREX–QCLAS data and bag/flask sampling–IRMS values are within the extended WMO compatibility goals of 0.2 and 5 ‰ for δ13C- and δD-CH4, respectively. Thus, the intercomparison also reveals the need for reference air samples with accurately determined isotopic composition of CH4 to further improve the interlaboratory compatibility.

  17. Critical evaluation of the role of scientific analysis in UK local authority AQMA decision-making: method development and preliminary results.

    Science.gov (United States)

    Woodfield, N K; Longhurst, J W S; Beattie, C I; Laxen, D P H

    2003-07-20

    Over the past 4 years, local government in the UK has undertaken a process of scientific review and assessment of air quality, which has culminated in a suite of designated air quality management areas (AQMAs) in over 120 of the 403 local authorities in England (including London), Scotland and Wales. Methods to identify specific pollution hot-spots have involved the use of advanced and complex air-quality dispersion modelling and monitoring techniques, and the UK government has provided guidance on both the general and technical methods for undertaking local air quality review and assessments. Approaches to implementing UK air quality policy, through the local air quality management (LAQM) process (Air Quality Strategy 2000) has not been uniform across the UK, as an inevitable consequence of non-prescriptive guidelines. This has led to a variety of outcomes with respect to how different tools and techniques have been applied, the interpretation of scientific uncertainty and the application of caution. A technique to appraise the scientific approaches undertaken by local government and to survey local government officers involved in the LAQM process have been devised, and a conceptual model proposed to identify the main influences in the process of determining AQMAs. Modelling tools used and the consideration of modelling uncertainty, error and model inputs have played a significant role in AQMA decision-making in the majority of local authorities declaring AQMAs in the UK.

  18. Methods for Developing Emissions Scenarios for Integrated Assessment Models

    Energy Technology Data Exchange (ETDEWEB)

    Prinn, Ronald [MIT; Webster, Mort [MIT

    2007-08-20

    The overall objective of this research was to contribute data and methods to support the future development of new emissions scenarios for integrated assessment of climate change. Specifically, this research had two main objectives: 1. Use historical data on economic growth and energy efficiency changes, and develop probability density functions (PDFs) for the appropriate parameters for two or three commonly used integrated assessment models. 2. Using the parameter distributions developed through the first task and previous work, we will develop methods of designing multi-gas emission scenarios that usefully span the joint uncertainty space in a small number of scenarios. Results on the autonomous energy efficiency improvement (AEEI) parameter are summarized, an uncertainty analysis of elasticities of substitution is described, and the probabilistic emissions scenario approach is presented.

  19. Current Developments in Nuclear Density Functional Methods

    CERN Document Server

    Dobaczewski, J

    2010-01-01

    Density functional theory (DFT) became a universal approach to compute ground-state and excited configurations of many-electron systems held together by an external one-body potential in condensed-matter, atomic, and molecular physics. At present, the DFT strategy is also intensely studied and applied in the area of nuclear structure. The nuclear DFT, a natural extension of the self-consistent mean-field theory, is a tool of choice for computations of ground-state properties and low-lying excitations of medium-mass and heavy nuclei. Over the past thirty-odd years, a lot of experience was accumulated in implementing, adjusting, and using the density-functional methods in nuclei. This research direction is still extremely actively pursued. In particular, current developments concentrate on (i) attempts to improve the performance and precision delivered by the nuclear density-functional methods, (ii) derivations of density functionals from first principles rooted in the low-energy chromodynamics and effective th...

  20. Analysis of Maths Learning Activities Developed By Pre-service Teachers in Terms of the Components of Content, Purpose, Application Methods

    Directory of Open Access Journals (Sweden)

    Çağla Toprak

    2014-04-01

    Full Text Available Today- when the influence of the alteration movement done in order to keep up with the age of the educational system is still continuing- the importance of teachers in students’ learning and achieving what is expected from the education system has been stated by the studies conducted (Hazır & Bıkmaz, 2006. Teachers own a critical role in the stage of both preparing teaching materials and using them (Stein & Smith, 1998b; Swan, 2007. When the existing curriculums –in particular, maths and geometry cirriculums- are analyzed, it can be observed that activities are the most significant teaching materials (Bozkurt, 2012. In fact, it is possible to characterize the existing curriculums as activity-based ones (Report of Workshop Examining Content of Primary School Curriculums According to Branches, 2010; Epö, 2005. Therefore, what sort of learning activities there are, what qualities they need to have, how to design and apply them are topics that must be elaborated (Uğurel et al., 2010.  At this point, our study to increase the skills of pre-service teachers during the process of developing activities was conducted with 27 pre-service teachers -19 girls 8 boys- studying in the 4th year in Mathematics Education Department at a state university in the Aegean Region. The activity designs the pre-service teachers developed considering the patterns given after a series of practice were analyzed in documents in terms of the aim of design and the form of practice. As a result of the studies, it is observed that pre-service teachers deal with the topics from the maths curriculum and these topics are of different grade levels. The result of the examination named as target component suggests that activities developed aim firstly at providing learning and this is followed by reinforcing the concepts already learned. It is stated that pre-service teachers prefer mostly small group (cooperative studies in the activities they develop.Key Words:

  1. Methods development for total organic carbon accountability

    Science.gov (United States)

    Benson, Brian L.; Kilgore, Melvin V., Jr.

    1991-01-01

    This report describes the efforts completed during the contract period beginning November 1, 1990 and ending April 30, 1991. Samples of product hygiene and potable water from WRT 3A were supplied by NASA/MSFC prior to contract award on July 24, 1990. Humidity condensate samples were supplied on August 3, 1990. During the course of this contract chemical analyses were performed on these samples to qualitatively determine specific components comprising, the measured organic carbon concentration. In addition, these samples and known standard solutions were used to identify and develop methodology useful to future comprehensive characterization of similar samples. Standard analyses including pH, conductivity, and total organic carbon (TOC) were conducted. Colorimetric and enzyme linked assays for total protein, bile acid, B-hydroxybutyric acid, methylene blue active substances (MBAS), urea nitrogen, ammonia, and glucose were also performed. Gas chromatographic procedures for non-volatile fatty acids and EPA priority pollutants were also performed. Liquid chromatography was used to screen for non-volatile, water soluble compounds not amenable to GC techniques. Methods development efforts were initiated to separate and quantitate certain chemical classes not classically analyzed in water and wastewater samples. These included carbohydrates, organic acids, and amino acids. Finally, efforts were initiated to identify useful concentration techniques to enhance detection limits and recovery of non-volatile, water soluble compounds.

  2. The development of interdepartmental audit methods

    International Nuclear Information System (INIS)

    The UK Radiotherapy Physics Audit Network is now well-established, with seven network groups and co-ordinated by the IPSM. It is based on visits, using ion chambers as the measurement method, and auditing at least machine calibration, single field parameters and simple multi-field planned irradiations. In addition procedural audit of dosimetry and quality control procedures, and records is incorporated. The general approach has been to use interdepartmental audit involving mutual co-operation with peer professionals from other centres. The different groups have evolved at different paces and in rather different directions. However the IPSM coordinating role ensures a basic common minimum to the system. The Scottish+ group has developed a semi-anatomical phantom to use in audit stages following on from the basic single field and geometric phantom dosimetry audit levels. This has been evaluated experimentally in one department before wider use. The Scottish+ audit system is briefly described. Results from levels 1 and 2 are summarised and the design and testing of the semi-anatomical phantom are discussed. The current and future development of the audit system is presented

  3. Interactive radio instruction: developing instructional methods.

    Science.gov (United States)

    Friend, J

    1989-01-01

    The USAID has, since 1972, funded the development of a new methodology for educational radio for young children through 3 projects: the Radio Mathematics PRoject of Nicaragua, the Radio Language Arts Project of Kenya, and the Radio Science PRoject of Papua New Guinea. These projects developed math programs for grades 1-4 and English as a second language for grades 1-3; programs to teach science in grades 4-6 are now being developed. Appropriate techniques were developed to engage young children actively in the learning process. Lessons are planned as a "conversation" between the children and the radio; scripts are written as 1/2 of a dialogue, with pauses carefully timed so that written as 12 of a dialogue, with pauses carefully timed so that students can contribute their 1/2. Teaching techniques used in all 3 projects include choral responses, simultaneous individual seatwork, and activities using simple materials such as pebbles and rulers. Certain techniques were specific to the subject being taught, or to the circumstances in which the lessons were to be used. Patterned oral drill was used frequently in the English lessons, including sound-cued drills. "Deferred" oral responses were used often in the math lessons. In this method, the children are instructed to solve a problem silently, not giving the answer aloud until requested, thus allowing time for even the slower children to participate. "One-child" questions were used in both English and science: the radio asks a question to be answered by a single child, who is selected on the spot by the classroom teacher. This allows for open-ended questions, but also requires constant supervision of the classroom teacher. Songs and games were used in all programs, and extensively for didactic purposes in the teaching of English. Instructions for science activities are often more complex than in other courses, particularly when the children are using science apparatus, especially when they work in pairs to share scarce

  4. Development and application of a risk assessment method for radioactive waste management. Volume III. Economic analysis; description and implementation of AMRAW-B model

    International Nuclear Information System (INIS)

    A Radioactive Waste Management Systems Model, is presented. The systems model and associated computer code called AMRAW (Assessment Method for Radioactive Waste), has two parts. The first part, AMRAW-A, consists of the Source Term (radioactive inventory versus time), the Release Model, and the Environmental Model. The second part of the systems model, AMRAW-B, is the Economic Model which calculates health effects corresponding to the various organ dose rates from AMRAW-A, collects these health effects in terms of economic costs and attributes these costs to radionuclides, decay groups, and elements initially in the waste inventory. This volume describes AMRAW-B, the Economics Model, and is in two parts: Part 1 presents a generic description of the AMRAW-B model, background economic theory and a description of the AMRAW-B computer code, and Part 2 presents implementation of the model with an application to terminal storage in a bedded salt reference repository

  5. Reliability analysis of reactor systems by applying probability method

    International Nuclear Information System (INIS)

    Probability method was chosen for analysing the reactor system reliability is considered realistic since it is based on verified experimental data. In fact this is a statistical method. The probability method developed takes into account the probability distribution of permitted levels of relevant parameters and their particular influence on the reliability of the system as a whole. The proposed method is rather general, and was used for problem of thermal safety analysis of reactor system. This analysis enables to analyze basic properties of the system under different operation conditions, expressed in form of probability they show the reliability of the system on the whole as well as reliability of each component

  6. Development of a non-destructive micro-analytical method for stable carbon isotope analysis of transmission electron microscope (TEM) samples

    Science.gov (United States)

    Hode, Tomas; Kristiansson, Per; Elfman, Mikael; Hugo, Richard C.; Cady, Sherry L.

    2009-10-01

    The biogenicity of ancient morphological microfossil-like objects can be established by linking morphological (e.g. cell remnants and extracellular polymeric matrix) and chemical (e.g. isotopes, biomarkers and biominerals) evidence indicative of microorganisms or microbial activity. We have developed a non-destructive micro-analytical ion beam system capable of measuring with high spatial resolution the stable carbon isotope ratios of thin samples used for transmission electron microscopy. The technique is based on elastic scattering of alpha particles with an energy of 2.751 MeV. At this energy the 13C cross section is enhanced relative to the pure Rutherford cross section for 13C, whereas the 12C cross section is reduced relative to its pure Rutherford cross section. Here we report the initial results of this experimental approach used to characterize ultramicrotomed sections of sulfur-embedded graphite and microbial cells.

  7. A Simple Buckling Analysis Method for Airframe Composite Stiffened Panel by Finite Strip Method

    Science.gov (United States)

    Tanoue, Yoshitsugu

    Carbon fiber reinforced plastics (CFRP) have been used in structural components for newly developed aircraft and spacecraft. The main structures of an airframe, such as the fuselage and wings, are essentially composed of stiffened panels. Therefore, in the structural design of airframes, it is important to evaluate the buckling strength of the composite stiffened panels. Widely used finite element method (FEM) can analyzed any stiffened panel shape with various boundary conditions. However, in the early phase of airframe development, many studies are required in structural design prior to carrying out detail drawing. In this phase, performing structural analysis using only FEM may not be very efficient. This paper describes a simple buckling analysis method for composite stiffened panels, which is based on finite strip method. This method can deal with isotropic and anisotropic laminated plates and shells with several boundary conditions. The accuracy of this method was verified by comparing it with theoretical analysis and FEM analysis (NASTRAN). It has been observed that the buckling coefficients calculated via the present method are in agreement with results found by detail analysis methods. Consequently, this method is designed to be an effective calculation tool for the buckling analysis in the early phases of airframe design.

  8. Dynamic analysis and assessment for sustainable development

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The assessment of sustainable development is crucial for constituting sustainable development strategies. Assessment methods that exist so far usually only use an indicator system for making sustainable judgement. These indicators rarely reflect dynamic characteristics. However, sustainable development is influenced by changes in the social-economic system and in the eco-environmental system at different times. Besides the spatial character, sustainable development has a temporal character that can not be neglected; therefore the research system should also be dynamic. This paper focuses on this dynamic trait, so that the assessment results obtained provide more information for judgements in decision-making processes. Firstly the dynamic characteristics of sustainable development are analyzed, which point to a track of sustainable development that is an upward undulating curve. According to the dynamic character and the development rules of a social, economic and ecological system, a flexible assessment approach that is based on tendency analysis, restrictive conditions and a feedback system is then proposed for sustainable development.

  9. Advanced analysis methods in particle physics

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  10. Development and validation of a capillary electrophoresis method with capacitively coupled contactless conductivity detection (CE-C(4) D) for the analysis of amikacin and its related substances.

    Science.gov (United States)

    El-Attug, Mohamed Nouri; Adams, Erwin; Van Schepdael, Ann

    2012-09-01

    Amikacin is a semisynthetic aminoglycoside antibiotic derived from kanamycin A that lacks a strong UV absorbing chromophore or fluorophore. Due to the physicochemical properties of amikacin and its related substances, CE in combination with capacitively coupled contactless conductivity detection (CE-C(4) D) was chosen. The optimized separation method uses a BGE composed of 20 mM MES adjusted to pH 6.6 by l-histidine and 0.3 mM CTAB that was added as flow modifier in a concentration below the CMC. Ammonium acetate 20 mg.L(-1) was used as internal standard. 30 kV was applied in reverse polarity on a fused silica capillary (73/48 cm; 75 μm id). The optimized separation was obtained in less than 6 min with good linearity (R(2) = 0.9996) for amikacin base. It shows a good precision expressed as RSD on relative peak areas equal to 0.1 and 0.7% for intraday and interday, respectively. The LOD and LOQ are 0.5 mg.L(-1) and 1.7 mg.L(-1) , respectively. PMID:22965725

  11. Microarray Analysis of the Developing Rat Mandible

    Institute of Scientific and Technical Information of China (English)

    Hideo KABURAGI; Naoyuki SUGANO; Maiko OSHIKAWA; Ryosuke KOSHI; Naoki SENDA; Kazuhiro KAWAMOTO; Koichi ITO

    2007-01-01

    To analyze the molecular events that occur in the developing mandible, we examined the expression of 8803 genes from samples taken at different time points during rat postnatal mandible development.Total RNA was extracted from the mandibles of 1-day-old, 1-week-old, and 2-week-old rats. Complementary RNA (cRNA) was synthesized from cDNA and biotinylated. Fragmented cRNA was hybridized to RGU34A GeneChip arrays. Among the 8803 genes tested, 4344 were detectable. We identified 148 genes with significantly increased expression, and 19 genes with significantly decreased expression. A comprehensive analysis appears to be an effective method of studying the complex process of development.

  12. Echinacea purpurea: Pharmacology, phytochemistry and analysis methods

    Directory of Open Access Journals (Sweden)

    Azadeh Manayi

    2015-01-01

    Full Text Available Echinacea purpurea (Asteraceae is a perennial medicinal herb with important immunostimulatory and anti-inflammatory properties, especially the alleviation of cold symptoms. The plant also attracted scientists′ attention to assess other aspects of its beneficial effects. For instance, antianxiety, antidepression, cytotoxicity, and antimutagenicity as induced by the plant have been revealed in various studies. The findings of the clinical trials are controversial in terms of side effects. While some studies revealed the beneficial effects of the plant on the patients and no severe adverse effects, some others have reported serious side effects including abdominal pain, angioedema, dyspnea, nausea, pruritus, rash, erythema, and urticaria. Other biological activities of the plant such as antioxidant, antibacterial, antiviral, and larvicidal activities have been reported in previous experimental studies. Different classes of secondary metabolites of the plant such as alkamides, caffeic acid derivatives, polysaccharides, and glycoproteins are believed to be biologically and pharmacologically active. Actually, concurrent determination and single analysis of cichoric acid and alkamides have been successfully developed mainly by using high-performance liquid chromatography (HPLC coupled with different detectors including UV spectrophotometric, coulometric electrochemical, and electrospray ionization mass spectrometric detectors. The results of the studies which were controversial revealed that in spite of major experiments successfully accomplished using E. purpurea, many questions remain unanswered and future investigations may aim for complete recognition of the plant′s mechanism of action using new, complementary methods.

  13. Development of multiscale analysis and some applications

    Science.gov (United States)

    Wang, Lipo

    2014-11-01

    For most complex systems the interaction of different scales is among the most interesting and challenging features. Typically different scale regimes have different physical properties. The commonly used analysis approaches such as structure function and Fourier analysis have their respective limitations, for instance the mixing of large and small scale information, i.e. the so-called infrared and ultraviolet effects. To make improvement in this regard, a new method, segment structure analysis (SSA), has been developed to study the multiscale statistics. Such method can detect the regime scaling based on the conditional extremal points, depicting the geometrical features directly in physical space. From standard test cases (e.g. fractal Brownian motion) to real turbulence data, results show that SSA can appropriately distinguish the different scale effects. A successful application is the scaling of the Lagrangian velocity structure function. This long-time controversial topic has been confirmed using the present method. In principle SSA can generally be applied to various problems.

  14. Quality by design in the chiral separation strategy for the determination of enantiomeric impurities: development of a capillary electrophoresis method based on dual cyclodextrin systems for the analysis of levosulpiride.

    Science.gov (United States)

    Orlandini, S; Pasquini, B; Del Bubba, M; Pinzauti, S; Furlanetto, S

    2015-02-01

    Quality by design (QbD) concepts, in accordance with International Conference on Harmonisation Pharmaceutical Development guideline Q8(R2), represent an innovative strategy for the development of analytical methods. In this paper QbD principles have been comprehensively applied in the set-up of a capillary electrophoresis method aimed to quantify enantiomeric impurities. The test compound was the chiral drug substance levosulpiride (S-SUL) and the developed method was intended to be used for routine analysis of the pharmaceutical product. The target of analytical QbD approach is to establish a design space (DS) of critical process parameters (CPPs) where the critical quality attributes (CQAs) of the method have been assured to fulfil the desired requirements with a selected probability. QbD can improve the understanding of the enantioseparation process, including both the electrophoretic behavior of enantiomers and their separation, therefore enabling its control. The CQAs were represented by enantioresolution and analysis time. The scouting phase made it possible to select a separation system made by sulfated-β-cyclodextrin and a neutral cyclodextrin, operating in reverse polarity mode. The type of neutral cyclodextrin was included among other CPPs, both instrumental and related to background electrolyte composition, which were evaluated in a screening phase by an asymmetric screening matrix. Response surface methodology was carried out by a Doehlert design and allowed the contour plots to be drawn, highlighting significant interactions between some of the CPPs. DS was defined by applying Monte-Carlo simulations, and corresponded to the following intervals: sulfated-β-cyclodextrin concentration, 9-12 mM; methyl-β-cyclodextrin concentration, 29-38 mM; Britton-Robinson buffer pH, 3.24-3.50; voltage, 12-14 kV. Robustness of the method was examined by a Plackett-Burman matrix and the obtained results, together with system repeatability data, led to define a method

  15. An introduction to numerical methods and analysis

    CERN Document Server

    Epperson, J F

    2007-01-01

    Praise for the First Edition "". . . outstandingly appealing with regard to its style, contents, considerations of requirements of practice, choice of examples, and exercises.""-Zentrablatt Math "". . . carefully structured with many detailed worked examples . . .""-The Mathematical Gazette "". . . an up-to-date and user-friendly account . . .""-Mathematika An Introduction to Numerical Methods and Analysis addresses the mathematics underlying approximation and scientific computing and successfully explains where approximation methods come from, why they sometimes work (or d

  16. Development of colorimetric method for cephalexin in dosage forms

    Directory of Open Access Journals (Sweden)

    Priyanka P

    2008-01-01

    Full Text Available A simple, sensitive, accurate, rapid, and economical colorimetric-spectrophotometric method has been developed for the estimation of cephalexin in capsules. This method is based on the reaction of the drug with ferric chloride and potassium ferricyanide, giving a green-colored chromogen exhibiting maximum absorbance at 791 nm against reagent blank. Beer′s law was obeyed in the concentration range of 1-6 µ g/ml. Results of the analysis were validated statistically and by recovery studies.

  17. Developments and retrospectives in Lie theory algebraic methods

    CERN Document Server

    Penkov, Ivan; Wolf, Joseph

    2014-01-01

    This volume reviews and updates a prominent series of workshops in representation/Lie theory, and reflects the widespread influence of those  workshops in such areas as harmonic analysis, representation theory, differential geometry, algebraic geometry, and mathematical physics.  Many of the contributors have had leading roles in both the classical and modern developments of Lie theory and its applications. This Work, entitled Developments and Retrospectives in Lie Theory, and comprising 26 articles, is organized in two volumes: Algebraic Methods and Geometric and Analytic Methods. This is the Algebraic Methods volume. The Lie Theory Workshop series, founded by Joe Wolf and Ivan Penkov and joined shortly thereafter by Geoff Mason, has been running for over two decades. Travel to the workshops has usually been supported by the NSF, and local universities have provided hospitality. The workshop talks have been seminal in describing new perspectives in the field covering broad areas of current research.  Mos...

  18. Methods for Rapid Screening in Woody Plant Herbicide Development

    Directory of Open Access Journals (Sweden)

    William Stanley

    2014-07-01

    Full Text Available Methods for woody plant herbicide screening were assayed with the goal of reducing resources and time required to conduct preliminary screenings for new products. Rapid screening methods tested included greenhouse seedling screening, germinal screening, and seed screening. Triclopyr and eight experimental herbicides from Dow AgroSciences (DAS 313, 402, 534, 548, 602, 729, 779, and 896 were tested on black locust, loblolly pine, red maple, sweetgum, and water oak. Screening results detected differences in herbicide and species in all experiments in much less time (days to weeks than traditional field screenings and consumed significantly less resources (<500 mg acid equivalent per herbicide per screening. Using regression analysis, various rapid screening methods were linked into a system capable of rapidly and inexpensively assessing herbicide efficacy and spectrum of activity. Implementation of such a system could streamline early-stage herbicide development leading to field trials, potentially freeing resources for use in development of beneficial new herbicide products.

  19. Developments and retrospectives in Lie theory geometric and analytic methods

    CERN Document Server

    Penkov, Ivan; Wolf, Joseph

    2014-01-01

    This volume reviews and updates a prominent series of workshops in representation/Lie theory, and reflects the widespread influence of those  workshops in such areas as harmonic analysis, representation theory, differential geometry, algebraic geometry, and mathematical physics.  Many of the contributors have had leading roles in both the classical and modern developments of Lie theory and its applications. This Work, entitled Developments and Retrospectives in Lie Theory, and comprising 26 articles, is organized in two volumes: Algebraic Methods and Geometric and Analytic Methods. This is the Geometric and Analytic Methods volume. The Lie Theory Workshop series, founded by Joe Wolf and Ivan Penkov and joined shortly thereafter by Geoff Mason, has been running for over two decades. Travel to the workshops has usually been supported by the NSF, and local universities have provided hospitality. The workshop talks have been seminal in describing new perspectives in the field covering broad areas of current re...

  20. Complexity of software trustworthiness and its dynamical statistical analysis methods

    Institute of Scientific and Technical Information of China (English)

    ZHENG ZhiMing; MA ShiLong; LI Wei; JIANG Xin; WEI Wei; MA LiLi; TANG ShaoTing

    2009-01-01

    Developing trusted softwares has become an important trend and a natural choice in the development of software technology and applications.At present,the method of measurement and assessment of software trustworthiness cannot guarantee safe and reliable operations of software systems completely and effectively.Based on the dynamical system study,this paper interprets the characteristics of behaviors of software systems and the basic scientific problems of software trustworthiness complexity,analyzes the characteristics of complexity of software trustworthiness,and proposes to study the software trustworthiness measurement in terms of the complexity of software trustworthiness.Using the dynamical statistical analysis methods,the paper advances an invariant-measure based assessment method of software trustworthiness by statistical indices,and hereby provides a dynamical criterion for the untrustworthiness of software systems.By an example,the feasibility of the proposed dynamical statistical analysis method in software trustworthiness measurement is demonstrated using numerical simulations and theoretical analysis.

  1. Development and validation of a single RP-HPLC assay method for analysis of bulk raw material batches of four parabens that are widely used as preservatives in pharmaceutical and cosmetic products.

    Science.gov (United States)

    Kumar, S; Mathkar, S; Romero, C; Rustum, A M

    2011-05-01

    A stability-indicating, robust, fast, and user friendly reversed-phase high-performance liquid chromatographic (RP-HPLC) assay method has been developed and validated for the analysis of commercial raw material batches of methylparaben, ethylparaben, propylparaben, and butylparaben. These four parabens are widely used as preservatives in pharmaceutical and cosmetic products. Accurate assay value of each of the parabens in their respective commercial lots is critical to determine the correct weight of the paraben that is needed to obtain the target concentration of the paraben in a specific lot of pharmaceutical or cosmetic products. Currently, there are no single HPLC assay methods (validated as per ICH requirements) available in the literature that can be used to analyze the commercial lots of each of the four parabens. The analytical method reported herein analyzes all four parabens in less than 10 min. The method presented in this report was successfully validated as per ICH guidelines. Therefore, this method can be implemented in QC laboratories to analyze and assay the commercial bulk lots of the four parabens.

  2. Analytical Method Development & Validation for Related Substances Method of Busulfan Injection by Ion Chromatography Method

    Directory of Open Access Journals (Sweden)

    Rewaria S

    2013-05-01

    Full Text Available A new simple, accurate, precise and reproducible Ion chromatography method has been developed forthe estimation of Methane sulfonic acid in Busulfan injectable dosage. The method which is developedis also validated in complete compliance with the current regulatory guidelines by using well developedanalytical method validation techniques and tools which comprises with the analytical method validationparameters like Linearity, LOD and LOQ determination, Accuracy, Method precision, Specificity,System suitability, Robustness, Ruggedness etc. by adopting the current method the linearity obtained isnear to 0.999 and thus this shows that the method is capable to give a good detector response, therecovery calculated was within the range of 85% to 115% of the specification limits.

  3. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault

    2013-01-01

    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  4. Numerical analysis in electromagnetics the TLM method

    CERN Document Server

    Saguet, Pierre

    2013-01-01

    The aim of this book is to give a broad overview of the TLM (Transmission Line Matrix) method, which is one of the "time-domain numerical methods". These methods are reputed for their significant reliance on computer resources. However, they have the advantage of being highly general.The TLM method has acquired a reputation for being a powerful and effective tool by numerous teams and still benefits today from significant theoretical developments. In particular, in recent years, its ability to simulate various situations with excellent precision, including complex materials, has been

  5. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  6. A biosegmentation benchmark for evaluation of bioimage analysis methods

    Directory of Open Access Journals (Sweden)

    Kvilekval Kristian

    2009-11-01

    Full Text Available Abstract Background We present a biosegmentation benchmark that includes infrastructure, datasets with associated ground truth, and validation methods for biological image analysis. The primary motivation for creating this resource comes from the fact that it is very difficult, if not impossible, for an end-user to choose from a wide range of segmentation methods available in the literature for a particular bioimaging problem. No single algorithm is likely to be equally effective on diverse set of images and each method has its own strengths and limitations. We hope that our benchmark resource would be of considerable help to both the bioimaging researchers looking for novel image processing methods and image processing researchers exploring application of their methods to biology. Results Our benchmark consists of different classes of images and ground truth data, ranging in scale from subcellular, cellular to tissue level, each of which pose their own set of challenges to image analysis. The associated ground truth data can be used to evaluate the effectiveness of different methods, to improve methods and to compare results. Standard evaluation methods and some analysis tools are integrated into a database framework that is available online at http://bioimage.ucsb.edu/biosegmentation/. Conclusion This online benchmark will facilitate integration and comparison of image analysis methods for bioimages. While the primary focus is on biological images, we believe that the dataset and infrastructure will be of interest to researchers and developers working with biological image analysis, image segmentation and object tracking in general.

  7. Multiple predictor smoothing methods for sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Helton, Jon Craig; Storlie, Curtis B.

    2006-08-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

  8. Integrated numerical methods for hypersonic aircraft cooling systems analysis

    Science.gov (United States)

    Petley, Dennis H.; Jones, Stuart C.; Dziedzic, William M.

    1992-01-01

    Numerical methods have been developed for the analysis of hypersonic aircraft cooling systems. A general purpose finite difference thermal analysis code is used to determine areas which must be cooled. Complex cooling networks of series and parallel flow can be analyzed using a finite difference computer program. Both internal fluid flow and heat transfer are analyzed, because increased heat flow causes a decrease in the flow of the coolant. The steady state solution is a successive point iterative method. The transient analysis uses implicit forward-backward differencing. Several examples of the use of the program in studies of hypersonic aircraft and rockets are provided.

  9. Digital Forensics Analysis of Spectral Estimation Methods

    CERN Document Server

    Mataracioglu, Tolga

    2011-01-01

    Steganography is the art and science of writing hidden messages in such a way that no one apart from the intended recipient knows of the existence of the message. In today's world, it is widely used in order to secure the information. In this paper, the traditional spectral estimation methods are introduced. The performance analysis of each method is examined by comparing all of the spectral estimation methods. Finally, from utilizing those performance analyses, a brief pros and cons of the spectral estimation methods are given. Also we give a steganography demo by hiding information into a sound signal and manage to pull out the information (i.e, the true frequency of the information signal) from the sound by means of the spectral estimation methods.

  10. Computational methods for nuclear criticality safety analysis

    International Nuclear Information System (INIS)

    Nuclear criticality safety analyses require the utilization of methods which have been tested and verified against benchmarks results. In this work, criticality calculations based on the KENO-IV and MCNP codes are studied aiming the qualification of these methods at the IPEN-CNEN/SP and COPESP. The utilization of variance reduction techniques is important to reduce the computer execution time, and several of them are analysed. As practical example of the above methods, a criticality safety analysis for the storage tubes for irradiated fuel elements from the IEA-R1 research has been carried out. This analysis showed that the MCNP code is more adequate for problems with complex geometries, and the KENO-IV code shows conservative results when it is not used the generalized geometry option. (author)

  11. Heteroscedastic regression analysis method for mixed data

    Institute of Scientific and Technical Information of China (English)

    FU Hui-min; YUE Xiao-rui

    2011-01-01

    The heteroscedastic regression model was established and the heteroscedastic regression analysis method was presented for mixed data composed of complete data, type- I censored data and type- Ⅱ censored data from the location-scale distribution. The best unbiased estimations of regression coefficients, as well as the confidence limits of the location parameter and scale parameter were given. Furthermore, the point estimations and confidence limits of percentiles were obtained. Thus, the traditional multiple regression analysis method which is only suitable to the complete data from normal distribution can be extended to the cases of heteroscedastic mixed data and the location-scale distribution. So the presented method has a broad range of promising applications.

  12. Analysis of spectral methods for the homogeneous Boltzmann equation

    KAUST Repository

    Filbet, Francis

    2011-04-01

    The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.

  13. A study of applicability of soil-structure interaction analysis method using boundary element method

    Energy Technology Data Exchange (ETDEWEB)

    Kim, M. K. [KAERI, Taejon (Korea, Republic of); Kim, M. K. [Yonsei University, Seoul (Korea, Republic of)

    2003-07-01

    In this study, a numerical method for Soil-Structure Interaction (SSI) analysis using FE-BE coupling method is developed. The total system is divided into two parts so called far field and near field. The far field is modeled by boundary element formulation using the multi-layered dynamic fundamental solution and coupled with near field modeled by finite elements. In order to verify the seismic response analysis, the results are compared with those of other commercial code. Finally, several SSI analyses which induced seismic loading are performed to examine the dynamic behavior of the system. As a result, it is shown that the developed method can be an efficient numerical method for solving the SSI analysis.

  14. Unascertained Factor Method of Dynamic Characteristic Analysis for Antenna Structures

    Institute of Scientific and Technical Information of China (English)

    ZHU Zeng-qing; LIANG Zhen-tao; CHEN Jian-jun

    2008-01-01

    The dynamic characteristic analysis model of antenna structures is built, in which the structural physical parameters and geometrical dimensions are all considered as unascertained variables, And a structure dynamic characteristic analysis method based on the unascertained factor method is given. The computational expression of structural characteristic is developed by the mathematics expression of unascertained factor and the principles of unascertained rational numbers arithmetic. An example is given, in which the possible values and confidence degrees of the unascertained structure characteristics are obtained. The calculated results show that the method is feasible and effective.

  15. Statistical methods of SNP data analysis with applications

    CERN Document Server

    Bulinski, Alexander; Shashkin, Alexey; Yaskov, Pavel

    2011-01-01

    Various statistical methods important for genetic analysis are considered and developed. Namely, we concentrate on the multifactor dimensionality reduction, logic regression, random forests and stochastic gradient boosting. These methods and their new modifications, e.g., the MDR method with "independent rule", are used to study the risk of complex diseases such as cardiovascular ones. The roles of certain combinations of single nucleotide polymorphisms and external risk factors are examined. To perform the data analysis concerning the ischemic heart disease and myocardial infarction the supercomputer SKIF "Chebyshev" of the Lomonosov Moscow State University was employed.

  16. Further development of microparticle image velocimetry analysis for characterisation of gas streams as a novel method of fuel cell development. Final report; Weiterentwicklung des Mikro-Particle Image Velocimetry Analyseverfahrens zur Charakterisierung von Gasstroemungen als neuartige Entwicklungsmethodik fuer Brennstoffzellen. Schlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    The project aimed at a better understanding of the complex fluid-mechanical processes in the small ducts of bipolar plates. So far, an appropriate technology for in-situ measurement was lacking. The project therefore focused on the further development of microparticle image velocimetry in order to enable analyses of the local velocity distribution of a gas stream in a microduct. Further, measurements were carried out in the microducts of a fuel cell in the more difficult conditions of actual operation. (orig./AKB) [German] Anlass des Forschungsvorhabens war die komplizierten stroemungsmechanischen Zusammenhaenge in den kleinen Kanaelen der Bipolarplatten zu verstehen. Bisher stand keine Messtechnik zur Verfuegung, dies es erlaubt, die stroemungsmechanischen Prozesse in den Mikrokanaelen unter Realbedingungen in situ zu vermessen und mit der instantanen Zellleistung zu korrelieren, Ziel des Projektes war es daher, die Methode der Mikro-Partikel-Image-Velocimetry in der Art weiterzuentwickeln, dass eine Analyse der lokalen Geschwindigkeitsverteilung einer Gasstroemung in einem Mikrokanal ermoeglicht wird. Darueber hinaus wird als zweites Ziel des Projekts eine solche Messung unter den erschwerten Bedingungen einer betriebenen Brennstoffzelle in Mikrokanaelen einer Zelle durchgefuehrt.

  17. Multiscale Methods for Nuclear Reactor Analysis

    Science.gov (United States)

    Collins, Benjamin S.

    The ability to accurately predict local pin powers in nuclear reactors is necessary to understand the mechanisms that cause fuel pin failure during steady state and transient operation. In the research presented here, methods are developed to improve the local solution using high order methods with boundary conditions from a low order global solution. Several different core configurations were tested to determine the improvement in the local pin powers compared to the standard techniques, that use diffusion theory and pin power reconstruction (PPR). Two different multiscale methods were developed and analyzed; the post-refinement multiscale method and the embedded multiscale method. The post-refinement multiscale methods use the global solution to determine boundary conditions for the local solution. The local solution is solved using either a fixed boundary source or an albedo boundary condition; this solution is "post-refinement" and thus has no impact on the global solution. The embedded multiscale method allows the local solver to change the global solution to provide an improved global and local solution. The post-refinement multiscale method is assessed using three core designs. When the local solution has more energy groups, the fixed source method has some difficulties near the interface: however the albedo method works well for all cases. In order to remedy the issue with boundary condition errors for the fixed source method, a buffer region is used to act as a filter, which decreases the sensitivity of the solution to the boundary condition. Both the albedo and fixed source methods benefit from the use of a buffer region. Unlike the post-refinement method, the embedded multiscale method alters the global solution. The ability to change the global solution allows for refinement in areas where the errors in the few group nodal diffusion are typically large. The embedded method is shown to improve the global solution when it is applied to a MOX/LEU assembly

  18. Development of a simultaneous high resolution typing method for three SLA class II genes, SLA-DQA, SLA-DQB1, and SLA-DRB1 and the analysis of SLA class II haplotypes.

    Science.gov (United States)

    Le, MinhThong; Choi, Hojun; Choi, Min-Kyeung; Cho, Hyesun; Kim, Jin-Hoi; Seo, Han Geuk; Cha, Se-Yeon; Seo, Kunho; Dadi, Hailu; Park, Chankyu

    2015-06-15

    The characterization of the genetic variations of major histocompatibility complex (MHC) is essential to understand the relationship between the genetic diversity of MHC molecules and disease resistance and susceptibility in adaptive immunity. We previously reported the development of high-resolution individual locus typing methods for three of the most polymorphic swine leukocyte antigens (SLA) class II loci, namely, SLA-DQA, SLA-DQB1, and SLA-DRB1. In this study, we extensively modified our previous protocols and developed a method for the simultaneous amplification of the three SLA class II genes and subsequent analysis of individual loci using direct sequencing. The unbiased and simultaneous amplification of alleles from the all three hyper-polymorphic and pseudogene containing genes such as MHC genes is extremely challenging. However, using this method, we demonstrated the successful typing of SLA-DQA, SLA-DQB1, and SLA-DRB1 for 31 selected individuals comprising 26 different SLA class II haplotypes which were identified from 700 animals using the single locus typing methods. The results were identical to the known genotypes from the individual locus typing. The new method has significant benefits over the individual locus typing, including lower typing cost, use of less biomaterial, less effort and fewer errors in handling large samples for multiple loci. We also extensively characterized the haplotypes of SLA class II genes and reported three new haplotypes. Our results should serve as a basis to investigate the possible association between polymorphisms of MHC class II and differences in immune responses to exogenous antigens.

  19. Chemical aspects of nuclear methods of analysis

    International Nuclear Information System (INIS)

    This final report includes papers which fall into three general areas: development of practical pre-analysis separation techniques, uranium/thorium separation from other elements for analytical and processing operations, and theory and mechanism of separation techniques. A separate abstract was prepared for each of the 9 papers

  20. Global/local methods research using a common structural analysis framework

    Science.gov (United States)

    Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.

    1991-01-01

    Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.

  1. METHODIC OF DEVELOPMENT OF MOTOR GIFTEDNESS OF PRESCHOOL CHILDREN

    Directory of Open Access Journals (Sweden)

    Светлана Юрьевна Федорова

    2013-05-01

    Full Text Available Education and training of gifted children today appropriate to consider as an important strategic task of modern society. In this context, the purpose of research is the development motor giftedness, which is particularly relevant at the stage of pre-school education, which is caused by age-preschoolers. Preschoolers' motor giftedness is considered by the author as developing integrated quality, including psychomotor skills, inclinations, increased motivation for motor activity. In the process of study the following methods are used:  the study and analysis of the scientific and methodological literature on studies, questioning, interview, testing of physical fitness, statistical data processing.The result of research work is methodic of development of motor giftedness on physical education in preschool. The author's methodic consists of four steps:  diagnostic, prognostic, practice and activity, social and pedagogical. Each step determines the inclusion of preschool children in sports and developing environment that meets his or her abilities and needs through the creation of certain social and educational conditions.The area of using results of the author's methodic is preschool and the system of improvement professional skill of teachers. DOI: http://dx.doi.org/10.12731/2218-7405-2013-4-31

  2. METHODIC OF DEVELOPMENT OF MOTOR GIFTEDNESS OF PRESCHOOL CHILDREN

    Directory of Open Access Journals (Sweden)

    Fedorova Svetlana Yurievna

    2013-04-01

    Full Text Available Education and training of gifted children today appropriate to consider as an important strategic task of modern society. In this context, the purpose of research is the development motor giftedness, which is particularly relevant at the stage of pre-school education, which is caused by age-preschoolers. Preschoolers' motor giftedness is considered by the author as developing integrated quality, including psychomotor skills, inclinations, increased motivation for motor activity. In the process of study the following methods are used: the study and analysis of the scientific and methodological literature on studies, questioning, interview, testing of physical fitness, statistical data processing. The result of research work is methodic of development of motor giftedness on physical education in preschool. The author's methodic consists of four steps: diagnostic, prognostic, practice and activity, social and pedagogical. Each step determines the inclusion of preschool children in sports and developing environment that meets his or her abilities and needs through the creation of certain social and educational conditions. The area of using results of the author's methodic is preschool and the system of improvement professional skill of teachers.

  3. Child development: analysis of a new concept

    Directory of Open Access Journals (Sweden)

    Juliana Martins de Souza

    2015-12-01

    Full Text Available Objectives: to perform concept analysis of the term child development (CD and submit it to review by experts. Method: analysis of concept according to the hybrid model, in three phases: theoretical phase, with literature review; field phase of qualitative research with professionals who care for children; and analytical phase, of articulation of data from previous steps, based on the bioecological theory of development. The new definition was analyzed by experts in a focus group. Project approved by the Research Ethics Committee. Results: we reviewed 256 articles, from 12 databases and books, and interviewed 10 professionals, identifying that: The CD concept has as antecedents aspects of pregnancy, factors of the child, factors of context, highlighting the relationships and child care, and social aspects; its consequences can be positive or negative, impacting on society; its attributes are behaviors and abilities of the child; its definitions are based on maturation, contextual perspectives or both. The new definition elaborated in concept analysis was validated by nine experts in focus group. It expresses the magnitude of the phenomenon and factors not presented in other definitions. Conclusion: the research produced a new definition of CD that can improve nursing classifications for the comprehensive care of the child.

  4. New Developments at NASA's Instrument Synthesis and Analysis Laboratory

    Science.gov (United States)

    Wood, H. John; Herring, Ellen L.; Brown, Tammy L.

    2006-01-01

    NASA's Instrument Synthesis and Analysis Laboratory (ISAL) has developed new methods to provide an instrument study in one week's engineering time. The final product is recorded in oral presentations, models and the analyses which underlie the models.

  5. Developments in mycotoxin analysis: an update for 2009-2010

    NARCIS (Netherlands)

    Shephard, G.S.; Berthiller, F.; Burdaspal, P.; Crews, C.; Jonker, M.A.; Krska, R.; MacDonald, S.; Malone, B.; Maragos, C.; Sabino, M.; Solfrizzo, M.; Egmond, van H.P.; Whitaker, T.B.

    2011-01-01

    This review highlights developments in mycotoxin analysis and sampling over a period between mid-2009 and mid-2010. It covers the major mycotoxins aflatoxins, Alternaria toxins, ergot alkaloids, fumonisins, ochratoxin, patulin, trichothecenes, and zearalenone. New and improved methods for mycotoxins

  6. Power System Transient Stability Analysis through a Homotopy Analysis Method

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Shaobu; Du, Pengwei; Zhou, Ning

    2014-04-01

    As an important function of energy management systems (EMSs), online contingency analysis plays an important role in providing power system security warnings of instability. At present, N-1 contingency analysis still relies on time-consuming numerical integration. To save computational cost, the paper proposes a quasi-analytical method to evaluate transient stability through time domain periodic solutions’ frequency sensitivities against initial values. First, dynamic systems described in classical models are modified into damping free systems whose solutions are either periodic or expanded (non-convergent). Second, because the sensitivities experience sharp changes when periodic solutions vanish and turn into expanded solutions, transient stability is assessed using the sensitivity. Third, homotopy analysis is introduced to extract frequency information and evaluate the sensitivities only from initial values so that time consuming numerical integration is avoided. Finally, a simple case is presented to demonstrate application of the proposed method, and simulation results show that the proposed method is promising.

  7. Development on Vulnerability Assessment Methods of PPS

    Institute of Scientific and Technical Information of China (English)

    MIAO; Qiang; ZHANG; Wen-liang; BU; Li-xin; YIN; Hong-he; LI; Xin-jun; FANG; Xin

    2013-01-01

    Through investigating information from domestic and abroad,joint the domestic assessment experience,we present a set of physical protection system(PPS)vulnerability assessment methods for on-operating nuclear power plants and for on-designing nuclear facilities.The methods will help to strengthen and upgrade the security measures of the nuclear facilities,improve the effectiveness and

  8. Development of RP-HPLC for analysis of human insulin

    OpenAIRE

    Rajan D; Gowda K; Mandal U; Ganesan.M; Bose A; Sarkar A; Pal T

    2006-01-01

    The objective of the present work is to develop a simple and sensitive method for analysis of human insulin injection by using reverse-phase high performance liquid chromatography technique. A reverse-phase high performance liquid chromatography method with UV-detection at room temperature has been developed for the analysis of insulin from formulation. Hypersil BDS C-18 was used as stationary phase, and mobile phase consisted of 60 volume of 1 mmol sodium sulphate and 0.2% triethylami...

  9. Alpha track analysis using nuclear emulsions as a preselecting method for safeguards environmental sample analysis

    International Nuclear Information System (INIS)

    Alpha track analysis in state-of-the-art nuclear emulsions was investigated to develop a preselecting method for environmental sampling for safeguards, which is based on counting of each alpha and fission tracks from nuclear material. We developed an automatic scanning system and software for readout of alpha tracks in the emulsions. Automatic analysis of alpha tracks from an uranium ore sample was demonstrated. - Highlights: • Automatic scanning system and software were developed for alpha track analysis. • Basic performance of alpha track readout in novel nuclear emulsions was investigated. • NIT was a promising candidate for alpha track analysis from nuclear material

  10. A Case Study of a Mixed Methods Study Engaged in Integrated Data Analysis

    Science.gov (United States)

    Schiazza, Daniela Marie

    2013-01-01

    The nascent field of mixed methods research has yet to develop a cohesive framework of guidelines and procedures for mixed methods data analysis (Greene, 2008). To support the field's development of analytical frameworks, this case study reflects on the development and implementation of a mixed methods study engaged in integrated data analysis.…

  11. Model-based methods for linkage analysis.

    Science.gov (United States)

    Rice, John P; Saccone, Nancy L; Corbett, Jonathan

    2008-01-01

    The logarithm of an odds ratio (LOD) score method originated in a seminal article by Newton Morton in 1955. The method is broadly concerned with issues of power and the posterior probability of linkage, ensuring that a reported linkage has a high probability of being a true linkage. In addition, the method is sequential so that pedigrees or LOD curves may be combined from published reports to pool data for analysis. This approach has been remarkably successful for 50 years in identifying disease genes for Mendelian disorders. After discussing these issues, we consider the situation for complex disorders where the maximum LOD score statistic shares some of the advantages of the traditional LOD score approach, but is limited by unknown power and the lack of sharing of the primary data needed to optimally combine analytic results. We may still learn from the LOD score method as we explore new methods in molecular biology and genetic analysis to utilize the complete human DNA sequence and the cataloging of all human genes.

  12. Agile Development Methods for Space Operations

    Science.gov (United States)

    Trimble, Jay; Webster, Chris

    2012-01-01

    Main stream industry software development practice has gone from a traditional waterfall process to agile iterative development that allows for fast response to customer inputs and produces higher quality software at lower cost. How can we, the space ops community, adopt state of the art software development practice, achieve greater productivity at lower cost, and maintain safe and effective space flight operations? At NASA Ames, we are developing Mission Control Technologies Software, in collaboration with Johnson Space Center (JSC) and, more recently, the Jet Propulsion Laboratory (JPL).

  13. Spatial Analysis Methods of Road Traffic Collisions

    DEFF Research Database (Denmark)

    Loo, Becky P. Y.; Anderson, Tessa Kate

    outlines the key issues in identifying hazardous road locations (HRLs), considers current approaches used for reducing and preventing road traffic collisions, and outlines a strategy for improved road safety. The book covers spatial accuracy, validation, and other statistical issues, as well as link......Spatial Analysis Methods of Road Traffic Collisions centers on the geographical nature of road crashes, and uses spatial methods to provide a greater understanding of the patterns and processes that cause them. Written by internationally known experts in the field of transport geography, the book...

  14. ANALYSIS METHOD OF AUTOMATIC PLANETARY TRANSMISSION KINEMATICS

    Directory of Open Access Journals (Sweden)

    Józef DREWNIAK

    2014-06-01

    Full Text Available In the present paper, planetary automatic transmission is modeled by means of contour graphs. The goals of modeling could be versatile: ratio calculating via algorithmic equation generation, analysis of velocity and accelerations. The exemplary gears running are analyzed, several drives/gears are consecutively taken into account discussing functional schemes, assigned contour graphs and generated system of equations and their solutions. The advantages of the method are: algorithmic approach, general approach where particular drives are cases of the generally created model. Moreover, the method allows for further analyzes and synthesis tasks e.g. checking isomorphism of design solutions.

  15. Text analysis devices, articles of manufacture, and text analysis methods

    Science.gov (United States)

    Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C

    2013-05-28

    Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes processing circuitry configured to analyze initial text to generate a measurement basis usable in analysis of subsequent text, wherein the measurement basis comprises a plurality of measurement features from the initial text, a plurality of dimension anchors from the initial text and a plurality of associations of the measurement features with the dimension anchors, and wherein the processing circuitry is configured to access a viewpoint indicative of a perspective of interest of a user with respect to the analysis of the subsequent text, and wherein the processing circuitry is configured to use the viewpoint to generate the measurement basis.

  16. Current trends of the development of chemical analysis

    Directory of Open Access Journals (Sweden)

    Rema Matakova

    2014-12-01

    Full Text Available This paper presents dynamics of the development of all stages of chemical analysis during last 15 years. The ways of the quality improvement of chemical analysis and its considerable advancement into the field of trace concentrations of substances are shown. Features of development of analytical methods, modern techniques for concentration and separation of substances, as well as chemomerrical processing of results are analyzed. Huge importance of computerization and automation of the analysis is shown.

  17. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    International Nuclear Information System (INIS)

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as a three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described

  18. Development of a measuring and evaluation method for X-ray analysis of residual stresses in the surface region of polycrystalline materials; Entwicklung eines Mess- und Auswerteverfahrens zur roentgenographischen Analyse des Eigenspannungszustandes im Oberflaechenbereich vielkristalliner Werkstoffe

    Energy Technology Data Exchange (ETDEWEB)

    Genzel, C.

    2000-11-01

    The topic of the habilitation thesis is the development of an X-ray diffraction method for measurement and depth-resolved analysis of internal stresses in the surface region of polycrystalline materials. The method relies on the basic approach of varying {tau}, the penetration depth of the X-rays in the materials, by the scattering vector g{sub theta{psi}} via stepwise specimen rotation. Thus, depth profiles of the interlattice plane distances d(hkl) in the specimen system can be derived for given direction and inclination angles {theta} and {psi} of the scattering vector. This offers the possibility to identify individual components of the stress tensors of the basic equation of the X-ray diffraction analysis, and to perform separate analyses of those components. For calculation of the relevant internal stress distributions {sigma}{sub ij}({tau}) using the interlattice plane distance profiles, a self-consistent method is established which takes into account the high sensitivity of the derived internal stresses in relation to the interlattice plane distance d{sub 0}(hkl) in the stress-free crystal lattice. The evaluation yields results describing the depth profiles as well as the strain-free interlattice plane distance d{sub 0}(hkl), so that a quantitative analysis is possible of tri-axial internal stress states in the surface region of the materials. (orig./CB) [German] Den Gegenstand der vorliegenden Arbeit bildet die Entwicklung eines roentgenographischen Mess- und Auswerteverfahrens zur tiefenaufgeloesten Analyse des oberflaechennahen Eigenspannungszustandes in vielkristallinen Werkstoffen. Der Grundgedanke der Methode besteht darin, die Eindringtiefe {tau} der Roentgenstrahlung in den Werkstoff durch schrittweise Drehung der Probe um den Streuvektor g{sub {theta}}{sub {psi}} zu variieren. Damit koennen Tiefenprofile der Netzebenenabstaende d(hkl) fuer fest vorgegebene Azimut- und Neigungswinkel {theta} und {psi} des Streuvektors im Probensystem ermittelt

  19. Single-cell analysis - Methods and protocols

    Directory of Open Access Journals (Sweden)

    Carlo Alberto Redi

    2013-06-01

    Full Text Available This is certainly a timely volume in the Methods in molecular biology series: we already entered the synthetic biology era and thus we need to be aware of the new methodological advances able to fulfill the new and necessary needs for biologists, biotechnologists and nano-biotechnologists. Notably, among these, the possibility to perform single cell analysis allow researchers to capture single cell responses....

  20. Bayesian methods for proteomic biomarker development

    Directory of Open Access Journals (Sweden)

    Belinda Hernández

    2015-12-01

    In this review we provide an introduction to Bayesian inference and demonstrate some of the advantages of using a Bayesian framework. We summarize how Bayesian methods have been used previously in proteomics and other areas of bioinformatics. Finally, we describe some popular and emerging Bayesian models from the statistical literature and provide a worked tutorial including code snippets to show how these methods may be applied for the evaluation of proteomic biomarkers.