WorldWideScience

Sample records for samples analytical problems

  1. Recent bibliography on analytical and sampling problems of a PWR primary coolant

    International Nuclear Information System (INIS)

    Illy, H.

    1980-07-01

    An extensive bibliography on the problems of analysis and sampling of the primary cooling water of PWRs is presented. The aim was to collect the analytical methods for dissolved gases. The sampling and preparation are also taken into account. last 8-10 years is included. The bibliography is arranged into alphabetical order by topics. The most important topics are as follows: boric acid, gas analysis, hydrogen isotopes, iodine, noble gases, radiation monitoring, sampling and preparation, water chemistry. (R.J.)

  2. Recent bibliography on analytical and sampling problems of a PWR primary coolant Suppl. 4

    International Nuclear Information System (INIS)

    Illy, H.

    1986-09-01

    The 4th supplement of a bibliographical series comprising the analytical and sampling problems of the primary coolant of PWR type reactors covers the literature from 1985 up to July 1986 (220 items). References are listed according to the following topics: boric acid; chloride, chlorine; general; hydrogen isotopes; iodine; iodide; noble gases; oxygen; other elements; radiation monitoring; reactor safety; sampling; water chemistry. (V.N.)

  3. Recent bibliography on analytical and sampling problems of a PWR primary coolant Pt. 1

    International Nuclear Information System (INIS)

    Illy, H.

    1981-12-01

    The first bibliography on analytical and sampling problems of a PWR primary coolant (KFKI Report-1980-48) was published in 1980 and it covered the literature published in the previous 8-10 years. The present supplement reviews the subsequent literature up till December 1981. It also includes some references overlooked in the first volume. The serial numbers are continued from the first bibliography. (author)

  4. Recent bibliography on analytical and sampling problems of a PWR primary coolant Suppl. 3

    International Nuclear Information System (INIS)

    Illy, H.

    1985-03-01

    The present supplement to the bibliography on analytical and sampling problems of PWR primary coolant covers the literature published in 1984 and includes some references overlooked in the previous volumes dealing with the publications of the last 10 years. References are devided into topics characterized by the following headlines: boric acid; chloride; chlorine; carbon dioxide; general; gas analysis; hydrogen isotopes; iodine; iodide; nitrogen; noble gases and radium; ammonia; ammonium; oxygen; other elements; radiation monitoring; reactor safety; sampling; water chemistry. Under a given subject bibliographical information is listed in alphabetical order of the authors. (V.N.)

  5. Analytical and sampling problems in primary coolant circuits of PWR-type reactors

    International Nuclear Information System (INIS)

    Illy, H.

    1980-10-01

    Details of recent analytical methods on the analysis and sampling of a PWR primary coolant are given in the order as follows: sampling and preparation; analysis of the gases dissolved in the water; monitoring of radiating substances; checking of boric acid concentration which controls the reactivity. The bibliography of this work and directions for its use are published in a separate report: KFKI-80-48 (1980). (author)

  6. Problem-based learning on quantitative analytical chemistry course

    Science.gov (United States)

    Fitri, Noor

    2017-12-01

    This research applies problem-based learning method on chemical quantitative analytical chemistry, so called as "Analytical Chemistry II" course, especially related to essential oil analysis. The learning outcomes of this course include aspects of understanding of lectures, the skills of applying course materials, and the ability to identify, formulate and solve chemical analysis problems. The role of study groups is quite important in improving students' learning ability and in completing independent tasks and group tasks. Thus, students are not only aware of the basic concepts of Analytical Chemistry II, but also able to understand and apply analytical concepts that have been studied to solve given analytical chemistry problems, and have the attitude and ability to work together to solve the problems. Based on the learning outcome, it can be concluded that the problem-based learning method in Analytical Chemistry II course has been proven to improve students' knowledge, skill, ability and attitude. Students are not only skilled at solving problems in analytical chemistry especially in essential oil analysis in accordance with local genius of Chemistry Department, Universitas Islam Indonesia, but also have skilled work with computer program and able to understand material and problem in English.

  7. Eco-analytical Methodology in Environmental Problems Monitoring

    Science.gov (United States)

    Agienko, M. I.; Bondareva, E. P.; Chistyakova, G. V.; Zhironkina, O. V.; Kalinina, O. I.

    2017-01-01

    Among the problems common to all mankind, which solutions influence the prospects of civilization, the problem of ecological situation monitoring takes very important place. Solution of this problem requires specific methodology based on eco-analytical comprehension of global issues. Eco-analytical methodology should help searching for the optimum balance between environmental problems and accelerating scientific and technical progress. The fact that Governments, corporations, scientists and nations focus on the production and consumption of material goods cause great damage to environment. As a result, the activity of environmentalists is developing quite spontaneously, as a complement to productive activities. Therefore, the challenge posed by the environmental problems for the science is the formation of geo-analytical reasoning and the monitoring of global problems common for the whole humanity. So it is expected to find the optimal trajectory of industrial development to prevent irreversible problems in the biosphere that could stop progress of civilization.

  8. Analytical characterization of high-level mixed wastes using multiple sample preparation treatments

    International Nuclear Information System (INIS)

    King, A.G.; Baldwin, D.L.; Urie, M.W.; McKinley, S.G.

    1994-01-01

    The Analytical Chemistry Laboratory at the Pacific Northwest Laboratory in Richland, Washington, is actively involved in performing analytical characterization of high-level mixed waste from Hanford's single shell and double shell tank characterization programs. A full suite of analyses is typically performed on homogenized tank core samples. These analytical techniques include inductively-coupled plasma-atomic emission spectroscopy, total organic carbon methods and radiochemistry methods, as well as many others, all requiring some type of remote sample-preparation treatment to solubilize the tank sludge material for analysis. Most of these analytical methods typically use a single sample-preparation treatment, inherently providing elemental information only. To better understand and interpret tank chemistry and assist in identifying chemical compounds, selected analytical methods are performed using multiple sample-preparation treatments. The sample preparation treatments used at Pacific Northwest Laboratory for this work with high-level mixed waste include caustic fusion, acid digestion, and water leach. The type of information available by comparing results from different sample-prep treatments includes evidence for the presence of refractory compounds, acid-soluble compounds, or water-soluble compounds. Problems unique to the analysis of Hanford tank wastes are discussed. Selected results from the Hanford single shell ferrocyanide tank, 241-C-109, are presented, and the resulting conclusions are discussed

  9. Analytical methods for heat transfer and fluid flow problems

    CERN Document Server

    Weigand, Bernhard

    2015-01-01

    This book describes useful analytical methods by applying them to real-world problems rather than solving the usual over-simplified classroom problems. The book demonstrates the applicability of analytical methods even for complex problems and guides the reader to a more intuitive understanding of approaches and solutions. Although the solution of Partial Differential Equations by numerical methods is the standard practice in industries, analytical methods are still important for the critical assessment of results derived from advanced computer simulations and the improvement of the underlying numerical techniques. Literature devoted to analytical methods, however, often focuses on theoretical and mathematical aspects and is therefore useless to most engineers. Analytical Methods for Heat Transfer and Fluid Flow Problems addresses engineers and engineering students. The second edition has been updated, the chapters on non-linear problems and on axial heat conduction problems were extended. And worked out exam...

  10. Post-analytical stability of 23 common chemistry and immunochemistry analytes in incurred samples

    DEFF Research Database (Denmark)

    Nielsen, Betina Klint; Frederiksen, Tina; Friis-Hansen, Lennart

    2017-01-01

    BACKGROUND: Storage of blood samples after centrifugation, decapping and initial sampling allows ordering of additional blood tests. The pre-analytic stability of biochemistry and immunochemistry analytes has been studied in detail, but little is known about the post-analytical stability...... in incurred samples. METHODS: We examined the stability of 23 routine analytes on the Dimension Vista® (Siemens Healthineers, Denmark): 42-60 routine samples in lithium-heparin gel tubes (Vacutainer, BD, USA) were centrifuged at 3000×g for 10min. Immediately after centrifugation, initial concentration...... of analytes were measured in duplicate (t=0). The tubes were stored decapped at room temperature and re-analyzed after 2, 4, 6, 8 and 10h in singletons. The concentration from reanalysis were normalized to initial concentration (t=0). Internal acceptance criteria for bias and total error were used...

  11. [Final goal and problems in clinical chemistry examination measured by advanced analytical instruments].

    Science.gov (United States)

    Sasaki, M; Hashimoto, E

    1993-07-01

    In the field of clinical chemistry of Japan, the automation of analytical instruments first appeared in the 1960's with the rapid developments in electronics industry. After a series of improvements and modifications in the past thirty years, these analytical instruments became excellent with multifunctions. From the results of these developments, it is now well recognized that automated analytical instruments are indispensable to manage the modern clinical Laboratory. On the other hand, these automated analytical instruments uncovered the various problems which had been hitherto undetected when the manually-operated instruments were used. For instances, the variation of commercially available standard solutions due to the lack of government control causes the different values obtained in institutions. In addition, there are many problems such as a shortage of medical technologists, a complication to handle the sampling and an increased labor costs. Furthermore, the inadequacies in maintenance activities cause the frequent erroneous reports of laboratory findings in spite of the latest and efficient analytical instruments equipped. Thus, the working process in clinical laboratory must be systematized to create the rapidity and the effectiveness. In the present report, we review the developmental history of automation system for analytical instruments, discuss the problems to create the effective clinical laboratory and explore the ways to deal with these emerging issues for the automation technology in clinical laboratory.

  12. Bias Assessment of General Chemistry Analytes using Commutable Samples.

    Science.gov (United States)

    Koerbin, Gus; Tate, Jillian R; Ryan, Julie; Jones, Graham Rd; Sikaris, Ken A; Kanowski, David; Reed, Maxine; Gill, Janice; Koumantakis, George; Yen, Tina; St John, Andrew; Hickman, Peter E; Simpson, Aaron; Graham, Peter

    2014-11-01

    Harmonisation of reference intervals for routine general chemistry analytes has been a goal for many years. Analytical bias may prevent this harmonisation. To determine if analytical bias is present when comparing methods, the use of commutable samples, or samples that have the same properties as the clinical samples routinely analysed, should be used as reference samples to eliminate the possibility of matrix effect. The use of commutable samples has improved the identification of unacceptable analytical performance in the Netherlands and Spain. The International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) has undertaken a pilot study using commutable samples in an attempt to determine not only country specific reference intervals but to make them comparable between countries. Australia and New Zealand, through the Australasian Association of Clinical Biochemists (AACB), have also undertaken an assessment of analytical bias using commutable samples and determined that of the 27 general chemistry analytes studied, 19 showed sufficiently small between method biases as to not prevent harmonisation of reference intervals. Application of evidence based approaches including the determination of analytical bias using commutable material is necessary when seeking to harmonise reference intervals.

  13. Hanford analytical sample projections FY 1998 - FY 2002

    International Nuclear Information System (INIS)

    Joyce, S.M.

    1998-01-01

    Analytical Services projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Projections are categorized by radiation level, protocol, sample matrix and program. Analyses requirements are also presented. This document summarizes the Hanford sample projections for fiscal years 1998 to 2002. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Waste Remediation Systems (TWRS), Solid Waste, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition, details on laboratory scale technology (development) work, Sample Management, and Data Management activities are included. This information will be used by Hanford Analytical Services (HAS) and the Sample Management Working Group (SMWG) to assure that laboratories and resources are available and effectively utilized to meet these documented needs

  14. Hanford analytical sample projections FY 1998--FY 2002

    Energy Technology Data Exchange (ETDEWEB)

    Joyce, S.M.

    1998-02-12

    Analytical Services projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Projections are categorized by radiation level, protocol, sample matrix and program. Analyses requirements are also presented. This document summarizes the Hanford sample projections for fiscal years 1998 to 2002. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Waste Remediation Systems (TWRS), Solid Waste, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition, details on laboratory scale technology (development) work, Sample Management, and Data Management activities are included. This information will be used by Hanford Analytical Services (HAS) and the Sample Management Working Group (SMWG) to assure that laboratories and resources are available and effectively utilized to meet these documented needs.

  15. Use of robotic systems for radiochemical sample changing and for analytical sample preparation

    International Nuclear Information System (INIS)

    Delmastro, J.R.; Hartenstein, S.D.; Wade, M.A.

    1989-01-01

    Two uses of the Perkin-Elmer (PE) robotic system will be presented. In the first, a PE robot functions as an automatic sample changer for up to five low energy photon spectrometry (LEPS) detectors operated with a Nuclear Data ND 6700 system. The entire system, including the robot, is controlled by an IBM PC-AT using software written in compiled BASIC. Problems associated with the development of the system and modifications to the robot will be presented. In the second, an evaluation study was performed to assess the abilities of the PE robotic system for performing complex analytical sample preparation procedures. For this study, a robotic system based upon the PE robot and auxiliary devices was constructed and programmed to perform the preparation of final product samples (UO 3 ) for accountability and impurity specification analyses. These procedures require sample dissolution, dilution, and liquid-liquid extraction steps. The results of an in-depth evaluation of all system components will be presented

  16. The problems of accountable and analytical procuring of enterprise management

    Directory of Open Access Journals (Sweden)

    Kovalova Tatiana Volodymyrivna

    2016-02-01

    Full Text Available This article investigated main aspects of accountable and analytical procuring of enterprise management. It was found essence of accountable and analytical procuring of enterprise management, purpose, functions and tasks. It was determined main elements and essence of accountable and analytical information taking into consideration needs of modern management. In the article are exposed structural elements of accountable and analytical procuring. It was formed conceptual approaches of building accountable and analytical procuring of enterprise management. It was analyzed main problems of improving accountable and analytical informational procuring of taking managerial decisions with the aim of solving economic problems due to current situation of national economy.

  17. Sampling Large Graphs for Anticipatory Analytics

    Science.gov (United States)

    2015-05-15

    low. C. Random Area Sampling Random area sampling [8] is a “ snowball ” sampling method in which a set of random seed vertices are selected and areas... Sampling Large Graphs for Anticipatory Analytics Lauren Edwards, Luke Johnson, Maja Milosavljevic, Vijay Gadepally, Benjamin A. Miller Lincoln...systems, greater human-in-the-loop involvement, or through complex algorithms. We are investigating the use of sampling to mitigate these challenges

  18. Analytical Chemistry Division's sample transaction system

    International Nuclear Information System (INIS)

    Stanton, J.S.; Tilson, P.A.

    1980-10-01

    The Analytical Chemistry Division uses the DECsystem-10 computer for a wide range of tasks: sample management, timekeeping, quality assurance, and data calculation. This document describes the features and operating characteristics of many of the computer programs used by the Division. The descriptions are divided into chapters which cover all of the information about one aspect of the Analytical Chemistry Division's computer processing

  19. Analytical vs. Simulation Solution Techniques for Pulse Problems in Non-linear Stochastic Dynamics

    DEFF Research Database (Denmark)

    Iwankiewicz, R.; Nielsen, Søren R. K.

    Advantages and disadvantages of available analytical and simulation techniques for pulse problems in non-linear stochastic dynamics are discussed. First, random pulse problems, both those which do and do not lead to Markov theory, are presented. Next, the analytical and analytically-numerical tec......Advantages and disadvantages of available analytical and simulation techniques for pulse problems in non-linear stochastic dynamics are discussed. First, random pulse problems, both those which do and do not lead to Markov theory, are presented. Next, the analytical and analytically...

  20. On accuracy problems for semi-analytical sensitivity analyses

    DEFF Research Database (Denmark)

    Pedersen, P.; Cheng, G.; Rasmussen, John

    1989-01-01

    The semi-analytical method of sensitivity analysis combines ease of implementation with computational efficiency. A major drawback to this method, however, is that severe accuracy problems have recently been reported. A complete error analysis for a beam problem with changing length is carried ou...... pseudo loads in order to obtain general load equilibrium with rigid body motions. Such a method would be readily applicable for any element type, whether analytical expressions for the element stiffnesses are available or not. This topic is postponed for a future study....

  1. ANALYTICAL ANARCHISM: THE PROBLEM OF DEFINITION AND DEMARCATION

    OpenAIRE

    Konstantinov M.S.

    2012-01-01

    In this paper the first time in the science of our country is considered a new trend of anarchist thought - analytical anarchism. As a methodological tool used critical analysis of the key propositions of the basic versions of this trend: the anarcho- capitalist and egalitarian. The study was proposed classification of discernible trends within the analytical anarchism on the basis of value criteria, identified conceptual and methodological problems of definition analytical anarchism and its ...

  2. The analytical calibration in (bio)imaging/mapping of the metallic elements in biological samples--definitions, nomenclature and strategies: state of the art.

    Science.gov (United States)

    Jurowski, Kamil; Buszewski, Bogusław; Piekoszewski, Wojciech

    2015-01-01

    Nowadays, studies related to the distribution of metallic elements in biological samples are one of the most important issues. There are many articles dedicated to specific analytical atomic spectrometry techniques used for mapping/(bio)imaging the metallic elements in various kinds of biological samples. However, in such literature, there is a lack of articles dedicated to reviewing calibration strategies, and their problems, nomenclature, definitions, ways and methods used to obtain quantitative distribution maps. The aim of this article was to characterize the analytical calibration in the (bio)imaging/mapping of the metallic elements in biological samples including (1) nomenclature; (2) definitions, and (3) selected and sophisticated, examples of calibration strategies with analytical calibration procedures applied in the different analytical methods currently used to study an element's distribution in biological samples/materials such as LA ICP-MS, SIMS, EDS, XRF and others. The main emphasis was placed on the procedures and methodology of the analytical calibration strategy. Additionally, the aim of this work is to systematize the nomenclature for the calibration terms: analytical calibration, analytical calibration method, analytical calibration procedure and analytical calibration strategy. The authors also want to popularize the division of calibration methods that are different than those hitherto used. This article is the first work in literature that refers to and emphasizes many different and complex aspects of analytical calibration problems in studies related to (bio)imaging/mapping metallic elements in different kinds of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Paper-Based Analytical Device for Zinc Ion Quantification in Water Samples with Power-Free Analyte Concentration

    Directory of Open Access Journals (Sweden)

    Hiroko Kudo

    2017-04-01

    Full Text Available Insufficient sensitivity is a general issue of colorimetric paper-based analytical devices (PADs for trace analyte detection, such as metal ions, in environmental water. This paper demonstrates the colorimetric detection of zinc ions (Zn2+ on a paper-based analytical device with an integrated analyte concentration system. Concentration of Zn2+ ions from an enlarged sample volume (1 mL has been achieved with the aid of a colorimetric Zn2+ indicator (Zincon electrostatically immobilized onto a filter paper substrate in combination with highly water-absorbent materials. Analyte concentration as well as sample pretreatment, including pH adjustment and interferent masking, has been elaborated. The resulting device enables colorimetric quantification of Zn2+ in environmental water samples (tap water, river water from a single sample application. The achieved detection limit of 0.53 μM is a significant improvement over that of a commercial colorimetric Zn2+ test paper (9.7 μM, demonstrating the efficiency of the developed analyte concentration system not requiring any equipment.

  4. Development of analytical techniques for safeguards environmental samples at JAEA

    International Nuclear Information System (INIS)

    Sakurai, Satoshi; Magara, Masaaki; Usuda, Shigekazu; Watanabe, Kazuo; Esaka, Fumitaka; Hirayama, Fumio; Lee, Chi-Gyu; Yasuda, Kenichiro; Inagawa, Jun; Suzuki, Daisuke; Iguchi, Kazunari; Kokubu, Yoko S.; Miyamoto, Yutaka; Ohzu, Akira

    2007-01-01

    JAEA has been developing, under the auspices of the Ministry of Education, Culture, Sports, Science and Technology of Japan, analytical techniques for ultra-trace amounts of nuclear materials in environmental samples in order to contribute to the strengthened safeguards system. Development of essential techniques for bulk and particle analysis, as well as screening, of the environmental swipe samples has been established as ultra-trace analytical methods of uranium and plutonium. In January 2003, JAEA was qualified, including its quality control system, as a member of the JAEA network analytical laboratories for environmental samples. Since 2004, JAEA has conducted the analysis of domestic and the IAEA samples, through which JAEA's analytical capability has been verified and improved. In parallel, advanced techniques have been developed in order to expand the applicability to the samples of various elemental composition and impurities and to improve analytical accuracy and efficiency. This paper summarizes the trace of the technical development in environmental sample analysis at JAEA, and refers to recent trends of research and development in this field. (author)

  5. 40 CFR 141.22 - Turbidity sampling and analytical requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Turbidity sampling and analytical... § 141.22 Turbidity sampling and analytical requirements. The requirements in this section apply to... the water distribution system at least once per day, for the purposes of making turbidity measurements...

  6. Hanford analytical sample projections FY 1996 - FY 2001. Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Joyce, S.M.

    1997-07-02

    This document summarizes the biannual Hanford sample projections for fiscal year 1997-2001. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Wastes Remediation Systems, Solid Wastes, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition to this revision, details on Laboratory scale technology (development), Sample management, and Data management activities were requested. This information will be used by the Hanford Analytical Services program and the Sample Management Working Group to assure that laboratories and resources are available and effectively utilized to meet these documented needs.

  7. An analytical statistical approach to the 3D reconstruction problem

    Energy Technology Data Exchange (ETDEWEB)

    Cierniak, Robert [Czestochowa Univ. of Technology (Poland). Inst. of Computer Engineering

    2011-07-01

    The presented here approach is concerned with the reconstruction problem for 3D spiral X-ray tomography. The reconstruction problem is formulated taking into considerations the statistical properties of signals obtained in X-ray CT. Additinally, image processing performed in our approach is involved in analytical methodology. This conception significantly improves quality of the obtained after reconstruction images and decreases the complexity of the reconstruction problem in comparison with other approaches. Computer simulations proved that schematically described here reconstruction algorithm outperforms conventional analytical methods in obtained image quality. (orig.)

  8. Contemporary sample stacking in analytical electrophoresis

    Czech Academy of Sciences Publication Activity Database

    Šlampová, Andrea; Malá, Zdeňka; Pantůčková, Pavla; Gebauer, Petr; Boček, Petr

    2013-01-01

    Roč. 34, č. 1 (2013), s. 3-18 ISSN 0173-0835 R&D Projects: GA ČR GAP206/10/1219 Institutional support: RVO:68081715 Keywords : biological samples * stacking * trace analysis * zone electrophoresis Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 3.161, year: 2013

  9. Contemporary sample stacking in analytical electrophoresis

    Czech Academy of Sciences Publication Activity Database

    Malá, Zdeňka; Šlampová, Andrea; Křivánková, Ludmila; Gebauer, Petr; Boček, Petr

    2015-01-01

    Roč. 36, č. 1 (2015), s. 15-35 ISSN 0173-0835 R&D Projects: GA ČR(CZ) GA13-05762S Institutional support: RVO:68081715 Keywords : biological samples * stacking * trace analysis * zone electrophoresis Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 2.482, year: 2015

  10. Analytic Solution to Shell Boundary – Value Problems

    Directory of Open Access Journals (Sweden)

    Yu. I. Vinogradov

    2015-01-01

    Full Text Available Object of research is to find analytical solution to the shell boundary – value problems, i.e. to consider the solution for a class of problems concerning the mechanics of hoop closed shells strain.The objective of work is to create an analytical method to define a stress – strain state of shells under non-axisymmetric loading. Thus, a main goal is to derive the formulas – solutions of the linear ordinary differential equations with variable continuous coefficients.The partial derivative differential equations of mechanics of shells strain by Fourier's method of variables division are reduced to the system of the differential equations with ordinary derivatives. The paper presents the obtained formulas to define solutions of the uniform differential equations and received on their basis formulas to define a particular solution depending on a type of the right parts of the differential equations.The analytical algorithm of the solution of a boundary task uses an approach to transfer the boundary conditions to the randomly chosen point of an interval of changing independent variable through the solution of the canonical matrix ordinary differential equation with the subsequent solution of system of algebraic equations for compatibility of boundary conditions at this point. Efficiency of algorithm is based on the fact that the solution of the ordinary differential equations is defined as the values of Cauchy – Krylova functions, which meet initial arbitrary conditions.The results of researches presented in work are useful to experts in the field of calculus mathematics, dealing with solution of systems of linear ordinary differential equations and creation of effective analytical computing methods to solve shell boundary – value problems.

  11. [Progress in sample preparation and analytical methods for trace polar small molecules in complex samples].

    Science.gov (United States)

    Zhang, Qianchun; Luo, Xialin; Li, Gongke; Xiao, Xiaohua

    2015-09-01

    Small polar molecules such as nucleosides, amines, amino acids are important analytes in biological, food, environmental, and other fields. It is necessary to develop efficient sample preparation and sensitive analytical methods for rapid analysis of these polar small molecules in complex matrices. Some typical materials in sample preparation, including silica, polymer, carbon, boric acid and so on, are introduced in this paper. Meanwhile, the applications and developments of analytical methods of polar small molecules, such as reversed-phase liquid chromatography, hydrophilic interaction chromatography, etc., are also reviewed.

  12. Non-linear analytic and coanalytic problems (Lp-theory, Clifford analysis, examples)

    International Nuclear Information System (INIS)

    Dubinskii, Yu A; Osipenko, A S

    2000-01-01

    Two kinds of new mathematical model of variational type are put forward: non-linear analytic and coanalytic problems. The formulation of these non-linear boundary-value problems is based on a decomposition of the complete scale of Sobolev spaces into the 'orthogonal' sum of analytic and coanalytic subspaces. A similar decomposition is considered in the framework of Clifford analysis. Explicit examples are presented

  13. ITOUGH2 sample problems

    International Nuclear Information System (INIS)

    Finsterle, S.

    1997-11-01

    This report contains a collection of ITOUGH2 sample problems. It complements the ITOUGH2 User's Guide [Finsterle, 1997a], and the ITOUGH2 Command Reference [Finsterle, 1997b]. ITOUGH2 is a program for parameter estimation, sensitivity analysis, and uncertainty propagation analysis. It is based on the TOUGH2 simulator for non-isothermal multiphase flow in fractured and porous media [Preuss, 1987, 1991a]. The report ITOUGH2 User's Guide [Finsterle, 1997a] describes the inverse modeling framework and provides the theoretical background. The report ITOUGH2 Command Reference [Finsterle, 1997b] contains the syntax of all ITOUGH2 commands. This report describes a variety of sample problems solved by ITOUGH2. Table 1.1 contains a short description of the seven sample problems discussed in this report. The TOUGH2 equation-of-state (EOS) module that needs to be linked to ITOUGH2 is also indicated. Each sample problem focuses on a few selected issues shown in Table 1.2. ITOUGH2 input features and the usage of program options are described. Furthermore, interpretations of selected inverse modeling results are given. Problem 1 is a multipart tutorial, describing basic ITOUGH2 input files for the main ITOUGH2 application modes; no interpretation of results is given. Problem 2 focuses on non-uniqueness, residual analysis, and correlation structure. Problem 3 illustrates a variety of parameter and observation types, and describes parameter selection strategies. Problem 4 compares the performance of minimization algorithms and discusses model identification. Problem 5 explains how to set up a combined inversion of steady-state and transient data. Problem 6 provides a detailed residual and error analysis. Finally, Problem 7 illustrates how the estimation of model-related parameters may help compensate for errors in that model

  14. Analytical laboratory and mobile sampling platform

    International Nuclear Information System (INIS)

    Stetzenbach, K.; Smiecinski, A.

    1996-01-01

    This is the final report for the Analytical Laboratory and Mobile Sampling Platform project. This report contains only major findings and conclusions resulting from this project. Detailed reports of all activities performed for this project were provided to the Project Office every quarter since the beginning of the project. This report contains water chemistry data for samples collected in the Nevada section of Death Valley National Park (Triangle Area Springs), Nevada Test Site springs, Pahranagat Valley springs, Nevada Test Site wells, Spring Mountain springs and Crater Flat and Amargosa Valley wells

  15. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management.

    Science.gov (United States)

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-12-15

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.

  16. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management

    Science.gov (United States)

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-01-01

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713

  17. Non-linear analytic and coanalytic problems ( L_p-theory, Clifford analysis, examples)

    Science.gov (United States)

    Dubinskii, Yu A.; Osipenko, A. S.

    2000-02-01

    Two kinds of new mathematical model of variational type are put forward: non-linear analytic and coanalytic problems. The formulation of these non-linear boundary-value problems is based on a decomposition of the complete scale of Sobolev spaces into the "orthogonal" sum of analytic and coanalytic subspaces. A similar decomposition is considered in the framework of Clifford analysis. Explicit examples are presented.

  18. The boundary value problem for discrete analytic functions

    KAUST Repository

    Skopenkov, Mikhail

    2013-06-01

    This paper is on further development of discrete complex analysis introduced by R.Isaacs, J.Ferrand, R.Duffin, and C.Mercat. We consider a graph lying in the complex plane and having quadrilateral faces. A function on the vertices is called discrete analytic, if for each face the difference quotients along the two diagonals are equal.We prove that the Dirichlet boundary value problem for the real part of a discrete analytic function has a unique solution. In the case when each face has orthogonal diagonals we prove that this solution uniformly converges to a harmonic function in the scaling limit. This solves a problem of S.Smirnov from 2010. This was proved earlier by R.Courant-K.Friedrichs-H.Lewy and L.Lusternik for square lattices, by D.Chelkak-S.Smirnov and implicitly by P.G.Ciarlet-P.-A.Raviart for rhombic lattices.In particular, our result implies uniform convergence of the finite element method on Delaunay triangulations. This solves a problem of A.Bobenko from 2011. The methodology is based on energy estimates inspired by alternating-current network theory. © 2013 Elsevier Ltd.

  19. Analytical derivation: An epistemic game for solving mathematically based physics problems

    Science.gov (United States)

    Bajracharya, Rabindra R.; Thompson, John R.

    2016-06-01

    Problem solving, which often involves multiple steps, is an integral part of physics learning and teaching. Using the perspective of the epistemic game, we documented a specific game that is commonly pursued by students while solving mathematically based physics problems: the analytical derivation game. This game involves deriving an equation through symbolic manipulations and routine mathematical operations, usually without any physical interpretation of the processes. This game often creates cognitive obstacles in students, preventing them from using alternative resources or better approaches during problem solving. We conducted hour-long, semi-structured, individual interviews with fourteen introductory physics students. Students were asked to solve four "pseudophysics" problems containing algebraic and graphical representations. The problems required the application of the fundamental theorem of calculus (FTC), which is one of the most frequently used mathematical concepts in physics problem solving. We show that the analytical derivation game is necessary, but not sufficient, to solve mathematically based physics problems, specifically those involving graphical representations.

  20. Sample problem calculations related to two-phase flow transients in a PWR relief-piping network

    International Nuclear Information System (INIS)

    Shin, Y.W.; Wiedermann, A.H.

    1981-03-01

    Two sample problems related with the fast transients of water/steam flow in the relief line of a PWR pressurizer were calculated with a network-flow analysis computer code STAC (System Transient-Flow Analysis Code). The sample problems were supplied by EPRI and are designed to test computer codes or computational methods to determine whether they have the basic capability to handle the important flow features present in a typical relief line of a PWR pressurizer. It was found necessary to implement into the STAC code a number of additional boundary conditions in order to calculate the sample problems. This includes the dynamics of the fluid interface that is treated as a moving boundary. This report describes the methodologies adopted for handling the newly implemented boundary conditions and the computational results of the two sample problems. In order to demonstrate the accuracies achieved in the STAC code results, analytical solutions are also obtained and used as a basis for comparison

  1. Waste minimization in analytical chemistry through innovative sample preparation techniques

    International Nuclear Information System (INIS)

    Smith, L. L.

    1998-01-01

    Because toxic solvents and other hazardous materials are commonly used in analytical methods, characterization procedures result in significant and costly amount of waste. We are developing alternative analytical methods in the radiological and organic areas to reduce the volume or form of the hazardous waste produced during sample analysis. For the radiological area, we have examined high-pressure, closed-vessel microwave digestion as a way to minimize waste from sample preparation operations. Heated solutions of strong mineral acids can be avoided for sample digestion by using the microwave approach. Because reactivity increases with pressure, we examined the use of less hazardous solvents to leach selected contaminants from soil for subsequent analysis. We demonstrated the feasibility of this approach by extracting plutonium from a NET reference material using citric and tartaric acids with microwave digestion. Analytical results were comparable to traditional digestion methods, while hazardous waste was reduced by a factor often. We also evaluated the suitability of other natural acids, determined the extraction performance on a wider variety of soil types, and examined the extraction efficiency of other contaminants. For the organic area, we examined ways to minimize the wastes associated with the determination of polychlorinated biphenyls (PCBs) in environmental samples. Conventional methods for analyzing semivolatile organic compounds are labor intensive and require copious amounts of hazardous solvents. For soil and sediment samples, we have a method to analyze PCBs that is based on microscale extraction using benign solvents (e.g., water or hexane). The extraction is performed at elevated temperatures in stainless steel cells containing the sample and solvent. Gas chromatography-mass spectrometry (GC/MS) was used to quantitate the analytes in the isolated extract. More recently, we developed a method utilizing solid-phase microextraction (SPME) for natural

  2. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA for Environmental Risk Management

    Directory of Open Access Journals (Sweden)

    Yan Li

    2016-12-01

    Full Text Available With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity and the degree of Socio-Economic Deprivation (SED at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.

  3. Analytical solutions of the electrostatically actuated curled beam problem

    KAUST Repository

    Younis, Mohammad I.

    2014-01-01

    This works presents analytical expressions of the electrostatically actuated initially deformed cantilever beam problem. The formulation is based on the continuous Euler-Bernoulli beam model combined with a single-mode Galerkin approximation. We

  4. The limited relevance of analytical ethics to the problems of bioethics.

    Science.gov (United States)

    Holmes, R L

    1990-04-01

    Philosophical ethics comprises metaethics, normative ethics and applied ethics. These have characteristically received analytic treatment by twentieth-century Anglo-American philosophy. But there has been disagreement over their interrelationship to one another and the relationship of analytical ethics to substantive morality--the making of moral judgments. I contend that the expertise philosophers have in either theoretical or applied ethics does not equip them to make sounder moral judgments on the problems of bioethics than nonphilosophers. One cannot "apply" theories like Kantianism or consequentialism to get solutions to practical moral problems unless one knows which theory is correct, and that is a metaethical question over which there is no consensus. On the other hand, to presume to be able to reach solutions through neutral analysis of problems is unavoidably to beg controversial theoretical issues in the process. Thus, while analytical ethics can play an important clarificatory role in bioethics, it can neither provide, nor substitute for, moral wisdom.

  5. Simple and Accurate Analytical Solutions of the Electrostatically Actuated Curled Beam Problem

    KAUST Repository

    Younis, Mohammad I.

    2014-01-01

    We present analytical solutions of the electrostatically actuated initially deformed cantilever beam problem. We use a continuous Euler-Bernoulli beam model combined with a single-mode Galerkin approximation. We derive simple analytical expressions

  6. A semi-analytical iterative technique for solving chemistry problems

    Directory of Open Access Journals (Sweden)

    Majeed Ahmed AL-Jawary

    2017-07-01

    Full Text Available The main aim and contribution of the current paper is to implement a semi-analytical iterative method suggested by Temimi and Ansari in 2011 namely (TAM to solve two chemical problems. An approximate solution obtained by the TAM provides fast convergence. The current chemical problems are the absorption of carbon dioxide into phenyl glycidyl ether and the other system is a chemical kinetics problem. These problems are represented by systems of nonlinear ordinary differential equations that contain boundary conditions and initial conditions. Error analysis of the approximate solutions is studied using the error remainder and the maximal error remainder. Exponential rate for the convergence is observed. For both problems the results of the TAM are compared with other results obtained by previous methods available in the literature. The results demonstrate that the method has many merits such as being derivative-free, and overcoming the difficulty arising in calculating Adomian polynomials to handle the non-linear terms in Adomian Decomposition Method (ADM. It does not require to calculate Lagrange multiplier in Variational Iteration Method (VIM in which the terms of the sequence become complex after several iterations, thus, analytical evaluation of terms becomes very difficult or impossible in VIM. No need to construct a homotopy in Homotopy Perturbation Method (HPM and solve the corresponding algebraic equations. The MATHEMATICA® 9 software was used to evaluate terms in the iterative process.

  7. An analytical approach for a nodal scheme of two-dimensional neutron transport problems

    International Nuclear Information System (INIS)

    Barichello, L.B.; Cabrera, L.C.; Prolo Filho, J.F.

    2011-01-01

    Research highlights: → Nodal equations for a two-dimensional neutron transport problem. → Analytical Discrete Ordinates Method. → Numerical results compared with the literature. - Abstract: In this work, a solution for a two-dimensional neutron transport problem, in cartesian geometry, is proposed, on the basis of nodal schemes. In this context, one-dimensional equations are generated by an integration process of the multidimensional problem. Here, the integration is performed for the whole domain such that no iterative procedure between nodes is needed. The ADO method is used to develop analytical discrete ordinates solution for the one-dimensional integrated equations, such that final solutions are analytical in terms of the spatial variables. The ADO approach along with a level symmetric quadrature scheme, lead to a significant order reduction of the associated eigenvalues problems. Relations between the averaged fluxes and the unknown fluxes at the boundary are introduced as the usually needed, in nodal schemes, auxiliary equations. Numerical results are presented and compared with test problems.

  8. Analytic semigroups and optimal regularity in parabolic problems

    CERN Document Server

    Lunardi, Alessandra

    2012-01-01

    The book shows how the abstract methods of analytic semigroups and evolution equations in Banach spaces can be fruitfully applied to the study of parabolic problems. Particular attention is paid to optimal regularity results in linear equations. Furthermore, these results are used to study several other problems, especially fully nonlinear ones. Owing to the new unified approach chosen, known theorems are presented from a novel perspective and new results are derived. The book is self-contained. It is addressed to PhD students and researchers interested in abstract evolution equations and in p

  9. Analytical Lie-algebraic solution of a 3D sound propagation problem in the ocean

    Energy Technology Data Exchange (ETDEWEB)

    Petrov, P.S., E-mail: petrov@poi.dvo.ru [Il' ichev Pacific Oceanological Institute, 43 Baltiyskaya str., Vladivostok, 690041 (Russian Federation); Prants, S.V., E-mail: prants@poi.dvo.ru [Il' ichev Pacific Oceanological Institute, 43 Baltiyskaya str., Vladivostok, 690041 (Russian Federation); Petrova, T.N., E-mail: petrova.tn@dvfu.ru [Far Eastern Federal University, 8 Sukhanova str., 690950, Vladivostok (Russian Federation)

    2017-06-21

    The problem of sound propagation in a shallow sea with variable bottom slope is considered. The sound pressure field produced by a time-harmonic point source in such inhomogeneous 3D waveguide is expressed in the form of a modal expansion. The expansion coefficients are computed using the adiabatic mode parabolic equation theory. The mode parabolic equations are solved explicitly, and the analytical expressions for the modal coefficients are obtained using a Lie-algebraic technique. - Highlights: • A group-theoretical approach is applied to a problem of sound propagation in a shallow sea with variable bottom slope. • An analytical solution of this problem is obtained in the form of modal expansion with analytical expressions of the coefficients. • Our result is the only analytical solution of the 3D sound propagation problem with no translational invariance. • This solution can be used for the validation of the numerical propagation models.

  10. Inverse problems with non-trivial priors: efficient solution through sequential Gibbs sampling

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Cordua, Knud Skou; Mosegaard, Klaus

    2012-01-01

    Markov chain Monte Carlo methods such as the Gibbs sampler and the Metropolis algorithm can be used to sample solutions to non-linear inverse problems. In principle, these methods allow incorporation of prior information of arbitrary complexity. If an analytical closed form description of the prior...... is available, which is the case when the prior can be described by a multidimensional Gaussian distribution, such prior information can easily be considered. In reality, prior information is often more complex than can be described by the Gaussian model, and no closed form expression of the prior can be given....... We propose an algorithm, called sequential Gibbs sampling, allowing the Metropolis algorithm to efficiently incorporate complex priors into the solution of an inverse problem, also for the case where no closed form description of the prior exists. First, we lay out the theoretical background...

  11. Stability of purgeable VOCs in water samples during pre-analytical holding: Part 1, Analysis by a commercial laboratory

    Energy Technology Data Exchange (ETDEWEB)

    West, O.R.; Bayne, C.K.; Siegrist, R.L.; Holden, W.L.; Scarborough, S.S. [Oak Ridge National Lab., TN (United States); Bottrell, D.W. [USDOE, Washington, DC (United States)

    1996-10-01

    This study was undertaken to examine the hypothesis that prevalent and priority purgeable VOCs in properly preserved water samples are stable for at least 28 days. (VOCs are considered stable if concentrations do not change by more than 10%.) Surface water was spiked with 44 purgeable VOCs. Results showed that the measurement of 35 out of 44 purgeable VOCs in properly preserved water samples (4 C, 250 mg NaHSO{sub 4}, no headspace in 40 mL VOC vials with 0.010-in. Teflon-lined silicone septum caps) will not be affected by sample storage for 28 days. Larger changes (>10%) and low practical reporting times were observed for a few analytes, e.g. acrolein, CS{sub 2}, vinyl acetate, etc.; these also involve other analytical problems. Advantages of a 28-day (compared to 14-day) holding time are pointed out.

  12. Identification of clinical biomarkers for pre-analytical quality control of blood samples.

    Science.gov (United States)

    Kang, Hyun Ju; Jeon, Soon Young; Park, Jae-Sun; Yun, Ji Young; Kil, Han Na; Hong, Won Kyung; Lee, Mee-Hee; Kim, Jun-Woo; Jeon, Jae-Pil; Han, Bok Ghee

    2013-04-01

    Pre-analytical conditions are key factors in maintaining the high quality of biospecimens. They are necessary for accurate reproducibility of experiments in the field of biomarker discovery as well as achieving optimal specificity of laboratory tests for clinical diagnosis. In research at the National Biobank of Korea, we evaluated the impact of pre-analytical conditions on the stability of biobanked blood samples by measuring biochemical analytes commonly used in clinical laboratory tests. We measured 10 routine laboratory analytes in serum and plasma samples from healthy donors (n = 50) with a chemistry autoanalyzer (Hitachi 7600-110). The analyte measurements were made at different time courses based on delay of blood fractionation, freezing delay of fractionated serum and plasma samples, and at different cycles (0, 1, 3, 6, 9) of freeze-thawing. Statistically significant changes from the reference sample mean were determined using the repeated-measures ANOVA and the significant change limit (SCL). The serum levels of GGT and LDH were changed significantly depending on both the time interval between blood collection and fractionation and the time interval between fractionation and freezing of serum and plasma samples. The glucose level was most sensitive only to the elapsed time between blood collection and centrifugation for blood fractionation. Based on these findings, a simple formula (glucose decrease by 1.387 mg/dL per hour) was derived to estimate the length of time delay after blood collection. In addition, AST, BUN, GGT, and LDH showed sensitive responses to repeated freeze-thaw cycles of serum and plasma samples. These results suggest that GGT and LDH measurements can be used as quality control markers for certain pre-analytical conditions (eg, delayed processing or repeated freeze-thawing) of blood samples which are either directly used in the laboratory tests or stored for future research in the biobank.

  13. Sampling and analyte enrichment strategies for ambient mass spectrometry.

    Science.gov (United States)

    Li, Xianjiang; Ma, Wen; Li, Hongmei; Ai, Wanpeng; Bai, Yu; Liu, Huwei

    2018-01-01

    Ambient mass spectrometry provides great convenience for fast screening, and has showed promising potential in analytical chemistry. However, its relatively low sensitivity seriously restricts its practical utility in trace compound analysis. In this review, we summarize the sampling and analyte enrichment strategies coupled with nine modes of representative ambient mass spectrometry (desorption electrospray ionization, paper vhspray ionization, wooden-tip spray ionization, probe electrospray ionization, coated blade spray ionization, direct analysis in real time, desorption corona beam ionization, dielectric barrier discharge ionization, and atmospheric-pressure solids analysis probe) that have dramatically increased the detection sensitivity. We believe that these advances will promote routine use of ambient mass spectrometry. Graphical abstract Scheme of sampling stretagies for ambient mass spectrometry.

  14. Advanced Curation: Solving Current and Future Sample Return Problems

    Science.gov (United States)

    Fries, M.; Calaway, M.; Evans, C.; McCubbin, F.

    2015-01-01

    Advanced Curation is a wide-ranging and comprehensive research and development effort at NASA Johnson Space Center that identifies and remediates sample related issues. For current collections, Advanced Curation investigates new cleaning, verification, and analytical techniques to assess their suitability for improving curation processes. Specific needs are also assessed for future sample return missions. For each need, a written plan is drawn up to achieve the requirement. The plan draws while upon current Curation practices, input from Curators, the analytical expertise of the Astromaterials Research and Exploration Science (ARES) team, and suitable standards maintained by ISO, IEST, NIST and other institutions. Additionally, new technologies are adopted on the bases of need and availability. Implementation plans are tested using customized trial programs with statistically robust courses of measurement, and are iterated if necessary until an implementable protocol is established. Upcoming and potential NASA missions such as OSIRIS-REx, the Asteroid Retrieval Mission (ARM), sample return missions in the New Frontiers program, and Mars sample return (MSR) all feature new difficulties and specialized sample handling requirements. The Mars 2020 mission in particular poses a suite of challenges since the mission will cache martian samples for possible return to Earth. In anticipation of future MSR, the following problems are among those under investigation: What is the most efficient means to achieve the less than 1.0 ng/sq cm total organic carbon (TOC) cleanliness required for all sample handling hardware? How do we maintain and verify cleanliness at this level? The Mars 2020 Organic Contamination Panel (OCP) predicts that organic carbon, if present, will be present at the "one to tens" of ppb level in martian near-surface samples. The same samples will likely contain wt% perchlorate salts, or approximately 1,000,000x as much perchlorate oxidizer as organic carbon

  15. Multidimensional integral representations problems of analytic continuation

    CERN Document Server

    Kytmanov, Alexander M

    2015-01-01

    The monograph is devoted to integral representations for holomorphic functions in several complex variables, such as Bochner-Martinelli, Cauchy-Fantappiè, Koppelman, multidimensional logarithmic residue etc., and their boundary properties. The applications considered are problems of analytic continuation of functions from the boundary of a bounded domain in C^n. In contrast to the well-known Hartogs-Bochner theorem, this book investigates functions with the one-dimensional property of holomorphic extension along complex lines, and includes the problems of receiving multidimensional boundary analogs of the Morera theorem.   This book is a valuable resource for specialists in complex analysis, theoretical physics, as well as graduate and postgraduate students with an understanding of standard university courses in complex, real and functional analysis, as well as algebra and geometry.

  16. Hyperbolic systems with analytic coefficients well-posedness of the Cauchy problem

    CERN Document Server

    Nishitani, Tatsuo

    2014-01-01

    This monograph focuses on the well-posedness of the Cauchy problem for linear hyperbolic systems with matrix coefficients. Mainly two questions are discussed: (A) Under which conditions on lower order terms is the Cauchy problem well posed? (B) When is the Cauchy problem well posed for any lower order term? For first order two by two systems with two independent variables with real analytic coefficients, we present complete answers for both (A) and (B). For first order systems with real analytic coefficients we prove general necessary conditions for question (B) in terms of minors of the principal symbols. With regard to sufficient conditions for (B), we introduce hyperbolic systems with nondegenerate characteristics, which contains strictly hyperbolic systems, and prove that the Cauchy problem for hyperbolic systems with nondegenerate characteristics is well posed for any lower order term. We also prove that any hyperbolic system which is close to a hyperbolic system with a nondegenerate characteristic of mu...

  17. Analysis of plant gums and saccharide materials in paint samples: comparison of GC-MS analytical procedures and databases.

    Science.gov (United States)

    Lluveras-Tenorio, Anna; Mazurek, Joy; Restivo, Annalaura; Colombini, Maria Perla; Bonaduce, Ilaria

    2012-10-10

    Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of identifying polysaccharide binders in

  18. Toxaphene: a challenging analytical problem

    NARCIS (Netherlands)

    de Geus, H.J.; Wester, P.G.; Schelvis, A.; de Boer, J.; Brinkman, U.A.T.

    2000-01-01

    The analysis of toxaphene, a highly complex mixture of chlorinated bornanes, bornenes and camphenes, is a challenging problem, especially as individual congeners are present at trace levels in biota and other relevant samples. The complicated nomenclature of the compounds of interest is briefly

  19. Green sample preparation for liquid chromatography and capillary electrophoresis of anionic and cationic analytes.

    Science.gov (United States)

    Wuethrich, Alain; Haddad, Paul R; Quirino, Joselito P

    2015-04-21

    A sample preparation device for the simultaneous enrichment and separation of cationic and anionic analytes was designed and implemented in an eight-channel configuration. The device is based on the use of an electric field to transfer the analytes from a large volume of sample into small volumes of electrolyte that was suspended into two glass micropipettes using a conductive hydrogel. This simple, economical, fast, and green (no organic solvent required) sample preparation scheme was evaluated using cationic and anionic herbicides as test analytes in water. The analytical figures of merit and ecological aspects were evaluated against the state-of-the-art sample preparation, solid-phase extraction. A drastic reduction in both sample preparation time (94% faster) and resources (99% less consumables used) was observed. Finally, the technique in combination with high-performance liquid chromatography and capillary electrophoresis was applied to analysis of quaternary ammonium and phenoxypropionic acid herbicides in fortified river water as well as drinking water (at levels relevant to Australian guidelines). The presented sustainable sample preparation approach could easily be applied to other charged analytes or adopted by other laboratories.

  20. Analytical solutions to matrix diffusion problems

    Energy Technology Data Exchange (ETDEWEB)

    Kekäläinen, Pekka, E-mail: pekka.kekalainen@helsinki.fi [Laboratory of Radiochemistry, Department of Chemistry, P.O. Box 55, FIN-00014 University of Helsinki (Finland)

    2014-10-06

    We report an analytical method to solve in a few cases of practical interest the equations which have traditionally been proposed for the matrix diffusion problem. In matrix diffusion, elements dissolved in ground water can penetrate the porous rock surronuding the advective flow paths. In the context of radioactive waste repositories this phenomenon provides a mechanism by which the area of rock surface in contact with advecting elements is greatly enhanced, and can thus be an important delay mechanism. The cases solved are relevant for laboratory as well for in situ experiments. Solutions are given as integral representations well suited for easy numerical solution.

  1. A sample preparation method for recovering suppressed analyte ions in MALDI TOF MS

    NARCIS (Netherlands)

    Lou, X.; Waal, de B.F.M.; Milroy, L.G.; Dongen, van J.L.J.

    2015-01-01

    In matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI TOF MS), analyte signals can be substantially suppressed by other compounds in the sample. In this technical note, we describe a modified thin-layer sample preparation method that significantly reduces the analyte

  2. Assessment of Two Analytical Methods in Solving the Linear and Nonlinear Elastic Beam Deformation Problems

    DEFF Research Database (Denmark)

    Barari, Amin; Ganjavi, B.; Jeloudar, M. Ghanbari

    2010-01-01

    and fluid mechanics. Design/methodology/approach – Two new but powerful analytical methods, namely, He's VIM and HPM, are introduced to solve some boundary value problems in structural engineering and fluid mechanics. Findings – Analytical solutions often fit under classical perturbation methods. However......, as with other analytical techniques, certain limitations restrict the wide application of perturbation methods, most important of which is the dependence of these methods on the existence of a small parameter in the equation. Disappointingly, the majority of nonlinear problems have no small parameter at all......Purpose – In the last two decades with the rapid development of nonlinear science, there has appeared ever-increasing interest of scientists and engineers in the analytical techniques for nonlinear problems. This paper considers linear and nonlinear systems that are not only regarded as general...

  3. Recent advances in computational-analytical integral transforms for convection-diffusion problems

    Science.gov (United States)

    Cotta, R. M.; Naveira-Cotta, C. P.; Knupp, D. C.; Zotin, J. L. Z.; Pontes, P. C.; Almeida, A. P.

    2017-10-01

    An unifying overview of the Generalized Integral Transform Technique (GITT) as a computational-analytical approach for solving convection-diffusion problems is presented. This work is aimed at bringing together some of the most recent developments on both accuracy and convergence improvements on this well-established hybrid numerical-analytical methodology for partial differential equations. Special emphasis is given to novel algorithm implementations, all directly connected to enhancing the eigenfunction expansion basis, such as a single domain reformulation strategy for handling complex geometries, an integral balance scheme in dealing with multiscale problems, the adoption of convective eigenvalue problems in formulations with significant convection effects, and the direct integral transformation of nonlinear convection-diffusion problems based on nonlinear eigenvalue problems. Then, selected examples are presented that illustrate the improvement achieved in each class of extension, in terms of convergence acceleration and accuracy gain, which are related to conjugated heat transfer in complex or multiscale microchannel-substrate geometries, multidimensional Burgers equation model, and diffusive metal extraction through polymeric hollow fiber membranes. Numerical results are reported for each application and, where appropriate, critically compared against the traditional GITT scheme without convergence enhancement schemes and commercial or dedicated purely numerical approaches.

  4. Integrated assessment of the global warming problem: A decision-analytical approach

    International Nuclear Information System (INIS)

    Van Lenthe, J.; Hendrickx, L.; Vlek, C.A.J.

    1994-12-01

    The multi-disciplinary character of the global warming problem asks for an integrated assessment approach for ordering and combining the various physical, ecological, economical, and sociological results. The Netherlands initiated their own National Research Program (NRP) on Global Air Pollution and Climate Change (NRP). The first phase (NRP-1) identified the integration theme as one of five central research themes. The second phase (NRP-2) shows a growing concern for integrated assessment issues. The current two-year research project 'Characterizing the risks: a comparative analysis of the risks of global warming and of relevant policy options, which started in September 1993, comes under the integrated assessment part of the Dutch NRP. The first part of the interim report describes the search for an integrated assessment methodology. It starts with emphasizing the need for integrated assessment at a relatively high level of aggregation and from a policy point of view. The conclusion will be that a decision-analytical approach might fit the purpose of a policy-oriented integrated modeling of the global warming problem. The discussion proceeds with an account on decision analysis and its explicit incorporation and analysis of uncertainty. Then influence diagrams, a relatively recent development in decision analysis, are introduced as a useful decision-analytical approach for integrated assessment. Finally, a software environment for creating and analyzing complex influence diagram models is discussed. The second part of the interim report provides a first, provisional integrated modeling of the global warming problem, emphasizing on the illustration of the decision-analytical approach. Major problem elements are identified and an initial problem structure is developed. The problem structure is described in terms of hierarchical influence diagrams. At some places the qualitative structure is filled with quantitative data

  5. Analytical Methodology for the Determination of Radium Isotopes in Environmental Samples

    International Nuclear Information System (INIS)

    2010-01-01

    Reliable, comparable and 'fit for purpose' results are an essential requirement for any decision based on analytical measurements. For the analyst, the availability of tested and validated analytical procedures is an extremely important tool for production of such analytical measurements. For maximum utility, such procedures should be comprehensive, clearly formulated, and readily available to both the analyst and the customer for reference. Since 2004, the environment programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. Measurements of radium isotopes are important for radiological and environmental protection, geochemical and geochronological investigations, hydrology, etc. The suite of isotopes creates and stimulates continuing interest in the development of new methods for determination of radium in various media. In this publication, the four most routinely used analytical methods for radium determination in biological and environmental samples, i.e. alpha spectrometry, gamma spectrometry, liquid scintillation spectrometry and mass spectrometry, are reviewed

  6. Analytical solutions of the electrostatically actuated curled beam problem

    KAUST Repository

    Younis, Mohammad I.

    2014-07-24

    This works presents analytical expressions of the electrostatically actuated initially deformed cantilever beam problem. The formulation is based on the continuous Euler-Bernoulli beam model combined with a single-mode Galerkin approximation. We derive simple analytical expressions for two commonly observed deformed beams configurations: the curled and tilted configurations. The derived analytical formulas are validated by comparing their results to experimental data and numerical results of a multi-mode reduced order model. The derived expressions do not involve any complicated integrals or complex terms and can be conveniently used by designers for quick, yet accurate, estimations. The formulas are found to yield accurate results for most commonly encountered microbeams of initial tip deflections of few microns. For largely deformed beams, we found that these formulas yield less accurate results due to the limitations of the single-mode approximation. In such cases, multi-mode reduced order models are shown to yield accurate results. © 2014 Springer-Verlag Berlin Heidelberg.

  7. OPTIMAL METHOD FOR PREPARATION OF SILICATE ROCK SAMPLES FOR ANALYTICAL PURPOSES

    Directory of Open Access Journals (Sweden)

    Maja Vrkljan

    2004-12-01

    Full Text Available The purpose of this study was to determine an optimal dissolution method for silicate rock samples for further analytical purposes. Analytical FAAS method of determining cobalt, chromium, copper, nickel, lead and zinc content in gabbro sample and geochemical standard AGV-1 has been applied for verification. Dissolution in mixtures of various inorganic acids has been tested, as well as Na2CO3 fusion technique. The results obtained by different methods have been compared and dissolution in the mixture of HNO3 + HF has been recommended as optimal.

  8. Growing geometric reasoning in solving problems of analytical geometry through the mathematical communication problems to state Islamic university students

    Science.gov (United States)

    Mujiasih; Waluya, S. B.; Kartono; Mariani

    2018-03-01

    Skills in working on the geometry problems great needs of the competence of Geometric Reasoning. As a teacher candidate, State Islamic University (UIN) students need to have the competence of this Geometric Reasoning. When the geometric reasoning in solving of geometry problems has grown well, it is expected the students are able to write their ideas to be communicative for the reader. The ability of a student's mathematical communication is supposed to be used as a marker of the growth of their Geometric Reasoning. Thus, the search for the growth of geometric reasoning in solving of analytic geometry problems will be characterized by the growth of mathematical communication abilities whose work is complete, correct and sequential, especially in writing. Preceded with qualitative research, this article was the result of a study that explores the problem: Was the search for the growth of geometric reasoning in solving analytic geometry problems could be characterized by the growth of mathematical communication abilities? The main activities in this research were done through a series of activities: (1) Lecturer trains the students to work on analytic geometry problems that were not routine and algorithmic process but many problems that the process requires high reasoning and divergent/open ended. (2) Students were asked to do the problems independently, in detail, complete, order, and correct. (3) Student answers were then corrected each its stage. (4) Then taken 6 students as the subject of this research. (5) Research subjects were interviewed and researchers conducted triangulation. The results of this research, (1) Mathematics Education student of UIN Semarang, had adequate the mathematical communication ability, (2) the ability of this mathematical communication, could be a marker of the geometric reasoning in solving of problems, and (3) the geometric reasoning of UIN students had grown in a category that tends to be good.

  9. Calculation of sample problems related to two-phase flow blowdown transients in pressure relief piping of a PWR pressurizer

    International Nuclear Information System (INIS)

    Shin, Y.W.; Wiedermann, A.H.

    1984-02-01

    A method was published, based on the integral method of characteristics, by which the junction and boundary conditions needed in computation of a flow in a piping network can be accurately formulated. The method for the junction and boundary conditions formulation together with the two-step Lax-Wendroff scheme are used in a computer program; the program in turn, is used here in calculating sample problems related to the blowdown transient of a two-phase flow in the piping network downstream of a PWR pressurizer. Independent, nearly exact analytical solutions also are obtained for the sample problems. Comparison of the results obtained by the hybrid numerical technique with the analytical solutions showed generally good agreement. The good numerical accuracy shown by the results of our scheme suggest that the hybrid numerical technique is suitable for both benchmark and design calculations of PWR pressurizer blowdown transients

  10. Analytical study on the determination of boron in environmental water samples

    International Nuclear Information System (INIS)

    Lopez, F.J.; Gimenez, E.; Hernandez, F.

    1993-01-01

    An analytical study on the determination of boron in environmental water samples was carried out. The curcumin and carmine standard methods were compared with the most recent Azomethine-H method in order to evaluate their analytical characteristics and feasibility for the analysis of boron in water samples. Analyses of synthetic water, ground water, sea water and waste water samples were carried out and a statistical evaluation of the results was made. The Azomethine-H method was found to be the most sensitive (detection limit 0.02 mg l -1 ) and selective (no interference of commonly occurring ions in water was observed), showing also the best precision (relative standard deviation lower than 4%). Moreover, it gave good results for all types of samples analyzed. The accuracy of this method was tested by the addition of known amounts of standard solutions to different types of water samples. The slopes of standard additions and direct calibration graphs were similar and recoveries of added boron ranged from 99 to 107%. (orig.)

  11. An analytical approach to managing complex process problems

    Energy Technology Data Exchange (ETDEWEB)

    Ramstad, Kari; Andersen, Espen; Rohde, Hans Christian; Tydal, Trine

    2006-03-15

    The oil companies are continuously investing time and money to ensure optimum regularity on their production facilities. High regularity increases profitability, reduces workload on the offshore organisation and most important; - reduces discharge to air and sea. There are a number of mechanisms and tools available in order to achieve high regularity. Most of these are related to maintenance, system integrity, well operations and process conditions. However, for all of these tools, they will only be effective if quick and proper analysis of fluids and deposits are carried out. In fact, analytical backup is a powerful tool used to maintain optimised oil production, and should as such be given high priority. The present Operator (Hydro Oil and Energy) and the Chemical Supplier (MI Production Chemicals) have developed a cooperation to ensure that analytical backup is provided efficiently to the offshore installations. The Operator's Research and Development (R and D) departments and the Chemical Supplier have complementary specialties in both personnel and equipment, and this is utilized to give the best possible service when required from production technologists or operations. In order for the Operator's Research departments, Health, Safety and Environment (HSE) departments and Operations to approve analytical work performed by the Chemical Supplier, a number of analytical tests are carried out following procedures agreed by both companies. In the present paper, three field case examples of analytical cooperation for managing process problems will be presented. 1) Deposition in a Complex Platform Processing System. 2) Contaminated Production Chemicals. 3) Improved Monitoring of Scale Inhibitor, Suspended Solids and Ions. In each case the Research Centre, Operations and the Chemical Supplier have worked closely together to achieve fast solutions and Best Practice. (author) (tk)

  12. Analytical methodologies for aluminium speciation in environmental and biological samples--a review.

    Science.gov (United States)

    Bi, S P; Yang, X D; Zhang, F P; Wang, X L; Zou, G W

    2001-08-01

    It is recognized that aluminium (Al) is a potential environmental hazard. Acidic deposition has been linked to increased Al concentrations in natural waters. Elevated levels of Al might have serious consequences for biological communities. Of particular interest is the speciation of Al in aquatic environments, because Al toxicity depends on its forms and concentrations. In this paper, advances in analytical methodologies for Al speciation in environmental and biological samples during the past five years are reviewed. Concerns about the specific problems of Al speciation and highlights of some important methods are elucidated in sections devoted to hybrid techniques (HPLC or FPLC coupled with ET-AAS, ICP-AES, or ICP-MS), flow-injection analysis (FIA), nuclear magnetic resonance (27Al NMR), electrochemical analysis, and computer simulation. More than 130 references are cited.

  13. Analytical Evaluation of Beam Deformation Problem Using Approximate Methods

    DEFF Research Database (Denmark)

    Barari, Amin; Kimiaeifar, A.; Domairry, G.

    2010-01-01

    The beam deformation equation has very wide applications in structural engineering. As a differential equation, it has its own problem concerning existence, uniqueness and methods of solutions. Often, original forms of governing differential equations used in engineering problems are simplified......, and this process produces noise in the obtained answers. This paper deals with the solution of second order of differential equation governing beam deformation using four analytical approximate methods, namely the Perturbation, Homotopy Perturbation Method (HPM), Homotopy Analysis Method (HAM) and Variational...... Iteration Method (VIM). The comparisons of the results reveal that these methods are very effective, convenient and quite accurate for systems of non-linear differential equation....

  14. Metal-organic frameworks for analytical chemistry: from sample collection to chromatographic separation.

    Science.gov (United States)

    Gu, Zhi-Yuan; Yang, Cheng-Xiong; Chang, Na; Yan, Xiu-Ping

    2012-05-15

    In modern analytical chemistry researchers pursue novel materials to meet analytical challenges such as improvements in sensitivity, selectivity, and detection limit. Metal-organic frameworks (MOFs) are an emerging class of microporous materials, and their unusual properties such as high surface area, good thermal stability, uniform structured nanoscale cavities, and the availability of in-pore functionality and outer-surface modification are attractive for diverse analytical applications. This Account summarizes our research on the analytical applications of MOFs ranging from sampling to chromatographic separation. MOFs have been either directly used or engineered to meet the demands of various analytical applications. Bulk MOFs with microsized crystals are convenient sorbents for direct application to in-field sampling and solid-phase extraction. Quartz tubes packed with MOF-5 have shown excellent stability, adsorption efficiency, and reproducibility for in-field sampling and trapping of atmospheric formaldehyde. The 2D copper(II) isonicotinate packed microcolumn has demonstrated large enhancement factors and good shape- and size-selectivity when applied to on-line solid-phase extraction of polycyclic aromatic hydrocarbons in water samples. We have explored the molecular sieving effect of MOFs for the efficient enrichment of peptides with simultaneous exclusion of proteins from biological fluids. These results show promise for the future of MOFs in peptidomics research. Moreover, nanosized MOFs and engineered thin films of MOFs are promising materials as novel coatings for solid-phase microextraction. We have developed an in situ hydrothermal growth approach to fabricate thin films of MOF-199 on etched stainless steel wire for solid-phase microextraction of volatile benzene homologues with large enhancement factors and wide linearity. Their high thermal stability and easy-to-engineer nanocrystals make MOFs attractive as new stationary phases to fabricate MOF

  15. Approximate Analytic Solutions for the Two-Phase Stefan Problem Using the Adomian Decomposition Method

    Directory of Open Access Journals (Sweden)

    Xiao-Ying Qin

    2014-01-01

    Full Text Available An Adomian decomposition method (ADM is applied to solve a two-phase Stefan problem that describes the pure metal solidification process. In contrast to traditional analytical methods, ADM avoids complex mathematical derivations and does not require coordinate transformation for elimination of the unknown moving boundary. Based on polynomial approximations for some known and unknown boundary functions, approximate analytic solutions for the model with undetermined coefficients are obtained using ADM. Substitution of these expressions into other equations and boundary conditions of the model generates some function identities with the undetermined coefficients. By determining these coefficients, approximate analytic solutions for the model are obtained. A concrete example of the solution shows that this method can easily be implemented in MATLAB and has a fast convergence rate. This is an efficient method for finding approximate analytic solutions for the Stefan and the inverse Stefan problems.

  16. Sampling problems for randomly broken sticks

    Energy Technology Data Exchange (ETDEWEB)

    Huillet, Thierry [Laboratoire de Physique Theorique et Modelisation, CNRS-UMR 8089 et Universite de Cergy-Pontoise, 5 mail Gay-Lussac, 95031, Neuville sur Oise (France)

    2003-04-11

    Consider the random partitioning model of a population (represented by a stick of length 1) into n species (fragments) with identically distributed random weights (sizes). Upon ranking the fragments' weights according to ascending sizes, let S{sub m:n} be the size of the mth smallest fragment. Assume that some observer is sampling such populations as follows: drop at random k points (the sample size) onto this stick and record the corresponding numbers of visited fragments. We shall investigate the following sampling problems: (1) what is the sample size if the sampling is carried out until the first visit of the smallest fragment (size S{sub 1:n})? (2) For a given sample size, have all the fragments of the stick been visited at least once or not? This question is related to Feller's random coupon collector problem. (3) In what order are new fragments being discovered and what is the random number of samples separating the discovery of consecutive new fragments until exhaustion of the list? For this problem, the distribution of the size-biased permutation of the species' weights, as the sequence of their weights in their order of appearance is needed and studied.

  17. [Patient identification errors and biological samples in the analytical process: Is it possible to improve patient safety?].

    Science.gov (United States)

    Cuadrado-Cenzual, M A; García Briñón, M; de Gracia Hills, Y; González Estecha, M; Collado Yurrita, L; de Pedro Moro, J A; Fernández Pérez, C; Arroyo Fernández, M

    2015-01-01

    Patient identification errors and biological samples are one of the problems with the highest risk factor in causing an adverse event in the patient. To detect and analyse the causes of patient identification errors in analytical requests (PIEAR) from emergency departments, and to develop improvement strategies. A process and protocol was designed, to be followed by all professionals involved in the requesting and performing of laboratory tests. Evaluation and monitoring indicators of PIEAR were determined, before and after the implementation of these improvement measures (years 2010-2014). A total of 316 PIEAR were detected in a total of 483,254 emergency service requests during the study period, representing a mean of 6.80/10,000 requests. Patient identification failure was the most frequent in all the 6-monthly periods assessed, with a significant difference (Perrors. However, we must continue working with this strategy, promoting a culture of safety for all the professionals involved, and trying to achieve the goal that 100% of the analytical and samples are properly identified. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  18. Biased sampling, over-identified parameter problems and beyond

    CERN Document Server

    Qin, Jing

    2017-01-01

    This book is devoted to biased sampling problems (also called choice-based sampling in Econometrics parlance) and over-identified parameter estimation problems. Biased sampling problems appear in many areas of research, including Medicine, Epidemiology and Public Health, the Social Sciences and Economics. The book addresses a range of important topics, including case and control studies, causal inference, missing data problems, meta-analysis, renewal process and length biased sampling problems, capture and recapture problems, case cohort studies, exponential tilting genetic mixture models etc. The goal of this book is to make it easier for Ph. D students and new researchers to get started in this research area. It will be of interest to all those who work in the health, biological, social and physical sciences, as well as those who are interested in survey methodology and other areas of statistical science, among others. .

  19. Determination of 237Np in environmental and nuclear samples: A review of the analytical method

    International Nuclear Information System (INIS)

    Thakur, P.; Mulholland, G.P.

    2012-01-01

    A number of analytical methods has been developed and used for the determination of neptunium in environmental and nuclear fuel samples using alpha, ICP–MS spectrometry, and other analytical techniques. This review summarizes and discusses development of the radiochemical procedures for separation of neptunium (Np), since the beginning of the nuclear industry, followed by a more detailed discussion on recent trends in the separation of neptunium. This article also highlights the progress in analytical methods and issues associated with the determination of neptunium in environmental samples. - Highlights: ► Determination of Np in environmental and nuclear samples is reviewed. ► Various analytical methods used for the determination of Np are listed. ► Progress and issues associated with the determination of Np are discussed.

  20. Pre-analytical sample quality: metabolite ratios as an intrinsic marker for prolonged room temperature exposure of serum samples.

    Directory of Open Access Journals (Sweden)

    Gabriele Anton

    Full Text Available Advances in the "omics" field bring about the need for a high number of good quality samples. Many omics studies take advantage of biobanked samples to meet this need. Most of the laboratory errors occur in the pre-analytical phase. Therefore evidence-based standard operating procedures for the pre-analytical phase as well as markers to distinguish between 'good' and 'bad' quality samples taking into account the desired downstream analysis are urgently needed. We studied concentration changes of metabolites in serum samples due to pre-storage handling conditions as well as due to repeated freeze-thaw cycles. We collected fasting serum samples and subjected aliquots to up to four freeze-thaw cycles and to pre-storage handling delays of 12, 24 and 36 hours at room temperature (RT and on wet and dry ice. For each treated aliquot, we quantified 127 metabolites through a targeted metabolomics approach. We found a clear signature of degradation in samples kept at RT. Storage on wet ice led to less pronounced concentration changes. 24 metabolites showed significant concentration changes at RT. In 22 of these, changes were already visible after only 12 hours of storage delay. Especially pronounced were increases in lysophosphatidylcholines and decreases in phosphatidylcholines. We showed that the ratio between the concentrations of these molecule classes could serve as a measure to distinguish between 'good' and 'bad' quality samples in our study. In contrast, we found quite stable metabolite concentrations during up to four freeze-thaw cycles. We concluded that pre-analytical RT handling of serum samples should be strictly avoided and serum samples should always be handled on wet ice or in cooling devices after centrifugation. Moreover, serum samples should be frozen at or below -80°C as soon as possible after centrifugation.

  1. Analytical characterization using surface-enhanced Raman scattering (SERS) and microfluidic sampling

    International Nuclear Information System (INIS)

    Wang, Chao; Yu, Chenxu

    2015-01-01

    With the rapid development of analytical techniques, it has become much easier to detect chemical and biological analytes, even at very low detection limits. In recent years, techniques based on vibrational spectroscopy, such as surface enhanced Raman spectroscopy (SERS), have been developed for non-destructive detection of pathogenic microorganisms. SERS is a highly sensitive analytical tool that can be used to characterize chemical and biological analytes interacting with SERS-active substrates. However, it has always been a challenge to obtain consistent and reproducible SERS spectroscopic results at complicated experimental conditions. Microfluidics, a tool for highly precise manipulation of small volume liquid samples, can be used to overcome the major drawbacks of SERS-based techniques. High reproducibility of SERS measurement could be obtained in continuous flow generated inside microfluidic devices. This article provides a thorough review of the principles, concepts and methods of SERS-microfluidic platforms, and the applications of such platforms in trace analysis of chemical and biological analytes. (topical review)

  2. Analytical simulation of RBS spectra of nanowire samples

    Energy Technology Data Exchange (ETDEWEB)

    Barradas, Nuno P., E-mail: nunoni@ctn.ist.utl.pt [Centro de Ciências e Tecnologias Nucleares, Instituto Superior Técnico, Universidade de Lisboa, E.N. 10 ao km 139,7, 2695-066 Bobadela LRS (Portugal); García Núñez, C. [Laboratorio de Electrónica y Semiconductores, Departamento de Física Aplicada, Universidad Autónoma de Madrid, 28049 Madrid (Spain); Redondo-Cubero, A. [Laboratorio de Electrónica y Semiconductores, Departamento de Física Aplicada, Universidad Autónoma de Madrid, 28049 Madrid (Spain); Centro de Micro-Análisis de Materiales, Universidad Autónoma de Madrid, 28049 Madrid (Spain); Shen, G.; Kung, P. [Department of Electrical and Computer Engineering, The University of Alabama, AL 35487 (United States); Pau, J.L. [Laboratorio de Electrónica y Semiconductores, Departamento de Física Aplicada, Universidad Autónoma de Madrid, 28049 Madrid (Spain)

    2016-03-15

    Almost all, if not all, general purpose codes for analysis of Ion Beam Analysis data have been originally developed to handle laterally homogeneous samples only. This is the case of RUMP, NDF, SIMNRA, and even of the Monte Carlo code Corteo. General-purpose codes usually include only limited support for lateral inhomogeneity. In this work, we show analytical simulations of samples that consist of a layer of parallel oriented nanowires on a substrate, using a model implemented in NDF. We apply the code to real samples, made of vertical ZnO nanowires on a sapphire substrate. Two configurations of the nanowires were studied: 40 nm diameter, 4.1 μm height, 3.5% surface coverage; and 55 nm diameter, 1.1 μm height, 42% surface coverage. We discuss the accuracy and limits of applicability of the analysis.

  3. Tank 214-AW-105, grab samples, analytical results for the final report

    International Nuclear Information System (INIS)

    Esch, R.A.

    1997-01-01

    This document is the final report for tank 241-AW-105 grab samples. Twenty grabs samples were collected from risers 10A and 15A on August 20 and 21, 1996, of which eight were designated for the K Basin sludge compatibility and mixing studies. This document presents the analytical results for the remaining twelve samples. Analyses were performed in accordance with the Compatibility Grab Sampling and Analysis Plan (TSAP) and the Data Quality Objectives for Tank Farms Waste Compatibility Program (DO). The results for the previous sampling of this tank were reported in WHC-SD-WM-DP-149, Rev. 0, 60-Day Waste Compatibility Safety Issue and Final Results for Tank 241-A W-105, Grab Samples 5A W-95-1, 5A W-95-2 and 5A W-95-3. Three supernate samples exceeded the TOC notification limit (30,000 microg C/g dry weight). Appropriate notifications were made. No immediate notifications were required for any other analyte. The TSAP requested analyses for polychlorinated biphenyls (PCB) for all liquids and centrifuged solid subsamples. The PCB analysis of the liquid samples has been delayed and will be presented in a revision to this document

  4. Toward greener analytical techniques for the absolute quantification of peptides in pharmaceutical and biological samples.

    Science.gov (United States)

    Van Eeckhaut, Ann; Mangelings, Debby

    2015-09-10

    Peptide-based biopharmaceuticals represent one of the fastest growing classes of new drug molecules. New reaction types included in the synthesis strategies to reduce the rapid metabolism of peptides, along with the availability of new formulation and delivery technologies, resulted in an increased marketing of peptide drug products. In this regard, the development of analytical methods for quantification of peptides in pharmaceutical and biological samples is of utmost importance. From the sample preparation step to their analysis by means of chromatographic or electrophoretic methods, many difficulties should be tackled to analyze them. Recent developments in analytical techniques emphasize more and more on the use of green analytical techniques. This review will discuss the progresses in and challenges observed during green analytical method development for the quantification of peptides in pharmaceutical and biological samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. An analytical nodal method for time-dependent one-dimensional discrete ordinates problems

    International Nuclear Information System (INIS)

    Barros, R.C. de

    1992-01-01

    In recent years, relatively little work has been done in developing time-dependent discrete ordinates (S N ) computer codes. Therefore, the topic of time integration methods certainly deserves further attention. In this paper, we describe a new coarse-mesh method for time-dependent monoenergetic S N transport problesm in slab geometry. This numerical method preserves the analytic solution of the transverse-integrated S N nodal equations by constants, so we call our method the analytical constant nodal (ACN) method. For time-independent S N problems in finite slab geometry and for time-dependent infinite-medium S N problems, the ACN method generates numerical solutions that are completely free of truncation errors. Bsed on this positive feature, we expect the ACN method to be more accurate than conventional numerical methods for S N transport calculations on coarse space-time grids

  6. An analytic solution of the static problem of inclined risers conveying fluid

    KAUST Repository

    Alfosail, Feras; Nayfeh, Ali H.; Younis, Mohammad I.

    2016-01-01

    We use the method of matched asymptotic expansion to develop an analytic solution to the static problem of clamped–clamped inclined risers conveying fluid. The inclined riser is modeled as an Euler–Bernoulli beam taking into account its self

  7. A GPU code for analytic continuation through a sampling method

    Directory of Open Access Journals (Sweden)

    Johan Nordström

    2016-01-01

    Full Text Available We here present a code for performing analytic continuation of fermionic Green’s functions and self-energies as well as bosonic susceptibilities on a graphics processing unit (GPU. The code is based on the sampling method introduced by Mishchenko et al. (2000, and is written for the widely used CUDA platform from NVidia. Detailed scaling tests are presented, for two different GPUs, in order to highlight the advantages of this code with respect to standard CPU computations. Finally, as an example of possible applications, we provide the analytic continuation of model Gaussian functions, as well as more realistic test cases from many-body physics.

  8. 40 CFR 90.421 - Dilute gaseous exhaust sampling and analytical system description.

    Science.gov (United States)

    2010-07-01

    ... gas mixture temperature, measured at a point immediately ahead of the critical flow venturi, must be... analytical system description. (a) General. The exhaust gas sampling system described in this section is... requirements are as follows: (1) This sampling system requires the use of a Positive Displacement Pump—Constant...

  9. Recent Trends in Microextraction Techniques Employed in Analytical and Bioanalytical Sample Preparation

    Directory of Open Access Journals (Sweden)

    Abuzar Kabir

    2017-12-01

    Full Text Available Sample preparation has been recognized as a major step in the chemical analysis workflow. As such, substantial efforts have been made in recent years to simplify the overall sample preparation process. Major focusses of these efforts have included miniaturization of the extraction device; minimizing/eliminating toxic and hazardous organic solvent consumption; eliminating sample pre-treatment and post-treatment steps; reducing the sample volume requirement; reducing extraction equilibrium time, maximizing extraction efficiency etc. All these improved attributes are congruent with the Green Analytical Chemistry (GAC principles. Classical sample preparation techniques such as solid phase extraction (SPE and liquid-liquid extraction (LLE are being rapidly replaced with emerging miniaturized and environmentally friendly techniques such as Solid Phase Micro Extraction (SPME, Stir bar Sorptive Extraction (SBSE, Micro Extraction by Packed Sorbent (MEPS, Fabric Phase Sorptive Extraction (FPSE, and Dispersive Liquid-Liquid Micro Extraction (DLLME. In addition to the development of many new generic extraction sorbents in recent years, a large number of molecularly imprinted polymers (MIPs created using different template molecules have also enriched the large cache of microextraction sorbents. Application of nanoparticles as high-performance extraction sorbents has undoubtedly elevated the extraction efficiency and method sensitivity of modern chromatographic analyses to a new level. Combining magnetic nanoparticles with many microextraction sorbents has opened up new possibilities to extract target analytes from sample matrices containing high volumes of matrix interferents. The aim of the current review is to critically audit the progress of microextraction techniques in recent years, which has indisputably transformed the analytical chemistry practices, from biological and therapeutic drug monitoring to the environmental field; from foods to phyto

  10. Interpolation and sampling in spaces of analytic functions

    CERN Document Server

    Seip, Kristian

    2004-01-01

    The book is about understanding the geometry of interpolating and sampling sequences in classical spaces of analytic functions. The subject can be viewed as arising from three classical topics: Nevanlinna-Pick interpolation, Carleson's interpolation theorem for H^\\infty, and the sampling theorem, also known as the Whittaker-Kotelnikov-Shannon theorem. The book aims at clarifying how certain basic properties of the space at hand are reflected in the geometry of interpolating and sampling sequences. Key words for the geometric descriptions are Carleson measures, Beurling densities, the Nyquist rate, and the Helson-Szegő condition. The book is based on six lectures given by the author at the University of Michigan. This is reflected in the exposition, which is a blend of informal explanations with technical details. The book is essentially self-contained. There is an underlying assumption that the reader has a basic knowledge of complex and functional analysis. Beyond that, the reader should have some familiari...

  11. 8. All Polish Conference on Analytical Chemistry: Analytical Chemistry for the Community of the 21. Century

    International Nuclear Information System (INIS)

    Koscielniak, P.; Wieczorek, M.; Kozak, J.

    2010-01-01

    Book of Abstracts contains short descriptions of lectures, communications and posters presented during 8 th All Polish Conference on Analytical Chemistry (Cracow, 4-9.07.2010). Scientific programme consisted of: basic analytical problems, preparation of the samples, chemometry and metrology, miniaturization of the analytical procedures, environmental analysis, medicinal analyses, industrial analyses, food analyses, biochemical analyses, analysis of relicts of the past. Several posters were devoted to the radiochemical separations, radiochemical analysis, environmental behaviour of the elements important for the nuclear science and the professional tests.

  12. Asbestos quantification in track ballast, a complex analytical problem

    Science.gov (United States)

    Cavallo, Alessandro

    2016-04-01

    Track ballast forms the trackbeb upon which railroad ties are laid. It is used to bear the load from the railroad ties, to facilitate water drainage, and also to keep down vegetation. It is typically made of angular crushed stone, with a grain size between 30 and 60 mm, with good mechanical properties (high compressive strength, freeze - thaw resistance, resistance to fragmentation). The most common rock types are represented by basalts, porphyries, orthogneisses, some carbonatic rocks and "green stones" (serpentinites, prasinites, amphibolites, metagabbros). Especially "green stones" may contain traces, and sometimes appreciable amounts of asbestiform minerals (chrysotile and/or fibrous amphiboles, generally tremolite - actinolite). In Italy, the chrysotile asbestos mine in Balangero (Turin) produced over 5 Mt railroad ballast (crushed serpentinites), which was used for the railways in northern and central Italy, from 1930 up to 1990. In addition to Balangero, several other serpentinite and prasinite quarries (e.g. Emilia Romagna) provided the railways ballast up to the year 2000. The legal threshold for asbestos content in track ballast is established in 1000 ppm: if the value is below this threshold, the material can be reused, otherwise it must be disposed of as hazardous waste, with very high costs. The quantitative asbestos determination in rocks is a very complex analytical issue: although techniques like TEM-SAED and micro-Raman are very effective in the identification of asbestos minerals, a quantitative determination on bulk materials is almost impossible or really expensive and time consuming. Another problem is represented by the discrimination of asbestiform minerals (e.g. chrysotile, asbestiform amphiboles) from the common acicular - pseudo-fibrous varieties (lamellar serpentine minerals, prismatic/acicular amphiboles). In this work, more than 200 samples from the main Italian rail yards were characterized by a combined use of XRD and a special SEM

  13. Means of introducing an analyte into liquid sampling atmospheric pressure glow discharge

    Science.gov (United States)

    Marcus, R. Kenneth; Quarles, Jr., Charles Derrick; Russo, Richard E.; Koppenaal, David W.; Barinaga, Charles J.; Carado, Anthony J.

    2017-01-03

    A liquid sampling, atmospheric pressure, glow discharge (LS-APGD) device as well as systems that incorporate the device and methods for using the device and systems are described. The LS-APGD includes a hollow capillary for delivering an electrolyte solution to a glow discharge space. The device also includes a counter electrode in the form of a second hollow capillary that can deliver the analyte into the glow discharge space. A voltage across the electrolyte solution and the counter electrode creates the microplasma within the glow discharge space that interacts with the analyte to move it to a higher energy state (vaporization, excitation, and/or ionization of the analyte).

  14. Reference Priors For Non-Normal Two-Sample Problems

    NARCIS (Netherlands)

    Fernández, C.; Steel, M.F.J.

    1997-01-01

    The reference prior algorithm (Berger and Bernardo, 1992) is applied to locationscale models with any regular sampling density. A number of two-sample problems is analyzed in this general context, extending the dierence, ratio and product of Normal means problems outside Normality, while explicitly

  15. Analytic solution of the relativistic Coulomb problem for a spinless Salpeter equation

    International Nuclear Information System (INIS)

    Durand, B.; Durand, L.

    1983-01-01

    We construct an analytic solution to the spinless S-wave Salpeter equation for two quarks interacting via a Coulomb potential, [2(-del 2 +m 2 )/sup 1/2/-M-α/r] psi(r) = 0, by transforming the momentum-space form of the equation into a mapping or boundary-value problem for analytic functions. The principal part of the three-dimensional wave function is identical to the solution of a one-dimensional Salpeter equation found by one of us and discussed here. The remainder of the wave function can be constructed by the iterative solution of an inhomogeneous singular integral equation. We show that the exact bound-state eigenvalues for the Coulomb problem are M/sub n/ = 2m/(1+α 2 /4n 2 )/sup 1/2/, n = 1,2,..., and that the wave function for the static interaction diverges for r→0 as C(mr)/sup -nu/, where #betta# = (α/π)(1+α/π+...) is known exactly

  16. Electromagnetic wave theory for boundary-value problems an advanced course on analytical methods

    CERN Document Server

    Eom, Hyo J

    2004-01-01

    Electromagnetic wave theory is based on Maxwell's equations, and electromagnetic boundary-value problems must be solved to understand electromagnetic scattering, propagation, and radiation. Electromagnetic theory finds practical applications in wireless telecommunications and microwave engineering. This book is written as a text for a two-semester graduate course on electromagnetic wave theory. As such, Electromagnetic Wave Theory for Boundary-Value Problems is intended to help students enhance analytic skills by solving pertinent boundary-value problems. In particular, the techniques of Fourier transform, mode matching, and residue calculus are utilized to solve some canonical scattering and radiation problems.

  17. Problem of the Moving Boundary in Continuous Casting Solved by The Analytic-Numerical Method

    Directory of Open Access Journals (Sweden)

    Grzymkowski R.

    2013-03-01

    Full Text Available Mathematical modeling of thermal processes combined with the reversible phase transitions of type: solid phase - liquid phase leads to formulation of the parabolic or elliptic moving boundary problem. Solution of such defined problem requires, most often, to use some sophisticated numerical techniques and far advanced mathematical tools. The paper presents an analytic-numerical method, especially attractive from the engineer’s point of view, applied for finding the approximate solutions of the selected class of problems which can be reduced to the one-phase solidification problem of a plate with the unknown a priori, varying in time boundary of the region in which the solution is sought. Proposed method is based on the known formalism of initial expansion of a sought function, describing the field of temperature, into the power series, some coefficients of which are determined with the aid of boundary conditions, and on the approximation of a function defining the freezing front location with the broken line, parameters of which are determined numerically. The method represents a combination of the analytical and numerical techniques and seems to be an effective and relatively easy in using tool for solving problems of considered kind.

  18. Problem of the Moving Boundary in Continuous Casting Solved by the Analytic-Numerical Method

    Directory of Open Access Journals (Sweden)

    R. Grzymkowski

    2013-01-01

    Full Text Available Mathematical modeling of thermal processes combined with the reversible phase transitions of type: solid phase – liquid phase leads to formulation of the parabolic or elliptic moving boundary problem. Solution of such defined problem requires, most often, to use some sophisticated numerical techniques and far advanced mathematical tools. The paper presents an analytic-numerical method, especially attractive from the engineer’s point of view, applied for finding the approximate solutions of the selected class of problems which can be reduced to the one-phase solidification problem of a plate with the unknown a priori, varying in time boundary of the region in which the solution is sought. Proposed method is based on the known formalism of initial expansion of a sought function, describing the field of temperature, into the power series, some coefficients of which are determined with the aid of boundary conditions, and on the approximation of a function defining the freezing front location with the broken line, parameters of which are determined numerically. The method represents a combination of the analytical and numerical techniques and seems to be an effective and relatively easy in using tool for solving problems of considered kind.

  19. Rapid Gamma Screening of Shipments of Analytical Samples to Meet DOT Regulations

    International Nuclear Information System (INIS)

    Wojtaszek, P.A.; Remington, D.L.; Ideker-Mulligan, V.

    2006-01-01

    The accelerated closure program at Rocky Flats required the capacity to ship up to 1000 analytical samples per week to off-site commercial laboratories, and to conduct such shipment within 24 hours of sample collection. During a period of near peak activity in the closure project, a regulatory change significantly increased the level of radionuclide data required for shipment of each package. In order to meet these dual challenges, a centralized and streamlined sample management program was developed which channeled analytical samples through a single, high-throughput radiological screening facility. This trailerized facility utilized high purity germanium (HPGe) gamma spectrometers to conduct screening measurements of entire packages of samples at once, greatly increasing throughput compared to previous methods. The In Situ Object Counting System (ISOCS) was employed to calibrate the HPGe systems to accommodate the widely varied sample matrices and packing configurations encountered. Optimum modeling and configuration parameters were determined. Accuracy of the measurements of grouped sample jars was confirmed with blind samples in multiple configurations. Levels of radionuclides not observable by gamma spectroscopy were calculated utilizing a spreadsheet program that can accommodate isotopic ratios for large numbers of different waste streams based upon acceptable knowledge. This program integrated all radionuclide data and output all information required for shipment, including the shipping class of the package. (authors)

  20. Analytic approximations to nonlinear boundary value problems modeling beam-type nano-electromechanical systems

    Energy Technology Data Exchange (ETDEWEB)

    Zou, Li [Dalian Univ. of Technology, Dalian City (China). State Key Lab. of Structural Analysis for Industrial Equipment; Liang, Songxin; Li, Yawei [Dalian Univ. of Technology, Dalian City (China). School of Mathematical Sciences; Jeffrey, David J. [Univ. of Western Ontario, London (Canada). Dept. of Applied Mathematics

    2017-06-01

    Nonlinear boundary value problems arise frequently in physical and mechanical sciences. An effective analytic approach with two parameters is first proposed for solving nonlinear boundary value problems. It is demonstrated that solutions given by the two-parameter method are more accurate than solutions given by the Adomian decomposition method (ADM). It is further demonstrated that solutions given by the ADM can also be recovered from the solutions given by the two-parameter method. The effectiveness of this method is demonstrated by solving some nonlinear boundary value problems modeling beam-type nano-electromechanical systems.

  1. Nuclear analytical techniques and their application to environmental samples

    International Nuclear Information System (INIS)

    Lieser, K.H.

    1986-01-01

    A survey is given on nuclear analytical techniques and their application to environmental samples. Measurement of the inherent radioactivity of elements or radionuclides allows determination of natural radioelements (e.g. Ra), man-made radioelements (e.g. Pu) and radionuclides in the environment. Activation analysis, in particular instrumental neutron activation analysis, is a very reliable and sensitive method for determination of a great number of trace elements in environmental samples, because the most abundant main constituents are not activated. Tracer techniques are very useful for studies of the behaviour and of chemical reactions of trace elements and compounds in the environment. Radioactive sources are mainly applied for excitation of characteristic X-rays (X-ray fluorescence analysis). (author)

  2. Results Of Analytical Sample Crosschecks For Next Generation Solvent Extraction Samples Isopar L Concentration And pH

    International Nuclear Information System (INIS)

    Peters, T.; Fink, S.

    2011-01-01

    As part of the implementation process for the Next Generation Cesium Extraction Solvent (NGCS), SRNL and F/H Lab performed a series of analytical cross-checks to ensure that the components in the NGCS solvent system do not constitute an undue analytical challenge. For measurement of entrained Isopar(reg s ign) L in aqueous solutions, both labs performed similarly with results more reliable at higher concentrations (near 50 mg/L). Low bias occurred in both labs, as seen previously for comparable blind studies for the baseline solvent system. SRNL recommends consideration to use of Teflon(trademark) caps on all sample containers used for this purpose. For pH measurements, the labs showed reasonable agreement but considerable positive bias for dilute boric acid solutions. SRNL recommends consideration of using an alternate analytical method for qualification of boric acid concentrations.

  3. Big Data Analytics as Input for Problem Definition and Idea Generation in Technological Design

    OpenAIRE

    Escandón-Quintanilla , Ma-Lorena; Gardoni , Mickaël; Cohendet , Patrick

    2016-01-01

    Part 10: Big Data Analytics and Business Intelligence; International audience; Big data analytics enables organizations to process massive amounts of data in shorter amounts of time and with more understanding than ever before. Many uses have been found to take advantage of this tools and techniques, especially for decision making. However, little applications have been found in the first stages of innovation, namely problem definition and idea generation. This paper discusses how big data an...

  4. Analytic continuation of quantum Monte Carlo data. Stochastic sampling method

    Energy Technology Data Exchange (ETDEWEB)

    Ghanem, Khaldoon; Koch, Erik [Institute for Advanced Simulation, Forschungszentrum Juelich, 52425 Juelich (Germany)

    2016-07-01

    We apply Bayesian inference to the analytic continuation of quantum Monte Carlo (QMC) data from the imaginary axis to the real axis. Demanding a proper functional Bayesian formulation of any analytic continuation method leads naturally to the stochastic sampling method (StochS) as the Bayesian method with the simplest prior, while it excludes the maximum entropy method and Tikhonov regularization. We present a new efficient algorithm for performing StochS that reduces computational times by orders of magnitude in comparison to earlier StochS methods. We apply the new algorithm to a wide variety of typical test cases: spectral functions and susceptibilities from DMFT and lattice QMC calculations. Results show that StochS performs well and is able to resolve sharp features in the spectrum.

  5. An approximate and an analytical solution to the carousel-pendulum problem

    Energy Technology Data Exchange (ETDEWEB)

    Vial, Alexandre [Pole Physique, Mecanique, Materiaux et Nanotechnologies, Universite de technologie de Troyes, 12, rue Marie Curie BP-2060, F-10010 Troyes Cedex (France)], E-mail: alexandre.vial@utt.fr

    2009-09-15

    We show that an improved solution to the carousel-pendulum problem can be easily obtained through a first-order Taylor expansion, and its accuracy is determined after the obtention of an unusable analytical exact solution, advantageously replaced by a numerical one. It is shown that the accuracy is unexpectedly high, even when the ratio length of the pendulum to carousel radius approaches unity. (letters and comments)

  6. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    International Nuclear Information System (INIS)

    Brown, Forrest B.

    2016-01-01

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple a ce.pl and simple a ce m g.pl.

  7. Analytical solution for heat conduction problem in composite slab and its implementation in constructal solution for cooling of electronics

    International Nuclear Information System (INIS)

    Kuddusi, Luetfullah; Denton, Jesse C.

    2007-01-01

    The constructal solution for cooling of electronics requires solution of a fundamental heat conduction problem in a composite slab composed of a heat generating slab and a thin strip of high conductivity material that is responsible for discharging the generated heat to a heat sink located at one end of the strip. The fundamental 2D heat conduction problem is solved analytically by applying an integral transform method. The analytical solution is then employed in a constructal solution, following Bejan, for cooling of electronics. The temperature and heat flux distributions of the elemental heat generating slabs are assumed to be the same as those of the analytical solution in all the elemental volumes and the high conductivity strips distributed in the different constructs. Although the analytical solution of the fundamental 2D heat conduction problem improves the accuracy of the distributions in the elemental slabs, the results following Bejan's strategy do not affirm the accuracy of Bejan's constructal solution itself as applied to this problem of cooling of electronics. Several different strategies are possible for developing a constructal solution to this problem as is indicated

  8. Sample Preparation of Corn Seed Tissue to Prevent Analyte Relocations for Mass Spectrometry Imaging

    Science.gov (United States)

    Kim, Shin Hye; Kim, Jeongkwon; Lee, Young Jin; Lee, Tae Geol; Yoon, Sohee

    2017-08-01

    Corn seed tissue sections were prepared by the tape support method using an adhesive tape, and mass spectrometry imaging (MSI) was performed. The effect of heat generated during sample preparation was investigated by time-of-flight secondary mass spectrometry (TOF-SIMS) imaging of corn seed tissue prepared by the tape support and the thaw-mounted methods. Unlike thaw-mounted sample preparation, the tape support method does not cause imaging distortion because of the absence of heat, which can cause migration of the analytes on the sample. By applying the tape-support method, the corn seed tissue was prepared without structural damage and MSI with accurate spatial information of analytes was successfully performed.

  9. Sample Preparation of Corn Seed Tissue to Prevent Analyte Relocations for Mass Spectrometry Imaging.

    Science.gov (United States)

    Kim, Shin Hye; Kim, Jeongkwon; Lee, Young Jin; Lee, Tae Geol; Yoon, Sohee

    2017-08-01

    Corn seed tissue sections were prepared by the tape support method using an adhesive tape, and mass spectrometry imaging (MSI) was performed. The effect of heat generated during sample preparation was investigated by time-of-flight secondary mass spectrometry (TOF-SIMS) imaging of corn seed tissue prepared by the tape support and the thaw-mounted methods. Unlike thaw-mounted sample preparation, the tape support method does not cause imaging distortion because of the absence of heat, which can cause migration of the analytes on the sample. By applying the tape-support method, the corn seed tissue was prepared without structural damage and MSI with accurate spatial information of analytes was successfully performed. Graphical Abstract ᅟ.

  10. MoonDB — A Data System for Analytical Data of Lunar Samples

    Science.gov (United States)

    Lehnert, K.; Ji, P.; Cai, M.; Evans, C.; Zeigler, R.

    2018-04-01

    MoonDB is a data system that makes analytical data from the Apollo lunar sample collection and lunar meteorites accessible by synthesizing published and unpublished datasets in a relational database with an online search interface.

  11. Sampling practices and analytical techniques used in the monitoring of steam and water in CEGB nuclear boilers

    International Nuclear Information System (INIS)

    Goodfellow, G.I.

    1978-01-01

    The steam and water in CEGB Magnox and AGR nuclear boilers are continuously monitored, using both laboratory techniques and on-line instrumentation, in order to maintain the chemical quality within pre-determined limits. The sampling systems in use and some of the difficulties associated with sampling requirements are discussed. The relative merits of chemical instruments installed either locally in various parts of the plant or in centralized instrument rooms are reviewed. The quality of water in nuclear boilers, as with all high-pressure steam-raising plant, is extremely high; consequently very sensitive analytical procedures are required, particularly for monitoring the feed-water of 'once-through boiler' systems. Considerable progress has been made in this field and examples are given of some of the techniques developed for analyses at the 'μ/kg' level together with some of the current problems.(author)

  12. An Analytical Model for Multilayer Well Production Evaluation to Overcome Cross-Flow Problem

    KAUST Repository

    Hakiki, Farizal; Wibowo, Aris T.; Rahmawati, Silvya D.; Yasutra, Amega; Sukarno, Pudjo

    2017-01-01

    One of the major concerns in a multi-layer system is that interlayer cross-flow may occur if reservoir fluids are produced from commingled layers that have unequal initial pressures. Reservoir would commonly have bigger average reservoir pressure (pore fluid pressure) as it goes deeper. The phenomenon is, however, not followed by the reservoir productivity or injectivity. The existence of reservoir with quite low average-pressure and high injectivity would tend experiencing the cross-flow problem. It is a phenomenon of fluid from bottom layer flowing into upper layer. It would strict upper-layer fluid to flow into wellbore. It is as if there is an injection treatment from bottom layer. The study deploys productivity index an approach parameter taking into account of cross-flow problem instead of injectivity index since it is a production well. The analytical study is to model the reservoir multilayer by addressing to avoid cross-flow problem. The analytical model employed hypothetical and real field data to test it. The scope of this study are: (a) Develop mathematical-based solution to determine the production rate from each layer; (b) Assess different scenarios to optimize production rate, those are: pump setting depth and performance of in-situ choke (ISC) installation. The ISC is acting as an inflow control device (ICD) alike that help to reduce cross-flow occurrence. This study employed macro program to write the code and develop the interface. Fast iterative procedure happens on solving the analytical model. Comparison results recognized that the mathematical-based solution shows a good agreement with the commercial software derived results.

  13. An Analytical Model for Multilayer Well Production Evaluation to Overcome Cross-Flow Problem

    KAUST Repository

    Hakiki, Farizal

    2017-10-17

    One of the major concerns in a multi-layer system is that interlayer cross-flow may occur if reservoir fluids are produced from commingled layers that have unequal initial pressures. Reservoir would commonly have bigger average reservoir pressure (pore fluid pressure) as it goes deeper. The phenomenon is, however, not followed by the reservoir productivity or injectivity. The existence of reservoir with quite low average-pressure and high injectivity would tend experiencing the cross-flow problem. It is a phenomenon of fluid from bottom layer flowing into upper layer. It would strict upper-layer fluid to flow into wellbore. It is as if there is an injection treatment from bottom layer. The study deploys productivity index an approach parameter taking into account of cross-flow problem instead of injectivity index since it is a production well. The analytical study is to model the reservoir multilayer by addressing to avoid cross-flow problem. The analytical model employed hypothetical and real field data to test it. The scope of this study are: (a) Develop mathematical-based solution to determine the production rate from each layer; (b) Assess different scenarios to optimize production rate, those are: pump setting depth and performance of in-situ choke (ISC) installation. The ISC is acting as an inflow control device (ICD) alike that help to reduce cross-flow occurrence. This study employed macro program to write the code and develop the interface. Fast iterative procedure happens on solving the analytical model. Comparison results recognized that the mathematical-based solution shows a good agreement with the commercial software derived results.

  14. Analytical Solution of Nonlinear Problems in Classical Dynamics by Means of Lagrange-Ham

    DEFF Research Database (Denmark)

    Kimiaeifar, Amin; Mahdavi, S. H; Rabbani, A.

    2011-01-01

    In this work, a powerful analytical method, called Homotopy Analysis Methods (HAM) is coupled with Lagrange method to obtain the exact solution for nonlinear problems in classic dynamics. In this work, the governing equations are obtained by using Lagrange method, and then the nonlinear governing...

  15. Simple and Accurate Analytical Solutions of the Electrostatically Actuated Curled Beam Problem

    KAUST Repository

    Younis, Mohammad I.

    2014-08-17

    We present analytical solutions of the electrostatically actuated initially deformed cantilever beam problem. We use a continuous Euler-Bernoulli beam model combined with a single-mode Galerkin approximation. We derive simple analytical expressions for two commonly observed deformed beams configurations: the curled and tilted configurations. The derived analytical formulas are validated by comparing their results to experimental data in the literature and numerical results of a multi-mode reduced order model. The derived expressions do not involve any complicated integrals or complex terms and can be conveniently used by designers for quick, yet accurate, estimations. The formulas are found to yield accurate results for most commonly encountered microbeams of initial tip deflections of few microns. For largely deformed beams, we found that these formulas yield less accurate results due to the limitations of the single-mode approximations they are based on. In such cases, multi-mode reduced order models need to be utilized.

  16. Sampling and examination methods used for TMI-2 samples

    International Nuclear Information System (INIS)

    Marley, A.W.; Akers, D.W.; McIsaac, C.V.

    1988-01-01

    The purpose of this paper is to summarize the sampling and examination techniques that were used in the collection and analysis of TMI-2 samples. Samples ranging from auxiliary building air to core debris were collected and analyzed. Handling of the larger samples and many of the smaller samples had to be done remotely and many standard laboratory analytical techniques were modified to accommodate the extremely high radiation fields associated with these samples. The TMI-2 samples presented unique problems with sampling and the laboratory analysis of prior molten fuel debris. 14 refs., 8 figs

  17. An analytical examination of distortions in power spectra due to sampling errors

    International Nuclear Information System (INIS)

    Njau, E.C.

    1982-06-01

    Distortions introduced into spectral energy densities of sinusoid signals as well as those of more complex signals through different forms of errors in signal sampling are developed and shown analytically. The approach we have adopted in doing this involves, firstly, developing for each type of signal and for the corresponding form of sampling errors an analytical expression that gives the faulty digitization process involved in terms of the features of the particular signal. Secondly, we take advantage of a method described elsewhere [IC/82/44] to relate, as much as possible, the true spectral energy density of the signal and the corresponding spectral energy density of the faulty digitization process. Thirdly, we then develop expressions which reveal the distortions that are formed in the directly computed spectral energy density of the digitized signal. It is evident from the formulations developed herein that the types of sampling errors taken into consideration may create false peaks and other distortions that are of non-negligible concern in computed power spectra. (author)

  18. Sampling analytical tests and destructive tests for quality assurance

    International Nuclear Information System (INIS)

    Saas, A.; Pasquini, S.; Jouan, A.; Angelis, de; Hreen Taywood, H.; Odoj, R.

    1990-01-01

    In the context of the third programme of the European Communities on the monitoring of radioactive waste, various methods have been developed for the performance of sampling and measuring tests on encapsulated waste of low and medium level activity, on the one hand, and of high level activity, on the other hand. The purpose was to provide better quality assurance for products to be stored on an interim or long-term basis. Various testing sampling means are proposed such as: - sampling of raw waste before conditioning and determination of the representative aliquot, - sampling of encapsulated waste on process output, - sampling of core specimens subjected to measurement before and after cutting. Equipment suitable for these sampling procedures have been developed and, in the case of core samples, a comparison of techniques has been made. The results are described for the various analytical tests carried out on the samples such as: - mechanical tests, - radiation resistance, - fire resistance, - lixiviation, - determination of free water, - biodegradation, - water resistance, - chemical and radiochemical analysis. Every time it was possible, these tests were compared with non-destructive tests on full-scale packages and some correlations are given. This word has made if possible to improve and clarify sample optimization, with fine sampling techniques and methodologies and draw up characterization procedures. It also provided an occasion for a first collaboration between the laboratories responsible for these studies and which will be furthered in the scope of the 1990-1994 programme

  19. How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?

    Directory of Open Access Journals (Sweden)

    Brady T West

    Full Text Available Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT, which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data.

  20. How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?

    Science.gov (United States)

    West, Brady T.; Sakshaug, Joseph W.; Aurelien, Guy Alain S.

    2016-01-01

    Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data. PMID:27355817

  1. Analytical Methodologies for the Determination of Endocrine Disrupting Compounds in Biological and Environmental Samples

    Directory of Open Access Journals (Sweden)

    Zoraida Sosa-Ferrera

    2013-01-01

    Full Text Available Endocrine-disruptor compounds (EDCs can mimic natural hormones and produce adverse effects in the endocrine functions by interacting with estrogen receptors. EDCs include both natural and synthetic chemicals, such as hormones, personal care products, surfactants, and flame retardants, among others. EDCs are characterised by their ubiquitous presence at trace-level concentrations and their wide diversity. Since the discovery of the adverse effects of these pollutants on wildlife and human health, analytical methods have been developed for their qualitative and quantitative determination. In particular, mass-based analytical methods show excellent sensitivity and precision for their quantification. This paper reviews recently published analytical methodologies for the sample preparation and for the determination of these compounds in different environmental and biological matrices by liquid chromatography coupled with mass spectrometry. The various sample preparation techniques are compared and discussed. In addition, recent developments and advances in this field are presented.

  2. Validation of an analytical methodology for the quantitative analysis of petroleum hydrocarbons in marine sediment samples

    Directory of Open Access Journals (Sweden)

    Eloy Yordad Companioni Damas

    2009-01-01

    Full Text Available This work describes a validation of an analytical procedure for the analysis of petroleum hydrocarbons in marine sediment samples. The proposed protocol is able to measure n-alkanes and polycyclic aromatic hydrocarbons (PAH in samples at concentrations as low as 30 ng/g, with a precision better than 15% for most of analytes. The extraction efficiency of fortified sediments varied from 65.1 to 105.6% and 59.7 to 97.8%, for n-alkanes and PAH in the ranges: C16 - C32 and fluoranthene - benzo(apyrene, respectively. The analytical protocol was applied to determine petroleum hydrocarbons in sediments collected from a marine coastal zone.

  3. Simulating quantum correlations as a distributed sampling problem

    International Nuclear Information System (INIS)

    Degorre, Julien; Laplante, Sophie; Roland, Jeremie

    2005-01-01

    It is known that quantum correlations exhibited by a maximally entangled qubit pair can be simulated with the help of shared randomness, supplemented with additional resources, such as communication, postselection or nonlocal boxes. For instance, in the case of projective measurements, it is possible to solve this problem with protocols using one bit of communication or making one use of a nonlocal box. We show that this problem reduces to a distributed sampling problem. We give a new method to obtain samples from a biased distribution, starting with shared random variables following a uniform distribution, and use it to build distributed sampling protocols. This approach allows us to derive, in a simpler and unified way, many existing protocols for projective measurements, and extend them to positive operator value measurements. Moreover, this approach naturally leads to a local hidden variable model for Werner states

  4. Computerized Analytical Data Management System and Automated Analytical Sample Transfer System at the COGEMA Reprocessing Plants in La Hague

    International Nuclear Information System (INIS)

    Flament, T.; Goasmat, F.; Poilane, F.

    2002-01-01

    Managing the operation of large commercial spent nuclear fuel reprocessing plants, such as UP3 and UP2-800 in La Hague, France, requires an extensive analytical program and the shortest possible analysis response times. COGEMA, together with its engineering subsidiary SGN, decided to build high-performance laboratories to support operations in its plants. These laboratories feature automated equipment, safe environments for operators, and short response times, all in centralized installations. Implementation of a computerized analytical data management system and a fully automated pneumatic system for the transfer of radioactive samples was a key factor contributing to the successful operation of the laboratories and plants

  5. Distribution-Preserving Stratified Sampling for Learning Problems.

    Science.gov (United States)

    Cervellera, Cristiano; Maccio, Danilo

    2017-06-09

    The need for extracting a small sample from a large amount of real data, possibly streaming, arises routinely in learning problems, e.g., for storage, to cope with computational limitations, obtain good training/test/validation sets, and select minibatches for stochastic gradient neural network training. Unless we have reasons to select the samples in an active way dictated by the specific task and/or model at hand, it is important that the distribution of the selected points is as similar as possible to the original data. This is obvious for unsupervised learning problems, where the goal is to gain insights on the distribution of the data, but it is also relevant for supervised problems, where the theory explains how the training set distribution influences the generalization error. In this paper, we analyze the technique of stratified sampling from the point of view of distances between probabilities. This allows us to introduce an algorithm, based on recursive binary partition of the input space, aimed at obtaining samples that are distributed as much as possible as the original data. A theoretical analysis is proposed, proving the (greedy) optimality of the procedure together with explicit error bounds. An adaptive version of the algorithm is also introduced to cope with streaming data. Simulation tests on various data sets and different learning tasks are also provided.

  6. Automated Sample Preparation for Radiogenic and Non-Traditional Metal Isotopes: Removing an Analytical Barrier for High Sample Throughput

    Science.gov (United States)

    Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.

    2014-05-01

    MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.

  7. Sample diagnosis using indicator elements and non-analyte signals for inductively coupled plasma mass spectrometry

    International Nuclear Information System (INIS)

    Antler, Margaret; Ying Hai; Burns, David H.; Salin, Eric D.

    2003-01-01

    A sample diagnosis procedure that uses both non-analyte and analyte signals to estimate matrix effects in inductively coupled plasma-mass spectrometry is presented. Non-analyte signals are those of background species in the plasma (e.g. N + , ArO + ), and changes in these signals can indicate changes in plasma conditions. Matrix effects of Al, Ba, Cs, K and Na on 19 non-analyte signals and 15 element signals were monitored. Multiple linear regression was used to build the prediction models, using a genetic algorithm for objective feature selection. Non-analyte elemental signals and non-analyte signals were compared for diagnosing matrix effects, and both were found to be suitable for estimating matrix effects. Individual analyte matrix effect estimation was compared with the overall matrix effect prediction, and models used to diagnose overall matrix effects were more accurate than individual analyte models. In previous work [Spectrochim. Acta Part B 57 (2002) 277], we tested models for analytical decision making. The current models were tested in the same way, and were able to successfully diagnose matrix effects with at least an 80% success rate

  8. Analytical artefacts in the speciation of arsenic in clinical samples

    International Nuclear Information System (INIS)

    Slejkovec, Zdenka; Falnoga, Ingrid; Goessler, Walter; Elteren, Johannes T. van; Raml, Reingard; Podgornik, Helena; Cernelc, Peter

    2008-01-01

    Urine and blood samples of cancer patients, treated with high doses of arsenic trioxide were analysed for arsenic species using HPLC-HGAFS and, in some cases, HPLC-ICPMS. Total arsenic was determined with either flow injection-HGAFS in urine or radiochemical neutron activation analysis in blood fractions (in serum/plasma, blood cells). The total arsenic concentrations (during prolonged, daily/weekly arsenic trioxide therapy) were in the μg mL -1 range for urine and in the ng g -1 range for blood fractions. The main arsenic species found in urine were As(III), MA and DMA and in blood As(V), MA and DMA. With proper sample preparation and storage of urine (no preservation agents/storage in liquid nitrogen) no analytical artefacts were observed and absence of significant amounts of alleged trivalent metabolites was proven. On the contrary, in blood samples a certain amount of arsenic can get lost in the speciation procedure what was especially noticeable for the blood cells although also plasma/serum gave rise to some disappearance of arsenic. The latter losses may be attributed to precipitation of As(III)-containing proteins/peptides during the methanol/water extraction procedure whereas the former losses were due to loss of specific As(III)-complexing proteins/peptides (e.g. cysteine, metallothionein, reduced GSH, ferritin) on the column (Hamilton PRP-X100) during the separation procedure. Contemporary analytical protocols are not able to completely avoid artefacts due to losses from the sampling to the detection stage so that it is recommended to be careful with the explanation of results, particularly regarding metabolic and pharmacokinetic interpretations, and always aim to compare the sum of species with the total arsenic concentration determined independently

  9. Analytic Approximations to the Free Boundary and Multi-dimensional Problems in Financial Derivatives Pricing

    Science.gov (United States)

    Lau, Chun Sing

    This thesis studies two types of problems in financial derivatives pricing. The first type is the free boundary problem, which can be formulated as a partial differential equation (PDE) subject to a set of free boundary condition. Although the functional form of the free boundary condition is given explicitly, the location of the free boundary is unknown and can only be determined implicitly by imposing continuity conditions on the solution. Two specific problems are studied in details, namely the valuation of fixed-rate mortgages and CEV American options. The second type is the multi-dimensional problem, which involves multiple correlated stochastic variables and their governing PDE. One typical problem we focus on is the valuation of basket-spread options, whose underlying asset prices are driven by correlated geometric Brownian motions (GBMs). Analytic approximate solutions are derived for each of these three problems. For each of the two free boundary problems, we propose a parametric moving boundary to approximate the unknown free boundary, so that the original problem transforms into a moving boundary problem which can be solved analytically. The governing parameter of the moving boundary is determined by imposing the first derivative continuity condition on the solution. The analytic form of the solution allows the price and the hedging parameters to be computed very efficiently. When compared against the benchmark finite-difference method, the computational time is significantly reduced without compromising the accuracy. The multi-stage scheme further allows the approximate results to systematically converge to the benchmark results as one recasts the moving boundary into a piecewise smooth continuous function. For the multi-dimensional problem, we generalize the Kirk (1995) approximate two-asset spread option formula to the case of multi-asset basket-spread option. Since the final formula is in closed form, all the hedging parameters can also be derived in

  10. Paper Capillary Enables Effective Sampling for Microfluidic Paper Analytical Devices.

    Science.gov (United States)

    Shangguan, Jin-Wen; Liu, Yu; Wang, Sha; Hou, Yun-Xuan; Xu, Bi-Yi; Xu, Jing-Juan; Chen, Hong-Yuan

    2018-06-06

    Paper capillary is introduced to enable effective sampling on microfluidic paper analytical devices. By coupling mac-roscale capillary force of paper capillary and microscale capillary forces of native paper, fluid transport can be flexibly tailored with proper design. Subsequently, a hybrid-fluid-mode paper capillary device was proposed, which enables fast and reliable sampling in an arrayed form, with less surface adsorption and bias for different components. The resulting device thus well supports high throughput, quantitative, and repeatable assays all by hands operation. With all these merits, multiplex analysis of ions, proteins, and microbe have all been realized on this platform, which has paved the way to level-up analysis on μPADs.

  11. Comment on 'analytic solution of the relativistic Coulomb problem for a spinless Salpeter equation'

    International Nuclear Information System (INIS)

    Lucha, W.; Schoeberl, F.F.

    1994-01-01

    We demonstrate that the analytic solution for the set of energy eigenvalues of the semi-relativistic Coulomb problem reported by B. and L. Durand is in clear conflict with an upper bound on the ground-state energy level derived by some straightforward variational procedure. (authors)

  12. Time and temperature dependent analytical stability of dry-collected Evalyn HPV self-sampling brush for cervical cancer screening

    DEFF Research Database (Denmark)

    Ejegod, Ditte Møller; Pedersen, Helle; Alzua, Garazi Peña

    2018-01-01

    As a new initiative, HPV self-sampling to non-attenders using the dry Evalyn self-sampling brush is offered in the Capital Region of Denmark. The use of a dry brush is largely uncharted territory in terms of analytical stability. In this study we aim to provide evidence on the analytical quality...

  13. Improved sample size determination for attributes and variables sampling

    International Nuclear Information System (INIS)

    Stirpe, D.; Picard, R.R.

    1985-01-01

    Earlier INMM papers have addressed the attributes/variables problem and, under conservative/limiting approximations, have reported analytical solutions for the attributes and variables sample sizes. Through computer simulation of this problem, we have calculated attributes and variables sample sizes as a function of falsification, measurement uncertainties, and required detection probability without using approximations. Using realistic assumptions for uncertainty parameters of measurement, the simulation results support the conclusions: (1) previously used conservative approximations can be expensive because they lead to larger sample sizes than needed; and (2) the optimal verification strategy, as well as the falsification strategy, are highly dependent on the underlying uncertainty parameters of the measurement instruments. 1 ref., 3 figs

  14. Abstracts book of 4. Poznan Analytical Seminar on Modern Methods of Sample Preparation and Trace Amounts Determination of Elements

    International Nuclear Information System (INIS)

    1995-01-01

    The 4. Poznan Analytical Seminar on Modern Methods of Sample Preparation and Trace Amounts Determination of Elements has been held in Poznan 27-28 April 1995. The new versions of analytical methods have been presented for quantitative determination of trace elements in biological, environmental and geological materials. Also the number of special techniques for sample preparation enables achievement the best precision of analytical results have been shown and discussed

  15. Sampling methodology and PCB analysis

    International Nuclear Information System (INIS)

    Dominelli, N.

    1995-01-01

    As a class of compounds PCBs are extremely stable and resist chemical and biological decomposition. Diluted solutions exposed to a range of environmental conditions will undergo some preferential degradation and the resulting mixture may differ considerably from the original PCB used as insulating fluid in electrical equipment. The structure of mixtures of PCBs (synthetic compounds prepared by direct chlorination of biphenyl with chlorine gas) is extremely complex and presents a formidable analytical problem, further complicated by the presence of PCBs as contaminants in oils to soils to water. This paper provides some guidance into sampling and analytical procedures; it also points out various potential problems encountered during these processes. The guidelines provided deal with sample collection, storage and handling, sample stability, laboratory analysis (usually gas chromatography), determination of PCB concentration, calculation of total PCB content, and quality assurance. 1 fig

  16. Trends in analytical methodologies for the determination of alkylphenols and bisphenol A in water samples.

    Science.gov (United States)

    Salgueiro-González, N; Muniategui-Lorenzo, S; López-Mahía, P; Prada-Rodríguez, D

    2017-04-15

    In the last decade, the impact of alkylphenols and bisphenol A in the aquatic environment has been widely evaluated because of their high use in industrial and household applications as well as their toxicological effects. These compounds are well-known endocrine disrupting compounds (EDCs) which can affect the hormonal system of humans and wildlife, even at low concentrations. Due to the fact that these pollutants enter into the environment through waters, and it is the most affected compartment, analytical methods which allow the determination of these compounds in aqueous samples at low levels are mandatory. In this review, an overview of the most significant advances in the analytical methodologies for the determination of alkylphenols and bisphenol A in waters is considered (from 2002 to the present). Sample handling and instrumental detection strategies are critically discussed, including analytical parameters related to quality assurance and quality control (QA/QC). Special attention is paid to miniaturized sample preparation methodologies and approaches proposed to reduce time- and reagents consumption according to Green Chemistry principles, which have increased in the last five years. Finally, relevant applications of these methods to the analysis of water samples are examined, being wastewater and surface water the most investigated. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. SOLUTION OF SIGNAL UNCERTAINTY PROBLEM AT ANALYTICAL DESIGN OF CONSECUTIVE COMPENSATOR IN PIEZO ACTUATOR CONTROL

    Directory of Open Access Journals (Sweden)

    S.V. Bystrov

    2016-05-01

    Full Text Available Subject of Research.We present research results for the signal uncertainty problem that naturally arises for the developers of servomechanisms, including analytical design of serial compensators, delivering the required quality indexes for servomechanisms. Method. The problem was solved with the use of Besekerskiy engineering approach, formulated in 1958. This gave the possibility to reduce requirements for input signal composition of servomechanisms by using only two of their quantitative characteristics, such as maximum speed and acceleration. Information about input signal maximum speed and acceleration allows entering into consideration the equivalent harmonic input signal with calculated amplitude and frequency. In combination with requirements for maximum tracking error, the amplitude and frequency of the equivalent harmonic effects make it possible to estimate analytically the value of the amplitude characteristics of the system by error and then convert it to amplitude characteristic of open-loop system transfer function. While previously Besekerskiy approach was mainly used in relation to the apparatus of logarithmic characteristics, we use this approach for analytical synthesis of consecutive compensators. Main Results. Proposed technique is used to create analytical representation of "input–output" and "error–output" polynomial dynamic models of the designed system. In turn, the desired model of the designed system in the "error–output" form of analytical representation of transfer functions is the basis for the design of consecutive compensator, that delivers the desired placement of state matrix eigenvalues and, consequently, the necessary set of dynamic indexes for the designed system. The given procedure of consecutive compensator analytical design on the basis of Besekerskiy engineering approach under conditions of signal uncertainty is illustrated by an example. Practical Relevance. The obtained theoretical results are

  18. Statistical Mechanics of a Simplified Bipartite Matching Problem: An Analytical Treatment

    Science.gov (United States)

    Dell'Erba, Matías Germán

    2012-03-01

    We perform an analytical study of a simplified bipartite matching problem in which there exists a constant matching energy, and both heterosexual and homosexual pairings are allowed. We obtain the partition function in a closed analytical form and we calculate the corresponding thermodynamic functions of this model. We conclude that the model is favored at high temperatures, for which the probabilities of heterosexual and homosexual pairs tend to become equal. In the limits of low and high temperatures, the system is extensive, however this property is lost in the general case. There exists a relation between the matching energies for which the system becomes more stable under external (thermal) perturbations. As the difference of energies between the two possible matches increases the system becomes more ordered, while the maximum of entropy is achieved when these energies are equal. In this limit, there is a first order phase transition between two phases with constant entropy.

  19. Negative dielectrophoresis spectroscopy for rare analyte quantification in biological samples

    Science.gov (United States)

    Kirmani, Syed Abdul Mannan; Gudagunti, Fleming Dackson; Velmanickam, Logeeshan; Nawarathna, Dharmakeerthi; Lima, Ivan T., Jr.

    2017-03-01

    We propose the use of negative dielectrophoresis (DEP) spectroscopy as a technique to improve the detection limit of rare analytes in biological samples. We observe a significant dependence of the negative DEP force on functionalized polystyrene beads at the edges of interdigitated electrodes with respect to the frequency of the electric field. We measured this velocity of repulsion for 0% and 0.8% conjugation of avidin with biotin functionalized polystyrene beads with our automated software through real-time image processing that monitors the Rayleigh scattering from the beads. A significant difference in the velocity of the beads was observed in the presence of as little as 80 molecules of avidin per biotin functionalized bead. This technology can be applied in the detection and quantification of rare analytes that can be useful in the diagnosis and the treatment of diseases, such as cancer and myocardial infarction, with the use of polystyrene beads functionalized with antibodies for the target biomarkers.

  20. Review of Analytes of Concern and Sample Methods for Closure of DOE High Level Waste Storage Tanks

    International Nuclear Information System (INIS)

    Thomas, T.R.

    2002-01-01

    Sampling residual waste after tank cleaning and analysis for analytes of concern to support closure and cleaning targets of large underground tanks used for storage of legacy high level radioactive waste (HLW) at Department of Energy (DOE) sites has been underway since about 1995. The DOE Tanks Focus Area (TFA) has been working with DOE tank sites to develop new sampling plans, and sampling methods for assessment of residual waste inventories. This paper discusses regulatory analytes of concern, sampling plans, and sampling methods that support closure and cleaning target activities for large storage tanks at the Hanford Site, the Savannah River Site (SRS), the Idaho National Engineering and Environmental Laboratory (INEEL), and the West Valley Demonstration Project (WVDP)

  1. An analytical method for the inverse Cauchy problem of Lame equation in a rectangle

    Science.gov (United States)

    Grigor’ev, Yu

    2018-04-01

    In this paper, we present an analytical computational method for the inverse Cauchy problem of Lame equation in the elasticity theory. A rectangular domain is frequently used in engineering structures and we only consider the analytical solution in a two-dimensional rectangle, wherein a missing boundary condition is recovered from the full measurement of stresses and displacements on an accessible boundary. The essence of the method consists in solving three independent Cauchy problems for the Laplace and Poisson equations. For each of them, the Fourier series is used to formulate a first-kind Fredholm integral equation for the unknown function of data. Then, we use a Lavrentiev regularization method, and the termwise separable property of kernel function allows us to obtain a closed-form regularized solution. As a result, for the displacement components, we obtain solutions in the form of a sum of series with three regularization parameters. The uniform convergence and error estimation of the regularized solutions are proved.

  2. Theoretical, analytical, and statistical interpretation of environmental data

    International Nuclear Information System (INIS)

    Lombard, S.M.

    1974-01-01

    The reliability of data from radiochemical analyses of environmental samples cannot be determined from nuclear counting statistics alone. The rigorous application of the principles of propagation of errors, an understanding of the physics and chemistry of the species of interest in the environment, and the application of information from research on the analytical procedure are all necessary for a valid estimation of the errors associated with analytical results. The specific case of the determination of plutonium in soil is considered in terms of analytical problems and data reliability. (U.S.)

  3. Characterization of carbon nanotubes and analytical methods for their determination in environmental and biological samples: A review

    Energy Technology Data Exchange (ETDEWEB)

    Herrero-Latorre, C., E-mail: carlos.herrero@usc.es; Álvarez-Méndez, J.; Barciela-García, J.; García-Martín, S.; Peña-Crecente, R.M.

    2015-01-01

    Highlights: • Analytical techniques for characterization of CNTs: classification, description and examples. • Determination methods for CNTs in biological and environmental samples. • Future trends and perspectives for characterization and determination of CNTs. - Abstract: In the present paper, a critical overview of the most commonly used techniques for the characterization and the determination of carbon nanotubes (CNTs) is given on the basis of 170 references (2000–2014). The analytical techniques used for CNT characterization (including microscopic and diffraction, spectroscopic, thermal and separation techniques) are classified, described, and illustrated with applied examples. Furthermore, the performance of sampling procedures as well as the available methods for the determination of CNTs in real biological and environmental samples are reviewed and discussed according to their analytical characteristics. In addition, future trends and perspectives in this field of work are critically presented.

  4. Application of the invariant embedding method to analytically solvable transport problems

    Energy Technology Data Exchange (ETDEWEB)

    Wahlberg, Malin

    2005-05-01

    The applicability and performance of the invariant embedding method for calculating various transport quantities is investigated in this thesis. The invariant embedding method is a technique to calculate the reflected or transmitted fluxes in homogeneous half-spaces and slabs, without the need for solving for the flux inside the medium. In return, the embedding equations become non-linear, and in practical cases they need to be solved by numerical methods. There are, however, fast and effective iterative methods available for this purpose. The objective of this thesis is to investigate the performance of these iterative methods in model problems, in which also an exact analytical solution can be obtained. Some of these analytical solutions are also new, hence their derivation constitutes a part of the thesis work. The cases investigated in the thesis all concern the calculation of reflected fluxes from half-spaces. The first problem treated was the calculation of the energy spectrum of reflected (sputtered) particles from a multiplying medium, where the multiplication arises from recoil production (i.e. like binary fission), when bombarded by o flux of monoenergetic particles of the same type. Both constant cross sections and energy dependent cross sections with a power law dependence were used in the calculations. The second class of problems concerned the calculation of the path length distribution of reflected particles from a medium without multiplication. It is an interesting new observation that the distribution of the path length travelled in the medium before reflection can be calculated with invariant embedding methods, which actually do not solve the flux distribution in the medium. We have tested the accuracy and the convergence properties of the embedding method also for this case. Finally, very recently a theory of connecting the infinite and half-space medium solutions by embedding-like integral equations was developed and reported in the literature

  5. Application of the invariant embedding method to analytically solvable transport problems

    International Nuclear Information System (INIS)

    Wahlberg, Malin

    2005-05-01

    The applicability and performance of the invariant embedding method for calculating various transport quantities is investigated in this thesis. The invariant embedding method is a technique to calculate the reflected or transmitted fluxes in homogeneous half-spaces and slabs, without the need for solving for the flux inside the medium. In return, the embedding equations become non-linear, and in practical cases they need to be solved by numerical methods. There are, however, fast and effective iterative methods available for this purpose. The objective of this thesis is to investigate the performance of these iterative methods in model problems, in which also an exact analytical solution can be obtained. Some of these analytical solutions are also new, hence their derivation constitutes a part of the thesis work. The cases investigated in the thesis all concern the calculation of reflected fluxes from half-spaces. The first problem treated was the calculation of the energy spectrum of reflected (sputtered) particles from a multiplying medium, where the multiplication arises from recoil production (i.e. like binary fission), when bombarded by o flux of monoenergetic particles of the same type. Both constant cross sections and energy dependent cross sections with a power law dependence were used in the calculations. The second class of problems concerned the calculation of the path length distribution of reflected particles from a medium without multiplication. It is an interesting new observation that the distribution of the path length travelled in the medium before reflection can be calculated with invariant embedding methods, which actually do not solve the flux distribution in the medium. We have tested the accuracy and the convergence properties of the embedding method also for this case. Finally, very recently a theory of connecting the infinite and half-space medium solutions by embedding-like integral equations was developed and reported in the literature

  6. Improved analytical sensitivity for uranium and plutonium in environmental samples: Cavity ion source thermal ionization mass spectrometry

    International Nuclear Information System (INIS)

    Ingeneri, Kristofer; Riciputi, L.

    2001-01-01

    Following successful field trials, environmental sampling has played a central role as a routine part of safeguards inspections since early 1996 to verify declared and to detect undeclared activity. The environmental sampling program has brought a new series of analytical challenges, and driven a need for advances in verification technology. Environmental swipe samples are often extremely low in concentration of analyte (ng level or lower), yet the need to analyze these samples accurately and precisely is vital, particularly for the detection of undeclared nuclear activities. Thermal ionization mass spectrometry (TIMS) is the standard method of determining isotope ratios of uranium and plutonium in the environmental sampling program. TIMS analysis typically employs 1-3 filaments to vaporize and ionize the sample, and the ions are mass separated and analyzed using magnetic sector instruments due to their high mass resolution and high ion transmission. However, the ionization efficiency (the ratio of material present to material actually detected) of uranium using a standard TIMS instrument is low (0.2%), even under the best conditions. Increasing ionization efficiency by even a small amount would have a dramatic impact for safeguards applications, allowing both improvements in analytical precision and a significant decrease in the amount of uranium and plutonium required for analysis, increasing the sensitivity of environmental sampling

  7. High-Order Analytic Expansion of Disturbing Function for Doubly Averaged Circular Restricted Three-Body Problem

    Directory of Open Access Journals (Sweden)

    Takashi Ito

    2016-01-01

    Full Text Available Terms in the analytic expansion of the doubly averaged disturbing function for the circular restricted three-body problem using the Legendre polynomial are explicitly calculated up to the fourteenth order of semimajor axis ratio (α between perturbed and perturbing bodies in the inner case (α1. The expansion outcome is compared with results from numerical quadrature on an equipotential surface. Comparison with direct numerical integration of equations of motion is also presented. Overall, the high-order analytic expansion of the doubly averaged disturbing function yields a result that agrees well with the numerical quadrature and with the numerical integration. Local extremums of the doubly averaged disturbing function are quantitatively reproduced by the high-order analytic expansion even when α is large. Although the analytic expansion is not applicable in some circumstances such as when orbits of perturbed and perturbing bodies cross or when strong mean motion resonance is at work, our expansion result will be useful for analytically understanding the long-term dynamical behavior of perturbed bodies in circular restricted three-body systems.

  8. A combined analytic-numeric approach for some boundary-value problems

    Directory of Open Access Journals (Sweden)

    Mustafa Turkyilmazoglu

    2016-02-01

    Full Text Available A combined analytic-numeric approach is undertaken in the present work for the solution of boundary-value problems in the finite or semi-infinite domains. Equations to be treated arise specifically from the boundary layer analysis of some two and three-dimensional flows in fluid mechanics. The purpose is to find quick but accurate enough solutions. Taylor expansions at either boundary conditions are computed which are next matched to the other asymptotic or exact boundary conditions. The technique is applied to the well-known Blasius as well as Karman flows. Solutions obtained in terms of series compare favorably with the existing ones in the literature.

  9. Numerical and analytical solutions for problems relevant for quantum computers

    International Nuclear Information System (INIS)

    Spoerl, Andreas

    2008-01-01

    Quantum computers are one of the next technological steps in modern computer science. Some of the relevant questions that arise when it comes to the implementation of quantum operations (as building blocks in a quantum algorithm) or the simulation of quantum systems are studied. Numerical results are gathered for variety of systems, e.g. NMR systems, Josephson junctions and others. To study quantum operations (e.g. the quantum fourier transform, swap operations or multiply-controlled NOT operations) on systems containing many qubits, a parallel C++ code was developed and optimised. In addition to performing high quality operations, a closer look was given to the minimal times required to implement certain quantum operations. These times represent an interesting quantity for the experimenter as well as for the mathematician. The former tries to fight dissipative effects with fast implementations, while the latter draws conclusions in the form of analytical solutions. Dissipative effects can even be included in the optimisation. The resulting solutions are relaxation and time optimised. For systems containing 3 linearly coupled spin (1)/(2) qubits, analytical solutions are known for several problems, e.g. indirect Ising couplings and trilinear operations. A further study was made to investigate whether there exists a sufficient set of criteria to identify systems with dynamics which are invertible under local operations. Finally, a full quantum algorithm to distinguish between two knots was implemented on a spin(1)/(2) system. All operations for this experiment were calculated analytically. The experimental results coincide with the theoretical expectations. (orig.)

  10. Analytical Parameters of an Amperometric Glucose Biosensor for Fast Analysis in Food Samples.

    Science.gov (United States)

    Artigues, Margalida; Abellà, Jordi; Colominas, Sergi

    2017-11-14

    Amperometric biosensors based on the use of glucose oxidase (GOx) are able to combine the robustness of electrochemical techniques with the specificity of biological recognition processes. However, very little information can be found in literature about the fundamental analytical parameters of these sensors. In this work, the analytical behavior of an amperometric biosensor based on the immobilization of GOx using a hydrogel (Chitosan) onto highly ordered titanium dioxide nanotube arrays (TiO₂NTAs) has been evaluated. The GOx-Chitosan/TiO₂NTAs biosensor showed a sensitivity of 5.46 μA·mM -1 with a linear range from 0.3 to 1.5 mM; its fundamental analytical parameters were studied using a commercial soft drink. The obtained results proved sufficient repeatability (RSD = 1.9%), reproducibility (RSD = 2.5%), accuracy (95-105% recovery), and robustness (RSD = 3.3%). Furthermore, no significant interferences from fructose, ascorbic acid and citric acid were obtained. In addition, the storage stability was further examined, after 30 days, the GOx-Chitosan/TiO₂NTAs biosensor retained 85% of its initial current response. Finally, the glucose content of different food samples was measured using the biosensor and compared with the respective HPLC value. In the worst scenario, a deviation smaller than 10% was obtained among the 20 samples evaluated.

  11. SALE, Quality Control of Analytical Chemical Measurements

    International Nuclear Information System (INIS)

    Bush, W.J.; Gentillon, C.D.

    1985-01-01

    1 - Description of problem or function: The Safeguards Analytical Laboratory Evaluation (SALE) program is a statistical analysis program written to analyze the data received from laboratories participating in the SALE quality control and evaluation program. The system is aimed at identifying and reducing analytical chemical measurement errors. Samples of well-characterized materials are distributed to laboratory participants at periodic intervals for determination of uranium or plutonium concentration and isotopic distributions. The results of these determinations are statistically evaluated and participants are informed of the accuracy and precision of their results. 2 - Method of solution: Various statistical techniques produce the SALE output. Assuming an unbalanced nested design, an analysis of variance is performed, resulting in a test of significance for time and analyst effects. A trend test is performed. Both within- laboratory and between-laboratory standard deviations are calculated. 3 - Restrictions on the complexity of the problem: Up to 1500 pieces of data for each nuclear material sampled by a maximum of 75 laboratories may be analyzed

  12. Analytical procedures for determining Pb and Sr isotopic compositions in water samples by ID-TIMS

    Directory of Open Access Journals (Sweden)

    Veridiana Martins

    2008-01-01

    Full Text Available Few articles deal with lead and strontium isotopic analysis of water samples. The aim of this study was to define the chemical procedures for Pb and Sr isotopic analyses of groundwater samples from an urban sedimentary aquifer. Thirty lead and fourteen strontium isotopic analyses were performed to test different analytical procedures. Pb and Sr isotopic ratios as well as Sr concentration did not vary using different chemical procedures. However, the Pb concentrations were very dependent on the different procedures. Therefore, the choice of the best analytical procedure was based on the Pb results, which indicated a higher reproducibility from samples that had been filtered and acidified before the evaporation, had their residues totally dissolved, and were purified by ion chromatography using the Biorad® column. Our results showed no changes in Pb ratios with the storage time.

  13. The analytic solution of the firm's cost-minimization problem with box constraints and the Cobb-Douglas model

    Science.gov (United States)

    Bayón, L.; Grau, J. M.; Ruiz, M. M.; Suárez, P. M.

    2012-12-01

    One of the most well-known problems in the field of Microeconomics is the Firm's Cost-Minimization Problem. In this paper we establish the analytical expression for the cost function using the Cobb-Douglas model and considering maximum constraints for the inputs. Moreover we prove that it belongs to the class C1.

  14. A review of electro analytical determinations of some important elements (Zn, Se, As) in environmental samples

    International Nuclear Information System (INIS)

    Lichiang; James, B.D.; Magee, R.J.

    1991-01-01

    This review covers electro analytical methods reported in the literature for the determination of zinc, cadmium, selenium and arsenic in environmental and biological samples. A comprehensive survey of electro analytical techniques used for the determination of four important elements, i.e. zinc, cadmium, selenium and arsenic is reported herein with 322 references up to 1990. (Orig./A.B.)

  15. Intimacy Is a Transdiagnostic Problem for Cognitive Behavior Therapy: Functional Analytical Psychotherapy Is a Solution

    Science.gov (United States)

    Wetterneck, Chad T.; Hart, John M.

    2012-01-01

    Problems with intimacy and interpersonal issues are exhibited across most psychiatric disorders. However, most of the targets in Cognitive Behavioral Therapy are primarily intrapersonal in nature, with few directly involved in interpersonal functioning and effective intimacy. Functional Analytic Psychotherapy (FAP) provides a behavioral basis for…

  16. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention

    Directory of Open Access Journals (Sweden)

    Samar Al-Hajj

    2017-09-01

    Full Text Available Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA. GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications.

  17. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.

    Science.gov (United States)

    Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-09-12

    Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.

  18. Analytical results from salt batch 9 routine DSSHT and SEHT monthly samples

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-06-01

    Strip Effluent Hold Tank (SEHT) and Decontaminated Salt Solution Hold Tank (DSSHT) samples from several of the “microbatches” of Integrated Salt Disposition Project (ISDP) Salt Batch (“Macrobatch”) 9 have been analyzed for 238Pu, 90Sr, 137Cs, cations (Inductively Coupled Plasma Emission Spectroscopy - ICPES), and anions (Ion Chromatography Anions - IC-A). The analytical results from the current microbatch samples are similar to those from previous macrobatch samples. The Cs removal continues to be acceptable, with decontamination factors (DF) averaging 25700 (107% RSD). The bulk chemistry of the DSSHT and SEHT samples do not show any signs of unusual behavior, other than lacking the anticipated degree of dilution that is calculated to occur during Modular Caustic-Side Solvent Extraction Unit (MCU) processing.

  19. Problems involved in sampling within and outside zones of emission

    Energy Technology Data Exchange (ETDEWEB)

    Oelschlaeger, W

    1973-01-01

    Problems involved in the sampling of plant materials both inside and outside emission zones are considered, especially in regard to trace element analysis. The basic problem revolves around obtaining as accurately as possible an average sample of actual composition. Elimination of error possibilities requires a knowledge of such factors as botanical composition, vegetation states, rains, mass losses in leaf and blossom parts, contamination through the soil, and gaseous or particulate emissions. Sampling and preparation of samples is also considered with respect to quantitative aspects of trace element analysis.

  20. Nuclear analytical methods for platinum group elements

    International Nuclear Information System (INIS)

    2005-04-01

    Platinum group elements (PGE) are of special interest for analytical research due to their economic importance like chemical peculiarities as catalysts, medical applications as anticancer drugs, and possible environmental detrimental impact as exhaust from automobile catalyzers. Natural levels of PGE are so low in concentration that most of the current analytical techniques approach their limit of detection capacity. In addition, Ru, Rh, Pd, Re, Os, Ir, and Pt analyses still constitute a challenge in accuracy and precision of quantification in natural matrices. Nuclear analytical techniques, such as neutron activation analysis, X ray fluorescence, or proton-induced X ray emission (PIXE), which are generally considered as reference methods for many analytical problems, are useful as well. However, due to methodological restrictions, they can, in most cases, only be applied after pre-concentration and under special irradiation conditions. This report was prepared following a coordinated research project and a consultants meeting addressing the subject from different viewpoints. The experts involved suggested to discuss the issue according to the (1) application, hence, the concentration levels encountered, and (2) method applied for analysis. Each of the different fields of application needs special consideration for sample preparation, PGE pre-concentration, and determination. Additionally, each analytical method requires special attention regarding the sensitivity and sample type. Quality assurance/quality control aspects are considered towards the end of the report. It is intended to provide the reader of this publication with state-of-the-art information on the various aspects of PGE analysis and to advise which technique might be most suitable for a particular analytical problem related to platinum group elements. In particular, many case studies described in detail from the authors' laboratory experience might help to decide which way to go. As in many cases

  1. Analytical Parameters of an Amperometric Glucose Biosensor for Fast Analysis in Food Samples

    Directory of Open Access Journals (Sweden)

    Margalida Artigues

    2017-11-01

    Full Text Available Amperometric biosensors based on the use of glucose oxidase (GOx are able to combine the robustness of electrochemical techniques with the specificity of biological recognition processes. However, very little information can be found in literature about the fundamental analytical parameters of these sensors. In this work, the analytical behavior of an amperometric biosensor based on the immobilization of GOx using a hydrogel (Chitosan onto highly ordered titanium dioxide nanotube arrays (TiO2NTAs has been evaluated. The GOx–Chitosan/TiO2NTAs biosensor showed a sensitivity of 5.46 μA·mM−1 with a linear range from 0.3 to 1.5 mM; its fundamental analytical parameters were studied using a commercial soft drink. The obtained results proved sufficient repeatability (RSD = 1.9%, reproducibility (RSD = 2.5%, accuracy (95–105% recovery, and robustness (RSD = 3.3%. Furthermore, no significant interferences from fructose, ascorbic acid and citric acid were obtained. In addition, the storage stability was further examined, after 30 days, the GOx–Chitosan/TiO2NTAs biosensor retained 85% of its initial current response. Finally, the glucose content of different food samples was measured using the biosensor and compared with the respective HPLC value. In the worst scenario, a deviation smaller than 10% was obtained among the 20 samples evaluated.

  2. Analytical Parameters of an Amperometric Glucose Biosensor for Fast Analysis in Food Samples

    Science.gov (United States)

    2017-01-01

    Amperometric biosensors based on the use of glucose oxidase (GOx) are able to combine the robustness of electrochemical techniques with the specificity of biological recognition processes. However, very little information can be found in literature about the fundamental analytical parameters of these sensors. In this work, the analytical behavior of an amperometric biosensor based on the immobilization of GOx using a hydrogel (Chitosan) onto highly ordered titanium dioxide nanotube arrays (TiO2NTAs) has been evaluated. The GOx–Chitosan/TiO2NTAs biosensor showed a sensitivity of 5.46 μA·mM−1 with a linear range from 0.3 to 1.5 mM; its fundamental analytical parameters were studied using a commercial soft drink. The obtained results proved sufficient repeatability (RSD = 1.9%), reproducibility (RSD = 2.5%), accuracy (95–105% recovery), and robustness (RSD = 3.3%). Furthermore, no significant interferences from fructose, ascorbic acid and citric acid were obtained. In addition, the storage stability was further examined, after 30 days, the GOx–Chitosan/TiO2NTAs biosensor retained 85% of its initial current response. Finally, the glucose content of different food samples was measured using the biosensor and compared with the respective HPLC value. In the worst scenario, a deviation smaller than 10% was obtained among the 20 samples evaluated. PMID:29135931

  3. Intercalibration of analytical methods on marine environmental samples

    International Nuclear Information System (INIS)

    1988-06-01

    The pollution of the seas by various chemical substances constitutes nowadays one of the principal concerns of mankind. The International Atomic Energy Agency has organized in past years several intercomparison exercises in the framework of its Analytical Quality Control Service. The present intercomparison had a double aim: first, to give laboratories participating in this intercomparison an opportunity for checking their analytical performance. Secondly, to produce on the basis of the results of this intercomparison a reference material made of fish tissue which would be accurately certified with respect to many trace elements. Such a material could be used by analytical chemists to check the validity of new analytical procedures. In total, 53 laboratories from 29 countries reported results (585 laboratory means for 48 elements). 5 refs, 52 tabs

  4. Numerical and analytical approaches to an advection-diffusion problem at small Reynolds number and large Péclet number

    Science.gov (United States)

    Fuller, Nathaniel J.; Licata, Nicholas A.

    2018-05-01

    Obtaining a detailed understanding of the physical interactions between a cell and its environment often requires information about the flow of fluid surrounding the cell. Cells must be able to effectively absorb and discard material in order to survive. Strategies for nutrient acquisition and toxin disposal, which have been evolutionarily selected for their efficacy, should reflect knowledge of the physics underlying this mass transport problem. Motivated by these considerations, in this paper we discuss the results from an undergraduate research project on the advection-diffusion equation at small Reynolds number and large Péclet number. In particular, we consider the problem of mass transport for a Stokesian spherical swimmer. We approach the problem numerically and analytically through a rescaling of the concentration boundary layer. A biophysically motivated first-passage problem for the absorption of material by the swimming cell demonstrates quantitative agreement between the numerical and analytical approaches. We conclude by discussing the connections between our results and the design of smart toxin disposal systems.

  5. Voltammetric technique, a panacea for analytical examination of environmental samples

    International Nuclear Information System (INIS)

    Zahir, E.; Mohiuddin, S.; Naqvi, I.I.

    2012-01-01

    Voltammetric methods for trace metal analysis in environmental samples of marine origin like mangrove, sediments and shrimps are generally recommended. Three different electro-analytical techniques i.e. polarography, anodic stripping voltammetry (ASV) and adsorptive stripping voltammetry (ADSV) have been used. Cd/sub 2/+, Pb/sub 2/+, Cu/sub 2/+ and Mn/sub 2/+ were determined through ASV, Cr/sub 6/+ was analyzed by ADSV and Fe/sub 2/+, Zn/sub 2/+, Ni/sub 2/+ and Co/sub 2/+ were determined through polarography. Out of which pairs of Fe/sub 2/+Zn/sub 2/+ and Ni/sub 2/+Co/sub 2/+ were determined in two separate runs while Cd/sub 2/+, Pb/sub 2/+, Cu/sub 2/+ were analyzed in single run of ASV. Sensitivity and speciation capabilities of voltammetric methods have been employed. Analysis conditions were optimized that includes selection of supporting electrolyte, pH, working electrodes, sweep rate etc. Stripping voltammetry was adopted for analysis at ultra trace levels. Statistical parameters for analytical method development like selectivity factor, interference, repeatability (0.0065-0.130 macro g/g), reproducibility (0.08125-1.625 macro g/g), detection limits (0.032-5.06 macro g/g), limits of quantification (0.081-12.652 macro g/g), sensitivities (5.636-2.15 nA mL macro g-1) etc. were also determined. The percentage recoveries were found in between 95-105% using certified reference materials. Real samples of complex marine environment from Karachi coastline were also analyzed. The standard addition method was employed where any matrix effect was evidenced. (author)

  6. An analytical approximation scheme to two-point boundary value problems of ordinary differential equations

    International Nuclear Information System (INIS)

    Boisseau, Bruno; Forgacs, Peter; Giacomini, Hector

    2007-01-01

    A new (algebraic) approximation scheme to find global solutions of two-point boundary value problems of ordinary differential equations (ODEs) is presented. The method is applicable for both linear and nonlinear (coupled) ODEs whose solutions are analytic near one of the boundary points. It is based on replacing the original ODEs by a sequence of auxiliary first-order polynomial ODEs with constant coefficients. The coefficients in the auxiliary ODEs are uniquely determined from the local behaviour of the solution in the neighbourhood of one of the boundary points. The problem of obtaining the parameters of the global (connecting) solutions, analytic at one of the boundary points, reduces to find the appropriate zeros of algebraic equations. The power of the method is illustrated by computing the approximate values of the 'connecting parameters' for a number of nonlinear ODEs arising in various problems in field theory. We treat in particular the static and rotationally symmetric global vortex, the skyrmion, the Abrikosov-Nielsen-Olesen vortex, as well as the 't Hooft-Polyakov magnetic monopole. The total energy of the skyrmion and of the monopole is also computed by the new method. We also consider some ODEs coming from the exact renormalization group. The ground-state energy level of the anharmonic oscillator is also computed for arbitrary coupling strengths with good precision. (fast track communication)

  7. Analysing task design and students' responses to context-based problems through different analytical frameworks

    Science.gov (United States)

    Broman, Karolina; Bernholt, Sascha; Parchmann, Ilka

    2015-05-01

    Background:Context-based learning approaches are used to enhance students' interest in, and knowledge about, science. According to different empirical studies, students' interest is improved by applying these more non-conventional approaches, while effects on learning outcomes are less coherent. Hence, further insights are needed into the structure of context-based problems in comparison to traditional problems, and into students' problem-solving strategies. Therefore, a suitable framework is necessary, both for the analysis of tasks and strategies. Purpose:The aim of this paper is to explore traditional and context-based tasks as well as students' responses to exemplary tasks to identify a suitable framework for future design and analyses of context-based problems. The paper discusses different established frameworks and applies the Higher-Order Cognitive Skills/Lower-Order Cognitive Skills (HOCS/LOCS) taxonomy and the Model of Hierarchical Complexity in Chemistry (MHC-C) to analyse traditional tasks and students' responses. Sample:Upper secondary students (n=236) at the Natural Science Programme, i.e. possible future scientists, are investigated to explore learning outcomes when they solve chemistry tasks, both more conventional as well as context-based chemistry problems. Design and methods:A typical chemistry examination test has been analysed, first the test items in themselves (n=36), and thereafter 236 students' responses to one representative context-based problem. Content analysis using HOCS/LOCS and MHC-C frameworks has been applied to analyse both quantitative and qualitative data, allowing us to describe different problem-solving strategies. Results:The empirical results show that both frameworks are suitable to identify students' strategies, mainly focusing on recall of memorized facts when solving chemistry test items. Almost all test items were also assessing lower order thinking. The combination of frameworks with the chemistry syllabus has been

  8. An analytic solution of the static problem of inclined risers conveying fluid

    KAUST Repository

    Alfosail, Feras

    2016-05-28

    We use the method of matched asymptotic expansion to develop an analytic solution to the static problem of clamped–clamped inclined risers conveying fluid. The inclined riser is modeled as an Euler–Bernoulli beam taking into account its self-weight, mid-plane stretching, an applied axial tension, and the internal fluid velocity. The solution consists of three parts: an outer solution valid away from the two boundaries and two inner solutions valid near the two ends. The three solutions are then matched and combined into a so-called composite expansion. A Newton–Raphson method is used to determine the value of the mid-plane stretching corresponding to each applied tension and internal velocity. The analytic solution is in good agreement with those obtained with other solution methods for large values of applied tensions. Therefore, it can be used to replace other mathematical solution methods that suffer numerical limitations and high computational cost. © 2016 Springer Science+Business Media Dordrecht

  9. A review of analytical techniques for the determination of carbon-14 in environmental samples

    International Nuclear Information System (INIS)

    Milton, G.M.; Brown, R.M.

    1993-11-01

    This report contains a brief summary of analytical techniques commonly used for the determination of radiocarbon in a variety of environmental samples. Details of the applicable procedures developed and tested in the Environmental Research Branch at Chalk River Laboratories are appended

  10. The problem of large samples. An activation analysis study of electronic waste material

    International Nuclear Information System (INIS)

    Segebade, C.; Goerner, W.; Bode, P.

    2007-01-01

    Large-volume instrumental photon activation analysis (IPAA) was used for the investigation of shredded electronic waste material. Sample masses from 1 to 150 grams were analyzed to obtain an estimate of the minimum sample size to be taken to achieve a representativeness of the results which is satisfactory for a defined investigation task. Furthermore, the influence of irradiation and measurement parameters upon the quality of the analytical results were studied. Finally, the analytical data obtained from IPAA and instrumental neutron activation analysis (INAA), both carried out in a large-volume mode, were compared. Only parts of the values were found in satisfactory agreement. (author)

  11. An analytical study of the Q(s, S) policy applied to the joint replenishment problem

    DEFF Research Database (Denmark)

    Nielsen, Christina; Larsen, Christian

    2005-01-01

    be considered supply chain management problems. The paper uses Markov decision theory to work out an analytical solution procedure to evaluate the costs of a particular Q(s,S) policy, and thereby a method for computing the optimal Q(s,S) policy, under the assumption that demands follow a Poisson Process...

  12. An analytical study of the Q(s,S) policy applied on the joint replenishment problem

    DEFF Research Database (Denmark)

    Nielsen, Christina; Larsen, Christian

    2002-01-01

    be considered supply chain management problems. The paper uses Markov decision theory to work out an analytical solution procedure to evaluate the costs of a particular Q(s,S) policy, and thereby a method to compute the optimal Q(s,S) policy, under the assumption that demands follow a Poisson process...

  13. Combining project based learning with exercises in problem solving in order to train analytical mathematical skills

    DEFF Research Database (Denmark)

    Friesel, Anna

    2013-01-01

    This paper presents the contents and the teaching methods used in the fourth semester course - REG4E - an important subject in engineering, namely Control Theory and Dynamical Systems. Control Theory courses in engineering education are usually related to exercises in the laboratory or to projects....... However, in order to understand complexity of control systems, the students need to possess an analytical understanding of abstract mathematical problems. Our main goal is to illustrate the theory through the robot project, but at the same time we force our students to train their analytical skills...

  14. Adaptive sampling method in deep-penetration particle transport problem

    International Nuclear Information System (INIS)

    Wang Ruihong; Ji Zhicheng; Pei Lucheng

    2012-01-01

    Deep-penetration problem has been one of the difficult problems in shielding calculation with Monte Carlo method for several decades. In this paper, a kind of particle transport random walking system under the emission point as a sampling station is built. Then, an adaptive sampling scheme is derived for better solution with the achieved information. The main advantage of the adaptive scheme is to choose the most suitable sampling number from the emission point station to obtain the minimum value of the total cost in the process of the random walk. Further, the related importance sampling method is introduced. Its main principle is to define the importance function due to the particle state and to ensure the sampling number of the emission particle is proportional to the importance function. The numerical results show that the adaptive scheme under the emission point as a station could overcome the difficulty of underestimation of the result in some degree, and the adaptive importance sampling method gets satisfied results as well. (authors)

  15. Thermo Techno Modern Analytical Equipment for Research and Industrial Laboratories

    Directory of Open Access Journals (Sweden)

    Khokhlov, S.V.

    2014-03-01

    Full Text Available A brief overview of some models of Thermo Techno analytical equipment and possible areas of their application is given. Thermo Techno Company was created in 2000 as a part of representative office of international corporation Thermo Fisher Scientific — world leader in manufacturing analytical equipments. Thermo Techno is a unique company in its integrated approach in solving the problems of the user, which includes a series of steps: setting the analytical task, selection of effective analysis methods, sample delivery and preparation as well as data transmitting and archiving.

  16. An analytical solution to the heat transfer problem in thick-walled hunt flow

    International Nuclear Information System (INIS)

    Bluck, Michael J; Wolfendale, Michael J

    2017-01-01

    Highlights: • Convective heat transfer in Hunt type flow of a liquid metal in a rectangular duct. • Analytical solution to the H1 constant peripheral temperature in a rectangular duct. • New H1 result demonstrating the enhancement of heat transfer due to flow distortion by the applied magnetic field. • Analytical solution to the H2 constant peripheral heat flux in a rectangular duct. • New H2 result demonstrating the reduction of heat transfer due to flow distortion by the applied magnetic field. • Results are important for validation of CFD in magnetohydrodynamics and for implementation of systems code approaches. - Abstract: The flow of a liquid metal in a rectangular duct, subject to a strong transverse magnetic field is of interest in a number of applications. An important application of such flows is in the context of coolants in fusion reactors, where heat is transferred to a lead-lithium eutectic. It is vital, therefore, that the heat transfer mechanisms are understood. Forced convection heat transfer is strongly dependent on the flow profile. In the hydrodynamic case, Nusselt numbers and the like, have long been well characterised in duct geometries. In the case of liquid metals in strong magnetic fields (magnetohydrodynamics), the flow profiles are very different and one can expect a concomitant effect on convective heat transfer. For fully developed laminar flows, the magnetohydrodynamic problem can be characterised in terms of two coupled partial differential equations. The problem of heat transfer for perfectly electrically insulating boundaries (Shercliff case) has been studied previously (Bluck et al., 2015). In this paper, we demonstrate corresponding analytical solutions for the case of conducting hartmann walls of arbitrary thickness. The flow is very different from the Shercliff case, exhibiting jets near the side walls and core flow suppression which have profound effects on heat transfer.

  17. Different Analytical Procedures for the Study of Organic Residues in Archeological Ceramic Samples with the Use of Gas Chromatography-mass Spectrometry.

    Science.gov (United States)

    Kałużna-Czaplińska, Joanna; Rosiak, Angelina; Kwapińska, Marzena; Kwapiński, Witold

    2016-01-01

    The analysis of the composition of organic residues present in pottery is an important source of information for historians and archeologists. Chemical characterization of the materials provides information on diets, habits, technologies, and original use of the vessels. This review presents the problem of analytical studies of archeological materials with a special emphasis on organic residues. Current methods used in the determination of different organic compounds in archeological ceramics are presented. Particular attention is paid to the procedures of analysis of archeological ceramic samples used before gas chromatography-mass spectrometry. Advantages and disadvantages of different extraction methods and application of proper quality assurance/quality control procedures are discussed.

  18. New high temperature plasmas and sample introduction systems for analytical atomic emission and mass spectrometry

    International Nuclear Information System (INIS)

    Montaser, A.

    1993-01-01

    In this research, new high-temperature plasmas and new sample introduction systems are explored for rapid elemental and isotopic analysis of gases, solutions, and solids using mass spectrometry and atomic emission spectrometry. During the period January 1993--December 1993, emphasis was placed on (a) analytical investigations of atmospheric-pressure helium inductively coupled plasma (He ICP) that are suitable for atomization, excitation, and ionization of elements possessing high excitation and ionization energies; (b) simulation and computer modeling of plasma sources to predict their structure and fundamental and analytical properties without incurring the enormous cost of experimental studies; (c) spectrosopic imaging and diagnostic studies of high-temperature plasmas; (d) fundamental studies of He ICP discharges and argon-nitrogen plasma by high-resolution Fourier transform spectrometry; and (e) fundamental and analytical investigation of new, low-cost devices as sample introduction systems for atomic spectrometry and examination of new diagnostic techniques for probing aerosols. Only the most important achievements are included in this report to illustrate progress and obstacles. Detailed descriptions of the authors' investigations are outlined in the reprints and preprints that accompany this report. The technical progress expected next year is briefly described at the end of this report

  19. Analytical methodologies for the determination of benzodiazepines in biological samples.

    Science.gov (United States)

    Persona, Karolina; Madej, Katarzyna; Knihnicki, Paweł; Piekoszewski, Wojciech

    2015-09-10

    Benzodiazepine drugs belong to important and most widely used medicaments. They demonstrate such therapeutic properties as anxiolytic, sedative, somnifacient, anticonvulsant, diastolic and muscle relaxant effects. However, despite the fact that benzodiazepines possess high therapeutic index and are considered to be relatively safe, their use can be dangerous when: (1) co-administered with alcohol, (2) co-administered with other medicaments like sedatives, antidepressants, neuroleptics or morphine like substances, (3) driving under their influence, (4) using benzodiazepines non-therapeutically as drugs of abuse or in drug-facilitated crimes. For these reasons benzodiazepines are still studied and determined in a variety of biological materials. In this article, sample preparation techniques which have been applied in analysis of benzodiazepine drugs in biological samples have been reviewed and presented. The next part of the article is focused on a review of analytical methods which have been employed for pharmacological, toxicological or forensic study of this group of drugs in the biological matrices. The review was preceded by a description of the physicochemical properties of the selected benzodiazepines and two, very often coexisting in the same analyzed samples, sedative-hypnotic drugs. Copyright © 2015. Published by Elsevier B.V.

  20. A rapid and sensitive analytical method for the determination of 14 pyrethroids in water samples.

    Science.gov (United States)

    Feo, M L; Eljarrat, E; Barceló, D

    2010-04-09

    A simple, efficient and environmentally friendly analytical methodology is proposed for extracting and preconcentrating pyrethroids from water samples prior to gas chromatography-negative ion chemical ionization mass spectrometry (GC-NCI-MS) analysis. Fourteen pyrethroids were selected for this work: bifenthrin, cyfluthrin, lambda-cyhalothrin, cypermethrin, deltamethrin, esfenvalerate, fenvalerate, fenpropathrin, tau-fluvalinate, permethrin, phenothrin, resmethrin, tetramethrin and tralomethrin. The method is based on ultrasound-assisted emulsification-extraction (UAEE) of a water-immiscible solvent in an aqueous medium. Chloroform was used as extraction solvent in the UAEE technique. Target analytes were quantitatively extracted achieving an enrichment factor of 200 when 20 mL aliquot of pure water spiked with pyrethroid standards was extracted. The method was also evaluated with tap water and river water samples. Method detection limits (MDLs) ranged from 0.03 to 35.8 ng L(-1) with RSDs values or =0.998. Recovery values were in the range of 45-106%, showing satisfactory robustness of the method for analyzing pyrethroids in water samples. The proposed methodology was applied for the analysis of river water samples. Cypermethrin was detected at concentration levels ranging from 4.94 to 30.5 ng L(-1). Copyright 2010 Elsevier B.V. All rights reserved.

  1. Chance constrained problems: penalty reformulation and performance of sample approximation technique

    Czech Academy of Sciences Publication Activity Database

    Branda, Martin

    2012-01-01

    Roč. 48, č. 1 (2012), s. 105-122 ISSN 0023-5954 R&D Projects: GA ČR(CZ) GBP402/12/G097 Institutional research plan: CEZ:AV0Z10750506 Keywords : chance constrained problems * penalty functions * asymptotic equivalence * sample approximation technique * investment problem Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.619, year: 2012 http://library.utia.cas.cz/separaty/2012/E/branda-chance constrained problems penalty reformulation and performance of sample approximation technique.pdf

  2. Analytic Lorentz integral transform of an arbitrary response function and its application to the inversion problem

    International Nuclear Information System (INIS)

    Barnea, N.; Liverts, E.

    2010-01-01

    In this paper we present an analytic expression for the Lorentz integral transform of an arbitrary response function expressed as a polynomial times a decaying exponent. The resulting expression is applied to the inversion problem of the Lorentz integral transform, simplifying the inversion procedure and improving the accuracy of the procedure. We have presented analytic formulae for a family of basis function often used in the inversion of the LIT function. These formulae allow for an efficient and accurate inversion. The quality and the stability of the resulting inversions were demonstrated through two different examples yielding outstanding results. (author)

  3. Exact Analytical Solutions in Three-Body Problems and Model of Neutrino Generator

    Directory of Open Access Journals (Sweden)

    Takibayev N.Zh.

    2010-04-01

    Full Text Available Exact analytic solutions are obtained in three-body problem for the scattering of light particle on the subsystem of two fixed centers in the case when pair potentials have a separable form. Solutions show an appearance of new resonance states and dependence of resonance energy and width on distance between two fixed centers. The approach of exact analytical solutions is expanded to the cases when two-body scattering amplitudes have the Breit-Wigner’s form and employed for description of neutron resonance scattering on subsystem of two heavy nuclei fixed in nodes of crystalline lattice. It is shown that some resonance states have widths close to zero at the certain values of distance between two heavy scatterer centers, this gives the possibility of transitions between states. One of these transitions between three-body resonance states could be connected with process of electron capture by proton with formation of neutron and emission of neutrino. This exoenergic process leading to the cooling of star without nuclear reactions is discussed.

  4. Optimal resampling for the noisy OneMax problem

    OpenAIRE

    Liu, Jialin; Fairbank, Michael; Pérez-Liébana, Diego; Lucas, Simon M.

    2016-01-01

    The OneMax problem is a standard benchmark optimisation problem for a binary search space. Recent work on applying a Bandit-Based Random Mutation Hill-Climbing algorithm to the noisy OneMax Problem showed that it is important to choose a good value for the resampling number to make a careful trade off between taking more samples in order to reduce noise, and taking fewer samples to reduce the total computational cost. This paper extends that observation, by deriving an analytical expression f...

  5. Marine anthropogenic radiotracers in the Southern Hemisphere: New sampling and analytical strategies

    Science.gov (United States)

    Levy, I.; Povinec, P. P.; Aoyama, M.; Hirose, K.; Sanchez-Cabeza, J. A.; Comanducci, J.-F.; Gastaud, J.; Eriksson, M.; Hamajima, Y.; Kim, C. S.; Komura, K.; Osvath, I.; Roos, P.; Yim, S. A.

    2011-04-01

    The Japan Agency for Marine Earth Science and Technology conducted in 2003-2004 the Blue Earth Global Expedition (BEAGLE2003) around the Southern Hemisphere Oceans, which was a rare opportunity to collect many seawater samples for anthropogenic radionuclide studies. We describe here sampling and analytical methodologies based on radiochemical separations of Cs and Pu from seawater, as well as radiometric and mass spectrometry measurements. Several laboratories took part in radionuclide analyses using different techniques. The intercomparison exercises and analyses of certified reference materials showed a reasonable agreement between the participating laboratories. The obtained data on the distribution of 137Cs and plutonium isotopes in seawater represent the most comprehensive results available for the Southern Hemisphere Oceans.

  6. An Investigation to Manufacturing Analytical Services Composition using the Analytical Target Cascading Method.

    Science.gov (United States)

    Tien, Kai-Wen; Kulvatunyou, Boonserm; Jung, Kiwook; Prabhu, Vittaldas

    2017-01-01

    As cloud computing is increasingly adopted, the trend is to offer software functions as modular services and compose them into larger, more meaningful ones. The trend is attractive to analytical problems in the manufacturing system design and performance improvement domain because 1) finding a global optimization for the system is a complex problem; and 2) sub-problems are typically compartmentalized by the organizational structure. However, solving sub-problems by independent services can result in a sub-optimal solution at the system level. This paper investigates the technique called Analytical Target Cascading (ATC) to coordinate the optimization of loosely-coupled sub-problems, each may be modularly formulated by differing departments and be solved by modular analytical services. The result demonstrates that ATC is a promising method in that it offers system-level optimal solutions that can scale up by exploiting distributed and modular executions while allowing easier management of the problem formulation.

  7. Parameter sampling capabilities of sequential and simultaneous data assimilation: I. Analytical comparison

    International Nuclear Information System (INIS)

    Fossum, Kristian; Mannseth, Trond

    2014-01-01

    We assess the parameter sampling capabilities of some Bayesian, ensemble-based, joint state-parameter (JS) estimation methods. The forward model is assumed to be non-chaotic and have nonlinear components, and the emphasis is on results obtained for the parameters in the state-parameter vector. A variety of approximate sampling methods exist, and a number of numerical comparisons between such methods have been performed. Often, more than one of the defining characteristics vary from one method to another, so it can be difficult to point out which characteristic of the more successful method in such a comparison was decisive. In this study, we single out one defining characteristic for comparison; whether or not data are assimilated sequentially or simultaneously. The current paper is concerned with analytical investigations into this issue. We carefully select one sequential and one simultaneous JS method for the comparison. We also design a corresponding pair of pure parameter estimation methods, and we show how the JS methods and the parameter estimation methods are pairwise related. It is shown that the sequential and the simultaneous parameter estimation methods are equivalent for one particular combination of observations with different degrees of nonlinearity. Strong indications are presented for why one may expect the sequential parameter estimation method to outperform the simultaneous parameter estimation method for all other combinations of observations. Finally, the conditions for when similar relations can be expected to hold between the corresponding JS methods are discussed. A companion paper, part II (Fossum and Mannseth 2014 Inverse Problems 30 114003), is concerned with statistical analysis of results from a range of numerical experiments involving sequential and simultaneous JS estimation, where the design of the numerical investigation is motivated by our findings in the current paper. (paper)

  8. Destructive analytical services to safeguards: Activity report 1984-1987

    International Nuclear Information System (INIS)

    Bagliano, G.

    1987-08-01

    The report gives an evaluation of the volume, delays and quality of destructive analytical services provided in 84/01/01-87/06/30 in support of Agency Safeguards. General observations are also made. Problems and improvements are identified and trends indicated. The main features of the considered period are as follows: a) About 4,100 inspections samples were received and analyzed in the period 84/01/01-87/06/30 by SAL and NWAL. The samples represented 20 different types of materials issued from different nuclear fuel cycles. b) 496 out of 895 spent fuel samples analyzed in this period were distributed to NWAL. c) The chemical and isotopic analyses of Uranium, Thorium and Plutonium requested by the Inspectors (Americium also but occasionally) resulted in the report of a total of about 14,200 analytical results. d) The calibration of on-site measurements, the certification of Working Reference Materials for Destructive Analysis and the maintenance and improvement of Destructive Analysis required additional analytical services which were provided. e) At present, compared to 1983, the total verification delays by DA were shortened by 16%, 57% and 55% respectively for Uranium, Plutonium and spent fuel materials. f) Timely detection of abrupt diversion is achievable for the low enriched Uranium samples and are close to be achievable for the Plutonium and Spent Fuel Samples. g) Concepts, Procedures, and Materials for Analytical Quality Control Programmes of the measurements performed at SAL and NWAL were prepared. The analysis performed can be adequately accurate. h) Problems continue however to be encountered in the quality of the overall DA verification system. i) Major upgrading of equipment and procedures was undertaken at SAL. j) Other selective chemical assays were being tested and Isotope Dilution Mass Spec assays have been successfully set up for the analysis of 3 mg-sized Plutonium product samples. k) Close contacts have been kept with NWAL via Consultants

  9. Macro elemental analysis of food samples by nuclear analytical technique

    Science.gov (United States)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  10. Functions with disconnected spectrum sampling, interpolation, translates

    CERN Document Server

    Olevskii, Alexander M

    2016-01-01

    The classical sampling problem is to reconstruct entire functions with given spectrum S from their values on a discrete set L. From the geometric point of view, the possibility of such reconstruction is equivalent to determining for which sets L the exponential system with frequencies in L forms a frame in the space L^2(S). The book also treats the problem of interpolation of discrete functions by analytic ones with spectrum in S and the problem of completeness of discrete translates. The size and arithmetic structure of both the spectrum S and the discrete set L play a crucial role in these problems. After an elementary introduction, the authors give a new presentation of classical results due to Beurling, Kahane, and Landau. The main part of the book focuses on recent progress in the area, such as construction of universal sampling sets, high-dimensional and non-analytic phenomena. The reader will see how methods of harmonic and complex analysis interplay with various important concepts in different areas, ...

  11. Effect of the sample matrix on measurement uncertainty in X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Morgenstern, P.; Brueggemann, L.; Wennrich, R.

    2005-01-01

    The estimation of measurement uncertainty, with reference to univariate calibration functions, is discussed in detail in the Eurachem Guide 'Quantifying Uncertainty in Analytical Measurement'. The adoption of these recommendations to quantitative X-ray fluorescence analysis (XRF) involves basic problems which are above all due to the strong influence of the sample matrix on the analytical response. In XRF-analysis, the proposed recommendations are consequently applicable only to the matrix corrected response. The application is also restricted with regard to both the matrices and analyte concentrations. In this context the present studies are aimed at the problems to predict measurement uncertainty also with reference to more variable sample compositions. The corresponding investigations are focused on the use of the intensity of the Compton scattered tube line as an internal standard to assess the effect of the individual sample matrix on the analytical response relatively to a reference matrix. Based on this concept the estimation of the measurement uncertainty of an analyte presented in an unknown specimen can be predicted in consideration of the data obtained under defined matrix conditions

  12. Sampling and Analytical Method for Alpha-Dicarbonyl Flavoring Compounds via Derivatization with o-Phenylenediamine and Analysis Using GC-NPD

    Directory of Open Access Journals (Sweden)

    Stephanie M. Pendergrass

    2016-01-01

    Full Text Available A novel methodology is described for the sampling and analysis of diacetyl, 2,3-pentanedione, 2,3-hexanedione, and 2,3-heptanedione. These analytes were collected on o-phenylenediamine-treated silica gel tubes and quantitatively recovered as the corresponding quinoxaline derivatives. After derivatization, the sorbent was desorbed in 3 mL of ethanol solvent and analyzed using gas chromatography/nitrogen-phosphorous detection (GC/NPD. The limits of detection (LOD achieved for each analyte were determined to be in the range of 5–10 nanograms/sample. Evaluation of the on-tube derivatization procedure indicated that it is unaffected by humidities ranging from 20% to 80% and that the derivatization procedure was quantitative for analyte concentrations ranging from 0.1 μg to approximately 500 μg per sample. Storage stability studies indicated that the derivatives were stable for 30 days when stored at both ambient and refrigerated temperatures. Additional studies showed that the quinoxaline derivatives were quantitatively recovered when sampling up to a total volume of 72 L at a sampling rate of 50 cc/min. This method will be important to evaluate and monitor worker exposures in the food and flavoring industry. Samples can be collected over an 8-hour shift with up to 288 L total volume collected regardless of time, sampling rate, and/or the effects of humidity.

  13. Development of an analytical procedure for plutonium in the concentration range of femtogram/gram and its application to environmental samples

    International Nuclear Information System (INIS)

    Schuettelkopf, H.

    1981-09-01

    To study the behaviour of plutonium in the environment and to measure plutonium in the vicinity of nuclear facilities, a quick, sensitive analytical method is required which can be applied to all sample materials found in the environment. For a sediment contaminated with plutonium a boiling out method using first HNO 3 /HF and subsequently HNO 3 /Al(NO 3 ) 3 was found to be successful. The leaching solution was then extracted by TOPO and the plutonium backextracted by ascorbic acid/HCl. Some different purification steps and finally electroplating using ammonium oxalate led to an optimum sample for α- spectroscopic determination of plutonium. An analytical method was worked out for plutonium which can be applied to all materials found in the environment. The sample size is 100 g but it might also be much greater. The average chemical yield is 70 and 80%. The detection limit for soil samples is 0.1 fCi/g and for plant samples 0.5 fCi/g. One technician can perform eight analyses per working day. The analytical procedure was applied to a large number of environmental samples and the results of these analyses are indicated. (orig./RB) [de

  14. Analytical results and sample locality map for rock, stream-sediment, and soil samples, Northern and Eastern Coloado Desert BLM Resource Area, Imperial, Riverside, and San Bernardino Counties, California

    Science.gov (United States)

    King, Harley D.; Chaffee, Maurice A.

    2000-01-01

    INTRODUCTION In 1996-1998 the U.S. Geological Survey (USGS) conducted a geochemical study of the Bureau of Land Management's (BLM) 5.5 million-acre Northern and Eastern Colorado Desert Resource Area (usually referred to as the NECD in this report), Imperial, Riverside, and San Bernardino Counties, southeastern California (figure 1). This study was done in support of the BLM's Coordinated Management Plan for the area. This report presents analytical data from this study. To provide comprehensive coverage of the NECD, we compiled and examined all available geochemical data, in digital form, from previous studies in the area, and made sample-site plots to aid in determining where sample-site coverage and analyses were sufficient, which samples should be re-analyzed, and where additional sampling was needed. Previous investigations conducted in parts of the current study area included the National Uranium Resource Evaluation (NURE) program studies of the Needles and Salton Sea 1? x 2? quadrangles; USGS studies of 12 BLM Wilderness Study Areas (WSAs) (Big Maria Mountains, Chemehuevi Mountains, Chuckwalla Mountains, Coxcomb Mountains, Mecca Hills, Orocopia Mountains, Palen-McCoy, Picacho Peak, Riverside Mountains, Sheephole Valley (also known as Sheep Hole/Cadiz), Turtle Mountains, and Whipple Mountains); and USGS studies in the Needles and El Centro 1? x 2? quadrangles done during the early 1990s as part of a project to identify the regional geochemistry of southern California. Areas where we did new sampling of rocks and stream sediments are mainly in the Chocolate Mountain Aerial Gunnery Range and in Joshua Tree National Park, which extends into the west-central part of the NECD, as shown in figure 1 and figure 2. This report contains analytical data for 132 rock samples and 1,245 stream-sediment samples collected by the USGS, and 362 stream-sediment samples and 189 soil samples collected during the NURE program. All samples are from the Northern and Eastern Colorado

  15. Limitless Analytic Elements

    Science.gov (United States)

    Strack, O. D. L.

    2018-02-01

    We present equations for new limitless analytic line elements. These elements possess a virtually unlimited number of degrees of freedom. We apply these new limitless analytic elements to head-specified boundaries and to problems with inhomogeneities in hydraulic conductivity. Applications of these new analytic elements to practical problems involving head-specified boundaries require the solution of a very large number of equations. To make the new elements useful in practice, an efficient iterative scheme is required. We present an improved version of the scheme presented by Bandilla et al. (2007), based on the application of Cauchy integrals. The limitless analytic elements are useful when modeling strings of elements, rivers for example, where local conditions are difficult to model, e.g., when a well is close to a river. The solution of such problems is facilitated by increasing the order of the elements to obtain a good solution. This makes it unnecessary to resort to dividing the element in question into many smaller elements to obtain a satisfactory solution.

  16. Numerical and analytical solutions for problems relevant for quantum computers; Numerische und analytische Loesungen fuer Quanteninformatisch-relevante Probleme

    Energy Technology Data Exchange (ETDEWEB)

    Spoerl, Andreas

    2008-06-05

    Quantum computers are one of the next technological steps in modern computer science. Some of the relevant questions that arise when it comes to the implementation of quantum operations (as building blocks in a quantum algorithm) or the simulation of quantum systems are studied. Numerical results are gathered for variety of systems, e.g. NMR systems, Josephson junctions and others. To study quantum operations (e.g. the quantum fourier transform, swap operations or multiply-controlled NOT operations) on systems containing many qubits, a parallel C++ code was developed and optimised. In addition to performing high quality operations, a closer look was given to the minimal times required to implement certain quantum operations. These times represent an interesting quantity for the experimenter as well as for the mathematician. The former tries to fight dissipative effects with fast implementations, while the latter draws conclusions in the form of analytical solutions. Dissipative effects can even be included in the optimisation. The resulting solutions are relaxation and time optimised. For systems containing 3 linearly coupled spin (1)/(2) qubits, analytical solutions are known for several problems, e.g. indirect Ising couplings and trilinear operations. A further study was made to investigate whether there exists a sufficient set of criteria to identify systems with dynamics which are invertible under local operations. Finally, a full quantum algorithm to distinguish between two knots was implemented on a spin(1)/(2) system. All operations for this experiment were calculated analytically. The experimental results coincide with the theoretical expectations. (orig.)

  17. Solving probabilistic inverse problems rapidly with prior samples

    NARCIS (Netherlands)

    Käufl, Paul; Valentine, Andrew P.; de Wit, Ralph W.; Trampert, Jeannot

    2016-01-01

    Owing to the increasing availability of computational resources, in recent years the probabilistic solution of non-linear, geophysical inverse problems by means of sampling methods has become increasingly feasible. Nevertheless, we still face situations in which a Monte Carlo approach is not

  18. Final Report on the Analytical Results for Tank Farm Samples in Support of Salt Dissolution Evaluation

    International Nuclear Information System (INIS)

    Hobbs, D.T.

    1996-01-01

    Recent processing of dilute solutions through the 2H-Evaporator system caused dissolution of salt in Tank 38H, the concentrate receipt tank. This report documents analytical results for samples taken from this evaporator system

  19. Current status of JAERI program on development of ultra-trace-analytical technology for safeguards environmental samples

    International Nuclear Information System (INIS)

    Adachi, T.; Usuda, S.; Watanabe, K.

    2001-01-01

    Full text: In order to contribute to the strengthened safeguards system based on the Program 93+2 of the IAEA, Japan Atomic Energy Research Institute (JAERI) is developing analytical technology for ultra-trace amounts of nuclear materials in environmental samples, and constructed the CLEAR facility (Clean Laboratory for Environmental Analysis and Research) for this purpose. The development of the technology is carried out, at existing laboratories for time being, in the following fields: screening, bulk analysis and particle analysis. The screening aims at estimating the amounts of nuclear materials in environmental samples to be introduced into the clean rooms, and is the first step to avoid cross-contamination among the samples and contamination of the clean rooms themselves. In addition to ordinary radiation spectrometry, Compton suppression technique was applied to low energy γ- and X-ray measurements, and sufficient reduction in background level has been demonstrated. Another technique in examination is imaging-plate method, which is a kind of autoradiography and suitable for determination of radioactive-particle distribution in the samples as well as for semiquantitative determination. As for the bulk analysis, the efforts are temporally made on uranium in swipe samples. Preliminary examination for optimization of sample pre-treatment conditions is in progress. At present, ashing by low-temperature-plasma method gives better results than high-temperature ashing or acid leaching. For the isotopic ratio measurement, instrumental performance of inductively-coupled plasma mass spectrometry (ICP-MS) are mainly examined because sample preparation for ICP-MS is simpler than that for thermal ionization mass spectrometry (TIMS). It was found by our measurement that the swipe material (TexWipe TX304, usually used by IAEA) contains un-negligible uranium blank with large deviation (2-6 ng/sheet). This would introduce significant uncertainty in the trace analysis. JAERI

  20. The analytical solution to the problem on the temperature field in a structural element of rectangular profile for third kind boundary conditions

    International Nuclear Information System (INIS)

    Kulich, N.V.; Nemtsev, V.A.

    1986-01-01

    The analytical solution to the problem on the stationary temperature field in an infinite structural element of rectangular profile characteristic of the conjugation points of a vessel and a tube sheet of a heat exchanger (or of a finned surface) at the third-kind boundary conditions has been obtained by the methods of the complex variable function theory. With the help of the obtained analytical dependences the calculations of the given element of the design and the comparison with the known data have been conducted. The proposed analytical solution can be effectively used in calculations of temperature fields in finned surfaces and structural elements of the power equipment of the considered profile and the method is applied for solution of the like problems

  1. Measuring myokines with cardiovascular functions: pre-analytical variables affecting the analytical output.

    Science.gov (United States)

    Lombardi, Giovanni; Sansoni, Veronica; Banfi, Giuseppe

    2017-08-01

    In the last few years, a growing number of molecules have been associated to an endocrine function of the skeletal muscle. Circulating myokine levels, in turn, have been associated with several pathophysiological conditions including the cardiovascular ones. However, data from different studies are often not completely comparable or even discordant. This would be due, at least in part, to the whole set of situations related to the preparation of the patient prior to blood sampling, blood sampling procedure, processing and/or store. This entire process constitutes the pre-analytical phase. The importance of the pre-analytical phase is often not considered. However, in routine diagnostics, the 70% of the errors are in this phase. Moreover, errors during the pre-analytical phase are carried over in the analytical phase and affects the final output. In research, for example, when samples are collected over a long time and by different laboratories, a standardized procedure for sample collecting and the correct procedure for sample storage are acknowledged. In this review, we discuss the pre-analytical variables potentially affecting the measurement of myokines with cardiovascular functions.

  2. Recommendations for sampling for prevention of hazards in civil defense. On analytics of chemical, biological and radioactive contaminations. Brief instruction for the CBRN (chemical, biological, radioactive, nuclear) sampling

    International Nuclear Information System (INIS)

    Bachmann, Udo; Biederbick, Walter; Derakshani, Nahid

    2010-01-01

    The recommendation for sampling for prevention of hazards in civil defense is describing the analytics of chemical, biological and radioactive contaminations and includes detail information on the sampling, protocol preparation and documentation procedures. The volume includes a separate brief instruction for the CBRN (chemical, biological, radioactive, nuclear) sampling.

  3. Testing Homogeneity in a Semiparametric Two-Sample Problem

    Directory of Open Access Journals (Sweden)

    Yukun Liu

    2012-01-01

    Full Text Available We study a two-sample homogeneity testing problem, in which one sample comes from a population with density f(x and the other is from a mixture population with mixture density (1−λf(x+λg(x. This problem arises naturally from many statistical applications such as test for partial differential gene expression in microarray study or genetic studies for gene mutation. Under the semiparametric assumption g(x=f(xeα+βx, a penalized empirical likelihood ratio test could be constructed, but its implementation is hindered by the fact that there is neither feasible algorithm for computing the test statistic nor available research results on its theoretical properties. To circumvent these difficulties, we propose an EM test based on the penalized empirical likelihood. We prove that the EM test has a simple chi-square limiting distribution, and we also demonstrate its competitive testing performances by simulations. A real-data example is used to illustrate the proposed methodology.

  4. Analytical Chemistry Laboratory Progress Report for FY 1994

    Energy Technology Data Exchange (ETDEWEB)

    Green, D.W.; Boparai, A.S.; Bowers, D.L. [and others

    1994-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1994 (October 1993 through September 1994). This annual report is the eleventh for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has a research program in analytical chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis -- which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL. The Chemical Analysis Group uses wet- chemical and instrumental methods for elemental, compositional, and isotopic determinations in solid, liquid, and gaseous samples and provides specialized analytical services. Major instruments in this group include an ion chromatograph (IC), an inductively coupled plasma/atomic emission spectrometer (ICP/AES), spectrophotometers, mass spectrometers (including gas-analysis and thermal-ionization mass spectrometers), emission spectrographs, autotitrators, sulfur and carbon determinators, and a kinetic phosphorescence uranium analyzer.

  5. Advantages of Analytical Transformations in Monte Carlo Methods for Radiation Transport

    International Nuclear Information System (INIS)

    McKinley, M S; Brooks III, E D; Daffin, F

    2004-01-01

    Monte Carlo methods for radiation transport typically attempt to solve an integral by directly sampling analog or weighted particles, which are treated as physical entities. Improvements to the methods involve better sampling, probability games or physical intuition about the problem. We show that significant improvements can be achieved by recasting the equations with an analytical transform to solve for new, non-physical entities or fields. This paper looks at one such transform, the difference formulation for thermal photon transport, showing a significant advantage for Monte Carlo solution of the equations for time dependent transport. Other related areas are discussed that may also realize significant benefits from similar analytical transformations

  6. Optimized Analytical Method to Determine Gallic and Picric Acids in Pyrotechnic Samples by Using HPLC/UV (Reverse Phase)

    International Nuclear Information System (INIS)

    Garcia Alonso, S.; Perez Pastor, R. M.

    2013-01-01

    A study on the optimization and development of a chromatographic method for the determination of gallic and picric acids in pyrotechnic samples is presented. In order to achieve this, both analytical conditions by HPLC with diode detection and extraction step of a selected sample were studied. (Author)

  7. Evaluation of analytical results on DOE Quality Assessment Program Samples

    International Nuclear Information System (INIS)

    Jaquish, R.E.; Kinnison, R.R.; Mathur, S.P.; Sastry, R.

    1985-01-01

    Criteria were developed for evaluating the participants analytical results in the DOE Quality Assessment Program (QAP). Historical data from previous QAP studies were analyzed using descriptive statistical methods to determine the interlaboratory precision that had been attained. Performance criteria used in other similar programs were also reviewed. Using these data, precision values and control limits were recommended for each type of analysis performed in the QA program. Results of the analysis performed by the QAP participants on the November 1983 samples were statistically analyzed and evaluated. The Environmental Measurements Laboratory (EML) values were used as the known values and 3-sigma precision values were used as control limits. Results were submitted by 26 participating laboratories for 49 different radionuclide media combinations. The participants reported 419 results and of these, 350 or 84% were within control limits. Special attention was given to the data from gamma spectral analysis of air filters and water samples. both normal probability and box plots were prepared for each nuclide to help evaluate the distribution of the data. Results that were outside the expected range were identified and suggestions made that laboratories check calculations, and procedures on these results

  8. Problem and Pathological Gambling in a Sample of Casino Patrons

    OpenAIRE

    Fong, Timothy W.; Campos, Michael D.; Brecht, Mary-Lynn; Davis, Alice; Marco, Adrienne; Pecanha, Viviane; Rosenthal, Richard J.

    2010-01-01

    Relatively few studies have examined gambling problems among individuals in a casino setting. The current study sought to examine the prevalence of gambling problems among a sample of casino patrons and examine alcohol and tobacco use, health status, and quality of life by gambling problem status. To these ends, 176 casino patrons were recruited by going to a Southern California casino and requesting that they complete an anonymous survey. Results indicated the following lifetime rates for at...

  9. On the problems of PPS sampling in multi-character surveys ...

    African Journals Online (AJOL)

    This paper, which is on the problems of PPS sampling in multi-character surveys, compares the efficiency of some estimators used in PPSWR sampling for multiple characteristics. From a superpopulation model, we computed the expected variances of the different estimators for each of the first two finite populations ...

  10. Nuclear and nuclear related analytical methods applied in environmental research

    International Nuclear Information System (INIS)

    Popescu, Ion V.; Gheboianu, Anca; Bancuta, Iulian; Cimpoca, G. V; Stihi, Claudia; Radulescu, Cristiana; Oros Calin; Frontasyeva, Marina; Petre, Marian; Dulama, Ioana; Vlaicu, G.

    2010-01-01

    Nuclear Analytical Methods can be used for research activities on environmental studies like water quality assessment, pesticide residues, global climatic change (transboundary), pollution and remediation. Heavy metal pollution is a problem associated with areas of intensive industrial activity. In this work the moss bio monitoring technique was employed to study the atmospheric deposition in Dambovita County Romania. Also, there were used complementary nuclear and atomic analytical methods: Neutron Activation Analysis (NAA), Atomic Absorption Spectrometry (AAS) and Inductively Coupled Plasma Atomic Emission Spectrometry (ICP-AES). These high sensitivity analysis methods were used to determine the chemical composition of some samples of mosses placed in different areas with different pollution industrial sources. The concentrations of Cr, Fe, Mn, Ni and Zn were determined. The concentration of Fe from the same samples was determined using all these methods and we obtained a very good agreement, in statistical limits, which demonstrate the capability of these analytical methods to be applied on a large spectrum of environmental samples with the same results. (authors)

  11. Identifying problems and generating recommendations for enhancing complex systems: applying the abstraction hierarchy framework as an analytical tool.

    Science.gov (United States)

    Xu, Wei

    2007-12-01

    This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.

  12. Statistics and sampling in transuranic studies

    International Nuclear Information System (INIS)

    Eberhardt, L.L.; Gilbert, R.O.

    1980-01-01

    The existing data on transuranics in the environment exhibit a remarkably high variability from sample to sample (coefficients of variation of 100% or greater). This chapter stresses the necessity of adequate sample size and suggests various ways to increase sampling efficiency. Objectives in sampling are regarded as being of great importance in making decisions as to sampling methodology. Four different classes of sampling methods are described: (1) descriptive sampling, (2) sampling for spatial pattern, (3) analytical sampling, and (4) sampling for modeling. A number of research needs are identified in the various sampling categories along with several problems that appear to be common to two or more such areas

  13. Analytic Hypoellipticity and the Treves Conjecture

    Directory of Open Access Journals (Sweden)

    Marco Mughetti

    2016-12-01

    Full Text Available We are concerned with the problem of the analytic hypoellipticity; precisely, we focus on the real analytic regularity of the solutions of sums of squares with real analytic coefficients. Treves conjecture states that an operator of this type is analytic hypoelliptic if and only if all the strata in the Poisson-Treves stratification are symplectic. We discuss a model operator, P, (firstly appeared and studied in [3] having a single symplectic stratum and prove that it is not analytic hypoelliptic. This yields a counterexample to the sufficient part of Treves conjecture; the necessary part is still an open problem.

  14. User's and reference guide to the INEL RML/analytical radiochemistry sample tracking database version 1.00

    International Nuclear Information System (INIS)

    Femec, D.A.

    1995-09-01

    This report discusses the sample tracking database in use at the Idaho National Engineering Laboratory (INEL) by the Radiation Measurements Laboratory (RML) and Analytical Radiochemistry. The database was designed in-house to meet the specific needs of the RML and Analytical Radiochemistry. The report consists of two parts, a user's guide and a reference guide. The user's guide presents some of the fundamentals needed by anyone who will be using the database via its user interface. The reference guide describes the design of both the database and the user interface. Briefly mentioned in the reference guide are the code-generating tools, CREATE-SCHEMA and BUILD-SCREEN, written to automatically generate code for the database and its user interface. The appendices contain the input files used by the these tools to create code for the sample tracking database. The output files generated by these tools are also included in the appendices

  15. Glycan characterization of the NIST RM monoclonal antibody using a total analytical solution: From sample preparation to data analysis.

    Science.gov (United States)

    Hilliard, Mark; Alley, William R; McManus, Ciara A; Yu, Ying Qing; Hallinan, Sinead; Gebler, John; Rudd, Pauline M

    Glycosylation is an important attribute of biopharmaceutical products to monitor from development through production. However, glycosylation analysis has traditionally been a time-consuming process with long sample preparation protocols and manual interpretation of the data. To address the challenges associated with glycan analysis, we developed a streamlined analytical solution that covers the entire process from sample preparation to data analysis. In this communication, we describe the complete analytical solution that begins with a simplified and fast N-linked glycan sample preparation protocol that can be completed in less than 1 hr. The sample preparation includes labelling with RapiFluor-MS tag to improve both fluorescence (FLR) and mass spectral (MS) sensitivities. Following HILIC-UPLC/FLR/MS analyses, the data are processed and a library search based on glucose units has been included to expedite the task of structural assignment. We then applied this total analytical solution to characterize the glycosylation of the NIST Reference Material mAb 8761. For this glycoprotein, we confidently identified 35 N-linked glycans and all three major classes, high mannose, complex, and hybrid, were present. The majority of the glycans were neutral and fucosylated; glycans featuring N-glycolylneuraminic acid and those with two galactoses connected via an α1,3-linkage were also identified.

  16. Computationally simple, analytic, closed form solution of the Coulomb self-interaction problem in Kohn Sham density functional theory

    International Nuclear Information System (INIS)

    Gonis, Antonios; Daene, Markus W.; Nicholson, Don M.; Stocks, George Malcolm

    2012-01-01

    We have developed and tested in terms of atomic calculations an exact, analytic and computationally simple procedure for determining the functional derivative of the exchange energy with respect to the density in the implementation of the Kohn Sham formulation of density functional theory (KS-DFT), providing an analytic, closed-form solution of the self-interaction problem in KS-DFT. We demonstrate the efficacy of our method through ground-state calculations of the exchange potential and energy for atomic He and Be atoms, and comparisons with experiment and the results obtained within the optimized effective potential (OEP) method.

  17. An analytical discrete ordinates solution for a nodal model of a two-dimensional neutron transport problem

    International Nuclear Information System (INIS)

    Filho, J. F. P.; Barichello, L. B.

    2013-01-01

    In this work, an analytical discrete ordinates method is used to solve a nodal formulation of a neutron transport problem in x, y-geometry. The proposed approach leads to an important reduction in the order of the associated eigenvalue systems, when combined with the classical level symmetric quadrature scheme. Auxiliary equations are proposed, as usually required for nodal methods, to express the unknown fluxes at the boundary introduced as additional unknowns in the integrated equations. Numerical results, for the problem defined by a two-dimensional region with a spatially constant and isotropically emitting source, are presented and compared with those available in the literature. (authors)

  18. Analytical results of variance reduction characteristics of biased Monte Carlo for deep-penetration problems

    International Nuclear Information System (INIS)

    Murthy, K.P.N.; Indira, R.

    1986-01-01

    An analytical formulation is presented for calculating the mean and variance of transmission for a model deep-penetration problem. With this formulation, the variance reduction characteristics of two biased Monte Carlo schemes are studied. The first is the usual exponential biasing wherein it is shown that the optimal biasing parameter depends sensitively on the scattering properties of the shielding medium. The second is a scheme that couples exponential biasing to the scattering angle biasing proposed recently. It is demonstrated that the coupled scheme performs better than exponential biasing

  19. Analytical Method to Estimate the Complex Permittivity of Oil Samples

    Directory of Open Access Journals (Sweden)

    Lijuan Su

    2018-03-01

    Full Text Available In this paper, an analytical method to estimate the complex dielectric constant of liquids is presented. The method is based on the measurement of the transmission coefficient in an embedded microstrip line loaded with a complementary split ring resonator (CSRR, which is etched in the ground plane. From this response, the dielectric constant and loss tangent of the liquid under test (LUT can be extracted, provided that the CSRR is surrounded by such LUT, and the liquid level extends beyond the region where the electromagnetic fields generated by the CSRR are present. For that purpose, a liquid container acting as a pool is added to the structure. The main advantage of this method, which is validated from the measurement of the complex dielectric constant of olive and castor oil, is that reference samples for calibration are not required.

  20. Analytical SN solutions in heterogeneous slabs using symbolic algebra computer programs

    International Nuclear Information System (INIS)

    Warsa, J.S.

    2002-01-01

    A modern symbolic algebra computer program, MAPLE, is used to compute solutions to the well-known analytical discrete ordinates, or S N , solutions in one-dimensional, slab geometry. Symbolic algebra programs compute the solutions with arbitrary precision and are free of spatial discretization error so they can be used to investigate new discretizations for one-dimensional slab, geometry S N methods. Pointwise scalar flux solutions are computed for several sample calculations of interest. Sample MAPLE command scripts are provided to illustrate how easily the theory can be translated into a working solution and serve as a complete tool capable of computing analytical S N solutions for mono-energetic, one-dimensional transport problems

  1. Comparison of soil sampling and analytical methods for asbestos at the Sumas Mountain Asbestos Site-Working towards a toolbox for better assessment.

    Science.gov (United States)

    Wroble, Julie; Frederick, Timothy; Frame, Alicia; Vallero, Daniel

    2017-01-01

    Established soil sampling methods for asbestos are inadequate to support risk assessment and risk-based decision making at Superfund sites due to difficulties in detecting asbestos at low concentrations and difficulty in extrapolating soil concentrations to air concentrations. Environmental Protection Agency (EPA)'s Office of Land and Emergency Management (OLEM) currently recommends the rigorous process of Activity Based Sampling (ABS) to characterize site exposures. The purpose of this study was to compare three soil analytical methods and two soil sampling methods to determine whether one method, or combination of methods, would yield more reliable soil asbestos data than other methods. Samples were collected using both traditional discrete ("grab") samples and incremental sampling methodology (ISM). Analyses were conducted using polarized light microscopy (PLM), transmission electron microscopy (TEM) methods or a combination of these two methods. Data show that the fluidized bed asbestos segregator (FBAS) followed by TEM analysis could detect asbestos at locations that were not detected using other analytical methods; however, this method exhibited high relative standard deviations, indicating the results may be more variable than other soil asbestos methods. The comparison of samples collected using ISM versus discrete techniques for asbestos resulted in no clear conclusions regarding preferred sampling method. However, analytical results for metals clearly showed that measured concentrations in ISM samples were less variable than discrete samples.

  2. Optimisation (sampling strategies and analytical procedures) for site specific environment monitoring at the areas of uranium production legacy sites in Ukraine - 59045

    International Nuclear Information System (INIS)

    Voitsekhovych, Oleg V.; Lavrova, Tatiana V.; Kostezh, Alexander B.

    2012-01-01

    There are many sites in the world, where Environment are still under influence of the contamination related to the Uranium production carried out in past. Author's experience shows that lack of site characterization data, incomplete or unreliable environment monitoring studies can significantly limit quality of Safety Assessment procedures and Priority actions analyses needed for Remediation Planning. During recent decades the analytical laboratories of the many enterprises, currently being responsible for establishing the site specific environment monitoring program have been significantly improved their technical sampling and analytical capacities. However, lack of experience in the optimal site specific sampling strategy planning and also not enough experience in application of the required analytical techniques, such as modern alpha-beta radiometers, gamma and alpha spectrometry and liquid-scintillation analytical methods application for determination of U-Th series radionuclides in the environment, does not allow to these laboratories to develop and conduct efficiently the monitoring programs as a basis for further Safety Assessment in decision making procedures. This paper gives some conclusions, which were gained from the experience establishing monitoring programs in Ukraine and also propose some practical steps on optimization in sampling strategy planning and analytical procedures to be applied for the area required Safety assessment and justification for its potential remediation and safe management. (authors)

  3. Sample handling in surface sensitive chemical and biological sensing: a practical review of basic fluidics and analyte transport.

    Science.gov (United States)

    Orgovan, Norbert; Patko, Daniel; Hos, Csaba; Kurunczi, Sándor; Szabó, Bálint; Ramsden, Jeremy J; Horvath, Robert

    2014-09-01

    This paper gives an overview of the advantages and associated caveats of the most common sample handling methods in surface-sensitive chemical and biological sensing. We summarize the basic theoretical and practical considerations one faces when designing and assembling the fluidic part of the sensor devices. The influence of analyte size, the use of closed and flow-through cuvettes, the importance of flow rate, tubing length and diameter, bubble traps, pressure-driven pumping, cuvette dead volumes, and sample injection systems are all discussed. Typical application areas of particular arrangements are also highlighted, such as the monitoring of cellular adhesion, biomolecule adsorption-desorption and ligand-receptor affinity binding. Our work is a practical review in the sense that for every sample handling arrangement considered we present our own experimental data and critically review our experience with the given arrangement. In the experimental part we focus on sample handling in optical waveguide lightmode spectroscopy (OWLS) measurements, but the present study is equally applicable for other biosensing technologies in which an analyte in solution is captured at a surface and its presence is monitored. Explicit attention is given to features that are expected to play an increasingly decisive role in determining the reliability of (bio)chemical sensing measurements, such as analyte transport to the sensor surface; the distorting influence of dead volumes in the fluidic system; and the appropriate sample handling of cell suspensions (e.g. their quasi-simultaneous deposition). At the appropriate places, biological aspects closely related to fluidics (e.g. cellular mechanotransduction, competitive adsorption, blood flow in veins) are also discussed, particularly with regard to their models used in biosensing. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Sources of pre-analytical variations in yield of DNA extracted from blood samples: analysis of 50,000 DNA samples in EPIC.

    Directory of Open Access Journals (Sweden)

    Elodie Caboux

    Full Text Available The European Prospective Investigation into Cancer and nutrition (EPIC is a long-term, multi-centric prospective study in Europe investigating the relationships between cancer and nutrition. This study has served as a basis for a number of Genome-Wide Association Studies (GWAS and other types of genetic analyses. Over a period of 5 years, 52,256 EPIC DNA samples have been extracted using an automated DNA extraction platform. Here we have evaluated the pre-analytical factors affecting DNA yield, including anthropometric, epidemiological and technical factors such as center of subject recruitment, age, gender, body-mass index, disease case or control status, tobacco consumption, number of aliquots of buffy coat used for DNA extraction, extraction machine or procedure, DNA quantification method, degree of haemolysis and variations in the timing of sample processing. We show that the largest significant variations in DNA yield were observed with degree of haemolysis and with center of subject recruitment. Age, gender, body-mass index, cancer case or control status and tobacco consumption also significantly impacted DNA yield. Feedback from laboratories which have analyzed DNA with different SNP genotyping technologies demonstrate that the vast majority of samples (approximately 88% performed adequately in different types of assays. To our knowledge this study is the largest to date to evaluate the sources of pre-analytical variations in DNA extracted from peripheral leucocytes. The results provide a strong evidence-based rationale for standardized recommendations on blood collection and processing protocols for large-scale genetic studies.

  5. Problem and pathological gambling in a sample of casino patrons.

    Science.gov (United States)

    Fong, Timothy W; Campos, Michael D; Brecht, Mary-Lynn; Davis, Alice; Marco, Adrienne; Pecanha, Viviane; Rosenthal, Richard J

    2011-03-01

    Relatively few studies have examined gambling problems among individuals in a casino setting. The current study sought to examine the prevalence of gambling problems among a sample of casino patrons and examine alcohol and tobacco use, health status, and quality of life by gambling problem status. To these ends, 176 casino patrons were recruited by going to a Southern California casino and requesting that they complete an anonymous survey. Results indicated the following lifetime rates for at-risk, problem, and pathological gambling: 29.2, 10.7, and 29.8%. Differences were found with regards to gambling behavior, and results indicated higher rates of smoking among individuals with gambling problems, but not higher rates of alcohol use. Self-rated quality of life was lower among pathological gamblers relative to non-problem gamblers, but did not differ from at-risk or problem gamblers. Although subject to some limitations, our data support the notion of higher frequency of gambling problems among casino patrons and may suggest the need for increased interventions for gambling problems on-site at casinos.

  6. Multi-frequency direct sampling method in inverse scattering problem

    Science.gov (United States)

    Kang, Sangwoo; Lambert, Marc; Park, Won-Kwang

    2017-10-01

    We consider the direct sampling method (DSM) for the two-dimensional inverse scattering problem. Although DSM is fast, stable, and effective, some phenomena remain unexplained by the existing results. We show that the imaging function of the direct sampling method can be expressed by a Bessel function of order zero. We also clarify the previously unexplained imaging phenomena and suggest multi-frequency DSM to overcome traditional DSM. Our method is evaluated in simulation studies using both single and multiple frequencies.

  7. Direct trace-elemental analysis of urine samples by laser ablation-inductively coupled plasma mass spectrometry after sample deposition on clinical filter papers.

    Science.gov (United States)

    Aramendía, Maite; Rello, Luis; Vanhaecke, Frank; Resano, Martín

    2012-10-16

    Collection of biological fluids on clinical filter papers shows important advantages from a logistic point of view, although analysis of these specimens is far from straightforward. Concerning urine analysis, and particularly when direct trace elemental analysis by laser ablation-inductively coupled plasma mass spectrometry (LA-ICPMS) is aimed at, several problems arise, such as lack of sensitivity or different distribution of the analytes on the filter paper, rendering obtaining reliable quantitative results quite difficult. In this paper, a novel approach for urine collection is proposed, which circumvents many of these problems. This methodology consists on the use of precut filter paper discs where large amounts of sample can be retained upon a single deposition. This provides higher amounts of the target analytes and, thus, sufficient sensitivity, and allows addition of an adequate internal standard at the clinical lab prior to analysis, therefore making it suitable for a strategy based on unsupervised sample collection and ulterior analysis at referral centers. On the basis of this sampling methodology, an analytical method was developed for the direct determination of several elements in urine (Be, Bi, Cd, Co, Cu, Ni, Sb, Sn, Tl, Pb, and V) at the low μg L(-1) level by means of LA-ICPMS. The method developed provides good results in terms of accuracy and LODs (≤1 μg L(-1) for most of the analytes tested), with a precision in the range of 15%, fit-for-purpose for clinical control analysis.

  8. The two-sample problem with induced dependent censorship.

    Science.gov (United States)

    Huang, Y

    1999-12-01

    Induced dependent censorship is a general phenomenon in health service evaluation studies in which a measure such as quality-adjusted survival time or lifetime medical cost is of interest. We investigate the two-sample problem and propose two classes of nonparametric tests. Based on consistent estimation of the survival function for each sample, the two classes of test statistics examine the cumulative weighted difference in hazard functions and in survival functions. We derive a unified asymptotic null distribution theory and inference procedure. The tests are applied to trial V of the International Breast Cancer Study Group and show that long duration chemotherapy significantly improves time without symptoms of disease and toxicity of treatment as compared with the short duration treatment. Simulation studies demonstrate that the proposed tests, with a wide range of weight choices, perform well under moderate sample sizes.

  9. Direct sampling methods for inverse elastic scattering problems

    Science.gov (United States)

    Ji, Xia; Liu, Xiaodong; Xi, Yingxia

    2018-03-01

    We consider the inverse elastic scattering of incident plane compressional and shear waves from the knowledge of the far field patterns. Specifically, three direct sampling methods for location and shape reconstruction are proposed using the different component of the far field patterns. Only inner products are involved in the computation, thus the novel sampling methods are very simple and fast to be implemented. With the help of the factorization of the far field operator, we give a lower bound of the proposed indicator functionals for sampling points inside the scatterers. While for the sampling points outside the scatterers, we show that the indicator functionals decay like the Bessel functions as the sampling point goes away from the boundary of the scatterers. We also show that the proposed indicator functionals continuously dependent on the far field patterns, which further implies that the novel sampling methods are extremely stable with respect to data error. For the case when the observation directions are restricted into the limited aperture, we firstly introduce some data retrieval techniques to obtain those data that can not be measured directly and then use the proposed direct sampling methods for location and shape reconstructions. Finally, some numerical simulations in two dimensions are conducted with noisy data, and the results further verify the effectiveness and robustness of the proposed sampling methods, even for multiple multiscale cases and limited-aperture problems.

  10. Semi-blind identification of wideband MIMO channels via stochastic sampling

    OpenAIRE

    Andrieu, Christophe; Piechocki, Robert J.; McGeehan, Joe P.; Armour, Simon M.

    2003-01-01

    In this paper we address the problem of wide-band multiple-input multiple-output (MIMO) channel (multidimensional time invariant FIR filter) identification using Markov chains Monte Carlo methods. Towards this end we develop a novel stochastic sampling technique that produces a sequence of multidimensional channel samples. The method is semi-blind in the sense that it uses a very short training sequence. In such a framework the problem is no longer analytically tractable; hence we resort to s...

  11. Analytical and between-subject variation of thrombin generation measured by calibrated automated thrombography on plasma samples.

    Science.gov (United States)

    Kristensen, Anne F; Kristensen, Søren R; Falkmer, Ursula; Münster, Anna-Marie B; Pedersen, Shona

    2018-05-01

    The Calibrated Automated Thrombography (CAT) is an in vitro thrombin generation (TG) assay that holds promise as a valuable tool within clinical diagnostics. However, the technique has a considerable analytical variation, and we therefore, investigated the analytical and between-subject variation of CAT systematically. Moreover, we assess the application of an internal standard for normalization to diminish variation. 20 healthy volunteers donated one blood sample which was subsequently centrifuged, aliquoted and stored at -80 °C prior to analysis. The analytical variation was determined on eight runs, where plasma from the same seven volunteers was processed in triplicates, and for the between-subject variation, TG analysis was performed on plasma from all 20 volunteers. The trigger reagents used for the TG assays included both PPP reagent containing 5 pM tissue factor (TF) and PPPlow with 1 pM TF. Plasma, drawn from a single donor, was applied to all plates as an internal standard for each TG analysis, which subsequently was used for normalization. The total analytical variation for TG analysis performed with PPPlow reagent is 3-14% and 9-13% for PPP reagent. This variation can be minimally reduced by using an internal standard but mainly for ETP (endogenous thrombin potential). The between-subject variation is higher when using PPPlow than PPP and this variation is considerable higher than the analytical variation. TG has a rather high inherent analytical variation but considerable lower than the between-subject variation when using PPPlow as reagent.

  12. The Efficacy of Problem-based Learning in an Analytical Laboratory Course for Pre-service Chemistry Teachers

    Science.gov (United States)

    Yoon, Heojeong; Woo, Ae Ja; Treagust, David; Chandrasegaran, AL

    2014-01-01

    The efficacy of problem-based learning (PBL) in an analytical chemistry laboratory course was studied using a programme that was designed and implemented with 20 students in a treatment group over 10 weeks. Data from 26 students in a traditional analytical chemistry laboratory course were used for comparison. Differences in the creative thinking ability of students in both the treatment and control groups were evaluated before and at the end of the implementation of the programme, using the Torrance Tests of Creative Thinking. In addition, changes in students' self-regulated learning skills using the Self-Regulated Learning Interview Schedule (SRLIS) and their self-evaluation proficiency were evaluated. Analysis of covariance showed that the creative thinking ability of the treatment group had improved statistically significantly after the PBL course (p effect on creative thinking ability. The SRLIS test showed that students in the treatment group used self-regulated learning strategies more frequently than students in the comparison group. According to the results of the self-evaluation, students became more positive and confident in problem-solving and group work as the semester progressed. Overall, PBL was shown to be an effective pedagogical instructional strategy for enhancing chemistry students' creative thinking ability, self-regulated learning skills and self-evaluation.

  13. The application of isogeometric analysis to the neutron diffusion equation for a pincell problem with an analytic benchmark

    International Nuclear Information System (INIS)

    Hall, S.K.; Eaton, M.D.; Williams, M.M.R.

    2012-01-01

    Highlights: ► Isogeometric analysis used to obtain solutions to the neutron diffusion equation. ► Exact geometry captured for a circular fuel pin within a square moderator. ► Comparisons are made between the finite element method and isogeometric analysis. ► Error and observed order of convergence found using an analytic solution. -- Abstract: In this paper the neutron diffusion equation is solved using Isogeometric Analysis (IGA), which is an attempt to generalise Finite Element Analysis (FEA) to include exact geometries. In contrast to FEA, the basis functions are rational functions instead of polynomials. These rational functions, called non-uniform rational B-splines, are used to capture both the geometry and approximate the solution. The method of manufactured solutions is used to verify a MatLab implementation of IGA, which is then applied to a pincell problem. This is a circular uranium fuel pin within a square block of graphite moderator. A new method is used to compute an analytic solution to a simplified version of this problem, and is then used to observe the order of convergence of the numerical scheme. Comparisons are made against quadratic finite elements for the pincell problem, and it is found that the disadvantage factor computed using IGA is less accurate. This is due to a cancellation of errors in the FEA solution. A modified pincell problem with vacuum boundary conditions is then considered. IGA is shown to outperform FEA in this situation.

  14. Comparison between correlated sampling and the perturbation technique of MCNP5 for fixed-source problems

    International Nuclear Information System (INIS)

    He Tao; Su Bingjing

    2011-01-01

    Highlights: → The performance of the MCNP differential operator perturbation technique is compared with that of the MCNP correlated sampling method for three types of fixed-source problems. → In terms of precision, the MCNP perturbation technique outperforms correlated sampling for one type of problem but performs comparably with or even under-performs correlated sampling for the other two types of problems. → In terms of accuracy, the MCNP perturbation calculations may predict inaccurate results for some of the test problems. However, the accuracy can be improved if the midpoint correction technique is used. - Abstract: Correlated sampling and the differential operator perturbation technique are two methods that enable MCNP (Monte Carlo N-Particle) to simulate small response change between an original system and a perturbed system. In this work the performance of the MCNP differential operator perturbation technique is compared with that of the MCNP correlated sampling method for three types of fixed-source problems. In terms of precision of predicted response changes, the MCNP perturbation technique outperforms correlated sampling for the problem involving variation of nuclide concentrations in the same direction but performs comparably with or even underperforms correlated sampling for the other two types of problems that involve void or variation of nuclide concentrations in opposite directions. In terms of accuracy, the MCNP differential operator perturbation calculations may predict inaccurate results that deviate from the benchmarks well beyond their uncertainty ranges for some of the test problems. However, the accuracy of the MCNP differential operator perturbation can be improved if the midpoint correction technique is used.

  15. Application of advanced nuclear and instrumental analytical techniques for characterisation of environmental materials

    International Nuclear Information System (INIS)

    Sudersanan, M.; Pawaskar, P.B.; Kayasth, S.R.; Kumar, S.C.

    2002-01-01

    Full text: Increasing realisation about the toxic effects of metal ions in environmental materials has given an impetus to research on analytical techniques for their characterization. The large number of analytes present at very low levels has necessitated the use of sensitive, selective and element specific techniques for their characterization. The concern about precision and accuracy on such analysis, which have socio-economic bearing, has emphasized the use of Certified Reference Materials and the use of multi-technique approach for the unambiguous characterization of analytes. The recent work carried out at Analytical Chemistry Division, BARC on these aspects is presented in this paper. Increasing use of fossil fuels has led to the generation of large quantities of fly ash which pose problems of safe disposal. The utilization of these materials for land filling is an attractive option but the presence of trace amounts of toxic metals like mercury, arsenic, lead etc may cause environmental problems. In view of the inhomogeneous nature of the material, efficient sample processing is an important factor, in addition to the validation of the results by the use of proper standards. Analysis was carried out on flyash samples received as reference materials and also as samples from commercial sources using a combination of both nuclear techniques like INAA and RNAA as well as other techniques like AAS, ICPAES, cold vapour AAS for mercury and hydride generation technique for arsenic. Similar analysis using nuclear techniques was employed for the characterization of air particulates. Biological materials often serve as sensitive indicator materials for pollution measurements. They are also employed for studies on the uptake of toxic metals like U, Th, Cd, Pb, Hg etc. The presence of large amounts of organic materials in them necessitate an appropriate sample dissolution procedure. In view of the possibility of loss of certain analytes like Cd, Hg, As, by high

  16. On analytical fits for electron impact ionisation cross sections

    International Nuclear Information System (INIS)

    Godunov, A.L.; Ivanov, P.B.

    1999-01-01

    The problem of providing accurate recommended analytical fits for electron impact ionisation cross sections is discussed, and a number of approaches are considered on the sample case of neon and its ions. The previously known fits are being reassessed using complete experimental and theoretical data, with the preference for experiment, to avoid systematic shifts introduced by the present calculation methods. The feasibility of the standard BELI formula is investigated in detail, and a number of other analytical expressions is suggested, approximating single-ionization cross sections in the whole range of energies. The factors influencing the accuracy of the fits and the physical meaning of the parameters obtained are discussed. (orig.)

  17. On the Use of Importance Sampling in Particle Transport Problems

    Energy Technology Data Exchange (ETDEWEB)

    Eriksson, B

    1965-06-15

    The idea of importance sampling is applied to the problem of solving integral equations of Fredholm's type. Especially Bolzmann's neutron transport equation is taken into consideration. For the solution of the latter equation, an importance sampling technique is derived from some simple transformations at the original transport equation into a similar equation. Examples of transformations are given, which have been used with great success in practice.

  18. On the Use of Importance Sampling in Particle Transport Problems

    International Nuclear Information System (INIS)

    Eriksson, B.

    1965-06-01

    The idea of importance sampling is applied to the problem of solving integral equations of Fredholm's type. Especially Bolzmann's neutron transport equation is taken into consideration. For the solution of the latter equation, an importance sampling technique is derived from some simple transformations at the original transport equation into a similar equation. Examples of transformations are given, which have been used with great success in practice

  19. Analytical solution to the circularity problem in the discounted cash flow valuation framework

    Directory of Open Access Journals (Sweden)

    Felipe Mejía-Peláez

    2011-12-01

    Full Text Available In this paper we propose an analytical solution to the circularity problem between value and cost of capital. Our solution is derived starting from a central principle of finance that relates value today to value, cash flow, and the discount rate for next period. We present a general formulation without circularity for the equity value (E, cost of levered equity (Ke, levered firm value (V, and the weighted average cost of capital (WACC. We furthermore compare the results obtained from these formulas with the results of the application of the Adjusted Present Value approach (no circularity and the iterative solution of circularity based upon the iteration feature of a spreadsheet, concluding that all methods yield exactly the same answer. The advantage of this solution is that it avoids problems such as using manual methods (i.e., the popular “Rolling WACC” ignoring the circularity issue, setting a target leverage (usually constant with the inconsistencies that result from it, the wrong use of book values, or attributing the discrepancies in values to rounding errors.

  20. Analytical results for 544 water samples collected in the Attean Quartz Monzonite in the vicinity of Jackman, Maine

    Science.gov (United States)

    Ficklin, W.H.; Nowlan, G.A.; Preston, D.J.

    1983-01-01

    Water samples were collected in the vicinity of Jackman, Maine as a part of the study of the relationship of dissolved constituents in water to the sediments subjacent to the water. Each sample was analyzed for specific conductance, alkalinity, acidity, pH, fluoride, chloride, sulfate, phosphate, nitrate, sodium, potassium, calcium, magnesium, and silica. Trace elements determined were copper, zinc, molybdenum, lead, iron, manganese, arsenic, cobalt, nickel, and strontium. The longitude and latitude of each sample location and a sample site map are included in the report as well as a table of the analytical results.

  1. ON SAMPLING BASED METHODS FOR THE DUBINS TRAVELING SALESMAN PROBLEM WITH NEIGHBORHOODS

    Directory of Open Access Journals (Sweden)

    Petr Váňa

    2015-12-01

    Full Text Available In this paper, we address the problem of path planning to visit a set of regions by Dubins vehicle, which is also known as the Dubins Traveling Salesman Problem Neighborhoods (DTSPN. We propose a modification of the existing sampling-based approach to determine increasing number of samples per goal region and thus improve the solution quality if a more computational time is available. The proposed modification of the sampling-based algorithm has been compared with performance of existing approaches for the DTSPN and results of the quality of the found solutions and the required computational time are presented in the paper.

  2. Problems of Aero-optics and Adaptive Optical Systems: Analytical Review

    Directory of Open Access Journals (Sweden)

    Yu. I. Shanin

    2017-01-01

    Full Text Available The analytical review gives the basic concepts of the aero-optics problem arising from the radiation propagation in the region of the boundary layers of a laser installation carrier aircraft. Estimates the radiation wave front distortions at its propagation in the near and far field. Presents main calculation approaches and methods to solve the gas-dynamic and optical problems in propagating laser radiation. Conducts a detailed analysis of the flows and their generating optical aberrations introduced by the aircraft turret (a projection platform of the on-board laser. Considers the effect of various factors (shock wave, difference in wall and flow temperatures on the flow pattern and the optical aberrations. Provides research data on the aero-optics obtained in the flying laboratory directly while in flight. Briefly considers the experimental research methods, diagnostic equipment, and synthesis of results while studying the aero-optics problem. Discusses some methods for mitigating the aerodynamic effects on the light propagation under flight conditions. Presents data about the passive, active, and hybrid effects on the flow in the boundary layers in order to reduce aberrations through improving the flow aerodynamics.The paper considers operation of adaptive optical systems under conditions of aero-optical distortions. Presents the study results concerning the reduction of the aero-optics effect on the characteristics of radiation in far field. Gives some research results regarding the effect on the efficiency of the adaptive system of a laser beam jitter and a time delay in the feedback signal transmission, which occur under application conditions. Provides data on adaptive correction of aero-optical wave fronts of radiation. Considers some application aspects in control systems of the on-board adaptive optics of adaptive filtration as a way to improve the efficiency of adaptive optical systems. The project in mind is to use obtained results

  3. Elusive Learning—Using Learning Analytics to Support Reflective Sensemaking of Ill-Structured Ethical Problems: A Learner-Managed Dashboard Solution

    Directory of Open Access Journals (Sweden)

    Yianna Vovides

    2016-06-01

    Full Text Available Since the turn of the 21st century, we have seen a surge of studies on the state of U.S. education addressing issues such as cost, graduation rates, retention, achievement, engagement, and curricular outcomes. There is an expectation that graduates should be able to enter the workplace equipped to take on complex and “messy” or ill-structured problems as part of their professional and everyday life. In the context of online learning, we have identified two key issues that are elusive (hard to capture and make visible: learning with ill-structured problems and the interaction of social and individual learning. We believe that the intersection between learning and analytics has the potential, in the long-term, to minimize the elusiveness of deep learning. A proposed analytics model is described in this article that is meant to capture and also support further development of a learner’s reflective sensemaking.

  4. Analytical Chemistry Laboratory (ACL) procedure compendium

    International Nuclear Information System (INIS)

    1992-06-01

    Covered are: analytical laboratory operations (ALO) sample receipt and control, ALO data report/package preparation review and control, single shell tank (PST) project sample tracking system, sample receiving, analytical balances, duties and responsibilities of sample custodian, sample refrigerator temperature monitoring, security, assignment of staff responsibilities, sample storage, data reporting, and general requirements for glassware

  5. Analysis and analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Batuecas Rodriguez, T [Department of Chemistry and Isotopes, Junta de Energia Nuclear, Madrid (Spain)

    1967-01-01

    The technology associated with the use of organic coolants in nuclear reactors depends to a large extent on the determination and control of their physical and chemical properties, and particularly on the viability, speed, sensitivity, precision and accuracy (depending on the intended usage) of the methods employed in detection and analytical determination. This has led to the study and development of numerous techniques, some specially designed for the extreme conditions involved in working with the types of product in question and others adapted from existing techniques. In the specific case of polyphenyl and hydropolyphenyl mixtures, which have been the principal subjects of study to date and offer greatest promise, the analytical problems are broadly as follows: Composition of initial product or virgin coolant composition of macro components and amounts of organic and inorganic impurities; Coolant during and after operation. Determination of gases and organic compounds produced by pyrolysis and radiolysis (degradation and polymerization products); Control of systems for purifying and regenerating the coolant after use. Dissolved pressurization gases; Detection of intermediate products during decomposition; these are generally very unstable (free radicals); Degree of fouling and film formation. Tests to determine potential formation of films; Corrosion of structural elements and canning materials; Health and safety. Toxicity, inflammability and impurities that can be activated. Although some of the above problems are closely interrelated and entail similar techniques, they vary as to degree of difficulty. Another question is the difficulty of distinguishing clearly between techniques for determining physical and physico-chemical properties, on one hand, and analytical techniques on the other. Any classification is therefore somewhat arbitrary (for example, in the case of dosimetry and techniques for determining mean molecular weights or electrical conductivity

  6. Analytical quality assurance in laboratories using tracers for biological and environmental studies

    International Nuclear Information System (INIS)

    Melaj, Mariana; Martin, Olga; Lopez, Silvia; Rojas de Tramontini, Susana

    1999-01-01

    This work describe the way we are organizing a quality assurance system to apply in the analytical measurements of the relation 14 N/ 15 N in biological and soil material. The relation 14 / 15 is measured with a optic emission spectrometer (NOI6PC), which distinguish the differences in wave length of electromagnetic radiation emitted by N-28, N-29 and N-30. The major problem is the 'cross contamination' of samples with different enrichments. The elements that are been considered to reach satisfactory analytical results are: 1) A proper working area; 2) The samples must be homogeneous and the samples must represent the whole sampled system; 3) The use of reference materials. In each digestion, a known reference sample must be added; 4) Adequate equipment operation; 5) Standard operating procedures; 6) Control charts, laboratory and equipment books. All operations using the equipment is registered in a book; 7) Training of the operators. (author)

  7. The electron transport problem sampling by Monte Carlo individual collision technique

    International Nuclear Information System (INIS)

    Androsenko, P.A.; Belousov, V.I.

    2005-01-01

    The problem of electron transport is of most interest in all fields of the modern science. To solve this problem the Monte Carlo sampling has to be used. The electron transport is characterized by a large number of individual interactions. To simulate electron transport the 'condensed history' technique may be used where a large number of collisions are grouped into a single step to be sampled randomly. Another kind of Monte Carlo sampling is the individual collision technique. In comparison with condensed history technique researcher has the incontestable advantages. For example one does not need to give parameters altered by condensed history technique like upper limit for electron energy, resolution, number of sub-steps etc. Also the condensed history technique may lose some very important tracks of electrons because of its limited nature by step parameters of particle movement and due to weakness of algorithms for example energy indexing algorithm. There are no these disadvantages in the individual collision technique. This report presents some sampling algorithms of new version BRAND code where above mentioned technique is used. All information on electrons was taken from Endf-6 files. They are the important part of BRAND. These files have not been processed but directly taken from electron information source. Four kinds of interaction like the elastic interaction, the Bremsstrahlung, the atomic excitation and the atomic electro-ionization were considered. In this report some results of sampling are presented after comparison with analogs. For example the endovascular radiotherapy problem (P2) of QUADOS2002 was presented in comparison with another techniques that are usually used. (authors)

  8. Automated Ground-Water Sampling and Analysis of Hexavalent Chromium using a “Universal” Sampling/Analytical System

    Directory of Open Access Journals (Sweden)

    Richard J. Venedam

    2005-02-01

    Full Text Available The capabilities of a “universal platform” for the deployment of analyticalsensors in the field for long-term monitoring of environmental contaminants were expandedin this investigation. The platform was previously used to monitor trichloroethene inmonitoring wells and at groundwater treatment systems (1,2. The platform was interfacedwith chromium (VI and conductivity analytical systems to monitor shallow wells installedadjacent to the Columbia River at the 100-D Area of the Hanford Site, Washington. Agroundwater plume of hexavalent chromium is discharging into the Columbia River throughthe gravels beds used by spawning salmon. The sampling/analytical platform was deployedfor the purpose of collecting data on subsurface hexavalent chromium concentrations atmore frequent intervals than was possible with the previous sampling and analysis methodsemployed a the Site.

  9. Privacy problems in the small sample selection

    Directory of Open Access Journals (Sweden)

    Loredana Cerbara

    2013-05-01

    Full Text Available The side of social research that uses small samples for the production of micro data, today finds some operating difficulties due to the privacy law. The privacy code is a really important and necessary law because it guarantees the Italian citizen’s rights, as already happens in other Countries of the world. However it does not seem appropriate to limit once more the possibilities of the data production of the national centres of research. That possibilities are already moreover compromised due to insufficient founds is a common problem becoming more and more frequent in the research field. It would be necessary, therefore, to include in the law the possibility to use telephonic lists to select samples useful for activities directly of interest and importance to the citizen, such as the collection of the data carried out on the basis of opinion polls by the centres of research of the Italian CNR and some universities.

  10. Possibilities for decreasing detection limits of analytical methods for determination of transformation products of unsymmetrical dimethylhydrazine in environmental samples

    Directory of Open Access Journals (Sweden)

    Bulat Kenessov

    2015-12-01

    Full Text Available Most rockets of middle and heavy class launched from Kazakhstan, Russia, China and other countries still use highly toxic unsymmetrical dimethylhydrazine (UDMH as a liquid propellant. Study of migration, distribution and accumulation of UDMH transformation products in environment and human health impact assessment of space rocket activity are currently complicated due to the absence of analytical methods allowing detection of trace concentrations of these compounds in analyzed samples. This paper reviews methods and approaches, which can be applied for development of such methods. Detection limits at a part-per-trillion (ppt level may be achieved using most selective and sensitive methods based on gas or liquid chromatography in combination of tandem or high-resolution mass spectrometry. In addition, 1000-fold concentration of samples or integrated sample preparation methods, e.g., dynamic headspace extraction, are required. Special attention during development and application of such methods must be paid to purity of laboratory air, reagents, glassware and analytical instruments.

  11. A direct sampling method to an inverse medium scattering problem

    KAUST Repository

    Ito, Kazufumi; Jin, Bangti; Zou, Jun

    2012-01-01

    In this work we present a novel sampling method for time harmonic inverse medium scattering problems. It provides a simple tool to directly estimate the shape of the unknown scatterers (inhomogeneous media), and it is applicable even when

  12. Nanomaterials in consumer products: a challenging analytical problem

    Directory of Open Access Journals (Sweden)

    Catia eContado

    2015-08-01

    Full Text Available Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs, generally added to improve the product quality. To evaluate correctly benefits versus risks of engineered nanomaterials and consequently to legislate in favor of consumer’s protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences.On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization.This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors.The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices.

  13. Nanomaterials in consumer products: a challenging analytical problem.

    Science.gov (United States)

    Contado, Catia

    2015-01-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits vs. risks of engineered nanomaterials and consequently to legislate in favor of consumer's protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices.

  14. Analytic simulation of the Poincare surface of sections for the diamagnetic Kepler problem

    Energy Technology Data Exchange (ETDEWEB)

    Hasegawa, H; Harada, A; Okazaki, Y [Kyoto Univ. (Japan). Dept. of Physics

    1984-11-11

    The Poincare surface-of-section analysis which the authors previously reported on the diamagnetic Kepler problem (classical hydrogen atom in a uniform magnetic field) in a transition region from regular to chaotic motions is simulated by an analytic means, by taking intersections of the energy integral and the approximate integral ..lambda.. of Solovev to obtain sections of the two separate regions of the motion that exist in the limit of a weak magnetic field (B ..-->.. 0). The origin of the unique hyperbolic point and the separatrix around which the onset of chaos takes place are thus identified. The invariant tori arising near the full chaos are shown to be simulated by this method but with modified parameter values in the expression ..lambda...

  15. Analytic simulation of the Poincare surface of sections for the diamagnetic Kepler problem

    International Nuclear Information System (INIS)

    Hasegawa, H.; Harada, A.; Okazaki, Y.

    1984-01-01

    The Poincare surface-of-section analysis which the authors previously reported on the diamagnetic Kepler problem (classical hydrogen atom in a uniform magnetic field) in a transition region from regular to chaotic motions is simulated by an analytic means, by taking intersections of the energy integral and the approximate integral Λ of Solovev to obtain sections of the two separate regions of the motion that exist in the limit of a weak magnetic field (B → 0). The origin of the unique hyperbolic point and the separatrix around which the onset of chaos takes place are thus identified. The invariant tori arising near the full chaos are shown to be simulated by this method but with modified parameter values in the expression Λ. (author)

  16. Analytical Modeling of Transient Process In Terms of One-Dimensional Problem of Dynamics With Kinematic Action

    Directory of Open Access Journals (Sweden)

    Kravets Victor V.

    2016-05-01

    Full Text Available One-dimensional dynamic design of a component characterized by inertia coefficient, elastic coefficient, and coefficient of energy dispersion. The component is affected by external action in the form of time-independent initial kinematic disturbances and varying ones. Mathematical model of component dynamics as well as a new form of analytical representation of transient in terms of one-dimensional problem of kinematic effect is provided. Dynamic design of a component is being carried out according to a theory of modal control.

  17. Marine sampling in Malaysia coastal area: the challenge, problems and solution

    International Nuclear Information System (INIS)

    Norfaizal Mohamed; Khairul Nizam Razali; Mohd Rafaie Mohd Murtadza; Muhammad Amin Abdul Ghani; Zaharudin Ahmad; Abdul Kadir Ishak

    2005-01-01

    Malaysia Marine Radioactivity Database Development Project is one of the five research contracts that was signed between MINT and AELB. Three marine sampling expeditions had been carried out using K.L. PAUS vessel owned by Malaysian Fisheries Institute, Chendering, Terengganu. The first marine sampling expedition was taken place at East Coast Peninsular Malaysia waters on August 2003, followed on February 2004 at West Coast Peninsular Malaysia waters, and lastly at Sarawak-Sabah waters on July 2004. Many challenges and problems were faced when collecting sediment, water, biota and plankton sample during this marine sampling. (Author)

  18. Development of analytical techniques for water and environmental samples (2)

    Energy Technology Data Exchange (ETDEWEB)

    Eum, Chul Hun; Jeon, Chi Wan; Jung, Kang Sup; Song, Kyung Sun; Kim, Sang Yeon [Korea Institute of Geology Mining and Materials, Taejon (Korea)

    1998-12-01

    The purpose of this study is to develop new analytical methods with good detection limit for toxic inorganic and organic compounds. The analyses of CN, organic acids, particulate materials in environmental samples have been done using several methods such as Ion Chromatography, SPE, SPME, GC/MS, GC/FID, SPLITT (split-flow thin cell fractionation) during the second year of this project. Advantage and disadvantage of several distillation method (by KS, JIS, EPA) for CN analysis in wastewater were investigated. As the results, we proposed new distillation apparatus for CN analysis, which was proved to be simpler, faster and to get better recovery than conventional apparatus. And ion chromatograph/pulsed amperometric detector (IC/PAD) system instead of colorimetry for CN detection was setup to solve matrix interference. And SPE(solid phase extraction) and SPME (solid phase micro extraction) as liquid-solid extraction technique were applied to the analysis of phenols in wastewater. Optimum experimental conditions and factors influencing analytical results were determined. From these results, It could be concluded that C{sub 18} cartridge and polystyrene-divinylbenzene disk in SPE method, polyacrylate fiber in SPME were proper solid phase adsorbent for phenol. Optimum conditions to analyze phenol derivatives simultaneously were established. Also, Continuous SPLITT (Split-flow thin cell) Fractionation (CSF) is a new preparative separation technique that is useful for fractionation of particulate and macromolecular materials. CSF is carried out in a thin ribbon-like channel equipped with two splitters at both inlet and outlet of the channel. In this work, we set up a new CSF system, and tested using polystyrene latex standard particles. And then we fractionated particles contained in air and underground water based on their sedimentation coefficients using CSF. (author). 27 refs., 13 tabs., 31 figs.

  19. Effects of fecal sampling on preanalytical and analytical phases in quantitative fecal immunochemical tests for hemoglobin.

    Science.gov (United States)

    Rapi, Stefano; Berardi, Margherita; Cellai, Filippo; Ciattini, Samuele; Chelazzi, Laura; Ognibene, Agostino; Rubeca, Tiziana

    2017-07-24

    Information on preanalytical variability is mandatory to bring laboratories up to ISO 15189 requirements. Fecal sampling is greatly affected by lack of harmonization in laboratory medicine. The aims of this study were to obtain information on the devices used for fecal sampling and to explore the effect of different amounts of feces on the results from the fecal immunochemical test for hemoglobin (FIT-Hb). Four commercial sample collection devices for quantitative FIT-Hb measurements were investigated. The volume of interest (VOI) of the probes was measured from diameter and geometry. Quantitative measurements of the mass of feces were carried out by gravimetry. The effects of an increased amount of feces on the analytical environment were investigated measuring the Hb values with a single analytical method. VOI was 8.22, 7.1 and 9.44 mm3 for probes that collected a target of 10 mg of feces, and 3.08 mm3 for one probe that targeted 2 mg of feces. The ratio between recovered and target amounts of devices ranged from 56% to 121%. Different changes in the measured Hb values were observed, in adding increasing amounts of feces in commercial buffers. The amounts of collected materials are related to the design of probes. Three out 4 manufacturers declare the same target amount using different sampling volumes and obtaining different amounts of collected materials. The introduction of a standard probes to reduce preanalytical variability could be an useful step for fecal test harmonization and to fulfill the ISO 15189 requirements.

  20. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    Science.gov (United States)

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. The electron transport problem sampling by Monte Carlo individual collision technique

    Energy Technology Data Exchange (ETDEWEB)

    Androsenko, P.A.; Belousov, V.I. [Obninsk State Technical Univ. of Nuclear Power Engineering, Kaluga region (Russian Federation)

    2005-07-01

    The problem of electron transport is of most interest in all fields of the modern science. To solve this problem the Monte Carlo sampling has to be used. The electron transport is characterized by a large number of individual interactions. To simulate electron transport the 'condensed history' technique may be used where a large number of collisions are grouped into a single step to be sampled randomly. Another kind of Monte Carlo sampling is the individual collision technique. In comparison with condensed history technique researcher has the incontestable advantages. For example one does not need to give parameters altered by condensed history technique like upper limit for electron energy, resolution, number of sub-steps etc. Also the condensed history technique may lose some very important tracks of electrons because of its limited nature by step parameters of particle movement and due to weakness of algorithms for example energy indexing algorithm. There are no these disadvantages in the individual collision technique. This report presents some sampling algorithms of new version BRAND code where above mentioned technique is used. All information on electrons was taken from Endf-6 files. They are the important part of BRAND. These files have not been processed but directly taken from electron information source. Four kinds of interaction like the elastic interaction, the Bremsstrahlung, the atomic excitation and the atomic electro-ionization were considered. In this report some results of sampling are presented after comparison with analogs. For example the endovascular radiotherapy problem (P2) of QUADOS2002 was presented in comparison with another techniques that are usually used. (authors)

  2. Analytical approaches for the characterization of nickel proteome.

    Science.gov (United States)

    Jiménez-Lamana, Javier; Szpunar, Joanna

    2017-08-16

    The use of nickel in modern industry and in consumer products implies some health problems for the human being. Nickel allergy and nickel carcinogenicity are well-known health effects related to human exposure to nickel, either during production of nickel-containing products or by direct contact with the final item. In this context, the study of nickel toxicity and nickel carcinogenicity involves the understanding of their molecular mechanisms and hence the characterization of the nickel-binding proteins in different biological samples. During the last 50 years, a broad range of analytical techniques, covering from the first chromatographic columns to the last generation mass spectrometers, have been used in order to fully characterize the nickel proteome. The aim of this review is to present a critical view of the different analytical approaches that have been applied for the purification, isolation, detection and identification of nickel-binding proteins. The different analytical techniques used are discussed from a critical point of view, highlighting advantages and limitations.

  3. Chapter 12. Sampling and analytical methods

    International Nuclear Information System (INIS)

    Busenberg, E.; Plummer, L.N.; Cook, P.G.; Solomon, D.K.; Han, L.F.; Groening, M.; Oster, H.

    2006-01-01

    When water samples are taken for the analysis of CFCs, regardless of the sampling method used, contamination of samples by contact with atmospheric air (with its 'high' CFC concentrations) is a major concern. This is because groundwaters usually have lower CFC concentrations than those waters which have been exposed to the modern air. Some groundwaters might not contain CFCs and, therefore, are most sensitive to trace contamination by atmospheric air. Thus, extreme precautions are needed to obtain uncontaminated samples when groundwaters, particularly those with older ages, are sampled. It is recommended at the start of any CFC investigation that samples from a CFC-free source be collected and analysed, as a check upon the sampling equipment and methodology. The CFC-free source might be a deep monitoring well or, alternatively, CFC-free water could be carefully prepared in the laboratory. It is especially important that all tubing, pumps and connection that will be used in the sampling campaign be checked in this manner

  4. Analytic structure and power series expansion of the Jost function for the two-dimensional problem

    International Nuclear Information System (INIS)

    Rakityansky, S A; Elander, N

    2012-01-01

    For a two-dimensional quantum-mechanical problem, we obtain a generalized power series expansion of the S-matrix that can be done near an arbitrary point on the Riemann surface of the energy, similar to the standard effective-range expansion. In order to do this, we consider the Jost function and analytically factorize its momentum dependence that causes the Jost function to be a multi-valued function. The remaining single-valued function of the energy is then expanded in the power series near an arbitrary point in the complex energy plane. A systematic and accurate procedure has been developed for calculating the expansion coefficients. This makes it possible to obtain a semi-analytic expression for the Jost function (and therefore for the S-matrix) near an arbitrary point on the Riemann surface and use it, for example, to locate the spectral points (bound and resonant states) as the S-matrix poles. The method is applied to a model similar to those used in the theory of quantum dots. (paper)

  5. Analytical reconstruction schemes for coarse-mesh spectral nodal solution of slab-geometry SN transport problems

    International Nuclear Information System (INIS)

    Barros, R. C.; Filho, H. A.; Platt, G. M.; Oliveira, F. B. S.; Militao, D. S.

    2009-01-01

    Coarse-mesh numerical methods are very efficient in the sense that they generate accurate results in short computational time, as the number of floating point operations generally decrease, as a result of the reduced number of mesh points. On the other hand, they generate numerical solutions that do not give detailed information on the problem solution profile, as the grid points can be located considerably away from each other. In this paper we describe two analytical reconstruction schemes for the coarse-mesh solution generated by the spectral nodal method for neutral particle discrete ordinates (S N ) transport model in slab geometry. The first scheme we describe is based on the analytical reconstruction of the coarse-mesh solution within each discretization cell of the spatial grid set up on the slab. The second scheme is based on the angular reconstruction of the discrete ordinates solution between two contiguous ordinates of the angular quadrature set used in the S N model. Numerical results are given so we can illustrate the accuracy of the two reconstruction schemes, as described in this paper. (authors)

  6. Analytical Chemistry Laboratory: Progress report for FY 1988

    International Nuclear Information System (INIS)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.; Lindahl, P.C.; Erickson, M.D.

    1988-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for fiscal year 1988 (October 1987 through September 1988). The Analytical Chemistry Laboratory is a full-cost recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques

  7. Analytical Chemistry Laboratory progress report for FY 1989

    International Nuclear Information System (INIS)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.; Lindahl, P.C.; Erickson, M.D.

    1989-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1989 (October 1988 through September 1989). The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques

  8. Analytical Chemistry Laboratory: Progress report for FY 1988

    Energy Technology Data Exchange (ETDEWEB)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.; Lindahl, P.C.; Erickson, M.D.

    1988-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for fiscal year 1988 (October 1987 through September 1988). The Analytical Chemistry Laboratory is a full-cost recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques.

  9. Solution of the spherically symmetric linear thermoviscoelastic problem in the inertia-free limit

    DEFF Research Database (Denmark)

    Christensen, Tage Emil; Dyre, J. C.

    2008-01-01

    paper-the thermoviscoelastic  problem may be solved analytically in the inertia-free limit, i.e., the limit where the sample is much smaller than the wavelength of sound waves at the frequencies of interest. As for the one-dimensional thermoviscoelastic problem [Christensen et al., Phys. Rev. E 75...

  10. Pre-analytical issues in the haemostasis laboratory: guidance for the clinical laboratories.

    Science.gov (United States)

    Magnette, A; Chatelain, M; Chatelain, B; Ten Cate, H; Mullier, F

    2016-01-01

    Ensuring quality has become a daily requirement in laboratories. In haemostasis, even more than in other disciplines of biology, quality is determined by a pre-analytical step that encompasses all procedures, starting with the formulation of the medical question, and includes patient preparation, sample collection, handling, transportation, processing, and storage until time of analysis. This step, based on a variety of manual activities, is the most vulnerable part of the total testing process and is a major component of the reliability and validity of results in haemostasis and constitutes the most important source of erroneous or un-interpretable results. Pre-analytical errors may occur throughout the testing process and arise from unsuitable, inappropriate or wrongly handled procedures. Problems may arise during the collection of blood specimens such as misidentification of the sample, use of inadequate devices or needles, incorrect order of draw, prolonged tourniquet placing, unsuccessful attempts to locate the vein, incorrect use of additive tubes, collection of unsuitable samples for quality or quantity, inappropriate mixing of a sample, etc. Some factors can alter the result of a sample constituent after collection during transportation, preparation and storage. Laboratory errors can often have serious adverse consequences. Lack of standardized procedures for sample collection accounts for most of the errors encountered within the total testing process. They can also have clinical consequences as well as a significant impact on patient care, especially those related to specialized tests as these are often considered as "diagnostic". Controlling pre-analytical variables is critical since this has a direct influence on the quality of results and on their clinical reliability. The accurate standardization of the pre-analytical phase is of pivotal importance for achieving reliable results of coagulation tests and should reduce the side effects of the influence

  11. Pain beliefs and problems in functioning among people with arthritis: a meta-analytic review.

    Science.gov (United States)

    Jia, Xiaojun; Jackson, Todd

    2016-10-01

    In this meta-analysis, we evaluated overall strengths of relation between beliefs about pain, health, or illness and problems in functioning (i.e., functional impairment, affective distress, pain severity) in osteoarthritis and rheumatoid arthritis samples as well as moderators of these associations. In sum, 111 samples (N = 17,365 patients) met inclusion criteria. On average, highly significant, medium effect sizes were observed for associations between beliefs and problems in functioning but heterogeneity was also inflated. Effect sizes were not affected by arthritis subtype, gender, or age. However, pain belief content emerged as a significant moderator, with larger effect sizes for studies in which personal incapacity or ineffectiveness in controlling pain was a content theme of belief indices (i.e., pain catastrophizing, helplessness, self-efficacy) compared to those examining locus of control and fear/threat/harm beliefs. Furthermore, analyses of longitudinal study subsets supported the status of pain beliefs risk factors for later problems in functioning in these groups.

  12. Parent-reported feeding and feeding problems in a sample of Dutch toddlers

    NARCIS (Netherlands)

    Moor, J.M.H. de; Didden, H.C.M.; Korzilius, H.P.L.M.

    2007-01-01

    Little is known about the feeding behaviors and problems with feeding in toddlers. In the present questionnaire study, data were collected on the feeding behaviors and feeding problems in a relatively large (n = 422) sample of Dutch healthy toddlers (i.e. 18-36 months old) who lived at home with

  13. SPIDIA-DNA: An External Quality Assessment for the pre-analytical phase of blood samples used for DNA-based analyses

    Czech Academy of Sciences Publication Activity Database

    Malentacchi, F.; Pazzagli, M.; Simi, L.; Orlando, C.; Wyrich, R.; Hartmann, C.C.; Verderio, P.; Pizzamiglio, S.; Ciniselli, C.M.; Tichopád, Aleš; Kubista, Mikael; Gelmini, S.

    -, č. 424 (2013), s. 274-286 ISSN 0009-8981 Institutional research plan: CEZ:AV0Z50520701 Keywords : Pre-analytical phase * DNA quality * Blood samples Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 2.764, year: 2013

  14. Analytical solutions to orthotropic variable thickness disk problems

    Directory of Open Access Journals (Sweden)

    Ahmet N. ERASLAN

    2016-02-01

    Full Text Available An analytical model is developed to estimate the mechanical response of nonisothermal, orthotropic, variable thickness disks under a variety of boundary conditions. Combining basic mechanical equations of disk geometry with the equations of orthotropic material, the elastic equation of the disk is obtained. This equation is transformed into a standard hypergeometric differential equation by means of a suitable transformation. An analytical solution is then obtained in terms of hypergeometric functions. The boundary conditions used to complete the solutions simulate rotating annular disks with two free surfaces, stationary annular disks with pressurized inner and free outer surfaces, and free inner and pressurized outer surfaces. The results of the solutions to each of these cases are presented in graphical forms. It is observed that, for the three cases investigated the elastic orthotropy parameter turns out to be an important parameter affecting the elastic behaviorKeywords: Orthotropic disk, Variable thickness, Thermoelasticity, Hypergeometric equation

  15. Protocols for the analytical characterization of therapeutic monoclonal antibodies. II - Enzymatic and chemical sample preparation.

    Science.gov (United States)

    Bobaly, Balazs; D'Atri, Valentina; Goyon, Alexandre; Colas, Olivier; Beck, Alain; Fekete, Szabolcs; Guillarme, Davy

    2017-08-15

    The analytical characterization of therapeutic monoclonal antibodies and related proteins usually incorporates various sample preparation methodologies. Indeed, quantitative and qualitative information can be enhanced by simplifying the sample, thanks to the removal of sources of heterogeneity (e.g. N-glycans) and/or by decreasing the molecular size of the tested protein by enzymatic or chemical fragmentation. These approaches make the sample more suitable for chromatographic and mass spectrometric analysis. Structural elucidation and quality control (QC) analysis of biopharmaceutics are usually performed at intact, subunit and peptide levels. In this paper, general sample preparation approaches used to attain peptide, subunit and glycan level analysis are overviewed. Protocols are described to perform tryptic proteolysis, IdeS and papain digestion, reduction as well as deglycosylation by PNGase F and EndoS2 enzymes. Both historical and modern sample preparation methods were compared and evaluated using rituximab and trastuzumab, two reference therapeutic mAb products approved by Food and Drug Administration (FDA) and European Medicines Agency (EMA). The described protocols may help analysts to develop sample preparation methods in the field of therapeutic protein analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Dry sample storage system for an analytical laboratory supporting plutonium processing

    International Nuclear Information System (INIS)

    Treibs, H.A.; Hartenstein, S.D.; Griebenow, B.L.; Wade, M.A.

    1990-01-01

    The Special Isotope Separation (SIS) plant is designed to provide removal of undesirable isotopes in fuel grade plutonium by the atomic vapor laser isotope separation (AVLIS) process. The AVLIS process involves evaporation of plutonium metal, and passage of an intense beam of light from a laser through the plutonium vapor. The laser beam consists of several discrete wavelengths, tuned to the precise wavelength required to ionize the undesired isotopes. These ions are attracted to charged plates, leaving the bulk of the plutonium vapor enriched in the desired isotopes to be collected on a cold plate. Major portions of the process consist of pyrochemical processes, including direct reduction of the plutonium oxide feed material with calcium metal, and aqueous processes for purification of plutonium in residues. The analytical laboratory for the plant is called the Material and Process Control Laboratory (MPCL), and provides for the analysis of solid and liquid process samples

  17. Semi-analytic techniques for calculating bubble wall profiles

    International Nuclear Information System (INIS)

    Akula, Sujeet; Balazs, Csaba; White, Graham A.

    2016-01-01

    We present semi-analytic techniques for finding bubble wall profiles during first order phase transitions with multiple scalar fields. Our method involves reducing the problem to an equation with a single field, finding an approximate analytic solution and perturbing around it. The perturbations can be written in a semi-analytic form. We assert that our technique lacks convergence problems and demonstrate the speed of convergence on an example potential. (orig.)

  18. Analytic solutions to a family of boundary-value problems for Ginsburg-Landau type equations

    Science.gov (United States)

    Vassilev, V. M.; Dantchev, D. M.; Djondjorov, P. A.

    2017-10-01

    We consider a two-parameter family of nonlinear ordinary differential equations describing the behavior of a critical thermodynamic system, e.g., a binary liquid mixture, of film geometry within the framework of the Ginzburg-Landau theory by means of the order-parameter. We focus on the case in which the confining surfaces are strongly adsorbing but prefer different components of the mixture, i.e., the order-parameter tends to infinity at one of the boundaries and to minus infinity at the other one. We assume that the boundaries of the system are positioned at a finite distance from each other and give analytic solutions to the corresponding boundary-value problems in terms of Weierstrass and Jacobi elliptic functions.

  19. A finite volume method for cylindrical heat conduction problems based on local analytical solution

    KAUST Repository

    Li, Wang

    2012-10-01

    A new finite volume method for cylindrical heat conduction problems based on local analytical solution is proposed in this paper with detailed derivation. The calculation results of this new method are compared with the traditional second-order finite volume method. The newly proposed method is more accurate than conventional ones, even though the discretized expression of this proposed method is slightly more complex than the second-order central finite volume method, making it cost more calculation time on the same grids. Numerical result shows that the total CPU time of the new method is significantly less than conventional methods for achieving the same level of accuracy. © 2012 Elsevier Ltd. All rights reserved.

  20. A finite volume method for cylindrical heat conduction problems based on local analytical solution

    KAUST Repository

    Li, Wang; Yu, Bo; Wang, Xinran; Wang, Peng; Sun, Shuyu

    2012-01-01

    A new finite volume method for cylindrical heat conduction problems based on local analytical solution is proposed in this paper with detailed derivation. The calculation results of this new method are compared with the traditional second-order finite volume method. The newly proposed method is more accurate than conventional ones, even though the discretized expression of this proposed method is slightly more complex than the second-order central finite volume method, making it cost more calculation time on the same grids. Numerical result shows that the total CPU time of the new method is significantly less than conventional methods for achieving the same level of accuracy. © 2012 Elsevier Ltd. All rights reserved.

  1. Problems of evaluating isotope analysis of concentrated salt solutions in potash mines

    International Nuclear Information System (INIS)

    Schmiedl, H.D.

    1980-01-01

    Three problems of quantitative evaluation of analytic D and 18 O isotope data of concentrated salt solutions are discussed: (1) Consideration of the influence of admixtures of hydrated salts in determining meteoric or marine water fractions in a concentrated salt solution, (2) analytic accuracy and detection limits in determining meteoric water in salt solutions, and (3) processes of isotopic exchange with reservoir rock and sample matrix

  2. Development of analytical methods for the separation of plutonium, americium, curium and neptunium from environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Salminen, S.

    2009-07-01

    In this work, separation methods have been developed for the analysis of anthropogenic transuranium elements plutonium, americium, curium and neptunium from environmental samples contaminated by global nuclear weapons testing and the Chernobyl accident. The analytical methods utilized in this study are based on extraction chromatography. Highly varying atmospheric plutonium isotope concentrations and activity ratios were found at both Kurchatov (Kazakhstan), near the former Semipalatinsk test site, and Sodankylae (Finland). The origin of plutonium is almost impossible to identify at Kurchatov, since hundreds of nuclear tests were performed at the Semipalatinsk test site. In Sodankylae, plutonium in the surface air originated from nuclear weapons testing, conducted mostly by USSR and USA before the sampling year 1963. The variation in americium, curium and neptunium concentrations was great as well in peat samples collected in southern and central Finland in 1986 immediately after the Chernobyl accident. The main source of transuranium contamination in peats was from global nuclear test fallout, although there are wide regional differences in the fraction of Chernobyl-originated activity (of the total activity) for americium, curium and neptunium. The separation methods developed in this study yielded good chemical recovery for the elements investigated and adequately pure fractions for radiometric activity determination. The extraction chromatographic methods were faster compared to older methods based on ion exchange chromatography. In addition, extraction chromatography is a more environmentally friendly separation method than ion exchange, because less acidic waste solutions are produced during the analytical procedures. (orig.)

  3. Analytical solution for the problem of maximum exit velocity under Coulomb friction in gravity flow discharge chutes

    Energy Technology Data Exchange (ETDEWEB)

    Salinic, Slavisa [University of Kragujevac, Faculty of Mechanical Engineering, Kraljevo (RS)

    2010-10-15

    In this paper, an analytical solution for the problem of finding profiles of gravity flow discharge chutes required to achieve maximum exit velocity under Coulomb friction is obtained by application of variational calculus. The model of a particle which moves down a rough curve in a uniform gravitational field is used to obtain a solution of the problem for various boundary conditions. The projection sign of the normal reaction force of the rough curve onto the normal to the curve and the restriction requiring that the tangential acceleration be non-negative are introduced as the additional constraints in the form of inequalities. These inequalities are transformed into equalities by introducing new state variables. Although this is fundamentally a constrained variational problem, by further introducing a new functional with an expanded set of unknown functions, it is transformed into an unconstrained problem where broken extremals appear. The obtained equations of the chute profiles contain a certain number of unknown constants which are determined from a corresponding system of nonlinear algebraic equations. The obtained results are compared with the known results from the literature. (orig.)

  4. Projected regression method for solving Fredholm integral equations arising in the analytic continuation problem of quantum physics

    International Nuclear Information System (INIS)

    Arsenault, Louis-François; Millis, Andrew J; Neuberg, Richard; Hannah, Lauren A

    2017-01-01

    We present a supervised machine learning approach to the inversion of Fredholm integrals of the first kind as they arise, for example, in the analytic continuation problem of quantum many-body physics. The approach provides a natural regularization for the ill-conditioned inverse of the Fredholm kernel, as well as an efficient and stable treatment of constraints. The key observation is that the stability of the forward problem permits the construction of a large database of outputs for physically meaningful inputs. Applying machine learning to this database generates a regression function of controlled complexity, which returns approximate solutions for previously unseen inputs; the approximate solutions are then projected onto the subspace of functions satisfying relevant constraints. Under standard error metrics the method performs as well or better than the Maximum Entropy method for low input noise and is substantially more robust to increased input noise. We suggest that the methodology will be similarly effective for other problems involving a formally ill-conditioned inversion of an integral operator, provided that the forward problem can be efficiently solved. (paper)

  5. Comparison of the acetyl bromide spectrophotometric method with other analytical lignin methods for determining lignin concentration in forage samples.

    Science.gov (United States)

    Fukushima, Romualdo S; Hatfield, Ronald D

    2004-06-16

    Present analytical methods to quantify lignin in herbaceous plants are not totally satisfactory. A spectrophotometric method, acetyl bromide soluble lignin (ABSL), has been employed to determine lignin concentration in a range of plant materials. In this work, lignin extracted with acidic dioxane was used to develop standard curves and to calculate the derived linear regression equation (slope equals absorptivity value or extinction coefficient) for determining the lignin concentration of respective cell wall samples. This procedure yielded lignin values that were different from those obtained with Klason lignin, acid detergent acid insoluble lignin, or permanganate lignin procedures. Correlations with in vitro dry matter or cell wall digestibility of samples were highest with data from the spectrophotometric technique. The ABSL method employing as standard lignin extracted with acidic dioxane has the potential to be employed as an analytical method to determine lignin concentration in a range of forage materials. It may be useful in developing a quick and easy method to predict in vitro digestibility on the basis of the total lignin content of a sample.

  6. Analytical Chemistry Laboratory progress report for FY 1991

    Energy Technology Data Exchange (ETDEWEB)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.; Lindahl, P.C.; Boparai, A.S.

    1991-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1991 (October 1990 through September 1991). This is the eighth annual report for the ACL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques.

  7. Analytical solutions in the two-cavity coupling problem

    International Nuclear Information System (INIS)

    Ayzatsky, N.I.

    2000-01-01

    Analytical solutions of precise equations that describe the rf-coupling of two cavities through a co-axial cylindrical hole are given for various limited cases.For their derivation we have used the method of solution of an infinite set of linear algebraic equations,based on its transformation into dual integral equations

  8. Toward a mathematical theory of environmental monitoring: the infrequent sampling problem

    International Nuclear Information System (INIS)

    Pimentel, K.D.

    1975-06-01

    Optimal monitoring of pollutants in diffusive environmental media was studied in the contexts of the subproblems of the optimal design and management of environmental monitors for bounds on maximum allowable errors in the estimate of the monitor state or output variables. Concise problem statements were made. Continuous-time finite-dimensional normal mode models for distributed stochastic diffusive pollutant transport were developed. The resultant set of state equations was discretized in time for implementation in the Kalman Filter in the problem of optimal state estimation. The main results of this thesis concern the special class of optimal monitoring problem called the infrequent sampling problem. Extensions to systems including pollutant scavenging and systems with emission or radiation boundary conditions were made. (U.S.)

  9. NASTRAN thermal analyzer: Theory and application including a guide to modeling engineering problems, volume 2. [sample problem library guide

    Science.gov (United States)

    Jackson, C. E., Jr.

    1977-01-01

    A sample problem library containing 20 problems covering most facets of Nastran Thermal Analyzer modeling is presented. Areas discussed include radiative interchange, arbitrary nonlinear loads, transient temperature and steady-state structural plots, temperature-dependent conductivities, simulated multi-layer insulation, and constraint techniques. The use of the major control options and important DMAP alters is demonstrated.

  10. SPIDIA-RNA: First external quality assessment for the pre-analytical phase of blood samples used for RNA based analyses

    Czech Academy of Sciences Publication Activity Database

    Pazzagli, M.; Malentacchi, F.; Simi, L.; Wyrich, R.; Guenther, K.; Hartmann, C.; Verderio, P.; Pizzamiglio, S.; Ciniselli, C.M.; Tichopád, Aleš; Kubista, Mikael; Gelmini, S.

    2013-01-01

    Roč. 59, č. 1 (2013), s. 20-31 ISSN 1046-2023 Institutional research plan: CEZ:AV0Z50520701 Keywords : Pre-analytical phase * RNA quality * Blood samples Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 3.221, year: 2013

  11. An algorithm for analytical solution of basic problems featuring elastostatic bodies with cavities and surface flaws

    Science.gov (United States)

    Penkov, V. B.; Levina, L. V.; Novikova, O. S.; Shulmin, A. S.

    2018-03-01

    Herein we propose a methodology for structuring a full parametric analytical solution to problems featuring elastostatic media based on state-of-the-art computing facilities that support computerized algebra. The methodology includes: direct and reverse application of P-Theorem; methods of accounting for physical properties of media; accounting for variable geometrical parameters of bodies, parameters of boundary states, independent parameters of volume forces, and remote stress factors. An efficient tool to address the task is the sustainable method of boundary states originally designed for the purposes of computerized algebra and based on the isomorphism of Hilbertian spaces of internal states and boundary states of bodies. We performed full parametric solutions of basic problems featuring a ball with a nonconcentric spherical cavity, a ball with a near-surface flaw, and an unlimited medium with two spherical cavities.

  12. Ball assisted device for analytical surface sampling

    Science.gov (United States)

    ElNaggar, Mariam S; Van Berkel, Gary J; Covey, Thomas R

    2015-11-03

    A system for sampling a surface includes a sampling probe having a housing and a socket, and a rolling sampling sphere within the socket. The housing has a sampling fluid supply conduit and a sampling fluid exhaust conduit. The sampling fluid supply conduit supplies sampling fluid to the sampling sphere. The sampling fluid exhaust conduit has an inlet opening for receiving sampling fluid carried from the surface by the sampling sphere. A surface sampling probe and a method for sampling a surface are also disclosed.

  13. Computing the zeros of analytic functions

    CERN Document Server

    Kravanja, Peter

    2000-01-01

    Computing all the zeros of an analytic function and their respective multiplicities, locating clusters of zeros and analytic fuctions, computing zeros and poles of meromorphic functions, and solving systems of analytic equations are problems in computational complex analysis that lead to a rich blend of mathematics and numerical analysis. This book treats these four problems in a unified way. It contains not only theoretical results (based on formal orthogonal polynomials or rational interpolation) but also numerical analysis and algorithmic aspects, implementation heuristics, and polished software (the package ZEAL) that is available via the CPC Program Library. Graduate studets and researchers in numerical mathematics will find this book very readable.

  14. Sampling and analytical methodologies for instrumental neutron activation analysis of airborne particulate matter

    International Nuclear Information System (INIS)

    1992-01-01

    The IAEA supports a number of projects having to do with the analysis of airborne particulate matter by nuclear techniques. Most of this work involves the use of activation analysis in its various forms, particularly instrumental neutron activation analysis (INAA). This technique has been widely used in many different countries for the analysis of airborne particulate matter, and there are already many publications in scientific journals, books and reports describing such work. The present document represents an attempt to summarize the most important features of INAA as applied to the analysis of airborne particulate matter. It is intended to serve as a set of guidelines for use by participants in the IAEA's own programmes, and other scientists, who are not yet fully experienced in the application of INAA to airborne particulate samples, and who wish either to make a start on using this technique or to improve their existing procedures. The methodologies for sampling described in this document are of rather general applicability, although they are presented here in a way that takes account of the particular requirements arising from the use of INAA as the analytical technique. The analytical part of the document, however, is presented in a form that is applicable only to INAA. (Subsequent publications in this series are expected to deal specifically with other nuclear related techniques such as energy dispersive X ray fluorescence (ED-XRF) and particle induced X ray emission (PIXE) analysis). Although the methods and procedures described here have been found through experience to yield acceptable results, they should not be considered mandatory. Any other procedure used should, however, be chosen to be capable of yielding results at least of equal quality to those described

  15. Sampling and analytical methodologies for instrumental neutron activation analysis of airborne particulate matter

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-12-01

    The IAEA supports a number of projects having to do with the analysis of airborne particulate matter by nuclear techniques. Most of this work involves the use of activation analysis in its various forms, particularly instrumental neutron activation analysis (INAA). This technique has been widely used in many different countries for the analysis of airborne particulate matter, and there are already many publications in scientific journals, books and reports describing such work. The present document represents an attempt to summarize the most important features of INAA as applied to the analysis of airborne particulate matter. It is intended to serve as a set of guidelines for use by participants in the IAEA's own programmes, and other scientists, who are not yet fully experienced in the application of INAA to airborne particulate samples, and who wish either to make a start on using this technique or to improve their existing procedures. The methodologies for sampling described in this document are of rather general applicability, although they are presented here in a way that takes account of the particular requirements arising from the use of INAA as the analytical technique. The analytical part of the document, however, is presented in a form that is applicable only to INAA. (Subsequent publications in this series are expected to deal specifically with other nuclear related techniques such as energy dispersive X ray fluorescence (ED-XRF) and particle induced X ray emission (PIXE) analysis). Although the methods and procedures described here have been found through experience to yield acceptable results, they should not be considered mandatory. Any other procedure used should, however, be chosen to be capable of yielding results at least of equal quality to those described.

  16. Analytics for managers with Excel

    CERN Document Server

    Bell, Peter C

    2013-01-01

    Analytics is one of a number of terms which are used to describe a data-driven more scientific approach to management. Ability in analytics is an essential management skill: knowledge of data and analytics helps the manager to analyze decision situations, prevent problem situations from arising, identify new opportunities, and often enables many millions of dollars to be added to the bottom line for the organization.The objective of this book is to introduce analytics from the perspective of the general manager of a corporation. Rather than examine the details or attempt an encyclopaedic revie

  17. On numerical-analytic techniques for boundary value problems

    Czech Academy of Sciences Publication Activity Database

    Rontó, András; Rontó, M.; Shchobak, N.

    2012-01-01

    Roč. 12, č. 3 (2012), s. 5-10 ISSN 1335-8243 Institutional support: RVO:67985840 Keywords : numerical-analytic method * periodic successive approximations * Lyapunov-Schmidt method Subject RIV: BA - General Mathematics http://www.degruyter.com/view/j/aeei.2012.12.issue-3/v10198-012-0035-1/v10198-012-0035-1.xml?format=INT

  18. Blood venous sample collection: Recommendations overview and a checklist to improve quality.

    Science.gov (United States)

    Giavarina, Davide; Lippi, Giuseppe

    2017-07-01

    The extra-analytical phases of the total testing process have substantial impact on managed care, as well as an inherent high risk of vulnerability to errors which is often greater than that of the analytical phase. The collection of biological samples is a crucial preanalytical activity. Problems or errors occurring shortly before, or soon after, this preanalytical step may impair sample quality and characteristics, or else modify the final results of testing. The standardization of fasting requirements, rest, patient position and psychological state of the patient are therefore crucial for mitigating the impact of preanalytical variability. Moreover, the quality of materials used for collecting specimens, along with their compatibility, can guarantee sample quality and persistence of chemical and physical characteristics of the analytes over time, so safeguarding the reliability of testing. Appropriate techniques and sampling procedures are effective to prevent problems such as hemolysis, undue clotting in the blood tube, draw of insufficient sample volume and modification of analyte concentration. An accurate identification of both patient and blood samples is a key priority as for other healthcare activities. Good laboratory practice and appropriate training of operators, by specifically targeting collection of biological samples, blood in particular, may greatly improve this issue, thus lowering the risk of errors and their adverse clinical consequences. The implementation of a simple and rapid check-list, including verification of blood collection devices, patient preparation and sampling techniques, was found to be effective for enhancing sample quality and reducing some preanalytical errors associated with these procedures. The use of this tool, along with implementation of objective and standardized systems for detecting non-conformities related to unsuitable samples, can be helpful for standardizing preanalytical activities and improving the quality of

  19. Micro-Crater Laser Induced Breakdown Spectroscopy--an Analytical approach in metals samples

    Energy Technology Data Exchange (ETDEWEB)

    Piscitelli, Vincent [UCV- Laboratorio de Espectroscopia Laser, Caracas (Venezuela); Lawrence Berkeley National laboratory, Berkeley, US (United States); Gonzalez, Jhanis; Xianglei, Mao; Russo, Richard [Lawrence Berkeley National laboratory, Berkeley, US (United States); Fernandez, Alberto [UCV- Laboratorio de Espectroscopia Laser, Caracas (Venezuela)

    2008-04-15

    The laser ablation has been increasing its popularity like as technique of chemical analysis. This is due to its great potentiality in the analysis of solid samples. On the way to contributing to the development of the technique, we in this work studied the laser induced breakdown spectroscopy (LIBS) in conditions of micro ablation for future studies of coverings and micro crates analysis. Craters between 2 and 7 micrometers of diameter were made using an Nd-YAG nanosecond laser in their fundamental emission of 1064 nm. In order to create these craters we use an objective lens of long distance work and 0.45 of numerical aperture. The atomic emission versus the energy of the laser and its effect on the size of craters was study. We found that below 3 micrometers although there was evidence of material removal by the formation of a crater, it was no detectable atomic emission for our instruments. In order to try to understand this, curves of size of crater versus plasma temperature using the Boltzmann distribution graphs taking the Copper emission lines in the visible region were made. In addition calibration curves for Copper and aluminum were made in two different matrices; one of it was a Cu/Zn alloy and the other a Zinc Matrix. The atomic lines Cu I (521.78 nm) and Al I (396.15 nm) was used. From the Calibration curve the analytical limit of detection and other analytical parameters were obtained.

  20. Mathematical solutions to problems in radiological protection involving air sampling and biokinetic modelling

    International Nuclear Information System (INIS)

    Birchall, A.

    1989-04-01

    Intakes of radionuclides are estimated with the personal air sampler (PAS) and by biological monitoring techniques: in the case of plutonium, there are problems with both methods. The statistical variation in activity collected when sampling radioactive aerosols with low number concentrations was investigated. It was shown that the PAS is barely adequate for monitoring plutonium at annual limit of intake (ALI) levels in typical workplace conditions. Two algorithms were developed, enabling non-recycling and recycling compartmental models to be solved. Their accuracy and speed were investigated, and methods of dealing with partitioning, continuous intake, and radioactive progeny were discussed. Analytical, rather than numerical, methods were used. These are faster, and thus ideally suited for implementation on microcomputers. The algorithms enable non-specialists to solve quickly and easily any first order compartmental model, including all the ICRP metabolic models. Non-recycling models with up to 50 compartments can be solved in seconds: recycling models take a little longer. A biokinetic model for plutonium in man following systemic uptake was developed. The proposed ICRP lung model (1989) was represented by a first order compartmental model. These two models were combined, and the recycling algorithm was used to calculate urinary and faecal excretion of plutonium following acute or chronic intake by inhalation. The results indicate much lower urinary excretion than predicted by ICRP Publication 54. (author)

  1. Learning Analytics: Challenges and Limitations

    Science.gov (United States)

    Wilson, Anna; Watson, Cate; Thompson, Terrie Lynn; Drew, Valerie; Doyle, Sarah

    2017-01-01

    Learning analytic implementations are increasingly being included in learning management systems in higher education. We lay out some concerns with the way learning analytics--both data and algorithms--are often presented within an unproblematized Big Data discourse. We describe some potential problems with the often implicit assumptions about…

  2. Analytical evaluation of BEA zeolite for the pre-concentration of polycyclic aromatic hydrocarbons and their subsequent chromatographic analysis in water samples.

    Science.gov (United States)

    Wilson, Walter B; Costa, Andréia A; Wang, Huiyong; Dias, José A; Dias, Sílvia C L; Campiglia, Andres D

    2012-07-06

    The analytical performance of BEA - a commercial zeolite - is evaluated for the pre-concentration of fifteen Environmental Protection Agency - polycyclic aromatic hydrocarbons and their subsequent HPLC analysis in tap and lake water samples. The pre-concentration factors obtained with BEA have led to a method with excellent analytical figures of merit. One milliliter aliquots were sufficient to obtain excellent precision of measurements at the parts-per-trillion concentration level with relative standard deviations varying from 4.1% (dibenzo[a,h]anthracene) to 13.4% (pyrene). The limits of detection were excellent as well and varied between 1.1 (anthracene) and 49.9 ng L(-1) (indeno[1,2,3-cd]pyrene). The recovery values of all the studied compounds meet the criterion for regulated polycyclic aromatic hydrocarbons, which mandates relative standard deviations equal or lower than 25%. The small volume of organic solvents (100 μL per sample) and amount of BEA (2 mg per sample) makes sample pre-concentration environmentally friendly and cost effective. The extraction procedure is well suited for numerous samples as the small working volume (1 mL) facilitates the implementation of simultaneous sample extraction. These are attractive features when routine monitoring of numerous samples is contemplated. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Sampling and sample processing in pesticide residue analysis.

    Science.gov (United States)

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  4. Pre-Analytical Considerations for Successful Next-Generation Sequencing (NGS: Challenges and Opportunities for Formalin-Fixed and Paraffin-Embedded Tumor Tissue (FFPE Samples

    Directory of Open Access Journals (Sweden)

    Gladys Arreaza

    2016-09-01

    Full Text Available In cancer drug discovery, it is important to investigate the genetic determinants of response or resistance to cancer therapy as well as factors that contribute to adverse events in the course of clinical trials. Despite the emergence of new technologies and the ability to measure more diverse analytes (e.g., circulating tumor cell (CTC, circulating tumor DNA (ctDNA, etc., tumor tissue is still the most common and reliable source for biomarker investigation. Because of its worldwide use and ability to preserve samples for many decades at ambient temperature, formalin-fixed, paraffin-embedded tumor tissue (FFPE is likely to be the preferred choice for tissue preservation in clinical practice for the foreseeable future. Multiple analyses are routinely performed on the same FFPE samples (such as Immunohistochemistry (IHC, in situ hybridization, RNAseq, DNAseq, TILseq, Methyl-Seq, etc.. Thus, specimen prioritization and optimization of the isolation of analytes is critical to ensure successful completion of each assay. FFPE is notorious for producing suboptimal DNA quality and low DNA yield. However, commercial vendors tend to request higher DNA sample mass than what is actually required for downstream assays, which restricts the breadth of biomarker work that can be performed. We evaluated multiple genomics service laboratories to assess the current state of NGS pre-analytical processing of FFPE. Significant differences in pre-analytical capabilities were observed. Key aspects are highlighted and recommendations are made to improve the current practice in translational research.

  5. Sampling and analytical methodologies for energy dispersive X-ray fluorescence analysis of airborne particulate matter

    International Nuclear Information System (INIS)

    1993-01-01

    The present document represents an attempt to summarize the most important features of the different forms of ED-XFR as applied to the analysis of airborne particulate matter. It is intended to serve as a set of guidelines for use by participants in the IAEA's own programmes, and other scientists, who are not yet fully experienced in the application of ED-XRF to airborne particulate samples, and who wish either to make a start on using this technique or to improve their existing procedures. The methodologies for sampling described in this document are of rather general applicability. Emphasis is also placed on the sources of errors affecting the sampling of airborne particulate matter. The analytical part of the document describes the different forms of ED-XRF and their potential applications. Spectrum evaluation, a key step in X-ray spectrometry, is covered in depth, including discussion on several calibration and peak fitting techniques and computer programs especially designed for this purpose. 148 refs, 25 figs, 13 tabs

  6. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  7. Laser sampling

    International Nuclear Information System (INIS)

    Gorbatenko, A A; Revina, E I

    2015-01-01

    The review is devoted to the major advances in laser sampling. The advantages and drawbacks of the technique are considered. Specific features of combinations of laser sampling with various instrumental analytical methods, primarily inductively coupled plasma mass spectrometry, are discussed. Examples of practical implementation of hybrid methods involving laser sampling as well as corresponding analytical characteristics are presented. The bibliography includes 78 references

  8. Curriculum Innovation for Marketing Analytics

    Science.gov (United States)

    Wilson, Elizabeth J.; McCabe, Catherine; Smith, Robert S.

    2018-01-01

    College graduates need better preparation for and experience in data analytics for higher-quality problem solving. Using the curriculum innovation framework of Borin, Metcalf, and Tietje (2007) and case study research methods, we offer rich insights about one higher education institution's work to address the marketing analytics skills gap.…

  9. Analytical chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Chae, Myeong Hu; Lee, Hu Jun; Kim, Ha Seok

    1989-02-15

    This book give explanations on analytical chemistry with ten chapters, which deal with development of analytical chemistry, the theory of error with definition and classification, sample and treatment gravimetry on general process of gravimetry in aqueous solution and non-aqueous solution, precipitation titration about precipitation reaction and types, complexometry with summary and complex compound, oxidation-reduction equilibrium on electrode potential and potentiometric titration, solvent extraction and chromatograph and experiment with basic operation for chemical experiment.

  10. Analytical chemistry

    International Nuclear Information System (INIS)

    Chae, Myeong Hu; Lee, Hu Jun; Kim, Ha Seok

    1989-02-01

    This book give explanations on analytical chemistry with ten chapters, which deal with development of analytical chemistry, the theory of error with definition and classification, sample and treatment gravimetry on general process of gravimetry in aqueous solution and non-aqueous solution, precipitation titration about precipitation reaction and types, complexometry with summary and complex compound, oxidation-reduction equilibrium on electrode potential and potentiometric titration, solvent extraction and chromatograph and experiment with basic operation for chemical experiment.

  11. An analytical approach to estimate the number of small scatterers in 2D inverse scattering problems

    International Nuclear Information System (INIS)

    Fazli, Roohallah; Nakhkash, Mansor

    2012-01-01

    This paper presents an analytical method to estimate the location and number of actual small targets in 2D inverse scattering problems. This method is motivated from the exact maximum likelihood estimation of signal parameters in white Gaussian noise for the linear data model. In the first stage, the method uses the MUSIC algorithm to acquire all possible target locations and in the next stage, it employs an analytical formula that works as a spatial filter to determine which target locations are associated to the actual ones. The ability of the method is examined for both the Born and multiple scattering cases and for the cases of well-resolved and non-resolved targets. Many numerical simulations using both the coincident and non-coincident arrays demonstrate that the proposed method can detect the number of actual targets even in the case of very noisy data and when the targets are closely located. Using the experimental microwave data sets, we further show that this method is successful in specifying the number of small inclusions. (paper)

  12. Semi-Analytical Benchmarks for MCNP6

    Energy Technology Data Exchange (ETDEWEB)

    Grechanuk, Pavel Aleksandrovi [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-11-07

    Code verification is an extremely important process that involves proving or disproving the validity of code algorithms by comparing them against analytical results of the underlying physics or mathematical theory on which the code is based. Monte Carlo codes such as MCNP6 must undergo verification and testing upon every release to ensure that the codes are properly simulating nature. Specifically, MCNP6 has multiple sets of problems with known analytic solutions that are used for code verification. Monte Carlo codes primarily specify either current boundary sources or a volumetric fixed source, either of which can be very complicated functions of space, energy, direction and time. Thus, most of the challenges with modeling analytic benchmark problems in Monte Carlo codes come from identifying the correct source definition to properly simulate the correct boundary conditions. The problems included in this suite all deal with mono-energetic neutron transport without energy loss, in a homogeneous material. The variables that differ between the problems are source type (isotropic/beam), medium dimensionality (infinite/semi-infinite), etc.

  13. Benchmarking the invariant embedding method against analytical solutions in model transport problems

    International Nuclear Information System (INIS)

    Malin, Wahlberg; Imre, Pazsit

    2005-01-01

    The purpose of this paper is to demonstrate the use of the invariant embedding method in a series of model transport problems, for which it is also possible to obtain an analytical solution. Due to the non-linear character of the embedding equations, their solution can only be obtained numerically. However, this can be done via a robust and effective iteration scheme. In return, the domain of applicability is far wider than the model problems investigated in this paper. The use of the invariant embedding method is demonstrated in three different areas. The first is the calculation of the energy spectrum of reflected (sputtered) particles from a multiplying medium, where the multiplication arises from recoil production. Both constant and energy dependent cross sections with a power law dependence were used in the calculations. The second application concerns the calculation of the path length distribution of reflected particles from a medium without multiplication. This is a relatively novel and unexpected application, since the embedding equations do not resolve the depth variable. The third application concerns the demonstration that solutions in an infinite medium and a half-space are interrelated through embedding-like integral equations, by the solution of which the reflected flux from a half-space can be reconstructed from solutions in an infinite medium or vice versa. In all cases the invariant embedding method proved to be robust, fast and monotonically converging to the exact solutions. (authors)

  14. Enzyme Biosensors for Biomedical Applications: Strategies for Safeguarding Analytical Performances in Biological Fluids

    Science.gov (United States)

    Rocchitta, Gaia; Spanu, Angela; Babudieri, Sergio; Latte, Gavinella; Madeddu, Giordano; Galleri, Grazia; Nuvoli, Susanna; Bagella, Paola; Demartis, Maria Ilaria; Fiore, Vito; Manetti, Roberto; Serra, Pier Andrea

    2016-01-01

    Enzyme-based chemical biosensors are based on biological recognition. In order to operate, the enzymes must be available to catalyze a specific biochemical reaction and be stable under the normal operating conditions of the biosensor. Design of biosensors is based on knowledge about the target analyte, as well as the complexity of the matrix in which the analyte has to be quantified. This article reviews the problems resulting from the interaction of enzyme-based amperometric biosensors with complex biological matrices containing the target analyte(s). One of the most challenging disadvantages of amperometric enzyme-based biosensor detection is signal reduction from fouling agents and interference from chemicals present in the sample matrix. This article, therefore, investigates the principles of functioning of enzymatic biosensors, their analytical performance over time and the strategies used to optimize their performance. Moreover, the composition of biological fluids as a function of their interaction with biosensing will be presented. PMID:27249001

  15. Studies on application of neutron activation analysis -Applied research on air pollution monitoring and development of analytical method of environmental samples

    International Nuclear Information System (INIS)

    Chung, Yong Sam; Moon, Jong Hwa; Chung, Young Ju; Jeong, Eui Sik; Lee, Sang Mi; Kang, Sang Hun; Cho, Seung Yeon; Kwon, Young Sik; Chung, Sang Wuk; Lee, Kyu Sung; Chun, Ki Hong; Kim, Nak Bae; Lee, Kil Yong; Yoon, Yoon Yeol; Chun, Sang Ki.

    1997-09-01

    This research report is written for results of applied research on air pollution monitoring using instrumental neutron activation analysis. For identification and standardization of analytical method, 24 environmental samples are analyzed quantitatively, and accuracy and precision of this method are measured. Using airborne particulate matter and biomonitor chosen as environmental indicators, trace elemental concentrations of sample collected at urban and rural site monthly are determined ant then the calculation of statistics and the factor analysis are carried out for investigation of emission source. Facilities for NAA are installed in a new HANARO reactor, functional test is performed for routine operation. In addition, unified software code for NAA is developed to improve accuracy, precision and abilities of analytical processes. (author). 103 refs., 61 tabs., 19 figs

  16. Virtual sampling in variational processing of Monte Carlo simulation in a deep neutron penetration problem

    International Nuclear Information System (INIS)

    Allagi, Mabruk O.; Lewins, Jeffery D.

    1999-01-01

    In a further study of virtually processed Monte Carlo estimates in neutron transport, a shielding problem has been studied. The use of virtual sampling to estimate the importance function at a certain point in the phase space depends on the presence of neutrons from the real source at that point. But in deep penetration problems, not many neutrons will reach regions far away from the source. In order to overcome this problem, two suggestions are considered: (1) virtual sampling is used as far as the real neutrons can reach, then fictitious sampling is introduced for the remaining regions, distributed in all the regions, or (2) only one fictitious source is placed where the real neutrons almost terminate and then virtual sampling is used in the same way as for the real source. Variational processing is again found to improve the Monte Carlo estimates, being best when using one fictitious source in the far regions with virtual sampling (option 2). When fictitious sources are used to estimate the importances in regions far away from the source, some optimization has to be performed for the proportion of fictitious to real sources, weighted against accuracy and computational costs. It has been found in this study that the optimum number of cells to be treated by fictitious sampling is problem dependent, but as a rule of thumb, fictitious sampling should be employed in regions where the number of neutrons from the real source fall below a specified limit for good statistics

  17. Analytical mechanics

    CERN Document Server

    Lemos, Nivaldo A

    2018-01-01

    Analytical mechanics is the foundation of many areas of theoretical physics including quantum theory and statistical mechanics, and has wide-ranging applications in engineering and celestial mechanics. This introduction to the basic principles and methods of analytical mechanics covers Lagrangian and Hamiltonian dynamics, rigid bodies, small oscillations, canonical transformations and Hamilton–Jacobi theory. This fully up-to-date textbook includes detailed mathematical appendices and addresses a number of advanced topics, some of them of a geometric or topological character. These include Bertrand's theorem, proof that action is least, spontaneous symmetry breakdown, constrained Hamiltonian systems, non-integrability criteria, KAM theory, classical field theory, Lyapunov functions, geometric phases and Poisson manifolds. Providing worked examples, end-of-chapter problems, and discussion of ongoing research in the field, it is suitable for advanced undergraduate students and graduate students studying analyt...

  18. Is a pre-analytical process for urinalysis required?

    Science.gov (United States)

    Petit, Morgane; Beaudeux, Jean-Louis; Majoux, Sandrine; Hennequin, Carole

    2017-10-01

    For the reliable urinary measurement of calcium, phosphate and uric acid, a pre-analytical process by adding acid or base to urine samples at laboratory is recommended in order to dissolve precipitated solutes. Several studies on different kind of samples and analysers have previously shown that a such pre-analytical treatment is useless. The objective was to study the necessity of pre-analytical treatment of urine on samples collected using the V-Monovette ® (Sarstedt) system and measured on the analyser Architect C16000 (Abbott Diagnostics). Sixty urinary samples of hospitalized patients were selected (n=30 for calcium and phosphate, and n=30 for uric acid). After acidification of urine samples for measurement of calcium and phosphate, and alkalinisation for measurement of uric acid respectively, differences between results before and after the pre-analytical treatment were compared to acceptable limits recommended by the French society of clinical biology (SFBC). No difference in concentration between before and after pre-analytical treatment of urine samples exceeded acceptable limits from SFBC for measurement of calcium and uric acid. For phosphate, only one sample exceeded these acceptable limits, showing a result paradoxically lower after acidification. In conclusion, in agreement with previous study, our results show that acidification or alkalinisation of urine samples from 24 h urines or from urination is not a pre-analytical necessity for measurement of calcium, phosphate and uric acid.

  19. Applications of neutrons for laboratory and industrial activation analysis problems

    International Nuclear Information System (INIS)

    Szabo, Elek; Bakos, Laszlo

    1986-01-01

    This chapter presents some particular applications and case studies of neutrons in activation analysis for research and industrial development purposes. The reactor neutrons have been applied in Hungarian laboratories for semiconductor research, for analysis of geological (lunar) samples, and for a special comparator measurement of samples. Some industrial applications of neutron generator and sealed sources for analytical problems are presented. Finally, prompt neutron activation analysis is outlined briefly. (R.P.)

  20. Analytical calculations by computer in physics and mathematics

    International Nuclear Information System (INIS)

    Gerdt, V.P.; Tarasov, O.V.; Shirokov, D.V.

    1978-01-01

    The review of present status of analytical calculations by computer is given. Some programming systems for analytical computations are considered. Such systems as SCHOONSCHIP, CLAM, REDUCE-2, SYMBAL, CAMAL, AVTO-ANALITIK which are implemented or will be implemented in JINR, and MACSYMA - one of the most developed systems - are discussed. It is shown on the basis of mathematical operations, realized in these systems, that they are appropriated for different problems of theoretical physics and mathematics, for example, for problems of quantum field theory, celestial mechanics, general relativity and so on. Some problems solved in JINR by programming systems for analytical computations are described. The review is intended for specialists in different fields of theoretical physics and mathematics

  1. Causality and analyticity in optics

    International Nuclear Information System (INIS)

    Nussenzveig, H.M.

    In order to provide an overall picture of the broad range of optical phenomena that are directly linked with the concepts of causality and analyticity, the following topics are briefly reviewed, emphasizing recent developments: 1) Derivation of dispersion relations for the optical constants of general linear media from causality. Application to the theory of natural optical activity. 2) Derivation of sum rules for the optical constants from causality and from the short-time response function (asymptotic high-frequency behavior). Average spectral behavior of optical media. Applications. 3) Role of spectral conditions. Analytic properties of coherence functions in quantum optics. Reconstruction theorem.4) Phase retrieval problems. 5) Inverse scattering problems. 6) Solution of nonlinear evolution equations in optics by inverse scattering methods. Application to self-induced transparency. Causality in nonlinear wave propagation. 7) Analytic continuation in frequency and angular momentum. Complex singularities. Resonances and natural-mode expansions. Regge poles. 8) Wigner's causal inequality. Time delay. Spatial displacements in total reflection. 9) Analyticity in diffraction theory. Complex angular momentum theory of Mie scattering. Diffraction as a barrier tunnelling effect. Complex trajectories in optics. (Author) [pt

  2. An analytical approach for a nodal formulation of a two-dimensional fixed-source neutron transport problem in heterogeneous medium

    Energy Technology Data Exchange (ETDEWEB)

    Basso Barichello, Liliane; Dias da Cunha, Rudnei [Universidade Federal do Rio Grande do Sul, Porto Alegre, RS (Brazil). Inst. de Matematica; Becker Picoloto, Camila [Universidade Federal do Rio Grande do Sul, Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica; Tres, Anderson [Universidade Federal do Rio Grande do Sul, Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Matematica Aplicada

    2015-05-15

    A nodal formulation of a fixed-source two-dimensional neutron transport problem, in Cartesian geometry, defined in a heterogeneous medium, is solved by an analytical approach. Explicit expressions, in terms of the spatial variables, are derived for averaged fluxes in each region in which the domain is subdivided. The procedure is an extension of an analytical discrete ordinates method, the ADO method, for the solution of the two-dimensional homogeneous medium case. The scheme is developed from the discrete ordinates version of the two-dimensional transport equation along with the level symmetric quadrature scheme. As usual for nodal schemes, relations between the averaged fluxes and the unknown angular fluxes at the contours are introduced as auxiliary equations. Numerical results are in agreement with results available in the literature.

  3. Use of CTX-I and PINP as bone turnover markers: National Bone Health Alliance recommendations to standardize sample handling and patient preparation to reduce pre-analytical variability.

    Science.gov (United States)

    Szulc, P; Naylor, K; Hoyle, N R; Eastell, R; Leary, E T

    2017-09-01

    The National Bone Health Alliance (NBHA) recommends standardized sample handling and patient preparation for C-terminal telopeptide of type I collagen (CTX-I) and N-terminal propeptide of type I procollagen (PINP) measurements to reduce pre-analytical variability. Controllable and uncontrollable patient-related factors are reviewed to facilitate interpretation and minimize pre-analytical variability. The IOF and the International Federation of Clinical Chemistry (IFCC) Bone Marker Standards Working Group have identified PINP and CTX-I in blood to be the reference markers of bone turnover for the fracture risk prediction and monitoring of osteoporosis treatment. Although used in clinical research for many years, bone turnover markers (BTM) have not been widely adopted in clinical practice primarily due to their poor within-subject and between-lab reproducibility. The NBHA Bone Turnover Marker Project team aim to reduce pre-analytical variability of CTX-I and PINP measurements through standardized sample handling and patient preparation. Recommendations for sample handling and patient preparations were made based on review of available publications and pragmatic considerations to reduce pre-analytical variability. Controllable and un-controllable patient-related factors were reviewed to facilitate interpretation and sample collection. Samples for CTX-I must be collected consistently in the morning hours in the fasted state. EDTA plasma is preferred for CTX-I for its greater sample stability. Sample collection conditions for PINP are less critical as PINP has minimal circadian variability and is not affected by food intake. Sample stability limits should be observed. The uncontrollable aspects (age, sex, pregnancy, immobility, recent fracture, co-morbidities, anti-osteoporotic drugs, other medications) should be considered in BTM interpretation. Adopting standardized sample handling and patient preparation procedures will significantly reduce controllable pre-analytical

  4. Analytical Chemistry Laboratory progress report for FY 1985

    Energy Technology Data Exchange (ETDEWEB)

    Green, D.W.; Heinrich, R.R.; Jensen, K.J.

    1985-12-01

    The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of technical support services to the scientific and engineering programs at ANL. In addition, ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques. The purpose of this report is to summarize the technical and administrative activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1985 (October 1984 through September 1985). This is the second annual report for the ACL. 4 figs., 1 tab.

  5. Data Analytics in CRM Processes: A Literature Review

    Directory of Open Access Journals (Sweden)

    Gončarovs Pāvels

    2017-12-01

    Full Text Available Nowadays, the data scarcity problem has been supplanted by the data deluge problem. Marketers and Customer Relationship Management (CRM specialists have access to rich data on consumer behaviour. The current challenge is effective utilisation of these data in CRM processes and selection of appropriate data analytics techniques. Data analytics techniques help find hidden patterns in data. The present paper explores the characteristics of data analytics as the integrated tool in CRM for sales managers. The paper aims at analysing some of the different analytics methods and tools which can be used for continuous improvement of CRM processes. A systematic literature has been conducted to achieve this goal. The results of the review highlight the most frequently considered CRM processes in the context of data analytics.

  6. Analytical Chemistry Laboratory progress report for FY 1985

    International Nuclear Information System (INIS)

    Green, D.W.; Heinrich, R.R.; Jensen, K.J.

    1985-12-01

    The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of technical support services to the scientific and engineering programs at ANL. In addition, ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques. The purpose of this report is to summarize the technical and administrative activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1985 (October 1984 through September 1985). This is the second annual report for the ACL. 4 figs., 1 tab

  7. Analytical and numerical study of validation test-cases for multi-physic problems: application to magneto-hydro-dynamic

    Directory of Open Access Journals (Sweden)

    D Cébron

    2016-04-01

    Full Text Available The present paper is concerned with the numerical simulation of Magneto-Hydro-Dynamic (MHD problems with industrial tools. MHD has receivedattention some twenty to thirty years ago as a possible alternative inpropulsion applications; MHD propelled ships have even been designed forthat purpose. However, such propulsion systems have been proved of lowefficiency and fundamental researches in the area have progressivelyreceived much less attention over the past decades. Numerical simulationof MHD problem could however provide interesting solutions in the field ofturbulent flow control. The development of recent efficient numericaltechniques for multi-physic applications provide promising tool for theengineer for that purpose. In the present paper, some elementary testcases in laminar flow with magnetic forcing terms are analysed; equationsof the coupled problem are exposed, analytical solutions are derived ineach case and are compared to numerical solutions obtained with anumerical tool for multi-physic applications. The present work can be seenas a validation of numerical tools (based on the finite element method foracademic as well as industrial application purposes.

  8. Boundary value problems of finite elasticity local theorems on existence, uniqueness, and analytic dependence on data

    CERN Document Server

    Valent, Tullio

    1988-01-01

    In this book I present, in a systematic form, some local theorems on existence, uniqueness, and analytic dependence on the load, which I have recently obtained for some types of boundary value problems of finite elasticity. Actually, these results concern an n-dimensional (n ~ 1) formal generalization of three-dimensional elasticity. Such a generalization, be­ sides being quite spontaneous, allows us to consider a great many inter­ esting mathematical situations, and sometimes allows us to clarify certain aspects of the three-dimensional case. Part of the matter presented is unpublished; other arguments have been only partially published and in lesser generality. Note that I concentrate on simultaneous local existence and uniqueness; thus, I do not deal with the more general theory of exis­ tence. Moreover, I restrict my discussion to compressible elastic bodies and I do not treat unilateral problems. The clever use of the inverse function theorem in finite elasticity made by STOPPELLI [1954, 1957a, 1957b]...

  9. Optimization of an analytical electron microscope for x-ray microanalysis: instrumental problems

    International Nuclear Information System (INIS)

    Bentley, J.; Zaluzec, N.J.; Kenik, E.A.; Carpenter, R.W.

    1979-01-01

    The addition of an energy dispersive x-ray spectrometer to a modern transmission or scanning transmission electron microscope can provide a powerful tool in the characterization of the materials. Unfortunately this seemingly simple modification can lead to a host of instrumental problems with respect to the accuracy, validity, and quality of the recorded information. This tutorial reviews the complications which can arise in performing x-ray microanalysis in current analytical electron microscopes. The first topic treated in depth is fluorescence by uncollimated radiation. The source, distinguishing characteristics, effects on quantitative analysis and schemes for elimination or minimization as applicable to TEM/STEMs, D-STEMs and HVEMs are discussed. The local specimen environment is considered in the second major section where again detrimental effects on quantitative analysis and remedial procedures, particularly the use of low-background specimen holers, are highlighted. Finally, the detrimental aspects of specimen contamination, insofar as they affect x-ray microanalysis, are discussed. It is concluded that if the described preventive measures are implemented, reliable quantitative analysis is possible

  10. Problem Gambling in a Sample of Older Adult Casino Gamblers.

    Science.gov (United States)

    van der Maas, Mark; Mann, Robert E; McCready, John; Matheson, Flora I; Turner, Nigel E; Hamilton, Hayley A; Schrans, Tracy; Ialomiteanu, Anca

    2017-01-01

    As older adults continue to make up a greater proportion of the Canadian population, it becomes more important to understand the implications that their leisure activities have for their physical and mental health. Gambling, in particular, is a form of leisure that is becoming more widely available and has important implications for the mental health and financial well-being of older adults. This study examines a large sample (2103) of casino-going Ontarian adults over the age of 55 and identifies those features of their gambling participation that are associated with problem gambling. Logistic regression analysis is used to analyze the data. Focusing on types of gambling participated in and motivations for visiting the casino, this study finds that several forms of gambling and motivations to gamble are associated with greater risk of problem gambling. It also finds that some motivations are associated with lower risk of problem gambling. The findings of this study have implications related to gambling availability within an aging population.

  11. Matrix effects break the LC behavior rule for analytes in LC-MS/MS analysis of biological samples.

    Science.gov (United States)

    Fang, Nianbai; Yu, Shanggong; Ronis, Martin Jj; Badger, Thomas M

    2015-04-01

    High-performance liquid chromatography (HPLC) and liquid chromatography-tandem mass spectrometry (LC-MS/MS) are generally accepted as the preferred techniques for detecting and quantitating analytes of interest in biological matrices on the basis of the rule that one chemical compound yields one LC-peak with reliable retention time (Rt.). However, in the current study, we have found that under the same LC-MS conditions, the Rt. and shape of LC-peaks of bile acids in urine samples from animals fed dissimilar diets differed significantly among each other. To verify this matrix effect, 17 authentic bile acid standards were dissolved in pure methanol or in methanol containing extracts of urine from pigs consuming either breast milk or infant formula and analyzed by LC-MS/MS. The matrix components in urine from piglets fed formula significantly reduced the LC-peak Rt. and areas of bile acids. This is the first characterization of this matrix effect on Rt. in the literature. Moreover, the matrix effect resulted in an unexpected LC behavior: one single compound yielded two LC-peaks, which broke the rule of one LC-peak for one compound. The three bile acid standards which exhibited this unconventional LC behavior were chenodeoxycholic acid, deoxycholic acid, and glycocholic acid. One possible explanation for this effect is that some matrix components may have loosely bonded to analytes, which changed the time analytes were retained on a chromatography column and interfered with the ionization of analytes in the MS ion source to alter the peak area. This study indicates that a comprehensive understanding of matrix effects is needed towards improving the use of HPLC and LC-MS/MS techniques for qualitative and quantitative analyses of analytes in pharmacokinetics, proteomics/metabolomics, drug development, and sports drug testing, especially when LC-MS/MS data are analyzed by automation software where identification of an analyte is based on its exact molecular weight and Rt

  12. Analysis of water and soil from the wetlands of Upper Three Runs Creek. Volume 2A, Analytical data packages September--October 1991 sampling

    Energy Technology Data Exchange (ETDEWEB)

    Haselow, L.A.; Rogers, V.A. [Westinghouse Savannah River Co., Aiken, SC (United States); Riordan, C.J. [Metcalf and Eddy, Inc. (United States); Eidson, G.W.; Herring, M.K. [Normandeau Associates, Inc. (United States)

    1992-08-01

    Shallow water and soils along Upper Three Runs Creek (UTRC) and associated wetlands between SRS Road F and Cato Road were sampled for nonradioactive and radioactive constituents. The sampling program is associated with risk evaluations being performed for various regulatory documents in these areas of the Savannah River Site (SRS). WSRC selected fifty sampling sites bordering the Mixed Waste Management Facility (MWMF), F- and H-Area Seepage Basins (FHSB), and the Sanitary Landfill (SL). The analytical results from this study provided information on the water and soil quality in UTRC and its associated wetlands. The analytical results from this investigation indicated that the primary constituents and radiological indicators detected in the shallow water and soils were tritium, gross alpha, radium 226, total radium and strontium 90. This investigation involved the collection of shallow water samples during the Fall of 1991 and the Spring of 1992 at fifty (50) sampling locations. Sampling was performed during these periods to incorporate high and low water table periods. Samples were collected from three sections along UTRC denoted as Phase I (MWMF), Phase II (FHSB) and Phase III (SL). One vibracored soil sample was also collected in each phase during the Fall of 1991. This document is compiled solely of experimental data obtained from the sampling procedures.

  13. Analysis of water and soil from the wetlands of Upper Three Runs Creek. Volume 2B: Analytical data packages, January--February 1992 sampling

    Energy Technology Data Exchange (ETDEWEB)

    Haselow, L.A.; Rogers, V.A. [Westinghouse Savannah River Co., Aiken, SC (United States); Riordan, C.J. [Metcalf and Eddy (United States); Eidson, G.W.; Herring, M.K. [Normandeau Associates, Inc., Aiken, SC (United States)

    1992-08-01

    Shallow water and soils along Upper Three Runs Creek (UTRC) and associated wetlands between SRS Road F and Cato Road were sampled for nonradioactive and radioactive constituents. The sampling program is associated with risk evaluations being performed for various regulatory documents in these areas of the Savannah River Site (SRS). WSRC selected fifty sampling sites bordering the Mixed Waste Management Facility (MWMF), F- and H-Area Seepage Basins (FHSB), and the Sanitary Landfill (SL). The analytical results from this study provided information on the water and soil quality in UTRC and its associated wetlands. The analytical results from this investigation indicated that the primary constituents and radiological indicators detected in the shallow water and soils were tritium, gross alpha, radium 226, total radium and strontium 90. This investigation involved the collection of shallow water samples during the Fall of 1991 and the Spring of 1992 at fifty (50) sampling locations. Sampling was performed during these periods to incorporate high and low water table periods. Samples were collected from three sections along UTRC denoted as Phase I (MWMF), Phase II (FHSB) and Phase III (SL). One vibracored soil sample was also collected in each phase during the Fall of 1991. This document is compiled of experimental data obtained from the sampling procedures.

  14. Analytical Chemistry Laboratory. Progress report for FY 1996

    Energy Technology Data Exchange (ETDEWEB)

    Green, D.W.; Boparai, A.S.; Bowers, D.L.

    1996-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1996. This annual report is the thirteenth for the ACL. It describes effort on continuing and new projects and contributions of the ACL staff to various programs at ANL. The ACL operates in the ANL system as a full-cost-recovery service center, but has a mission that includes a complementary research and development component: The Analytical Chemistry Laboratory will provide high-quality, cost-effective chemical analysis and related technical support to solve research problems of our clients -- Argonne National Laboratory, the Department of Energy, and others -- and will conduct world-class research and development in analytical chemistry and its applications. Because of the diversity of research and development work at ANL, the ACL handles a wide range of analytical chemistry problems. Some routine or standard analyses are done, but the ACL usually works with commercial laboratories if our clients require high-volume, production-type analyses. It is common for ANL programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. Thus, much of the support work done by the ACL is very similar to our applied analytical chemistry research.

  15. Exploring the Connection Between Sampling Problems in Bayesian Inference and Statistical Mechanics

    Science.gov (United States)

    Pohorille, Andrew

    2006-01-01

    The Bayesian and statistical mechanical communities often share the same objective in their work - estimating and integrating probability distribution functions (pdfs) describing stochastic systems, models or processes. Frequently, these pdfs are complex functions of random variables exhibiting multiple, well separated local minima. Conventional strategies for sampling such pdfs are inefficient, sometimes leading to an apparent non-ergodic behavior. Several recently developed techniques for handling this problem have been successfully applied in statistical mechanics. In the multicanonical and Wang-Landau Monte Carlo (MC) methods, the correct pdfs are recovered from uniform sampling of the parameter space by iteratively establishing proper weighting factors connecting these distributions. Trivial generalizations allow for sampling from any chosen pdf. The closely related transition matrix method relies on estimating transition probabilities between different states. All these methods proved to generate estimates of pdfs with high statistical accuracy. In another MC technique, parallel tempering, several random walks, each corresponding to a different value of a parameter (e.g. "temperature"), are generated and occasionally exchanged using the Metropolis criterion. This method can be considered as a statistically correct version of simulated annealing. An alternative approach is to represent the set of independent variables as a Hamiltonian system. Considerab!e progress has been made in understanding how to ensure that the system obeys the equipartition theorem or, equivalently, that coupling between the variables is correctly described. Then a host of techniques developed for dynamical systems can be used. Among them, probably the most powerful is the Adaptive Biasing Force method, in which thermodynamic integration and biased sampling are combined to yield very efficient estimates of pdfs. The third class of methods deals with transitions between states described

  16. Analytical Validation of a New Enzymatic and Automatable Method for d-Xylose Measurement in Human Urine Samples

    Directory of Open Access Journals (Sweden)

    Israel Sánchez-Moreno

    2017-01-01

    Full Text Available Hypolactasia, or intestinal lactase deficiency, affects more than half of the world population. Currently, xylose quantification in urine after gaxilose oral administration for the noninvasive diagnosis of hypolactasia is performed with the hand-operated nonautomatable phloroglucinol reaction. This work demonstrates that a new enzymatic xylose quantification method, based on the activity of xylose dehydrogenase from Caulobacter crescentus, represents an excellent alternative to the manual phloroglucinol reaction. The new method is automatable and facilitates the use of the gaxilose test for hypolactasia diagnosis in the clinical practice. The analytical validation of the new technique was performed in three different autoanalyzers, using buffer or urine samples spiked with different xylose concentrations. For the comparison between the phloroglucinol and the enzymatic assays, 224 urine samples of patients to whom the gaxilose test had been prescribed were assayed by both methods. A mean bias of −16.08 mg of xylose was observed when comparing the results obtained by both techniques. After adjusting the cut-off of the enzymatic method to 19.18 mg of xylose, the Kappa coefficient was found to be 0.9531, indicating an excellent level of agreement between both analytical procedures. This new assay represents the first automatable enzymatic technique validated for xylose quantification in urine.

  17. Applications of Asymptotic Sampling on High Dimensional Structural Dynamic Problems

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Bucher, Christian

    2011-01-01

    The paper represents application of the asymptotic sampling on various structural models subjected to random excitations. A detailed study on the effect of different distributions of the so-called support points is performed. This study shows that the distribution of the support points has consid...... dimensional reliability problems in structural dynamics.......The paper represents application of the asymptotic sampling on various structural models subjected to random excitations. A detailed study on the effect of different distributions of the so-called support points is performed. This study shows that the distribution of the support points has...... is minimized. Next, the method is applied on different cases of linear and nonlinear systems with a large number of random variables representing the dynamic excitation. The results show that asymptotic sampling is capable of providing good approximations of low failure probability events for very high...

  18. Integrated sampling and analysis plan for samples measuring >10 mrem/hour

    International Nuclear Information System (INIS)

    Haller, C.S.

    1992-03-01

    This integrated sampling and analysis plan was prepared to assist in planning and scheduling of Hanford Site sampling and analytical activities for all waste characterization samples that measure greater than 10 mrem/hour. This report also satisfies the requirements of the renegotiated Interim Milestone M-10-05 of the Hanford Federal Facility Agreement and Consent Order (the Tri-Party Agreement). For purposes of comparing the various analytical needs with the Hanford Site laboratory capabilities, the analytical requirements of the various programs were normalized by converting required laboratory effort for each type of sample to a common unit of work, the standard analytical equivalency unit (AEU). The AEU approximates the amount of laboratory resources required to perform an extensive suite of analyses on five core segments individually plus one additional suite of analyses on a composite sample derived from a mixture of the five core segments and prepare a validated RCRA-type data package

  19. Analytical solution for Van der Pol-Duffing oscillators

    International Nuclear Information System (INIS)

    Kimiaeifar, A.; Saidi, A.R.; Bagheri, G.H.; Rahimpour, M.; Domairry, D.G.

    2009-01-01

    In this paper, the problem of single-well, double-well and double-hump Van der Pol-Duffing oscillator is studied. Governing equation is solved analytically using a new kind of analytic technique for nonlinear problems namely the 'Homotopy Analysis Method' (HAM), for the first time. Present solution gives an expression which can be used in wide range of time for all domain of response. Comparisons of the obtained solutions with numerical results show that this method is effective and convenient for solving this problem. This method is a capable tool for solving this kind of nonlinear problems.

  20. A review of novel strategies of sample preparation for the determination of antibacterial residues in foodstuffs using liquid chromatography-based analytical methods

    Energy Technology Data Exchange (ETDEWEB)

    Marazuela, M.D., E-mail: marazuela@quim.ucm.es [Department of Analytical Chemistry, Faculty of Chemistry, Universidad Complutense de Madrid, E-28040 Madrid (Spain); Bogialli, S [Department of Chemistry, University of Rome ' La Sapienza' , Piazza Aldo Moro, 5 00185 Rome (Italy)

    2009-07-10

    The determination of trace residues and contaminants in food has been of growing concern over the past few years. Residual antibacterials in food constitute a risk to human health, especially because they can contribute to the transmission of antibiotic-resistant pathogenic bacteria through the food chain. Therefore, to ensure food safety EU and USA regulatory agencies have established lists of forbidden or banned substances and tolerance levels for authorized veterinary drugs (e.g. antibacterials). In addition, the EU Commission Decision 2002/657/EC has set requirements about the performance of analytical methods for the determination of veterinary drug residues in food and feedstuffs. During the past years, the use of powerful mass spectrometric detectors in combination with innovative chromatographic technologies has solved many problems related to sensitivity and selectivity of this type of analysis. However sample preparation still remains as the bottleneck step, mainly in terms of analysis time and sources of error. This review covering research published between 2004 and 2008 intends to provide an update overview of the past five years, on recent trends in sample preparation for the determination of antibacterial residues in foods, making special emphasis in on-line, high-throughput, multi-class methods and including several applications in detail.

  1. Double-contained receiver tank 244-TX, grab samples, 244TX-97-1 through 244TX-97-3 analytical results for the final report

    International Nuclear Information System (INIS)

    Esch, R.A.

    1997-01-01

    This document is the final report for the double-contained receiver tank (DCRT) 244-TX grab samples. Three grabs samples were collected from riser 8 on May 29, 1997. Analyses were performed in accordance with the Compatibility Grab Sampling and Analysis Plan (TSAP) and the Data Quality Objectives for Tank Farms Waste Compatibility Program (DQO). The analytical results are presented in a table

  2. Applying analytical ultracentrifugation to nanocrystal suspensions

    Energy Technology Data Exchange (ETDEWEB)

    Jamison, Jennifer A; Krueger, Karl M; Mayo, J T; Yavuz, Cafer T; Redden, Jacina J; Colvin, Vicki L, E-mail: colvin@rice.ed [Department of Chemistry, Rice University, 6100 Main Street, MS-60, Houston, TX 77005 (United States)

    2009-09-02

    While applied frequently in physical biochemistry to the study of protein complexes, the quantitative use of analytical ultracentrifugation (AUC) for nanocrystal analysis is relatively rare. Its application in nanoscience is potentially very powerful as it provides a measure of nanocrystal density, size and structure directly in the solution phase. Towards that end, this paper examines the best practices for applying data collection and analysis methods for AUC, geared towards the study of biomolecules, to the unique problems of nanoparticle analysis. Using uniform nanocrystals of cadmium selenide, we compared several schemes for analyzing raw sedimentation data. Comparable values of the mean sedimentation coefficients (s-value) were found using several popular analytical approaches; however, the distribution in sample s-values is best captured using the van Holde-Weischt algorithm. Measured s-values could be reproducibly collected if sample temperature and concentration were controlled; under these circumstances, the variability for average sedimentation values was typically 5%. The full shape of the distribution in s-values, however, is not easily subjected to quantitative interpretation. Moreover, the selection of the appropriate sedimentation speed is crucial for AUC of nanocrystals as the density of inorganic nanocrystals is much larger than that of solvents. Quantitative analysis of sedimentation properties will allow for better agreement between experimental and theoretical models of nanocrystal solution behavior, as well as providing deeper insight into the hydrodynamic size and solution properties of nanomaterials.

  3. Recent analytical applications of magnetic nanoparticles

    Directory of Open Access Journals (Sweden)

    Mohammad Faraji

    2016-07-01

    Full Text Available Analytical chemistry has experienced, as well as other areas of science, a big change due to the needs and opportunities provided by analytical nanoscience and nanotechnology. Now, nanotechnology is increasingly proving to be a powerful ally of analytical chemistry to achieve its objectives, and to simplify analytical processes. Moreover, the information needs arising from the growing nanotechnological activity are opening an exciting new field of action for analytical chemists. Magnetic nanoparticles have been used in various fields owing to their unique properties including large specific surface area and simple separation with magnetic fields. For Analytical applications, they have been used mainly for sample preparation techniques (magnetic solid phase extraction with different advanced functional groups (layered double hydroxide, β-cyclodextrin, carbon nanotube, graphen, polymer, octadecylsilane and automation of it, microextraction techniques enantioseparation and chemosensors. This review summarizes the basic principles and achievements of magnetic nanoparticles in sample preparation techniques, enantioseparation and chemosensors. Also, some selected articles recently published (2010-2016 have been reviewed and discussed.

  4. Improving Creative Problem-Solving in a Sample of Third Culture Kids

    Science.gov (United States)

    Lee, Young Ju; Bain, Sherry K.; McCallum, R. Steve

    2007-01-01

    We investigated the effects of divergent thinking training (with explicit instruction) on problem-solving tasks in a sample of Third Culture Kids (Useem and Downie, 1976). We were specifically interested in whether the children's originality and fluency in responding increased following instruction, not only on classroom-based worksheets and the…

  5. Analytical challenges in sports drug testing.

    Science.gov (United States)

    Thevis, Mario; Krug, Oliver; Geyer, Hans; Walpurgis, Katja; Baume, Norbert; Thomas, Andreas

    2018-03-01

    Analytical chemistry represents a central aspect of doping controls. Routine sports drug testing approaches are primarily designed to address the question whether a prohibited substance is present in a doping control sample and whether prohibited methods (for example, blood transfusion or sample manipulation) have been conducted by an athlete. As some athletes have availed themselves of the substantial breadth of research and development in the pharmaceutical arena, proactive and preventive measures are required such as the early implementation of new drug candidates and corresponding metabolites into routine doping control assays, even though these drug candidates are to date not approved for human use. Beyond this, analytical data are also cornerstones of investigations into atypical or adverse analytical findings, where the overall picture provides ample reason for follow-up studies. Such studies have been of most diverse nature, and tailored approaches have been required to probe hypotheses and scenarios reported by the involved parties concerning the plausibility and consistency of statements and (analytical) facts. In order to outline the variety of challenges that doping control laboratories are facing besides providing optimal detection capabilities and analytical comprehensiveness, selected case vignettes involving the follow-up of unconventional adverse analytical findings, urine sample manipulation, drug/food contamination issues, and unexpected biotransformation reactions are thematized.

  6. Interior beam searchlight semi-analytical benchmark

    International Nuclear Information System (INIS)

    Ganapol, Barry D.; Kornreich, Drew E.

    2008-01-01

    Multidimensional semi-analytical benchmarks to provide highly accurate standards to assess routine numerical particle transport algorithms are few and far between. Because of the well-established 1D theory for the analytical solution of the transport equation, it is sometimes possible to 'bootstrap' a 1D solution to generate a more comprehensive solution representation. Here, we consider the searchlight problem (SLP) as a multidimensional benchmark. A variation of the usual SLP is the interior beam SLP (IBSLP) where a beam source lies beneath the surface of a half space and emits directly towards the free surface. We consider the establishment of a new semi-analytical benchmark based on a new FN formulation. This problem is important in radiative transfer experimental analysis to determine cloud absorption and scattering properties. (authors)

  7. Problems of accuracy and sources of error in trace analysis of elements

    International Nuclear Information System (INIS)

    Porat, Ze'ev.

    1995-07-01

    The technological developments in the field of analytical chemistry in recent years facilitates trace analysis of materials in sub-ppb levels. This provides important information regarding the presence of various trace elements in the human body, in drinking water and in the environment. However, it also exposes the measurements to more severe problems of contamination and inaccuracy due to the high sensitivity of the analytical methods. The sources of error are numerous and can be included in three main groups: (a) impurities of various sources; (b) loss of material during sample processing; (c) problems of calibration and interference. These difficulties are discussed here in detail, together with some practical solutions and examples.(authors) 8 figs., 2 tabs., 18 refs.,

  8. Problems of accuracy and sources of error in trace analysis of elements

    Energy Technology Data Exchange (ETDEWEB)

    Porat, Ze` ev

    1995-07-01

    The technological developments in the field of analytical chemistry in recent years facilitates trace analysis of materials in sub-ppb levels. This provides important information regarding the presence of various trace elements in the human body, in drinking water and in the environment. However, it also exposes the measurements to more severe problems of contamination and inaccuracy due to the high sensitivity of the analytical methods. The sources of error are numerous and can be included in three main groups: (a) impurities of various sources; (b) loss of material during sample processing; (c) problems of calibration and interference. These difficulties are discussed here in detail, together with some practical solutions and examples.(authors) 8 figs., 2 tabs., 18 refs.,.

  9. ANALYTICAL SYNTHESIS OF CHEMICAL REACTOR CONTROL SYSTEM

    Directory of Open Access Journals (Sweden)

    Alexander Labutin

    2017-02-01

    Full Text Available The problem of the analytical synthesis of the synergetic control system of chemical reactor for the realization of a complex series-parallel exothermal reaction has been solved. The synthesis of control principles is performed using the analytical design method of aggregated regulators. Synthesized nonlinear control system solves the problem of stabilization of the concentration of target component at the exit of reactor and also enables one to automatically transfer to new production using the equipment.

  10. Multiple Solutions of Nonlinear Boundary Value Problems of Fractional Order: A New Analytic Iterative Technique

    Directory of Open Access Journals (Sweden)

    Omar Abu Arqub

    2014-01-01

    Full Text Available The purpose of this paper is to present a new kind of analytical method, the so-called residual power series, to predict and represent the multiplicity of solutions to nonlinear boundary value problems of fractional order. The present method is capable of calculating all branches of solutions simultaneously, even if these multiple solutions are very close and thus rather difficult to distinguish even by numerical techniques. To verify the computational efficiency of the designed proposed technique, two nonlinear models are performed, one of them arises in mixed convection flows and the other one arises in heat transfer, which both admit multiple solutions. The results reveal that the method is very effective, straightforward, and powerful for formulating these multiple solutions.

  11. Analytic Bayesian solution of the two-stage poisson-type problem in probabilistic risk analysis

    International Nuclear Information System (INIS)

    Frohner, F.H.

    1985-01-01

    The basic purpose of probabilistic risk analysis is to make inferences about the probabilities of various postulated events, with an account of all relevant information such as prior knowledge and operating experience with the specific system under study, as well as experience with other similar systems. Estimation of the failure rate of a Poisson-type system leads to an especially simple Bayesian solution in closed form if the prior probabilty implied by the invariance properties of the problem is properly taken into account. This basic simplicity persists if a more realistic prior, representing order of magnitude knowledge of the rate parameter, is employed instead. Moreover, the more realistic prior allows direct incorporation of experience gained from other similar systems, without need to postulate a statistical model for an underlying ensemble. The analytic formalism is applied to actual nuclear reactor data

  12. Symptoms and problems in a nationally representative sample of advanced cancer patients

    DEFF Research Database (Denmark)

    Johnsen, Anna Thit; Petersen, Morten Aagaard; Pedersen, Lise

    2009-01-01

    Little is known about the need for palliative care among advanced cancer patients who are not in specialist palliative care. The purpose was to identify prevalence and predictors of symptoms and problems in a nationally representative sample of Danish advanced cancer patients. Patients with cancer...... or not were associated with several symptoms and problems. This is probably the first nationally representative study of its kind. It shows that advanced cancer patients in Denmark have symptoms and problems that deserve attention and that some patient groups are especially at risk....... predictors. In total, 977 (60%) patients participated. The most frequent symptoms/problems were fatigue (57%; severe 22%) followed by reduced role function, insomnia and pain. Age, cancer stage, primary tumour, type of department, marital status and whether the patient had recently been hospitalized...

  13. Analytical Chemistry Laboratory (ACL) procedure compendium. Volume 1, Administrative

    Energy Technology Data Exchange (ETDEWEB)

    1992-06-01

    Covered are: analytical laboratory operations (ALO) sample receipt and control, ALO data report/package preparation review and control, single shell tank (PST) project sample tracking system, sample receiving, analytical balances, duties and responsibilities of sample custodian, sample refrigerator temperature monitoring, security, assignment of staff responsibilities, sample storage, data reporting, and general requirements for glassware.

  14. SALE: Safeguards Analytical Laboratory Evaluation computer code

    International Nuclear Information System (INIS)

    Carroll, D.J.; Bush, W.J.; Dolan, C.A.

    1976-09-01

    The Safeguards Analytical Laboratory Evaluation (SALE) program implements an industry-wide quality control and evaluation system aimed at identifying and reducing analytical chemical measurement errors. Samples of well-characterized materials are distributed to laboratory participants at periodic intervals for determination of uranium or plutonium concentration and isotopic distributions. The results of these determinations are statistically-evaluated, and each participant is informed of the accuracy and precision of his results in a timely manner. The SALE computer code which produces the report is designed to facilitate rapid transmission of this information in order that meaningful quality control will be provided. Various statistical techniques comprise the output of the SALE computer code. Assuming an unbalanced nested design, an analysis of variance is performed in subroutine NEST resulting in a test of significance for time and analyst effects. A trend test is performed in subroutine TREND. Microfilm plots are obtained from subroutine CUMPLT. Within-laboratory standard deviations are calculated in the main program or subroutine VAREST, and between-laboratory standard deviations are calculated in SBLV. Other statistical tests are also performed. Up to 1,500 pieces of data for each nuclear material sampled by 75 (or fewer) laboratories may be analyzed with this code. The input deck necessary to run the program is shown, and input parameters are discussed in detail. Printed output and microfilm plot output are described. Output from a typical SALE run is included as a sample problem

  15. A new analytical application of nylon-induced room-temperature phosphorescence: Determination of thiabendazole in water samples

    Energy Technology Data Exchange (ETDEWEB)

    Correa, R.A. [Departamento de Quimica Analitica, Facultad de Ciencias Bioquimicas y Farmaceuticas, Universidad Nacional de Rosario, Suipacha 531 (2000) Rosario (Argentina); Escandar, G.M. [Departamento de Quimica Analitica, Facultad de Ciencias Bioquimicas y Farmaceuticas, Universidad Nacional de Rosario, Suipacha 531 (2000) Rosario (Argentina)]. E-mail: gescanda@fbioyf.unr.edu.ar

    2006-06-30

    This paper discusses the first analytical determination of the widely used fungicide thiabendazole by nylon-induced phosphorimetry. Nylon was investigated as a novel solid-matrix for inducing room-temperature phosphorescence of thiabendazole, which was enhanced under the effect of external heavy-atom salts. Among the investigated salts, lead(II) acetate was the most effective in yielding a high phosphorescence signal. An additional enhancement of the phosphorescence emission was attained when the measurements were carried out under a nitrogen atmosphere. There was only a moderate increase in the presence of cyclodextrins. The room-temperature phosphorescence lifetimes of the adsorbed thiabendazole were measured under different working conditions and, in all cases, two decaying components were detected. On the basis of the obtained results, a very simple and sensitive phosphorimetric method for the determination of thiabendazole was established. The analytical figures of merit obtained under the best experimental conditions were: linear calibration range from 0.031 to 0.26 {mu}g ml{sup -1} (the lowest value corresponds to the quantitation limit), relative standard deviation, 2.4% (n = 5) at a level of 0.096 {mu}g ml{sup -1}, and limit of detection calculated according to 1995 IUPAC Recommendations equal to 0.010 {mu}g ml{sup -1} (0.03 ng/spot). The potential interference from common agrochemicals was also studied. The feasibility of determining thiabendazole in real samples was successfully evaluated through the analysis of spiked river, tap and mineral water samples.

  16. Solving multi-objective facility location problem using the fuzzy analytical hierarchy process and goal programming: a case study on infectious waste disposal centers

    Directory of Open Access Journals (Sweden)

    Narong Wichapa

    Full Text Available The selection of a suitable location for infectious waste disposal is one of the major problems in waste management. Determining the location of infectious waste disposal centers is a difficult and complex process because it requires combining social and environmental factors that are hard to interpret, and cost factors that require the allocation of resources. Additionally, it depends on several regulations. Based on the actual conditions of a case study, forty hospitals and three candidate municipalities in the sub-Northeast region of Thailand, we considered multiple factors such as infrastructure, geological and social & environmental factors, calculating global priority weights using the fuzzy analytical hierarchy process (FAHP. After that, a new multi-objective facility location problem model which combines FAHP and goal programming (GP, namely the FAHP-GP model, was tested. The proposed model can lead to selecting new suitable locations for infectious waste disposal by considering both total cost and final priority weight objectives. The novelty of the proposed model is the simultaneous combination of relevant factors that are difficult to interpret and cost factors, which require the allocation of resources. Keywords: Multi-objective facility location problem, Fuzzy analytic hierarchy process, Infectious waste disposal centers

  17. Gauge field geometry from complex and harmonic analyticities

    International Nuclear Information System (INIS)

    Gal'perin, A.S.; Ivanov, E.A.; Ogievetsky, V.I.; Sokatchev, E.

    1987-01-01

    The analyticity preservation principle is employed to demonstrate and impressive affinity between field theories with intrinsic analytic structure and superfield gauge theories. The defining constraints of the former theories are interpreted as the integrability conditions for the existence of appropriate analytic subspaces and are solved by passing to the basis with manifest analyticity. We prefer to work within the analytic basis. This allows, e.g., to replace the nonlinear splitting problem of twistor approach by solving a linear equation

  18. State of the art and trends of the analytical chemistry of traces of mercury

    International Nuclear Information System (INIS)

    Nitschke, L.; Scholz, F.; Henrion, G.

    1989-01-01

    The main developments in the field of mercury analysis are traced back for the last decade. A short account of the history is given. New analytical methods and preconcentration techniques are regarded with special reference to spectrophotometry, atomic absorption spectroscopy, neutron activation analysis, voltametry, chromatography, and catalytic analysis. Problems of sampling and applications to environmental analysis are included. 148 refs. (author)

  19. Some analytical aspects about determination of Sr89 and Sr90 in environmental samples

    International Nuclear Information System (INIS)

    Gasco, C.; Alvarez Garcia, A.

    1988-01-01

    Some problems about determination of Sr 89 and Sr 90 in environmental samples have been studied. The main difficulties are due to the wide range in the concentration of their components and the contents of chemical and radiochemical interferent elements. The behaviour of strontium on ion exchange resin has been described by some experiments in various media: aqueous media, calcium concentration and matrix variable. The differences of alkaline-earth nitrate and carbonate solubilities have been analyzed in nitric acid. The chemical recovery in environmental samples has been determined. (Author)

  20. Efficiency of analytical and sampling-based uncertainty propagation in intensity-modulated proton therapy

    Science.gov (United States)

    Wahl, N.; Hennig, P.; Wieser, H. P.; Bangert, M.

    2017-07-01

    The sensitivity of intensity-modulated proton therapy (IMPT) treatment plans to uncertainties can be quantified and mitigated with robust/min-max and stochastic/probabilistic treatment analysis and optimization techniques. Those methods usually rely on sparse random, importance, or worst-case sampling. Inevitably, this imposes a trade-off between computational speed and accuracy of the uncertainty propagation. Here, we investigate analytical probabilistic modeling (APM) as an alternative for uncertainty propagation and minimization in IMPT that does not rely on scenario sampling. APM propagates probability distributions over range and setup uncertainties via a Gaussian pencil-beam approximation into moments of the probability distributions over the resulting dose in closed form. It supports arbitrary correlation models and allows for efficient incorporation of fractionation effects regarding random and systematic errors. We evaluate the trade-off between run-time and accuracy of APM uncertainty computations on three patient datasets. Results are compared against reference computations facilitating importance and random sampling. Two approximation techniques to accelerate uncertainty propagation and minimization based on probabilistic treatment plan optimization are presented. Runtimes are measured on CPU and GPU platforms, dosimetric accuracy is quantified in comparison to a sampling-based benchmark (5000 random samples). APM accurately propagates range and setup uncertainties into dose uncertainties at competitive run-times (GPU ≤slant {5} min). The resulting standard deviation (expectation value) of dose show average global γ{3% / {3}~mm} pass rates between 94.2% and 99.9% (98.4% and 100.0%). All investigated importance sampling strategies provided less accuracy at higher run-times considering only a single fraction. Considering fractionation, APM uncertainty propagation and treatment plan optimization was proven to be possible at constant time complexity

  1. The antisocial family tree: family histories of behavior problems in antisocial personality in the United States.

    Science.gov (United States)

    Vaughn, Michael G; Salas-Wright, Christopher P; DeLisi, Matt; Qian, Zhengmin

    2015-05-01

    Multiple avenues of research (e.g., criminal careers, intergenerational family transmission, and epidemiological studies) have indicated a concentration of antisocial traits and behaviors that cluster among families and within individuals in a population. The current study draws on each of these perspectives in exploring the intergenerational contours of antisocial personality disorder across multiple generations of a large-scale epidemiological sample. The analytic sample of persons meeting criteria for antisocial personality disorder (N = 1,226) was derived from waves I and II of the National Epidemiologic Survey on Alcohol and Related Conditions. Path analytic, latent class, and multinomial models were executed to describe and elucidate family histories among persons diagnosed with antisocial personality disorder. Three classes of an antisocial family tree were found: minimal family history of problem behaviors (70.3 % of sample) who were characterized by higher socioeconomic functioning, parental and progeny behavior problems (9.4 % of sample) who were characterized by criminal behaviors, psychopathology, and substance use disorders, and multigenerational history of problem behaviors (20.3 % of sample) who were characterized by alcoholism, psychopathology, and versatile criminal offending. These findings add a typology to intergenerational studies of antisocial behavior that can assist in identifying etiological and treatment factors among those for whom crime runs in the family.

  2. Direct analysis of biological samples by total reflection X-ray fluorescence

    International Nuclear Information System (INIS)

    Lue M, Marco P.; Hernandez-Caraballo, Edwin A.

    2004-01-01

    The technique of total reflection X-ray fluorescence (TXRF) is well suited for the direct analysis of biological samples due to the low matrix interferences and simultaneous multi-element nature. Nevertheless, biological organic samples are frequently analysed after digestion procedures. The direct determination of analytes requires shorter analysis time, low reactive consumption and simplifies the whole analysis process. On the other hand, the biological/clinical samples are often available in minimal amounts and routine studies require the analysis of large number of samples. To overcome the difficulties associated with the analysis of organic samples, particularly of solid ones, different procedures of sample preparation and calibration to approach the direct analysis have been evaluated: (1) slurry sampling, (2) Compton peak standardization, (3) in situ microwave digestion, (4) in situ chemical modification and (5) direct analysis with internal standardization. Examples of analytical methods developed by our research group are discussed. Some of them have not been previously published, illustrating alternative strategies for coping with various problems that may be encountered in the direct analysis by total reflection X-ray fluorescence spectrometry

  3. Analytical Model of the Nonlinear Dynamics of Cantilever Tip-Sample Surface Interactions for Various Acoustic-Atomic Force Microscopies

    Science.gov (United States)

    Cantrell, John H., Jr.; Cantrell, Sean A.

    2008-01-01

    A comprehensive analytical model of the interaction of the cantilever tip of the atomic force microscope (AFM) with the sample surface is developed that accounts for the nonlinearity of the tip-surface interaction force. The interaction is modeled as a nonlinear spring coupled at opposite ends to linear springs representing cantilever and sample surface oscillators. The model leads to a pair of coupled nonlinear differential equations that are solved analytically using a standard iteration procedure. Solutions are obtained for the phase and amplitude signals generated by various acoustic-atomic force microscope (A-AFM) techniques including force modulation microscopy, atomic force acoustic microscopy, ultrasonic force microscopy, heterodyne force microscopy, resonant difference-frequency atomic force ultrasonic microscopy (RDF-AFUM), and the commonly used intermittent contact mode (TappingMode) generally available on AFMs. The solutions are used to obtain a quantitative measure of image contrast resulting from variations in the Young modulus of the sample for the amplitude and phase images generated by the A-AFM techniques. Application of the model to RDF-AFUM and intermittent soft contact phase images of LaRC-cp2 polyimide polymer is discussed. The model predicts variations in the Young modulus of the material of 24 percent from the RDF-AFUM image and 18 percent from the intermittent soft contact image. Both predictions are in good agreement with the literature value of 21 percent obtained from independent, macroscopic measurements of sheet polymer material.

  4. Problems of the heat transfer during the irradiation of solids

    International Nuclear Information System (INIS)

    Jahn, G.

    1981-03-01

    This report deals with the thermal problems during the irradiation of solids. Analytical and constructive solutions are outlined by some examples. Two cases are looked at: 1) the samples and the equipment are warmed up during irradiation. Thus they have to be cooled which yields a negative heat flux direction. 2) The samples shall have a suitable temperature higher than room temperature. Thus they have to be heated which yields a positive heat flux direction. (BHO)

  5. Green approaches in sample preparation of bioanalytical samples prior to chromatographic analysis.

    Science.gov (United States)

    Filippou, Olga; Bitas, Dimitrios; Samanidou, Victoria

    2017-02-01

    Sample preparation is considered to be the most challenging step of the analytical procedure, since it has an effect on the whole analytical methodology, therefore it contributes significantly to the greenness or lack of it of the entire process. The elimination of the sample treatment steps, pursuing at the same time the reduction of the amount of the sample, strong reductions in consumption of hazardous reagents and energy also maximizing safety for operators and environment, the avoidance of the use of big amount of organic solvents, form the basis for greening sample preparation and analytical methods. In the last decade, the development and utilization of greener and sustainable microextraction techniques is an alternative to classical sample preparation procedures. In this review, the main green microextraction techniques (solid phase microextraction, stir bar sorptive extraction, hollow-fiber liquid phase microextraction, dispersive liquid - liquid microextraction, etc.) will be presented, with special attention to bioanalytical applications of these environment-friendly sample preparation techniques which comply with the green analytical chemistry principles. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. [Pre-analytical stability before centrifugation of 7 biochemical analytes in whole blood].

    Science.gov (United States)

    Perrier-Cornet, Andreas; Moineau, Marie-Pierre; Narbonne, Valérie; Plee-Gautier, Emmanuelle; Le Saos, Fabienne; Carre, Jean-Luc

    2015-01-01

    The pre-analytical stability of 7 biochemical parameters (parathyroid hormone -PTH-, vitamins A, C E and D, 1,25-dihydroxyvitamin D and insulin) at +4 °C, was studied on whole blood samples before centrifugation. The impact of freezing at -20°C was also analyzed/performed for PTH and vitamin D. The differences in the results of assays for whole blood samples, being kept for different times between sampling time and analysis, from 9 healthy adults, were compaired by using a Student t test. The 7 analytes investigated remained stable up to 4 hours at +4°C in whole blood. This study showed that it is possible to accept uncentrifuged whole blood specimens kept at +4°C before analysis. PTH is affected by freezing whereas vitamin D is not.

  7. Schedule Analytics

    Science.gov (United States)

    2016-04-30

    Warfare, Naval Sea Systems Command Acquisition Cycle Time : Defining the Problem David Tate, Institute for Defense Analyses Schedule Analytics Jennifer...research was comprised of the following high- level steps :  Identify and review primary data sources 1...research. However, detailed reviews of the OMB IT Dashboard data revealed that schedule data is highly aggregated. Program start date and program end date

  8. Big Data Analytics with Datalog Queries on Spark.

    Science.gov (United States)

    Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo

    2016-01-01

    There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics.

  9. Electrophoretic extraction of low molecular weight cationic analytes from sodium dodecyl sulfate containing sample matrices for their direct electrospray ionization mass spectrometry.

    Science.gov (United States)

    Kinde, Tristan F; Lopez, Thomas D; Dutta, Debashis

    2015-03-03

    While the use of sodium dodecyl sulfate (SDS) in separation buffers allows efficient analysis of complex mixtures, its presence in the sample matrix is known to severely interfere with the mass-spectrometric characterization of analyte molecules. In this article, we report a microfluidic device that addresses this analytical challenge by enabling inline electrospray ionization mass spectrometry (ESI-MS) of low molecular weight cationic samples prepared in SDS containing matrices. The functionality of this device relies on the continuous extraction of analyte molecules into an SDS-free solvent stream based on the free-flow zone electrophoresis (FFZE) technique prior to their ESI-MS analysis. The reported extraction was accomplished in our current work in a glass channel with microelectrodes fabricated along its sidewalls to realize the desired electric field. Our experiments show that a key challenge to successfully operating such a device is to suppress the electroosmotically driven fluid circulations generated in its extraction channel that otherwise tend to vigorously mix the liquid streams flowing through this duct. A new coating medium, N-(2-triethoxysilylpropyl) formamide, recently demonstrated by our laboratory to nearly eliminate electroosmotic flow in glass microchannels was employed to address this issue. Applying this surface modifier, we were able to efficiently extract two different peptides, human angiotensin I and MRFA, individually from an SDS containing matrix using the FFZE method and detect them at concentrations down to 3.7 and 6.3 μg/mL, respectively, in samples containing as much as 10 mM SDS. Notice that in addition to greatly reducing the amount of SDS entering the MS instrument, the reported approach allows rapid solvent exchange for facilitating efficient analyte ionization desired in ESI-MS analysis.

  10. Hot sample archiving. Revision 3

    International Nuclear Information System (INIS)

    McVey, C.B.

    1995-01-01

    This Engineering Study revision evaluated the alternatives to provide tank waste characterization analytical samples for a time period as recommended by the Tank Waste Remediation Systems Program. The recommendation of storing 40 ml segment samples for a period of approximately 18 months (6 months past the approval date of the Tank Characterization Report) and then composite the core segment material in 125 ml containers for a period of five years. The study considers storage at 222-S facility. It was determined that the critical storage problem was in the hot cell area. The 40 ml sample container has enough material for approximately 3 times the required amount for a complete laboratory re-analysis. The final result is that 222-S can meet the sample archive storage requirements. During the 100% capture rate the capacity is exceeded in the hot cell area, but quick, inexpensive options are available to meet the requirements

  11. Pre-analytical and analytical validations and clinical applications of a miniaturized, simple and cost-effective solid phase extraction combined with LC-MS/MS for the simultaneous determination of catecholamines and metanephrines in spot urine samples.

    Science.gov (United States)

    Li, Xiaoguang Sunny; Li, Shu; Kellermann, Gottfried

    2016-10-01

    It remains a challenge to simultaneously quantify catecholamines and metanephrines in a simple, sensitive and cost-effective manner due to pre-analytical and analytical constraints. Herein, we describe such a method consisting of a miniaturized sample preparation and selective LC-MS/MS detection by the use of second morning spot urine samples. Ten microliters of second morning urine sample were subjected to solid phase extraction on an Oasis HLB microplate upon complexation with phenylboronic acid. The analytes were well-resolved on a Luna PFP column followed by tandem mass spectrometric detection. Full validation and suitability of spot urine sampling and biological variation were investigated. The extraction recovery and matrix effect are 74.1-97.3% and 84.1-119.0%, respectively. The linearity range is 2.5-500, 0.5-500, 2.5-1250, 2.5-1250 and 0.5-1250ng/mL for norepinephrine, epinephrine, dopamine, normetanephrine and metanephrine, respectively. The intra- and inter-assay imprecisions are ≤9.4% for spiked quality control samples, and the respective recoveries are 97.2-112.5% and 95.9-104.0%. The Deming regression slope is 0.90-1.08, and the mean Bland-Altman percentage difference is from -3.29 to 11.85 between a published and proposed method (n=50). A correlation observed for the spot and 24h urine collections is significant (n=20, p<0.0001, r: 0.84-0.95, slope: 0.61-0.98). No statistical differences are found in day-to-day biological variability (n=20). Reference intervals are established for an apparently healthy population (n=88). The developed method, being practical, sensitive, reliable and cost-effective, is expected to set a new stage for routine testing, basic research and clinical applications. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Irregular analytical errors in diagnostic testing - a novel concept.

    Science.gov (United States)

    Vogeser, Michael; Seger, Christoph

    2018-02-23

    In laboratory medicine, routine periodic analyses for internal and external quality control measurements interpreted by statistical methods are mandatory for batch clearance. Data analysis of these process-oriented measurements allows for insight into random analytical variation and systematic calibration bias over time. However, in such a setting, any individual sample is not under individual quality control. The quality control measurements act only at the batch level. Quantitative or qualitative data derived for many effects and interferences associated with an individual diagnostic sample can compromise any analyte. It is obvious that a process for a quality-control-sample-based approach of quality assurance is not sensitive to such errors. To address the potential causes and nature of such analytical interference in individual samples more systematically, we suggest the introduction of a new term called the irregular (individual) analytical error. Practically, this term can be applied in any analytical assay that is traceable to a reference measurement system. For an individual sample an irregular analytical error is defined as an inaccuracy (which is the deviation from a reference measurement procedure result) of a test result that is so high it cannot be explained by measurement uncertainty of the utilized routine assay operating within the accepted limitations of the associated process quality control measurements. The deviation can be defined as the linear combination of the process measurement uncertainty and the method bias for the reference measurement system. Such errors should be coined irregular analytical errors of the individual sample. The measurement result is compromised either by an irregular effect associated with the individual composition (matrix) of the sample or an individual single sample associated processing error in the analytical process. Currently, the availability of reference measurement procedures is still highly limited, but LC

  13. The analytical solution of the problem of a shock focusing in a gas for one-dimensional case

    Science.gov (United States)

    Shestakovskaya, E. S.; Magazov, F. G.

    2018-03-01

    The analytical solution of the problem of an imploding shock wave in the vessel with an impermeable wall is constructed for the cases of planar, cylindrical and spherical symmetry. The negative velocity is set at the vessel boundary. The velocity of cold ideal gas is zero. At the initial time the shock spreads from this point into the center of symmetry. The boundary moves under the particular law which conforms to the movement of the shock. In Euler variables it moves but in Lagrangian variables its trajectory is a vertical line. Equations that determine the structure of the gas flow between the shock front and the boundary as a function of time and the Lagrangian coordinate as well as the dependence of the entropy on the shock wave velocity are obtained. Self-similar coefficients and corresponding critical values of self-similar coordinates were found for a wide range of adiabatic index. The problem is solved for Lagrangian coordinates.

  14. Quality assurance in the pre-analytical phase of human urine samples by (1)H NMR spectroscopy.

    Science.gov (United States)

    Budde, Kathrin; Gök, Ömer-Necmi; Pietzner, Maik; Meisinger, Christine; Leitzmann, Michael; Nauck, Matthias; Köttgen, Anna; Friedrich, Nele

    2016-01-01

    Metabolomic approaches investigate changes in metabolite profiles, which may reflect changes in metabolic pathways and provide information correlated with a specific biological process or pathophysiology. High-resolution (1)H NMR spectroscopy is used to identify metabolites in biofluids and tissue samples qualitatively and quantitatively. This pre-analytical study evaluated the effects of storage time and temperature on (1)H NMR spectra from human urine in two settings. Firstly, to evaluate short time effects probably due to acute delay in sample handling and secondly, the effect of prolonged storage up to one month to find markers of sample miss-handling. A number of statistical procedures were used to assess the differences between samples stored under different conditions, including Projection to Latent Structure Discriminant Analysis (PLS-DA), non-parametric testing as well as mixed effect linear regression analysis. The results indicate that human urine samples can be stored at 10 °C for 24 h or at -80 °C for 1 month, as no relevant changes in (1)H NMR fingerprints were observed during these time periods and temperature conditions. However, some metabolites most likely of microbial origin showed alterations during prolonged storage but without facilitating classification. In conclusion, the presented protocol for urine sample handling and semi-automatic metabolite quantification is suitable for large-scale epidemiological studies. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Carotenoid determination in recent marine sediments - practical problems during sample preparation and HPLC analysis

    Directory of Open Access Journals (Sweden)

    Magdalena Krajewska

    2017-05-01

    Full Text Available An analytical procedure for the analysis of carotenoids in marine sediments rich in organic matter has been developed. Analysis of these compounds is difficult; the application of methods used by other authors required optimization for the samples studied here. The analytical procedure involved multiple ultrasound-assisted extraction with acetone followed by liquid-liquid extraction (acetone extract:benzene:water - 15:1:10 v/v/v and HPLC analysis. The influence of column temperature on pigment separation and the quantification method were investigated – a temperature of 5 °C was selected for the Lichrospher 100 RP-18e column. The pigments in the sediment extract were quantified using a method based on HPLC analysis (at 450 nm and spectrophotometric measurements (at 450 nm, and extinction coefficients were determined for standard solutions at this wavelength. It is very important to use the value of the extinction coefficient appropriate to the wavelength at which the detection of carotenoids was carried out.

  16. The association between childhood maltreatment and gambling problems in a community sample of adult men and women.

    Science.gov (United States)

    Hodgins, David C; Schopflocher, Don P; el-Guebaly, Nady; Casey, David M; Smith, Garry J; Williams, Robert J; Wood, Robert T

    2010-09-01

    The association between childhood maltreatment and gambling problems was examined in a community sample of men and women (N = 1,372). As hypothesized, individuals with gambling problems reported greater childhood maltreatment than individuals without gambling problems. Childhood maltreatment predicted severity of gambling problems and frequency of gambling even when other individual and social factors were controlled including symptoms of alcohol and other drug use disorders, family environment, psychological distress, and symptoms of antisocial disorder. In contrast to findings in treatment-seeking samples, women with gambling problems did not report greater maltreatment than men with gambling problems. These results underscore the need for both increased prevention of childhood maltreatment and increased sensitivity towards trauma issues in gambling treatment programs for men and women.

  17. The problem of sampling families rather than populations: Relatedness among individuals in samples of juvenile brown trout Salmo trutta L

    DEFF Research Database (Denmark)

    Hansen, Michael Møller; Eg Nielsen, Einar; Mensberg, Karen-Lise Dons

    1997-01-01

    In species exhibiting a nonrandom distribution of closely related individuals, sampling of a few families may lead to biased estimates of allele frequencies in populations. This problem was studied in two brown trout populations, based on analysis of mtDNA and microsatellites. In both samples mt......DNA haplotype frequencies differed significantly between age classes, and in one sample 17 out of 18 individuals less than 1 year of age shared one particular mtDNA haplotype. Estimates of relatedness showed that these individuals most likely represented only three full-sib families. Older trout exhibiting...

  18. Discretization of convection-diffusion equations with finite-difference scheme derived from simplified analytical solutions

    International Nuclear Information System (INIS)

    Kriventsev, Vladimir

    2000-09-01

    Most of thermal hydraulic processes in nuclear engineering can be described by general convection-diffusion equations that are often can be simulated numerically with finite-difference method (FDM). An effective scheme for finite-difference discretization of such equations is presented in this report. The derivation of this scheme is based on analytical solutions of a simplified one-dimensional equation written for every control volume of the finite-difference mesh. These analytical solutions are constructed using linearized representations of both diffusion coefficient and source term. As a result, the Efficient Finite-Differencing (EFD) scheme makes it possible to significantly improve the accuracy of numerical method even using mesh systems with fewer grid nodes that, in turn, allows to speed-up numerical simulation. EFD has been carefully verified on the series of sample problems for which either analytical or very precise numerical solutions can be found. EFD has been compared with other popular FDM schemes including novel, accurate (as well as sophisticated) methods. Among the methods compared were well-known central difference scheme, upwind scheme, exponential differencing and hybrid schemes of Spalding. Also, newly developed finite-difference schemes, such as the the quadratic upstream (QUICK) scheme of Leonard, the locally analytic differencing (LOAD) scheme of Wong and Raithby, the flux-spline scheme proposed by Varejago and Patankar as well as the latest LENS discretization of Sakai have been compared. Detailed results of this comparison are given in this report. These tests have shown a high efficiency of the EFD scheme. For most of sample problems considered EFD has demonstrated the numerical error that appeared to be in orders of magnitude lower than that of other discretization methods. Or, in other words, EFD has predicted numerical solution with the same given numerical error but using much fewer grid nodes. In this report, the detailed

  19. Quality assurance procedures for the analysis of TRU waste samples

    International Nuclear Information System (INIS)

    Glasgow, D.C. Giaquinto, J.M.; Robinson, L.

    1995-01-01

    The Waste Isolation Pilot Plant (WIPP) project was undertaken in response to the growing need for a national repository for transuranic (TRU) waste. Guidelines for WIPP specify that any waste item to be interred must be fully characterized and analyzed to determine the presence of chemical compounds designated hazardous and certain toxic elements. The Transuranic Waste Characterization Program (TWCP) was launched to develop analysis and quality guidelines, certify laboratories, and to oversee the actual waste characterizations at the laboratories. ORNL is participating in the waste characterization phase and brings to bear a variety of analytical techniques including ICP-AES, cold vapor atomic absorption, and instrumental neutron activation analysis (INAA) to collective determine arsenic, cadmium, barium, chromium, mercury, selenium, silver, and other elements. All of the analytical techniques involved participate in a cooperative effort to meet the project objectives. One important component of any good quality assurance program is determining when an alternate method is more suitable for a given analytical problem. By bringing to bear a whole arsenal of analytical techniques working toward common objectives, few analytical problems prove to be insurmountable. INAA and ICP-AES form a powerful pair when functioning in this cooperative manner. This paper will provide details of the quality assurance protocols, typical results from quality control samples for both INAA and ICP-AES, and detail method cooperation schemes used

  20. Analytical and Numerical Studies of Several Fluid Mechanical Problems

    Science.gov (United States)

    Kong, D. L.

    2014-03-01

    In this thesis, three parts, each with several chapters, are respectively devoted to hydrostatic, viscous, and inertial fluids theories and applications. Involved topics include planetary, biological fluid systems, and high performance computing technology. In the hydrostatics part, the classical Maclaurin spheroids theory is generalized, for the first time, to a more realistic multi-layer model, establishing geometries of both the outer surface and the interfaces. For one of its astrophysical applications, the theory explicitly predicts physical shapes of surface and core-mantle-boundary for layered terrestrial planets, which enables the studies of some gravity problems, and the direct numerical simulations of dynamo flows in rotating planetary cores. As another application of the figure theory, the zonal flow in the deep atmosphere of Jupiter is investigated for a better understanding of the Jovian gravity field. An upper bound of gravity field distortions, especially in higher-order zonal gravitational coefficients, induced by deep zonal winds is estimated firstly. The oblate spheroidal shape of an undistorted Jupiter resulting from its fast solid body rotation is fully taken into account, which marks the most significant improvement from previous approximation based Jovian wind theories. High viscosity flows, for example Stokes flows, occur in a lot of processes involving low-speed motions in fluids. Microorganism swimming is such a typical case. A fully three dimensional analytic solution of incompressible Stokes equation is derived in the exterior domain of an arbitrarily translating and rotating prolate spheroid, which models a large family of microorganisms such as cocci bacteria. The solution is then applied to the magnetotactic bacteria swimming problem, and good consistency has been found between theoretical predictions and laboratory observations of the moving patterns of such bacteria under magnetic fields. In the analysis of dynamics of planetary

  1. Forensic analysis of high explosives residues in post-blast water samples employing solid phase extraction for analyte pro-concentration

    International Nuclear Information System (INIS)

    Umi Kalsom Ahmad; Rajendran, Sumathy; Ling, Lee Woan

    2008-01-01

    Nitro aromatic, nitramine and nitrate ester compounds are a major group of high order explosive or better known as military explosives. Octahydro-1,3,5,7-tetrazocine (HMX), 1,3,5-hexahydro-1,3,5-trinitro triazine (RDX), 2,4,6-trinitro-toluene (TNT), pentaerythritol tetranitrate (PETN) and 2,4-dinitrotoluene (2,4-DNT) are secondary high explosives classified as most commonly used explosives components. There is an increasing demand for pre-concentration of these compounds in water samples as the sensitivity achieved by instrumental analytical methods for these high explosives residues are the main drawback in the application at trace levels for forensic analysis. Hence, a simple cartridge solid phase extraction (SPE) procedure was optimized as the off-line extraction and pre-concentration method to enhance the detection limit of high explosive residues using micellar electrokinetic chromatography (MEKC) and gas chromatography with electron-capture detection (GC-ECD) methods. The SPE cartridges utilized LiChrolut EN as the SPE adsorbent. By emplying pre-concentration using SPE, the detection limits of the target analytes in water sample were lowered by more than 1000 times with good percentage recovery (>87%) for MEKC method and lowered by 120 times with more than 2 % percentage recovery for GC-ECD methods. In order to test the feasibility of the developed method to real cases, post-blast water samples were analyzed. The post-blast water samples which were collected from Baling Bom training range, Ulu Kinta, Perak contained RDX and PETN in the range of 0.05 - 0.17 ppm and 0.0124 - 0.0390 ppm respectively. (author)

  2. A comparative examination of sample treatment procedures for ICAP-AES analysis of biological tissue

    Science.gov (United States)

    De Boer, J. L. M.; Maessen, F. J. M. J.

    The objective of this study was to contribute to the evaluation of existing sample preparation procedures for ICAP-AES analysis of biological material. Performance characteristics were established of current digestion procedures comprising extraction, solubilization, pressure digestion, and wet and dry ashing methods. Apart from accuracy and precision, a number of criteria of special interest for the analytical practice was applied. As a test sample served SRM bovine liver. In this material six elements were simultaneously determined. Results showed that every procedure has its defects and advantages. Hence, unambiguous recommendation of standard digestion procedures can be made only when taking into account the specific analytical problem.

  3. Analytical study in 1D nuclear waste migration

    International Nuclear Information System (INIS)

    Perez Guerrero, Jesus S.; Heilbron Filho, Paulo L.; Romani, Zrinka V.

    1999-01-01

    The simulation of the nuclear waste migration phenomena are governed mainly by diffusive-convective equation that includes the effects of hydrodynamic dispersion (mechanical dispersion and molecular diffusion), radioactive decay and chemical interaction. For some special problems (depending on the boundary conditions and when the domain is considered infinite or semi-infinite) an analytical solution may be obtained using classical analytical methods such as Laplace Transform or variable separation. The hybrid Generalized Integral Transform Technique (GITT) is a powerful tool that can be applied to solve diffusive-convective linear problems to obtain formal analytical solutions. The aim of this work is to illustrate that the GITT may be used to obtain an analytical formal solution for the study of migration of radioactive waste in saturated flow porous media. A case test considering 241 Am radionuclide is presented. (author)

  4. Optimizing the triple-axis spectrometer PANDA at the MLZ for small samples and complex sample environment conditions

    Science.gov (United States)

    Utschick, C.; Skoulatos, M.; Schneidewind, A.; Böni, P.

    2016-11-01

    The cold-neutron triple-axis spectrometer PANDA at the neutron source FRM II has been serving an international user community studying condensed matter physics problems. We report on a new setup, improving the signal-to-noise ratio for small samples and pressure cell setups. Analytical and numerical Monte Carlo methods are used for the optimization of elliptic and parabolic focusing guides. They are placed between the monochromator and sample positions, and the flux at the sample is compared to the one achieved by standard monochromator focusing techniques. A 25 times smaller spot size is achieved, associated with a factor of 2 increased intensity, within the same divergence limits, ± 2 ° . This optional neutron focusing guide shall establish a top-class spectrometer for studying novel exotic properties of matter in combination with more stringent sample environment conditions such as extreme pressures associated with small sample sizes.

  5. Unit Stratified Sampling as a Tool for Approximation of Stochastic Optimization Problems

    Czech Academy of Sciences Publication Activity Database

    Šmíd, Martin

    2012-01-01

    Roč. 19, č. 30 (2012), s. 153-169 ISSN 1212-074X R&D Projects: GA ČR GAP402/11/0150; GA ČR GAP402/10/0956; GA ČR GA402/09/0965 Institutional research plan: CEZ:AV0Z10750506 Institutional support: RVO:67985556 Keywords : Stochastic programming * approximation * stratified sampling Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2013/E/smid-unit stratified sampling as a tool for approximation of stochastic optimization problems.pdf

  6. Lecture Notes and Exercises for Course 21240 (Basic Analytical Chemistry)

    DEFF Research Database (Denmark)

    1999-01-01

    The publication contains notes dealing with difficult topics in analytical chemistry (cfr. Course Descriptions, DTU), relevant exercises as well as final examination problems from the last years.......The publication contains notes dealing with difficult topics in analytical chemistry (cfr. Course Descriptions, DTU), relevant exercises as well as final examination problems from the last years....

  7. Lecture Notes and Exercises for Course 21240 (Basic Analytical Chemistry)

    DEFF Research Database (Denmark)

    1998-01-01

    The publication contains notes dealing with difficult topics in analytical chemistry (cfr. Course Descriptions, DTU), relevant exercises as well as final examination problems from the last years.......The publication contains notes dealing with difficult topics in analytical chemistry (cfr. Course Descriptions, DTU), relevant exercises as well as final examination problems from the last years....

  8. Microfluidic devices for sample clean-up and screening of biological samples

    NARCIS (Netherlands)

    Tetala, K.K.R.

    2009-01-01

    Analytical chemistry plays an important role in the separation and identification of analytes from raw samples (e.g. plant extracts, blood), but the whole analytical process is tedious, difficult to automate and time consuming. To overcome these drawbacks, the concept of μTAS (miniaturized total

  9. ISCO Grab Sample Ion Chromatography Analytical Data

    Data.gov (United States)

    U.S. Environmental Protection Agency — ISCO grab samples were collected from river, wastewater treatment plant discharge, and public drinking water intakes. Samples were analyzed for major ions (ppb)...

  10. Robustness to non-normality of various tests for the one-sample location problem

    Directory of Open Access Journals (Sweden)

    Michelle K. McDougall

    2004-01-01

    Full Text Available This paper studies the effect of the normal distribution assumption on the power and size of the sign test, Wilcoxon's signed rank test and the t-test when used in one-sample location problems. Power functions for these tests under various skewness and kurtosis conditions are produced for several sample sizes from simulated data using the g-and-k distribution of MacGillivray and Cannon [5].

  11. Three-dimensional transport theory: An analytical solution of an internal beam searchlight problem-I

    International Nuclear Information System (INIS)

    Williams, M.M.R.

    2009-01-01

    We describe a number of methods for obtaining analytical solutions and numerical results for three-dimensional one-speed neutron transport problems in a half-space containing a variety of source shapes which emit neutrons mono-directionally. For example, we consider an off-centre point source, a ring source and a disk source, or any combination of these, and calculate the surface scalar flux as a function of the radial and angular co-ordinates. Fourier transforms in the transverse directions are used and a Laplace transform in the axial direction. This enables the Wiener-Hopf method to be employed, followed by an inverse Fourier-Hankel transform. Some additional transformations are introduced which enable the inverse Hankel transforms involving Bessel functions to be evaluated numerically more efficiently. A hybrid diffusion theory method is also described which is shown to be a useful guide to the general behaviour of the solutions of the transport equation.

  12. Co-occurring substance-related and behavioral addiction problems: A person-centered, lay epidemiology approach

    Science.gov (United States)

    Konkolÿ Thege, Barna; Hodgins, David C.; Wild, T. Cameron

    2016-01-01

    Background and aims The aims of this study were (a) to describe the prevalence of single versus multiple addiction problems in a large representative sample and (b) to identify distinct subgroups of people experiencing substance-related and behavioral addiction problems. Methods A random sample of 6,000 respondents from Alberta, Canada, completed survey items assessing self-attributed problems experienced in the past year with four substances (alcohol, tobacco, marijuana, and cocaine) and six behaviors (gambling, eating, shopping, sex, video gaming, and work). Hierarchical cluster analyses were used to classify patterns of co-occurring addiction problems on an analytic subsample of 2,728 respondents (1,696 women and 1032 men; Mage = 45.1 years, SDage = 13.5 years) who reported problems with one or more of the addictive behaviors in the previous year. Results In the total sample, 49.2% of the respondents reported zero, 29.8% reported one, 13.1% reported two, and 7.9% reported three or more addiction problems in the previous year. Cluster-analytic results suggested a 7-group solution. Members of most clusters were characterized by multiple addiction problems; the average number of past year addictive behaviors in cluster members ranged between 1 (Cluster II: excessive eating only) and 2.5 (Cluster VII: excessive video game playing with the frequent co-occurrence of smoking, excessive eating and work). Discussion and conclusions Our findings replicate previous results indicating that about half of the adult population struggles with at least one excessive behavior in a given year; however, our analyses revealed a higher number of co-occurring addiction clusters than typically found in previous studies. PMID:27829288

  13. Co-occurring substance-related and behavioral addiction problems: A person-centered, lay epidemiology approach.

    Science.gov (United States)

    Konkolÿ Thege, Barna; Hodgins, David C; Wild, T Cameron

    2016-12-01

    Background and aims The aims of this study were (a) to describe the prevalence of single versus multiple addiction problems in a large representative sample and (b) to identify distinct subgroups of people experiencing substance-related and behavioral addiction problems. Methods A random sample of 6,000 respondents from Alberta, Canada, completed survey items assessing self-attributed problems experienced in the past year with four substances (alcohol, tobacco, marijuana, and cocaine) and six behaviors (gambling, eating, shopping, sex, video gaming, and work). Hierarchical cluster analyses were used to classify patterns of co-occurring addiction problems on an analytic subsample of 2,728 respondents (1,696 women and 1032 men; M age  = 45.1 years, SD age  = 13.5 years) who reported problems with one or more of the addictive behaviors in the previous year. Results In the total sample, 49.2% of the respondents reported zero, 29.8% reported one, 13.1% reported two, and 7.9% reported three or more addiction problems in the previous year. Cluster-analytic results suggested a 7-group solution. Members of most clusters were characterized by multiple addiction problems; the average number of past year addictive behaviors in cluster members ranged between 1 (Cluster II: excessive eating only) and 2.5 (Cluster VII: excessive video game playing with the frequent co-occurrence of smoking, excessive eating and work). Discussion and conclusions Our findings replicate previous results indicating that about half of the adult population struggles with at least one excessive behavior in a given year; however, our analyses revealed a higher number of co-occurring addiction clusters than typically found in previous studies.

  14. Statistical surrogate model based sampling criterion for stochastic global optimization of problems with constraints

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Su Gil; Jang, Jun Yong; Kim, Ji Hoon; Lee, Tae Hee [Hanyang University, Seoul (Korea, Republic of); Lee, Min Uk [Romax Technology Ltd., Seoul (Korea, Republic of); Choi, Jong Su; Hong, Sup [Korea Research Institute of Ships and Ocean Engineering, Daejeon (Korea, Republic of)

    2015-04-15

    Sequential surrogate model-based global optimization algorithms, such as super-EGO, have been developed to increase the efficiency of commonly used global optimization technique as well as to ensure the accuracy of optimization. However, earlier studies have drawbacks because there are three phases in the optimization loop and empirical parameters. We propose a united sampling criterion to simplify the algorithm and to achieve the global optimum of problems with constraints without any empirical parameters. It is able to select the points located in a feasible region with high model uncertainty as well as the points along the boundary of constraint at the lowest objective value. The mean squared error determines which criterion is more dominant among the infill sampling criterion and boundary sampling criterion. Also, the method guarantees the accuracy of the surrogate model because the sample points are not located within extremely small regions like super-EGO. The performance of the proposed method, such as the solvability of a problem, convergence properties, and efficiency, are validated through nonlinear numerical examples with disconnected feasible regions.

  15. A methodology for more efficient tail area sampling with discrete probability distribution

    International Nuclear Information System (INIS)

    Park, Sang Ryeol; Lee, Byung Ho; Kim, Tae Woon

    1988-01-01

    Monte Carlo Method is commonly used to observe the overall distribution and to determine the lower or upper bound value in statistical approach when direct analytical calculation is unavailable. However, this method would not be efficient if the tail area of a distribution is concerned. A new method entitled 'Two Step Tail Area Sampling' is developed, which uses the assumption of discrete probability distribution and samples only the tail area without distorting the overall distribution. This method uses two step sampling procedure. First, sampling at points separated by large intervals is done and second, sampling at points separated by small intervals is done with some check points determined at first step sampling. Comparison with Monte Carlo Method shows that the results obtained from the new method converge to analytic value faster than Monte Carlo Method if the numbers of calculation of both methods are the same. This new method is applied to DNBR (Departure from Nucleate Boiling Ratio) prediction problem in design of the pressurized light water nuclear reactor

  16. Analytical calculations of neutron slowing down and transport in the constant-cross-section problem

    International Nuclear Information System (INIS)

    Cacuci, D.G.

    1978-01-01

    Some aspects of the problem of neutron slowing down and transport in an infinite medium consisting of a single nuclide that scatters elastically and isotropically and has energy-independent cross sections were investigated. The method of singular eigenfunctions was applied to the Boltzmann equation governing the Laplace transform (with respect to the lethargy variable) of the neutron flux. A new sufficient condition for the convergence of the coefficients of the expansion of the scattering kernel in Legendre polynomials was rigorously derived for this energy-dependent problem. Formulas were obtained for the lethargy-dependent spatial moments of the scalar flux that are valid for medium to large lethargies. In deriving these formulas, use was made of the well-known connection between the spatial moments of the Laplace-transformed scalar flux and the moments of the flux in the ''eigenvalue space.'' The calculations were greatly aided by the construction of a closed general expression for these ''eigenvalue space'' moments. Extensive use was also made of the methods of combinatorial analysis and of computer evaluation, via FORMAC, of complicated sequences of manipulations. For the case of no absorption it was possible to obtain for materials of any atomic weight explicit corrections to the age-theory formulas for the spatial moments M/sub 2n/(u) of the scalar flux that are valid through terms of the order of u -5 . The evaluation of the coefficients of the powers of n, as explicit functions of the nuclear mass, is one of the end products of this investigation. In addition, an exact expression for the second spatial moment, M 2 (u), valid for arbitrary (constant) absorption, was derived. It is now possible to calculate analytically and rigorously the ''age'' for the constant-cross-section problem for arbitrary (constant) absorption and nuclear mass. 5 figures, 1 table

  17. Analytical calculations of neutron slowing down and transport in the constant-cross-section problem

    International Nuclear Information System (INIS)

    Cacuci, D.G.

    1978-04-01

    Aspects of the problem of neutron slowing down and transport in an infinite medium consisting of a single nuclide that scatters elastically and isotropically and has energy-independent cross sections were investigated. The method of singular eigenfunctions was applied to the Boltzmann Equation governing the Laplace transform (with respect to the lethargy variable) of the neutron flux. A new sufficient condition for the convergence of the coefficients of the expansion of the scattering kernel in Legendre polynomials was rigorously derived for this energy-dependent problem. Formulas were obtained for the lethargy-dependent spatial moments of the scalar flux that are valid for medium to large lethargies. Use was made of the well-known connection between the spatial moments of the Laplace-transformed scalar flux and the moments of the flux in the ''eigenvalue space.'' The calculations were aided by the construction of a closed general expression for these ''eigenvalue space'' moments. Extensive use was also made of the methods of combinatorial analysis and of computer evaluation of complicated sequences of manipulations. For the case of no absorption it was possible to obtain for materials of any atomic weight explicit corrections to the age-theory formulas for the spatial moments M/sub 2n/(u) of the scalar flux that are valid through terms of the order of u -5 . The evaluation of the coefficients of the powers of n, as explicit functions of the nuclear mass, represent one of the end products of this investigation. In addition, an exact expression for the second spatial moment, M 2 (u), valid for arbitrary (constant) absorption, was derived. It is now possible to calculate analytically and rigorously the ''age'' for the constant-cross-section problem for arbitrary (constant) absorption and nuclear mass. 5 figures, 1 table

  18. Analytical calculations of neutron slowing down and transport in the constant-cross-section problem

    Energy Technology Data Exchange (ETDEWEB)

    Cacuci, D.G.

    1978-04-01

    Aspects of the problem of neutron slowing down and transport in an infinite medium consisting of a single nuclide that scatters elastically and isotropically and has energy-independent cross sections were investigated. The method of singular eigenfunctions was applied to the Boltzmann Equation governing the Laplace transform (with respect to the lethargy variable) of the neutron flux. A new sufficient condition for the convergence of the coefficients of the expansion of the scattering kernel in Legendre polynomials was rigorously derived for this energy-dependent problem. Formulas were obtained for the lethargy-dependent spatial moments of the scalar flux that are valid for medium to large lethargies. Use was made of the well-known connection between the spatial moments of the Laplace-transformed scalar flux and the moments of the flux in the ''eigenvalue space.'' The calculations were aided by the construction of a closed general expression for these ''eigenvalue space'' moments. Extensive use was also made of the methods of combinatorial analysis and of computer evaluation of complicated sequences of manipulations. For the case of no absorption it was possible to obtain for materials of any atomic weight explicit corrections to the age-theory formulas for the spatial moments M/sub 2n/(u) of the scalar flux that are valid through terms of the order of u/sup -5/. The evaluation of the coefficients of the powers of n, as explicit functions of the nuclear mass, represent one of the end products of this investigation. In addition, an exact expression for the second spatial moment, M/sub 2/(u), valid for arbitrary (constant) absorption, was derived. It is now possible to calculate analytically and rigorously the ''age'' for the constant-cross-section problem for arbitrary (constant) absorption and nuclear mass. 5 figures, 1 table.

  19. Self-recognition of mental health problems in a rural Australian sample.

    Science.gov (United States)

    Handley, Tonelle E; Lewin, Terry J; Perkins, David; Kelly, Brian

    2018-04-19

    Although mental health literacy has increased in recent years, mental illness is often under-recognised. There has been little research conducted on mental illness in rural areas; however, this can be most prominent in rural areas due to factors such as greater stigma and stoicism. The aim of this study is to create a profile of those who are most and least likely to self-identify mental health problems among rural residents with moderate- to-high psychological distress. Secondary analysis of a longitudinal postal survey. Rural and remote New South Wales, Australia. Four-hundred-and-seventy-two community residents. Participants completed the K10 Psychological Distress Scale, as well as the question 'In the past 12 months have you experienced any mental health problems?' The characteristics of those who reported moderate/high distress scores were explored by comparing those who did and did not experience mental health problems recently. Of the 472 participants, 319 (68%) with moderate/high distress reported a mental health problem. Reporting a mental health problem was higher among those with recent adverse life events or who perceived more stress from life events while lower among those who attributed their symptoms to a physical cause. Among a rural sample with moderate/high distress, one-third did not report a mental health problem. Results suggest a threshold effect, whereby mental health problems are more likely to be acknowledged in the context of additional life events. Ongoing public health campaigns are necessary to ensure that symptoms of mental illness are recognised in the multiple forms that they take. © 2018 National Rural Health Alliance Ltd.

  20. An analytical approximation for resonance integral

    International Nuclear Information System (INIS)

    Magalhaes, C.G. de; Martinez, A.S.

    1985-01-01

    It is developed a method which allows to obtain an analytical solution for the resonance integral. The problem formulation is completely theoretical and based in concepts of physics of general character. The analytical expression for integral does not involve any empiric correlation or parameter. Results of approximation are compared with pattern values for each individual resonance and for sum of all resonances. (M.C.K.) [pt

  1. Investigation of potential analytical methods for redox control of the vitrification process

    International Nuclear Information System (INIS)

    Goldman, D.S.

    1985-11-01

    An investigation was conducted to evaluate several analytical techniques to measure ferrous/ferric ratios in simulated and radioactive nuclear waste glasses for eventual redox control of the vitrification process. Redox control will minimize the melt foaming that occurs under highly oxidizing conditions and the metal precipitation that occurs under highly reducing conditions. The analytical method selected must have a rapid response for production problems with minimal complexity and analyst involvement. The wet-chemistry, Moessbauer spectroscopy, glass color analysis, and ion chromatography techniques were explored, with particular emphasis being placed on the Moessbauer technique. In general, all of these methods can be used for nonradioactive samples. The Moessbauer method can readily analyze glasses containing uranium and thorium. A shielded container was designed and built to analyze fully radioactive glasses with the Moessbauer spectrometer in a hot cell environment. However, analyses conducted with radioactive waste glasses containing 90 Sr and 137 Cs were unsuccessful, presumably due to background radiation problems caused by the samples. The color of glass powder can be used to analyze the ferrous/ferric ratio for low chromium glasses, but this method may not be as precise as the others. Ion chromatography was only tested on nonradioactive glasses, but this technique appears to have the required precision due to its analysis of both Fe +2 and Fe +3 and its anticipated adaptability for radioactivity samples. This development would be similar to procedures already in use for shielded inductively coupled plasma emission (ICP) spectrometry. Development of the ion chromatography method is therefore recommended; conventional wet-chemistry is recommended as a backup procedure

  2. Role of analytical chemistry in environmental monitoring

    International Nuclear Information System (INIS)

    Kayasth, S.; Swain, K.

    2004-01-01

    Basic aspects of pollution and the role of analytical chemistry in environmental monitoring are highlighted and exemplified, with emphasis on trace elements. Sources and pathways of natural and especially man-made polluting substances as well as physico-chemical characteristics are given. Attention is paid to adequate sampling in various compartments of the environment comprising both lithosphere and biosphere. Trace analysis is dealt with using a variety of analytical techniques, including criteria for choice of suited techniques, as well as aspects of analytical quality assurance and control. Finally, some data on trace elements levels in soil and water samples from India are presented. (author)

  3. Manual of selected physico-chemical analytical methods. IV

    International Nuclear Information System (INIS)

    Beran, M.; Klosova, E.; Krtil, J.; Sus, F.; Kuvik, V.; Vrbova, L.; Hamplova, M.; Lengyel, J.; Kelnar, L.; Zakouril, K.

    1990-11-01

    The Central Testing Laboratory of the Nuclear Research Institute at Rez has for a decade been participating in the development of analytical procedures and has been providing analyses of samples of different types and origin. The analytical procedures developed have been published in special journals and a number of them in the Manuals of analytical methods, in three parts. The 4th part of the Manual contains selected physico-chemical methods developed or modified by the Laboratory in the years 1986-1990 within the project ''Development of physico-chemical analytical methods''. In most cases, techniques are involved for non-nuclear applications. Some can find wider applications, especially in analyses of environmental samples. Others have been developed for specific cases of sample analyses or require special instrumentation (mass spectrometer), which partly restricts their applicability by other institutions. (author)

  4. Perspectives on making big data analytics work for oncology.

    Science.gov (United States)

    El Naqa, Issam

    2016-12-01

    Oncology, with its unique combination of clinical, physical, technological, and biological data provides an ideal case study for applying big data analytics to improve cancer treatment safety and outcomes. An oncology treatment course such as chemoradiotherapy can generate a large pool of information carrying the 5Vs hallmarks of big data. This data is comprised of a heterogeneous mixture of patient demographics, radiation/chemo dosimetry, multimodality imaging features, and biological markers generated over a treatment period that can span few days to several weeks. Efforts using commercial and in-house tools are underway to facilitate data aggregation, ontology creation, sharing, visualization and varying analytics in a secure environment. However, open questions related to proper data structure representation and effective analytics tools to support oncology decision-making need to be addressed. It is recognized that oncology data constitutes a mix of structured (tabulated) and unstructured (electronic documents) that need to be processed to facilitate searching and subsequent knowledge discovery from relational or NoSQL databases. In this context, methods based on advanced analytics and image feature extraction for oncology applications will be discussed. On the other hand, the classical p (variables)≫n (samples) inference problem of statistical learning is challenged in the Big data realm and this is particularly true for oncology applications where p-omics is witnessing exponential growth while the number of cancer incidences has generally plateaued over the past 5-years leading to a quasi-linear growth in samples per patient. Within the Big data paradigm, this kind of phenomenon may yield undesirable effects such as echo chamber anomalies, Yule-Simpson reversal paradox, or misleading ghost analytics. In this work, we will present these effects as they pertain to oncology and engage small thinking methodologies to counter these effects ranging from

  5. Random walks in the quarter plane algebraic methods, boundary value problems, applications to queueing systems and analytic combinatorics

    CERN Document Server

    Fayolle, Guy; Malyshev, Vadim

    2017-01-01

    This monograph aims to promote original mathematical methods to determine the invariant measure of two-dimensional random walks in domains with boundaries. Such processes arise in numerous applications and are of interest in several areas of mathematical research, such as Stochastic Networks, Analytic Combinatorics, and Quantum Physics. This second edition consists of two parts. Part I is a revised upgrade of the first edition (1999), with additional recent results on the group of a random walk. The theoretical approach given therein has been developed by the authors since the early 1970s. By using Complex Function Theory, Boundary Value Problems, Riemann Surfaces, and Galois Theory, completely new methods are proposed for solving functional equations of two complex variables, which can also be applied to characterize the Transient Behavior of the walks, as well as to find explicit solutions to the one-dimensional Quantum Three-Body Problem, or to tackle a new class of Integrable Systems. Part II borrows spec...

  6. Non-erotic thoughts, attentional focus, and sexual problems in a community sample.

    Science.gov (United States)

    Nelson, Andrea L; Purdon, Christine

    2011-04-01

    According to Barlow's model of sexual dysfunction, anxiety in sexual situations leads to attentional focus on sexual performance at the expense of erotic cues, which compromises sexual arousal. This negative experience will enhance anxiety in future sexual situations, and non-erotic thoughts (NETs) relevant to performance will receive attentional priority. Previous research with student samples (Purdon & Holdaway, 2006; Purdon & Watson, 2010) has found that people experience many types of NETs in addition to performance-relevant thoughts, and that, consistent with Barlow's model, the frequency of and anxiety evoked by these thoughts is positively associated with sexual problems. Extending this previous work, the current study found that, in a community sample of women (N = 81) and men (N = 72) in long-term relationships, women were more likely to report body image concerns and external consequences of the sexual activity, while men were more likely to report performance-related concerns. Equally likely among men and women were thoughts about emotional consequences of the sexual activity. Regardless of thought content, experiencing more frequent NETs was associated with more sexual problems in both women and men. Moreover, as per Barlow's model, greater negative affect in anticipation of and during sexual activity predicted greater frequency of NETs and greater anxiety in response to NETs was associated with greater difficulty dismissing the thoughts. However, greater difficulty in refocusing on erotic thoughts during sexual activity uniquely predicted more sexual problems above the frequency and dismissability of NETs. Together, these data support the cognitive interference mechanism implicated by Barlow's causal model of sexual dysfunction and have implications for the treatment of sexual problems.

  7. Analytical Validation of Quantitative Real-Time PCR Methods for Quantification of Trypanosoma cruzi DNA in Blood Samples from Chagas Disease Patients.

    Science.gov (United States)

    Ramírez, Juan Carlos; Cura, Carolina Inés; da Cruz Moreira, Otacilio; Lages-Silva, Eliane; Juiz, Natalia; Velázquez, Elsa; Ramírez, Juan David; Alberti, Anahí; Pavia, Paula; Flores-Chávez, María Delmans; Muñoz-Calderón, Arturo; Pérez-Morales, Deyanira; Santalla, José; Marcos da Matta Guedes, Paulo; Peneau, Julie; Marcet, Paula; Padilla, Carlos; Cruz-Robles, David; Valencia, Edward; Crisante, Gladys Elena; Greif, Gonzalo; Zulantay, Inés; Costales, Jaime Alfredo; Alvarez-Martínez, Miriam; Martínez, Norma Edith; Villarroel, Rodrigo; Villarroel, Sandro; Sánchez, Zunilda; Bisio, Margarita; Parrado, Rudy; Maria da Cunha Galvão, Lúcia; Jácome da Câmara, Antonia Cláudia; Espinoza, Bertha; Alarcón de Noya, Belkisyole; Puerta, Concepción; Riarte, Adelina; Diosque, Patricio; Sosa-Estani, Sergio; Guhl, Felipe; Ribeiro, Isabela; Aznar, Christine; Britto, Constança; Yadón, Zaida Estela; Schijman, Alejandro G

    2015-09-01

    An international study was performed by 26 experienced PCR laboratories from 14 countries to assess the performance of duplex quantitative real-time PCR (qPCR) strategies on the basis of TaqMan probes for detection and quantification of parasitic loads in peripheral blood samples from Chagas disease patients. Two methods were studied: Satellite DNA (SatDNA) qPCR and kinetoplastid DNA (kDNA) qPCR. Both methods included an internal amplification control. Reportable range, analytical sensitivity, limits of detection and quantification, and precision were estimated according to international guidelines. In addition, inclusivity and exclusivity were estimated with DNA from stocks representing the different Trypanosoma cruzi discrete typing units and Trypanosoma rangeli and Leishmania spp. Both methods were challenged against 156 blood samples provided by the participant laboratories, including samples from acute and chronic patients with varied clinical findings, infected by oral route or vectorial transmission. kDNA qPCR showed better analytical sensitivity than SatDNA qPCR with limits of detection of 0.23 and 0.70 parasite equivalents/mL, respectively. Analyses of clinical samples revealed a high concordance in terms of sensitivity and parasitic loads determined by both SatDNA and kDNA qPCRs. This effort is a major step toward international validation of qPCR methods for the quantification of T. cruzi DNA in human blood samples, aiming to provide an accurate surrogate biomarker for diagnosis and treatment monitoring for patients with Chagas disease. Copyright © 2015 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  8. Comparison of the analytical methods used to determine natural and artificial radionuclides from environmental samples by gamma, alpha and beta spectrometry

    DEFF Research Database (Denmark)

    Pöllänen, Roy; Virtanen, Sinikka; Kämäräinen, Meerit

    In CAMNAR, an extensive interlaboratory exercise on the analytical methods used to determine several radionuclides present in the environmental samples was organized. Activity concentration of different natural radionuclides, such as Rn-222, Pb-210, Po-210, K-40, Ra-226, Ra-228 and isotopes...... of uranium, in addition to artificial Cs-137 and Am-241 were analysed from lake sediment samples and drinking water. The measurement techniques were gamma-ray spectrometry, alpha spectrometry, liquid scintillation counting and inductively coupled plasma mass spectrometry. Twenty six laboratories from nine...

  9. Analytic solutions of hydrodynamics equations

    International Nuclear Information System (INIS)

    Coggeshall, S.V.

    1991-01-01

    Many similarity solutions have been found for the equations of one-dimensional (1-D) hydrodynamics. These special combinations of variables allow the partial differential equations to be reduced to ordinary differential equations, which must then be solved to determine the physical solutions. Usually, these reduced ordinary differential equations are solved numerically. In some cases it is possible to solve these reduced equations analytically to obtain explicit solutions. In this work a collection of analytic solutions of the 1-D hydrodynamics equations is presented. These can be used for a variety of purposes, including (i) numerical benchmark problems, (ii) as a basis for analytic models, and (iii) to provide insight into more complicated solutions

  10. Analytical applications of ICP-FTS

    International Nuclear Information System (INIS)

    Faires, L.M.; Palmer, B.A.; Cunningham, P.T.

    1986-01-01

    The Analytical Chemistry Group of the Chemistry Division at Los Alamos National Laboratory has been investigating the analytical utility of the inductively coupled plasma (ICP) - Fourier transform spectrometer (FTS) combination. While a new state-of-the-art FTS facility is under construction at Los Alamos, preliminary data has been obtained on the one-meter FTS at the National Solar Observatory at Kitt Peak, Arizona. This paper presents an update of the Los Alamos FTS facility, which is expected to be completed in 1986, and presents data showing the analytical potential of an ICP-FTS system. Some of the potential problems of the multiplex disadvantage are discussed, and the advantages of the high resolution obtainable with the FTS are illustrated

  11. System effects in sample self-stacking CZE: Single analyte peak splitting of salt-containing samples

    Czech Academy of Sciences Publication Activity Database

    Malá, Zdeňka; Gebauer, Petr; Boček, Petr

    2009-01-01

    Roč. 30, č. 5 (2009), s. 866-874 ISSN 0173-0835 R&D Projects: GA ČR GA203/08/1536; GA AV ČR IAA400310609; GA AV ČR IAA400310703 Institutional research plan: CEZ:AV0Z40310501 Keywords : CZE * peak splitting * self-stacking Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 3.077, year: 2009

  12. Analytics for Customer Engagement

    NARCIS (Netherlands)

    Bijmolt, Tammo H. A.; Leeflang, Peter S. H.; Block, Frank; Eisenbeiss, Maik; Hardie, Bruce G. S.; Lemmens, Aurelie; Saffert, Peter

    In this article, we discuss the state of the art of models for customer engagement and the problems that are inherent to calibrating and implementing these models. The authors first provide an overview of the data available for customer analytics and discuss recent developments. Next, the authors

  13. Analytical Chemistry Laboratory, progress report for FY 1993

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1993 (October 1992 through September 1993). This annual report is the tenth for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has research programs in analytical chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require development or modification of methods and adaption of techniques to obtain useful analytical data. The ACL is administratively within the Chemical Technology Division (CMT), its principal ANL client, but provides technical support for many of the technical divisions and programs at ANL. The ACL has four technical groups--Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis--which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL.

  14. ANALYTICAL TECHNIQUES FOR THE DETERMINATION OF MELOXICAM IN PHARMACEUTICAL FORMULATIONS AND BIOLOGICAL SAMPLES

    Directory of Open Access Journals (Sweden)

    Aisha Noreen

    2016-06-01

    Full Text Available Meloxicam (MX belongs to the family of oxicams which is the most important group of non steroidal anti-inflammatory drugs (NSAIDs and is widely used for their analgesics and antipyretic activities. It inhibits both COX-I and COX-II enzymes with less gastric and local tissues irritation. A number of analytical techniques have been used for the determination of MX in pharmaceutical as well as in biological fluids. These techniques include titrimetry, spectrometry, chromatography, flow injection spectrometry, fluorescence spectrometry, capillary zone electrophoresis and electrochemical techniques. Many of these techniques have also been used for the simultaneous determination of MX with other compounds. A comprehensive review of these analytical techniques has been done which could be useful for the analytical chemists and quality control pharmacists.

  15. Temperature-controlled micro-TLC: a versatile green chemistry and fast analytical tool for separation and preliminary screening of steroids fraction from biological and environmental samples.

    Science.gov (United States)

    Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J

    2011-11-01

    This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the

  16. Analytic theory of curvature effects for wave problems with general boundary conditions

    DEFF Research Database (Denmark)

    Willatzen, Morten; Gravesen, Jens; Voon, L. C. Lew Yan

    2010-01-01

    A formalism based on a combination of differential geometry and perturbation theory is used to obtain analytic expressions for confined eigenmode changes due to general curvature effects. In cases of circular-shaped and helix-shaped structures, where alternative analytic solutions can be found......, the perturbative solution is shown to yield the same result. The present technique allows the generalization of earlier results to arbitrary boundary conditions. The power of the method is illustrated using examples based on Maxwell’s and Schrödinger’s equations for applications in photonics and nanoelectronics....

  17. Trace element studies in bioenvironmental samples using 3-MeV protons

    International Nuclear Information System (INIS)

    Walter, R.L.; Willis, R.D.; Gutknecht, W.F.

    1974-01-01

    Trace metal compositions of a wide range of biological, environmental, medical and clinical samples were investigated using proton-induced x-ray emission analysis (PIXEA). The x-rays were detected with a Si(Li) detector and spectra from over 3000 irradiations have been recorded on magnetic tape. The chi 2 fitting code TRACE developed at our laboratory was used in a semi-automatic mode to extract abundances of elements from S to Cd. Various methods of overcoming analytical problems and specimen preparation difficulties are reported. Results from some samples for typical studies are illustrated along with the reasons for interest in the sample types

  18. Analytical and Numerical Studies of Sloshing in Tanks

    Energy Technology Data Exchange (ETDEWEB)

    Solaas, F

    1996-12-31

    For oil cargo ship tanks and liquid natural gas carriers, the dimensions of the tanks are often such that the highest resonant sloshing periods and the ship motions are in the same period range, which may cause violent resonant sloshing of the liquid. In this doctoral thesis, linear and non-linear analytical potential theory solutions of the sloshing problem are studied for a two-dimensional rectangular tank and a vertical circular cylindrical tank, using perturbation technique for the non-linear case. The tank is forced to oscillate harmonically with small amplitudes of sway with frequency in the vicinity of the lowest natural frequency of the fluid inside the tank. The method is extended to other tank shapes using a combined analytical and numerical method. A boundary element numerical method is used to determine the eigenfunctions and eigenvalues of the problem. These are used in the non-linear analytical free surface conditions, and the velocity potential and free surface elevation for each boundary value problem in the perturbation scheme are determined by the boundary element method. Both the analytical method and the combined analytical and numerical method are restricted to tanks with vertical walls in the free surface. The suitability of a commercial programme, FLOW-3D, to estimate sloshing is studied. It solves the Navier-Stokes equations by the finite difference method. The free surface as function of time is traced using the fractional volume of fluid method. 59 refs., 54 figs., 37 tabs.

  19. Analytical and Numerical Studies of Sloshing in Tanks

    Energy Technology Data Exchange (ETDEWEB)

    Solaas, F.

    1995-12-31

    For oil cargo ship tanks and liquid natural gas carriers, the dimensions of the tanks are often such that the highest resonant sloshing periods and the ship motions are in the same period range, which may cause violent resonant sloshing of the liquid. In this doctoral thesis, linear and non-linear analytical potential theory solutions of the sloshing problem are studied for a two-dimensional rectangular tank and a vertical circular cylindrical tank, using perturbation technique for the non-linear case. The tank is forced to oscillate harmonically with small amplitudes of sway with frequency in the vicinity of the lowest natural frequency of the fluid inside the tank. The method is extended to other tank shapes using a combined analytical and numerical method. A boundary element numerical method is used to determine the eigenfunctions and eigenvalues of the problem. These are used in the non-linear analytical free surface conditions, and the velocity potential and free surface elevation for each boundary value problem in the perturbation scheme are determined by the boundary element method. Both the analytical method and the combined analytical and numerical method are restricted to tanks with vertical walls in the free surface. The suitability of a commercial programme, FLOW-3D, to estimate sloshing is studied. It solves the Navier-Stokes equations by the finite difference method. The free surface as function of time is traced using the fractional volume of fluid method. 59 refs., 54 figs., 37 tabs.

  20. Prevalence of Pre-Analytical Errors in Clinical Chemistry Diagnostic Labs in Sulaimani City of Iraqi Kurdistan.

    Science.gov (United States)

    Najat, Dereen

    2017-01-01

    Laboratory testing is roughly divided into three phases: a pre-analytical phase, an analytical phase and a post-analytical phase. Most analytical errors have been attributed to the analytical phase. However, recent studies have shown that up to 70% of analytical errors reflect the pre-analytical phase. The pre-analytical phase comprises all processes from the time a laboratory request is made by a physician until the specimen is analyzed at the lab. Generally, the pre-analytical phase includes patient preparation, specimen transportation, specimen collection and storage. In the present study, we report the first comprehensive assessment of the frequency and types of pre-analytical errors at the Sulaimani diagnostic labs in Iraqi Kurdistan. Over 2 months, 5500 venous blood samples were observed in 10 public diagnostic labs of Sulaimani City. The percentages of rejected samples and types of sample inappropriateness were evaluated. The percentage of each of the following pre-analytical errors were recorded: delay in sample transportation, clotted samples, expired reagents, hemolyzed samples, samples not on ice, incorrect sample identification, insufficient sample, tube broken in centrifuge, request procedure errors, sample mix-ups, communication conflicts, misinterpreted orders, lipemic samples, contaminated samples and missed physician's request orders. The difference between the relative frequencies of errors observed in the hospitals considered was tested using a proportional Z test. In particular, the survey aimed to discover whether analytical errors were recorded and examine the types of platforms used in the selected diagnostic labs. The analysis showed a high prevalence of improper sample handling during the pre-analytical phase. In appropriate samples, the percentage error was as high as 39%. The major reasons for rejection were hemolyzed samples (9%), incorrect sample identification (8%) and clotted samples (6%). Most quality control schemes at Sulaimani

  1. Prevalence of Pre-Analytical Errors in Clinical Chemistry Diagnostic Labs in Sulaimani City of Iraqi Kurdistan.

    Directory of Open Access Journals (Sweden)

    Dereen Najat

    Full Text Available Laboratory testing is roughly divided into three phases: a pre-analytical phase, an analytical phase and a post-analytical phase. Most analytical errors have been attributed to the analytical phase. However, recent studies have shown that up to 70% of analytical errors reflect the pre-analytical phase. The pre-analytical phase comprises all processes from the time a laboratory request is made by a physician until the specimen is analyzed at the lab. Generally, the pre-analytical phase includes patient preparation, specimen transportation, specimen collection and storage. In the present study, we report the first comprehensive assessment of the frequency and types of pre-analytical errors at the Sulaimani diagnostic labs in Iraqi Kurdistan.Over 2 months, 5500 venous blood samples were observed in 10 public diagnostic labs of Sulaimani City. The percentages of rejected samples and types of sample inappropriateness were evaluated. The percentage of each of the following pre-analytical errors were recorded: delay in sample transportation, clotted samples, expired reagents, hemolyzed samples, samples not on ice, incorrect sample identification, insufficient sample, tube broken in centrifuge, request procedure errors, sample mix-ups, communication conflicts, misinterpreted orders, lipemic samples, contaminated samples and missed physician's request orders. The difference between the relative frequencies of errors observed in the hospitals considered was tested using a proportional Z test. In particular, the survey aimed to discover whether analytical errors were recorded and examine the types of platforms used in the selected diagnostic labs.The analysis showed a high prevalence of improper sample handling during the pre-analytical phase. In appropriate samples, the percentage error was as high as 39%. The major reasons for rejection were hemolyzed samples (9%, incorrect sample identification (8% and clotted samples (6%. Most quality control schemes

  2. Multiple analyte adduct formation in liquid chromatography-tandem mass spectrometry - Advantages and limitations in the analysis of biologically-related samples.

    Science.gov (United States)

    Dziadosz, Marek

    2018-05-01

    Multiple analyte adduct formation was examined and discussed in the context of reproducible signal detection in liquid chromatography-tandem mass spectrometry applied in the analysis of biologically-related samples. Appropriate infusion solutions were prepared in H 2 O/methanol (3/97, v/v) with 1 mM sodium acetate and 10 mM acetic acid. An API 4000 QTrap tandem mass spectrometer was used for experiments performed in the negative scan mode (-Q1 MS) and the negative enhanced product ion mode (-EPI). γ‑Hydroxybutyrate and its deuterated form were used as model compounds to highlight both the complexity of adduct formation in popular mobile phases used and the effective signal compensation by the application of isotope-labelled analytes as internal standards. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Ecotoxicity on a stick: A novel analytical tool for predicting the ecotoxicity of petroleum contaminated samples

    International Nuclear Information System (INIS)

    Parkerton, T.F.; Stone, M.A.

    1995-01-01

    Hydrocarbons generally elicit toxicity via a nonpolar narcotic mechanism. Recent research suggests that chemicals acting by this mode invoke ecotoxicity when the molar concentration in organisms lipid exceeds a critical threshold. Since ecotoxicity of nonpolar narcotic mixtures appears to be additive, the ecotoxicity of hydrocarbon mixtures thus depends upon: (1) the partitioning of individual hydrocarbons comprising the mixture from the environment to lipids and (2) the total molar sum of the constituent hydrocarbons in lipids. These insights have led previous investigators to advance the concept of biomimetic extraction as a novel tool for assessing potential narcosis-type or baseline ecotoxicity in aqueous samples. Drawing from this earlier work, the authors have developed a method to quantify Bioavailable Petroleum Hydrocarbons (BPHS) in hydrocarbon-contaminated aqueous and soil/sediment samples. A sample is equilibrated with a solid phase microextraction (SPME) fiber that serves as a surrogate for organism lipids. The total moles of hydrocarbons that partition to the SPME fiber is then quantified using a simple GC/FID procedure. Research conducted to support the development and initial validation of this method will be presented. Results suggest that BPH analyses provide a promising, cost-effective approach for predicting the ecotoxicity of environmental samples contaminated with hydrocarbon mixtures. Consequently, BPH analyses may provide a valuable analytical screening tool for ecotoxicity assessment in product and effluent testing, environmental monitoring and site remediation applications

  4. Open mathematical problems regarding non-Newtonian fluids

    International Nuclear Information System (INIS)

    Wilson, Helen J

    2012-01-01

    We present three open problems in the mathematical modelling of the flow of non-Newtonian fluids. The first problem is rather long standing: a discontinuity in the dependence of the rise velocity of a gas bubble on its volume. This is very well characterized experimentally but not, so far, fully reproduced either numerically or analytically. The other two are both instabilities. The first is observed experimentally but never predicted analytically or numerically. In the second instability, numerical studies reproduce the experimental observations but there is as yet no analytical or semi-analytical prediction of the linear instability which must be present. (invited article)

  5. Expressing analytical performance from multi-sample evaluation in laboratory EQA.

    Science.gov (United States)

    Thelen, Marc H M; Jansen, Rob T P; Weykamp, Cas W; Steigstra, Herman; Meijer, Ron; Cobbaert, Christa M

    2017-08-28

    To provide its participants with an external quality assessment system (EQAS) that can be used to check trueness, the Dutch EQAS organizer, Organization for Quality Assessment of Laboratory Diagnostics (SKML), has innovated its general chemistry scheme over the last decade by introducing fresh frozen commutable samples whose values were assigned by Joint Committee for Traceability in Laboratory Medicine (JCTLM)-listed reference laboratories using reference methods where possible. Here we present some important innovations in our feedback reports that allow participants to judge whether their trueness and imprecision meet predefined analytical performance specifications. Sigma metrics are used to calculate performance indicators named 'sigma values'. Tolerance intervals are based on both Total Error allowable (TEa) according to biological variation data and state of the art (SA) in line with the European Federation of Clinical Chemistry and Laboratory Medicine (EFLM) Milan consensus. The existing SKML feedback reports that express trueness as the agreement between the regression line through the results of the last 12 months and the values obtained from reference laboratories and calculate imprecision from the residuals of the regression line are now enriched with sigma values calculated from the degree to which the combination of trueness and imprecision are within tolerance limits. The information and its conclusion to a simple two-point scoring system are also graphically represented in addition to the existing difference plot. By adding sigma metrics-based performance evaluation in relation to both TEa and SA tolerance intervals to its EQAS schemes, SKML provides its participants with a powerful and actionable check on accuracy.

  6. Analytical performances of food microbiology laboratories - critical analysis of 7 years of proficiency testing results.

    Science.gov (United States)

    Abdel Massih, M; Planchon, V; Polet, M; Dierick, K; Mahillon, J

    2016-02-01

    Based on the results of 19 food microbiology proficiency testing (PT) schemes, this study aimed to assess the laboratory performances, to highlight the main sources of unsatisfactory analytical results and to suggest areas of improvement. The 2009-2015 results of REQUASUD and IPH PT, involving a total of 48 laboratories, were analysed. On average, the laboratories failed to detect or enumerate foodborne pathogens in 3·0% of the tests. Thanks to a close collaboration with the PT participants, the causes of outliers could be identified in 74% of the cases. The main causes of erroneous PT results were either pre-analytical (handling of the samples, timing of analysis), analytical (unsuitable methods, confusion of samples, errors in colony counting or confirmation) or postanalytical mistakes (calculation and encoding of results). PT schemes are a privileged observation post to highlight analytical problems, which would otherwise remain unnoticed. In this perspective, this comprehensive study of PT results provides insight into the sources of systematic errors encountered during the analyses. This study draws the attention of the laboratories to the main causes of analytical errors and suggests practical solutions to avoid them, in an educational purpose. The observations support the hypothesis that regular participation to PT, when followed by feed-back and appropriate corrective actions, can play a key role in quality improvement and provide more confidence in the laboratory testing results. © 2015 The Society for Applied Microbiology.

  7. Hanford analytical sample projections FY 1998 - FY 2002

    International Nuclear Information System (INIS)

    Joyce, S.M.

    1997-01-01

    Sample projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Sample projections are categorized by radiation level, protocol, sample matrix and Program. Analyses requirements are also presented

  8. Analytic Cognitive Style Predicts Religious and Paranormal Belief

    Science.gov (United States)

    Pennycook, Gordon; Cheyne, James Allan; Seli, Paul; Koehler, Derek J.; Fugelsang, Jonathan A.

    2012-01-01

    An analytic cognitive style denotes a propensity to set aside highly salient intuitions when engaging in problem solving. We assess the hypothesis that an analytic cognitive style is associated with a history of questioning, altering, and rejecting (i.e., unbelieving) supernatural claims, both religious and paranormal. In two studies, we examined…

  9. Sample problems for the novice user of the AMPX-II system

    International Nuclear Information System (INIS)

    Ford, W.E. III; Roussin, R.W.; Petrie, L.M.; Diggs, B.R.; Comolander, H.E.

    1979-01-01

    Contents of the IBM version of the APMX system distributed by the Radiation Shielding Information Center (APMX-II) are described. Sample problems which demonstrate the procedure for implementing AMPX-II modules to generate point cross sections; generate multigroup neutron, photon production, and photon interaction cross sections for various transport codes; collapse multigroup cross sections; check, edit, and punch multigroup cross sections; and execute a one-dimensional discrete ordinates transport calculation are detailed. 25 figures, 9 tables

  10. Beta/gamma test problems for ITS

    International Nuclear Information System (INIS)

    Mei, G.T.

    1993-01-01

    The Integrated Tiger Series of Coupled Electron/Photon Monte Carlo Transport Codes (ITS 3.0, PC Version) was used at Oak Ridge National Laboratory (ORNL) to compare with and extend the experimental findings of the beta/gamma response of selected health physics instruments. In order to assure that ITS gives correct results, several beta/gamma problems have been tested. ITS was used to simulate these problems numerically, and results for each were compared to the problem's experimental or analytical results. ITS successfully predicted the experimental or analytical results of all tested problems within the statistical uncertainty inherent in the Monte Carlo method

  11. Synthetic Multiple-Imputation Procedure for Multistage Complex Samples

    Directory of Open Access Journals (Sweden)

    Zhou Hanzhi

    2016-03-01

    Full Text Available Multiple imputation (MI is commonly used when item-level missing data are present. However, MI requires that survey design information be built into the imputation models. For multistage stratified clustered designs, this requires dummy variables to represent strata as well as primary sampling units (PSUs nested within each stratum in the imputation model. Such a modeling strategy is not only operationally burdensome but also inferentially inefficient when there are many strata in the sample design. Complexity only increases when sampling weights need to be modeled. This article develops a generalpurpose analytic strategy for population inference from complex sample designs with item-level missingness. In a simulation study, the proposed procedures demonstrate efficient estimation and good coverage properties. We also consider an application to accommodate missing body mass index (BMI data in the analysis of BMI percentiles using National Health and Nutrition Examination Survey (NHANES III data. We argue that the proposed methods offer an easy-to-implement solution to problems that are not well-handled by current MI techniques. Note that, while the proposed method borrows from the MI framework to develop its inferential methods, it is not designed as an alternative strategy to release multiply imputed datasets for complex sample design data, but rather as an analytic strategy in and of itself.

  12. Paraxial light distribution in the focal region of a lens: a comparison of several analytical solutions and a numerical result

    Science.gov (United States)

    Wu, Yang; Kelly, Damien P.

    2014-12-01

    The distribution of the complex field in the focal region of a lens is a classical optical diffraction problem. Today, it remains of significant theoretical importance for understanding the properties of imaging systems. In the paraxial regime, it is possible to find analytical solutions in the neighborhood of the focus, when a plane wave is incident on a focusing lens whose finite extent is limited by a circular aperture. For example, in Born and Wolf's treatment of this problem, two different, but mathematically equivalent analytical solutions, are presented that describe the 3D field distribution using infinite sums of ? and ? type Lommel functions. An alternative solution expresses the distribution in terms of Zernike polynomials, and was presented by Nijboer in 1947. More recently, Cao derived an alternative analytical solution by expanding the Fresnel kernel using a Taylor series expansion. In practical calculations, however, only a finite number of terms from these infinite series expansions is actually used to calculate the distribution in the focal region. In this manuscript, we compare and contrast each of these different solutions to a numerically calculated result, paying particular attention to how quickly each solution converges for a range of different spatial locations behind the focusing lens. We also examine the time taken to calculate each of the analytical solutions. The numerical solution is calculated in a polar coordinate system and is semi-analytic. The integration over the angle is solved analytically, while the radial coordinate is sampled with a sampling interval of ? and then numerically integrated. This produces an infinite set of replicas in the diffraction plane, that are located in circular rings centered at the optical axis and each with radii given by ?, where ? is the replica order. These circular replicas are shown to be fundamentally different from the replicas that arise in a Cartesian coordinate system.

  13. Future analytical provision - Relocation of Sellafield Ltd Analytical Services Laboratory

    International Nuclear Information System (INIS)

    Newell, B.

    2015-01-01

    Sellafield Ltd Analytical Services provide an essential view on the environmental, safety, process and high hazard risk reduction performances by analysis of samples. It is the largest and most complex analytical services laboratory in Europe, with 150 laboratories (55 operational) and 350 staff (including 180 analysts). Sellafield Ltd Analytical Services Main Laboratory is in need of replacement. This is due to the age of the facility and changes to work streams. This relocation is an opportunity to -) design and commission bespoke MA (Medium-Active) cells, -) modify HA (High-Active) cell design to facilitate an in-cell laboratory, -) develop non-destructive techniques, -) open light building for better worker morale. The option chosen was to move the activities to the NNL Central laboratory (NNLCL) that is based at Sellafield and is the UK's flagship nuclear research and development facility. This poster gives a time schedule

  14. A direct sampling method to an inverse medium scattering problem

    KAUST Repository

    Ito, Kazufumi

    2012-01-10

    In this work we present a novel sampling method for time harmonic inverse medium scattering problems. It provides a simple tool to directly estimate the shape of the unknown scatterers (inhomogeneous media), and it is applicable even when the measured data are only available for one or two incident directions. A mathematical derivation is provided for its validation. Two- and three-dimensional numerical simulations are presented, which show that the method is accurate even with a few sets of scattered field data, computationally efficient, and very robust with respect to noises in the data. © 2012 IOP Publishing Ltd.

  15. Maybe Small Is Too Small a Term: Introduction to Advancing Small Sample Prevention Science.

    Science.gov (United States)

    Fok, Carlotta Ching Ting; Henry, David; Allen, James

    2015-10-01

    Prevention research addressing health disparities often involves work with small population groups experiencing such disparities. The goals of this special section are to (1) address the question of what constitutes a small sample; (2) identify some of the key research design and analytic issues that arise in prevention research with small samples; (3) develop applied, problem-oriented, and methodologically innovative solutions to these design and analytic issues; and (4) evaluate the potential role of these innovative solutions in describing phenomena, testing theory, and evaluating interventions in prevention research. Through these efforts, we hope to promote broader application of these methodological innovations. We also seek whenever possible, to explore their implications in more general problems that appear in research with small samples but concern all areas of prevention research. This special section includes two sections. The first section aims to provide input for researchers at the design phase, while the second focuses on analysis. Each article describes an innovative solution to one or more challenges posed by the analysis of small samples, with special emphasis on testing for intervention effects in prevention research. A concluding article summarizes some of their broader implications, along with conclusions regarding future directions in research with small samples in prevention science. Finally, a commentary provides the perspective of the federal agencies that sponsored the conference that gave rise to this special section.

  16. Delivering business analytics practical guidelines for best practice

    CERN Document Server

    Stubbs, Evan

    2013-01-01

    AVOID THE MISTAKES THAT OTHERS MAKE - LEARN WHAT LEADS TO BEST PRACTICE AND KICKSTART SUCCESS This groundbreaking resource provides comprehensive coverage across all aspects of business analytics, presenting proven management guidelines to drive sustainable differentiation. Through a rich set of case studies, author Evan Stubbs reviews solutions and examples to over twenty common problems spanning managing analytics assets and information, leveraging technology, nurturing skills, and defining processes. Delivering Business Analytics also outlines the Data Scientist's Code, fifteen principle

  17. Emerging technology and architecture for big-data analytics

    CERN Document Server

    Chang, Chip; Yu, Hao

    2017-01-01

    This book describes the current state of the art in big-data analytics, from a technology and hardware architecture perspective. The presentation is designed to be accessible to a broad audience, with general knowledge of hardware design and some interest in big-data analytics. Coverage includes emerging technology and devices for data-analytics, circuit design for data-analytics, and architecture and algorithms to support data-analytics. Readers will benefit from the realistic context used by the authors, which demonstrates what works, what doesn’t work, and what are the fundamental problems, solutions, upcoming challenges and opportunities. Provides a single-source reference to hardware architectures for big-data analytics; Covers various levels of big-data analytics hardware design abstraction and flow, from device, to circuits and systems; Demonstrates how non-volatile memory (NVM) based hardware platforms can be a viable solution to existing challenges in hardware architecture for big-data analytics.

  18. Human-machine analytics for closed-loop sense-making in time-dominant cyber defense problems

    Science.gov (United States)

    Henry, Matthew H.

    2017-05-01

    Many defense problems are time-dominant: attacks progress at speeds that outpace human-centric systems designed for monitoring and response. Despite this shortcoming, these well-honed and ostensibly reliable systems pervade most domains, including cyberspace. The argument that often prevails when considering the automation of defense is that while technological systems are suitable for simple, well-defined tasks, only humans possess sufficiently nuanced understanding of problems to act appropriately under complicated circumstances. While this perspective is founded in verifiable truths, it does not account for a middle ground in which human-managed technological capabilities extend well into the territory of complex reasoning, thereby automating more nuanced sense-making and dramatically increasing the speed at which it can be applied. Snort1 and platforms like it enable humans to build, refine, and deploy sense-making tools for network defense. Shortcomings of these platforms include a reliance on rule-based logic, which confounds analyst knowledge of how bad actors behave with the means by which bad behaviors can be detected, and a lack of feedback-informed automation of sensor deployment. We propose an approach in which human-specified computational models hypothesize bad behaviors independent of indicators and then allocate sensors to estimate and forecast the state of an intrusion. State estimates and forecasts inform the proactive deployment of additional sensors and detection logic, thereby closing the sense-making loop. All the while, humans are on the loop, rather than in it, permitting nuanced management of fast-acting automated measurement, detection, and inference engines. This paper motivates and conceptualizes analytics to facilitate this human-machine partnership.

  19. The Students Decision Making in Solving Discount Problem

    Science.gov (United States)

    Abdillah; Nusantara, Toto; Subanji; Susanto, Hery; Abadyo

    2016-01-01

    This research is reviewing students' process of decision making intuitively, analytically, and interactively. The research done by using discount problem which specially created to explore student's intuition, analytically, and interactively. In solving discount problems, researcher exploring student's decision in determining their attitude which…

  20. Determination of 93Zr, 107Pd and 135Cs in zircaloy hulls analytical development on inactive samples

    International Nuclear Information System (INIS)

    Excoffier, E.; Bienvenu, Ph.; Combes, C.; Pontremoli, S.; Delteil, N.; Ferrini, R.

    2000-01-01

    A study involving the participation of three laboratories of the Direction of the Fuel Cycle has been undertaken within the framework of a common interest program existing between the COGEMA and the CEA. Its purpose is to develop analytical methods for the determination of long-lived radionuclides in zircaloy hulls coming from spent fuel reprocessing operations. Acting as a complement to work carried out at the DRRV in ATALANTE concerning zircaloy dissolution and direct analysis of hull solutions, a study is now being conducted at the DESD/SCCD/LARC in Cadarache on three of these radionuclides, namely: zirconium 93, palladium 107 and caesium 135. It concerns three radioisotopes having very long periods (∼10 6 y), and which stabilize mainly through emission of β particles. The analytical technique chosen for the final measurement is inductively coupled plasma mass spectrometry (ICP/MS). Prior to the measurement, chemical separation processes are used to extract the radionuclides from the matrix and separate them from interfering elements and β emitters. The method developed initially on inactive solutions is being validated on irradiated samples coming from UP2/800 - UP3 reprocessing plants. (authors)

  1. An asymptotic analytical solution to the problem of two moving boundaries with fractional diffusion in one-dimensional drug release devices

    International Nuclear Information System (INIS)

    Yin Chen; Xu Mingyu

    2009-01-01

    We set up a one-dimensional mathematical model with a Caputo fractional operator of a drug released from a polymeric matrix that can be dissolved into a solvent. A two moving boundaries problem in fractional anomalous diffusion (in time) with order α element of (0, 1] under the assumption that the dissolving boundary can be dissolved slowly is presented in this paper. The two-parameter regular perturbation technique and Fourier and Laplace transform methods are used. A dimensionless asymptotic analytical solution is given in terms of the Wright function

  2. Local entropy as a measure for sampling solutions in constraint satisfaction problems

    International Nuclear Information System (INIS)

    Baldassi, Carlo; Ingrosso, Alessandro; Lucibello, Carlo; Saglietti, Luca; Zecchina, Riccardo

    2016-01-01

    We introduce a novel entropy-driven Monte Carlo (EdMC) strategy to efficiently sample solutions of random constraint satisfaction problems (CSPs). First, we extend a recent result that, using a large-deviation analysis, shows that the geometry of the space of solutions of the binary perceptron learning problem (a prototypical CSP), contains regions of very high-density of solutions. Despite being sub-dominant, these regions can be found by optimizing a local entropy measure. Building on these results, we construct a fast solver that relies exclusively on a local entropy estimate, and can be applied to general CSPs. We describe its performance not only for the perceptron learning problem but also for the random K-satisfiabilty problem (another prototypical CSP with a radically different structure), and show numerically that a simple zero-temperature Metropolis search in the smooth local entropy landscape can reach sub-dominant clusters of optimal solutions in a small number of steps, while standard Simulated Annealing either requires extremely long cooling procedures or just fails. We also discuss how the EdMC can heuristically be made even more efficient for the cases we studied. (paper: disordered systems, classical and quantum)

  3. Stationary and related stochastic processes sample function properties and their applications

    CERN Document Server

    Cramér, Harald

    2004-01-01

    This graduate-level text offers a comprehensive account of the general theory of stationary processes, with special emphasis on the properties of sample functions. Assuming a familiarity with the basic features of modern probability theory, the text develops the foundations of the general theory of stochastic processes, examines processes with a continuous-time parameter, and applies the general theory to procedures key to the study of stationary processes. Additional topics include analytic properties of the sample functions and the problem of time distribution of the intersections between a

  4. Reconstruction of binary geological images using analytical edge and object models

    Science.gov (United States)

    Abdollahifard, Mohammad J.; Ahmadi, Sadegh

    2016-04-01

    Reconstruction of fields using partial measurements is of vital importance in different applications in geosciences. Solving such an ill-posed problem requires a well-chosen model. In recent years, training images (TI) are widely employed as strong prior models for solving these problems. However, in the absence of enough evidence it is difficult to find an adequate TI which is capable of describing the field behavior properly. In this paper a very simple and general model is introduced which is applicable to a fairly wide range of binary images without any modifications. The model is motivated by the fact that nearly all binary images are composed of simple linear edges in micro-scale. The analytic essence of this model allows us to formulate the template matching problem as a convex optimization problem having efficient and fast solutions. The model has the potential to incorporate the qualitative and quantitative information provided by geologists. The image reconstruction problem is also formulated as an optimization problem and solved using an iterative greedy approach. The proposed method is capable of recovering the image unknown values with accuracies about 90% given samples representing as few as 2% of the original image.

  5. Analytical solutions of one-dimensional advection–diffusion

    Indian Academy of Sciences (India)

    Analytical solutions are obtained for one-dimensional advection –diffusion equation with variable coefficients in a longitudinal finite initially solute free domain,for two dispersion problems.In the first one,temporally dependent solute dispersion along uniform flow in homogeneous domain is studied.In the second problem the ...

  6. A theoretical study on a convergence problem of nodal methods

    Energy Technology Data Exchange (ETDEWEB)

    Shaohong, Z.; Ziyong, L. [Shanghai Jiao Tong Univ., 1954 Hua Shan Road, Shanghai, 200030 (China); Chao, Y. A. [Westinghouse Electric Company, P. O. Box 355, Pittsburgh, PA 15230-0355 (United States)

    2006-07-01

    The effectiveness of modern nodal methods is largely due to its use of the information from the analytical flux solution inside a homogeneous node. As a result, the nodal coupling coefficients depend explicitly or implicitly on the evolving Eigen-value of a problem during its solution iteration process. This poses an inherently non-linear matrix Eigen-value iteration problem. This paper points out analytically that, whenever the half wave length of an evolving node interior analytic solution becomes smaller than the size of that node, this non-linear iteration problem can become inherently unstable and theoretically can always be non-convergent or converge to higher order harmonics. This phenomenon is confirmed, demonstrated and analyzed via the simplest 1-D problem solved by the simplest analytic nodal method, the Analytic Coarse Mesh Finite Difference (ACMFD, [1]) method. (authors)

  7. Analytical Solution of General Bagley-Torvik Equation

    OpenAIRE

    William Labecca; Osvaldo Guimarães; José Roberto C. Piqueira

    2015-01-01

    Bagley-Torvik equation appears in viscoelasticity problems where fractional derivatives seem to play an important role concerning empirical data. There are several works treating this equation by using numerical methods and analytic formulations. However, the analytical solutions presented in the literature consider particular cases of boundary and initial conditions, with inhomogeneous term often expressed in polynomial form. Here, by using Laplace transform methodology, the general inhomoge...

  8. Writing analytic element programs in Python.

    Science.gov (United States)

    Bakker, Mark; Kelson, Victor A

    2009-01-01

    The analytic element method is a mesh-free approach for modeling ground water flow at both the local and the regional scale. With the advent of the Python object-oriented programming language, it has become relatively easy to write analytic element programs. In this article, an introduction is given of the basic principles of the analytic element method and of the Python programming language. A simple, yet flexible, object-oriented design is presented for analytic element codes using multiple inheritance. New types of analytic elements may be added without the need for any changes in the existing part of the code. The presented code may be used to model flow to wells (with either a specified discharge or drawdown) and streams (with a specified head). The code may be extended by any hydrogeologist with a healthy appetite for writing computer code to solve more complicated ground water flow problems. Copyright © 2009 The Author(s). Journal Compilation © 2009 National Ground Water Association.

  9. Automation of analytical systems in power cycles

    International Nuclear Information System (INIS)

    Staub Lukas

    2008-01-01

    'Automation' is a widely used term in instrumentation and is often applied to signal exchange, PLC and SCADA systems. Common use, however, does not necessarily described autonomous operation of analytical devices. We define an automated analytical system as a black box with an input (sample) and an output (measured value). In addition we need dedicated status lines for assessing the validities of the input for our black box and the output for subsequent systems. We will discuss input parameters, automated analytical processes and output parameters. Further considerations will be given to signal exchange and integration into the operating routine of a power plant. Local control loops (chemical dosing) and the automation of sampling systems are not discussed here. (author)

  10. Analysis of IFR samples at ANL-E

    International Nuclear Information System (INIS)

    Bowers, D.L.; Sabau, C.S.

    1993-01-01

    The Analytical Chemistry Laboratory analyzes a variety of samples submitted by the different research groups within IFR. This talk describes the analytical work on samples generated by the Plutonium Electrorefiner, Large Scale Electrorefiner and Waste Treatment Studies. The majority of these samples contain Transuranics and necessitate facilities that safely contain these radioisotopes. Details such as: sample receiving, dissolution techniques, chemical separations, Instrumentation used, reporting of results are discussed. The Importance of Interactions between customer and analytical personnel Is also demonstrated

  11. Pre-analytical and post-analytical evaluation in the era of molecular diagnosis of sexually transmitted diseases: cellularity control and internal control

    Directory of Open Access Journals (Sweden)

    Loria Bianchi

    2014-06-01

    Full Text Available Background. Increase of molecular tests performed on DNA extracted from various biological materials should not be carried out without an adequate standardization of the pre-analytical and post-analytical phase. Materials and Methods. Aim of this study was to evaluate the role of internal control (IC to standardize pre-analytical phase and the role of cellularity control (CC in the suitability evaluation of biological matrices, and their influence on false negative results. 120 cervical swabs (CS were pre-treated and extracted following 3 different protocols. Extraction performance was evaluated by amplification of: IC, added in each mix extraction; human gene HPRT1 (CC with RT-PCR to quantify sample cellularity; L1 region of HPV with SPF10 primers. 135 urine, 135 urethral swabs, 553 CS and 332 ThinPrep swabs (TP were tested for C. trachomatis (CT and U. parvum (UP with RT-PCR and for HPV by endpoint-PCR. Samples were also tested for cellularity. Results. Extraction protocol with highest average cellularity (Ac/sample showed lowest number of samples with inhibitors; highest HPV positivity was achieved by protocol with greatest Ac/PCR. CS and TP under 300.000 cells/sample showed a significant decrease of UP (P<0.01 and HPV (P<0.005 positivity. Female urine under 40.000 cells/mL were inadequate to detect UP (P<0.05. Conclusions. Our data show that IC and CC allow optimization of pre-analytical phase, with an increase of analytical quality. Cellularity/sample allows better sample adequacy evaluation, crucial to avoid false negative results, while cellularity/PCR allows better optimization of PCR amplification. Further data are required to define the optimal cut-off for result normalization.

  12. So you've got your sample in solution. What next? (W7)

    International Nuclear Information System (INIS)

    Brindle, I.P.

    2002-01-01

    Full text: Several factors must be considered after the sample is prepared for analysis. These factors include memory, oxidation state of analyte, presence of interfering elements, etc. Some elements are 'sticky' and exhibit prodigious memory effects. For elements like mercury, gold and boron, memory effects make it difficult to determine elemental concentrations in samples that vary widely in concentration. When selenium is determined by hydride generation, the selenium cannot be in the VI oxidation state, since borohydride will not reduce this oxidation state. Different treatments must be used. The treatment of organometallics may require, in addition, the presence of reagents to improve the yield of derivitized species that are to be determined. Interfering elements must sometimes be masked or removed before determination of the analyte can proceed. In this presentation, these various issues will be discussed. Solutions to some of the problems, from the analytical chemistry laboratories at Brock University, will be presented. In addition, options for the simultaneous determination of elements by vapor generation and nebulization will be discussed, based on recent work in the Brock laboratories. (author)

  13. Efficient alignment-free DNA barcode analytics.

    Science.gov (United States)

    Kuksa, Pavel; Pavlovic, Vladimir

    2009-11-10

    In this work we consider barcode DNA analysis problems and address them using alternative, alignment-free methods and representations which model sequences as collections of short sequence fragments (features). The methods use fixed-length representations (spectrum) for barcode sequences to measure similarities or dissimilarities between sequences coming from the same or different species. The spectrum-based representation not only allows for accurate and computationally efficient species classification, but also opens possibility for accurate clustering analysis of putative species barcodes and identification of critical within-barcode loci distinguishing barcodes of different sample groups. New alignment-free methods provide highly accurate and fast DNA barcode-based identification and classification of species with substantial improvements in accuracy and speed over state-of-the-art barcode analysis methods. We evaluate our methods on problems of species classification and identification using barcodes, important and relevant analytical tasks in many practical applications (adverse species movement monitoring, sampling surveys for unknown or pathogenic species identification, biodiversity assessment, etc.) On several benchmark barcode datasets, including ACG, Astraptes, Hesperiidae, Fish larvae, and Birds of North America, proposed alignment-free methods considerably improve prediction accuracy compared to prior results. We also observe significant running time improvements over the state-of-the-art methods. Our results show that newly developed alignment-free methods for DNA barcoding can efficiently and with high accuracy identify specimens by examining only few barcode features, resulting in increased scalability and interpretability of current computational approaches to barcoding.

  14. Road Transportable Analytical Laboratory system

    International Nuclear Information System (INIS)

    Finger, S.M.; Keith, V.F.; Spertzel, R.O.; De Avila, J.C.; O'Donnell, M.; Vann, R.L.

    1993-09-01

    This developmental effort clearly shows that a Road Transportable Analytical Laboratory System is a worthwhile and achievable goal. The RTAL is designed to fully analyze (radioanalytes, and organic and inorganic chemical analytes) 20 samples per day at the highest levels of quality assurance and quality control. It dramatically reduces the turnaround time for environmental sample analysis from 45 days (at a central commercial laboratory) to 1 day. At the same time each RTAL system will save the DOE over $12 million per year in sample analysis costs compared to the costs at a central commercial laboratory. If RTAL systems were used at the eight largest DOE facilities (at Hanford, Savannah River, Fernald, Oak Ridge, Idaho, Rocky Flats, Los Alamos, and the Nevada Test Site), the annual savings would be $96,589,000. The DOE's internal study of sample analysis needs projects 130,000 environmental samples requiring analysis in FY 1994, clearly supporting the need for the RTAL system. The cost and time savings achievable with the RTAL system will accelerate and improve the efficiency of cleanup and remediation operations throughout the DOE complex

  15. On the Formal Integrability Problem for Planar Differential Systems

    Directory of Open Access Journals (Sweden)

    Antonio Algaba

    2013-01-01

    Full Text Available We study the analytic integrability problem through the formal integrability problem and we show its connection, in some cases, with the existence of invariant analytic (sometimes algebraic curves. From the results obtained, we consider some families of analytic differential systems in ℂ2, and imposing the formal integrability we find resonant centers obviating the computation of some necessary conditions.

  16. The Effectiveness of Transactional Behavior Analytic Group Therapy on the Prevention of Relapse among Detoxified People

    OpenAIRE

    S Mousa Kafi; Rahim Mollazadeh Esfanaji; Morteza Nori; Ertaj Salehi

    2009-01-01

    Introduction: Addiction Phenomenon among detoxified people is an important therapeutic problem for substance abusers. The aim of this research was the study of effectiveness of transactional behavior analytic group therapy on prevention of relapse of detoxified people. Method: the research design was quasi experimental with witness group. By using of available sampling of detoxified people who referred to government centers for maintenance therapy with Methadone, 24 subjects that divided to t...

  17. Solid sample atomic absorption spectroscopy in a chemical contaminant monitoring pilot project

    Energy Technology Data Exchange (ETDEWEB)

    Klein, J.; Schmidt, H.; Dirscherl, C.; Muntau, H.

    1987-09-01

    The Institute for Technology and Hygiene of Food of Animal Origin is developing a practible system of monitoring the distribution of toxic substances in the environment, using the dairy cows as bioindicators. A pilot project has been established to solve basic problems as sampling strategy, sample preparation, analysis and data handling. In the preliminary stage of this study the new technique of SS-AAS turned out to be a useful tool. In order to test overall analytical reliability of the data obtained all analytical procedures applied for the different matrices are controlled by the use of reference material of similar matrix compositions. Results of studies on the distribution of admium and lead are reported; the representativity of small sample amounts of cortical tissue (50-60 mg and 1-2 mg dry mass) has additionally been investigated. Direct analysis of wet tissue aliquots (5-10 mg) was not feasible. A possible method of sample preparation of wet tissue is presented which yields reliable results within 10 min of operation time.

  18. An Entropic Estimator for Linear Inverse Problems

    Directory of Open Access Journals (Sweden)

    Amos Golan

    2012-05-01

    Full Text Available In this paper we examine an Information-Theoretic method for solving noisy linear inverse estimation problems which encompasses under a single framework a whole class of estimation methods. Under this framework, the prior information about the unknown parameters (when such information exists, and constraints on the parameters can be incorporated in the statement of the problem. The method builds on the basics of the maximum entropy principle and consists of transforming the original problem into an estimation of a probability density on an appropriate space naturally associated with the statement of the problem. This estimation method is generic in the sense that it provides a framework for analyzing non-normal models, it is easy to implement and is suitable for all types of inverse problems such as small and or ill-conditioned, noisy data. First order approximation, large sample properties and convergence in distribution are developed as well. Analytical examples, statistics for model comparisons and evaluations, that are inherent to this method, are discussed and complemented with explicit examples.

  19. International Congress on Analytical Chemistry. Abstracts. V. 1

    International Nuclear Information System (INIS)

    1997-01-01

    The collection of materials of the international congress on analytical chemistry taken place in Moscow in June 1997. The main directs of investigations in such regions of analytical chemistry as quantitative and qualitative analysis, microanalysis, sample preparation and preconcentration, analytical reagents, chromatography and related techniques, flow analysis, electroanalytical and kinetic methods sensors are elucidated

  20. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

    International Nuclear Information System (INIS)

    Elsheikh, Ahmed H.; Wheeler, Mary F.; Hoteit, Ibrahim

    2014-01-01

    A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using Stochastic Ensemble Method (SEM). NS is an efficient sampling algorithm that can be used for Bayesian calibration and estimating the Bayesian evidence for prior model selection. Nested sampling has the advantage of computational feasibility. Within the nested sampling algorithm, a constrained sampling step is performed. For this step, we utilize HMC to reduce the correlation between successive sampled states. HMC relies on the gradient of the logarithm of the posterior distribution, which we estimate using a stochastic ensemble method based on an ensemble of directional derivatives. SEM only requires forward model runs and the simulator is then used as a black box and no adjoint code is needed. The developed HNS algorithm is successfully applied for Bayesian calibration and prior model selection of several nonlinear subsurface flow problems

  1. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

    Energy Technology Data Exchange (ETDEWEB)

    Elsheikh, Ahmed H., E-mail: aelsheikh@ices.utexas.edu [Institute for Computational Engineering and Sciences (ICES), University of Texas at Austin, TX (United States); Institute of Petroleum Engineering, Heriot-Watt University, Edinburgh EH14 4AS (United Kingdom); Wheeler, Mary F. [Institute for Computational Engineering and Sciences (ICES), University of Texas at Austin, TX (United States); Hoteit, Ibrahim [Department of Earth Sciences and Engineering, King Abdullah University of Science and Technology (KAUST), Thuwal (Saudi Arabia)

    2014-02-01

    A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using Stochastic Ensemble Method (SEM). NS is an efficient sampling algorithm that can be used for Bayesian calibration and estimating the Bayesian evidence for prior model selection. Nested sampling has the advantage of computational feasibility. Within the nested sampling algorithm, a constrained sampling step is performed. For this step, we utilize HMC to reduce the correlation between successive sampled states. HMC relies on the gradient of the logarithm of the posterior distribution, which we estimate using a stochastic ensemble method based on an ensemble of directional derivatives. SEM only requires forward model runs and the simulator is then used as a black box and no adjoint code is needed. The developed HNS algorithm is successfully applied for Bayesian calibration and prior model selection of several nonlinear subsurface flow problems.

  2. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

    KAUST Repository

    Elsheikh, Ahmed H.

    2014-02-01

    A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using Stochastic Ensemble Method (SEM). NS is an efficient sampling algorithm that can be used for Bayesian calibration and estimating the Bayesian evidence for prior model selection. Nested sampling has the advantage of computational feasibility. Within the nested sampling algorithm, a constrained sampling step is performed. For this step, we utilize HMC to reduce the correlation between successive sampled states. HMC relies on the gradient of the logarithm of the posterior distribution, which we estimate using a stochastic ensemble method based on an ensemble of directional derivatives. SEM only requires forward model runs and the simulator is then used as a black box and no adjoint code is needed. The developed HNS algorithm is successfully applied for Bayesian calibration and prior model selection of several nonlinear subsurface flow problems. © 2013 Elsevier Inc.

  3. Efficient sample preparation from complex biological samples using a sliding lid for immobilized droplet extractions.

    Science.gov (United States)

    Casavant, Benjamin P; Guckenberger, David J; Beebe, David J; Berry, Scott M

    2014-07-01

    Sample preparation is a major bottleneck in many biological processes. Paramagnetic particles (PMPs) are a ubiquitous method for isolating analytes of interest from biological samples and are used for their ability to thoroughly sample a solution and be easily collected with a magnet. There are three main methods by which PMPs are used for sample preparation: (1) removal of fluid from the analyte-bound PMPs, (2) removal of analyte-bound PMPs from the solution, and (3) removal of the substrate (with immobilized analyte-bound PMPs). In this paper, we explore the third and least studied method for PMP-based sample preparation using a platform termed Sliding Lid for Immobilized Droplet Extractions (SLIDE). SLIDE leverages principles of surface tension and patterned hydrophobicity to create a simple-to-operate platform for sample isolation (cells, DNA, RNA, protein) and preparation (cell staining) without the need for time-intensive wash steps, use of immiscible fluids, or precise pinning geometries. Compared to other standard isolation protocols using PMPs, SLIDE is able to perform rapid sample preparation with low (0.6%) carryover of contaminants from the original sample. The natural recirculation occurring within the pinned droplets of SLIDE make possible the performance of multistep cell staining protocols within the SLIDE by simply resting the lid over the various sample droplets. SLIDE demonstrates a simple easy to use platform for sample preparation on a range of complex biological samples.

  4. A sensitive analytical procedure for monitoring acrylamide in environmental water samples by offline SPE-UPLC/MS/MS.

    Science.gov (United States)

    Togola, Anne; Coureau, Charlotte; Guezennec, Anne-Gwenaëlle; Touzé, Solène

    2015-05-01

    The presence of acrylamide in natural systems is of concern from both environmental and health points of view. We developed an accurate and robust analytical procedure (offline solid phase extraction combined with UPLC/MS/MS) with a limit of quantification (20 ng L(-1)) compatible with toxicity threshold values. The optimized (considering the nature of extraction phases, sampling volumes, and solvent of elution) solid phase extraction (SPE) was validated according to ISO Standard ISO/IEC 17025 on groundwater, surface water, and industrial process water samples. Acrylamide is highly polar, which induces a high variability during the SPE step, therefore requiring the use of C(13)-labeled acrylamide as an internal standard to guarantee the accuracy and robustness of the method (uncertainty about 25 % (k = 2) at limit of quantification level). The specificity of the method and the stability of acrylamide were studied for these environmental media, and it was shown that the method is suitable for measuring acrylamide in environmental studies.

  5. THE CREATION OF ANALYTICAL PREDICATION CONCEPT IN MULTILEVEL ORGANIZATION SYSTEM MANAGEMENT

    Directory of Open Access Journals (Sweden)

    O. M. Pisareva

    2013-01-01

    Full Text Available This article deals with analytical and forecasting methodology problems in the integrated business systems management. The theoretical framework of analytical predication concept in multilevel organization systems is presented.

  6. The use of atomic absorption spectroscopy to measure arsenic, selenium, molybdenum, and vanadium in water and soil samples from uranium mill tailings sites

    International Nuclear Information System (INIS)

    Hollenbach, M.H.

    1988-01-01

    The Technical Measurements Center (TMC) was established to support the environmental measurement needs of the various DOE remedial action programs. A laboratory intercomparison study conducted by the TMC, using soil and water samples from sites contaminated by uranium mill tailings, indicated large discrepancies in analytical results reported by participating laboratories for arsenic, selenium, molybdenum, and vanadium. The present study was undertaken to investigate the most commonly used analytical techniques for measuring these four elements, ascertain routine and reliable quantification, and assess problems and successes of analysts. Based on a survey of the technical literature, the analytical technique of atomic absorption spectroscopy was selected for detailed study. The application of flame atomic absorption, graphite furnace atomic absorption, and hydride generation atomic absorption to the analysis of tailings-contaminated samples is discussed. Additionally, laboratory sample preparation methods for atomic absorption spectroscopy are presented. The conclusion of this report is that atomic absorption can be used effectively for the determination of arsenic, selenium, molybdenum, and vanadium in water and soil samples if the analyst understands the measurement process and is aware of potential problems. The problem of accurate quantification of arsenic, selenium, molybdenum, and vanadium in water and soil contaminated by waste products from uranium milling operations affects all DOE remedial action programs [Surplus Facilities Management Program (SFMP), Formerly Utilized Site Remedial Action Program (FUSRAP), and Uranium Mill Tailings Remedial Action Program (UMTRAP)], since all include sites where uranium was processed. 96 refs., 9 figs

  7. Sampling general N-body interactions with auxiliary fields

    Science.gov (United States)

    Körber, C.; Berkowitz, E.; Luu, T.

    2017-09-01

    We present a general auxiliary field transformation which generates effective interactions containing all possible N-body contact terms. The strength of the induced terms can analytically be described in terms of general coefficients associated with the transformation and thus are controllable. This transformation provides a novel way for sampling 3- and 4-body (and higher) contact interactions non-perturbatively in lattice quantum Monte Carlo simulations. As a proof of principle, we show that our method reproduces the exact solution for a two-site quantum mechanical problem.

  8. Analytical-numerical solution of a nonlinear integrodifferential equation in econometrics

    Science.gov (United States)

    Kakhktsyan, V. M.; Khachatryan, A. Kh.

    2013-07-01

    A mixed problem for a nonlinear integrodifferential equation arising in econometrics is considered. An analytical-numerical method is proposed for solving the problem. Some numerical results are presented.

  9. Semi-analytic solution to planar Helmholtz equation

    Directory of Open Access Journals (Sweden)

    Tukač M.

    2013-06-01

    Full Text Available Acoustic solution of interior domains is of great interest. Solving acoustic pressure fields faster with lower computational requirements is demanded. A novel solution technique based on the analytic solution to the Helmholtz equation in rectangular domain is presented. This semi-analytic solution is compared with the finite element method, which is taken as the reference. Results show that presented method is as precise as the finite element method. As the semi-analytic method doesn’t require spatial discretization, it can be used for small and very large acoustic problems with the same computational costs.

  10. Enhanced spot preparation for liquid extractive sampling and analysis

    Science.gov (United States)

    Van Berkel, Gary J.; King, Richard C.

    2015-09-22

    A method for performing surface sampling of an analyte, includes the step of placing the analyte on a stage with a material in molar excess to the analyte, such that analyte-analyte interactions are prevented and the analyte can be solubilized for further analysis. The material can be a matrix material that is mixed with the analyte. The material can be provided on a sample support. The analyte can then be contacted with a solvent to extract the analyte for further processing, such as by electrospray mass spectrometry.

  11. Sustained impact of inattention and hyperactivity-impulsivity on peer problems: mediating roles of prosocial skills and conduct problems in a community sample of children.

    Science.gov (United States)

    Andrade, Brendan F; Tannock, Rosemary

    2014-06-01

    This prospective 2-year longitudinal study tested whether inattentive and hyperactive/impulsive symptom dimensions predicted future peer problems, when accounting for concurrent conduct problems and prosocial skills. A community sample of 492 children (49 % female) who ranged in age from 6 to 10 years (M = 8.6, SD = .93) was recruited. Teacher reports of children's inattention, and hyperactivity/impulsivity symptoms, conduct problems, prosocial skills and peer problems were collected in two consecutive school years. Elevated inattention and hyperactivity/impulsivity in Year-1 predicted greater peer problems in Year-2. Conduct problems in the first and second years of the study were associated with more peer problems, and explained a portion of the relationship between inattention and hyperactivity/impulsivity with peer problems. However, prosocial skills were associated with fewer peer problems in children with elevated inattention and hyperactivity/impulsivity. Inattention and hyperactivity/impulsivity have negative effects on children's peer functioning after 1-year, but concurrent conduct problems and prosocial skills have important and opposing impacts on these associations.

  12. Analytical Techniques in the Pharmaceutical Sciences

    DEFF Research Database (Denmark)

    Leurs, Ulrike; Mistarz, Ulrik Hvid; Rand, Kasper Dyrberg

    2016-01-01

    Mass spectrometry (MS) offers the capability to identify, characterize and quantify a target molecule in a complex sample matrix and has developed into a premier analytical tool in drug development science. Through specific MS-based workflows including customized sample preparation, coupling...

  13. Flow cytometry for feline lymphoma: a retrospective study regarding pre-analytical factors possibly affecting the quality of samples.

    Science.gov (United States)

    Martini, Valeria; Bernardi, Serena; Marelli, Priscilla; Cozzi, Marzia; Comazzi, Stefano

    2018-06-01

    Objectives Flow cytometry (FC) is becoming increasingly popular among veterinary oncologists for the diagnosis of lymphoma or leukaemia. It is accurate, fast and minimally invasive. Several studies of FC have been carried out in canine oncology and applied with great results, whereas there is limited knowledge and use of this technique in feline patients. This is mainly owing to the high prevalence of intra-abdominal lymphomas in this species and the difficulty associated with the diagnostic procedures needed to collect the sample. The purpose of the present study is to investigate whether any pre-analytical factor might affect the quality of suspected feline lymphoma samples for FC analysis. Methods Ninety-seven consecutive samples of suspected feline lymphoma were retrospectively selected from the authors' institution's FC database. The referring veterinarians were contacted and interviewed about several different variables, including signalment, appearance of the lesion, features of the sampling procedure and the experience of veterinarians performing the sampling. Statistical analyses were performed to assess the possible influence of these variables on the cellularity of the samples and the likelihood of it being finally processed for FC. Results Sample cellularity is a major factor in the likelihood of the sample being processed. Moreover, sample cellularity was significantly influenced by the needle size, with 21 G needles providing the highest cellularity. Notably, the sample cellularity and the likelihood of being processed did not vary between peripheral and intra-abdominal lesions. Approximately half of the cats required pharmacological restraint. Side effects were reported in one case only (transient swelling after peripheral lymph node sampling). Conclusions and relevance FC can be safely applied to cases of suspected feline lymphomas, including intra-abdominal lesions. A 21 G needle should be preferred for sampling. This study provides the basis for

  14. Analytical applications for delayed neutrons

    International Nuclear Information System (INIS)

    Eccleston, G.W.

    1983-01-01

    Analytical formulations that describe the time dependence of neutron populations in nuclear materials contain delayed-neutron dependent terms. These terms are important because the delayed neutrons, even though their yields in fission are small, permit control of the fission chain reaction process. Analytical applications that use delayed neutrons range from simple problems that can be solved with the point reactor kinetics equations to complex problems that can only be solved with large codes that couple fluid calculations with the neutron dynamics. Reactor safety codes, such as SIMMER, model transients of the entire reactor core using coupled space-time neutronics and comprehensive thermal-fluid dynamics. Nondestructive delayed-neutron assay instruments are designed and modeled using a three-dimensional continuous-energy Monte Carlo code. Calculations on high-burnup spent fuels and other materials that contain a mix of uranium and plutonium isotopes require accurate and complete information on the delayed-neutron periods, yields, and energy spectra. A continuing need exists for delayed-neutron parameters for all the fissioning isotopes

  15. Indirect boundary element method for three dimensional problems. Analytical solution for contribution to wave field by triangular element; Sanjigen kansetsu kyokai yosoho. Sankakukei yoso no kiyo no kaisekikai

    Energy Technology Data Exchange (ETDEWEB)

    Yokoi, T [Building Research Institute, Tokyo (Japan); Sanchez-Sesma, F [Universidad National Autonoma de Mexico, (Mexico). Institute de Ingenieria

    1997-05-27

    Formulation is introduced for discretizing a boundary integral equation into an indirect boundary element method for the solution of 3-dimensional topographic problems. Yokoi and Takenaka propose an analytical solution-capable reference solution (solution for the half space elastic body with flat free surface) to problems of topographic response to seismic motion in a 2-dimensional in-plane field. That is to say, they propose a boundary integral equation capable of effectively suppressing the non-physical waves that emerge in the result of computation in the wake of the truncation of the discretized ground surface making use of the wave field in a semi-infinite elastic body with flat free surface. They apply the proposed boundary integral equation discretized into the indirect boundary element method to solve some examples, and succeed in proving its validity. In this report, the equation is expanded to deal with 3-dimensional topographic problems. A problem of a P-wave vertically landing on a flat and free surface is solved by the conventional boundary integral equation and the proposed boundary integral equation, and the solutions are compared with each other. It is found that the new method, different from the conventional one, can delete non-physical waves from the analytical result. 4 figs.

  16. Tank 48H Waste Composition and Results of Investigation of Analytical Methods

    Energy Technology Data Exchange (ETDEWEB)

    Walker , D.D. [Westinghouse Savannah River Company, AIKEN, SC (United States)

    1997-04-02

    This report serves two purposes. First, it documents the analytical results of Tank 48H samples taken between April and August 1996. Second, it describes investigations of the precision of the sampling and analytical methods used on the Tank 48H samples.

  17. Analytic thinking reduces belief in conspiracy theories.

    Science.gov (United States)

    Swami, Viren; Voracek, Martin; Stieger, Stefan; Tran, Ulrich S; Furnham, Adrian

    2014-12-01

    Belief in conspiracy theories has been associated with a range of negative health, civic, and social outcomes, requiring reliable methods of reducing such belief. Thinking dispositions have been highlighted as one possible factor associated with belief in conspiracy theories, but actual relationships have only been infrequently studied. In Study 1, we examined associations between belief in conspiracy theories and a range of measures of thinking dispositions in a British sample (N=990). Results indicated that a stronger belief in conspiracy theories was significantly associated with lower analytic thinking and open-mindedness and greater intuitive thinking. In Studies 2-4, we examined the causational role played by analytic thinking in relation to conspiracist ideation. In Study 2 (N=112), we showed that a verbal fluency task that elicited analytic thinking reduced belief in conspiracy theories. In Study 3 (N=189), we found that an alternative method of eliciting analytic thinking, which related to cognitive disfluency, was effective at reducing conspiracist ideation in a student sample. In Study 4, we replicated the results of Study 3 among a general population sample (N=140) in relation to generic conspiracist ideation and belief in conspiracy theories about the July 7, 2005, bombings in London. Our results highlight the potential utility of supporting attempts to promote analytic thinking as a means of countering the widespread acceptance of conspiracy theories. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Flow cytometry for feline lymphoma: a retrospective study about pre-analytical factors possibly affecting the quality of samples

    Directory of Open Access Journals (Sweden)

    Serena Bernardi

    2017-05-01

    Full Text Available Introduction Flow cytometry (FC is an increasingly required technique on which veterinary oncologists rely to have an accurate, fast, minimally invasive lymphoma or leukemia diagnosis. FC has been studied and applied with great results in canine oncology, whereas in feline oncology the use of this technique is still to be experienced. This is mainly due to a supposed discomfort in sampling, because of the high prevalence of intra-abdominal lymphomas. The purpose of the present study is to investigate whether any pre-analytical factor might affect the quality of suspected feline lymphoma samples for FC analysis. Methods 97 consecutive samples of suspected feline lymphoma were retrospectively selected from the authors’ institution FC database. The referring veterinarians were recalled and interrogated about several different variables, including signalling, features of the lesion, features of the sampling procedure and the experience of veterinarians performing the sampling. Statistical analyses were performed to assess the possible influence of these variables on the cellularity of the samples and the likelihood of being finally processed for FC. Results None of the investigated variables significantly influenced the quality of the submitted samples, but the needle size, with 21G needles providing the highest cellularity (Table 1. Notably, the samples quality did not vary between peripheral and intra-abdominal lesions. Sample cellularity alone influenced the likelihood of being processed. About a half of the cats required pharmacological restraint. Side effects were reported in one case only (transient swelling after peripheral lymph node sampling. Conclusions FC can be safely applied to cases of suspected feline lymphomas, even for intra-abdominal lesions. 21G needle should be preferred for sampling. This study provides the bases for the spread of this minimally invasive, fast and cost-effective technique in feline medicine.

  19. Multispectral analytical image fusion

    International Nuclear Information System (INIS)

    Stubbings, T.C.

    2000-04-01

    With new and advanced analytical imaging methods emerging, the limits of physical analysis capabilities and furthermore of data acquisition quantities are constantly pushed, claiming high demands to the field of scientific data processing and visualisation. Physical analysis methods like Secondary Ion Mass Spectrometry (SIMS) or Auger Electron Spectroscopy (AES) and others are capable of delivering high-resolution multispectral two-dimensional and three-dimensional image data; usually this multispectral data is available in form of n separate image files with each showing one element or other singular aspect of the sample. There is high need for digital image processing methods enabling the analytical scientist, confronted with such amounts of data routinely, to get rapid insight into the composition of the sample examined, to filter the relevant data and to integrate the information of numerous separate multispectral images to get the complete picture. Sophisticated image processing methods like classification and fusion provide possible solution approaches to this challenge. Classification is a treatment by multivariate statistical means in order to extract analytical information. Image fusion on the other hand denotes a process where images obtained from various sensors or at different moments of time are combined together to provide a more complete picture of a scene or object under investigation. Both techniques are important for the task of information extraction and integration and often one technique depends on the other. Therefore overall aim of this thesis is to evaluate the possibilities of both techniques regarding the task of analytical image processing and to find solutions for the integration and condensation of multispectral analytical image data in order to facilitate the interpretation of the enormous amounts of data routinely acquired by modern physical analysis instruments. (author)

  20. Problematic Technology Use in a clinical sample of children and adolescents. Personality and behavioral problems associated.

    Science.gov (United States)

    Alonso, Cristina; Romero, Estrella

    2017-03-01

    In parallel to the rapid growth of access to new technologies (NT) there has been an increase in the problematic use of the same, especially among children and adolescents. Although research in this field is increasing, the studies have mainly been developed in the community, and the characteristics associated with the problematic use of NT are unknown in samples that require clinical care. Therefore, the aim of this study is to analyze the relationship between problematic use of video games (UPV) and Internet (UPI) and personality traits and behavior problems in a clinical sample of children and adolescents. The sample consists of 88 patients who were examined in the clinical psychology consultation in the Mental Health Unit for Children and Adolescents of the University Hospital of Santiago de Compostela. Data were obtained from self-reports and rating scales filled out by parents. 31.8% of the participants present UPI and 18.2%, UPV. The children and adolescents with UPNT have lower levels of Openness to experience, Conscientiousness and Agreeableness and higher levels of Emotional instability, global Impulsivity and Externalizing behavior problems, as well as Attention and Thought problems. UPNT is a problem that emerges as an important issue in clinical care for children and adolescents, so its study in child and youth care units is needed. Understanding the psychopathological profile of children and adolescents with UPNT will allow for the development of differential and more specific interventions.

  1. Critical Discourse Analysis. The Elaboration of a Problem Oriented Discourse Analytic Approach After Foucault

    Directory of Open Access Journals (Sweden)

    Rainer Diaz-Bone

    2006-05-01

    Full Text Available Abstract: The German discourse researcher Siegfried JÄGER from Duisburg is the first to have published a German-language book about the methodology of discourse analysis after FOUCAULT. JÄGER integrates in his work the discourse analytic work of Jürgen LINK as well as the interdisciplinary discussion carried on in the discourse analytic journal "kultuRRevolution" (Journal for Applied Discourse Analysis. JÄGER and his co-workers were associated with the Duisburger Institute for Language Research and Social Research (DISS, see http://www.diss-duisburg.de/ for 20 years, developing discourse theory and the methodology of discourse analysis. The interview was done via e-mail. It depicts the discourse analytic approach of JÄGER and his co-workers following the works of FOUCAULT and LINK. The interview reconstructs JÄGERs vita and his academic career. Further topics of the interview are the agenda of JÄGERs discourse studies, methodological considerations, the (problematic relationship between FOUCAULDian discourse analysis and (discourses, linguistics, styles and organization of research and questions concerning applied discourse analytic research as a form of critical intervention. URN: urn:nbn:de:0114-fqs0603219

  2. Spherical cavity-expansion forcing function in PRONTO 3D for application to penetration problems

    Energy Technology Data Exchange (ETDEWEB)

    Warren, T.L.; Tabbara, M.R.

    1997-05-01

    In certain penetration events the primary mode of deformation of the target can be approximated by known analytical expressions. In the context of an analysis code, this approximation eliminates the need for modeling the target as well as the need for a contact algorithm. This technique substantially reduces execution time. In this spirit, a forcing function which is derived from a spherical-cavity expansion analysis has been implemented in PRONTO 3D. This implementation is capable of computing the structural and component responses of a projectile due to three dimensional penetration events. Sample problems demonstrate good agreement with experimental and analytical results.

  3. Analytical chemistry of nuclear materials

    International Nuclear Information System (INIS)

    1963-01-01

    The last two decades have witnessed an enormous development in chemical analysis. The rapid progress of nuclear energy, of solid-state physics and of other fields of modern industry has extended the concept of purity to limits previously unthought of, and to reach the new dimensions of these extreme demands, entirely new techniques have been invented and applied and old ones have been refined. Recognizing these facts, the International Atomic Energy Agency convened a Panel on Analytical Chemistry of Nuclear Materials to discuss the general problems facing the analytical chemist engaged in nuclear energy development, particularly in newly developing centre and countries, to analyse the represent situation and to advise as to the directions in which research and development appear to be most necessary. The Panel also discussed the analytical programme of the Agency's laboratory at Seibersdorf, where the Agency has already started a programme of international comparison of analytical methods which may lead to the establishment of international standards for many materials of interest. Refs and tabs

  4. PIXE and its applications to biological samples

    International Nuclear Information System (INIS)

    Aldape, F.; Flores, M.J.

    1996-01-01

    Throughout this century, industrialized society has seriously affected the ecology by introducing huge amounts of pollutants into the atmosphere as well as marine and soil environments. On the other hand, it is known that these pollutants, in excess of certain levels of concentration, not only put at risk the life of living beings but may also cause the extinction of some species. It is therefore of basic importance to substantially increase quantitative determinations of trace element concentrations in biological specimens in order to assess the effects of pollutants. It is in this field that PIXE plays a key role in these studies, where its unique analytical properties are decisive. Moreover, since the importance of these research has been recognized in many countries, many scientists have been encouraged to continue or initiate new research programmes aimed to solve the worldwide pollution problem. This document presents an overview of those papers reporting the application of PIXE analysis to biological samples during this last decade of the 20th century and recounts the number of PIXE laboratories dedicating their efforts to find the clues of the biological effects of the presence of pollutants introduced in living beings. Sample preparation methods, different kinds of samples under study and the use of complementary analytical techniques are also illustrated. (author). 108 refs

  5. A ring test of in vitro neutral detergent fiber digestibility: analytical variability and sample ranking.

    Science.gov (United States)

    Hall, M B; Mertens, D R

    2012-04-01

    In vitro neutral detergent fiber (NDF) digestibility (NDFD) is an empirical measurement of fiber fermentability by rumen microbes. Variation is inherent in all assays and may be increased as multiple steps or differing procedures are used to assess an empirical measure. The main objective of this study was to evaluate variability within and among laboratories of 30-h NDFD values analyzed in repeated runs. Subsamples of alfalfa (n=4), corn forage (n=5), and grass (n=5) ground to pass a 6-mm screen passed a test for homogeneity. The 14 samples were sent to 10 laboratories on 3 occasions over 12 mo. Laboratories ground the samples and ran 1 to 3 replicates of each sample within fermentation run and analyzed 2 or 3 sets of samples. Laboratories used 1 of 2 NDFD procedures: 8 labs used procedures related to the 1970 Goering and Van Soest (GVS) procedure using fermentation vessels or filter bags, and 2 used a procedure with preincubated inoculum (PInc). Means and standard deviations (SD) of sample replicates within run within laboratory (lab) were evaluated with a statistical model that included lab, run within lab, sample, and lab × sample interaction as factors. All factors affected mean values for 30-h NDFD. The lab × sample effect suggests against a simple lab bias in mean values. The SD ranged from 0.49 to 3.37% NDFD and were influenced by lab and run within lab. The GVS procedure gave greater NDFD values than PInc, with an average difference across all samples of 17% NDFD. Because of the differences between GVS and PInc, we recommend using results in contexts appropriate to each procedure. The 95% probability limits for within-lab repeatability and among-lab reproducibility for GVS mean values were 10.2 and 13.4%, respectively. These percentages describe the span of the range around the mean into which 95% of analytical results for a sample fall for values generated within a lab and among labs. This degree of precision was supported in that the average maximum

  6. Quantification of process induced disorder in milled samples using different analytical techniques

    DEFF Research Database (Denmark)

    Zimper, Ulrike; Aaltonen, Jaakko; McGoverin, Cushla M.

    2012-01-01

    The aim of this study was to compare three different analytical methods to detect and quantify the amount of crystalline disorder/ amorphousness in two milled model drugs. X-ray powder diffraction (XRPD), differential scanning calorimetry (DSC) and Raman spectroscopy were used as analytical methods...... and indomethacin and simvastatin were chosen as the model compounds. These compounds partly converted from crystalline to disordered forms by milling. Partial least squares regression (PLS) was used to create calibration models for the XRPD and Raman data, which were subsequently used to quantify the milling......-induced crystalline disorder/ amorphousness under different process conditions. In the DSC measurements the change in heat capacity at the glass transition was used for quantification. Differently prepared amorphous indomethacin standards (prepared by either melt quench cooling or cryo milling) were compared...

  7. Advanced analytical techniques

    International Nuclear Information System (INIS)

    Mrochek, J.E.; Shumate, S.E.; Genung, R.K.; Bahner, C.T.; Lee, N.E.; Dinsmore, S.R.

    1976-01-01

    The development of several new analytical techniques for use in clinical diagnosis and biomedical research is reported. These include: high-resolution liquid chromatographic systems for the early detection of pathological molecular constituents in physiologic body fluids; gradient elution chromatography for the analysis of protein-bound carbohydrates in blood serum samples, with emphasis on changes in sera from breast cancer patients; electrophoretic separation techniques coupled with staining of specific proteins in cellular isoenzymes for the monitoring of genetic mutations and abnormal molecular constituents in blood samples; and the development of a centrifugal elution chromatographic technique for the assay of specific proteins and immunoglobulins in human blood serum samples

  8. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. The direct effects of inattention and hyperactivity/impulsivity on peer problems and mediating roles of prosocial and conduct problem behaviors in a community sample of children.

    Science.gov (United States)

    Andrade, Brendan F; Tannock, Rosemary

    2013-11-01

    This study tested whether children's symptoms of inattention and hyperactivity/impulsivity were associated with peer problems and whether these associations were mediated by conduct problems and prosocial behaviors. A community sample of 500 children, including 245 boys and 255 girls, who ranged in age from 6 to 9 years (M = 7.6, SD = 0.91) were recruited. Teachers' report of children's inattention, hyperactivity/impulsivity, conduct problems, prosocial behaviors, and peer problems was collected. Symptoms of inattention and hyperactivity/impulsivity were significantly positively associated with peer problems. Conduct problems were associated with more peer problems and prosocial behaviors with less peer problems. Conduct problems and prosocial behaviors partially mediated the association between hyperactivity/impulsivity and peer problems and fully mediated the inattention-peer problems association. Findings show that prosocial behaviors and conduct problems are important variables that account for some of the negative impact of symptoms of inattention and hyperactivity/impulsivity on peer functioning.

  10. Contribution to analytical solution of neutron slowing down problem in homogeneous and heterogeneous media

    International Nuclear Information System (INIS)

    Stefanovic, D.B.

    1970-12-01

    The objective of this work is to describe the new analytical solution of the neutron slowing down equation for infinite monoatomic media with arbitrary energy dependence of cross section. The solution is obtained by introducing Green slowing down functions instead of starting from slowing down equations directly. The previously used methods for calculation of fission neutron spectra in the reactor cell were numerical. The proposed analytical method was used for calculating the space-energy distribution of fast neutrons and number of neutron reactions in a thermal reactor cell. The role of analytical method in solving the neutron slowing down in reactor physics is to enable understating of the slowing down process and neutron transport. The obtained results could be used as standards for testing the accuracy od approximative and practical methods

  11. Solution of direct kinematic problem for Stewart-Gough platform with the use of analytical equation of plane

    Directory of Open Access Journals (Sweden)

    A. L. Lapikov

    2014-01-01

    Full Text Available The paper concerns the solution of direct kinematic problem for the Stewart-Gough platform of the type 6-3. The article represents a detailed analysis of methods of direct kinematic problem solution for platform mechanisms based on the parallel structures. The complexity of the problem solution is estimated for the mechanisms of parallel kinematics in comparison with the classic manipulators, characterized by the open kinematic chain.The method for the solution of this problem is suggested. It consists in setting up the correspondence between the functional dependence of Cartesian coordinates and the orientation of the moving platform centre on the values of generalized coordinates of the manipulator, which may be represented, in the case of platform manipulators, by the lengths of extensible arms to connect the foundation and the moving platform of the manipulator. The method is constructed in such a way that the solution of the direct kinematic problem reduces to solution of the analytical equation of plane where the moving platform is situated. The equation of the required plane is built according to three points which in this case are attachment points of moving platform joints. To define joints coordinates values it is necessary to generate a system of nine nonlinear equations. It ought to be noted that in generating a system of equation are used the equations with the same type of nonlinearity. The physical meaning of all nine equations of the system is Euclidean distance between the points of the manipulator. The location and orientation of the moving platform is represented as a homogeneous transformation matrix. The components of translation and rotation of this matrix can be defined through the required plane.The obtained theoretical results are supposed to be used in the decision support system during the complex research of multi-sectional manipulators of parallel kinematics to describe the geometrically similar 3D-prototype of the

  12. Practical reporting times for environmental samples

    International Nuclear Information System (INIS)

    Bayne, C.K.; Schmoyer, D.D.; Jenkins, R.A.

    1993-02-01

    Preanalytical holding times for environmental samples are specified because chemical and physical characteristics may change between sampling and chemical analysis. For example, the Federal Register prescribes a preanalytical holding time of 14 days for volatile organic compounds in soil stored at 4 degrees C. The American Society for Testing Materials (ASTM) uses a more technical definition that the preanalytical holding time is the day when the analyte concentration for an environmental sample falls below the lower 99% confidence interval on the analyte concentration at day zero. This study reviews various holding time definitions and suggest a new preanalytical holding time approach using acceptable error rates for measuring an environmental analyte. This practical reporting time (PRT) approach has been applied to nineteen volatile organic compounds and four explosives in three environmental soil samples. A PRT nomograph of error rates has been developed to estimate the consequences of missing a preanalytical holding time. This nomograph can be applied to a large class of analytes with concentrations that decay linearly or exponentially with time regardless of sample matrices and storage conditions

  13. Practical reporting times for environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Bayne, C.K.; Schmoyer, D.D.; Jenkins, R.A.

    1993-02-01

    Preanalytical holding times for environmental samples are specified because chemical and physical characteristics may change between sampling and chemical analysis. For example, the Federal Register prescribes a preanalytical holding time of 14 days for volatile organic compounds in soil stored at 4{degrees}C. The American Society for Testing Materials (ASTM) uses a more technical definition that the preanalytical holding time is the day when the analyte concentration for an environmental sample falls below the lower 99% confidence interval on the analyte concentration at day zero. This study reviews various holding time definitions and suggest a new preanalytical holding time approach using acceptable error rates for measuring an environmental analyte. This practical reporting time (PRT) approach has been applied to nineteen volatile organic compounds and four explosives in three environmental soil samples. A PRT nomograph of error rates has been developed to estimate the consequences of missing a preanalytical holding time. This nomograph can be applied to a large class of analytes with concentrations that decay linearly or exponentially with time regardless of sample matrices and storage conditions.

  14. Soil sampling intercomparison exercise by selected laboratories of the ALMERA Network

    International Nuclear Information System (INIS)

    2009-01-01

    The IAEA's Seibersdorf Laboratories in Austria have the programmatic responsibility to provide assistance to Member State laboratories in maintaining and improving the reliability of analytical measurement results, both in radionuclide and trace element determinations. This is accomplished through the provision of reference materials of terrestrial origin, validated analytical procedures, training in the implementation of internal quality control, and through the evaluation of measurement performance by the organization of worldwide and regional interlaboratory comparison exercises. The IAEA is mandated to support global radionuclide measurement systems related to accidental or intentional releases of radioactivity in the environment. To fulfil this obligation and ensure a reliable, worldwide, rapid and consistent response, the IAEA coordinates an international network of analytical laboratories for the measurement of environmental radioactivity (ALMERA). The network was established by the IAEA in 1995 and makes available to Member States a world-wide network of analytical laboratories capable of providing reliable and timely analysis of environmental samples in the event of an accidental or intentional release of radioactivity. A primary requirement for the ALMERA members is participation in the IAEA interlaboratory comparison exercises, which are specifically organized for ALMERA on a regular basis. These exercises are designed to monitor and demonstrate the performance and analytical capabilities of the network members, and to identify gaps and problem areas where further development is needed. In this framework, the IAEA organized a soil sampling intercomparison exercise (IAEA/SIE/01) for selected laboratories of the ALMERA network. The main objective of this exercise was to compare soil sampling procedures used by different participating laboratories. The performance evaluation results of the interlaboratory comparison exercises performed in the framework of

  15. Density meter algorithm and system for estimating sampling/mixing uncertainty

    International Nuclear Information System (INIS)

    Shine, E.P.

    1986-01-01

    The Laboratories Department at the Savannah River Plant (SRP) has installed a six-place density meter with an automatic sampling device. This paper describes the statistical software developed to analyze the density of uranyl nitrate solutions using this automated system. The purpose of this software is twofold: to estimate the sampling/mixing and measurement uncertainties in the process and to provide a measurement control program for the density meter. Non-uniformities in density are analyzed both analytically and graphically. The mean density and its limit of error are estimated. Quality control standards are analyzed concurrently with process samples and used to control the density meter measurement error. The analyses are corrected for concentration due to evaporation of samples waiting to be analyzed. The results of this program have been successful in identifying sampling/mixing problems and controlling the quality of analyses

  16. Density meter algorithm and system for estimating sampling/mixing uncertainty

    International Nuclear Information System (INIS)

    Shine, E.P.

    1986-01-01

    The Laboratories Department at the Savannah River Plant (SRP) has installed a six-place density meter with an automatic sampling device. This paper describes the statisical software developed to analyze the density of uranyl nitrate solutions using this automated system. The purpose of this software is twofold: to estimate the sampling/mixing and measurement uncertainties in the process and to provide a measurement control program for the density meter. Non-uniformities in density are analyzed both analytically and graphically. The mean density and its limit of error are estimated. Quality control standards are analyzed concurrently with process samples and used to control the density meter measurement error. The analyses are corrected for concentration due to evaporation of samples waiting to be analyzed. The results of this program have been successful in identifying sampling/mixing problems and controlling the quality of analyses

  17. PENGARUH MODEL PROBLEM BASED LEARNING TERHADAP KEMAMPUAN BERPIKIR ANALITIS DAN KETERAMPILAN PROSES SAINS KIMIA PESERTA DIDIK

    Directory of Open Access Journals (Sweden)

    Eli Rohaeti

    2017-10-01

    Full Text Available This research aimed to investigate the effect of PBL model on students’ analytical thinking abilities and science process skills at rate reaction. This research was quasi experimental research using posttest-only control design. The sample was consisted 2 classess, experiment class used Problem Based Learning model and control class used Direct Instruction model with a total sample of 61 students. Instruments of this research were the observation sheet for measuring science process skills and the integrated assessment instrument which involved two indicators, analytical thinking abilities and science process skills. The result of this study shows that PBL model can increase students’ analytical thinking abilities and science process skills. The mean of posttest analytical thinking abilities and science process skills in experiment class is better than control class. The result of the statistic tests using ancova analysis shows that significance 0.000 < 0.05 at 5% significance level, so there’s effect of the using of PBL model on students’ analytical thinking abilities and science process skills. Abstrak Penelitian ini bertujuan untuk menguji pengaruh model PBL terhadap kemampuan berpikir analitis dan keterampilan proses sains kimia peserta didik pada materi laju reaksi menggunakan instrumen penilaian terintegrasi. Jenis penelitian ini adalah penelitian eksperimen semu. Desain penelitian yang digunakan, yaitu posttest control group design. Sampel dalam penelitian ini sebanyak 61 peserta didik yang dibagi dalam dua kelas, yaitu kelas eksperimen dan kelas kontrol. Kelas eksperimen menggunakan model Problem Based Learning, sedangkan kelas kontrol menggunakan model Direct Instruction. Instrumen yang digunakan dalam penelitian, yaitu lembar observasi untuk mengukur keterampilan proses sains kimia dan instrumen penilaian terintegrasi yang mencakup indikator kemampuan berpikir analitis dan keterampilan proses sains kimia peserta didik. Hasil

  18. Analytical study of doubly excited ridge states

    International Nuclear Information System (INIS)

    Wong, H.Y.

    1988-01-01

    Two different non-separable problems are explored and analyzed. Non-perturbative methods need to be used to handle them, as the competing forces involved in these problems are equally strong and do not yield to a perturbative analysis. The first one is the study of doubly excited ridge states of atoms, in which two electrons are comparably excited. An analytical wavefunction for such states is introduced and is used to solve the two-electron Hamiltonian in the pair coordinates called hyperspherical coordinates variationally. The correlation between the electrons is built in analytically into the structure of the wavefunction. Sequences of ridge states out to very high excitation are computed and are organized as Rydberg series converging to the double ionization limit. Numerical results of such states in He and H - are compared with other theoretical calculations where available. The second problem is the analysis of the photodetachment of negative ions in an electric field via the frame transformation theory. The presence of the electron field requires a transformation from spherical to cylindrical symmetry for the outgoing photoelectron. This gives an oscillatory modulating factor as the effect of the electric field on cross-sections. All of this work is derived analytically in a general form applicable to the photodetachment of any negative ion. The expressions are applied to H - and S - for illustration

  19. Stability of purgeable VOCs in water samples during pre-analytical holding. Part 2: Analyses by an EPA regional laboratory

    Energy Technology Data Exchange (ETDEWEB)

    West, O.R.; Bayne, C.K.; Siegrist, R.L.; Holden, W.L. [Oak Ridge National Lab., TN (United States); Bottrell, D.W. [Dept. of Energy, Germantown, MD (United States)

    1997-03-01

    This study was undertaken to examine the hypothesis that prevalent and priority purgeable VOCs in properly preserved water samples are stable for at least 28 days. For the purposes of this study, VOCs were considered functionally stable if concentrations measured after 28 days did not change by more than 10% from the initial values. An extensive stability experiment was performed on freshly-collected surface water spiked with a suite of 44 purgeable VOCs. The spiked water was then distributed into multiple 40-mL VOC vials with 0.010-in Teflon-lined silicone septum caps prefilled with 250 mg of NaHSO{sub 4} (resulting pH of the water {approximately}2). The samples were sent to a commercial [Analytical Resources, Inc. (ARI)] and EPA (Region IV) laboratory where they were stored at 4 C. On 1, 8, 15, 22, 29, 36, and 71 days after sample preparation, analysts from ARI took 4 replicate samples out of storage and analyzed these samples for purgeable VOCs following EPA/SW846 8260A. A similar analysis schedule was followed by analysts at the EPA laboratory. This document contains the results from the EPA analyses; the ARI results are described in a separate report.

  20. Analytic Approach to Resolving Parking Problems in Downtown Zagreb

    Directory of Open Access Journals (Sweden)

    Adolf Malić

    2005-01-01

    Full Text Available Parking issue is one of the major problems in Zagreb, andin relation to that Zagreb does not differ from other similar orbigger European cities. The problem the city is facing is beingpresented in the paper. It is complex and can be solved gradually,using operative and planning measures, by applying influentialparameters assessments based on which the appropriateparking-garage spaces assessment would be selected. Besides,all the knowledge learned from experiences of similar Europeancities should be used in resolving stationary traffic problem.Introduction of fast public urban transport would providepassengers with improved services (particularly in relation tothe travelling time introducing modern traffic system thatwould reduce the travelling time to below 30 minutes for the farthestrelations. Further improvement in reducing parking problemsin downtown as well as Zagreb broader area would not bepossible without t,nplementing th.s approach.