WorldWideScience

Sample records for sample designs analytical

  1. Post-analytical stability of 23 common chemistry and immunochemistry analytes in incurred samples

    DEFF Research Database (Denmark)

    Nielsen, Betina Klint; Frederiksen, Tina; Friis-Hansen, Lennart

    2017-01-01

    BACKGROUND: Storage of blood samples after centrifugation, decapping and initial sampling allows ordering of additional blood tests. The pre-analytic stability of biochemistry and immunochemistry analytes has been studied in detail, but little is known about the post-analytical stability...... in incurred samples. METHODS: We examined the stability of 23 routine analytes on the Dimension Vista® (Siemens Healthineers, Denmark): 42-60 routine samples in lithium-heparin gel tubes (Vacutainer, BD, USA) were centrifuged at 3000×g for 10min. Immediately after centrifugation, initial concentration...... of analytes were measured in duplicate (t=0). The tubes were stored decapped at room temperature and re-analyzed after 2, 4, 6, 8 and 10h in singletons. The concentration from reanalysis were normalized to initial concentration (t=0). Internal acceptance criteria for bias and total error were used...

  2. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    Science.gov (United States)

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  3. Green sample preparation for liquid chromatography and capillary electrophoresis of anionic and cationic analytes.

    Science.gov (United States)

    Wuethrich, Alain; Haddad, Paul R; Quirino, Joselito P

    2015-04-21

    A sample preparation device for the simultaneous enrichment and separation of cationic and anionic analytes was designed and implemented in an eight-channel configuration. The device is based on the use of an electric field to transfer the analytes from a large volume of sample into small volumes of electrolyte that was suspended into two glass micropipettes using a conductive hydrogel. This simple, economical, fast, and green (no organic solvent required) sample preparation scheme was evaluated using cationic and anionic herbicides as test analytes in water. The analytical figures of merit and ecological aspects were evaluated against the state-of-the-art sample preparation, solid-phase extraction. A drastic reduction in both sample preparation time (94% faster) and resources (99% less consumables used) was observed. Finally, the technique in combination with high-performance liquid chromatography and capillary electrophoresis was applied to analysis of quaternary ammonium and phenoxypropionic acid herbicides in fortified river water as well as drinking water (at levels relevant to Australian guidelines). The presented sustainable sample preparation approach could easily be applied to other charged analytes or adopted by other laboratories.

  4. Bias Assessment of General Chemistry Analytes using Commutable Samples.

    Science.gov (United States)

    Koerbin, Gus; Tate, Jillian R; Ryan, Julie; Jones, Graham Rd; Sikaris, Ken A; Kanowski, David; Reed, Maxine; Gill, Janice; Koumantakis, George; Yen, Tina; St John, Andrew; Hickman, Peter E; Simpson, Aaron; Graham, Peter

    2014-11-01

    Harmonisation of reference intervals for routine general chemistry analytes has been a goal for many years. Analytical bias may prevent this harmonisation. To determine if analytical bias is present when comparing methods, the use of commutable samples, or samples that have the same properties as the clinical samples routinely analysed, should be used as reference samples to eliminate the possibility of matrix effect. The use of commutable samples has improved the identification of unacceptable analytical performance in the Netherlands and Spain. The International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) has undertaken a pilot study using commutable samples in an attempt to determine not only country specific reference intervals but to make them comparable between countries. Australia and New Zealand, through the Australasian Association of Clinical Biochemists (AACB), have also undertaken an assessment of analytical bias using commutable samples and determined that of the 27 general chemistry analytes studied, 19 showed sufficiently small between method biases as to not prevent harmonisation of reference intervals. Application of evidence based approaches including the determination of analytical bias using commutable material is necessary when seeking to harmonise reference intervals.

  5. Hanford analytical sample projections FY 1998 - FY 2002

    International Nuclear Information System (INIS)

    Joyce, S.M.

    1998-01-01

    Analytical Services projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Projections are categorized by radiation level, protocol, sample matrix and program. Analyses requirements are also presented. This document summarizes the Hanford sample projections for fiscal years 1998 to 2002. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Waste Remediation Systems (TWRS), Solid Waste, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition, details on laboratory scale technology (development) work, Sample Management, and Data Management activities are included. This information will be used by Hanford Analytical Services (HAS) and the Sample Management Working Group (SMWG) to assure that laboratories and resources are available and effectively utilized to meet these documented needs

  6. Hanford analytical sample projections FY 1998--FY 2002

    Energy Technology Data Exchange (ETDEWEB)

    Joyce, S.M.

    1998-02-12

    Analytical Services projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Projections are categorized by radiation level, protocol, sample matrix and program. Analyses requirements are also presented. This document summarizes the Hanford sample projections for fiscal years 1998 to 2002. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Waste Remediation Systems (TWRS), Solid Waste, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition, details on laboratory scale technology (development) work, Sample Management, and Data Management activities are included. This information will be used by Hanford Analytical Services (HAS) and the Sample Management Working Group (SMWG) to assure that laboratories and resources are available and effectively utilized to meet these documented needs.

  7. Sampling Large Graphs for Anticipatory Analytics

    Science.gov (United States)

    2015-05-15

    low. C. Random Area Sampling Random area sampling [8] is a “ snowball ” sampling method in which a set of random seed vertices are selected and areas... Sampling Large Graphs for Anticipatory Analytics Lauren Edwards, Luke Johnson, Maja Milosavljevic, Vijay Gadepally, Benjamin A. Miller Lincoln...systems, greater human-in-the-loop involvement, or through complex algorithms. We are investigating the use of sampling to mitigate these challenges

  8. Analytical Chemistry Division's sample transaction system

    International Nuclear Information System (INIS)

    Stanton, J.S.; Tilson, P.A.

    1980-10-01

    The Analytical Chemistry Division uses the DECsystem-10 computer for a wide range of tasks: sample management, timekeeping, quality assurance, and data calculation. This document describes the features and operating characteristics of many of the computer programs used by the Division. The descriptions are divided into chapters which cover all of the information about one aspect of the Analytical Chemistry Division's computer processing

  9. Lagoa Real design. Description and evaluation of sampling system

    International Nuclear Information System (INIS)

    Hashizume, B.K.

    1982-10-01

    This report describes the samples preparation system of drilling from Lagoa Real Design, aiming obtainment representative fraction of the half from drilling outlier. The error of sampling + analysis and analytical accuracy was obtainment by delayed neutron analysis. (author)

  10. Analytical models for low-power rectenna design

    NARCIS (Netherlands)

    Akkermans, J.A.G.; Beurden, van M.C.; Doodeman, G.J.N.; Visser, H.J.

    2005-01-01

    The design of a low-cost rectenna for low-power applications is presented. The rectenna is designed with the use of analytical models and closed-form analytical expressions. This allows for a fast design of the rectenna system. To acquire a small-area rectenna, a layered design is proposed.

  11. Paper-Based Analytical Device for Zinc Ion Quantification in Water Samples with Power-Free Analyte Concentration

    Directory of Open Access Journals (Sweden)

    Hiroko Kudo

    2017-04-01

    Full Text Available Insufficient sensitivity is a general issue of colorimetric paper-based analytical devices (PADs for trace analyte detection, such as metal ions, in environmental water. This paper demonstrates the colorimetric detection of zinc ions (Zn2+ on a paper-based analytical device with an integrated analyte concentration system. Concentration of Zn2+ ions from an enlarged sample volume (1 mL has been achieved with the aid of a colorimetric Zn2+ indicator (Zincon electrostatically immobilized onto a filter paper substrate in combination with highly water-absorbent materials. Analyte concentration as well as sample pretreatment, including pH adjustment and interferent masking, has been elaborated. The resulting device enables colorimetric quantification of Zn2+ in environmental water samples (tap water, river water from a single sample application. The achieved detection limit of 0.53 μM is a significant improvement over that of a commercial colorimetric Zn2+ test paper (9.7 μM, demonstrating the efficiency of the developed analyte concentration system not requiring any equipment.

  12. Development of analytical techniques for safeguards environmental samples at JAEA

    International Nuclear Information System (INIS)

    Sakurai, Satoshi; Magara, Masaaki; Usuda, Shigekazu; Watanabe, Kazuo; Esaka, Fumitaka; Hirayama, Fumio; Lee, Chi-Gyu; Yasuda, Kenichiro; Inagawa, Jun; Suzuki, Daisuke; Iguchi, Kazunari; Kokubu, Yoko S.; Miyamoto, Yutaka; Ohzu, Akira

    2007-01-01

    JAEA has been developing, under the auspices of the Ministry of Education, Culture, Sports, Science and Technology of Japan, analytical techniques for ultra-trace amounts of nuclear materials in environmental samples in order to contribute to the strengthened safeguards system. Development of essential techniques for bulk and particle analysis, as well as screening, of the environmental swipe samples has been established as ultra-trace analytical methods of uranium and plutonium. In January 2003, JAEA was qualified, including its quality control system, as a member of the JAEA network analytical laboratories for environmental samples. Since 2004, JAEA has conducted the analysis of domestic and the IAEA samples, through which JAEA's analytical capability has been verified and improved. In parallel, advanced techniques have been developed in order to expand the applicability to the samples of various elemental composition and impurities and to improve analytical accuracy and efficiency. This paper summarizes the trace of the technical development in environmental sample analysis at JAEA, and refers to recent trends of research and development in this field. (author)

  13. Tank 214-AW-105, grab samples, analytical results for the final report

    International Nuclear Information System (INIS)

    Esch, R.A.

    1997-01-01

    This document is the final report for tank 241-AW-105 grab samples. Twenty grabs samples were collected from risers 10A and 15A on August 20 and 21, 1996, of which eight were designated for the K Basin sludge compatibility and mixing studies. This document presents the analytical results for the remaining twelve samples. Analyses were performed in accordance with the Compatibility Grab Sampling and Analysis Plan (TSAP) and the Data Quality Objectives for Tank Farms Waste Compatibility Program (DO). The results for the previous sampling of this tank were reported in WHC-SD-WM-DP-149, Rev. 0, 60-Day Waste Compatibility Safety Issue and Final Results for Tank 241-A W-105, Grab Samples 5A W-95-1, 5A W-95-2 and 5A W-95-3. Three supernate samples exceeded the TOC notification limit (30,000 microg C/g dry weight). Appropriate notifications were made. No immediate notifications were required for any other analyte. The TSAP requested analyses for polychlorinated biphenyls (PCB) for all liquids and centrifuged solid subsamples. The PCB analysis of the liquid samples has been delayed and will be presented in a revision to this document

  14. 40 CFR 141.22 - Turbidity sampling and analytical requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Turbidity sampling and analytical... § 141.22 Turbidity sampling and analytical requirements. The requirements in this section apply to... the water distribution system at least once per day, for the purposes of making turbidity measurements...

  15. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  16. Hanford analytical sample projections FY 1996 - FY 2001. Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Joyce, S.M.

    1997-07-02

    This document summarizes the biannual Hanford sample projections for fiscal year 1997-2001. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Wastes Remediation Systems, Solid Wastes, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition to this revision, details on Laboratory scale technology (development), Sample management, and Data management activities were requested. This information will be used by the Hanford Analytical Services program and the Sample Management Working Group to assure that laboratories and resources are available and effectively utilized to meet these documented needs.

  17. Chemometrics in analytical chemistry-part I: history, experimental design and data analysis tools.

    Science.gov (United States)

    Brereton, Richard G; Jansen, Jeroen; Lopes, João; Marini, Federico; Pomerantsev, Alexey; Rodionova, Oxana; Roger, Jean Michel; Walczak, Beata; Tauler, Romà

    2017-10-01

    Chemometrics has achieved major recognition and progress in the analytical chemistry field. In the first part of this tutorial, major achievements and contributions of chemometrics to some of the more important stages of the analytical process, like experimental design, sampling, and data analysis (including data pretreatment and fusion), are summarised. The tutorial is intended to give a general updated overview of the chemometrics field to further contribute to its dissemination and promotion in analytical chemistry.

  18. Paper Capillary Enables Effective Sampling for Microfluidic Paper Analytical Devices.

    Science.gov (United States)

    Shangguan, Jin-Wen; Liu, Yu; Wang, Sha; Hou, Yun-Xuan; Xu, Bi-Yi; Xu, Jing-Juan; Chen, Hong-Yuan

    2018-06-06

    Paper capillary is introduced to enable effective sampling on microfluidic paper analytical devices. By coupling mac-roscale capillary force of paper capillary and microscale capillary forces of native paper, fluid transport can be flexibly tailored with proper design. Subsequently, a hybrid-fluid-mode paper capillary device was proposed, which enables fast and reliable sampling in an arrayed form, with less surface adsorption and bias for different components. The resulting device thus well supports high throughput, quantitative, and repeatable assays all by hands operation. With all these merits, multiplex analysis of ions, proteins, and microbe have all been realized on this platform, which has paved the way to level-up analysis on μPADs.

  19. Contemporary sample stacking in analytical electrophoresis

    Czech Academy of Sciences Publication Activity Database

    Šlampová, Andrea; Malá, Zdeňka; Pantůčková, Pavla; Gebauer, Petr; Boček, Petr

    2013-01-01

    Roč. 34, č. 1 (2013), s. 3-18 ISSN 0173-0835 R&D Projects: GA ČR GAP206/10/1219 Institutional support: RVO:68081715 Keywords : biological samples * stacking * trace analysis * zone electrophoresis Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 3.161, year: 2013

  20. Contemporary sample stacking in analytical electrophoresis

    Czech Academy of Sciences Publication Activity Database

    Malá, Zdeňka; Šlampová, Andrea; Křivánková, Ludmila; Gebauer, Petr; Boček, Petr

    2015-01-01

    Roč. 36, č. 1 (2015), s. 15-35 ISSN 0173-0835 R&D Projects: GA ČR(CZ) GA13-05762S Institutional support: RVO:68081715 Keywords : biological samples * stacking * trace analysis * zone electrophoresis Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 2.482, year: 2015

  1. [Progress in sample preparation and analytical methods for trace polar small molecules in complex samples].

    Science.gov (United States)

    Zhang, Qianchun; Luo, Xialin; Li, Gongke; Xiao, Xiaohua

    2015-09-01

    Small polar molecules such as nucleosides, amines, amino acids are important analytes in biological, food, environmental, and other fields. It is necessary to develop efficient sample preparation and sensitive analytical methods for rapid analysis of these polar small molecules in complex matrices. Some typical materials in sample preparation, including silica, polymer, carbon, boric acid and so on, are introduced in this paper. Meanwhile, the applications and developments of analytical methods of polar small molecules, such as reversed-phase liquid chromatography, hydrophilic interaction chromatography, etc., are also reviewed.

  2. Analytical laboratory and mobile sampling platform

    International Nuclear Information System (INIS)

    Stetzenbach, K.; Smiecinski, A.

    1996-01-01

    This is the final report for the Analytical Laboratory and Mobile Sampling Platform project. This report contains only major findings and conclusions resulting from this project. Detailed reports of all activities performed for this project were provided to the Project Office every quarter since the beginning of the project. This report contains water chemistry data for samples collected in the Nevada section of Death Valley National Park (Triangle Area Springs), Nevada Test Site springs, Pahranagat Valley springs, Nevada Test Site wells, Spring Mountain springs and Crater Flat and Amargosa Valley wells

  3. Waste minimization in analytical chemistry through innovative sample preparation techniques

    International Nuclear Information System (INIS)

    Smith, L. L.

    1998-01-01

    Because toxic solvents and other hazardous materials are commonly used in analytical methods, characterization procedures result in significant and costly amount of waste. We are developing alternative analytical methods in the radiological and organic areas to reduce the volume or form of the hazardous waste produced during sample analysis. For the radiological area, we have examined high-pressure, closed-vessel microwave digestion as a way to minimize waste from sample preparation operations. Heated solutions of strong mineral acids can be avoided for sample digestion by using the microwave approach. Because reactivity increases with pressure, we examined the use of less hazardous solvents to leach selected contaminants from soil for subsequent analysis. We demonstrated the feasibility of this approach by extracting plutonium from a NET reference material using citric and tartaric acids with microwave digestion. Analytical results were comparable to traditional digestion methods, while hazardous waste was reduced by a factor often. We also evaluated the suitability of other natural acids, determined the extraction performance on a wider variety of soil types, and examined the extraction efficiency of other contaminants. For the organic area, we examined ways to minimize the wastes associated with the determination of polychlorinated biphenyls (PCBs) in environmental samples. Conventional methods for analyzing semivolatile organic compounds are labor intensive and require copious amounts of hazardous solvents. For soil and sediment samples, we have a method to analyze PCBs that is based on microscale extraction using benign solvents (e.g., water or hexane). The extraction is performed at elevated temperatures in stainless steel cells containing the sample and solvent. Gas chromatography-mass spectrometry (GC/MS) was used to quantitate the analytes in the isolated extract. More recently, we developed a method utilizing solid-phase microextraction (SPME) for natural

  4. User's and reference guide to the INEL RML/analytical radiochemistry sample tracking database version 1.00

    International Nuclear Information System (INIS)

    Femec, D.A.

    1995-09-01

    This report discusses the sample tracking database in use at the Idaho National Engineering Laboratory (INEL) by the Radiation Measurements Laboratory (RML) and Analytical Radiochemistry. The database was designed in-house to meet the specific needs of the RML and Analytical Radiochemistry. The report consists of two parts, a user's guide and a reference guide. The user's guide presents some of the fundamentals needed by anyone who will be using the database via its user interface. The reference guide describes the design of both the database and the user interface. Briefly mentioned in the reference guide are the code-generating tools, CREATE-SCHEMA and BUILD-SCREEN, written to automatically generate code for the database and its user interface. The appendices contain the input files used by the these tools to create code for the sample tracking database. The output files generated by these tools are also included in the appendices

  5. Analytical characterization of high-level mixed wastes using multiple sample preparation treatments

    International Nuclear Information System (INIS)

    King, A.G.; Baldwin, D.L.; Urie, M.W.; McKinley, S.G.

    1994-01-01

    The Analytical Chemistry Laboratory at the Pacific Northwest Laboratory in Richland, Washington, is actively involved in performing analytical characterization of high-level mixed waste from Hanford's single shell and double shell tank characterization programs. A full suite of analyses is typically performed on homogenized tank core samples. These analytical techniques include inductively-coupled plasma-atomic emission spectroscopy, total organic carbon methods and radiochemistry methods, as well as many others, all requiring some type of remote sample-preparation treatment to solubilize the tank sludge material for analysis. Most of these analytical methods typically use a single sample-preparation treatment, inherently providing elemental information only. To better understand and interpret tank chemistry and assist in identifying chemical compounds, selected analytical methods are performed using multiple sample-preparation treatments. The sample preparation treatments used at Pacific Northwest Laboratory for this work with high-level mixed waste include caustic fusion, acid digestion, and water leach. The type of information available by comparing results from different sample-prep treatments includes evidence for the presence of refractory compounds, acid-soluble compounds, or water-soluble compounds. Problems unique to the analysis of Hanford tank wastes are discussed. Selected results from the Hanford single shell ferrocyanide tank, 241-C-109, are presented, and the resulting conclusions are discussed

  6. Developing Learning Analytics Design Knowledge in the "Middle Space": The Student Tuning Model and Align Design Framework for Learning Analytics Use

    Science.gov (United States)

    Wise, Alyssa Friend; Vytasek, Jovita Maria; Hausknecht, Simone; Zhao, Yuting

    2016-01-01

    This paper addresses a relatively unexplored area in the field of learning analytics: how analytics are taken up and used as part of teaching and learning processes. Initial steps are taken towards developing design knowledge for this "middle space," with a focus on students as analytics users. First, a core set of challenges for…

  7. Identification of clinical biomarkers for pre-analytical quality control of blood samples.

    Science.gov (United States)

    Kang, Hyun Ju; Jeon, Soon Young; Park, Jae-Sun; Yun, Ji Young; Kil, Han Na; Hong, Won Kyung; Lee, Mee-Hee; Kim, Jun-Woo; Jeon, Jae-Pil; Han, Bok Ghee

    2013-04-01

    Pre-analytical conditions are key factors in maintaining the high quality of biospecimens. They are necessary for accurate reproducibility of experiments in the field of biomarker discovery as well as achieving optimal specificity of laboratory tests for clinical diagnosis. In research at the National Biobank of Korea, we evaluated the impact of pre-analytical conditions on the stability of biobanked blood samples by measuring biochemical analytes commonly used in clinical laboratory tests. We measured 10 routine laboratory analytes in serum and plasma samples from healthy donors (n = 50) with a chemistry autoanalyzer (Hitachi 7600-110). The analyte measurements were made at different time courses based on delay of blood fractionation, freezing delay of fractionated serum and plasma samples, and at different cycles (0, 1, 3, 6, 9) of freeze-thawing. Statistically significant changes from the reference sample mean were determined using the repeated-measures ANOVA and the significant change limit (SCL). The serum levels of GGT and LDH were changed significantly depending on both the time interval between blood collection and fractionation and the time interval between fractionation and freezing of serum and plasma samples. The glucose level was most sensitive only to the elapsed time between blood collection and centrifugation for blood fractionation. Based on these findings, a simple formula (glucose decrease by 1.387 mg/dL per hour) was derived to estimate the length of time delay after blood collection. In addition, AST, BUN, GGT, and LDH showed sensitive responses to repeated freeze-thaw cycles of serum and plasma samples. These results suggest that GGT and LDH measurements can be used as quality control markers for certain pre-analytical conditions (eg, delayed processing or repeated freeze-thawing) of blood samples which are either directly used in the laboratory tests or stored for future research in the biobank.

  8. Sampling and analyte enrichment strategies for ambient mass spectrometry.

    Science.gov (United States)

    Li, Xianjiang; Ma, Wen; Li, Hongmei; Ai, Wanpeng; Bai, Yu; Liu, Huwei

    2018-01-01

    Ambient mass spectrometry provides great convenience for fast screening, and has showed promising potential in analytical chemistry. However, its relatively low sensitivity seriously restricts its practical utility in trace compound analysis. In this review, we summarize the sampling and analyte enrichment strategies coupled with nine modes of representative ambient mass spectrometry (desorption electrospray ionization, paper vhspray ionization, wooden-tip spray ionization, probe electrospray ionization, coated blade spray ionization, direct analysis in real time, desorption corona beam ionization, dielectric barrier discharge ionization, and atmospheric-pressure solids analysis probe) that have dramatically increased the detection sensitivity. We believe that these advances will promote routine use of ambient mass spectrometry. Graphical abstract Scheme of sampling stretagies for ambient mass spectrometry.

  9. Analytical software design : introduction and industrial experience report

    NARCIS (Netherlands)

    Osaiweran, A.A.H.; Boosten, M.; Mousavi, M.R.

    2010-01-01

    Analytical Software Design (ASD) is a design approach that combines formal and empirical methods for developing mathematically verified software systems. Unlike conventional design methods, the design phase is extended with more formal techniques, so that flaws are detected earlier, thereby reducing

  10. A sample preparation method for recovering suppressed analyte ions in MALDI TOF MS

    NARCIS (Netherlands)

    Lou, X.; Waal, de B.F.M.; Milroy, L.G.; Dongen, van J.L.J.

    2015-01-01

    In matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI TOF MS), analyte signals can be substantially suppressed by other compounds in the sample. In this technical note, we describe a modified thin-layer sample preparation method that significantly reduces the analyte

  11. Analytical Methodology for the Determination of Radium Isotopes in Environmental Samples

    International Nuclear Information System (INIS)

    2010-01-01

    Reliable, comparable and 'fit for purpose' results are an essential requirement for any decision based on analytical measurements. For the analyst, the availability of tested and validated analytical procedures is an extremely important tool for production of such analytical measurements. For maximum utility, such procedures should be comprehensive, clearly formulated, and readily available to both the analyst and the customer for reference. Since 2004, the environment programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. Measurements of radium isotopes are important for radiological and environmental protection, geochemical and geochronological investigations, hydrology, etc. The suite of isotopes creates and stimulates continuing interest in the development of new methods for determination of radium in various media. In this publication, the four most routinely used analytical methods for radium determination in biological and environmental samples, i.e. alpha spectrometry, gamma spectrometry, liquid scintillation spectrometry and mass spectrometry, are reviewed

  12. OPTIMAL METHOD FOR PREPARATION OF SILICATE ROCK SAMPLES FOR ANALYTICAL PURPOSES

    Directory of Open Access Journals (Sweden)

    Maja Vrkljan

    2004-12-01

    Full Text Available The purpose of this study was to determine an optimal dissolution method for silicate rock samples for further analytical purposes. Analytical FAAS method of determining cobalt, chromium, copper, nickel, lead and zinc content in gabbro sample and geochemical standard AGV-1 has been applied for verification. Dissolution in mixtures of various inorganic acids has been tested, as well as Na2CO3 fusion technique. The results obtained by different methods have been compared and dissolution in the mixture of HNO3 + HF has been recommended as optimal.

  13. Experimental and Sampling Design for the INL-2 Sample Collection Operational Test

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Amidan, Brett G.; Matzke, Brett D.

    2009-02-16

    , sample extraction, and analytical methods to be used in the INL-2 study. For each of the five test events, the specified floor of the INL building will be contaminated with BG using a point-release device located in the room specified in the experimental design. Then quality control (QC), reference material coupon (RMC), judgmental, and probabilistic samples will be collected according to the sampling plan for each test event. Judgmental samples will be selected based on professional judgment and prior information. Probabilistic samples were selected with a random aspect and in sufficient numbers to provide desired confidence for detecting contamination or clearing uncontaminated (or decontaminated) areas. Following sample collection for a given test event, the INL building will be decontaminated. For possibly contaminated areas, the numbers of probabilistic samples were chosen to provide 95% confidence of detecting contaminated areas of specified sizes. For rooms that may be uncontaminated following a contamination event, or for whole floors after decontamination, the numbers of judgmental and probabilistic samples were chosen using the CJR approach. The numbers of samples were chosen to support making X%/Y% clearance statements with X = 95% or 99% and Y = 96% or 97%. The experimental and sampling design also provides for making X%/Y% clearance statements using only probabilistic samples. For each test event, the numbers of characterization and clearance samples were selected within limits based on operational considerations while still maintaining high confidence for detection and clearance aspects. The sampling design for all five test events contains 2085 samples, with 1142 after contamination and 943 after decontamination. These numbers include QC, RMC, judgmental, and probabilistic samples. The experimental and sampling design specified in this report provides a good statistical foundation for achieving the objectives of the INL-2 study.

  14. Learning design guided learning analytics in MOOCs

    NARCIS (Netherlands)

    Brouns, Francis; Firssova, Olga

    2016-01-01

    Poster presentation for our paper Brouns, F., & Firssova, O. (2016, October).The role of learning design and learning analytics in MOOCs. Paper presented at 9th EDEN Research Workshop, Oldenburg, Germany.

  15. Analytical Model-Based Design Optimization of a Transverse Flux Machine

    Energy Technology Data Exchange (ETDEWEB)

    Hasan, Iftekhar; Husain, Tausif; Sozer, Yilmaz; Husain, Iqbal; Muljadi, Eduard

    2017-02-16

    This paper proposes an analytical machine design tool using magnetic equivalent circuit (MEC)-based particle swarm optimization (PSO) for a double-sided, flux-concentrating transverse flux machine (TFM). The magnetic equivalent circuit method is applied to analytically establish the relationship between the design objective and the input variables of prospective TFM designs. This is computationally less intensive and more time efficient than finite element solvers. A PSO algorithm is then used to design a machine with the highest torque density within the specified power range along with some geometric design constraints. The stator pole length, magnet length, and rotor thickness are the variables that define the optimization search space. Finite element analysis (FEA) was carried out to verify the performance of the MEC-PSO optimized machine. The proposed analytical design tool helps save computation time by at least 50% when compared to commercial FEA-based optimization programs, with results found to be in agreement with less than 5% error.

  16. Development of a standard data base for FBR core nuclear design (XIII). Analysis of small sample reactivity experiments at ZPPR-9

    International Nuclear Information System (INIS)

    Sato, Wakaei; Fukushima, Manabu; Ishikawa, Makoto

    2000-09-01

    A comprehensive study to evaluate and accumulate the abundant results of fast reactor physics is now in progress at O-arai Engineering Center to improve analytical methods and prediction accuracy of nuclear design for large fast breeder cores such as future commercial FBRs. The present report summarizes the analytical results of sample reactivity experiments at ZPPR-9 core, which has not been evaluated by the latest analytical method yet. The intention of the work is to extend and further generalize the standard data base for FBR core nuclear design. The analytical results of the sample reactivity experiments (samples: PU-30, U-6, DU-6, SS-1 and B-1) at ZPPR-9 core in JUPITER series, with the latest nuclear data library JENDL-3.2 and the analytical method which was established by the JUPITER analysis, can be concluded as follows: The region-averaged final C/E values generally agreed with unity within 5% differences at the inner core region. However, the C/E values of every sample showed the radial space-dependency increasing from center to core edge, especially the discrepancy of B-1 was the largest by 10%. Next, the influence of the present analytical results for the ZPPR-9 sample reactivity to the cross-section adjustment was evaluated. The reference case was a unified cross-section set ADJ98 based on the recent JUPITER analysis. As a conclusion, the present analytical results have sufficient physical consistency with other JUPITER data, and possess qualification as a part of the standard data base for FBR nuclear design. (author)

  17. Building smart cities analytics, ICT, and design thinking

    CERN Document Server

    Stimmel, Carol L

    2015-01-01

    The term "smart city" defines the new urban environment, one that is designed for performance through information and communication technologies. Given that the majority of people across the world will live in urban environments within the next few decades, it's not surprising that massive effort and investment is being placed into efforts to develop strategies and plans for achieving "smart" urban growth. Building Smart Cities: Analytics, ICT, and Design Thinking explains the technology and a methodology known as design thinking for building smart cities. Information and communications technologies form the backbone of smart cities. A comprehensive and robust data analytics program enables the right choices to be made in building these cities. Design thinking helps to create smart cities that are both livable and able to evolve. This book examines all of these components in the context of smart city development and shows how to use them in an integrated manner. Using the principles of design thinking to refr...

  18. Analytical study on the determination of boron in environmental water samples

    International Nuclear Information System (INIS)

    Lopez, F.J.; Gimenez, E.; Hernandez, F.

    1993-01-01

    An analytical study on the determination of boron in environmental water samples was carried out. The curcumin and carmine standard methods were compared with the most recent Azomethine-H method in order to evaluate their analytical characteristics and feasibility for the analysis of boron in water samples. Analyses of synthetic water, ground water, sea water and waste water samples were carried out and a statistical evaluation of the results was made. The Azomethine-H method was found to be the most sensitive (detection limit 0.02 mg l -1 ) and selective (no interference of commonly occurring ions in water was observed), showing also the best precision (relative standard deviation lower than 4%). Moreover, it gave good results for all types of samples analyzed. The accuracy of this method was tested by the addition of known amounts of standard solutions to different types of water samples. The slopes of standard additions and direct calibration graphs were similar and recoveries of added boron ranged from 99 to 107%. (orig.)

  19. Metal-organic frameworks for analytical chemistry: from sample collection to chromatographic separation.

    Science.gov (United States)

    Gu, Zhi-Yuan; Yang, Cheng-Xiong; Chang, Na; Yan, Xiu-Ping

    2012-05-15

    In modern analytical chemistry researchers pursue novel materials to meet analytical challenges such as improvements in sensitivity, selectivity, and detection limit. Metal-organic frameworks (MOFs) are an emerging class of microporous materials, and their unusual properties such as high surface area, good thermal stability, uniform structured nanoscale cavities, and the availability of in-pore functionality and outer-surface modification are attractive for diverse analytical applications. This Account summarizes our research on the analytical applications of MOFs ranging from sampling to chromatographic separation. MOFs have been either directly used or engineered to meet the demands of various analytical applications. Bulk MOFs with microsized crystals are convenient sorbents for direct application to in-field sampling and solid-phase extraction. Quartz tubes packed with MOF-5 have shown excellent stability, adsorption efficiency, and reproducibility for in-field sampling and trapping of atmospheric formaldehyde. The 2D copper(II) isonicotinate packed microcolumn has demonstrated large enhancement factors and good shape- and size-selectivity when applied to on-line solid-phase extraction of polycyclic aromatic hydrocarbons in water samples. We have explored the molecular sieving effect of MOFs for the efficient enrichment of peptides with simultaneous exclusion of proteins from biological fluids. These results show promise for the future of MOFs in peptidomics research. Moreover, nanosized MOFs and engineered thin films of MOFs are promising materials as novel coatings for solid-phase microextraction. We have developed an in situ hydrothermal growth approach to fabricate thin films of MOF-199 on etched stainless steel wire for solid-phase microextraction of volatile benzene homologues with large enhancement factors and wide linearity. Their high thermal stability and easy-to-engineer nanocrystals make MOFs attractive as new stationary phases to fabricate MOF

  20. HPLC/DAD determination of rosmarinic acid in Salvia officinalis: sample preparation optimization by factorial design

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Karina B. de [Universidade Federal do Parana (UFPR), Curitiba, PR (Brazil). Dept. de Farmacia; Oliveira, Bras H. de, E-mail: bho@ufpr.br [Universidade Federal do Parana (UFPR), Curitiba, PR (Brazil). Dept. de Quimica

    2013-01-15

    Sage (Salvia officinalis) contains high amounts of the biologically active rosmarinic acid (RA) and other polyphenolic compounds. RA is easily oxidized, and may undergo degradation during sample preparation for analysis. The objective of this work was to develop and validate an analytical procedure for determination of RA in sage, using factorial design of experiments for optimizing sample preparation. The statistically significant variables for improving RA extraction yield were determined initially and then used in the optimization step, using central composite design (CCD). The analytical method was then fully validated, and used for the analysis of commercial samples of sage. The optimized procedure involved extraction with aqueous methanol (40%) containing an antioxidant mixture (ascorbic acid and ethylenediaminetetraacetic acid (EDTA)), with sonication at 45 deg C for 20 min. The samples were then injected in a system containing a C{sub 18} column, using methanol (A) and 0.1% phosphoric acid in water (B) in step gradient mode (45A:55B, 0-5 min; 80A:20B, 5-10 min) with flow rate of 1.0 mL min-1 and detection at 330 nm. Using this conditions, RA concentrations were 50% higher when compared to extractions without antioxidants (98.94 {+-} 1.07% recovery). Auto-oxidation of RA during sample extraction was prevented by the use of antioxidants resulting in more reliable analytical results. The method was then used for the analysis of commercial samples of sage. (author)

  1. HPLC/DAD determination of rosmarinic acid in Salvia officinalis: sample preparation optimization by factorial design

    International Nuclear Information System (INIS)

    Oliveira, Karina B. de; Oliveira, Bras H. de

    2013-01-01

    Sage (Salvia officinalis) contains high amounts of the biologically active rosmarinic acid (RA) and other polyphenolic compounds. RA is easily oxidized, and may undergo degradation during sample preparation for analysis. The objective of this work was to develop and validate an analytical procedure for determination of RA in sage, using factorial design of experiments for optimizing sample preparation. The statistically significant variables for improving RA extraction yield were determined initially and then used in the optimization step, using central composite design (CCD). The analytical method was then fully validated, and used for the analysis of commercial samples of sage. The optimized procedure involved extraction with aqueous methanol (40%) containing an antioxidant mixture (ascorbic acid and ethylenediaminetetraacetic acid (EDTA)), with sonication at 45 deg C for 20 min. The samples were then injected in a system containing a C 18 column, using methanol (A) and 0.1% phosphoric acid in water (B) in step gradient mode (45A:55B, 0-5 min; 80A:20B, 5-10 min) with flow rate of 1.0 mL min−1 and detection at 330 nm. Using this conditions, RA concentrations were 50% higher when compared to extractions without antioxidants (98.94 ± 1.07% recovery). Auto-oxidation of RA during sample extraction was prevented by the use of antioxidants resulting in more reliable analytical results. The method was then used for the analysis of commercial samples of sage. (author)

  2. Designing a Marketing Analytics Course for the Digital Age

    Science.gov (United States)

    Liu, Xia; Burns, Alvin C.

    2018-01-01

    Marketing analytics is receiving great attention because of evolving technology and the radical changes in the marketing environment. This study aims to assist the design and implementation of a marketing analytics course. We assembled a rich data set from four sources: business executives, 400 employers' job postings, one million tweets about…

  3. Implementing analytics a blueprint for design, development, and adoption

    CERN Document Server

    Sheikh, Nauman

    2013-01-01

    Implementing Analytics demystifies the concept, technology and application of analytics and breaks its implementation down to repeatable and manageable steps, making it possible for widespread adoption across all functions of an organization. Implementing Analytics simplifies and helps democratize a very specialized discipline to foster business efficiency and innovation without investing in multi-million dollar technology and manpower. A technology agnostic methodology that breaks down complex tasks like model design and tuning and emphasizes business decisions rather than the technology behi

  4. Determination of 237Np in environmental and nuclear samples: A review of the analytical method

    International Nuclear Information System (INIS)

    Thakur, P.; Mulholland, G.P.

    2012-01-01

    A number of analytical methods has been developed and used for the determination of neptunium in environmental and nuclear fuel samples using alpha, ICP–MS spectrometry, and other analytical techniques. This review summarizes and discusses development of the radiochemical procedures for separation of neptunium (Np), since the beginning of the nuclear industry, followed by a more detailed discussion on recent trends in the separation of neptunium. This article also highlights the progress in analytical methods and issues associated with the determination of neptunium in environmental samples. - Highlights: ► Determination of Np in environmental and nuclear samples is reviewed. ► Various analytical methods used for the determination of Np are listed. ► Progress and issues associated with the determination of Np are discussed.

  5. Pre-analytical sample quality: metabolite ratios as an intrinsic marker for prolonged room temperature exposure of serum samples.

    Directory of Open Access Journals (Sweden)

    Gabriele Anton

    Full Text Available Advances in the "omics" field bring about the need for a high number of good quality samples. Many omics studies take advantage of biobanked samples to meet this need. Most of the laboratory errors occur in the pre-analytical phase. Therefore evidence-based standard operating procedures for the pre-analytical phase as well as markers to distinguish between 'good' and 'bad' quality samples taking into account the desired downstream analysis are urgently needed. We studied concentration changes of metabolites in serum samples due to pre-storage handling conditions as well as due to repeated freeze-thaw cycles. We collected fasting serum samples and subjected aliquots to up to four freeze-thaw cycles and to pre-storage handling delays of 12, 24 and 36 hours at room temperature (RT and on wet and dry ice. For each treated aliquot, we quantified 127 metabolites through a targeted metabolomics approach. We found a clear signature of degradation in samples kept at RT. Storage on wet ice led to less pronounced concentration changes. 24 metabolites showed significant concentration changes at RT. In 22 of these, changes were already visible after only 12 hours of storage delay. Especially pronounced were increases in lysophosphatidylcholines and decreases in phosphatidylcholines. We showed that the ratio between the concentrations of these molecule classes could serve as a measure to distinguish between 'good' and 'bad' quality samples in our study. In contrast, we found quite stable metabolite concentrations during up to four freeze-thaw cycles. We concluded that pre-analytical RT handling of serum samples should be strictly avoided and serum samples should always be handled on wet ice or in cooling devices after centrifugation. Moreover, serum samples should be frozen at or below -80°C as soon as possible after centrifugation.

  6. Analytical characterization using surface-enhanced Raman scattering (SERS) and microfluidic sampling

    International Nuclear Information System (INIS)

    Wang, Chao; Yu, Chenxu

    2015-01-01

    With the rapid development of analytical techniques, it has become much easier to detect chemical and biological analytes, even at very low detection limits. In recent years, techniques based on vibrational spectroscopy, such as surface enhanced Raman spectroscopy (SERS), have been developed for non-destructive detection of pathogenic microorganisms. SERS is a highly sensitive analytical tool that can be used to characterize chemical and biological analytes interacting with SERS-active substrates. However, it has always been a challenge to obtain consistent and reproducible SERS spectroscopic results at complicated experimental conditions. Microfluidics, a tool for highly precise manipulation of small volume liquid samples, can be used to overcome the major drawbacks of SERS-based techniques. High reproducibility of SERS measurement could be obtained in continuous flow generated inside microfluidic devices. This article provides a thorough review of the principles, concepts and methods of SERS-microfluidic platforms, and the applications of such platforms in trace analysis of chemical and biological analytes. (topical review)

  7. Analytical simulation of RBS spectra of nanowire samples

    Energy Technology Data Exchange (ETDEWEB)

    Barradas, Nuno P., E-mail: nunoni@ctn.ist.utl.pt [Centro de Ciências e Tecnologias Nucleares, Instituto Superior Técnico, Universidade de Lisboa, E.N. 10 ao km 139,7, 2695-066 Bobadela LRS (Portugal); García Núñez, C. [Laboratorio de Electrónica y Semiconductores, Departamento de Física Aplicada, Universidad Autónoma de Madrid, 28049 Madrid (Spain); Redondo-Cubero, A. [Laboratorio de Electrónica y Semiconductores, Departamento de Física Aplicada, Universidad Autónoma de Madrid, 28049 Madrid (Spain); Centro de Micro-Análisis de Materiales, Universidad Autónoma de Madrid, 28049 Madrid (Spain); Shen, G.; Kung, P. [Department of Electrical and Computer Engineering, The University of Alabama, AL 35487 (United States); Pau, J.L. [Laboratorio de Electrónica y Semiconductores, Departamento de Física Aplicada, Universidad Autónoma de Madrid, 28049 Madrid (Spain)

    2016-03-15

    Almost all, if not all, general purpose codes for analysis of Ion Beam Analysis data have been originally developed to handle laterally homogeneous samples only. This is the case of RUMP, NDF, SIMNRA, and even of the Monte Carlo code Corteo. General-purpose codes usually include only limited support for lateral inhomogeneity. In this work, we show analytical simulations of samples that consist of a layer of parallel oriented nanowires on a substrate, using a model implemented in NDF. We apply the code to real samples, made of vertical ZnO nanowires on a sapphire substrate. Two configurations of the nanowires were studied: 40 nm diameter, 4.1 μm height, 3.5% surface coverage; and 55 nm diameter, 1.1 μm height, 42% surface coverage. We discuss the accuracy and limits of applicability of the analysis.

  8. Design of analytical instrumentation with D-T sealed neutron generators

    International Nuclear Information System (INIS)

    Qiao Yahua; Wu Jizong; Zheng Weiming; Liu Quanwei; Zhang Min

    2008-01-01

    Analytical instrumentation with D-T sealed neutron generators source activation, The 14 MeV D-T sealed neutron tube with 10 9 n · s -1 neutron yield is used as generator source. The optimal structure of moderator and shield was achieved by MC computing.The instrumentation's configuration is showed. The instrumentation is made up of the SMY-DT50.8-2.1 sealed neutron tube and the high-voltage power supply system, which center is the sealed neutron generators. 6 cm Pb and 20 cm polythene is chosen as moderator, Pb, polythene and 10 cm boron-PE was chosen as shield .The sample box is far the source from 9 cm, the measurement system were made up of HPGe detector and the sample transforming system. After moderator and shield, the thermal neutron fluence rate at the point of sample is 0.93 × 10 6 n · s -1 cm -2 , which is accorded with design demand, and the laboratory and surroundings reaches the safety standard of the dose levels. (authors)

  9. Toward greener analytical techniques for the absolute quantification of peptides in pharmaceutical and biological samples.

    Science.gov (United States)

    Van Eeckhaut, Ann; Mangelings, Debby

    2015-09-10

    Peptide-based biopharmaceuticals represent one of the fastest growing classes of new drug molecules. New reaction types included in the synthesis strategies to reduce the rapid metabolism of peptides, along with the availability of new formulation and delivery technologies, resulted in an increased marketing of peptide drug products. In this regard, the development of analytical methods for quantification of peptides in pharmaceutical and biological samples is of utmost importance. From the sample preparation step to their analysis by means of chromatographic or electrophoretic methods, many difficulties should be tackled to analyze them. Recent developments in analytical techniques emphasize more and more on the use of green analytical techniques. This review will discuss the progresses in and challenges observed during green analytical method development for the quantification of peptides in pharmaceutical and biological samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Use of robotic systems for radiochemical sample changing and for analytical sample preparation

    International Nuclear Information System (INIS)

    Delmastro, J.R.; Hartenstein, S.D.; Wade, M.A.

    1989-01-01

    Two uses of the Perkin-Elmer (PE) robotic system will be presented. In the first, a PE robot functions as an automatic sample changer for up to five low energy photon spectrometry (LEPS) detectors operated with a Nuclear Data ND 6700 system. The entire system, including the robot, is controlled by an IBM PC-AT using software written in compiled BASIC. Problems associated with the development of the system and modifications to the robot will be presented. In the second, an evaluation study was performed to assess the abilities of the PE robotic system for performing complex analytical sample preparation procedures. For this study, a robotic system based upon the PE robot and auxiliary devices was constructed and programmed to perform the preparation of final product samples (UO 3 ) for accountability and impurity specification analyses. These procedures require sample dissolution, dilution, and liquid-liquid extraction steps. The results of an in-depth evaluation of all system components will be presented

  11. Design and characterization of poly(dimethylsiloxane)-based valves for interfacing continuous-flow sampling to microchip electrophoresis.

    Science.gov (United States)

    Li, Michelle W; Huynh, Bryan H; Hulvey, Matthew K; Lunte, Susan M; Martin, R Scott

    2006-02-15

    This work describes the fabrication and evaluation of a poly(dimethyl)siloxane (PDMS)-based device that enables the discrete injection of a sample plug from a continuous-flow stream into a microchannel for subsequent analysis by electrophoresis. Devices were fabricated by aligning valving and flow channel layers followed by plasma sealing the combined layers onto a glass plate that contained fittings for the introduction of liquid sample and nitrogen gas. The design incorporates a reduced-volume pneumatic valve that actuates (on the order of hundreds of milliseconds) to allow analyte from a continuously flowing sampling channel to be injected into a separation channel for electrophoresis. The injector design was optimized to include a pushback channel to flush away stagnant sample associated with the injector dead volume. The effect of the valve actuation time, the pushback voltage, and the sampling stream flow rate on the performance of the device was characterized. Using the optimized design and an injection frequency of 0.64 Hz showed that the injection process is reproducible (RSD of 1.77%, n = 15). Concentration change experiments using fluorescein as the analyte showed that the device could achieve a lag time as small as 14 s. Finally, to demonstrate the potential uses of this device, the microchip was coupled to a microdialysis probe to monitor a concentration change and sample a fluorescein dye mixture.

  12. A GPU code for analytic continuation through a sampling method

    Directory of Open Access Journals (Sweden)

    Johan Nordström

    2016-01-01

    Full Text Available We here present a code for performing analytic continuation of fermionic Green’s functions and self-energies as well as bosonic susceptibilities on a graphics processing unit (GPU. The code is based on the sampling method introduced by Mishchenko et al. (2000, and is written for the widely used CUDA platform from NVidia. Detailed scaling tests are presented, for two different GPUs, in order to highlight the advantages of this code with respect to standard CPU computations. Finally, as an example of possible applications, we provide the analytic continuation of model Gaussian functions, as well as more realistic test cases from many-body physics.

  13. Design of homogeneous trench-assisted multi-core fibers based on analytical model

    DEFF Research Database (Denmark)

    Ye, Feihong; Tu, Jiajing; Saitoh, Kunimasa

    2016-01-01

    We present a design method of homogeneous trench-assisted multicore fibers (TA-MCFs) based on an analytical model utilizing an analytical expression for the mode coupling coefficient between two adjacent cores. The analytical model can also be used for crosstalk (XT) properties analysis, such as ...

  14. 40 CFR 90.421 - Dilute gaseous exhaust sampling and analytical system description.

    Science.gov (United States)

    2010-07-01

    ... gas mixture temperature, measured at a point immediately ahead of the critical flow venturi, must be... analytical system description. (a) General. The exhaust gas sampling system described in this section is... requirements are as follows: (1) This sampling system requires the use of a Positive Displacement Pump—Constant...

  15. Dry sample storage system for an analytical laboratory supporting plutonium processing

    International Nuclear Information System (INIS)

    Treibs, H.A.; Hartenstein, S.D.; Griebenow, B.L.; Wade, M.A.

    1990-01-01

    The Special Isotope Separation (SIS) plant is designed to provide removal of undesirable isotopes in fuel grade plutonium by the atomic vapor laser isotope separation (AVLIS) process. The AVLIS process involves evaporation of plutonium metal, and passage of an intense beam of light from a laser through the plutonium vapor. The laser beam consists of several discrete wavelengths, tuned to the precise wavelength required to ionize the undesired isotopes. These ions are attracted to charged plates, leaving the bulk of the plutonium vapor enriched in the desired isotopes to be collected on a cold plate. Major portions of the process consist of pyrochemical processes, including direct reduction of the plutonium oxide feed material with calcium metal, and aqueous processes for purification of plutonium in residues. The analytical laboratory for the plant is called the Material and Process Control Laboratory (MPCL), and provides for the analysis of solid and liquid process samples

  16. Recent Trends in Microextraction Techniques Employed in Analytical and Bioanalytical Sample Preparation

    Directory of Open Access Journals (Sweden)

    Abuzar Kabir

    2017-12-01

    Full Text Available Sample preparation has been recognized as a major step in the chemical analysis workflow. As such, substantial efforts have been made in recent years to simplify the overall sample preparation process. Major focusses of these efforts have included miniaturization of the extraction device; minimizing/eliminating toxic and hazardous organic solvent consumption; eliminating sample pre-treatment and post-treatment steps; reducing the sample volume requirement; reducing extraction equilibrium time, maximizing extraction efficiency etc. All these improved attributes are congruent with the Green Analytical Chemistry (GAC principles. Classical sample preparation techniques such as solid phase extraction (SPE and liquid-liquid extraction (LLE are being rapidly replaced with emerging miniaturized and environmentally friendly techniques such as Solid Phase Micro Extraction (SPME, Stir bar Sorptive Extraction (SBSE, Micro Extraction by Packed Sorbent (MEPS, Fabric Phase Sorptive Extraction (FPSE, and Dispersive Liquid-Liquid Micro Extraction (DLLME. In addition to the development of many new generic extraction sorbents in recent years, a large number of molecularly imprinted polymers (MIPs created using different template molecules have also enriched the large cache of microextraction sorbents. Application of nanoparticles as high-performance extraction sorbents has undoubtedly elevated the extraction efficiency and method sensitivity of modern chromatographic analyses to a new level. Combining magnetic nanoparticles with many microextraction sorbents has opened up new possibilities to extract target analytes from sample matrices containing high volumes of matrix interferents. The aim of the current review is to critically audit the progress of microextraction techniques in recent years, which has indisputably transformed the analytical chemistry practices, from biological and therapeutic drug monitoring to the environmental field; from foods to phyto

  17. Interpolation and sampling in spaces of analytic functions

    CERN Document Server

    Seip, Kristian

    2004-01-01

    The book is about understanding the geometry of interpolating and sampling sequences in classical spaces of analytic functions. The subject can be viewed as arising from three classical topics: Nevanlinna-Pick interpolation, Carleson's interpolation theorem for H^\\infty, and the sampling theorem, also known as the Whittaker-Kotelnikov-Shannon theorem. The book aims at clarifying how certain basic properties of the space at hand are reflected in the geometry of interpolating and sampling sequences. Key words for the geometric descriptions are Carleson measures, Beurling densities, the Nyquist rate, and the Helson-Szegő condition. The book is based on six lectures given by the author at the University of Michigan. This is reflected in the exposition, which is a blend of informal explanations with technical details. The book is essentially self-contained. There is an underlying assumption that the reader has a basic knowledge of complex and functional analysis. Beyond that, the reader should have some familiari...

  18. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    Science.gov (United States)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  19. Design of a Clean Room for Quality Control of an Environmental Sampling in KINAC

    International Nuclear Information System (INIS)

    Yoon, Jongho; Ahn, Gil Hoon; Seo, Hana; Han, Kitek; Park, Il Jin

    2014-01-01

    The objective of environmental sampling and analysis for safeguards is to characterize the nuclear materials handled and the activities conducted at the specific locations. The KINAC is responsible for the conclusions drawn from the analytical results provided by the analytical laboratories. To assure the KINAC of the continuity of the quality of the analytical results provided by the laboratories, the KINAC will implement a quality control(QC) programme. One of the QC programme is to prepare QC samples. The establishment of a clean room is needed to handle QC samples due to stringent control of contamination. The KINAC designed a clean facility with cleanliness of ISO Class 6, the Clean Room for Estimation and Assay of trace Nuclear materials(CREAN) to meet conflicting requirements of a clean room and for handling of nuclear materials according to Korean laws. The clean room will be expected to acquire of a radiation safety license under these conditions in this year and continue to improve it. The construction of the CREAN facility will be completed by the middle of 2015. In terms of QC programme, the establishment of a clean room is essential and will be not only very helpful for setting of quality control system for the national environmental sampling programme but also be applied for the environmental sample analysis techniques to the nuclear forensics

  20. Means of introducing an analyte into liquid sampling atmospheric pressure glow discharge

    Science.gov (United States)

    Marcus, R. Kenneth; Quarles, Jr., Charles Derrick; Russo, Richard E.; Koppenaal, David W.; Barinaga, Charles J.; Carado, Anthony J.

    2017-01-03

    A liquid sampling, atmospheric pressure, glow discharge (LS-APGD) device as well as systems that incorporate the device and methods for using the device and systems are described. The LS-APGD includes a hollow capillary for delivering an electrolyte solution to a glow discharge space. The device also includes a counter electrode in the form of a second hollow capillary that can deliver the analyte into the glow discharge space. A voltage across the electrolyte solution and the counter electrode creates the microplasma within the glow discharge space that interacts with the analyte to move it to a higher energy state (vaporization, excitation, and/or ionization of the analyte).

  1. Sample handling in surface sensitive chemical and biological sensing: a practical review of basic fluidics and analyte transport.

    Science.gov (United States)

    Orgovan, Norbert; Patko, Daniel; Hos, Csaba; Kurunczi, Sándor; Szabó, Bálint; Ramsden, Jeremy J; Horvath, Robert

    2014-09-01

    This paper gives an overview of the advantages and associated caveats of the most common sample handling methods in surface-sensitive chemical and biological sensing. We summarize the basic theoretical and practical considerations one faces when designing and assembling the fluidic part of the sensor devices. The influence of analyte size, the use of closed and flow-through cuvettes, the importance of flow rate, tubing length and diameter, bubble traps, pressure-driven pumping, cuvette dead volumes, and sample injection systems are all discussed. Typical application areas of particular arrangements are also highlighted, such as the monitoring of cellular adhesion, biomolecule adsorption-desorption and ligand-receptor affinity binding. Our work is a practical review in the sense that for every sample handling arrangement considered we present our own experimental data and critically review our experience with the given arrangement. In the experimental part we focus on sample handling in optical waveguide lightmode spectroscopy (OWLS) measurements, but the present study is equally applicable for other biosensing technologies in which an analyte in solution is captured at a surface and its presence is monitored. Explicit attention is given to features that are expected to play an increasingly decisive role in determining the reliability of (bio)chemical sensing measurements, such as analyte transport to the sensor surface; the distorting influence of dead volumes in the fluidic system; and the appropriate sample handling of cell suspensions (e.g. their quasi-simultaneous deposition). At the appropriate places, biological aspects closely related to fluidics (e.g. cellular mechanotransduction, competitive adsorption, blood flow in veins) are also discussed, particularly with regard to their models used in biosensing. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Analytical model for Stirling cycle machine design

    Energy Technology Data Exchange (ETDEWEB)

    Formosa, F. [Laboratoire SYMME, Universite de Savoie, BP 80439, 74944 Annecy le Vieux Cedex (France); Despesse, G. [Laboratoire Capteurs Actionneurs et Recuperation d' Energie, CEA-LETI-MINATEC, Grenoble (France)

    2010-10-15

    In order to study further the promising free piston Stirling engine architecture, there is a need of an analytical thermodynamic model which could be used in a dynamical analysis for preliminary design. To aim at more realistic values, the models have to take into account the heat losses and irreversibilities on the engine. An analytical model which encompasses the critical flaws of the regenerator and furthermore the heat exchangers effectivenesses has been developed. This model has been validated using the whole range of the experimental data available from the General Motor GPU-3 Stirling engine prototype. The effects of the technological and operating parameters on Stirling engine performance have been investigated. In addition to the regenerator influence, the effect of the cooler effectiveness is underlined. (author)

  3. [Patient identification errors and biological samples in the analytical process: Is it possible to improve patient safety?].

    Science.gov (United States)

    Cuadrado-Cenzual, M A; García Briñón, M; de Gracia Hills, Y; González Estecha, M; Collado Yurrita, L; de Pedro Moro, J A; Fernández Pérez, C; Arroyo Fernández, M

    2015-01-01

    Patient identification errors and biological samples are one of the problems with the highest risk factor in causing an adverse event in the patient. To detect and analyse the causes of patient identification errors in analytical requests (PIEAR) from emergency departments, and to develop improvement strategies. A process and protocol was designed, to be followed by all professionals involved in the requesting and performing of laboratory tests. Evaluation and monitoring indicators of PIEAR were determined, before and after the implementation of these improvement measures (years 2010-2014). A total of 316 PIEAR were detected in a total of 483,254 emergency service requests during the study period, representing a mean of 6.80/10,000 requests. Patient identification failure was the most frequent in all the 6-monthly periods assessed, with a significant difference (Perrors. However, we must continue working with this strategy, promoting a culture of safety for all the professionals involved, and trying to achieve the goal that 100% of the analytical and samples are properly identified. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  4. Rapid Gamma Screening of Shipments of Analytical Samples to Meet DOT Regulations

    International Nuclear Information System (INIS)

    Wojtaszek, P.A.; Remington, D.L.; Ideker-Mulligan, V.

    2006-01-01

    The accelerated closure program at Rocky Flats required the capacity to ship up to 1000 analytical samples per week to off-site commercial laboratories, and to conduct such shipment within 24 hours of sample collection. During a period of near peak activity in the closure project, a regulatory change significantly increased the level of radionuclide data required for shipment of each package. In order to meet these dual challenges, a centralized and streamlined sample management program was developed which channeled analytical samples through a single, high-throughput radiological screening facility. This trailerized facility utilized high purity germanium (HPGe) gamma spectrometers to conduct screening measurements of entire packages of samples at once, greatly increasing throughput compared to previous methods. The In Situ Object Counting System (ISOCS) was employed to calibrate the HPGe systems to accommodate the widely varied sample matrices and packing configurations encountered. Optimum modeling and configuration parameters were determined. Accuracy of the measurements of grouped sample jars was confirmed with blind samples in multiple configurations. Levels of radionuclides not observable by gamma spectroscopy were calculated utilizing a spreadsheet program that can accommodate isotopic ratios for large numbers of different waste streams based upon acceptable knowledge. This program integrated all radionuclide data and output all information required for shipment, including the shipping class of the package. (authors)

  5. Design Evaluation of Wind Turbine Spline Couplings Using an Analytical Model: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Y.; Keller, J.; Wallen, R.; Errichello, R.; Halse, C.; Lambert, S.

    2015-02-01

    Articulated splines are commonly used in the planetary stage of wind turbine gearboxes for transmitting the driving torque and improving load sharing. Direct measurement of spline loads and performance is extremely challenging because of limited accessibility. This paper presents an analytical model for the analysis of articulated spline coupling designs. For a given torque and shaft misalignment, this analytical model quickly yields insights into relationships between the spline design parameters and resulting loads; bending, contact, and shear stresses; and safety factors considering various heat treatment methods. Comparisons of this analytical model against previously published computational approaches are also presented.

  6. WEB ANALYTICS COMBINED WITH EYE TRACKING FOR SUCCESSFUL USER EXPERIENCE DESIGN: A CASE STUDY

    OpenAIRE

    Magdalena BORYS; Monika CZWÓRNÓG; Tomasz RATAJCZYK

    2016-01-01

    The authors propose a new approach for the mobile user experience design process by means of web analytics and eye-tracking. The proposed method was applied to design the LUT mobile website. In the method, to create the mobile website design, data of various users and their behaviour were gathered and analysed using the web analytics tool. Next, based on the findings from web analytics, the mobile prototype for the website was created and validated in eye-tracking usability testing. The analy...

  7. Future analytical provision - Relocation of Sellafield Ltd Analytical Services Laboratory

    International Nuclear Information System (INIS)

    Newell, B.

    2015-01-01

    Sellafield Ltd Analytical Services provide an essential view on the environmental, safety, process and high hazard risk reduction performances by analysis of samples. It is the largest and most complex analytical services laboratory in Europe, with 150 laboratories (55 operational) and 350 staff (including 180 analysts). Sellafield Ltd Analytical Services Main Laboratory is in need of replacement. This is due to the age of the facility and changes to work streams. This relocation is an opportunity to -) design and commission bespoke MA (Medium-Active) cells, -) modify HA (High-Active) cell design to facilitate an in-cell laboratory, -) develop non-destructive techniques, -) open light building for better worker morale. The option chosen was to move the activities to the NNL Central laboratory (NNLCL) that is based at Sellafield and is the UK's flagship nuclear research and development facility. This poster gives a time schedule

  8. Sampling and analytical methodologies for energy dispersive X-ray fluorescence analysis of airborne particulate matter

    International Nuclear Information System (INIS)

    1993-01-01

    The present document represents an attempt to summarize the most important features of the different forms of ED-XFR as applied to the analysis of airborne particulate matter. It is intended to serve as a set of guidelines for use by participants in the IAEA's own programmes, and other scientists, who are not yet fully experienced in the application of ED-XRF to airborne particulate samples, and who wish either to make a start on using this technique or to improve their existing procedures. The methodologies for sampling described in this document are of rather general applicability. Emphasis is also placed on the sources of errors affecting the sampling of airborne particulate matter. The analytical part of the document describes the different forms of ED-XRF and their potential applications. Spectrum evaluation, a key step in X-ray spectrometry, is covered in depth, including discussion on several calibration and peak fitting techniques and computer programs especially designed for this purpose. 148 refs, 25 figs, 13 tabs

  9. Nuclear analytical techniques and their application to environmental samples

    International Nuclear Information System (INIS)

    Lieser, K.H.

    1986-01-01

    A survey is given on nuclear analytical techniques and their application to environmental samples. Measurement of the inherent radioactivity of elements or radionuclides allows determination of natural radioelements (e.g. Ra), man-made radioelements (e.g. Pu) and radionuclides in the environment. Activation analysis, in particular instrumental neutron activation analysis, is a very reliable and sensitive method for determination of a great number of trace elements in environmental samples, because the most abundant main constituents are not activated. Tracer techniques are very useful for studies of the behaviour and of chemical reactions of trace elements and compounds in the environment. Radioactive sources are mainly applied for excitation of characteristic X-rays (X-ray fluorescence analysis). (author)

  10. The Role of Nanoparticle Design in Determining Analytical Performance of Lateral Flow Immunoassays.

    Science.gov (United States)

    Zhan, Li; Guo, Shuang-Zhuang; Song, Fayi; Gong, Yan; Xu, Feng; Boulware, David R; McAlpine, Michael C; Chan, Warren C W; Bischof, John C

    2017-12-13

    Rapid, simple, and cost-effective diagnostics are needed to improve healthcare at the point of care (POC). However, the most widely used POC diagnostic, the lateral flow immunoassay (LFA), is ∼1000-times less sensitive and has a smaller analytical range than laboratory tests, requiring a confirmatory test to establish truly negative results. Here, a rational and systematic strategy is used to design the LFA contrast label (i.e., gold nanoparticles) to improve the analytical sensitivity, analytical detection range, and antigen quantification of LFAs. Specifically, we discovered that the size (30, 60, or 100 nm) of the gold nanoparticles is a main contributor to the LFA analytical performance through both the degree of receptor interaction and the ultimate visual or thermal contrast signals. Using the optimal LFA design, we demonstrated the ability to improve the analytical sensitivity by 256-fold and expand the analytical detection range from 3 log 10 to 6 log 10 for diagnosing patients with inflammatory conditions by measuring C-reactive protein. This work demonstrates that, with appropriate design of the contrast label, a simple and commonly used diagnostic technology can compete with more expensive state-of-the-art laboratory tests.

  11. Results Of Analytical Sample Crosschecks For Next Generation Solvent Extraction Samples Isopar L Concentration And pH

    International Nuclear Information System (INIS)

    Peters, T.; Fink, S.

    2011-01-01

    As part of the implementation process for the Next Generation Cesium Extraction Solvent (NGCS), SRNL and F/H Lab performed a series of analytical cross-checks to ensure that the components in the NGCS solvent system do not constitute an undue analytical challenge. For measurement of entrained Isopar(reg s ign) L in aqueous solutions, both labs performed similarly with results more reliable at higher concentrations (near 50 mg/L). Low bias occurred in both labs, as seen previously for comparable blind studies for the baseline solvent system. SRNL recommends consideration to use of Teflon(trademark) caps on all sample containers used for this purpose. For pH measurements, the labs showed reasonable agreement but considerable positive bias for dilute boric acid solutions. SRNL recommends consideration of using an alternate analytical method for qualification of boric acid concentrations.

  12. Analytic continuation of quantum Monte Carlo data. Stochastic sampling method

    Energy Technology Data Exchange (ETDEWEB)

    Ghanem, Khaldoon; Koch, Erik [Institute for Advanced Simulation, Forschungszentrum Juelich, 52425 Juelich (Germany)

    2016-07-01

    We apply Bayesian inference to the analytic continuation of quantum Monte Carlo (QMC) data from the imaginary axis to the real axis. Demanding a proper functional Bayesian formulation of any analytic continuation method leads naturally to the stochastic sampling method (StochS) as the Bayesian method with the simplest prior, while it excludes the maximum entropy method and Tikhonov regularization. We present a new efficient algorithm for performing StochS that reduces computational times by orders of magnitude in comparison to earlier StochS methods. We apply the new algorithm to a wide variety of typical test cases: spectral functions and susceptibilities from DMFT and lattice QMC calculations. Results show that StochS performs well and is able to resolve sharp features in the spectrum.

  13. Sample Preparation of Corn Seed Tissue to Prevent Analyte Relocations for Mass Spectrometry Imaging

    Science.gov (United States)

    Kim, Shin Hye; Kim, Jeongkwon; Lee, Young Jin; Lee, Tae Geol; Yoon, Sohee

    2017-08-01

    Corn seed tissue sections were prepared by the tape support method using an adhesive tape, and mass spectrometry imaging (MSI) was performed. The effect of heat generated during sample preparation was investigated by time-of-flight secondary mass spectrometry (TOF-SIMS) imaging of corn seed tissue prepared by the tape support and the thaw-mounted methods. Unlike thaw-mounted sample preparation, the tape support method does not cause imaging distortion because of the absence of heat, which can cause migration of the analytes on the sample. By applying the tape-support method, the corn seed tissue was prepared without structural damage and MSI with accurate spatial information of analytes was successfully performed.

  14. Sample Preparation of Corn Seed Tissue to Prevent Analyte Relocations for Mass Spectrometry Imaging.

    Science.gov (United States)

    Kim, Shin Hye; Kim, Jeongkwon; Lee, Young Jin; Lee, Tae Geol; Yoon, Sohee

    2017-08-01

    Corn seed tissue sections were prepared by the tape support method using an adhesive tape, and mass spectrometry imaging (MSI) was performed. The effect of heat generated during sample preparation was investigated by time-of-flight secondary mass spectrometry (TOF-SIMS) imaging of corn seed tissue prepared by the tape support and the thaw-mounted methods. Unlike thaw-mounted sample preparation, the tape support method does not cause imaging distortion because of the absence of heat, which can cause migration of the analytes on the sample. By applying the tape-support method, the corn seed tissue was prepared without structural damage and MSI with accurate spatial information of analytes was successfully performed. Graphical Abstract ᅟ.

  15. Recent bibliography on analytical and sampling problems of a PWR primary coolant

    International Nuclear Information System (INIS)

    Illy, H.

    1980-07-01

    An extensive bibliography on the problems of analysis and sampling of the primary cooling water of PWRs is presented. The aim was to collect the analytical methods for dissolved gases. The sampling and preparation are also taken into account. last 8-10 years is included. The bibliography is arranged into alphabetical order by topics. The most important topics are as follows: boric acid, gas analysis, hydrogen isotopes, iodine, noble gases, radiation monitoring, sampling and preparation, water chemistry. (R.J.)

  16. MoonDB — A Data System for Analytical Data of Lunar Samples

    Science.gov (United States)

    Lehnert, K.; Ji, P.; Cai, M.; Evans, C.; Zeigler, R.

    2018-04-01

    MoonDB is a data system that makes analytical data from the Apollo lunar sample collection and lunar meteorites accessible by synthesizing published and unpublished datasets in a relational database with an online search interface.

  17. Towards analytical mix design for large-stone asphalt mixes.

    CSIR Research Space (South Africa)

    Rust, FC

    1992-08-01

    Full Text Available This paper addresses the development of an analytically based design procedure for large-aggregate asphalt and its application in thirteen trial sections. The physical and engineering properties of the various materials are discussed and related...

  18. Effects of fecal sampling on preanalytical and analytical phases in quantitative fecal immunochemical tests for hemoglobin.

    Science.gov (United States)

    Rapi, Stefano; Berardi, Margherita; Cellai, Filippo; Ciattini, Samuele; Chelazzi, Laura; Ognibene, Agostino; Rubeca, Tiziana

    2017-07-24

    Information on preanalytical variability is mandatory to bring laboratories up to ISO 15189 requirements. Fecal sampling is greatly affected by lack of harmonization in laboratory medicine. The aims of this study were to obtain information on the devices used for fecal sampling and to explore the effect of different amounts of feces on the results from the fecal immunochemical test for hemoglobin (FIT-Hb). Four commercial sample collection devices for quantitative FIT-Hb measurements were investigated. The volume of interest (VOI) of the probes was measured from diameter and geometry. Quantitative measurements of the mass of feces were carried out by gravimetry. The effects of an increased amount of feces on the analytical environment were investigated measuring the Hb values with a single analytical method. VOI was 8.22, 7.1 and 9.44 mm3 for probes that collected a target of 10 mg of feces, and 3.08 mm3 for one probe that targeted 2 mg of feces. The ratio between recovered and target amounts of devices ranged from 56% to 121%. Different changes in the measured Hb values were observed, in adding increasing amounts of feces in commercial buffers. The amounts of collected materials are related to the design of probes. Three out 4 manufacturers declare the same target amount using different sampling volumes and obtaining different amounts of collected materials. The introduction of a standard probes to reduce preanalytical variability could be an useful step for fecal test harmonization and to fulfill the ISO 15189 requirements.

  19. Coupling Numerical Methods and Analytical Models for Ducted Turbines to Evaluate Designs

    Directory of Open Access Journals (Sweden)

    Bradford Knight

    2018-04-01

    Full Text Available Hydrokinetic turbines extract energy from currents in oceans, rivers, and streams. Ducts can be used to accelerate the flow across the turbine to improve performance. The objective of this work is to couple an analytical model with a Reynolds averaged Navier–Stokes (RANS computational fluid dynamics (CFD solver to evaluate designs. An analytical model is derived for ducted turbines. A steady-state moving reference frame solver is used to analyze both the freestream and ducted turbine. A sliding mesh solver is examined for the freestream turbine. An efficient duct is introduced to accelerate the flow at the turbine. Since the turbine is optimized for operation in the freestream and not within the duct, there is a decrease in efficiency due to duct-turbine interaction. Despite the decrease in efficiency, the power extracted by the turbine is increased. The analytical model under-predicts the flow rejection from the duct that is predicted by CFD since the CFD predicts separation but the analytical model does not. Once the mass flow rate is corrected, the model can be used as a design tool to evaluate how the turbine-duct pair reduces mass flow efficiency. To better understand this phenomenon, the turbine is also analyzed within a tube with the analytical model and CFD. The analytical model shows that the duct’s mass flow efficiency reduces as a function of loading, showing that the system will be more efficient when lightly loaded. Using the conclusions of the analytical model, a more efficient ducted turbine system is designed. The turbine is pitched more heavily and the twist profile is adapted to the radial throat velocity profile.

  20. Analytical and sampling problems in primary coolant circuits of PWR-type reactors

    International Nuclear Information System (INIS)

    Illy, H.

    1980-10-01

    Details of recent analytical methods on the analysis and sampling of a PWR primary coolant are given in the order as follows: sampling and preparation; analysis of the gases dissolved in the water; monitoring of radiating substances; checking of boric acid concentration which controls the reactivity. The bibliography of this work and directions for its use are published in a separate report: KFKI-80-48 (1980). (author)

  1. Planetary Sample Caching System Design Options

    Science.gov (United States)

    Collins, Curtis; Younse, Paulo; Backes, Paul

    2009-01-01

    Potential Mars Sample Return missions would aspire to collect small core and regolith samples using a rover with a sample acquisition tool and sample caching system. Samples would need to be stored in individual sealed tubes in a canister that could be transfered to a Mars ascent vehicle and returned to Earth. A sample handling, encapsulation and containerization system (SHEC) has been developed as part of an integrated system for acquiring and storing core samples for application to future potential MSR and other potential sample return missions. Requirements and design options for the SHEC system were studied and a recommended design concept developed. Two families of solutions were explored: 1)transfer of a raw sample from the tool to the SHEC subsystem and 2)transfer of a tube containing the sample to the SHEC subsystem. The recommended design utilizes sample tool bit change out as the mechanism for transferring tubes to and samples in tubes from the tool. The SHEC subsystem design, called the Bit Changeout Caching(BiCC) design, is intended for operations on a MER class rover.

  2. WEB ANALYTICS COMBINED WITH EYE TRACKING FOR SUCCESSFUL USER EXPERIENCE DESIGN: A CASE STUDY

    Directory of Open Access Journals (Sweden)

    Magdalena BORYS

    2016-12-01

    Full Text Available The authors propose a new approach for the mobile user experience design process by means of web analytics and eye-tracking. The proposed method was applied to design the LUT mobile website. In the method, to create the mobile website design, data of various users and their behaviour were gathered and analysed using the web analytics tool. Next, based on the findings from web analytics, the mobile prototype for the website was created and validated in eye-tracking usability testing. The analysis of participants’ behaviour during eye-tracking sessions allowed improvements of the prototype.

  3. An analytical examination of distortions in power spectra due to sampling errors

    International Nuclear Information System (INIS)

    Njau, E.C.

    1982-06-01

    Distortions introduced into spectral energy densities of sinusoid signals as well as those of more complex signals through different forms of errors in signal sampling are developed and shown analytically. The approach we have adopted in doing this involves, firstly, developing for each type of signal and for the corresponding form of sampling errors an analytical expression that gives the faulty digitization process involved in terms of the features of the particular signal. Secondly, we take advantage of a method described elsewhere [IC/82/44] to relate, as much as possible, the true spectral energy density of the signal and the corresponding spectral energy density of the faulty digitization process. Thirdly, we then develop expressions which reveal the distortions that are formed in the directly computed spectral energy density of the digitized signal. It is evident from the formulations developed herein that the types of sampling errors taken into consideration may create false peaks and other distortions that are of non-negligible concern in computed power spectra. (author)

  4. Sampling analytical tests and destructive tests for quality assurance

    International Nuclear Information System (INIS)

    Saas, A.; Pasquini, S.; Jouan, A.; Angelis, de; Hreen Taywood, H.; Odoj, R.

    1990-01-01

    In the context of the third programme of the European Communities on the monitoring of radioactive waste, various methods have been developed for the performance of sampling and measuring tests on encapsulated waste of low and medium level activity, on the one hand, and of high level activity, on the other hand. The purpose was to provide better quality assurance for products to be stored on an interim or long-term basis. Various testing sampling means are proposed such as: - sampling of raw waste before conditioning and determination of the representative aliquot, - sampling of encapsulated waste on process output, - sampling of core specimens subjected to measurement before and after cutting. Equipment suitable for these sampling procedures have been developed and, in the case of core samples, a comparison of techniques has been made. The results are described for the various analytical tests carried out on the samples such as: - mechanical tests, - radiation resistance, - fire resistance, - lixiviation, - determination of free water, - biodegradation, - water resistance, - chemical and radiochemical analysis. Every time it was possible, these tests were compared with non-destructive tests on full-scale packages and some correlations are given. This word has made if possible to improve and clarify sample optimization, with fine sampling techniques and methodologies and draw up characterization procedures. It also provided an occasion for a first collaboration between the laboratories responsible for these studies and which will be furthered in the scope of the 1990-1994 programme

  5. Current status of JAERI program on development of ultra-trace-analytical technology for safeguards environmental samples

    International Nuclear Information System (INIS)

    Adachi, T.; Usuda, S.; Watanabe, K.

    2001-01-01

    is, therefore, studying on the selective recovery of uranium particles from the swipe samples. Otherwise, alternative swipe materials with less uranium blank would be preferable. The analytical technology for individual particles in the environmental samples is an important issue to develop. Works are continued with total reflection X-ray fluorescence spectrometry (TXRF) for screening, electron-probe microanalysis (EPMA) for elemental composition and morphology of each particle, and secondary ion mass spectrometry (SIMS) for isotopic ratio measurement. A special mount made of glassy carbon was designed in order that the mount could be commonly used among the three apparatuses. The detection limit of uranium in particle screening by TXRF was achieved to 0.4 ng. By combination of TXRF, EPMA and SIMS, the throughput for analysis on uranium particle of 1 μm was one swipe per day, which is to be increased by improvement of the technique for particle mapping. The place of the R and D work will be moved to the CLEAR facility, the construction of which was completed in December 2000. The facility includes clean rooms (215 m 2 ) with cleanliness class 100 for chemical sample treatment and those (480 m 2 ) with class 1,000 and 10,000 for analytical operation, sample storage, etc. The facility performance tests were carried out prior to the installation of analytical equipment, which gave the satisfactory results, e.g., class 10 on working surfaces of clean hoods and benches. The full operation of the facility is scheduled for June 2001. The first phase of the program continues until March 2003. During this period, fundamental technology for ultra-trace analysis for uranium and plutonium will be established with sufficient sensitivity and accuracy for the environmental sample analysis. JAERI will contribute to the strengthened safeguards system of IAEA by joining the community of Network Analytical Laboratories as a first member from the Asian area, as well as contribute to

  6. Analytical Methodologies for the Determination of Endocrine Disrupting Compounds in Biological and Environmental Samples

    Directory of Open Access Journals (Sweden)

    Zoraida Sosa-Ferrera

    2013-01-01

    Full Text Available Endocrine-disruptor compounds (EDCs can mimic natural hormones and produce adverse effects in the endocrine functions by interacting with estrogen receptors. EDCs include both natural and synthetic chemicals, such as hormones, personal care products, surfactants, and flame retardants, among others. EDCs are characterised by their ubiquitous presence at trace-level concentrations and their wide diversity. Since the discovery of the adverse effects of these pollutants on wildlife and human health, analytical methods have been developed for their qualitative and quantitative determination. In particular, mass-based analytical methods show excellent sensitivity and precision for their quantification. This paper reviews recently published analytical methodologies for the sample preparation and for the determination of these compounds in different environmental and biological matrices by liquid chromatography coupled with mass spectrometry. The various sample preparation techniques are compared and discussed. In addition, recent developments and advances in this field are presented.

  7. Validation of an analytical methodology for the quantitative analysis of petroleum hydrocarbons in marine sediment samples

    Directory of Open Access Journals (Sweden)

    Eloy Yordad Companioni Damas

    2009-01-01

    Full Text Available This work describes a validation of an analytical procedure for the analysis of petroleum hydrocarbons in marine sediment samples. The proposed protocol is able to measure n-alkanes and polycyclic aromatic hydrocarbons (PAH in samples at concentrations as low as 30 ng/g, with a precision better than 15% for most of analytes. The extraction efficiency of fortified sediments varied from 65.1 to 105.6% and 59.7 to 97.8%, for n-alkanes and PAH in the ranges: C16 - C32 and fluoranthene - benzo(apyrene, respectively. The analytical protocol was applied to determine petroleum hydrocarbons in sediments collected from a marine coastal zone.

  8. Analytical Study on Thermal and Mechanical Design of Printed Circuit Heat Exchanger

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Su-Jong [Idaho National Lab. (INL), Idaho Falls, ID (United States); Sabharwall, Piyush [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kim, Eung-Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-09-01

    The analytical methodologies for the thermal design, mechanical design and cost estimation of printed circuit heat exchanger are presented in this study. In this study, three flow arrangements of parallel flow, countercurrent flow and crossflow are taken into account. For each flow arrangement, the analytical solution of temperature profile of heat exchanger is introduced. The size and cost of printed circuit heat exchangers for advanced small modular reactors, which employ various coolants such as sodium, molten salts, helium, and water, are also presented.

  9. Computerized Analytical Data Management System and Automated Analytical Sample Transfer System at the COGEMA Reprocessing Plants in La Hague

    International Nuclear Information System (INIS)

    Flament, T.; Goasmat, F.; Poilane, F.

    2002-01-01

    Managing the operation of large commercial spent nuclear fuel reprocessing plants, such as UP3 and UP2-800 in La Hague, France, requires an extensive analytical program and the shortest possible analysis response times. COGEMA, together with its engineering subsidiary SGN, decided to build high-performance laboratories to support operations in its plants. These laboratories feature automated equipment, safe environments for operators, and short response times, all in centralized installations. Implementation of a computerized analytical data management system and a fully automated pneumatic system for the transfer of radioactive samples was a key factor contributing to the successful operation of the laboratories and plants

  10. Automated Sample Preparation for Radiogenic and Non-Traditional Metal Isotopes: Removing an Analytical Barrier for High Sample Throughput

    Science.gov (United States)

    Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.

    2014-05-01

    MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.

  11. Sampling and sample processing in pesticide residue analysis.

    Science.gov (United States)

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  12. Sample diagnosis using indicator elements and non-analyte signals for inductively coupled plasma mass spectrometry

    International Nuclear Information System (INIS)

    Antler, Margaret; Ying Hai; Burns, David H.; Salin, Eric D.

    2003-01-01

    A sample diagnosis procedure that uses both non-analyte and analyte signals to estimate matrix effects in inductively coupled plasma-mass spectrometry is presented. Non-analyte signals are those of background species in the plasma (e.g. N + , ArO + ), and changes in these signals can indicate changes in plasma conditions. Matrix effects of Al, Ba, Cs, K and Na on 19 non-analyte signals and 15 element signals were monitored. Multiple linear regression was used to build the prediction models, using a genetic algorithm for objective feature selection. Non-analyte elemental signals and non-analyte signals were compared for diagnosing matrix effects, and both were found to be suitable for estimating matrix effects. Individual analyte matrix effect estimation was compared with the overall matrix effect prediction, and models used to diagnose overall matrix effects were more accurate than individual analyte models. In previous work [Spectrochim. Acta Part B 57 (2002) 277], we tested models for analytical decision making. The current models were tested in the same way, and were able to successfully diagnose matrix effects with at least an 80% success rate

  13. Analytical artefacts in the speciation of arsenic in clinical samples

    International Nuclear Information System (INIS)

    Slejkovec, Zdenka; Falnoga, Ingrid; Goessler, Walter; Elteren, Johannes T. van; Raml, Reingard; Podgornik, Helena; Cernelc, Peter

    2008-01-01

    Urine and blood samples of cancer patients, treated with high doses of arsenic trioxide were analysed for arsenic species using HPLC-HGAFS and, in some cases, HPLC-ICPMS. Total arsenic was determined with either flow injection-HGAFS in urine or radiochemical neutron activation analysis in blood fractions (in serum/plasma, blood cells). The total arsenic concentrations (during prolonged, daily/weekly arsenic trioxide therapy) were in the μg mL -1 range for urine and in the ng g -1 range for blood fractions. The main arsenic species found in urine were As(III), MA and DMA and in blood As(V), MA and DMA. With proper sample preparation and storage of urine (no preservation agents/storage in liquid nitrogen) no analytical artefacts were observed and absence of significant amounts of alleged trivalent metabolites was proven. On the contrary, in blood samples a certain amount of arsenic can get lost in the speciation procedure what was especially noticeable for the blood cells although also plasma/serum gave rise to some disappearance of arsenic. The latter losses may be attributed to precipitation of As(III)-containing proteins/peptides during the methanol/water extraction procedure whereas the former losses were due to loss of specific As(III)-complexing proteins/peptides (e.g. cysteine, metallothionein, reduced GSH, ferritin) on the column (Hamilton PRP-X100) during the separation procedure. Contemporary analytical protocols are not able to completely avoid artefacts due to losses from the sampling to the detection stage so that it is recommended to be careful with the explanation of results, particularly regarding metabolic and pharmacokinetic interpretations, and always aim to compare the sum of species with the total arsenic concentration determined independently

  14. LATUX: An Iterative Workflow for Designing, Validating, and Deploying Learning Analytics Visualizations

    Science.gov (United States)

    Martinez-Maldonado, Roberto; Pardo, Abelardo; Mirriahi, Negin; Yacef, Kalina; Kay, Judy; Clayphan, Andrew

    2015-01-01

    Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human-computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now…

  15. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    Science.gov (United States)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  16. Time and temperature dependent analytical stability of dry-collected Evalyn HPV self-sampling brush for cervical cancer screening

    DEFF Research Database (Denmark)

    Ejegod, Ditte Møller; Pedersen, Helle; Alzua, Garazi Peña

    2018-01-01

    As a new initiative, HPV self-sampling to non-attenders using the dry Evalyn self-sampling brush is offered in the Capital Region of Denmark. The use of a dry brush is largely uncharted territory in terms of analytical stability. In this study we aim to provide evidence on the analytical quality...

  17. Hanford analytical services quality assurance plan. Revision 1

    International Nuclear Information System (INIS)

    1995-02-01

    This document, the Hanford Analytical Services Quality Assurance Plan (HASQAP), is issued by the U.S. Department of Energy, Richland Operations Office (RL). The HASQAP establishes quality requirements in response to U.S. Department of Energy (DOE) Order 5700.6C, Quality Assurance (10 CFR 830.120, open-quotes Quality Assurance Requirementsclose quotes). The HASQAP is designed to meet the needs of the RL for controlling the of analytical chemistry services provided by laboratory operations. The HASQAP is issued through the Analytical Services Branch of the Waste Management Division. The Analytical Services Branch is designated by the RL as having the responsibility for oversight management of laboratory operations under the Waste Management Division. The laboratories conduct sample analyses under several regulatory statutes, such as the Clean Air Act and the Clean Water Act. Sample analysis in support of the Hanford Federal Facility Agreement and Consent Order (Tri-Party Agreement) is a major role of the laboratory operations

  18. Abstracts book of 4. Poznan Analytical Seminar on Modern Methods of Sample Preparation and Trace Amounts Determination of Elements

    International Nuclear Information System (INIS)

    1995-01-01

    The 4. Poznan Analytical Seminar on Modern Methods of Sample Preparation and Trace Amounts Determination of Elements has been held in Poznan 27-28 April 1995. The new versions of analytical methods have been presented for quantitative determination of trace elements in biological, environmental and geological materials. Also the number of special techniques for sample preparation enables achievement the best precision of analytical results have been shown and discussed

  19. Trends in analytical methodologies for the determination of alkylphenols and bisphenol A in water samples.

    Science.gov (United States)

    Salgueiro-González, N; Muniategui-Lorenzo, S; López-Mahía, P; Prada-Rodríguez, D

    2017-04-15

    In the last decade, the impact of alkylphenols and bisphenol A in the aquatic environment has been widely evaluated because of their high use in industrial and household applications as well as their toxicological effects. These compounds are well-known endocrine disrupting compounds (EDCs) which can affect the hormonal system of humans and wildlife, even at low concentrations. Due to the fact that these pollutants enter into the environment through waters, and it is the most affected compartment, analytical methods which allow the determination of these compounds in aqueous samples at low levels are mandatory. In this review, an overview of the most significant advances in the analytical methodologies for the determination of alkylphenols and bisphenol A in waters is considered (from 2002 to the present). Sample handling and instrumental detection strategies are critically discussed, including analytical parameters related to quality assurance and quality control (QA/QC). Special attention is paid to miniaturized sample preparation methodologies and approaches proposed to reduce time- and reagents consumption according to Green Chemistry principles, which have increased in the last five years. Finally, relevant applications of these methods to the analysis of water samples are examined, being wastewater and surface water the most investigated. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. SOLUTION OF SIGNAL UNCERTAINTY PROBLEM AT ANALYTICAL DESIGN OF CONSECUTIVE COMPENSATOR IN PIEZO ACTUATOR CONTROL

    Directory of Open Access Journals (Sweden)

    S.V. Bystrov

    2016-05-01

    Full Text Available Subject of Research.We present research results for the signal uncertainty problem that naturally arises for the developers of servomechanisms, including analytical design of serial compensators, delivering the required quality indexes for servomechanisms. Method. The problem was solved with the use of Besekerskiy engineering approach, formulated in 1958. This gave the possibility to reduce requirements for input signal composition of servomechanisms by using only two of their quantitative characteristics, such as maximum speed and acceleration. Information about input signal maximum speed and acceleration allows entering into consideration the equivalent harmonic input signal with calculated amplitude and frequency. In combination with requirements for maximum tracking error, the amplitude and frequency of the equivalent harmonic effects make it possible to estimate analytically the value of the amplitude characteristics of the system by error and then convert it to amplitude characteristic of open-loop system transfer function. While previously Besekerskiy approach was mainly used in relation to the apparatus of logarithmic characteristics, we use this approach for analytical synthesis of consecutive compensators. Main Results. Proposed technique is used to create analytical representation of "input–output" and "error–output" polynomial dynamic models of the designed system. In turn, the desired model of the designed system in the "error–output" form of analytical representation of transfer functions is the basis for the design of consecutive compensator, that delivers the desired placement of state matrix eigenvalues and, consequently, the necessary set of dynamic indexes for the designed system. The given procedure of consecutive compensator analytical design on the basis of Besekerskiy engineering approach under conditions of signal uncertainty is illustrated by an example. Practical Relevance. The obtained theoretical results are

  1. Negative dielectrophoresis spectroscopy for rare analyte quantification in biological samples

    Science.gov (United States)

    Kirmani, Syed Abdul Mannan; Gudagunti, Fleming Dackson; Velmanickam, Logeeshan; Nawarathna, Dharmakeerthi; Lima, Ivan T., Jr.

    2017-03-01

    We propose the use of negative dielectrophoresis (DEP) spectroscopy as a technique to improve the detection limit of rare analytes in biological samples. We observe a significant dependence of the negative DEP force on functionalized polystyrene beads at the edges of interdigitated electrodes with respect to the frequency of the electric field. We measured this velocity of repulsion for 0% and 0.8% conjugation of avidin with biotin functionalized polystyrene beads with our automated software through real-time image processing that monitors the Rayleigh scattering from the beads. A significant difference in the velocity of the beads was observed in the presence of as little as 80 molecules of avidin per biotin functionalized bead. This technology can be applied in the detection and quantification of rare analytes that can be useful in the diagnosis and the treatment of diseases, such as cancer and myocardial infarction, with the use of polystyrene beads functionalized with antibodies for the target biomarkers.

  2. Analytical challenges in sports drug testing.

    Science.gov (United States)

    Thevis, Mario; Krug, Oliver; Geyer, Hans; Walpurgis, Katja; Baume, Norbert; Thomas, Andreas

    2018-03-01

    Analytical chemistry represents a central aspect of doping controls. Routine sports drug testing approaches are primarily designed to address the question whether a prohibited substance is present in a doping control sample and whether prohibited methods (for example, blood transfusion or sample manipulation) have been conducted by an athlete. As some athletes have availed themselves of the substantial breadth of research and development in the pharmaceutical arena, proactive and preventive measures are required such as the early implementation of new drug candidates and corresponding metabolites into routine doping control assays, even though these drug candidates are to date not approved for human use. Beyond this, analytical data are also cornerstones of investigations into atypical or adverse analytical findings, where the overall picture provides ample reason for follow-up studies. Such studies have been of most diverse nature, and tailored approaches have been required to probe hypotheses and scenarios reported by the involved parties concerning the plausibility and consistency of statements and (analytical) facts. In order to outline the variety of challenges that doping control laboratories are facing besides providing optimal detection capabilities and analytical comprehensiveness, selected case vignettes involving the follow-up of unconventional adverse analytical findings, urine sample manipulation, drug/food contamination issues, and unexpected biotransformation reactions are thematized.

  3. Review of Analytes of Concern and Sample Methods for Closure of DOE High Level Waste Storage Tanks

    International Nuclear Information System (INIS)

    Thomas, T.R.

    2002-01-01

    Sampling residual waste after tank cleaning and analysis for analytes of concern to support closure and cleaning targets of large underground tanks used for storage of legacy high level radioactive waste (HLW) at Department of Energy (DOE) sites has been underway since about 1995. The DOE Tanks Focus Area (TFA) has been working with DOE tank sites to develop new sampling plans, and sampling methods for assessment of residual waste inventories. This paper discusses regulatory analytes of concern, sampling plans, and sampling methods that support closure and cleaning target activities for large storage tanks at the Hanford Site, the Savannah River Site (SRS), the Idaho National Engineering and Environmental Laboratory (INEEL), and the West Valley Demonstration Project (WVDP)

  4. Analytical Design of Evolvable Software for High-Assurance Computing

    Science.gov (United States)

    2001-02-14

    system size Sext wij j 1= Ai ∑ wik k 1= Mi ∑+               i 1= N ∑= = 59 5 Analytical Partition of Components As discussed in Chapter 1...76]. Does the research approach yield evolvable components in less mathematically-oriented applications such as multi- media and e- commerce? There is... Social Security Number Date 216 217 Appendix H Benchmark Design for the Microwave Oven Software The benchmark design consists of the

  5. Characterization of carbon nanotubes and analytical methods for their determination in environmental and biological samples: A review

    Energy Technology Data Exchange (ETDEWEB)

    Herrero-Latorre, C., E-mail: carlos.herrero@usc.es; Álvarez-Méndez, J.; Barciela-García, J.; García-Martín, S.; Peña-Crecente, R.M.

    2015-01-01

    Highlights: • Analytical techniques for characterization of CNTs: classification, description and examples. • Determination methods for CNTs in biological and environmental samples. • Future trends and perspectives for characterization and determination of CNTs. - Abstract: In the present paper, a critical overview of the most commonly used techniques for the characterization and the determination of carbon nanotubes (CNTs) is given on the basis of 170 references (2000–2014). The analytical techniques used for CNT characterization (including microscopic and diffraction, spectroscopic, thermal and separation techniques) are classified, described, and illustrated with applied examples. Furthermore, the performance of sampling procedures as well as the available methods for the determination of CNTs in real biological and environmental samples are reviewed and discussed according to their analytical characteristics. In addition, future trends and perspectives in this field of work are critically presented.

  6. Improved analytical sensitivity for uranium and plutonium in environmental samples: Cavity ion source thermal ionization mass spectrometry

    International Nuclear Information System (INIS)

    Ingeneri, Kristofer; Riciputi, L.

    2001-01-01

    Following successful field trials, environmental sampling has played a central role as a routine part of safeguards inspections since early 1996 to verify declared and to detect undeclared activity. The environmental sampling program has brought a new series of analytical challenges, and driven a need for advances in verification technology. Environmental swipe samples are often extremely low in concentration of analyte (ng level or lower), yet the need to analyze these samples accurately and precisely is vital, particularly for the detection of undeclared nuclear activities. Thermal ionization mass spectrometry (TIMS) is the standard method of determining isotope ratios of uranium and plutonium in the environmental sampling program. TIMS analysis typically employs 1-3 filaments to vaporize and ionize the sample, and the ions are mass separated and analyzed using magnetic sector instruments due to their high mass resolution and high ion transmission. However, the ionization efficiency (the ratio of material present to material actually detected) of uranium using a standard TIMS instrument is low (0.2%), even under the best conditions. Increasing ionization efficiency by even a small amount would have a dramatic impact for safeguards applications, allowing both improvements in analytical precision and a significant decrease in the amount of uranium and plutonium required for analysis, increasing the sensitivity of environmental sampling

  7. Analytical Parameters of an Amperometric Glucose Biosensor for Fast Analysis in Food Samples.

    Science.gov (United States)

    Artigues, Margalida; Abellà, Jordi; Colominas, Sergi

    2017-11-14

    Amperometric biosensors based on the use of glucose oxidase (GOx) are able to combine the robustness of electrochemical techniques with the specificity of biological recognition processes. However, very little information can be found in literature about the fundamental analytical parameters of these sensors. In this work, the analytical behavior of an amperometric biosensor based on the immobilization of GOx using a hydrogel (Chitosan) onto highly ordered titanium dioxide nanotube arrays (TiO₂NTAs) has been evaluated. The GOx-Chitosan/TiO₂NTAs biosensor showed a sensitivity of 5.46 μA·mM -1 with a linear range from 0.3 to 1.5 mM; its fundamental analytical parameters were studied using a commercial soft drink. The obtained results proved sufficient repeatability (RSD = 1.9%), reproducibility (RSD = 2.5%), accuracy (95-105% recovery), and robustness (RSD = 3.3%). Furthermore, no significant interferences from fructose, ascorbic acid and citric acid were obtained. In addition, the storage stability was further examined, after 30 days, the GOx-Chitosan/TiO₂NTAs biosensor retained 85% of its initial current response. Finally, the glucose content of different food samples was measured using the biosensor and compared with the respective HPLC value. In the worst scenario, a deviation smaller than 10% was obtained among the 20 samples evaluated.

  8. The analytical calibration in (bio)imaging/mapping of the metallic elements in biological samples--definitions, nomenclature and strategies: state of the art.

    Science.gov (United States)

    Jurowski, Kamil; Buszewski, Bogusław; Piekoszewski, Wojciech

    2015-01-01

    Nowadays, studies related to the distribution of metallic elements in biological samples are one of the most important issues. There are many articles dedicated to specific analytical atomic spectrometry techniques used for mapping/(bio)imaging the metallic elements in various kinds of biological samples. However, in such literature, there is a lack of articles dedicated to reviewing calibration strategies, and their problems, nomenclature, definitions, ways and methods used to obtain quantitative distribution maps. The aim of this article was to characterize the analytical calibration in the (bio)imaging/mapping of the metallic elements in biological samples including (1) nomenclature; (2) definitions, and (3) selected and sophisticated, examples of calibration strategies with analytical calibration procedures applied in the different analytical methods currently used to study an element's distribution in biological samples/materials such as LA ICP-MS, SIMS, EDS, XRF and others. The main emphasis was placed on the procedures and methodology of the analytical calibration strategy. Additionally, the aim of this work is to systematize the nomenclature for the calibration terms: analytical calibration, analytical calibration method, analytical calibration procedure and analytical calibration strategy. The authors also want to popularize the division of calibration methods that are different than those hitherto used. This article is the first work in literature that refers to and emphasizes many different and complex aspects of analytical calibration problems in studies related to (bio)imaging/mapping metallic elements in different kinds of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Application of analytical target cascading method in multidisciplinary design optimization of ship conceptual design

    Directory of Open Access Journals (Sweden)

    WANG Jian

    2017-10-01

    Full Text Available [Objectives] Ship conceptual design requires the coordination of many different disciplines for comprehensive optimization, which presents a complicated system design problem affecting several fields of technology. However, the development of overall ship design is relatively slow compared with other subjects. [Methods] The decomposition and coordination strategy of ship design is presented, and the analytical target cascading (ATC method is applied to the multidisciplinary design optimization of the conceptual design phase of ships on this basis. A tank ship example covering the 5 disciplines of buoyancy and stability, rapidity, maneuverability, capacity and economy is established to illustrate the analysis process in the present study. [Results] The results demonstrate the stability, convergence and validity of the ATC method in dealing with the complex coupling effect occurring in ship conceptual design.[Conclusions] The proposed method provides an effective basis for optimization of ship conceptual design.

  10. Product design planning with the analytic hierarchy process in inter-organizational networks

    NARCIS (Netherlands)

    Hummel, J. Marjan; van Rossum, Wouter; Verkerke, Gijsbertus Jacob; Rakhorst, Gerhard

    2002-01-01

    In the second half of inter–organizational product development, the new product is likely to face significant design changes. Our study focused on the adequacy of the analytic hierarchy process (AHP) to support the collaborative partners to steer and align the accompanying design activities. It

  11. Product design planning with the analytic hierarchy process in inter-organizational networks

    NARCIS (Netherlands)

    Hummel, J.M.; van Rossum, Wouter; Verkerke, Bart; Rakhorst, G

    2002-01-01

    In the second half of inter-organizational product developments the new product is likely to face significant design changes. Our study focused on the adequacy of the analytic hierarchy process (AHP) to support the collaborative partners to steer and align the accompanying design activities. It

  12. Parameter sampling capabilities of sequential and simultaneous data assimilation: I. Analytical comparison

    International Nuclear Information System (INIS)

    Fossum, Kristian; Mannseth, Trond

    2014-01-01

    We assess the parameter sampling capabilities of some Bayesian, ensemble-based, joint state-parameter (JS) estimation methods. The forward model is assumed to be non-chaotic and have nonlinear components, and the emphasis is on results obtained for the parameters in the state-parameter vector. A variety of approximate sampling methods exist, and a number of numerical comparisons between such methods have been performed. Often, more than one of the defining characteristics vary from one method to another, so it can be difficult to point out which characteristic of the more successful method in such a comparison was decisive. In this study, we single out one defining characteristic for comparison; whether or not data are assimilated sequentially or simultaneously. The current paper is concerned with analytical investigations into this issue. We carefully select one sequential and one simultaneous JS method for the comparison. We also design a corresponding pair of pure parameter estimation methods, and we show how the JS methods and the parameter estimation methods are pairwise related. It is shown that the sequential and the simultaneous parameter estimation methods are equivalent for one particular combination of observations with different degrees of nonlinearity. Strong indications are presented for why one may expect the sequential parameter estimation method to outperform the simultaneous parameter estimation method for all other combinations of observations. Finally, the conditions for when similar relations can be expected to hold between the corresponding JS methods are discussed. A companion paper, part II (Fossum and Mannseth 2014 Inverse Problems 30 114003), is concerned with statistical analysis of results from a range of numerical experiments involving sequential and simultaneous JS estimation, where the design of the numerical investigation is motivated by our findings in the current paper. (paper)

  13. Analytical procedures for determining Pb and Sr isotopic compositions in water samples by ID-TIMS

    Directory of Open Access Journals (Sweden)

    Veridiana Martins

    2008-01-01

    Full Text Available Few articles deal with lead and strontium isotopic analysis of water samples. The aim of this study was to define the chemical procedures for Pb and Sr isotopic analyses of groundwater samples from an urban sedimentary aquifer. Thirty lead and fourteen strontium isotopic analyses were performed to test different analytical procedures. Pb and Sr isotopic ratios as well as Sr concentration did not vary using different chemical procedures. However, the Pb concentrations were very dependent on the different procedures. Therefore, the choice of the best analytical procedure was based on the Pb results, which indicated a higher reproducibility from samples that had been filtered and acidified before the evaporation, had their residues totally dissolved, and were purified by ion chromatography using the Biorad® column. Our results showed no changes in Pb ratios with the storage time.

  14. A review of electro analytical determinations of some important elements (Zn, Se, As) in environmental samples

    International Nuclear Information System (INIS)

    Lichiang; James, B.D.; Magee, R.J.

    1991-01-01

    This review covers electro analytical methods reported in the literature for the determination of zinc, cadmium, selenium and arsenic in environmental and biological samples. A comprehensive survey of electro analytical techniques used for the determination of four important elements, i.e. zinc, cadmium, selenium and arsenic is reported herein with 322 references up to 1990. (Orig./A.B.)

  15. Optimal design of nuclear mechanical dampers with analytical hierarchy process

    International Nuclear Information System (INIS)

    Zou Yuehua; Wen Bo; Xu Hongxiang; Qin Yonglie

    2000-01-01

    An optimal design with analytical hierarchy process on nuclear mechanical dampers manufactured by authors' university was described. By using fuzzy judgement matrix the coincidence was automatically satisfied without the need of coincidence test. The results obtained by this method have been put into the production practices

  16. Analytical results from salt batch 9 routine DSSHT and SEHT monthly samples

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-06-01

    Strip Effluent Hold Tank (SEHT) and Decontaminated Salt Solution Hold Tank (DSSHT) samples from several of the “microbatches” of Integrated Salt Disposition Project (ISDP) Salt Batch (“Macrobatch”) 9 have been analyzed for 238Pu, 90Sr, 137Cs, cations (Inductively Coupled Plasma Emission Spectroscopy - ICPES), and anions (Ion Chromatography Anions - IC-A). The analytical results from the current microbatch samples are similar to those from previous macrobatch samples. The Cs removal continues to be acceptable, with decontamination factors (DF) averaging 25700 (107% RSD). The bulk chemistry of the DSSHT and SEHT samples do not show any signs of unusual behavior, other than lacking the anticipated degree of dilution that is calculated to occur during Modular Caustic-Side Solvent Extraction Unit (MCU) processing.

  17. Recent bibliography on analytical and sampling problems of a PWR primary coolant Suppl. 4

    International Nuclear Information System (INIS)

    Illy, H.

    1986-09-01

    The 4th supplement of a bibliographical series comprising the analytical and sampling problems of the primary coolant of PWR type reactors covers the literature from 1985 up to July 1986 (220 items). References are listed according to the following topics: boric acid; chloride, chlorine; general; hydrogen isotopes; iodine; iodide; noble gases; oxygen; other elements; radiation monitoring; reactor safety; sampling; water chemistry. (V.N.)

  18. Analytical Parameters of an Amperometric Glucose Biosensor for Fast Analysis in Food Samples

    Directory of Open Access Journals (Sweden)

    Margalida Artigues

    2017-11-01

    Full Text Available Amperometric biosensors based on the use of glucose oxidase (GOx are able to combine the robustness of electrochemical techniques with the specificity of biological recognition processes. However, very little information can be found in literature about the fundamental analytical parameters of these sensors. In this work, the analytical behavior of an amperometric biosensor based on the immobilization of GOx using a hydrogel (Chitosan onto highly ordered titanium dioxide nanotube arrays (TiO2NTAs has been evaluated. The GOx–Chitosan/TiO2NTAs biosensor showed a sensitivity of 5.46 μA·mM−1 with a linear range from 0.3 to 1.5 mM; its fundamental analytical parameters were studied using a commercial soft drink. The obtained results proved sufficient repeatability (RSD = 1.9%, reproducibility (RSD = 2.5%, accuracy (95–105% recovery, and robustness (RSD = 3.3%. Furthermore, no significant interferences from fructose, ascorbic acid and citric acid were obtained. In addition, the storage stability was further examined, after 30 days, the GOx–Chitosan/TiO2NTAs biosensor retained 85% of its initial current response. Finally, the glucose content of different food samples was measured using the biosensor and compared with the respective HPLC value. In the worst scenario, a deviation smaller than 10% was obtained among the 20 samples evaluated.

  19. Analytical Parameters of an Amperometric Glucose Biosensor for Fast Analysis in Food Samples

    Science.gov (United States)

    2017-01-01

    Amperometric biosensors based on the use of glucose oxidase (GOx) are able to combine the robustness of electrochemical techniques with the specificity of biological recognition processes. However, very little information can be found in literature about the fundamental analytical parameters of these sensors. In this work, the analytical behavior of an amperometric biosensor based on the immobilization of GOx using a hydrogel (Chitosan) onto highly ordered titanium dioxide nanotube arrays (TiO2NTAs) has been evaluated. The GOx–Chitosan/TiO2NTAs biosensor showed a sensitivity of 5.46 μA·mM−1 with a linear range from 0.3 to 1.5 mM; its fundamental analytical parameters were studied using a commercial soft drink. The obtained results proved sufficient repeatability (RSD = 1.9%), reproducibility (RSD = 2.5%), accuracy (95–105% recovery), and robustness (RSD = 3.3%). Furthermore, no significant interferences from fructose, ascorbic acid and citric acid were obtained. In addition, the storage stability was further examined, after 30 days, the GOx–Chitosan/TiO2NTAs biosensor retained 85% of its initial current response. Finally, the glucose content of different food samples was measured using the biosensor and compared with the respective HPLC value. In the worst scenario, a deviation smaller than 10% was obtained among the 20 samples evaluated. PMID:29135931

  20. Intercalibration of analytical methods on marine environmental samples

    International Nuclear Information System (INIS)

    1988-06-01

    The pollution of the seas by various chemical substances constitutes nowadays one of the principal concerns of mankind. The International Atomic Energy Agency has organized in past years several intercomparison exercises in the framework of its Analytical Quality Control Service. The present intercomparison had a double aim: first, to give laboratories participating in this intercomparison an opportunity for checking their analytical performance. Secondly, to produce on the basis of the results of this intercomparison a reference material made of fish tissue which would be accurately certified with respect to many trace elements. Such a material could be used by analytical chemists to check the validity of new analytical procedures. In total, 53 laboratories from 29 countries reported results (585 laboratory means for 48 elements). 5 refs, 52 tabs

  1. Quality assurance of analytical, scientific, and design computer programs for nuclear power plants

    International Nuclear Information System (INIS)

    1994-06-01

    This Standard applies to the design and development, modification, documentation, execution, and configuration management of computer programs used to perform analytical, scientific, and design computations during the design and analysis of safety-related nuclear power plant equipment, systems, structures, and components as identified by the owner. 2 figs

  2. Quality assurance of analytical, scientific, and design computer programs for nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-06-01

    This Standard applies to the design and development, modification, documentation, execution, and configuration management of computer programs used to perform analytical, scientific, and design computations during the design and analysis of safety-related nuclear power plant equipment, systems, structures, and components as identified by the owner. 2 figs.

  3. Optimization of instrumental neutron activation analysis method by means of 2k experimental design technique aiming the validation of analytical procedures

    International Nuclear Information System (INIS)

    Petroni, Robson; Moreira, Edson G.

    2013-01-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) methods were carried out for the determination of the elements arsenic, chromium, cobalt, iron, rubidium, scandium, selenium and zinc in biological materials. The aim is to validate the analytical methods for future accreditation at the National Institute of Metrology, Quality and Technology (INMETRO). The 2 k experimental design was applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. Samples of Mussel Tissue Certified Reference Material and multi-element standards were analyzed considering the following variables: sample decay time, counting time and sample distance to detector. The standard multi-element concentration (comparator standard), mass of the sample and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN - CNEN/SP). Optimized conditions were estimated based on the results of z-score tests, main effect and interaction effects. The results obtained with the different experimental configurations were evaluated for accuracy (precision and trueness) for each measurement. (author)

  4. Voltammetric technique, a panacea for analytical examination of environmental samples

    International Nuclear Information System (INIS)

    Zahir, E.; Mohiuddin, S.; Naqvi, I.I.

    2012-01-01

    Voltammetric methods for trace metal analysis in environmental samples of marine origin like mangrove, sediments and shrimps are generally recommended. Three different electro-analytical techniques i.e. polarography, anodic stripping voltammetry (ASV) and adsorptive stripping voltammetry (ADSV) have been used. Cd/sub 2/+, Pb/sub 2/+, Cu/sub 2/+ and Mn/sub 2/+ were determined through ASV, Cr/sub 6/+ was analyzed by ADSV and Fe/sub 2/+, Zn/sub 2/+, Ni/sub 2/+ and Co/sub 2/+ were determined through polarography. Out of which pairs of Fe/sub 2/+Zn/sub 2/+ and Ni/sub 2/+Co/sub 2/+ were determined in two separate runs while Cd/sub 2/+, Pb/sub 2/+, Cu/sub 2/+ were analyzed in single run of ASV. Sensitivity and speciation capabilities of voltammetric methods have been employed. Analysis conditions were optimized that includes selection of supporting electrolyte, pH, working electrodes, sweep rate etc. Stripping voltammetry was adopted for analysis at ultra trace levels. Statistical parameters for analytical method development like selectivity factor, interference, repeatability (0.0065-0.130 macro g/g), reproducibility (0.08125-1.625 macro g/g), detection limits (0.032-5.06 macro g/g), limits of quantification (0.081-12.652 macro g/g), sensitivities (5.636-2.15 nA mL macro g-1) etc. were also determined. The percentage recoveries were found in between 95-105% using certified reference materials. Real samples of complex marine environment from Karachi coastline were also analyzed. The standard addition method was employed where any matrix effect was evidenced. (author)

  5. A review of analytical techniques for the determination of carbon-14 in environmental samples

    International Nuclear Information System (INIS)

    Milton, G.M.; Brown, R.M.

    1993-11-01

    This report contains a brief summary of analytical techniques commonly used for the determination of radiocarbon in a variety of environmental samples. Details of the applicable procedures developed and tested in the Environmental Research Branch at Chalk River Laboratories are appended

  6. Analysis of plant gums and saccharide materials in paint samples: comparison of GC-MS analytical procedures and databases.

    Science.gov (United States)

    Lluveras-Tenorio, Anna; Mazurek, Joy; Restivo, Annalaura; Colombini, Maria Perla; Bonaduce, Ilaria

    2012-10-10

    Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of identifying polysaccharide binders in

  7. Analytic transfer maps for Lie algebraic design codes

    International Nuclear Information System (INIS)

    van Zeijts, J.; Neri, F.; Dragt, A.J.

    1990-01-01

    Lie algebraic methods provide a powerful tool for modeling particle transport through Hamiltonian systems. Briefly summarized, Lie algebraic design codes work as follows: first the time t flow generated by a Hamiltonian system is represented by a Lie algebraic map acting on the initial conditions. Maps are generated for each element in the lattice or beamline under study. Next all these maps are concatenated into a one-turn or one-pass map that represents the complete dynamics of the system. Finally, the resulting map is analyzed and design decisions are made based on the linear and nonlinear entries in the map. The authors give a short description of how to find Lie algebraic transfer maps in analytic form, for inclusion in accelerator design codes. As an example they find the transfer map, through third order, for the combined-function quadrupole magnet, and use such magnets to correct detrimental third-order aberrations in a spot forming system

  8. Fabrication of paper-based analytical devices optimized by central composite design.

    Science.gov (United States)

    Hamedpour, Vahid; Leardi, Riccardo; Suzuki, Koji; Citterio, Daniel

    2018-04-30

    In this work, an application of a design of experiments approach for the optimization of an isoniazid assay on a single-area inkjet-printed paper-based analytical device (PAD) is described. For this purpose, a central composite design was used for evaluation of the effect of device geometry and amount of assay reagents on the efficiency of the proposed device. The factors of interest were printed length, width, and sampling volume as factors related to device geometry, and amounts of the assay reagents polyvinyl alcohol (PVA), NH4OH, and AgNO3. Deposition of the assay reagents was performed by a thermal inkjet printer. The colorimetric assay mechanism of this device is based on the chemical interaction of isoniazid, ammonium hydroxide, and PVA with silver ions to induce the formation of yellow silver nanoparticles (AgNPs). The in situ-formed AgNPs can be easily detected by the naked eye or with a simple flat-bed scanner. Under optimal conditions, the calibration curve was linear in the isoniazid concentration range 0.03-10 mmol L-1 with a relative standard deviation of 3.4% (n = 5 for determination of 1.0 mmol L-1). Finally, the application of the proposed device for isoniazid determination in pharmaceutical preparations produced satisfactory results.

  9. Road Transportable Analytical Laboratory system

    International Nuclear Information System (INIS)

    Finger, S.M.; Keith, V.F.; Spertzel, R.O.; De Avila, J.C.; O'Donnell, M.; Vann, R.L.

    1993-09-01

    This developmental effort clearly shows that a Road Transportable Analytical Laboratory System is a worthwhile and achievable goal. The RTAL is designed to fully analyze (radioanalytes, and organic and inorganic chemical analytes) 20 samples per day at the highest levels of quality assurance and quality control. It dramatically reduces the turnaround time for environmental sample analysis from 45 days (at a central commercial laboratory) to 1 day. At the same time each RTAL system will save the DOE over $12 million per year in sample analysis costs compared to the costs at a central commercial laboratory. If RTAL systems were used at the eight largest DOE facilities (at Hanford, Savannah River, Fernald, Oak Ridge, Idaho, Rocky Flats, Los Alamos, and the Nevada Test Site), the annual savings would be $96,589,000. The DOE's internal study of sample analysis needs projects 130,000 environmental samples requiring analysis in FY 1994, clearly supporting the need for the RTAL system. The cost and time savings achievable with the RTAL system will accelerate and improve the efficiency of cleanup and remediation operations throughout the DOE complex

  10. Optimization of instrumental neutron activation analysis method by means of 2{sup k} experimental design technique aiming the validation of analytical procedures

    Energy Technology Data Exchange (ETDEWEB)

    Petroni, Robson; Moreira, Edson G., E-mail: rpetroni@ipen.br, E-mail: emoreira@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2013-07-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) methods were carried out for the determination of the elements arsenic, chromium, cobalt, iron, rubidium, scandium, selenium and zinc in biological materials. The aim is to validate the analytical methods for future accreditation at the National Institute of Metrology, Quality and Technology (INMETRO). The 2{sup k} experimental design was applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. Samples of Mussel Tissue Certified Reference Material and multi-element standards were analyzed considering the following variables: sample decay time, counting time and sample distance to detector. The standard multi-element concentration (comparator standard), mass of the sample and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN - CNEN/SP). Optimized conditions were estimated based on the results of z-score tests, main effect and interaction effects. The results obtained with the different experimental configurations were evaluated for accuracy (precision and trueness) for each measurement. (author)

  11. New high temperature plasmas and sample introduction systems for analytical atomic emission and mass spectrometry

    International Nuclear Information System (INIS)

    Montaser, A.

    1993-01-01

    In this research, new high-temperature plasmas and new sample introduction systems are explored for rapid elemental and isotopic analysis of gases, solutions, and solids using mass spectrometry and atomic emission spectrometry. During the period January 1993--December 1993, emphasis was placed on (a) analytical investigations of atmospheric-pressure helium inductively coupled plasma (He ICP) that are suitable for atomization, excitation, and ionization of elements possessing high excitation and ionization energies; (b) simulation and computer modeling of plasma sources to predict their structure and fundamental and analytical properties without incurring the enormous cost of experimental studies; (c) spectrosopic imaging and diagnostic studies of high-temperature plasmas; (d) fundamental studies of He ICP discharges and argon-nitrogen plasma by high-resolution Fourier transform spectrometry; and (e) fundamental and analytical investigation of new, low-cost devices as sample introduction systems for atomic spectrometry and examination of new diagnostic techniques for probing aerosols. Only the most important achievements are included in this report to illustrate progress and obstacles. Detailed descriptions of the authors' investigations are outlined in the reprints and preprints that accompany this report. The technical progress expected next year is briefly described at the end of this report

  12. Analytical methodologies for the determination of benzodiazepines in biological samples.

    Science.gov (United States)

    Persona, Karolina; Madej, Katarzyna; Knihnicki, Paweł; Piekoszewski, Wojciech

    2015-09-10

    Benzodiazepine drugs belong to important and most widely used medicaments. They demonstrate such therapeutic properties as anxiolytic, sedative, somnifacient, anticonvulsant, diastolic and muscle relaxant effects. However, despite the fact that benzodiazepines possess high therapeutic index and are considered to be relatively safe, their use can be dangerous when: (1) co-administered with alcohol, (2) co-administered with other medicaments like sedatives, antidepressants, neuroleptics or morphine like substances, (3) driving under their influence, (4) using benzodiazepines non-therapeutically as drugs of abuse or in drug-facilitated crimes. For these reasons benzodiazepines are still studied and determined in a variety of biological materials. In this article, sample preparation techniques which have been applied in analysis of benzodiazepine drugs in biological samples have been reviewed and presented. The next part of the article is focused on a review of analytical methods which have been employed for pharmacological, toxicological or forensic study of this group of drugs in the biological matrices. The review was preceded by a description of the physicochemical properties of the selected benzodiazepines and two, very often coexisting in the same analyzed samples, sedative-hypnotic drugs. Copyright © 2015. Published by Elsevier B.V.

  13. Direct drive TFPM wind generator analytical design optimised for minimum active mass usage

    DEFF Research Database (Denmark)

    Nica, Florin Valentin Traian; Leban, Krisztina Monika; Ritchie, Ewen

    2013-01-01

    The paper focuses of the Transverse Flux Permanent (TFPM) Generator as a solution for offshore direct drive wind turbines. A complex design algorithm is presented. Two topologies (U core and C core) of TFPM were considered. The analytical design is optimised using a combination of genetic...

  14. A rapid and sensitive analytical method for the determination of 14 pyrethroids in water samples.

    Science.gov (United States)

    Feo, M L; Eljarrat, E; Barceló, D

    2010-04-09

    A simple, efficient and environmentally friendly analytical methodology is proposed for extracting and preconcentrating pyrethroids from water samples prior to gas chromatography-negative ion chemical ionization mass spectrometry (GC-NCI-MS) analysis. Fourteen pyrethroids were selected for this work: bifenthrin, cyfluthrin, lambda-cyhalothrin, cypermethrin, deltamethrin, esfenvalerate, fenvalerate, fenpropathrin, tau-fluvalinate, permethrin, phenothrin, resmethrin, tetramethrin and tralomethrin. The method is based on ultrasound-assisted emulsification-extraction (UAEE) of a water-immiscible solvent in an aqueous medium. Chloroform was used as extraction solvent in the UAEE technique. Target analytes were quantitatively extracted achieving an enrichment factor of 200 when 20 mL aliquot of pure water spiked with pyrethroid standards was extracted. The method was also evaluated with tap water and river water samples. Method detection limits (MDLs) ranged from 0.03 to 35.8 ng L(-1) with RSDs values or =0.998. Recovery values were in the range of 45-106%, showing satisfactory robustness of the method for analyzing pyrethroids in water samples. The proposed methodology was applied for the analysis of river water samples. Cypermethrin was detected at concentration levels ranging from 4.94 to 30.5 ng L(-1). Copyright 2010 Elsevier B.V. All rights reserved.

  15. Sample design effects in landscape genetics

    Science.gov (United States)

    Oyler-McCance, Sara J.; Fedy, Bradley C.; Landguth, Erin L.

    2012-01-01

    An important research gap in landscape genetics is the impact of different field sampling designs on the ability to detect the effects of landscape pattern on gene flow. We evaluated how five different sampling regimes (random, linear, systematic, cluster, and single study site) affected the probability of correctly identifying the generating landscape process of population structure. Sampling regimes were chosen to represent a suite of designs common in field studies. We used genetic data generated from a spatially-explicit, individual-based program and simulated gene flow in a continuous population across a landscape with gradual spatial changes in resistance to movement. Additionally, we evaluated the sampling regimes using realistic and obtainable number of loci (10 and 20), number of alleles per locus (5 and 10), number of individuals sampled (10-300), and generational time after the landscape was introduced (20 and 400). For a simulated continuously distributed species, we found that random, linear, and systematic sampling regimes performed well with high sample sizes (>200), levels of polymorphism (10 alleles per locus), and number of molecular markers (20). The cluster and single study site sampling regimes were not able to correctly identify the generating process under any conditions and thus, are not advisable strategies for scenarios similar to our simulations. Our research emphasizes the importance of sampling data at ecologically appropriate spatial and temporal scales and suggests careful consideration for sampling near landscape components that are likely to most influence the genetic structure of the species. In addition, simulating sampling designs a priori could help guide filed data collection efforts.

  16. Marine anthropogenic radiotracers in the Southern Hemisphere: New sampling and analytical strategies

    Science.gov (United States)

    Levy, I.; Povinec, P. P.; Aoyama, M.; Hirose, K.; Sanchez-Cabeza, J. A.; Comanducci, J.-F.; Gastaud, J.; Eriksson, M.; Hamajima, Y.; Kim, C. S.; Komura, K.; Osvath, I.; Roos, P.; Yim, S. A.

    2011-04-01

    The Japan Agency for Marine Earth Science and Technology conducted in 2003-2004 the Blue Earth Global Expedition (BEAGLE2003) around the Southern Hemisphere Oceans, which was a rare opportunity to collect many seawater samples for anthropogenic radionuclide studies. We describe here sampling and analytical methodologies based on radiochemical separations of Cs and Pu from seawater, as well as radiometric and mass spectrometry measurements. Several laboratories took part in radionuclide analyses using different techniques. The intercomparison exercises and analyses of certified reference materials showed a reasonable agreement between the participating laboratories. The obtained data on the distribution of 137Cs and plutonium isotopes in seawater represent the most comprehensive results available for the Southern Hemisphere Oceans.

  17. Macro elemental analysis of food samples by nuclear analytical technique

    Science.gov (United States)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  18. Sampling and Analytical Method for Alpha-Dicarbonyl Flavoring Compounds via Derivatization with o-Phenylenediamine and Analysis Using GC-NPD

    Directory of Open Access Journals (Sweden)

    Stephanie M. Pendergrass

    2016-01-01

    Full Text Available A novel methodology is described for the sampling and analysis of diacetyl, 2,3-pentanedione, 2,3-hexanedione, and 2,3-heptanedione. These analytes were collected on o-phenylenediamine-treated silica gel tubes and quantitatively recovered as the corresponding quinoxaline derivatives. After derivatization, the sorbent was desorbed in 3 mL of ethanol solvent and analyzed using gas chromatography/nitrogen-phosphorous detection (GC/NPD. The limits of detection (LOD achieved for each analyte were determined to be in the range of 5–10 nanograms/sample. Evaluation of the on-tube derivatization procedure indicated that it is unaffected by humidities ranging from 20% to 80% and that the derivatization procedure was quantitative for analyte concentrations ranging from 0.1 μg to approximately 500 μg per sample. Storage stability studies indicated that the derivatives were stable for 30 days when stored at both ambient and refrigerated temperatures. Additional studies showed that the quinoxaline derivatives were quantitatively recovered when sampling up to a total volume of 72 L at a sampling rate of 50 cc/min. This method will be important to evaluate and monitor worker exposures in the food and flavoring industry. Samples can be collected over an 8-hour shift with up to 288 L total volume collected regardless of time, sampling rate, and/or the effects of humidity.

  19. Analytics for Knowledge Creation: Towards Epistemic Agency and Design-Mode Thinking

    Science.gov (United States)

    Chen, Bodong; Zhang, Jianwei

    2016-01-01

    Innovation and knowledge creation call for high-level epistemic agency and design-mode thinking, two competencies beyond the traditional scopes of schooling. In this paper, we discuss the need for learning analytics to support these two competencies, and more broadly, the demand for education for innovation. We ground these arguments on a…

  20. Development of an analytical procedure for plutonium in the concentration range of femtogram/gram and its application to environmental samples

    International Nuclear Information System (INIS)

    Schuettelkopf, H.

    1981-09-01

    To study the behaviour of plutonium in the environment and to measure plutonium in the vicinity of nuclear facilities, a quick, sensitive analytical method is required which can be applied to all sample materials found in the environment. For a sediment contaminated with plutonium a boiling out method using first HNO 3 /HF and subsequently HNO 3 /Al(NO 3 ) 3 was found to be successful. The leaching solution was then extracted by TOPO and the plutonium backextracted by ascorbic acid/HCl. Some different purification steps and finally electroplating using ammonium oxalate led to an optimum sample for α- spectroscopic determination of plutonium. An analytical method was worked out for plutonium which can be applied to all materials found in the environment. The sample size is 100 g but it might also be much greater. The average chemical yield is 70 and 80%. The detection limit for soil samples is 0.1 fCi/g and for plant samples 0.5 fCi/g. One technician can perform eight analyses per working day. The analytical procedure was applied to a large number of environmental samples and the results of these analyses are indicated. (orig./RB) [de

  1. Analytical results and sample locality map for rock, stream-sediment, and soil samples, Northern and Eastern Coloado Desert BLM Resource Area, Imperial, Riverside, and San Bernardino Counties, California

    Science.gov (United States)

    King, Harley D.; Chaffee, Maurice A.

    2000-01-01

    INTRODUCTION In 1996-1998 the U.S. Geological Survey (USGS) conducted a geochemical study of the Bureau of Land Management's (BLM) 5.5 million-acre Northern and Eastern Colorado Desert Resource Area (usually referred to as the NECD in this report), Imperial, Riverside, and San Bernardino Counties, southeastern California (figure 1). This study was done in support of the BLM's Coordinated Management Plan for the area. This report presents analytical data from this study. To provide comprehensive coverage of the NECD, we compiled and examined all available geochemical data, in digital form, from previous studies in the area, and made sample-site plots to aid in determining where sample-site coverage and analyses were sufficient, which samples should be re-analyzed, and where additional sampling was needed. Previous investigations conducted in parts of the current study area included the National Uranium Resource Evaluation (NURE) program studies of the Needles and Salton Sea 1? x 2? quadrangles; USGS studies of 12 BLM Wilderness Study Areas (WSAs) (Big Maria Mountains, Chemehuevi Mountains, Chuckwalla Mountains, Coxcomb Mountains, Mecca Hills, Orocopia Mountains, Palen-McCoy, Picacho Peak, Riverside Mountains, Sheephole Valley (also known as Sheep Hole/Cadiz), Turtle Mountains, and Whipple Mountains); and USGS studies in the Needles and El Centro 1? x 2? quadrangles done during the early 1990s as part of a project to identify the regional geochemistry of southern California. Areas where we did new sampling of rocks and stream sediments are mainly in the Chocolate Mountain Aerial Gunnery Range and in Joshua Tree National Park, which extends into the west-central part of the NECD, as shown in figure 1 and figure 2. This report contains analytical data for 132 rock samples and 1,245 stream-sediment samples collected by the USGS, and 362 stream-sediment samples and 189 soil samples collected during the NURE program. All samples are from the Northern and Eastern Colorado

  2. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys.

    Directory of Open Access Journals (Sweden)

    Lauren Hund

    Full Text Available Lot quality assurance sampling (LQAS surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis.

  3. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys.

    Science.gov (United States)

    Hund, Lauren; Bedrick, Edward J; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes) depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis.

  4. Final Report on the Analytical Results for Tank Farm Samples in Support of Salt Dissolution Evaluation

    International Nuclear Information System (INIS)

    Hobbs, D.T.

    1996-01-01

    Recent processing of dilute solutions through the 2H-Evaporator system caused dissolution of salt in Tank 38H, the concentrate receipt tank. This report documents analytical results for samples taken from this evaporator system

  5. Recent bibliography on analytical and sampling problems of a PWR primary coolant Suppl. 3

    International Nuclear Information System (INIS)

    Illy, H.

    1985-03-01

    The present supplement to the bibliography on analytical and sampling problems of PWR primary coolant covers the literature published in 1984 and includes some references overlooked in the previous volumes dealing with the publications of the last 10 years. References are devided into topics characterized by the following headlines: boric acid; chloride; chlorine; carbon dioxide; general; gas analysis; hydrogen isotopes; iodine; iodide; nitrogen; noble gases and radium; ammonia; ammonium; oxygen; other elements; radiation monitoring; reactor safety; sampling; water chemistry. Under a given subject bibliographical information is listed in alphabetical order of the authors. (V.N.)

  6. 30 CFR 71.208 - Bimonthly sampling; designated work positions.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Bimonthly sampling; designated work positions... UNDERGROUND COAL MINES Sampling Procedures § 71.208 Bimonthly sampling; designated work positions. (a) Each... standard when quartz is present), respirable dust sampling of designated work positions shall begin on the...

  7. An analysis of basic design students' intuitive and analytic attitudes in colour decisions

    OpenAIRE

    Akbay, Saadet

    2003-01-01

    Cataloged from PDF version of article. Colour can be defined as a subjective preference, an experience and an intuitive sense, or as a theory and a science. Design education regards colour as a scientific theory by means of reasoning. The design students’ colour decisions, values, and intuitive attitudes are aimed to be developed and cultivated by colour education in basic design, and supported and equipped by knowledge towards analytical attitudes. Thus, the major concern o...

  8. Measuring myokines with cardiovascular functions: pre-analytical variables affecting the analytical output.

    Science.gov (United States)

    Lombardi, Giovanni; Sansoni, Veronica; Banfi, Giuseppe

    2017-08-01

    In the last few years, a growing number of molecules have been associated to an endocrine function of the skeletal muscle. Circulating myokine levels, in turn, have been associated with several pathophysiological conditions including the cardiovascular ones. However, data from different studies are often not completely comparable or even discordant. This would be due, at least in part, to the whole set of situations related to the preparation of the patient prior to blood sampling, blood sampling procedure, processing and/or store. This entire process constitutes the pre-analytical phase. The importance of the pre-analytical phase is often not considered. However, in routine diagnostics, the 70% of the errors are in this phase. Moreover, errors during the pre-analytical phase are carried over in the analytical phase and affects the final output. In research, for example, when samples are collected over a long time and by different laboratories, a standardized procedure for sample collecting and the correct procedure for sample storage are acknowledged. In this review, we discuss the pre-analytical variables potentially affecting the measurement of myokines with cardiovascular functions.

  9. Recommendations for sampling for prevention of hazards in civil defense. On analytics of chemical, biological and radioactive contaminations. Brief instruction for the CBRN (chemical, biological, radioactive, nuclear) sampling

    International Nuclear Information System (INIS)

    Bachmann, Udo; Biederbick, Walter; Derakshani, Nahid

    2010-01-01

    The recommendation for sampling for prevention of hazards in civil defense is describing the analytics of chemical, biological and radioactive contaminations and includes detail information on the sampling, protocol preparation and documentation procedures. The volume includes a separate brief instruction for the CBRN (chemical, biological, radioactive, nuclear) sampling.

  10. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys

    OpenAIRE

    Hund, Lauren; Bedrick, Edward J.; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we comp...

  11. Overview of analytical models for the design of linear and planar motors

    NARCIS (Netherlands)

    Jansen, J.W.; Smeets, J.P.C.; Overboom, T.T.; Rovers, J.M.M.; Lomonova, E.A.

    2014-01-01

    In this paper, an overview of analytical techniques for the modeling of linear and planar permanent-magnet motors is given. These models can be used complementary to finite element analyses for fast evaluations of topologies, but they are indispensable for the design of magnetically levitated planar

  12. Box-Behnken design in modeling of solid-phase tea waste extraction for the removal of uranium from water samples

    Energy Technology Data Exchange (ETDEWEB)

    Khajeh, Mostafa; Jahanbin, Elham; Ghaffari-Moghaddam, Mansour; Moghaddam, Zahra Safaei [Zabol Univ. (Iran, Islamic Republic of). Dept. of Chemistry; Bohlooli, Mousa [Zabol Univ. (Iran, Islamic Republic of). Dept. of Biology

    2015-07-01

    In this study, the solid-phase tea waste procedure was used for separation, preconcentration and determination of uranium from water samples by UV-Vis spectrophotometer. In addition, Box-Behnken experimental design was employed to investigated the influence of six variables including pH, mass of adsorbent, eluent volume, amount of 1-(2-pyridylazo)-2-naphthol (PAN); and sample and eluent flow rates on the extraction of analyte. High determination coefficient (R{sup 2}) of 0.972 and adjusted-R{sup 2} of 0.943 showed the satisfactory adjustment of the polynomial regression model. This method was used for the extraction of uranium from real water samples.

  13. Box-Behnken design in modeling of solid-phase tea waste extraction for the removal of uranium from water samples

    International Nuclear Information System (INIS)

    Khajeh, Mostafa; Jahanbin, Elham; Ghaffari-Moghaddam, Mansour; Moghaddam, Zahra Safaei; Bohlooli, Mousa

    2015-01-01

    In this study, the solid-phase tea waste procedure was used for separation, preconcentration and determination of uranium from water samples by UV-Vis spectrophotometer. In addition, Box-Behnken experimental design was employed to investigated the influence of six variables including pH, mass of adsorbent, eluent volume, amount of 1-(2-pyridylazo)-2-naphthol (PAN); and sample and eluent flow rates on the extraction of analyte. High determination coefficient (R 2 ) of 0.972 and adjusted-R 2 of 0.943 showed the satisfactory adjustment of the polynomial regression model. This method was used for the extraction of uranium from real water samples.

  14. Magnetically-driven medical robots: An analytical magnetic model for endoscopic capsules design

    Science.gov (United States)

    Li, Jing; Barjuei, Erfan Shojaei; Ciuti, Gastone; Hao, Yang; Zhang, Peisen; Menciassi, Arianna; Huang, Qiang; Dario, Paolo

    2018-04-01

    Magnetic-based approaches are highly promising to provide innovative solutions for the design of medical devices for diagnostic and therapeutic procedures, such as in the endoluminal districts. Due to the intrinsic magnetic properties (no current needed) and the high strength-to-size ratio compared with electromagnetic solutions, permanent magnets are usually embedded in medical devices. In this paper, a set of analytical formulas have been derived to model the magnetic forces and torques which are exerted by an arbitrary external magnetic field on a permanent magnetic source embedded in a medical robot. In particular, the authors modelled cylindrical permanent magnets as general solution often used and embedded in magnetically-driven medical devices. The analytical model can be applied to axially and diametrically magnetized, solid and annular cylindrical permanent magnets in the absence of the severe calculation complexity. Using a cylindrical permanent magnet as a selected solution, the model has been applied to a robotic endoscopic capsule as a pilot study in the design of magnetically-driven robots.

  15. Optimized Analytical Method to Determine Gallic and Picric Acids in Pyrotechnic Samples by Using HPLC/UV (Reverse Phase)

    International Nuclear Information System (INIS)

    Garcia Alonso, S.; Perez Pastor, R. M.

    2013-01-01

    A study on the optimization and development of a chromatographic method for the determination of gallic and picric acids in pyrotechnic samples is presented. In order to achieve this, both analytical conditions by HPLC with diode detection and extraction step of a selected sample were studied. (Author)

  16. New vistas in refractive laser beam shaping with an analytic design approach

    Science.gov (United States)

    Duerr, Fabian; Thienpont, Hugo

    2014-05-01

    Many commercial, medical and scientific applications of the laser have been developed since its invention. Some of these applications require a specific beam irradiance distribution to ensure optimal performance. Often, it is possible to apply geometrical methods to design laser beam shapers. This common design approach is based on the ray mapping between the input plane and the output beam. Geometric ray mapping designs with two plano-aspheric lenses have been thoroughly studied in the past. Even though analytic expressions for various ray mapping functions do exist, the surface profiles of the lenses are still calculated numerically. In this work, we present an alternative novel design approach that allows direct calculation of the rotational symmetric lens profiles described by analytic functions. Starting from the example of a basic beam expander, a set of functional differential equations is derived from Fermat's principle. This formalism allows calculating the exact lens profiles described by Taylor series coefficients up to very high orders. To demonstrate the versatility of this new approach, two further cases are solved: a Gaussian to at-top irradiance beam shaping system, and a beam shaping system that generates a more complex dark-hollow Gaussian (donut-like) irradiance profile with zero intensity in the on-axis region. The presented ray tracing results confirm the high accuracy of all calculated solutions and indicate the potential of this design approach for refractive beam shaping applications.

  17. Preparing a Data Scientist: A Pedagogic Experience in Designing a Big Data Analytics Course

    Science.gov (United States)

    Asamoah, Daniel Adomako; Sharda, Ramesh; Hassan Zadeh, Amir; Kalgotra, Pankush

    2017-01-01

    In this article, we present an experiential perspective on how a big data analytics course was designed and delivered to students at a major Midwestern university. In reference to the "MSIS 2006 Model Curriculum," we designed this course as a level 2 course, with prerequisites in databases, computer programming, statistics, and data…

  18. Evaluation of analytical results on DOE Quality Assessment Program Samples

    International Nuclear Information System (INIS)

    Jaquish, R.E.; Kinnison, R.R.; Mathur, S.P.; Sastry, R.

    1985-01-01

    Criteria were developed for evaluating the participants analytical results in the DOE Quality Assessment Program (QAP). Historical data from previous QAP studies were analyzed using descriptive statistical methods to determine the interlaboratory precision that had been attained. Performance criteria used in other similar programs were also reviewed. Using these data, precision values and control limits were recommended for each type of analysis performed in the QA program. Results of the analysis performed by the QAP participants on the November 1983 samples were statistically analyzed and evaluated. The Environmental Measurements Laboratory (EML) values were used as the known values and 3-sigma precision values were used as control limits. Results were submitted by 26 participating laboratories for 49 different radionuclide media combinations. The participants reported 419 results and of these, 350 or 84% were within control limits. Special attention was given to the data from gamma spectral analysis of air filters and water samples. both normal probability and box plots were prepared for each nuclide to help evaluate the distribution of the data. Results that were outside the expected range were identified and suggestions made that laboratories check calculations, and procedures on these results

  19. Stability of purgeable VOCs in water samples during pre-analytical holding: Part 1, Analysis by a commercial laboratory

    Energy Technology Data Exchange (ETDEWEB)

    West, O.R.; Bayne, C.K.; Siegrist, R.L.; Holden, W.L.; Scarborough, S.S. [Oak Ridge National Lab., TN (United States); Bottrell, D.W. [USDOE, Washington, DC (United States)

    1996-10-01

    This study was undertaken to examine the hypothesis that prevalent and priority purgeable VOCs in properly preserved water samples are stable for at least 28 days. (VOCs are considered stable if concentrations do not change by more than 10%.) Surface water was spiked with 44 purgeable VOCs. Results showed that the measurement of 35 out of 44 purgeable VOCs in properly preserved water samples (4 C, 250 mg NaHSO{sub 4}, no headspace in 40 mL VOC vials with 0.010-in. Teflon-lined silicone septum caps) will not be affected by sample storage for 28 days. Larger changes (>10%) and low practical reporting times were observed for a few analytes, e.g. acrolein, CS{sub 2}, vinyl acetate, etc.; these also involve other analytical problems. Advantages of a 28-day (compared to 14-day) holding time are pointed out.

  20. Glycan characterization of the NIST RM monoclonal antibody using a total analytical solution: From sample preparation to data analysis.

    Science.gov (United States)

    Hilliard, Mark; Alley, William R; McManus, Ciara A; Yu, Ying Qing; Hallinan, Sinead; Gebler, John; Rudd, Pauline M

    Glycosylation is an important attribute of biopharmaceutical products to monitor from development through production. However, glycosylation analysis has traditionally been a time-consuming process with long sample preparation protocols and manual interpretation of the data. To address the challenges associated with glycan analysis, we developed a streamlined analytical solution that covers the entire process from sample preparation to data analysis. In this communication, we describe the complete analytical solution that begins with a simplified and fast N-linked glycan sample preparation protocol that can be completed in less than 1 hr. The sample preparation includes labelling with RapiFluor-MS tag to improve both fluorescence (FLR) and mass spectral (MS) sensitivities. Following HILIC-UPLC/FLR/MS analyses, the data are processed and a library search based on glucose units has been included to expedite the task of structural assignment. We then applied this total analytical solution to characterize the glycosylation of the NIST Reference Material mAb 8761. For this glycoprotein, we confidently identified 35 N-linked glycans and all three major classes, high mannose, complex, and hybrid, were present. The majority of the glycans were neutral and fucosylated; glycans featuring N-glycolylneuraminic acid and those with two galactoses connected via an α1,3-linkage were also identified.

  1. Recent bibliography on analytical and sampling problems of a PWR primary coolant Pt. 1

    International Nuclear Information System (INIS)

    Illy, H.

    1981-12-01

    The first bibliography on analytical and sampling problems of a PWR primary coolant (KFKI Report-1980-48) was published in 1980 and it covered the literature published in the previous 8-10 years. The present supplement reviews the subsequent literature up till December 1981. It also includes some references overlooked in the first volume. The serial numbers are continued from the first bibliography. (author)

  2. Comparison of two microextraction methods based on solidification of floating organic droplet for the determination of multiclass analytes in river water samples by liquid chromatography tandem mass spectrometry using Central Composite Design.

    Science.gov (United States)

    Asati, Ankita; Satyanarayana, G N V; Patel, Devendra K

    2017-09-01

    Two low density organic solvents based liquid-liquid microextraction methods, namely Vortex assisted liquid-liquid microextraction based on solidification of floating organic droplet (VALLME-SFO) and Dispersive liquid-liquid microextraction based on solidification of floating organic droplet(DLLME-SFO) have been compared for the determination of multiclass analytes (pesticides, plasticizers, pharmaceuticals and personal care products) in river water samples by using liquid chromatography tandem mass spectrometry (LC-MS/MS). The effect of various experimental parameters on the efficiency of the two methods and their optimum values were studied with the aid of Central Composite Design (CCD) and Response Surface Methodology(RSM). Under optimal conditions, VALLME-SFO was validated in terms of limit of detection, limit of quantification, dynamic linearity range, determination of coefficient, enrichment factor and extraction recovery for which the respective values were (0.011-0.219ngmL -1 ), (0.035-0.723ngmL -1 ), (0.050-0.500ngmL -1 ), (R 2 =0.992-0.999), (40-56), (80-106%). However, when the DLLME-SFO method was validated under optimal conditions, the range of values of limit of detection, limit of quantification, dynamic linearity range, determination of coefficient, enrichment factor and extraction recovery were (0.025-0.377ngmL -1 ), (0.083-1.256ngmL -1 ), (0.100-1.000ngmL -1 ), (R 2 =0.990-0.999), (35-49), (69-98%) respectively. Interday and intraday precisions were calculated as percent relative standard deviation (%RSD) and the values were ≤15% for VALLME-SFO and DLLME-SFO methods. Both methods were successfully applied for determining multiclass analytes in river water samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Water sample-collection and distribution system

    Science.gov (United States)

    Brooks, R. R.

    1978-01-01

    Collection and distribution system samples water from six designated stations, filtered if desired, and delivers it to various analytical sensors. System may be controlled by Water Monitoring Data Acquisition System or operated manually.

  4. Analytical Method to Estimate the Complex Permittivity of Oil Samples

    Directory of Open Access Journals (Sweden)

    Lijuan Su

    2018-03-01

    Full Text Available In this paper, an analytical method to estimate the complex dielectric constant of liquids is presented. The method is based on the measurement of the transmission coefficient in an embedded microstrip line loaded with a complementary split ring resonator (CSRR, which is etched in the ground plane. From this response, the dielectric constant and loss tangent of the liquid under test (LUT can be extracted, provided that the CSRR is surrounded by such LUT, and the liquid level extends beyond the region where the electromagnetic fields generated by the CSRR are present. For that purpose, a liquid container acting as a pool is added to the structure. The main advantage of this method, which is validated from the measurement of the complex dielectric constant of olive and castor oil, is that reference samples for calibration are not required.

  5. An analytical study of photoacoustic and thermoacoustic generation efficiency towards contrast agent and film design optimization

    Directory of Open Access Journals (Sweden)

    Fei Gao

    2017-09-01

    Full Text Available Photoacoustic (PA and thermoacoustic (TA effects have been explored in many applications, such as bio-imaging, laser-induced ultrasound generator, and sensitive electromagnetic (EM wave film sensor. In this paper, we propose a compact analytical PA/TA generation model to incorporate EM, thermal and mechanical parameters, etc. From the derived analytical model, both intuitive predictions and quantitative simulations are performed. It shows that beyond the EM absorption improvement, there are many other physical parameters that deserve careful consideration when designing contrast agents or film composites, followed by simulation study. Lastly, several sets of experimental results are presented to prove the feasibility of the proposed analytical model. Overall, the proposed compact model could work as a clear guidance and predication for improved PA/TA contrast agents and film generator/sensor designs in the domain area.

  6. Determination of melamine in soil samples using surfactant-enhanced hollow fiber liquid phase microextraction followed by HPLC–UV using experimental design

    Directory of Open Access Journals (Sweden)

    Ali Sarafraz Yazdi

    2015-11-01

    Full Text Available Surfactant-enhanced hollow fiber liquid phase (SE-HF-LPME microextraction was applied for the extraction of melamine in conjunction with high performance liquid chromatography with UV detection (HPLC–UV. Sodium dodecyl sulfate (SDS was added firstly to the sample solution at pH 1.9 to form hydrophobic ion-pair with protonated melamine. Then the protonated melamine–dodecyl sulfate ion-pair (Mel–DS was extracted from aqueous phase into organic phase immobilized in the pores and lumen of the hollow fiber. After extraction, the analyte-enriched 1-octanol was withdrawn into the syringe and injected into the HPLC. Preliminary, one variable at a time method was applied to select the type of extraction solvent. Then, in screening step, the other variables that may affect the extraction efficiency of the analyte were studied using a fractional factorial design. In the next step, a central composite design was applied for optimization of the significant factors having positive effects on extraction efficiency. The optimum operational conditions included: sample volume, 5 mL; surfactant concentration, 1.5 mM; pH 1.9; stirring rate, 1500 rpm and extraction time, 60 min. Using the optimum conditions, the method was analytically evaluated. The detection limit, relative standard deviation and linear range were 0.005 μg mL−1, 4.0% (3 μg mL−1, n = 5 and 0.01–8 μg mL−1, respectively. The performance of the procedure in extraction of melamine from the soil samples was good according to its relative recoveries in different spiking levels (95–109%.

  7. Collection of analytes from microneedle patches.

    Science.gov (United States)

    Romanyuk, Andrey V; Zvezdin, Vasiliy N; Samant, Pradnya; Grenader, Mark I; Zemlyanova, Marina; Prausnitz, Mark R

    2014-11-04

    Clinical medicine and public health would benefit from simplified acquisition of biological samples from patients that can be easily obtained at point of care, in the field, and by patients themselves. Microneedle patches are designed to serve this need by collecting dermal interstitial fluid containing biomarkers without the dangers, pain, or expertise needed to collect blood. This study presents novel methods to collect biomarker analytes from microneedle patches for analysis by integration into conventional analytical laboratory microtubes and microplates. Microneedle patches were made out of cross-linked hydrogel composed of poly(methyl vinyl ether-alt-maleic acid) and poly(ethylene glycol) prepared by micromolding. Microneedle patches were shown to swell with water up to 50-fold in volume, depending on degree of polymer cross-linking, and to collect interstitial fluid from the skin of rats. To collect analytes from microneedle patches, the patches were mounted within the cap of microcentrifuge tubes or formed the top of V-bottom multiwell microplates, and fluid was collected in the bottom of the tubes under gentle centrifugation. In another method, microneedle patches were attached to form the bottom of multiwell microplates, thereby enabling in situ analysis. The simplicity of biological sample acquisition using microneedle patches coupled with the simplicity of analyte collection from microneedles patches integrated into conventional analytical equipment could broaden the reach of future screening, diagnosis, and monitoring of biomarkers in healthcare and environmental/workplace settings.

  8. An analytical inductor design procedure for three-phase PWM converters in power factor correction applications

    DEFF Research Database (Denmark)

    Kouchaki, Alireza; Niroumand, Farideh Javidi; Haase, Frerk

    2015-01-01

    This paper presents an analytical method for designing the inductor of three-phase power factor correction converters (PFCs). The complex behavior of the inductor current complicates the inductor design procedure as well as the core loss and copper loss calculations. Therefore, this paper analyze...... to calculate the core loss in the PFC application. To investigate the impact of the dc link voltage level, two inductors for different dc voltage levels are designed and the results are compared.......This paper presents an analytical method for designing the inductor of three-phase power factor correction converters (PFCs). The complex behavior of the inductor current complicates the inductor design procedure as well as the core loss and copper loss calculations. Therefore, this paper analyzes...... circuit is used to provide the inductor current harmonic spectrum. Therefore, using the harmonic spectrum, the low and high frequency copper losses are calculated. The high frequency minor B-H loops in one switching cycle are also analyzed. Then, the loss map provided by the measurement setup is used...

  9. On efficiency of some ratio estimators in double sampling design ...

    African Journals Online (AJOL)

    In this paper, three sampling ratio estimators in double sampling design were proposed with the intention of finding an alternative double sampling design estimator to the conventional ratio estimator in double sampling design discussed by Cochran (1997), Okafor (2002) , Raj (1972) and Raj and Chandhok (1999).

  10. Comparison of soil sampling and analytical methods for asbestos at the Sumas Mountain Asbestos Site-Working towards a toolbox for better assessment.

    Science.gov (United States)

    Wroble, Julie; Frederick, Timothy; Frame, Alicia; Vallero, Daniel

    2017-01-01

    Established soil sampling methods for asbestos are inadequate to support risk assessment and risk-based decision making at Superfund sites due to difficulties in detecting asbestos at low concentrations and difficulty in extrapolating soil concentrations to air concentrations. Environmental Protection Agency (EPA)'s Office of Land and Emergency Management (OLEM) currently recommends the rigorous process of Activity Based Sampling (ABS) to characterize site exposures. The purpose of this study was to compare three soil analytical methods and two soil sampling methods to determine whether one method, or combination of methods, would yield more reliable soil asbestos data than other methods. Samples were collected using both traditional discrete ("grab") samples and incremental sampling methodology (ISM). Analyses were conducted using polarized light microscopy (PLM), transmission electron microscopy (TEM) methods or a combination of these two methods. Data show that the fluidized bed asbestos segregator (FBAS) followed by TEM analysis could detect asbestos at locations that were not detected using other analytical methods; however, this method exhibited high relative standard deviations, indicating the results may be more variable than other soil asbestos methods. The comparison of samples collected using ISM versus discrete techniques for asbestos resulted in no clear conclusions regarding preferred sampling method. However, analytical results for metals clearly showed that measured concentrations in ISM samples were less variable than discrete samples.

  11. Optimisation (sampling strategies and analytical procedures) for site specific environment monitoring at the areas of uranium production legacy sites in Ukraine - 59045

    International Nuclear Information System (INIS)

    Voitsekhovych, Oleg V.; Lavrova, Tatiana V.; Kostezh, Alexander B.

    2012-01-01

    There are many sites in the world, where Environment are still under influence of the contamination related to the Uranium production carried out in past. Author's experience shows that lack of site characterization data, incomplete or unreliable environment monitoring studies can significantly limit quality of Safety Assessment procedures and Priority actions analyses needed for Remediation Planning. During recent decades the analytical laboratories of the many enterprises, currently being responsible for establishing the site specific environment monitoring program have been significantly improved their technical sampling and analytical capacities. However, lack of experience in the optimal site specific sampling strategy planning and also not enough experience in application of the required analytical techniques, such as modern alpha-beta radiometers, gamma and alpha spectrometry and liquid-scintillation analytical methods application for determination of U-Th series radionuclides in the environment, does not allow to these laboratories to develop and conduct efficiently the monitoring programs as a basis for further Safety Assessment in decision making procedures. This paper gives some conclusions, which were gained from the experience establishing monitoring programs in Ukraine and also propose some practical steps on optimization in sampling strategy planning and analytical procedures to be applied for the area required Safety assessment and justification for its potential remediation and safe management. (authors)

  12. Synthetic Multiple-Imputation Procedure for Multistage Complex Samples

    Directory of Open Access Journals (Sweden)

    Zhou Hanzhi

    2016-03-01

    Full Text Available Multiple imputation (MI is commonly used when item-level missing data are present. However, MI requires that survey design information be built into the imputation models. For multistage stratified clustered designs, this requires dummy variables to represent strata as well as primary sampling units (PSUs nested within each stratum in the imputation model. Such a modeling strategy is not only operationally burdensome but also inferentially inefficient when there are many strata in the sample design. Complexity only increases when sampling weights need to be modeled. This article develops a generalpurpose analytic strategy for population inference from complex sample designs with item-level missingness. In a simulation study, the proposed procedures demonstrate efficient estimation and good coverage properties. We also consider an application to accommodate missing body mass index (BMI data in the analysis of BMI percentiles using National Health and Nutrition Examination Survey (NHANES III data. We argue that the proposed methods offer an easy-to-implement solution to problems that are not well-handled by current MI techniques. Note that, while the proposed method borrows from the MI framework to develop its inferential methods, it is not designed as an alternative strategy to release multiply imputed datasets for complex sample design data, but rather as an analytic strategy in and of itself.

  13. Sources of pre-analytical variations in yield of DNA extracted from blood samples: analysis of 50,000 DNA samples in EPIC.

    Directory of Open Access Journals (Sweden)

    Elodie Caboux

    Full Text Available The European Prospective Investigation into Cancer and nutrition (EPIC is a long-term, multi-centric prospective study in Europe investigating the relationships between cancer and nutrition. This study has served as a basis for a number of Genome-Wide Association Studies (GWAS and other types of genetic analyses. Over a period of 5 years, 52,256 EPIC DNA samples have been extracted using an automated DNA extraction platform. Here we have evaluated the pre-analytical factors affecting DNA yield, including anthropometric, epidemiological and technical factors such as center of subject recruitment, age, gender, body-mass index, disease case or control status, tobacco consumption, number of aliquots of buffy coat used for DNA extraction, extraction machine or procedure, DNA quantification method, degree of haemolysis and variations in the timing of sample processing. We show that the largest significant variations in DNA yield were observed with degree of haemolysis and with center of subject recruitment. Age, gender, body-mass index, cancer case or control status and tobacco consumption also significantly impacted DNA yield. Feedback from laboratories which have analyzed DNA with different SNP genotyping technologies demonstrate that the vast majority of samples (approximately 88% performed adequately in different types of assays. To our knowledge this study is the largest to date to evaluate the sources of pre-analytical variations in DNA extracted from peripheral leucocytes. The results provide a strong evidence-based rationale for standardized recommendations on blood collection and processing protocols for large-scale genetic studies.

  14. Hanford high level waste: Sample Exchange/Evaluation (SEE) Program

    International Nuclear Information System (INIS)

    King, A.G.

    1994-08-01

    The Pacific Northwest Laboratory (PNL)/Analytical Chemistry Laboratory (ACL) and the Westinghouse Hanford Company (WHC)/Process Analytical Laboratory (PAL) provide analytical support services to various environmental restoration and waste management projects/programs at Hanford. In response to a US Department of Energy -- Richland Field Office (DOE-RL) audit, which questioned the comparability of analytical methods employed at each laboratory, the Sample Exchange/Exchange (SEE) program was initiated. The SEE Program is a selfassessment program designed to compare analytical methods of the PAL and ACL laboratories using sitespecific waste material. The SEE program is managed by a collaborative, the Quality Assurance Triad (Triad). Triad membership is made up of representatives from the WHC/PAL, PNL/ACL, and WHC Hanford Analytical Services Management (HASM) organizations. The Triad works together to design/evaluate/implement each phase of the SEE Program

  15. Design of an Integrated Methodology for Analytical Design of Complex Supply Chains

    Directory of Open Access Journals (Sweden)

    Shahid Rashid

    2012-01-01

    Full Text Available A literature review and gap analysis indentifies key limitations of industry best practice when modelling of supply chains. To address these limitations the paper reports on the conception and development of an integrated modelling methodology designed to underpin the analytical design of complex supply chains. The methodology is based upon a systematic deployment of EM, CLD, and SM techniques; the integration of which is achieved via common modelling concepts and decomposition principles. Thereby the methodology facilitates: (i graphical representation and description of key “processing”, “resourcing” and “work flow” properties of supply chain configurations; (ii behavioural exploration of currently configured supply chains, to facilitate reasoning about uncertain demand impacts on supply, make, delivery, and return processes; (iii predictive quantification about relative performances of alternative complex supply chain configurations, including risk assessments. Guidelines for the application of each step of the methodology are described. Also described are recommended data collection methods and expected modelling outcomes for each step. The methodology is being extensively case tested to quantify potential benefits & costs relative to current best industry practice. The paper reflects on preliminary benefits gained during industry based case study modelling and identifies areas of potential improvement.

  16. Analytical and between-subject variation of thrombin generation measured by calibrated automated thrombography on plasma samples.

    Science.gov (United States)

    Kristensen, Anne F; Kristensen, Søren R; Falkmer, Ursula; Münster, Anna-Marie B; Pedersen, Shona

    2018-05-01

    The Calibrated Automated Thrombography (CAT) is an in vitro thrombin generation (TG) assay that holds promise as a valuable tool within clinical diagnostics. However, the technique has a considerable analytical variation, and we therefore, investigated the analytical and between-subject variation of CAT systematically. Moreover, we assess the application of an internal standard for normalization to diminish variation. 20 healthy volunteers donated one blood sample which was subsequently centrifuged, aliquoted and stored at -80 °C prior to analysis. The analytical variation was determined on eight runs, where plasma from the same seven volunteers was processed in triplicates, and for the between-subject variation, TG analysis was performed on plasma from all 20 volunteers. The trigger reagents used for the TG assays included both PPP reagent containing 5 pM tissue factor (TF) and PPPlow with 1 pM TF. Plasma, drawn from a single donor, was applied to all plates as an internal standard for each TG analysis, which subsequently was used for normalization. The total analytical variation for TG analysis performed with PPPlow reagent is 3-14% and 9-13% for PPP reagent. This variation can be minimally reduced by using an internal standard but mainly for ETP (endogenous thrombin potential). The between-subject variation is higher when using PPPlow than PPP and this variation is considerable higher than the analytical variation. TG has a rather high inherent analytical variation but considerable lower than the between-subject variation when using PPPlow as reagent.

  17. Design Patterns to Achieve 300x Speedup for Oceanographic Analytics in the Cloud

    Science.gov (United States)

    Jacob, J. C.; Greguska, F. R., III; Huang, T.; Quach, N.; Wilson, B. D.

    2017-12-01

    We describe how we achieve super-linear speedup over standard approaches for oceanographic analytics on a cluster computer and the Amazon Web Services (AWS) cloud. NEXUS is an open source platform for big data analytics in the cloud that enables this performance through a combination of horizontally scalable data parallelism with Apache Spark and rapid data search, subset, and retrieval with tiled array storage in cloud-aware NoSQL databases like Solr and Cassandra. NEXUS is the engine behind several public portals at NASA and OceanWorks is a newly funded project for the ocean community that will mature and extend this capability for improved data discovery, subset, quality screening, analysis, matchup of satellite and in situ measurements, and visualization. We review the Python language API for Spark and how to use it to quickly convert existing programs to use Spark to run with cloud-scale parallelism, and discuss strategies to improve performance. We explain how partitioning the data over space, time, or both leads to algorithmic design patterns for Spark analytics that can be applied to many different algorithms. We use NEXUS analytics as examples, including area-averaged time series, time averaged map, and correlation map.

  18. Second International Workshop on Teaching Analytics

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Reimann, Peter; Halb, Wolfgang

    2013-01-01

    Teaching Analytics is conceived as a subfield of learning analytics that focuses on the design, development, evaluation, and education of visual analytics methods and tools for teachers in primary, secondary, and tertiary educational settings. The Second International Workshop on Teaching Analytics...... (IWTA) 2013 seeks to bring together researchers and practitioners in the fields of education, learning sciences, learning analytics, and visual analytics to investigate the design, development, use, evaluation, and impact of visual analytical methods and tools for teachers’ dynamic diagnostic decision...

  19. Design compliance matrix waste sample container filling system for nested, fixed-depth sampling system

    International Nuclear Information System (INIS)

    BOGER, R.M.

    1999-01-01

    This design compliance matrix document provides specific design related functional characteristics, constraints, and requirements for the container filling system that is part of the nested, fixed-depth sampling system. This document addresses performance, external interfaces, ALARA, Authorization Basis, environmental and design code requirements for the container filling system. The container filling system will interface with the waste stream from the fluidic pumping channels of the nested, fixed-depth sampling system and will fill containers with waste that meet the Resource Conservation and Recovery Act (RCRA) criteria for waste that contains volatile and semi-volatile organic materials. The specifications for the nested, fixed-depth sampling system are described in a Level 2 Specification document (HNF-3483, Rev. 1). The basis for this design compliance matrix document is the Tank Waste Remediation System (TWRS) desk instructions for design Compliance matrix documents (PI-CP-008-00, Rev. 0)

  20. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    Energy Technology Data Exchange (ETDEWEB)

    Shine, E. P.; Poirier, M. R.

    2013-10-29

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  1. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    International Nuclear Information System (INIS)

    Shine, E. P.; Poirier, M. R.

    2013-01-01

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  2. A Proposed Concentration Curriculum Design for Big Data Analytics for Information Systems Students

    Science.gov (United States)

    Molluzzo, John C.; Lawler, James P.

    2015-01-01

    Big Data is becoming a critical component of the Information Systems curriculum. Educators are enhancing gradually the concentration curriculum for Big Data in schools of computer science and information systems. This paper proposes a creative curriculum design for Big Data Analytics for a program at a major metropolitan university. The design…

  3. Road Transportable Analytical Laboratory system. Phase 1

    Energy Technology Data Exchange (ETDEWEB)

    Finger, S.M.; Keith, V.F.; Spertzel, R.O.; De Avila, J.C.; O`Donnell, M.; Vann, R.L.

    1993-09-01

    This developmental effort clearly shows that a Road Transportable Analytical Laboratory System is a worthwhile and achievable goal. The RTAL is designed to fully analyze (radioanalytes, and organic and inorganic chemical analytes) 20 samples per day at the highest levels of quality assurance and quality control. It dramatically reduces the turnaround time for environmental sample analysis from 45 days (at a central commercial laboratory) to 1 day. At the same time each RTAL system will save the DOE over $12 million per year in sample analysis costs compared to the costs at a central commercial laboratory. If RTAL systems were used at the eight largest DOE facilities (at Hanford, Savannah River, Fernald, Oak Ridge, Idaho, Rocky Flats, Los Alamos, and the Nevada Test Site), the annual savings would be $96,589,000. The DOE`s internal study of sample analysis needs projects 130,000 environmental samples requiring analysis in FY 1994, clearly supporting the need for the RTAL system. The cost and time savings achievable with the RTAL system will accelerate and improve the efficiency of cleanup and remediation operations throughout the DOE complex.

  4. Analytical results for 544 water samples collected in the Attean Quartz Monzonite in the vicinity of Jackman, Maine

    Science.gov (United States)

    Ficklin, W.H.; Nowlan, G.A.; Preston, D.J.

    1983-01-01

    Water samples were collected in the vicinity of Jackman, Maine as a part of the study of the relationship of dissolved constituents in water to the sediments subjacent to the water. Each sample was analyzed for specific conductance, alkalinity, acidity, pH, fluoride, chloride, sulfate, phosphate, nitrate, sodium, potassium, calcium, magnesium, and silica. Trace elements determined were copper, zinc, molybdenum, lead, iron, manganese, arsenic, cobalt, nickel, and strontium. The longitude and latitude of each sample location and a sample site map are included in the report as well as a table of the analytical results.

  5. Survalytics: An Open-Source Cloud-Integrated Experience Sampling, Survey, and Analytics and Metadata Collection Module for Android Operating System Apps.

    Science.gov (United States)

    O'Reilly-Shah, Vikas; Mackey, Sean

    2016-06-03

    We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes.

  6. Survalytics: An Open-Source Cloud-Integrated Experience Sampling, Survey, and Analytics and Metadata Collection Module for Android Operating System Apps

    Science.gov (United States)

    Mackey, Sean

    2016-01-01

    Background We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Objective Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Methods Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. Results The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. Conclusions The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes. PMID:27261155

  7. Atmospheric Deposition: Sampling Procedures, Analytical Methods, and Main Recent Findings from the Scientific Literature

    Directory of Open Access Journals (Sweden)

    M. Amodio

    2014-01-01

    Full Text Available The atmosphere is a carrier on which some natural and anthropogenic organic and inorganic chemicals are transported, and the wet and dry deposition events are the most important processes that remove those chemicals, depositing it on soil and water. A wide variety of different collectors were tested to evaluate site-specificity, seasonality and daily variability of settleable particle concentrations. Deposition fluxes of POPs showed spatial and seasonal variations, diagnostic ratios of PAHs on deposited particles, allowed the discrimination between pyrolytic or petrogenic sources. Congener pattern analysis and bulk deposition fluxes in rural sites confirmed long-range atmospheric transport of PCDDs/Fs. More and more sophisticated and newly designed deposition samplers have being used for characterization of deposited mercury, demonstrating the importance of rain scavenging and the relatively higher magnitude of Hg deposition from Chinese anthropogenic sources. Recently biological monitors demonstrated that PAH concentrations in lichens were comparable with concentrations measured in a conventional active sampler in an outdoor environment. In this review the authors explore the methodological approaches used for the assessment of atmospheric deposition, from the analysis of the sampling methods, the analytical procedures for chemical characterization of pollutants and the main results from the scientific literature.

  8. Analytical Chemistry Laboratory (ACL) procedure compendium

    International Nuclear Information System (INIS)

    1992-06-01

    Covered are: analytical laboratory operations (ALO) sample receipt and control, ALO data report/package preparation review and control, single shell tank (PST) project sample tracking system, sample receiving, analytical balances, duties and responsibilities of sample custodian, sample refrigerator temperature monitoring, security, assignment of staff responsibilities, sample storage, data reporting, and general requirements for glassware

  9. Assessment of the analytical capabilities of inductively coupled plasma-mass spectrometry

    Science.gov (United States)

    Taylor, Howard E.; Garbarino, John R.

    1988-01-01

    A thorough assessment of the analytical capabilities of inductively coupled plasma-mass spectrometry was conducted for selected analytes of importance in water quality applications and hydrologic research. A multielement calibration curve technique was designed to produce accurate and precise results in analysis times of approximately one minute. The suite of elements included Al, As, B, Ba, Be, Cd, Co, Cr, Cu, Hg, Li, Mn, Mo, Ni, Pb, Se, Sr, V, and Zn. The effects of sample matrix composition on the accuracy of the determinations showed that matrix elements (such as Na, Ca, Mg, and K) that may be present in natural water samples at concentration levels greater than 50 mg/L resulted in as much as a 10% suppression in ion current for analyte elements. Operational detection limits are presented.

  10. Sample design for the residential energy consumption survey

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    The purpose of this report is to provide detailed information about the multistage area-probability sample design used for the Residential Energy Consumption Survey (RECS). It is intended as a technical report, for use by statisticians, to better understand the theory and procedures followed in the creation of the RECS sample frame. For a more cursory overview of the RECS sample design, refer to the appendix entitled ``How the Survey was Conducted,`` which is included in the statistical reports produced for each RECS survey year.

  11. Visual analytics for multimodal social network analysis: a design study with social scientists.

    Science.gov (United States)

    Ghani, Sohaib; Kwon, Bum Chul; Lee, Seungyoon; Yi, Ji Soo; Elmqvist, Niklas

    2013-12-01

    Social network analysis (SNA) is becoming increasingly concerned not only with actors and their relations, but also with distinguishing between different types of such entities. For example, social scientists may want to investigate asymmetric relations in organizations with strict chains of command, or incorporate non-actors such as conferences and projects when analyzing coauthorship patterns. Multimodal social networks are those where actors and relations belong to different types, or modes, and multimodal social network analysis (mSNA) is accordingly SNA for such networks. In this paper, we present a design study that we conducted with several social scientist collaborators on how to support mSNA using visual analytics tools. Based on an openended, formative design process, we devised a visual representation called parallel node-link bands (PNLBs) that splits modes into separate bands and renders connections between adjacent ones, similar to the list view in Jigsaw. We then used the tool in a qualitative evaluation involving five social scientists whose feedback informed a second design phase that incorporated additional network metrics. Finally, we conducted a second qualitative evaluation with our social scientist collaborators that provided further insights on the utility of the PNLBs representation and the potential of visual analytics for mSNA.

  12. System design description for sampling fuel in K basins

    International Nuclear Information System (INIS)

    Baker, R.B.

    1996-01-01

    This System Design Description provides: (1) statements of the Spent Nuclear Fuel Projects (SNFP) needs requiring sampling of fuel in the K East and K West Basins, (2) the sampling equipment functions and requirements, (3) a general work plan and the design logic being followed to develop the equipment, and (4) a summary description of the design for the sampling equipment. The report summarizes the integrated application of both the subject equipment and the canister sludge sampler in near-term characterization campaigns at K Basins

  13. Sampling design for use by the soil decontamination project

    International Nuclear Information System (INIS)

    Rutherford, D.W.; Stevens, J.R.

    1981-01-01

    This report proposes a general approach to the problem and discusses sampling of soil to map the contaminated area and to provide samples for characterizaton of soil components and contamination. Basic concepts in sample design are reviewed with reference to environmental transuranic studies. Common designs are reviewed and evaluated for use with specific objectives that might be required by the soil decontamination project. Examples of a hierarchial design pilot study and a combined hierarchial and grid study are proposed for the Rocky Flats 903 pad area

  14. [Saarland Growth Study: sampling design].

    Science.gov (United States)

    Danker-Hopfe, H; Zabransky, S

    2000-01-01

    The use of reference data to evaluate the physical development of children and adolescents is part of the daily routine in the paediatric ambulance. The construction of such reference data is based on the collection of extensive reference data. There are different kinds of reference data: cross sectional references, which are based on data collected from a big representative cross-sectional sample of the population, longitudinal references, which are based on follow-up surveys of usually smaller samples of individuals from birth to maturity, and mixed longitudinal references, which are a combination of longitudinal and cross-sectional reference data. The advantages and disadvantages of the different methods of data collection and the resulting reference data are discussed. The Saarland Growth Study was conducted for several reasons: growth processes are subject to secular changes, there are no specific reference data for children and adolescents from this part of the country and the growth charts in use in the paediatric praxis are possibly not appropriate any more. Therefore, the Saarland Growth Study served two purposes a) to create actual regional reference data and b) to create a database for future studies on secular trends in growth processes of children and adolescents from Saarland. The present contribution focusses on general remarks on the sampling design of (cross-sectional) growth surveys and its inferences for the design of the present study.

  15. Automated Ground-Water Sampling and Analysis of Hexavalent Chromium using a “Universal” Sampling/Analytical System

    Directory of Open Access Journals (Sweden)

    Richard J. Venedam

    2005-02-01

    Full Text Available The capabilities of a “universal platform” for the deployment of analyticalsensors in the field for long-term monitoring of environmental contaminants were expandedin this investigation. The platform was previously used to monitor trichloroethene inmonitoring wells and at groundwater treatment systems (1,2. The platform was interfacedwith chromium (VI and conductivity analytical systems to monitor shallow wells installedadjacent to the Columbia River at the 100-D Area of the Hanford Site, Washington. Agroundwater plume of hexavalent chromium is discharging into the Columbia River throughthe gravels beds used by spawning salmon. The sampling/analytical platform was deployedfor the purpose of collecting data on subsurface hexavalent chromium concentrations atmore frequent intervals than was possible with the previous sampling and analysis methodsemployed a the Site.

  16. Evolution in the design of a low sheath-flow interface for CE-MS and application to biological samples.

    Science.gov (United States)

    González-Ruiz, Víctor; Codesido, Santiago; Rudaz, Serge; Schappler, Julie

    2018-03-01

    Although several interfaces for CE-MS hyphenation are commercially available, the development of new versatile, simple and yet efficient and sensitive alternatives remains an important field of research. In a previous work, a simple low sheath-flow interface was developed from inexpensive parts. This interface features a design easy to build, maintain, and adapt to particular needs. The present work introduces an improved design of the previous interface. By reducing the diameter of the separation capillary and the emitter, a smaller Taylor cone is spontaneously formed, minimizing the zone dispersion while the analytes go through the interface and leading to less peak broadening associated to the ESI process. Numerical modeling allowed studying the mixing and diffusion processes taking place in the Taylor cone. The analytical performance of this new interface was tested with pharmaceutically relevant molecules and endogenous metabolites. The interface was eventually applied to the analysis of neural cell culture samples, allowing the identification of a panel of neurotransmission-related molecules. An excellent migration time repeatability was obtained (intra-day RSD 10 with an injected volume of 6.7 nL of biological extract. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Hanford analytical services quality assurance requirements documents. Volume 1: Administrative Requirements

    International Nuclear Information System (INIS)

    Hyatt, J.E.

    1997-01-01

    Hanford Analytical Services Quality Assurance Requirements Document (HASQARD) is issued by the Analytical Services, Program of the Waste Management Division, US Department of Energy (US DOE), Richland Operations Office (DOE-RL). The HASQARD establishes quality requirements in response to DOE Order 5700.6C (DOE 1991b). The HASQARD is designed to meet the needs of DOE-RL for maintaining a consistent level of quality for sampling and field and laboratory analytical services provided by contractor and commercial field and laboratory analytical operations. The HASQARD serves as the quality basis for all sampling and field/laboratory analytical services provided to DOE-RL through the Analytical Services Program of the Waste Management Division in support of Hanford Site environmental cleanup efforts. This includes work performed by contractor and commercial laboratories and covers radiological and nonradiological analyses. The HASQARD applies to field sampling, field analysis, and research and development activities that support work conducted under the Hanford Federal Facility Agreement and Consent Order Tri-Party Agreement and regulatory permit applications and applicable permit requirements described in subsections of this volume. The HASQARD applies to work done to support process chemistry analysis (e.g., ongoing site waste treatment and characterization operations) and research and development projects related to Hanford Site environmental cleanup activities. This ensures a uniform quality umbrella to analytical site activities predicated on the concepts contained in the HASQARD. Using HASQARD will ensure data of known quality and technical defensibility of the methods used to obtain that data. The HASQARD is made up of four volumes: Volume 1, Administrative Requirements; Volume 2, Sampling Technical Requirements; Volume 3, Field Analytical Technical Requirements; and Volume 4, Laboratory Technical Requirements. Volume 1 describes the administrative requirements

  18. Possibilities for decreasing detection limits of analytical methods for determination of transformation products of unsymmetrical dimethylhydrazine in environmental samples

    Directory of Open Access Journals (Sweden)

    Bulat Kenessov

    2015-12-01

    Full Text Available Most rockets of middle and heavy class launched from Kazakhstan, Russia, China and other countries still use highly toxic unsymmetrical dimethylhydrazine (UDMH as a liquid propellant. Study of migration, distribution and accumulation of UDMH transformation products in environment and human health impact assessment of space rocket activity are currently complicated due to the absence of analytical methods allowing detection of trace concentrations of these compounds in analyzed samples. This paper reviews methods and approaches, which can be applied for development of such methods. Detection limits at a part-per-trillion (ppt level may be achieved using most selective and sensitive methods based on gas or liquid chromatography in combination of tandem or high-resolution mass spectrometry. In addition, 1000-fold concentration of samples or integrated sample preparation methods, e.g., dynamic headspace extraction, are required. Special attention during development and application of such methods must be paid to purity of laboratory air, reagents, glassware and analytical instruments.

  19. Analytical strategies for uranium determination in natural water and industrial effluents samples

    International Nuclear Information System (INIS)

    Santos, Juracir Silva

    2011-01-01

    The work was developed under the project 993/2007 - 'Development of analytical strategies for uranium determination in environmental and industrial samples - Environmental monitoring in the Caetite city, Bahia, Brazil' and made possible through a partnership established between Universidade Federal da Bahia and the Comissao Nacional de Energia Nuclear. Strategies were developed to uranium determination in natural water and effluents of uranium mine. The first one was a critical evaluation of the determination of uranium by inductively coupled plasma optical emission spectrometry (ICP OES) performed using factorial and Doehlert designs involving the factors: acid concentration, radio frequency power and nebuliser gas flow rate. Five emission lines were simultaneously studied (namely: 367.007, 385.464, 385.957, 386.592 and 409.013 nm), in the presence of HN0 3 , H 3 C 2 00H or HCI. The determinations in HN0 3 medium were the most sensitive. Among the factors studied, the gas flow rate was the most significant for the five emission lines. Calcium caused interference in the emission intensity for some lines and iron did not interfere (at least up to 10 mg L -1 ) in the five lines studied. The presence of 13 other elements did not affect the emission intensity of uranium for the lines chosen. The optimized method, using the line at 385.957 nm, allows the determination of uranium with limit of quantification of 30 μg L -1 and precision expressed as RSD lower than 2.2% for uranium concentrations of either 500 and 1000 μg L -1 . In second one, a highly sensitive flow-based procedure for uranium determination in natural waters is described. A 100-cm optical path flow cell based on a liquid-core waveguide (LCW) was exploited to increase sensitivity of the arsenazo 111 method, aiming to achieve the limits established by environmental regulations. The flow system was designed with solenoid micro-pumps in order to improve mixing and minimize reagent consumption, as well as

  20. Design and development of multiple sample counting setup

    International Nuclear Information System (INIS)

    Rath, D.P.; Murali, S.; Babu, D.A.R.

    2010-01-01

    Full text: The analysis of active samples on regular basis for ambient air activity and floor contamination from radio chemical lab accounts for major chunk of the operational activity in Health Physicist's responsibility. The requirement for daily air sample analysis on immediate counting and delayed counting from various labs in addition to samples of smear swipe check of lab led to the urge for development of system that could cater multiple sample analysis in a time programmed manner on a single sample loading. A multiple alpha/beta counting system for counting was designed and fabricated. It has arrangements for loading 10 samples in slots in order, get counted in a time programmed manner with results displayed and records maintained in PC. The paper describes the design and development of multiple sample counting setup presently in use at the facility has resulted in reduction of man-hour consumption in counting and recording of the results

  1. Using Learning Analytics to Understand the Design of an Intelligent Language Tutor – Chatbot Lucy

    OpenAIRE

    Yi Fei Wang; Stephen Petrina

    2013-01-01

    the goal of this article is to explore how learning analytics can be used to predict and advise the design of an intelligent language tutor, chatbot Lucy. With its focus on using student-produced data to understand the design of Lucy to assist English language learning, this research can be a valuable component for language-learning designers to improve second language acquisition. In this article, we present students’ learning journey and data trails, the chatting log architecture and result...

  2. 105-N Basin sediment disposition phase-one sampling and analysis plan

    International Nuclear Information System (INIS)

    1997-01-01

    The sampling and analysis plan (SAP) for Phase 1 of the 105-N Basin sediment disposition project defines the sampling and analytical activities that will be performed for the engineering assessment phase (phase 1) of the project. A separate SAP defines the sampling and analytical activities that will be performed for the characterization phase (Phase 2) of the 105-N sediment disposition project. The Phase-1 SAP is presented in the introduction (Section 1.0), in the field sampling plan (FSP) (Section 2.0), and in the quality assurance project plan (QAPjP) (Section 3.0). The FSP defines the sampling and analytical methodologies to be performed. The QAPjP provides information on the quality assurance/quality control (QA/QC) parameters related to the sampling and analytical methodologies. This SAP defines the strategy and the methods that will be used to sample and analyze the sediment on the floor of the 105-N Basin. The resulting data will be used to develop and evaluate engineering designs for collecting and removing sediment from the basin

  3. Randomization-Based Inference about Latent Variables from Complex Samples: The Case of Two-Stage Sampling

    Science.gov (United States)

    Li, Tiandong

    2012-01-01

    In large-scale assessments, such as the National Assessment of Educational Progress (NAEP), plausible values based on Multiple Imputations (MI) have been used to estimate population characteristics for latent constructs under complex sample designs. Mislevy (1991) derived a closed-form analytic solution for a fixed-effect model in creating…

  4. Probability sampling design in ethnobotanical surveys of medicinal plants

    Directory of Open Access Journals (Sweden)

    Mariano Martinez Espinosa

    2012-07-01

    Full Text Available Non-probability sampling design can be used in ethnobotanical surveys of medicinal plants. However, this method does not allow statistical inferences to be made from the data generated. The aim of this paper is to present a probability sampling design that is applicable in ethnobotanical studies of medicinal plants. The sampling design employed in the research titled "Ethnobotanical knowledge of medicinal plants used by traditional communities of Nossa Senhora Aparecida do Chumbo district (NSACD, Poconé, Mato Grosso, Brazil" was used as a case study. Probability sampling methods (simple random and stratified sampling were used in this study. In order to determine the sample size, the following data were considered: population size (N of 1179 families; confidence coefficient, 95%; sample error (d, 0.05; and a proportion (p, 0.5. The application of this sampling method resulted in a sample size (n of at least 290 families in the district. The present study concludes that probability sampling methods necessarily have to be employed in ethnobotanical studies of medicinal plants, particularly where statistical inferences have to be made using data obtained. This can be achieved by applying different existing probability sampling methods, or better still, a combination of such methods.

  5. Designing an enhanced groundwater sample collection system

    International Nuclear Information System (INIS)

    Schalla, R.

    1994-10-01

    As part of an ongoing technical support mission to achieve excellence and efficiency in environmental restoration activities at the Laboratory for Energy and Health-Related Research (LEHR), Pacific Northwest Laboratory (PNL) provided guidance on the design and construction of monitoring wells and identified the most suitable type of groundwater sampling pump and accessories for monitoring wells. The goal was to utilize a monitoring well design that would allow for hydrologic testing and reduce turbidity to minimize the impact of sampling. The sampling results of the newly designed monitoring wells were clearly superior to those of the previously installed monitoring wells. The new wells exhibited reduced turbidity, in addition to improved access for instrumentation and hydrologic testing. The variable frequency submersible pump was selected as the best choice for obtaining groundwater samples. The literature references are listed at the end of this report. Despite some initial difficulties, the actual performance of the variable frequency, submersible pump and its accessories was effective in reducing sampling time and labor costs, and its ease of use was preferred over the previously used bladder pumps. The surface seals system, called the Dedicator, proved to be useful accessory to prevent surface contamination while providing easy access for water-level measurements and for connecting the pump. Cost savings resulted from the use of the pre-production pumps (beta units) donated by the manufacturer for the demonstration. However, larger savings resulted from shortened field time due to the ease in using the submersible pumps and the surface seal access system. Proper deployment of the monitoring wells also resulted in cost savings and ensured representative samples

  6. Development of analytical techniques for water and environmental samples (2)

    Energy Technology Data Exchange (ETDEWEB)

    Eum, Chul Hun; Jeon, Chi Wan; Jung, Kang Sup; Song, Kyung Sun; Kim, Sang Yeon [Korea Institute of Geology Mining and Materials, Taejon (Korea)

    1998-12-01

    The purpose of this study is to develop new analytical methods with good detection limit for toxic inorganic and organic compounds. The analyses of CN, organic acids, particulate materials in environmental samples have been done using several methods such as Ion Chromatography, SPE, SPME, GC/MS, GC/FID, SPLITT (split-flow thin cell fractionation) during the second year of this project. Advantage and disadvantage of several distillation method (by KS, JIS, EPA) for CN analysis in wastewater were investigated. As the results, we proposed new distillation apparatus for CN analysis, which was proved to be simpler, faster and to get better recovery than conventional apparatus. And ion chromatograph/pulsed amperometric detector (IC/PAD) system instead of colorimetry for CN detection was setup to solve matrix interference. And SPE(solid phase extraction) and SPME (solid phase micro extraction) as liquid-solid extraction technique were applied to the analysis of phenols in wastewater. Optimum experimental conditions and factors influencing analytical results were determined. From these results, It could be concluded that C{sub 18} cartridge and polystyrene-divinylbenzene disk in SPE method, polyacrylate fiber in SPME were proper solid phase adsorbent for phenol. Optimum conditions to analyze phenol derivatives simultaneously were established. Also, Continuous SPLITT (Split-flow thin cell) Fractionation (CSF) is a new preparative separation technique that is useful for fractionation of particulate and macromolecular materials. CSF is carried out in a thin ribbon-like channel equipped with two splitters at both inlet and outlet of the channel. In this work, we set up a new CSF system, and tested using polystyrene latex standard particles. And then we fractionated particles contained in air and underground water based on their sedimentation coefficients using CSF. (author). 27 refs., 13 tabs., 31 figs.

  7. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    Science.gov (United States)

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Extending cluster lot quality assurance sampling designs for surveillance programs.

    Science.gov (United States)

    Hund, Lauren; Pagano, Marcello

    2014-07-20

    Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance on the basis of the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than simple random sampling. By applying survey sampling results to the binary classification procedure, we develop a simple and flexible nonparametric procedure to incorporate clustering effects into the LQAS sample design to appropriately inflate the sample size, accommodating finite numbers of clusters in the population when relevant. We use this framework to then discuss principled selection of survey design parameters in longitudinal surveillance programs. We apply this framework to design surveys to detect rises in malnutrition prevalence in nutrition surveillance programs in Kenya and South Sudan, accounting for clustering within villages. By combining historical information with data from previous surveys, we design surveys to detect spikes in the childhood malnutrition rate. Copyright © 2014 John Wiley & Sons, Ltd.

  9. Application of analytical quality by design principles for the determination of alkyl p-toluenesulfonates impurities in Aprepitant by HPLC. Validation using total-error concept.

    Science.gov (United States)

    Zacharis, Constantinos K; Vastardi, Elli

    2018-02-20

    In the research presented we report the development of a simple and robust liquid chromatographic method for the quantification of two genotoxic alkyl sulphonate impurities (namely methyl p-toluenesulfonate and isopropyl p-toluenesulfonate) in Aprepitant API substances using the Analytical Quality by Design (AQbD) approach. Following the steps of AQbD protocol, the selected critical method attributes (CMAs) were the separation criterions between the critical peak pairs, the analysis time and the peak efficiencies of the analytes. The critical method parameters (CMPs) included the flow rate, the gradient slope and the acetonitrile content at the first step of the gradient elution program. Multivariate experimental designs namely Plackett-Burman and Box-Behnken designs were conducted sequentially for factor screening and optimization of the method parameters. The optimal separation conditions were estimated using the desirability function. The method was fully validated in the range of 10-200% of the target concentration limit of the analytes using the "total error" approach. Accuracy profiles - a graphical decision making tool - were constructed using the results of the validation procedures. The β-expectation tolerance intervals did not exceed the acceptance criteria of±10%, meaning that 95% of future results will be included in the defined bias limits. The relative bias ranged between - 1.3-3.8% for both analytes, while the RSD values for repeatability and intermediate precision were less than 1.9% in all cases. The achieved limit of detection (LOD) and the limit of quantification (LOQ) were adequate for the specific purpose and found to be 0.02% (corresponding to 48μgg -1 in sample) for both methyl and isopropyl p-toluenesulfonate. As proof-of-concept, the validated method was successfully applied in the analysis of several Aprepitant batches indicating that this methodology could be used for routine quality control analyses. Copyright © 2017 Elsevier B

  10. Chapter 12. Sampling and analytical methods

    International Nuclear Information System (INIS)

    Busenberg, E.; Plummer, L.N.; Cook, P.G.; Solomon, D.K.; Han, L.F.; Groening, M.; Oster, H.

    2006-01-01

    When water samples are taken for the analysis of CFCs, regardless of the sampling method used, contamination of samples by contact with atmospheric air (with its 'high' CFC concentrations) is a major concern. This is because groundwaters usually have lower CFC concentrations than those waters which have been exposed to the modern air. Some groundwaters might not contain CFCs and, therefore, are most sensitive to trace contamination by atmospheric air. Thus, extreme precautions are needed to obtain uncontaminated samples when groundwaters, particularly those with older ages, are sampled. It is recommended at the start of any CFC investigation that samples from a CFC-free source be collected and analysed, as a check upon the sampling equipment and methodology. The CFC-free source might be a deep monitoring well or, alternatively, CFC-free water could be carefully prepared in the laboratory. It is especially important that all tubing, pumps and connection that will be used in the sampling campaign be checked in this manner

  11. Thermal probe design for Europa sample acquisition

    Science.gov (United States)

    Horne, Mera F.

    2018-01-01

    The planned lander missions to the surface of Europa will access samples from the subsurface of the ice in a search for signs of life. A small thermal drill (probe) is proposed to meet the sample requirement of the Science Definition Team's (SDT) report for the Europa mission. The probe is 2 cm in diameter and 16 cm in length and is designed to access the subsurface to 10 cm deep and to collect five ice samples of 7 cm3 each, approximately. The energy required to penetrate the top 10 cm of ice in a vacuum is 26 Wh, approximately, and to melt 7 cm3 of ice is 1.2 Wh, approximately. The requirement stated in the SDT report of collecting samples from five different sites can be accommodated with repeated use of the same thermal drill. For smaller sample sizes, a smaller probe of 1.0 cm in diameter with the same length of 16 cm could be utilized that would require approximately 6.4 Wh to penetrate the top 10 cm of ice, and 0.02 Wh to collect 0.1 g of sample. The thermal drill has the advantage of simplicity of design and operations and the ability to penetrate ice over a range of densities and hardness while maintaining sample integrity.

  12. Analytical procedure in aseismic design of eccentric structure using response spectrum

    International Nuclear Information System (INIS)

    Takemori, T.; Kuwabara, Y.; Suwabe, A.; Mitsunobu, S.

    1977-01-01

    In this paper, the response are evaluated by the following two methods by the use of the typical torsional analytical models in which masses, rigidities, eccentricities between the centers thereof and several actual earthquake waves are taken as the parameters: (1) the root mean square of responses by using the response spectra derived from the earthquake waves, (2) the time history analysis by using the earthquake wave. The earthquake waves used are chosen to present the different frequency content and magnitude of the response spectra. The typical results derived from the study are as follows: (a) the response accelerations of mass center in the input earthquake direction by the (1) method coincide comparatively well with those by the (2) method, (b) the response accelerations perpendicular to the input earthquake direction by (1) method are 2 to 3 times as much as those by the (2) method, (c) the amplification of the response accelerations at arbitrary points distributed on the spread mass to those of center of the lumped mass by the (1) method are remarkably large compared with those by the (2) method in both directions respectively. These problems on the response spectrum analysis for the above-mentioned eccentric structure are discussed, and an improved analytical method applying the amplification coefficients of responses derived from this parametric time history analysis is proposed to the actual seismic design by the using of the given design ground response spectrum with root mean square technique

  13. Triangulation based inclusion probabilities: a design-unbiased sampling approach

    OpenAIRE

    Fehrmann, Lutz; Gregoire, Timothy; Kleinn, Christoph

    2011-01-01

    A probabilistic sampling approach for design-unbiased estimation of area-related quantitative characteristics of spatially dispersed population units is proposed. The developed field protocol includes a fixed number of 3 units per sampling location and is based on partial triangulations over their natural neighbors to derive the individual inclusion probabilities. The performance of the proposed design is tested in comparison to fixed area sample plots in a simulation with two forest stands. ...

  14. SPIDIA-DNA: An External Quality Assessment for the pre-analytical phase of blood samples used for DNA-based analyses

    Czech Academy of Sciences Publication Activity Database

    Malentacchi, F.; Pazzagli, M.; Simi, L.; Orlando, C.; Wyrich, R.; Hartmann, C.C.; Verderio, P.; Pizzamiglio, S.; Ciniselli, C.M.; Tichopád, Aleš; Kubista, Mikael; Gelmini, S.

    -, č. 424 (2013), s. 274-286 ISSN 0009-8981 Institutional research plan: CEZ:AV0Z50520701 Keywords : Pre-analytical phase * DNA quality * Blood samples Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 2.764, year: 2013

  15. Sample Handling and Processing on Mars for Future Astrobiology Missions

    Science.gov (United States)

    Beegle, Luther; Kirby, James P.; Fisher, Anita; Hodyss, Robert; Saltzman, Alison; Soto, Juancarlos; Lasnik, James; Roark, Shane

    2011-01-01

    In most analytical investigations, there is a need to process complex field samples for the unique detection of analytes especially when detecting low concentration organic molecules that may identify extraterrestrial life. Sample processing for analytical instruments is time, resource and manpower consuming in terrestrial laboratories. Every step in this laborious process will have to be automated for in situ life detection. We have developed, and are currently demonstrating, an automated wet chemistry preparation system that can operate autonomously on Earth and is designed to operate under Martian ambient conditions. This will enable a complete wet chemistry laboratory as part of future missions. Our system, namely the Automated Sample Processing System (ASPS) receives fines, extracts organics through solvent extraction, processes the extract by removing non-organic soluble species and delivers sample to multiple instruments for analysis (including for non-organic soluble species).

  16. Protocols for the analytical characterization of therapeutic monoclonal antibodies. II - Enzymatic and chemical sample preparation.

    Science.gov (United States)

    Bobaly, Balazs; D'Atri, Valentina; Goyon, Alexandre; Colas, Olivier; Beck, Alain; Fekete, Szabolcs; Guillarme, Davy

    2017-08-15

    The analytical characterization of therapeutic monoclonal antibodies and related proteins usually incorporates various sample preparation methodologies. Indeed, quantitative and qualitative information can be enhanced by simplifying the sample, thanks to the removal of sources of heterogeneity (e.g. N-glycans) and/or by decreasing the molecular size of the tested protein by enzymatic or chemical fragmentation. These approaches make the sample more suitable for chromatographic and mass spectrometric analysis. Structural elucidation and quality control (QC) analysis of biopharmaceutics are usually performed at intact, subunit and peptide levels. In this paper, general sample preparation approaches used to attain peptide, subunit and glycan level analysis are overviewed. Protocols are described to perform tryptic proteolysis, IdeS and papain digestion, reduction as well as deglycosylation by PNGase F and EndoS2 enzymes. Both historical and modern sample preparation methods were compared and evaluated using rituximab and trastuzumab, two reference therapeutic mAb products approved by Food and Drug Administration (FDA) and European Medicines Agency (EMA). The described protocols may help analysts to develop sample preparation methods in the field of therapeutic protein analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Potential sources of analytical bias and error in selected trace element data-quality analyses

    Science.gov (United States)

    Paul, Angela P.; Garbarino, John R.; Olsen, Lisa D.; Rosen, Michael R.; Mebane, Christopher A.; Struzeski, Tedmund M.

    2016-09-28

    unfiltered sample for all trace elements except selenium. Accounting for the small dilution effect (2 percent) from the addition of HCl, as required for the in-bottle digestion procedure for unfiltered samples, may be one step toward decreasing the number of instances where trace-element concentrations are greater in filtered samples than in paired unfiltered samples.The laboratory analyses of arsenic, cadmium, lead, and zinc did not appear to be influenced by instrument biases. These trace elements showed similar results on both instruments used to analyze filtered and unfiltered samples. The results for aluminum and molybdenum tended to be higher on the instrument designated to analyze unfiltered samples; the results for selenium tended to be lower. The matrices used to prepare calibration standards were different for the two instruments. The instrument designated for the analysis of unfiltered samples was calibrated using standards prepared in a nitric:hydrochloric acid (HNO3:HCl) matrix. The instrument designated for the analysis of filtered samples was calibrated using standards prepared in a matrix acidified only with HNO3. Matrix chemistry may have influenced the responses of aluminum, molybdenum, and selenium on the two instruments. The best analytical practice is to calibrate instruments using calibration standards prepared in matrices that reasonably match those of the samples being analyzed.Filtered and unfiltered samples were spiked over a range of trace-element concentrations from less than 1 to 58 times ambient concentrations. The greater the magnitude of the trace-element spike concentration relative to the ambient concentration, the greater the likelihood spike recoveries will be within data control guidelines (80–120 percent). Greater variability in spike recoveries occurred when trace elements were spiked at concentrations less than 10 times the ambient concentration. Spike recoveries that were considerably lower than 90 percent often were associated with

  18. SALE, Quality Control of Analytical Chemical Measurements

    International Nuclear Information System (INIS)

    Bush, W.J.; Gentillon, C.D.

    1985-01-01

    1 - Description of problem or function: The Safeguards Analytical Laboratory Evaluation (SALE) program is a statistical analysis program written to analyze the data received from laboratories participating in the SALE quality control and evaluation program. The system is aimed at identifying and reducing analytical chemical measurement errors. Samples of well-characterized materials are distributed to laboratory participants at periodic intervals for determination of uranium or plutonium concentration and isotopic distributions. The results of these determinations are statistically evaluated and participants are informed of the accuracy and precision of their results. 2 - Method of solution: Various statistical techniques produce the SALE output. Assuming an unbalanced nested design, an analysis of variance is performed, resulting in a test of significance for time and analyst effects. A trend test is performed. Both within- laboratory and between-laboratory standard deviations are calculated. 3 - Restrictions on the complexity of the problem: Up to 1500 pieces of data for each nuclear material sampled by a maximum of 75 laboratories may be analyzed

  19. Design and Analytical Evaluation of a New Self-Centering Connection with Bolted T-Stub Devices

    Directory of Open Access Journals (Sweden)

    Mahbobeh Mirzaie Aliabadi

    2013-01-01

    Full Text Available A new posttensioned T-stub connection (PTTC for earthquake resistant steel moment resisting frames (MRFs is introduced. The proposed connection consists of high strength posttensioned (PT strands and bolted T-stubs. The post-tensioning strands run through the column and are anchored against the flange of the exterior column. The T-stubs, providing energy dissipation, are bolted to the flange of beam and column and no field welding is required. The strands compress the T-stub against the column flange to develop the resisting moment to service loads and to provide a restoring force that returns the structure to its initial position following an earthquake. An analytical model based on fiber elements is developed in OpenSees to model PTTCs. The analytical model can predict the expected behavior of the new proposed connection under cyclic loading. PTTC provides similar characteristic behavior of the posttensioned connections. Both theoretical behavior and design methods are proposed, and the design methods are verified based on parametric studies and comparison to analytical results. The parametric studies prove the desired self-centering behavior of PTTC and show that this connection can reduce or eliminate the plastic rotation by its self-centering behavior as well as providing required strength and stiffness under large earthquake rotations.

  20. Development of analytical methods for the separation of plutonium, americium, curium and neptunium from environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Salminen, S.

    2009-07-01

    In this work, separation methods have been developed for the analysis of anthropogenic transuranium elements plutonium, americium, curium and neptunium from environmental samples contaminated by global nuclear weapons testing and the Chernobyl accident. The analytical methods utilized in this study are based on extraction chromatography. Highly varying atmospheric plutonium isotope concentrations and activity ratios were found at both Kurchatov (Kazakhstan), near the former Semipalatinsk test site, and Sodankylae (Finland). The origin of plutonium is almost impossible to identify at Kurchatov, since hundreds of nuclear tests were performed at the Semipalatinsk test site. In Sodankylae, plutonium in the surface air originated from nuclear weapons testing, conducted mostly by USSR and USA before the sampling year 1963. The variation in americium, curium and neptunium concentrations was great as well in peat samples collected in southern and central Finland in 1986 immediately after the Chernobyl accident. The main source of transuranium contamination in peats was from global nuclear test fallout, although there are wide regional differences in the fraction of Chernobyl-originated activity (of the total activity) for americium, curium and neptunium. The separation methods developed in this study yielded good chemical recovery for the elements investigated and adequately pure fractions for radiometric activity determination. The extraction chromatographic methods were faster compared to older methods based on ion exchange chromatography. In addition, extraction chromatography is a more environmentally friendly separation method than ion exchange, because less acidic waste solutions are produced during the analytical procedures. (orig.)

  1. Analytical Design of Passive LCL Filter for Three-phase Two-level Power Factor Correction Rectifiers

    DEFF Research Database (Denmark)

    Kouchaki, Alireza; Nymand, Morten

    2017-01-01

    This paper proposes a comprehensive analytical LCL filter design method for three-phase two-level power factor correction rectifiers (PFCs). The high frequency converter current ripple generates the high frequency current harmonics that need to be attenuated with respect to the grid standards...

  2. Study design and percent recoveries of anthropogenic organic compounds with and without the addition of ascorbic acid to preserve water samples containing free chlorine, 2004-06

    Science.gov (United States)

    Valder, Joshua F.; Delzer, Gregory C.; Price, Curtis V.; Sandstrom, Mark W.

    2008-01-01

    The National Water-Quality Assessment (NAWQA) Program of the U.S. Geological Survey (USGS) began implementing Source Water-Quality Assessments (SWQAs) in 2002 that focus on characterizing the quality of source water and finished water of aquifers and major rivers used by some of the larger community water systems in the United States. As used for SWQA studies, source water is the raw (ambient) water collected at the supply well prior to water treatment (for ground water) or the raw (ambient) water collected from the river near the intake (for surface water). Finished water is the water that is treated, which typically involves, in part, the addition of chlorine or other disinfection chemicals to remove pathogens, and is ready to be delivered to consumers. Finished water is collected before the water enters the distribution system. This report describes the study design and percent recoveries of anthropogenic organic compounds (AOCs) with and without the addition of ascorbic acid to preserve water samples containing free chlorine. The percent recoveries were determined by using analytical results from a laboratory study conducted in 2004 by the USGS's National Water Quality Laboratory (NWQL) and from data collected during 2004-06 for a field study currently (2008) being conducted by the USGS's NAWQA Program. The laboratory study was designed to determine if preserving samples with ascorbic acid (quenching samples) adversely affects analytical performance under controlled conditions. During the laboratory study, eight samples of reagent water were spiked for each of five analytical schedules evaluated. Percent recoveries from these samples were then compared in two ways: (1) four quenched reagent spiked samples analyzed on day 0 were compared with four quenched reagent spiked samples analyzed on day 7 or 14, and (2) the combined eight quenched reagent spiked samples analyzed on day 0, 7, or 14 were compared with eight laboratory reagent spikes (LRSs). Percent

  3. Comparison of the acetyl bromide spectrophotometric method with other analytical lignin methods for determining lignin concentration in forage samples.

    Science.gov (United States)

    Fukushima, Romualdo S; Hatfield, Ronald D

    2004-06-16

    Present analytical methods to quantify lignin in herbaceous plants are not totally satisfactory. A spectrophotometric method, acetyl bromide soluble lignin (ABSL), has been employed to determine lignin concentration in a range of plant materials. In this work, lignin extracted with acidic dioxane was used to develop standard curves and to calculate the derived linear regression equation (slope equals absorptivity value or extinction coefficient) for determining the lignin concentration of respective cell wall samples. This procedure yielded lignin values that were different from those obtained with Klason lignin, acid detergent acid insoluble lignin, or permanganate lignin procedures. Correlations with in vitro dry matter or cell wall digestibility of samples were highest with data from the spectrophotometric technique. The ABSL method employing as standard lignin extracted with acidic dioxane has the potential to be employed as an analytical method to determine lignin concentration in a range of forage materials. It may be useful in developing a quick and easy method to predict in vitro digestibility on the basis of the total lignin content of a sample.

  4. Isotope dilution and sampling factors of the quality assurance and TQM of environmental analysis

    International Nuclear Information System (INIS)

    Macasek, F.

    1999-01-01

    Sampling and preparatory treatment of environmental objects is discussed from the view of their information content, functional speciation of the pollutant, statistical distribution treatment and uncertainty assessment. During homogenization of large samples, a substantial information may be lost and validity of environmental information becomes vague. Isotope dilution analysis is discussed as the most valuable tool for both validity of analysis and evaluation of samples variance. Data collection for a non-parametric statistical treatment of series of 'non-representative' sub-samples, and physico-chemical speciation of analyte may actually better fulfill criteria of similarity and representativeness. Large samples are often required due to detection limits of analysis, but the representativeness of environmental samples should by understood not only by the mean analyte concentration, but also by its spatial and time variance. Hence, heuristic analytical scenarios and interpretation of results must be designed by cooperation of environmentalists and analytical chemists. (author)

  5. ACS sampling system: design, implementation, and performance evaluation

    Science.gov (United States)

    Di Marcantonio, Paolo; Cirami, Roberto; Chiozzi, Gianluca

    2004-09-01

    By means of ACS (ALMA Common Software) framework we designed and implemented a sampling system which allows sampling of every Characteristic Component Property with a specific, user-defined, sustained frequency limited only by the hardware. Collected data are sent to various clients (one or more Java plotting widgets, a dedicated GUI or a COTS application) using the ACS/CORBA Notification Channel. The data transport is optimized: samples are cached locally and sent in packets with a lower and user-defined frequency to keep network load under control. Simultaneous sampling of the Properties of different Components is also possible. Together with the design and implementation issues we present the performance of the sampling system evaluated on two different platforms: on a VME based system using VxWorks RTOS (currently adopted by ALMA) and on a PC/104+ embedded platform using Red Hat 9 Linux operating system. The PC/104+ solution offers, as an alternative, a low cost PC compatible hardware environment with free and open operating system.

  6. Hanford Sampling Quality Management Plan (HSQMP)

    International Nuclear Information System (INIS)

    Hyatt, J.E.

    1995-06-01

    HSQMP establishes quality requirements in response to DOE Order 5700. 6C and to 10 Code of Federal Regulations 830.120. HSQMP is designed to meet the needs of Richland Operations Office for controlling the quality of services provided by sampling operations. It is issued through the Analytical Services Program of the Waste Programs Division. This document describes the Environmental Sampling and Analysis Program activities considered to represent the best management activities necessary to achieve a sampling program with adequate control

  7. SPIDIA-RNA: First external quality assessment for the pre-analytical phase of blood samples used for RNA based analyses

    Czech Academy of Sciences Publication Activity Database

    Pazzagli, M.; Malentacchi, F.; Simi, L.; Wyrich, R.; Guenther, K.; Hartmann, C.; Verderio, P.; Pizzamiglio, S.; Ciniselli, C.M.; Tichopád, Aleš; Kubista, Mikael; Gelmini, S.

    2013-01-01

    Roč. 59, č. 1 (2013), s. 20-31 ISSN 1046-2023 Institutional research plan: CEZ:AV0Z50520701 Keywords : Pre-analytical phase * RNA quality * Blood samples Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 3.221, year: 2013

  8. Ball assisted device for analytical surface sampling

    Science.gov (United States)

    ElNaggar, Mariam S; Van Berkel, Gary J; Covey, Thomas R

    2015-11-03

    A system for sampling a surface includes a sampling probe having a housing and a socket, and a rolling sampling sphere within the socket. The housing has a sampling fluid supply conduit and a sampling fluid exhaust conduit. The sampling fluid supply conduit supplies sampling fluid to the sampling sphere. The sampling fluid exhaust conduit has an inlet opening for receiving sampling fluid carried from the surface by the sampling sphere. A surface sampling probe and a method for sampling a surface are also disclosed.

  9. SALE: Safeguards Analytical Laboratory Evaluation computer code

    International Nuclear Information System (INIS)

    Carroll, D.J.; Bush, W.J.; Dolan, C.A.

    1976-09-01

    The Safeguards Analytical Laboratory Evaluation (SALE) program implements an industry-wide quality control and evaluation system aimed at identifying and reducing analytical chemical measurement errors. Samples of well-characterized materials are distributed to laboratory participants at periodic intervals for determination of uranium or plutonium concentration and isotopic distributions. The results of these determinations are statistically-evaluated, and each participant is informed of the accuracy and precision of his results in a timely manner. The SALE computer code which produces the report is designed to facilitate rapid transmission of this information in order that meaningful quality control will be provided. Various statistical techniques comprise the output of the SALE computer code. Assuming an unbalanced nested design, an analysis of variance is performed in subroutine NEST resulting in a test of significance for time and analyst effects. A trend test is performed in subroutine TREND. Microfilm plots are obtained from subroutine CUMPLT. Within-laboratory standard deviations are calculated in the main program or subroutine VAREST, and between-laboratory standard deviations are calculated in SBLV. Other statistical tests are also performed. Up to 1,500 pieces of data for each nuclear material sampled by 75 (or fewer) laboratories may be analyzed with this code. The input deck necessary to run the program is shown, and input parameters are discussed in detail. Printed output and microfilm plot output are described. Output from a typical SALE run is included as a sample problem

  10. Emerging technology and architecture for big-data analytics

    CERN Document Server

    Chang, Chip; Yu, Hao

    2017-01-01

    This book describes the current state of the art in big-data analytics, from a technology and hardware architecture perspective. The presentation is designed to be accessible to a broad audience, with general knowledge of hardware design and some interest in big-data analytics. Coverage includes emerging technology and devices for data-analytics, circuit design for data-analytics, and architecture and algorithms to support data-analytics. Readers will benefit from the realistic context used by the authors, which demonstrates what works, what doesn’t work, and what are the fundamental problems, solutions, upcoming challenges and opportunities. Provides a single-source reference to hardware architectures for big-data analytics; Covers various levels of big-data analytics hardware design abstraction and flow, from device, to circuits and systems; Demonstrates how non-volatile memory (NVM) based hardware platforms can be a viable solution to existing challenges in hardware architecture for big-data analytics.

  11. Analytical methodologies for aluminium speciation in environmental and biological samples--a review.

    Science.gov (United States)

    Bi, S P; Yang, X D; Zhang, F P; Wang, X L; Zou, G W

    2001-08-01

    It is recognized that aluminium (Al) is a potential environmental hazard. Acidic deposition has been linked to increased Al concentrations in natural waters. Elevated levels of Al might have serious consequences for biological communities. Of particular interest is the speciation of Al in aquatic environments, because Al toxicity depends on its forms and concentrations. In this paper, advances in analytical methodologies for Al speciation in environmental and biological samples during the past five years are reviewed. Concerns about the specific problems of Al speciation and highlights of some important methods are elucidated in sections devoted to hybrid techniques (HPLC or FPLC coupled with ET-AAS, ICP-AES, or ICP-MS), flow-injection analysis (FIA), nuclear magnetic resonance (27Al NMR), electrochemical analysis, and computer simulation. More than 130 references are cited.

  12. Mobile Variable Depth Sampling System Design Study

    International Nuclear Information System (INIS)

    BOGER, R.M.

    2000-01-01

    A design study is presented for a mobile, variable depth sampling system (MVDSS) that will support the treatment and immobilization of Hanford LAW and HLW. The sampler can be deployed in a 4-inch tank riser and has a design that is based on requirements identified in the Level 2 Specification (latest revision). The waste feed sequence for the MVDSS is based on Phase 1, Case 3S6 waste feed sequence. Technical information is also presented that supports the design study

  13. Mobile Variable Depth Sampling System Design Study

    Energy Technology Data Exchange (ETDEWEB)

    BOGER, R.M.

    2000-08-25

    A design study is presented for a mobile, variable depth sampling system (MVDSS) that will support the treatment and immobilization of Hanford LAW and HLW. The sampler can be deployed in a 4-inch tank riser and has a design that is based on requirements identified in the Level 2 Specification (latest revision). The waste feed sequence for the MVDSS is based on Phase 1, Case 3S6 waste feed sequence. Technical information is also presented that supports the design study.

  14. Molecularly imprinted polymers--potential and challenges in analytical chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Mahony, J.O. [Dublin City University, School of Chemical Sciences, Glasnevin, Dublin 9 (Ireland); Nolan, K. [Dublin City University, School of Chemical Sciences, Glasnevin, Dublin 9 (Ireland); Smyth, M.R. [Dublin City University, School of Chemical Sciences, Glasnevin, Dublin 9 (Ireland); Mizaikoff, B. [Georgia Institute of Technology, School of Chemistry and Biochemistry, 770 State Street, Boggs Building, Atlanta, GA 30332-0400 (United States)]. E-mail: boris.mizaikoff@chemistry.gatech.edu

    2005-04-04

    Among the variety of biomimetic recognition schemes utilizing supramolecular approaches molecularly imprinted polymers (MIPs) have proven their potential as synthetic receptors in numerous applications ranging from liquid chromatography to assays and sensor technology. Their inherent advantages compared to biochemical/biological recognition systems include robustness, storage endurance and lower costs. However, until recently only few contributions throughout the relevant literature describe quantitative analytical applications of MIPs for practically relevant analyte molecules and real-world samples. Increased motivation to thoroughly evaluate the true potential of MIP technology is clearly attributed to the demands of modern analytical chemistry, which include enhanced sensitivity, selectivity and applicability of molecular recognition building blocks at decreasing costs. In particular, the areas of environmental monitoring, food and beverage analysis and industrial process surveillance require analytical tools capable of discriminating chemicals with high molecular specificity considering increasing numbers of complex environmental contaminants, pollution of raw products and rigorous quality control requested by legislation and consumer protection. Furthermore, efficient product improvement and development of new products requires precise qualitative and quantitative analytical methods. Finally, environmental, food and process safety control issues favor the application of on-line in situ analytical methods with high molecular selectivity. While biorecognition schemes frequently suffer from degrading bioactivity and long-term stability when applied in real-world sample environments, MIPs serving as synthetic antibodies have successfully been applied as stationary phase separation matrix (e.g. HPLC and SPE), recognition component in bioassays (e.g. ELISA) or biomimetic recognition layer in chemical sensor systems. Examples such as MIP-based selective analysis of

  15. Molecularly imprinted polymers--potential and challenges in analytical chemistry

    International Nuclear Information System (INIS)

    Mahony, J.O.; Nolan, K.; Smyth, M.R.; Mizaikoff, B.

    2005-01-01

    Among the variety of biomimetic recognition schemes utilizing supramolecular approaches molecularly imprinted polymers (MIPs) have proven their potential as synthetic receptors in numerous applications ranging from liquid chromatography to assays and sensor technology. Their inherent advantages compared to biochemical/biological recognition systems include robustness, storage endurance and lower costs. However, until recently only few contributions throughout the relevant literature describe quantitative analytical applications of MIPs for practically relevant analyte molecules and real-world samples. Increased motivation to thoroughly evaluate the true potential of MIP technology is clearly attributed to the demands of modern analytical chemistry, which include enhanced sensitivity, selectivity and applicability of molecular recognition building blocks at decreasing costs. In particular, the areas of environmental monitoring, food and beverage analysis and industrial process surveillance require analytical tools capable of discriminating chemicals with high molecular specificity considering increasing numbers of complex environmental contaminants, pollution of raw products and rigorous quality control requested by legislation and consumer protection. Furthermore, efficient product improvement and development of new products requires precise qualitative and quantitative analytical methods. Finally, environmental, food and process safety control issues favor the application of on-line in situ analytical methods with high molecular selectivity. While biorecognition schemes frequently suffer from degrading bioactivity and long-term stability when applied in real-world sample environments, MIPs serving as synthetic antibodies have successfully been applied as stationary phase separation matrix (e.g. HPLC and SPE), recognition component in bioassays (e.g. ELISA) or biomimetic recognition layer in chemical sensor systems. Examples such as MIP-based selective analysis of

  16. Sampling and analytical methodologies for instrumental neutron activation analysis of airborne particulate matter

    International Nuclear Information System (INIS)

    1992-01-01

    The IAEA supports a number of projects having to do with the analysis of airborne particulate matter by nuclear techniques. Most of this work involves the use of activation analysis in its various forms, particularly instrumental neutron activation analysis (INAA). This technique has been widely used in many different countries for the analysis of airborne particulate matter, and there are already many publications in scientific journals, books and reports describing such work. The present document represents an attempt to summarize the most important features of INAA as applied to the analysis of airborne particulate matter. It is intended to serve as a set of guidelines for use by participants in the IAEA's own programmes, and other scientists, who are not yet fully experienced in the application of INAA to airborne particulate samples, and who wish either to make a start on using this technique or to improve their existing procedures. The methodologies for sampling described in this document are of rather general applicability, although they are presented here in a way that takes account of the particular requirements arising from the use of INAA as the analytical technique. The analytical part of the document, however, is presented in a form that is applicable only to INAA. (Subsequent publications in this series are expected to deal specifically with other nuclear related techniques such as energy dispersive X ray fluorescence (ED-XRF) and particle induced X ray emission (PIXE) analysis). Although the methods and procedures described here have been found through experience to yield acceptable results, they should not be considered mandatory. Any other procedure used should, however, be chosen to be capable of yielding results at least of equal quality to those described

  17. Sampling and analytical methodologies for instrumental neutron activation analysis of airborne particulate matter

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-12-01

    The IAEA supports a number of projects having to do with the analysis of airborne particulate matter by nuclear techniques. Most of this work involves the use of activation analysis in its various forms, particularly instrumental neutron activation analysis (INAA). This technique has been widely used in many different countries for the analysis of airborne particulate matter, and there are already many publications in scientific journals, books and reports describing such work. The present document represents an attempt to summarize the most important features of INAA as applied to the analysis of airborne particulate matter. It is intended to serve as a set of guidelines for use by participants in the IAEA's own programmes, and other scientists, who are not yet fully experienced in the application of INAA to airborne particulate samples, and who wish either to make a start on using this technique or to improve their existing procedures. The methodologies for sampling described in this document are of rather general applicability, although they are presented here in a way that takes account of the particular requirements arising from the use of INAA as the analytical technique. The analytical part of the document, however, is presented in a form that is applicable only to INAA. (Subsequent publications in this series are expected to deal specifically with other nuclear related techniques such as energy dispersive X ray fluorescence (ED-XRF) and particle induced X ray emission (PIXE) analysis). Although the methods and procedures described here have been found through experience to yield acceptable results, they should not be considered mandatory. Any other procedure used should, however, be chosen to be capable of yielding results at least of equal quality to those described.

  18. Urine sample collection protocols for bioassay samples

    Energy Technology Data Exchange (ETDEWEB)

    MacLellan, J.A.; McFadden, K.M.

    1992-11-01

    In vitro radiobioassay analyses are used to measure the amount of radioactive material excreted by personnel exposed to the potential intake of radioactive material. The analytical results are then used with various metabolic models to estimate the amount of radioactive material in the subject`s body and the original intake of radioactive material. Proper application of these metabolic models requires knowledge of the excretion period. It is normal practice to design the bioassay program based on a 24-hour excretion sample. The Hanford bioassay program simulates a total 24-hour urine excretion sample with urine collection periods lasting from one-half hour before retiring to one-half hour after rising on two consecutive days. Urine passed during the specified periods is collected in three 1-L bottles. Because the daily excretion volume given in Publication 23 of the International Commission on Radiological Protection (ICRP 1975, p. 354) for Reference Man is 1.4 L, it was proposed to use only two 1-L bottles as a cost-saving measure. This raised the broader question of what should be the design capacity of a 24-hour urine sample kit.

  19. Urine sample collection protocols for bioassay samples

    Energy Technology Data Exchange (ETDEWEB)

    MacLellan, J.A.; McFadden, K.M.

    1992-11-01

    In vitro radiobioassay analyses are used to measure the amount of radioactive material excreted by personnel exposed to the potential intake of radioactive material. The analytical results are then used with various metabolic models to estimate the amount of radioactive material in the subject's body and the original intake of radioactive material. Proper application of these metabolic models requires knowledge of the excretion period. It is normal practice to design the bioassay program based on a 24-hour excretion sample. The Hanford bioassay program simulates a total 24-hour urine excretion sample with urine collection periods lasting from one-half hour before retiring to one-half hour after rising on two consecutive days. Urine passed during the specified periods is collected in three 1-L bottles. Because the daily excretion volume given in Publication 23 of the International Commission on Radiological Protection (ICRP 1975, p. 354) for Reference Man is 1.4 L, it was proposed to use only two 1-L bottles as a cost-saving measure. This raised the broader question of what should be the design capacity of a 24-hour urine sample kit.

  20. Micro-Crater Laser Induced Breakdown Spectroscopy--an Analytical approach in metals samples

    Energy Technology Data Exchange (ETDEWEB)

    Piscitelli, Vincent [UCV- Laboratorio de Espectroscopia Laser, Caracas (Venezuela); Lawrence Berkeley National laboratory, Berkeley, US (United States); Gonzalez, Jhanis; Xianglei, Mao; Russo, Richard [Lawrence Berkeley National laboratory, Berkeley, US (United States); Fernandez, Alberto [UCV- Laboratorio de Espectroscopia Laser, Caracas (Venezuela)

    2008-04-15

    The laser ablation has been increasing its popularity like as technique of chemical analysis. This is due to its great potentiality in the analysis of solid samples. On the way to contributing to the development of the technique, we in this work studied the laser induced breakdown spectroscopy (LIBS) in conditions of micro ablation for future studies of coverings and micro crates analysis. Craters between 2 and 7 micrometers of diameter were made using an Nd-YAG nanosecond laser in their fundamental emission of 1064 nm. In order to create these craters we use an objective lens of long distance work and 0.45 of numerical aperture. The atomic emission versus the energy of the laser and its effect on the size of craters was study. We found that below 3 micrometers although there was evidence of material removal by the formation of a crater, it was no detectable atomic emission for our instruments. In order to try to understand this, curves of size of crater versus plasma temperature using the Boltzmann distribution graphs taking the Copper emission lines in the visible region were made. In addition calibration curves for Copper and aluminum were made in two different matrices; one of it was a Cu/Zn alloy and the other a Zinc Matrix. The atomic lines Cu I (521.78 nm) and Al I (396.15 nm) was used. From the Calibration curve the analytical limit of detection and other analytical parameters were obtained.

  1. Analytical evaluation of BEA zeolite for the pre-concentration of polycyclic aromatic hydrocarbons and their subsequent chromatographic analysis in water samples.

    Science.gov (United States)

    Wilson, Walter B; Costa, Andréia A; Wang, Huiyong; Dias, José A; Dias, Sílvia C L; Campiglia, Andres D

    2012-07-06

    The analytical performance of BEA - a commercial zeolite - is evaluated for the pre-concentration of fifteen Environmental Protection Agency - polycyclic aromatic hydrocarbons and their subsequent HPLC analysis in tap and lake water samples. The pre-concentration factors obtained with BEA have led to a method with excellent analytical figures of merit. One milliliter aliquots were sufficient to obtain excellent precision of measurements at the parts-per-trillion concentration level with relative standard deviations varying from 4.1% (dibenzo[a,h]anthracene) to 13.4% (pyrene). The limits of detection were excellent as well and varied between 1.1 (anthracene) and 49.9 ng L(-1) (indeno[1,2,3-cd]pyrene). The recovery values of all the studied compounds meet the criterion for regulated polycyclic aromatic hydrocarbons, which mandates relative standard deviations equal or lower than 25%. The small volume of organic solvents (100 μL per sample) and amount of BEA (2 mg per sample) makes sample pre-concentration environmentally friendly and cost effective. The extraction procedure is well suited for numerous samples as the small working volume (1 mL) facilitates the implementation of simultaneous sample extraction. These are attractive features when routine monitoring of numerous samples is contemplated. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Superhydrophobic analyte concentration utilizing colloid-pillar array SERS substrates.

    Science.gov (United States)

    Wallace, Ryan A; Charlton, Jennifer J; Kirchner, Teresa B; Lavrik, Nickolay V; Datskos, Panos G; Sepaniak, Michael J

    2014-12-02

    The ability to detect a few molecules present in a large sample is of great interest for the detection of trace components in both medicinal and environmental samples. Surface enhanced Raman spectroscopy (SERS) is a technique that can be utilized to detect molecules at very low absolute numbers. However, detection at trace concentration levels in real samples requires properly designed delivery and detection systems. The following work involves superhydrophobic surfaces that have as a framework deterministic or stochastic silicon pillar arrays formed by lithographic or metal dewetting protocols, respectively. In order to generate the necessary plasmonic substrate for SERS detection, simple and flow stable Ag colloid was added to the functionalized pillar array system via soaking. Native pillars and pillars with hydrophobic modification are used. The pillars provide a means to concentrate analyte via superhydrophobic droplet evaporation effects. A ≥ 100-fold concentration of analyte was estimated, with a limit of detection of 2.9 × 10(-12) M for mitoxantrone dihydrochloride. Additionally, analytes were delivered to the surface via a multiplex approach in order to demonstrate an ability to control droplet size and placement for scaled-up uses in real world applications. Finally, a concentration process involving transport and sequestration based on surface treatment selective wicking is demonstrated.

  3. Exploring the Design Space of Immersive Urban Analytics

    OpenAIRE

    Chen, Zhutian; Wang, Yifang; Sun, Tianchen; Gao, Xiang; Chen, Wei; Pan, Zhigeng; Qu, Huamin; Wu, Yingcai

    2017-01-01

    Recent years have witnessed the rapid development and wide adoption of immersive head-mounted devices, such as HTC VIVE, Oculus Rift, and Microsoft HoloLens. These immersive devices have the potential to significantly extend the methodology of urban visual analytics by providing critical 3D context information and creating a sense of presence. In this paper, we propose an theoretical model to characterize the visualizations in immersive urban analytics. Further more, based on our comprehensiv...

  4. Reliability of impingement sampling designs: An example from the Indian Point station

    International Nuclear Information System (INIS)

    Mattson, M.T.; Waxman, J.B.; Watson, D.A.

    1988-01-01

    A 4-year data base (1976-1979) of daily fish impingement counts at the Indian Point electric power station on the Hudson River was used to compare the precision and reliability of three random-sampling designs: (1) simple random, (2) seasonally stratified, and (3) empirically stratified. The precision of daily impingement estimates improved logarithmically for each design as more days in the year were sampled. Simple random sampling was the least, and empirically stratified sampling was the most precise design, and the difference in precision between the two stratified designs was small. Computer-simulated sampling was used to estimate the reliability of the two stratified-random-sampling designs. A seasonally stratified sampling design was selected as the most appropriate reduced-sampling program for Indian Point station because: (1) reasonably precise and reliable impingement estimates were obtained using this design for all species combined and for eight common Hudson River fish by sampling only 30% of the days in a year (110 d); and (2) seasonal strata may be more precise and reliable than empirical strata if future changes in annual impingement patterns occur. The seasonally stratified design applied to the 1976-1983 Indian Point impingement data showed that selection of sampling dates based on daily species-specific impingement variability gave results that were more precise, but not more consistently reliable, than sampling allocations based on the variability of all fish species combined. 14 refs., 1 fig., 6 tabs

  5. Pre-Analytical Considerations for Successful Next-Generation Sequencing (NGS: Challenges and Opportunities for Formalin-Fixed and Paraffin-Embedded Tumor Tissue (FFPE Samples

    Directory of Open Access Journals (Sweden)

    Gladys Arreaza

    2016-09-01

    Full Text Available In cancer drug discovery, it is important to investigate the genetic determinants of response or resistance to cancer therapy as well as factors that contribute to adverse events in the course of clinical trials. Despite the emergence of new technologies and the ability to measure more diverse analytes (e.g., circulating tumor cell (CTC, circulating tumor DNA (ctDNA, etc., tumor tissue is still the most common and reliable source for biomarker investigation. Because of its worldwide use and ability to preserve samples for many decades at ambient temperature, formalin-fixed, paraffin-embedded tumor tissue (FFPE is likely to be the preferred choice for tissue preservation in clinical practice for the foreseeable future. Multiple analyses are routinely performed on the same FFPE samples (such as Immunohistochemistry (IHC, in situ hybridization, RNAseq, DNAseq, TILseq, Methyl-Seq, etc.. Thus, specimen prioritization and optimization of the isolation of analytes is critical to ensure successful completion of each assay. FFPE is notorious for producing suboptimal DNA quality and low DNA yield. However, commercial vendors tend to request higher DNA sample mass than what is actually required for downstream assays, which restricts the breadth of biomarker work that can be performed. We evaluated multiple genomics service laboratories to assess the current state of NGS pre-analytical processing of FFPE. Significant differences in pre-analytical capabilities were observed. Key aspects are highlighted and recommendations are made to improve the current practice in translational research.

  6. Conditional estimation of exponential random graph models from snowball sampling designs

    NARCIS (Netherlands)

    Pattison, Philippa E.; Robins, Garry L.; Snijders, Tom A. B.; Wang, Peng

    2013-01-01

    A complete survey of a network in a large population may be prohibitively difficult and costly. So it is important to estimate models for networks using data from various network sampling designs, such as link-tracing designs. We focus here on snowball sampling designs, designs in which the members

  7. Analytical Modelling Of Milling For Tool Design And Selection

    International Nuclear Information System (INIS)

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-01-01

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools

  8. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  9. Laser sampling

    International Nuclear Information System (INIS)

    Gorbatenko, A A; Revina, E I

    2015-01-01

    The review is devoted to the major advances in laser sampling. The advantages and drawbacks of the technique are considered. Specific features of combinations of laser sampling with various instrumental analytical methods, primarily inductively coupled plasma mass spectrometry, are discussed. Examples of practical implementation of hybrid methods involving laser sampling as well as corresponding analytical characteristics are presented. The bibliography includes 78 references

  10. Analytical techniques and method validation for the measurement of selected semivolatile and nonvolatile organofluorochemicals in air.

    Science.gov (United States)

    Reagen, William K; Lindstrom, Kent R; Thompson, Kathy L; Flaherty, John M

    2004-09-01

    The widespread use of semi- and nonvolatile organofluorochemicals in industrial facilities, concern about their persistence, and relatively recent advancements in liquid chromatography/mass spectrometry (LC/MS) technology have led to the development of new analytical methods to assess potential worker exposure to airborne organofluorochemicals. Techniques were evaluated for the determination of 19 organofluorochemicals and for total fluorine in ambient air samples. Due to the potential biphasic nature of most of these fluorochemicals when airborne, Occupational Safety and Health Administration (OSHA) versatile sampler (OVS) tubes were used to simultaneously trap fluorochemical particulates and vapors from workplace air. Analytical methods were developed for OVS air samples to quantitatively analyze for total fluorine using oxygen bomb combustion/ion selective electrode and for 17 organofluorochemicals using LC/MS and gas chromatography/mass spectrometry (GC/MS). The experimental design for this validation was based on the National Institute of Occupational Safety and Health (NIOSH) Guidelines for Air Sampling and Analytical Method Development and Evaluation, with some revisions of the experimental design. The study design incorporated experiments to determine analytical recovery and stability, sampler capacity, the effect of some environmental parameters on recoveries, storage stability, limits of detection, precision, and accuracy. Fluorochemical mixtures were spiked onto each OVS tube over a range of 0.06-6 microg for each of 12 compounds analyzed by LC/MS and 0.3-30 microg for 5 compounds analyzed by GC/MS. These ranges allowed reliable quantitation at 0.001-0.1 mg/m3 in general for LC/MS analytes and 0.005-0.5 mg/m3 for GC/MS analytes when 60 L of air are sampled. The organofluorochemical exposure guideline (EG) is currently 0.1 mg/m3 for many analytes, with one exception being ammonium perfluorooctanoate (EG is 0.01 mg/m3). Total fluorine results may be used

  11. Analytical chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Chae, Myeong Hu; Lee, Hu Jun; Kim, Ha Seok

    1989-02-15

    This book give explanations on analytical chemistry with ten chapters, which deal with development of analytical chemistry, the theory of error with definition and classification, sample and treatment gravimetry on general process of gravimetry in aqueous solution and non-aqueous solution, precipitation titration about precipitation reaction and types, complexometry with summary and complex compound, oxidation-reduction equilibrium on electrode potential and potentiometric titration, solvent extraction and chromatograph and experiment with basic operation for chemical experiment.

  12. Analytical chemistry

    International Nuclear Information System (INIS)

    Chae, Myeong Hu; Lee, Hu Jun; Kim, Ha Seok

    1989-02-01

    This book give explanations on analytical chemistry with ten chapters, which deal with development of analytical chemistry, the theory of error with definition and classification, sample and treatment gravimetry on general process of gravimetry in aqueous solution and non-aqueous solution, precipitation titration about precipitation reaction and types, complexometry with summary and complex compound, oxidation-reduction equilibrium on electrode potential and potentiometric titration, solvent extraction and chromatograph and experiment with basic operation for chemical experiment.

  13. ANL small-sample calorimeter system design and operation

    International Nuclear Information System (INIS)

    Roche, C.T.; Perry, R.B.; Lewis, R.N.; Jung, E.A.; Haumann, J.R.

    1978-07-01

    The Small-Sample Calorimetric System is a portable instrument designed to measure the thermal power produced by radioactive decay of plutonium-containing fuels. The small-sample calorimeter is capable of measuring samples producing power up to 32 milliwatts at a rate of one sample every 20 min. The instrument is contained in two packages: a data-acquisition module consisting of a microprocessor with an 8K-byte nonvolatile memory, and a measurement module consisting of the calorimeter and a sample preheater. The total weight of the system is 18 kg

  14. Studies on application of neutron activation analysis -Applied research on air pollution monitoring and development of analytical method of environmental samples

    International Nuclear Information System (INIS)

    Chung, Yong Sam; Moon, Jong Hwa; Chung, Young Ju; Jeong, Eui Sik; Lee, Sang Mi; Kang, Sang Hun; Cho, Seung Yeon; Kwon, Young Sik; Chung, Sang Wuk; Lee, Kyu Sung; Chun, Ki Hong; Kim, Nak Bae; Lee, Kil Yong; Yoon, Yoon Yeol; Chun, Sang Ki.

    1997-09-01

    This research report is written for results of applied research on air pollution monitoring using instrumental neutron activation analysis. For identification and standardization of analytical method, 24 environmental samples are analyzed quantitatively, and accuracy and precision of this method are measured. Using airborne particulate matter and biomonitor chosen as environmental indicators, trace elemental concentrations of sample collected at urban and rural site monthly are determined ant then the calculation of statistics and the factor analysis are carried out for investigation of emission source. Facilities for NAA are installed in a new HANARO reactor, functional test is performed for routine operation. In addition, unified software code for NAA is developed to improve accuracy, precision and abilities of analytical processes. (author). 103 refs., 61 tabs., 19 figs

  15. Information Management Platform for Data Analytics and Aggregation (IMPALA) System Design Document

    Science.gov (United States)

    Carnell, Andrew; Akinyelu, Akinyele

    2016-01-01

    The System Design document tracks the design activities that are performed to guide the integration, installation, verification, and acceptance testing of the IMPALA Platform. The inputs to the design document are derived from the activities recorded in Tasks 1 through 6 of the Statement of Work (SOW), with the proposed technical solution being the completion of Phase 1-A. With the documentation of the architecture of the IMPALA Platform and the installation steps taken, the SDD will be a living document, capturing the details about capability enhancements and system improvements to the IMPALA Platform to provide users in development of accurate and precise analytical models. The IMPALA Platform infrastructure team, data architecture team, system integration team, security management team, project manager, NASA data scientists and users are the intended audience of this document. The IMPALA Platform is an assembly of commercial-off-the-shelf (COTS) products installed on an Apache-Hadoop platform. User interface details for the COTS products will be sourced from the COTS tools vendor documentation. The SDD is a focused explanation of the inputs, design steps, and projected outcomes of every design activity for the IMPALA Platform through installation and validation.

  16. Adaptive designs for the one-sample log-rank test.

    Science.gov (United States)

    Schmidt, Rene; Faldum, Andreas; Kwiecien, Robert

    2017-09-22

    Traditional designs in phase IIa cancer trials are single-arm designs with a binary outcome, for example, tumor response. In some settings, however, a time-to-event endpoint might appear more appropriate, particularly in the presence of loss to follow-up. Then the one-sample log-rank test might be the method of choice. It allows to compare the survival curve of the patients under treatment to a prespecified reference survival curve. The reference curve usually represents the expected survival under standard of the care. In this work, convergence of the one-sample log-rank statistic to Brownian motion is proven using Rebolledo's martingale central limit theorem while accounting for staggered entry times of the patients. On this basis, a confirmatory adaptive one-sample log-rank test is proposed where provision is made for data dependent sample size reassessment. The focus is to apply the inverse normal method. This is done in two different directions. The first strategy exploits the independent increments property of the one-sample log-rank statistic. The second strategy is based on the patient-wise separation principle. It is shown by simulation that the proposed adaptive test might help to rescue an underpowered trial and at the same time lowers the average sample number (ASN) under the null hypothesis as compared to a single-stage fixed sample design. © 2017, The International Biometric Society.

  17. ANALYTICAL RESULTS OF MOX COLEMANITE CONCRETE SAMPLES POURED AUGUST 29, 2012

    Energy Technology Data Exchange (ETDEWEB)

    Best, D.; Cozzi, A.; Reigel, M.

    2012-12-20

    The Mixed Oxide Fuel Fabrication Facility (MFFF) will use colemanite bearing concrete neutron absorber panels credited with attenuating neutron flux in the criticality design analyses and shielding operators from radiation. The Savannah River National Laboratory is tasked with measuring the total density, partial hydrogen density, and partial boron density of the colemanite concrete. Samples poured 8/29/12 were received on 9/20/2012 and analyzed. The average total density of each of the samples measured by the ASTM method C 642 was within the lower bound of 1.88 g/cm{sup 3}. The average partial hydrogen density of samples 8.6.1, 8.7.1, and 8.5.3 as measured using method ASTM E 1311 met the lower bound of 6.04E-02 g/cm{sup 3}. The average measured partial boron density of each sample met the lower bound of 1.65E-01 g/cm{sup 3} measured by the ASTM C 1301 method. The average partial hydrogen density of samples 8.5.1, 8.6.3, and 8.7.3 did not meet the lower bound. The samples, as received, were not wrapped in a moist towel as previous samples and appeared to be somewhat drier. This may explain the lower hydrogen partial density with respect to previous samples.

  18. Baseline Design Compliance Matrix for the Rotary Mode Core Sampling System

    International Nuclear Information System (INIS)

    LECHELT, J.A.

    2000-01-01

    The purpose of the design compliance matrix (DCM) is to provide a single-source document of all design requirements associated with the fifteen subsystems that make up the rotary mode core sampling (RMCS) system. It is intended to be the baseline requirement document for the RMCS system and to be used in governing all future design and design verification activities associated with it. This document is the DCM for the RMCS system used on Hanford single-shell radioactive waste storage tanks. This includes the Exhauster System, Rotary Mode Core Sample Trucks, Universal Sampling System, Diesel Generator System, Distribution Trailer, X-Ray Cart System, Breathing Air Compressor, Nitrogen Supply Trailer, Casks and Cask Truck, Service Trailer, Core Sampling Riser Equipment, Core Sampling Support Trucks, Foot Clamp, Ramps and Platforms and Purged Camera System. Excluded items are tools such as light plants and light stands. Other items such as the breather inlet filter are covered by a different design baseline. In this case, the inlet breather filter is covered by the Tank Farms Design Compliance Matrix

  19. Innovative technology summary report: Road Transportable Analytical Laboratory (RTAL)

    International Nuclear Information System (INIS)

    1998-10-01

    The Road Transportable Analytical Laboratory (RTAL) has been used in support of US Department of Energy (DOE) site and waste characterization and remediation planning at Fernald Environmental Management Project (FEMP) and is being considered for implementation at other DOE sites, including the Paducah Gaseous Diffusion Plant. The RTAL laboratory system consists of a set of individual laboratory modules deployable independently or as an interconnected group to meet each DOE site's specific analysis needs. The prototype RTAL, deployed at FEMP Operable Unit 1 Waste Pits, has been designed to be synergistic with existing analytical laboratory capabilities, thereby reducing the occurrence of unplanned rush samples that are disruptive to efficient laboratory operations

  20. Mechanical-Stress Analytical Modeling for the Design of Coils in Power Applications

    Directory of Open Access Journals (Sweden)

    Bellan D.

    2014-12-01

    Full Text Available Modern electrical-power systems are often exploited for transmitting high-frequency carrier signals for communications purposes. Series-connected air-core coils represent the fundamental component allowing such applications by providing a proper filtering in the frequency domain. They must be designed, however, to withstand also the line short-circuit current. When a high-magnitude current flows through a coil, strong mechanical stresses are produced within the conductor, leading to possible damage of the coil. In this paper, an approximate analytical model is derived for the relationship between the maximum mechanical stress and the electrical/geometrical parameters of the coil. Such a model provides the guidelines for a fast and safe coil design, whereas numerical simulations are only needed for the design refinement. The presented approach can be extended to other applications such as, for example, the mechanical stress resulting from the inrush currents in the coils of power transformers.

  1. Outcome-Dependent Sampling Design and Inference for Cox's Proportional Hazards Model.

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Cai, Jianwen; Sandler, Dale P; Zhou, Haibo

    2016-11-01

    We propose a cost-effective outcome-dependent sampling design for the failure time data and develop an efficient inference procedure for data collected with this design. To account for the biased sampling scheme, we derive estimators from a weighted partial likelihood estimating equation. The proposed estimators for regression parameters are shown to be consistent and asymptotically normally distributed. A criteria that can be used to optimally implement the ODS design in practice is proposed and studied. The small sample performance of the proposed method is evaluated by simulation studies. The proposed design and inference procedure is shown to be statistically more powerful than existing alternative designs with the same sample sizes. We illustrate the proposed method with an existing real data from the Cancer Incidence and Mortality of Uranium Miners Study.

  2. Designing Game Analytics For A City-Builder Game

    OpenAIRE

    Korppoo, Karoliina

    2015-01-01

    The video game industry continues to grow. Competition is tough as games become more and more popular and easier for the users to get, thanks to digital distribution and social media platforms that support games. Thanks to the readily available internet connections and games using them, data of player behaviour can be acquired. This is where game analytics come in. What sort of player actions provide meaningful information that can be used to iterate the game? Typically game analytics is appl...

  3. Analytical Solution for Optimum Design of Furrow Irrigation Systems

    Science.gov (United States)

    Kiwan, M. E.

    1996-05-01

    An analytical solution for the optimum design of furrow irrigation systems is derived. The non-linear calculus optimization method is used to formulate a general form for designing the optimum system elements under circumstances of maximizing the water application efficiency of the system during irrigation. Different system bases and constraints are considered in the solution. A full irrigation water depth is considered to be achieved at the tail of the furrow line. The solution is based on neglecting the recession and depletion times after off-irrigation. This assumption is valid in the case of open-end (free gradient) furrow systems rather than closed-end (closed dike) systems. Illustrative examples for different systems are presented and the results are compared with the output obtained using an iterative numerical solution method. The final derived solution is expressed as a function of the furrow length ratio (the furrow length to the water travelling distance). The function of water travelling developed by Reddy et al. is considered for reaching the optimum solution. As practical results from the study, the optimum furrow elements for free gradient systems can be estimated to achieve the maximum application efficiency, i.e. furrow length, water inflow rate and cutoff irrigation time.

  4. Is a pre-analytical process for urinalysis required?

    Science.gov (United States)

    Petit, Morgane; Beaudeux, Jean-Louis; Majoux, Sandrine; Hennequin, Carole

    2017-10-01

    For the reliable urinary measurement of calcium, phosphate and uric acid, a pre-analytical process by adding acid or base to urine samples at laboratory is recommended in order to dissolve precipitated solutes. Several studies on different kind of samples and analysers have previously shown that a such pre-analytical treatment is useless. The objective was to study the necessity of pre-analytical treatment of urine on samples collected using the V-Monovette ® (Sarstedt) system and measured on the analyser Architect C16000 (Abbott Diagnostics). Sixty urinary samples of hospitalized patients were selected (n=30 for calcium and phosphate, and n=30 for uric acid). After acidification of urine samples for measurement of calcium and phosphate, and alkalinisation for measurement of uric acid respectively, differences between results before and after the pre-analytical treatment were compared to acceptable limits recommended by the French society of clinical biology (SFBC). No difference in concentration between before and after pre-analytical treatment of urine samples exceeded acceptable limits from SFBC for measurement of calcium and uric acid. For phosphate, only one sample exceeded these acceptable limits, showing a result paradoxically lower after acidification. In conclusion, in agreement with previous study, our results show that acidification or alkalinisation of urine samples from 24 h urines or from urination is not a pre-analytical necessity for measurement of calcium, phosphate and uric acid.

  5. Designing Robust Process Analytical Technology (PAT) Systems for Crystallization Processes: A Potassium Dichromate Crystallization Case Study

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Sin, Gürkan

    2013-01-01

    The objective of this study is to test and validate a Process Analytical Technology (PAT) system design on a potassium dichromate crystallization process in the presence of input uncertainties using uncertainty and sensitivity analysis. To this end a systematic framework for managing uncertaintie...

  6. Core-shell nanoparticles optical sensors - Rational design of zinc ions fluorescent nanoprobes of improved analytical performance

    Science.gov (United States)

    Woźnica, Emilia; Gasik, Joanna; Kłucińska, Katarzyna; Kisiel, Anna; Maksymiuk, Krzysztof; Michalska, Agata

    2017-10-01

    In this work the effect of affinity of an analyte to a receptor on the response of nanostructural fluorimetric probes is discussed. Core-shell nanoparticles sensors are prepared that benefit from the properties of the phases involved leading to improved analytical performance. The optical transduction system chosen is independent of pH, thus the change of sample pH can be used to control the analyte - receptor affinity through the "conditional" binding constant prevailing within the lipophilic phase. It is shown that by affecting the "conditional" binding constant the performance of the sensor can be fine-tuned. As expected, increase in "conditional" affinity of the ligand embedded in the lipophilic phase to the analyte results in higher sensitivity over narrow concentration range - bulk reaction and sigmoidal shape response of emission intensity vs. logarithm of concentration changes. To induce a linear dependence of emission intensity vs. logarithm of analyte concentration covering a broad concentration range, a spatial confinement of the reaction zone is proposed, and application of core-shell nanostructures. The core material, polypyrrole nanospheres, is effectively not permeable for the analyte - ligand complex, thus the reaction is limited to the outer shell layer of the polymer prepared from poly(maleic anhydride-alt-1-octadecene). For herein introduced system a linear dependence of emission intensity vs. logarithm of Zn2+ concentration was obtained within the range from 10-7 to 10-1 M.

  7. A Frequency Domain Design Method For Sampled-Data Compensators

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Jannerup, Ole Erik

    1990-01-01

    A new approach to the design of a sampled-data compensator in the frequency domain is investigated. The starting point is a continuous-time compensator for the continuous-time system which satisfy specific design criteria. The new design method will graphically show how the discrete...

  8. Use of CTX-I and PINP as bone turnover markers: National Bone Health Alliance recommendations to standardize sample handling and patient preparation to reduce pre-analytical variability.

    Science.gov (United States)

    Szulc, P; Naylor, K; Hoyle, N R; Eastell, R; Leary, E T

    2017-09-01

    The National Bone Health Alliance (NBHA) recommends standardized sample handling and patient preparation for C-terminal telopeptide of type I collagen (CTX-I) and N-terminal propeptide of type I procollagen (PINP) measurements to reduce pre-analytical variability. Controllable and uncontrollable patient-related factors are reviewed to facilitate interpretation and minimize pre-analytical variability. The IOF and the International Federation of Clinical Chemistry (IFCC) Bone Marker Standards Working Group have identified PINP and CTX-I in blood to be the reference markers of bone turnover for the fracture risk prediction and monitoring of osteoporosis treatment. Although used in clinical research for many years, bone turnover markers (BTM) have not been widely adopted in clinical practice primarily due to their poor within-subject and between-lab reproducibility. The NBHA Bone Turnover Marker Project team aim to reduce pre-analytical variability of CTX-I and PINP measurements through standardized sample handling and patient preparation. Recommendations for sample handling and patient preparations were made based on review of available publications and pragmatic considerations to reduce pre-analytical variability. Controllable and un-controllable patient-related factors were reviewed to facilitate interpretation and sample collection. Samples for CTX-I must be collected consistently in the morning hours in the fasted state. EDTA plasma is preferred for CTX-I for its greater sample stability. Sample collection conditions for PINP are less critical as PINP has minimal circadian variability and is not affected by food intake. Sample stability limits should be observed. The uncontrollable aspects (age, sex, pregnancy, immobility, recent fracture, co-morbidities, anti-osteoporotic drugs, other medications) should be considered in BTM interpretation. Adopting standardized sample handling and patient preparation procedures will significantly reduce controllable pre-analytical

  9. Designing for Student-Facing Learning Analytics

    Science.gov (United States)

    Kitto, Kirsty; Lupton, Mandy; Davis, Kate; Waters, Zak

    2017-01-01

    Despite a narrative that sees learning analytics (LA) as a field that aims to enhance student learning, few student-facing solutions have emerged. This can make it difficult for educators to imagine how data can be used in the classroom, and in turn diminishes the promise of LA as an enabler for encouraging important skills such as sense-making,…

  10. Enhancing sampling design in mist-net bat surveys by accounting for sample size optimization

    OpenAIRE

    Trevelin, Leonardo Carreira; Novaes, Roberto Leonan Morim; Colas-Rosas, Paul François; Benathar, Thayse Cristhina Melo; Peres, Carlos A.

    2017-01-01

    The advantages of mist-netting, the main technique used in Neotropical bat community studies to date, include logistical implementation, standardization and sampling representativeness. Nonetheless, study designs still have to deal with issues of detectability related to how different species behave and use the environment. Yet there is considerable sampling heterogeneity across available studies in the literature. Here, we approach the problem of sample size optimization. We evaluated the co...

  11. How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?

    Directory of Open Access Journals (Sweden)

    Brady T West

    Full Text Available Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT, which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data.

  12. How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?

    Science.gov (United States)

    West, Brady T.; Sakshaug, Joseph W.; Aurelien, Guy Alain S.

    2016-01-01

    Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data. PMID:27355817

  13. Analytical methods and laboratory facility for the Defense Waste Processing Facility

    International Nuclear Information System (INIS)

    Coleman, C.J.; Dewberry, R.A.; Lethco, A.J.; Denard, C.D.

    1985-01-01

    This paper describes the analytical methods, instruments, and laboratory that will support vitrification of defense waste. The Defense Waste Processing Facility (DWPF) is now being constructed at Savannah River Plant (SRP). Beginning in 1989, SRP high-level defense waste will be immobilized in borosilicate glass for disposal in a federal repository. The DWPF will contain an analytical laboratory for performing process control analyses. Additional analyses will be performed for process history and process diagnostics. The DWPF analytical facility will consist of a large shielded sampling cell, three shielded analytical cells, a laboratory for instrumental analysis and chemical separations, and a counting room. Special instrumentation is being designed for use in the analytical cells, including microwave drying/dissolution apparatus, and remote pipetting devices. The instrumentation laboratory will contain inductively coupled plasma, atomic absorption, Moessbauer spectrometers, a carbon analyzer, and ion chromatography equipment. Counting equipment will include intrinsic germanium detectors, scintillation counters, Phoswich alpha, beta, gamma detectors, and a low-energy photon detector

  14. Examination of fast reactor fuels, FBR analytical quality assurance standards and methods, and analytical methods development: irradiation tests. Progress report, April 1--June 30, 1976, and FY 1976

    International Nuclear Information System (INIS)

    Baker, R.D.

    1976-08-01

    Characterization of unirradiated and irradiated LMFBR fuels by analytical chemistry methods will continue, and additional methods will be modified and mechanized for hot cell application. Macro- and microexaminations will be made on fuel and cladding using the shielded electron microprobe, emission spectrograph, radiochemistry, gamma scanner, mass spectrometers, and other analytical facilities. New capabilities will be developed in gamma scanning, analyses to assess spatial distributions of fuel and fission products, mass spectrometric measurements of burnup and fission gas constituents and other chemical analyses. Microstructural analyses of unirradiated and irradiated materials will continue using optical and electron microscopy and autoradiographic and x-ray techniques. Analytical quality assurance standards tasks are designed to assure the quality of the chemical characterizations necessary to evaluate reactor components relative to specifications. Tasks include: (1) the preparation and distribution of calibration materials and quality control samples for use in quality assurance surveillance programs, (2) the development of and the guidance in the use of quality assurance programs for sampling and analysis, (3) the development of improved methods of analysis, and (4) the preparation of continuously updated analytical method manuals. Reliable analytical methods development for the measurement of burnup, oxygen-to-metal (O/M) ratio, and various gases in irradiated fuels is described

  15. Sample design considerations of indoor air exposure surveys

    International Nuclear Information System (INIS)

    Cox, B.G.; Mage, D.T.; Immerman, F.W.

    1988-01-01

    Concern about the potential for indoor air pollution has prompted recent surveys of radon and NO 2 concentrations in homes and personal exposure studies of volatile organics, carbon monoxide and pesticides, to name a few. The statistical problems in designing sample surveys that measure the physical environment are diverse and more complicated than those encountered in traditional surveys of human attitudes and attributes. This paper addresses issues encountered when designing indoor air quality (IAQ) studies. General statistical concepts related to target population definition, frame creation, and sample selection for area household surveys and telephone surveys are presented. The implications of different measurement approaches are discussed, and response rate considerations are described

  16. Placement Design of Changeable Message Signs on Curved Roadways

    Directory of Open Access Journals (Sweden)

    Zhongren Wang, Ph.D. P.E. T.E.

    2015-01-01

    Full Text Available This paper presented a fundamental framework for Changeable Message Sign (CMS placement design along roadways with horizontal curves. This analytical framework determines the available distance for motorists to read and react to CMS messages based on CMS character height, driver's cone of vision, CMS pixel's cone of legibility, roadway horizontal curve radius, and CMS lateral and vertical placement. Sample design charts were developed to illustrate how the analytical framework may facilitate CMS placement design.

  17. Matrix effects break the LC behavior rule for analytes in LC-MS/MS analysis of biological samples.

    Science.gov (United States)

    Fang, Nianbai; Yu, Shanggong; Ronis, Martin Jj; Badger, Thomas M

    2015-04-01

    High-performance liquid chromatography (HPLC) and liquid chromatography-tandem mass spectrometry (LC-MS/MS) are generally accepted as the preferred techniques for detecting and quantitating analytes of interest in biological matrices on the basis of the rule that one chemical compound yields one LC-peak with reliable retention time (Rt.). However, in the current study, we have found that under the same LC-MS conditions, the Rt. and shape of LC-peaks of bile acids in urine samples from animals fed dissimilar diets differed significantly among each other. To verify this matrix effect, 17 authentic bile acid standards were dissolved in pure methanol or in methanol containing extracts of urine from pigs consuming either breast milk or infant formula and analyzed by LC-MS/MS. The matrix components in urine from piglets fed formula significantly reduced the LC-peak Rt. and areas of bile acids. This is the first characterization of this matrix effect on Rt. in the literature. Moreover, the matrix effect resulted in an unexpected LC behavior: one single compound yielded two LC-peaks, which broke the rule of one LC-peak for one compound. The three bile acid standards which exhibited this unconventional LC behavior were chenodeoxycholic acid, deoxycholic acid, and glycocholic acid. One possible explanation for this effect is that some matrix components may have loosely bonded to analytes, which changed the time analytes were retained on a chromatography column and interfered with the ionization of analytes in the MS ion source to alter the peak area. This study indicates that a comprehensive understanding of matrix effects is needed towards improving the use of HPLC and LC-MS/MS techniques for qualitative and quantitative analyses of analytes in pharmacokinetics, proteomics/metabolomics, drug development, and sports drug testing, especially when LC-MS/MS data are analyzed by automation software where identification of an analyte is based on its exact molecular weight and Rt

  18. Framework for pedagogical learning analytics

    OpenAIRE

    Heilala, Ville

    2018-01-01

    Learning analytics is an emergent technological practice and a multidisciplinary scientific discipline, which goal is to facilitate effective learning and knowledge of learning. In this design science research, I combine knowledge discovery process, a concept of pedagogical knowledge, ethics of learning analytics and microservice architecture. The result is a framework for pedagogical learning analytics. The framework is applied and evaluated in the context of agency analytics. The framework ...

  19. Analysis of water and soil from the wetlands of Upper Three Runs Creek. Volume 2A, Analytical data packages September--October 1991 sampling

    Energy Technology Data Exchange (ETDEWEB)

    Haselow, L.A.; Rogers, V.A. [Westinghouse Savannah River Co., Aiken, SC (United States); Riordan, C.J. [Metcalf and Eddy, Inc. (United States); Eidson, G.W.; Herring, M.K. [Normandeau Associates, Inc. (United States)

    1992-08-01

    Shallow water and soils along Upper Three Runs Creek (UTRC) and associated wetlands between SRS Road F and Cato Road were sampled for nonradioactive and radioactive constituents. The sampling program is associated with risk evaluations being performed for various regulatory documents in these areas of the Savannah River Site (SRS). WSRC selected fifty sampling sites bordering the Mixed Waste Management Facility (MWMF), F- and H-Area Seepage Basins (FHSB), and the Sanitary Landfill (SL). The analytical results from this study provided information on the water and soil quality in UTRC and its associated wetlands. The analytical results from this investigation indicated that the primary constituents and radiological indicators detected in the shallow water and soils were tritium, gross alpha, radium 226, total radium and strontium 90. This investigation involved the collection of shallow water samples during the Fall of 1991 and the Spring of 1992 at fifty (50) sampling locations. Sampling was performed during these periods to incorporate high and low water table periods. Samples were collected from three sections along UTRC denoted as Phase I (MWMF), Phase II (FHSB) and Phase III (SL). One vibracored soil sample was also collected in each phase during the Fall of 1991. This document is compiled solely of experimental data obtained from the sampling procedures.

  20. Analysis of water and soil from the wetlands of Upper Three Runs Creek. Volume 2B: Analytical data packages, January--February 1992 sampling

    Energy Technology Data Exchange (ETDEWEB)

    Haselow, L.A.; Rogers, V.A. [Westinghouse Savannah River Co., Aiken, SC (United States); Riordan, C.J. [Metcalf and Eddy (United States); Eidson, G.W.; Herring, M.K. [Normandeau Associates, Inc., Aiken, SC (United States)

    1992-08-01

    Shallow water and soils along Upper Three Runs Creek (UTRC) and associated wetlands between SRS Road F and Cato Road were sampled for nonradioactive and radioactive constituents. The sampling program is associated with risk evaluations being performed for various regulatory documents in these areas of the Savannah River Site (SRS). WSRC selected fifty sampling sites bordering the Mixed Waste Management Facility (MWMF), F- and H-Area Seepage Basins (FHSB), and the Sanitary Landfill (SL). The analytical results from this study provided information on the water and soil quality in UTRC and its associated wetlands. The analytical results from this investigation indicated that the primary constituents and radiological indicators detected in the shallow water and soils were tritium, gross alpha, radium 226, total radium and strontium 90. This investigation involved the collection of shallow water samples during the Fall of 1991 and the Spring of 1992 at fifty (50) sampling locations. Sampling was performed during these periods to incorporate high and low water table periods. Samples were collected from three sections along UTRC denoted as Phase I (MWMF), Phase II (FHSB) and Phase III (SL). One vibracored soil sample was also collected in each phase during the Fall of 1991. This document is compiled of experimental data obtained from the sampling procedures.

  1. A factor analytic investigation of the Tripartite model of affect in a clinical sample of young Australians

    Directory of Open Access Journals (Sweden)

    Cosgrave Elizabeth M

    2008-09-01

    Full Text Available Abstract Background The Mood and Anxiety Symptom Questionnaire (MASQ was designed to specifically measure the Tripartite model of affect and is proposed to offer a delineation between the core components of anxiety and depression. Factor analytic data from adult clinical samples has shown mixed results; however no studies employing confirmatory factor analysis (CFA have supported the predicted structure of distinct Depression, Anxiety and General Distress factors. The Tripartite model has not been validated in a clinical sample of older adolescents and young adults. The aim of the present study was to examine the validity of the Tripartite model using scale-level data from the MASQ and correlational and confirmatory factor analysis techniques. Methods 137 young people (M = 17.78, SD = 2.63 referred to a specialist mental health service for adolescents and young adults completed the MASQ and diagnostic interview. Results All MASQ scales were highly inter-correlated, with the lowest correlation between the depression- and anxiety-specific scales (r = .59. This pattern of correlations was observed for all participants rating for an Axis-I disorder but not for participants without a current disorder (r = .18. Confirmatory factor analyses were conducted to evaluate the model fit of a number of solutions. The predicted Tripartite structure was not supported. A 2-factor model demonstrated superior model fit and parsimony compared to 1- or 3-factor models. These broad factors represented Depression and Anxiety and were highly correlated (r = .88. Conclusion The present data lend support to the notion that the Tripartite model does not adequately explain the relationship between anxiety and depression in all clinical populations. Indeed, in the present study this model was found to be inappropriate for a help-seeking community sample of older adolescents and young adults.

  2. eAnalytics: Dynamic Web-based Analytics for the Energy Industry

    Directory of Open Access Journals (Sweden)

    Paul Govan

    2016-11-01

    Full Text Available eAnalytics is a web application built on top of R that provides dynamic data analytics to energy industry stakeholders. The application allows users to dynamically manipulate chart data and style through the Shiny package’s reactive framework. eAnalytics currently supports a number of features including interactive datatables, dynamic charting capabilities, and the ability to save, download, or export information for further use. Going forward, the goal for this project is that it will serve as a research hub for discovering new relationships in the data. The application is illustrated with a simple tutorial of the user interface design.

  3. Analytical Validation of a New Enzymatic and Automatable Method for d-Xylose Measurement in Human Urine Samples

    Directory of Open Access Journals (Sweden)

    Israel Sánchez-Moreno

    2017-01-01

    Full Text Available Hypolactasia, or intestinal lactase deficiency, affects more than half of the world population. Currently, xylose quantification in urine after gaxilose oral administration for the noninvasive diagnosis of hypolactasia is performed with the hand-operated nonautomatable phloroglucinol reaction. This work demonstrates that a new enzymatic xylose quantification method, based on the activity of xylose dehydrogenase from Caulobacter crescentus, represents an excellent alternative to the manual phloroglucinol reaction. The new method is automatable and facilitates the use of the gaxilose test for hypolactasia diagnosis in the clinical practice. The analytical validation of the new technique was performed in three different autoanalyzers, using buffer or urine samples spiked with different xylose concentrations. For the comparison between the phloroglucinol and the enzymatic assays, 224 urine samples of patients to whom the gaxilose test had been prescribed were assayed by both methods. A mean bias of −16.08 mg of xylose was observed when comparing the results obtained by both techniques. After adjusting the cut-off of the enzymatic method to 19.18 mg of xylose, the Kappa coefficient was found to be 0.9531, indicating an excellent level of agreement between both analytical procedures. This new assay represents the first automatable enzymatic technique validated for xylose quantification in urine.

  4. Integrated sampling and analysis plan for samples measuring >10 mrem/hour

    International Nuclear Information System (INIS)

    Haller, C.S.

    1992-03-01

    This integrated sampling and analysis plan was prepared to assist in planning and scheduling of Hanford Site sampling and analytical activities for all waste characterization samples that measure greater than 10 mrem/hour. This report also satisfies the requirements of the renegotiated Interim Milestone M-10-05 of the Hanford Federal Facility Agreement and Consent Order (the Tri-Party Agreement). For purposes of comparing the various analytical needs with the Hanford Site laboratory capabilities, the analytical requirements of the various programs were normalized by converting required laboratory effort for each type of sample to a common unit of work, the standard analytical equivalency unit (AEU). The AEU approximates the amount of laboratory resources required to perform an extensive suite of analyses on five core segments individually plus one additional suite of analyses on a composite sample derived from a mixture of the five core segments and prepare a validated RCRA-type data package

  5. Estimating HIES Data through Ratio and Regression Methods for Different Sampling Designs

    Directory of Open Access Journals (Sweden)

    Faqir Muhammad

    2007-01-01

    Full Text Available In this study, comparison has been made for different sampling designs, using the HIES data of North West Frontier Province (NWFP for 2001-02 and 1998-99 collected from the Federal Bureau of Statistics, Statistical Division, Government of Pakistan, Islamabad. The performance of the estimators has also been considered using bootstrap and Jacknife. A two-stage stratified random sample design is adopted by HIES. In the first stage, enumeration blocks and villages are treated as the first stage Primary Sampling Units (PSU. The sample PSU’s are selected with probability proportional to size. Secondary Sampling Units (SSU i.e., households are selected by systematic sampling with a random start. They have used a single study variable. We have compared the HIES technique with some other designs, which are: Stratified Simple Random Sampling. Stratified Systematic Sampling. Stratified Ranked Set Sampling. Stratified Two Phase Sampling. Ratio and Regression methods were applied with two study variables, which are: Income (y and Household sizes (x. Jacknife and Bootstrap are used for variance replication. Simple Random Sampling with sample size (462 to 561 gave moderate variances both by Jacknife and Bootstrap. By applying Systematic Sampling, we received moderate variance with sample size (467. In Jacknife with Systematic Sampling, we obtained variance of regression estimator greater than that of ratio estimator for a sample size (467 to 631. At a sample size (952 variance of ratio estimator gets greater than that of regression estimator. The most efficient design comes out to be Ranked set sampling compared with other designs. The Ranked set sampling with jackknife and bootstrap, gives minimum variance even with the smallest sample size (467. Two Phase sampling gave poor performance. Multi-stage sampling applied by HIES gave large variances especially if used with a single study variable.

  6. Analytical validation of Gentian NGAL particle-enhanced enhanced turbidimetric immunoassay (PETIA

    Directory of Open Access Journals (Sweden)

    Gian Luca Salvagno

    2017-08-01

    Full Text Available Objectives: This study was designed to validate the analytical performance of the new Gentian particle-enhanced enhanced turbidimetric immunoassay (PETIA for measuring neutrophil gelatinase-associated lipocalin (NGAL in serum samples. Design and methods: Analytical validation of the Gentian NGAL assay was carried out on a Roche Cobas c501 and was based on assessment of limit of blank (LOB, limit of detection (LOD, functional sensitivity, imprecision, linearity and concordance with the BioPorto NGAL test. Results: The LOB and LOD of Gentian NGAL were found to be 3.8 ng/mL and 6.3 ng/mL, respectively. An analytical coefficient of variation (CV of 20% corresponded to a NGAL value of 10 ng/mL. The intra-assay and inter-assay imprecision (CV was between 0.4 and 5.2% and 0.6 and 7.1% and the total imprecision (CV was 3.7%. The linearity was optimal at NGAL concentrations between 37 and 1420 ng/mL (r=1.00; p<0.001. An excellent correlation was observed between values measured with Gentian NGAL and BioPorto NGAL in 74 routine serum samples (r=0.993. The mean percentage bias of the Gentian assay versus the Bioporto assay was +3.1% (95% CI, +1.6% to +4.5%. Conclusions: These results show that Gentian NGAL may be a viable option to other commercial immunoassays for both routine and urgent assessment of serum NGAL. Keywords: Neutrophil gelatinase-associated lipocalin, NGAL, Analytical validation, Acute kidney injury

  7. Creel survey sampling designs for estimating effort in short-duration Chinook salmon fisheries

    Science.gov (United States)

    McCormick, Joshua L.; Quist, Michael C.; Schill, Daniel J.

    2013-01-01

    Chinook Salmon Oncorhynchus tshawytscha sport fisheries in the Columbia River basin are commonly monitored using roving creel survey designs and require precise, unbiased catch estimates. The objective of this study was to examine the relative bias and precision of total catch estimates using various sampling designs to estimate angling effort under the assumption that mean catch rate was known. We obtained information on angling populations based on direct visual observations of portions of Chinook Salmon fisheries in three Idaho river systems over a 23-d period. Based on the angling population, Monte Carlo simulations were used to evaluate the properties of effort and catch estimates for each sampling design. All sampling designs evaluated were relatively unbiased. Systematic random sampling (SYS) resulted in the most precise estimates. The SYS and simple random sampling designs had mean square error (MSE) estimates that were generally half of those observed with cluster sampling designs. The SYS design was more efficient (i.e., higher accuracy per unit cost) than a two-cluster design. Increasing the number of clusters available for sampling within a day decreased the MSE of estimates of daily angling effort, but the MSE of total catch estimates was variable depending on the fishery. The results of our simulations provide guidelines on the relative influence of sample sizes and sampling designs on parameters of interest in short-duration Chinook Salmon fisheries.

  8. Sample triage : an overview of Environment Canada's program

    Energy Technology Data Exchange (ETDEWEB)

    Lambert, P.; Goldthorp, M.; Fingas, M. [Environment Canada, Ottawa, ON (Canada). Emergencies Science and Technology Division, Environmental Technology Centre, Science and Technology Branch

    2006-07-01

    The Chemical, biological and radiological/nuclear Research and Technology Initiative (CRTI) is a program led by Canada's Department of National Defence in an effort to improve the capability of providing technical and analytical support in the event of a terrorist-related event. This paper summarized the findings from the CRTI Sample Triage Working Group and reviewed information on Environment Canada's triage program and its' mobile sample inspection facility that was designed to help examine samples of hazardous materials in a controlled environment to minimize the risk of exposure. A sample triage program is designed to deal with administrative, health and safety issues by facilitating the safe transfer of samples to an analytical laboratory. It refers to the collation of all results including field screening information, intelligence and observations for the purpose of prioritizing and directing the sample to the appropriate laboratory for analysis. A central component of Environment Canada's Emergency Response Program has been its capacity to respond on site during an oil or chemical spill. As such, the Emergencies Science and Technology Division has acquired a new mobile sample inspection facility in 2004. It is constructed to work with a custom designed decontamination unit and Ford F450 tow vehicle. The criteria and general design of the trailer facility was described. This paper also outlined the steps taken following a spill of hazardous materials into the environment so that potentially dangerous samples could be safety assessed. Several field trials will be carried out in order to develop standard operating procedures for the mobile sample inspection facility. 6 refs., 6 figs., 4 appendices.

  9. Double-contained receiver tank 244-TX, grab samples, 244TX-97-1 through 244TX-97-3 analytical results for the final report

    International Nuclear Information System (INIS)

    Esch, R.A.

    1997-01-01

    This document is the final report for the double-contained receiver tank (DCRT) 244-TX grab samples. Three grabs samples were collected from riser 8 on May 29, 1997. Analyses were performed in accordance with the Compatibility Grab Sampling and Analysis Plan (TSAP) and the Data Quality Objectives for Tank Farms Waste Compatibility Program (DQO). The analytical results are presented in a table

  10. Squid-inspired vehicle design using coupled fluid-solid analytical modeling

    Science.gov (United States)

    Giorgio-Serchi, Francesco; Weymouth, Gabriel

    2017-11-01

    The need for enhanced automation in the marine and maritime fields is fostering research into robust and highly maneuverable autonomous underwater vehicles. To address these needs we develop design principles for a new generation of soft-bodied aquatic vehicles similar to octopi and squids. In particular, we consider the capability of pulsed-jetting bodies to boost thrust by actively modifying their external body-shape and in this way benefit of the contribution from added-mass variation. We present an analytical formulation of the coupled fluid-structure interaction between the elastic body and the ambient fluid. The model incorporates a number of new salient contributions to the soft-body dynamics. We highlight the role of added-mass variation effects of the external fluid in enhancing thrust and assess how the shape-changing actuation is impeded by a confinement-related unsteady inertial term and by an external shape-dependent fluid stiffness contribution. We show how the analysis of these combined terms has guided us to the design of a new prototype of a squid-inspired vehicle tuning of the natural frequency of the coupled fluid-solid system with the purpose of optimizing its actuation routine.

  11. Sampling designs matching species biology produce accurate and affordable abundance indices

    Directory of Open Access Journals (Sweden)

    Grant Harris

    2013-12-01

    Full Text Available Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling, it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS data from 42 Alaskan brown bears (Ursus arctos. Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion, and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km2 cells and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture

  12. Sampling designs matching species biology produce accurate and affordable abundance indices.

    Science.gov (United States)

    Harris, Grant; Farley, Sean; Russell, Gareth J; Butler, Matthew J; Selinger, Jeff

    2013-01-01

    Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling), it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS) data from 42 Alaskan brown bears (Ursus arctos). Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion), and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km(2) cells) and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture sessions

  13. Sampling designs matching species biology produce accurate and affordable abundance indices

    Science.gov (United States)

    Farley, Sean; Russell, Gareth J.; Butler, Matthew J.; Selinger, Jeff

    2013-01-01

    Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling), it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS) data from 42 Alaskan brown bears (Ursus arctos). Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion), and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km2 cells) and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture sessions, which

  14. Recent analytical applications of magnetic nanoparticles

    Directory of Open Access Journals (Sweden)

    Mohammad Faraji

    2016-07-01

    Full Text Available Analytical chemistry has experienced, as well as other areas of science, a big change due to the needs and opportunities provided by analytical nanoscience and nanotechnology. Now, nanotechnology is increasingly proving to be a powerful ally of analytical chemistry to achieve its objectives, and to simplify analytical processes. Moreover, the information needs arising from the growing nanotechnological activity are opening an exciting new field of action for analytical chemists. Magnetic nanoparticles have been used in various fields owing to their unique properties including large specific surface area and simple separation with magnetic fields. For Analytical applications, they have been used mainly for sample preparation techniques (magnetic solid phase extraction with different advanced functional groups (layered double hydroxide, β-cyclodextrin, carbon nanotube, graphen, polymer, octadecylsilane and automation of it, microextraction techniques enantioseparation and chemosensors. This review summarizes the basic principles and achievements of magnetic nanoparticles in sample preparation techniques, enantioseparation and chemosensors. Also, some selected articles recently published (2010-2016 have been reviewed and discussed.

  15. Analytical model and design of spoke-type permanent-magnet machines accounting for saturation and nonlinearity of magnetic bridges

    Energy Technology Data Exchange (ETDEWEB)

    Liang, Peixin; Chai, Feng [State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin 150001 (China); Department of Electrical Engineering, Harbin Institute of Technology, Harbin 150001 (China); Bi, Yunlong [Department of Electrical Engineering, Harbin Institute of Technology, Harbin 150001 (China); Pei, Yulong, E-mail: peiyulong1@163.com [Department of Electrical Engineering, Harbin Institute of Technology, Harbin 150001 (China); Cheng, Shukang [State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin 150001 (China); Department of Electrical Engineering, Harbin Institute of Technology, Harbin 150001 (China)

    2016-11-01

    Based on subdomain model, this paper presents an analytical method for predicting the no-load magnetic field distribution, back-EMF and torque in general spoke-type motors with magnetic bridges. Taking into account the saturation and nonlinearity of magnetic material, the magnetic bridges are equivalent to fan-shaped saturation regions. For getting standard boundary conditions, a lumped parameter magnetic circuit model and iterative method are employed to calculate the permeability. The final field domain is divided into five types of simple subdomains. Based on the method of separation of variables, the analytical expression of each subdomain is derived. The analytical results of the magnetic field distribution, Back-EMF and torque are verified by finite element method, which confirms the validity of the proposed model for facilitating the motor design and optimization. - Highlights: • The no-load magnetic field of poke-type motors is firstly calculated by analytical method. • The magnetic circuit model and iterative method are employed to calculate the permeability. • The analytical expression of each subdomain is derived.. • The proposed method can effectively reduce the predesign stages duration.

  16. Analytical model and design of spoke-type permanent-magnet machines accounting for saturation and nonlinearity of magnetic bridges

    International Nuclear Information System (INIS)

    Liang, Peixin; Chai, Feng; Bi, Yunlong; Pei, Yulong; Cheng, Shukang

    2016-01-01

    Based on subdomain model, this paper presents an analytical method for predicting the no-load magnetic field distribution, back-EMF and torque in general spoke-type motors with magnetic bridges. Taking into account the saturation and nonlinearity of magnetic material, the magnetic bridges are equivalent to fan-shaped saturation regions. For getting standard boundary conditions, a lumped parameter magnetic circuit model and iterative method are employed to calculate the permeability. The final field domain is divided into five types of simple subdomains. Based on the method of separation of variables, the analytical expression of each subdomain is derived. The analytical results of the magnetic field distribution, Back-EMF and torque are verified by finite element method, which confirms the validity of the proposed model for facilitating the motor design and optimization. - Highlights: • The no-load magnetic field of poke-type motors is firstly calculated by analytical method. • The magnetic circuit model and iterative method are employed to calculate the permeability. • The analytical expression of each subdomain is derived.. • The proposed method can effectively reduce the predesign stages duration.

  17. Analytic free-form lens design for imaging applications with high aspect ratio

    Science.gov (United States)

    Duerr, Fabian; Benítez, Pablo; Miñano, Juan Carlos; Meuret, Youri; Thienpont, Hugo

    2012-10-01

    A new three-dimensional analytic optics design method is presented that enables the coupling of three ray sets with only two free-form lens surfaces. Closely related to the Simultaneous Multiple Surface method in three dimensions (SMS3D), it is derived directly from Fermat's principle, leading to multiple sets of functional differential equations. The general solution of these equations makes it possible to calculate more than 80 coefficients for each implicit surface function. Ray tracing simulations of these free-form lenses demonstrate superior imaging performance for applications with high aspect ratio, compared to conventional rotational symmetric systems.

  18. Analytical Chemistry Laboratory (ACL) procedure compendium. Volume 1, Administrative

    Energy Technology Data Exchange (ETDEWEB)

    1992-06-01

    Covered are: analytical laboratory operations (ALO) sample receipt and control, ALO data report/package preparation review and control, single shell tank (PST) project sample tracking system, sample receiving, analytical balances, duties and responsibilities of sample custodian, sample refrigerator temperature monitoring, security, assignment of staff responsibilities, sample storage, data reporting, and general requirements for glassware.

  19. Analytical Methods for Cs-137 and Other Radionuclides in Solvent Samples

    International Nuclear Information System (INIS)

    Pennebaker, F.M.

    2002-01-01

    Accurate characterization of individual waste components is critical to ensure design and operation of effective treatment processes and compliance with waste acceptance criteria. Current elemental analysis of organic matrices consists of conversion of the organic sample to aqueous by digesting the sample, which is inadequate in many cases. Direct analysis of the organic would increase sensitivity and decrease contamination and analysis time. For this project, we evaluated an Aridus membrane-desolvation sample introduction system for the direct analysis of organic solvents by Inductively Coupled Plasma - Mass Spectrometry (ICP-MS). The desolvator-ICP-MS successfully analyzed solvent from the caustic-side solvent extraction (CSSX) process and tri-butyl phosphate (TBP) organic tank waste from F-canyon for a variety of elements. Detection limits for most elements were determined in the part per trillion (ppt) range. This technology should increase accuracy in support of SRTC activities involving CSSX and other site processes involving organic compounds

  20. Ethical and Privacy Issues in the Design of Learning Analytics Applications

    NARCIS (Netherlands)

    Drachsler, Hendrik; Hoel, Tore; Cooper, Adam; Kismihok, Gabor; Berg, Alan; Scheffel, Maren; Chen, Weiqin; Ferguson, Rebecca

    2017-01-01

    Issues related to Ethics and Privacy have become a major stumbling block in application of Learning Analytics technologies on a large scale. Recently, the learning analytics community at large has more actively addressed the EP4LA issues, and we are now starting to see learning analytics solutions

  1. 40 CFR 91.421 - Dilute gaseous exhaust sampling and analytical system description.

    Science.gov (United States)

    2010-07-01

    ... Pump—Constant Volume Sampler (PDP-CVS) system with a heat exchanger, or a Critical Flow Venturi... gas mixture temperature, measured at a point immediately ahead of the critical flow venturi, must be.... (a) General. The exhaust gas sampling system described in this section is designed to measure the...

  2. A new analytical application of nylon-induced room-temperature phosphorescence: Determination of thiabendazole in water samples

    Energy Technology Data Exchange (ETDEWEB)

    Correa, R.A. [Departamento de Quimica Analitica, Facultad de Ciencias Bioquimicas y Farmaceuticas, Universidad Nacional de Rosario, Suipacha 531 (2000) Rosario (Argentina); Escandar, G.M. [Departamento de Quimica Analitica, Facultad de Ciencias Bioquimicas y Farmaceuticas, Universidad Nacional de Rosario, Suipacha 531 (2000) Rosario (Argentina)]. E-mail: gescanda@fbioyf.unr.edu.ar

    2006-06-30

    This paper discusses the first analytical determination of the widely used fungicide thiabendazole by nylon-induced phosphorimetry. Nylon was investigated as a novel solid-matrix for inducing room-temperature phosphorescence of thiabendazole, which was enhanced under the effect of external heavy-atom salts. Among the investigated salts, lead(II) acetate was the most effective in yielding a high phosphorescence signal. An additional enhancement of the phosphorescence emission was attained when the measurements were carried out under a nitrogen atmosphere. There was only a moderate increase in the presence of cyclodextrins. The room-temperature phosphorescence lifetimes of the adsorbed thiabendazole were measured under different working conditions and, in all cases, two decaying components were detected. On the basis of the obtained results, a very simple and sensitive phosphorimetric method for the determination of thiabendazole was established. The analytical figures of merit obtained under the best experimental conditions were: linear calibration range from 0.031 to 0.26 {mu}g ml{sup -1} (the lowest value corresponds to the quantitation limit), relative standard deviation, 2.4% (n = 5) at a level of 0.096 {mu}g ml{sup -1}, and limit of detection calculated according to 1995 IUPAC Recommendations equal to 0.010 {mu}g ml{sup -1} (0.03 ng/spot). The potential interference from common agrochemicals was also studied. The feasibility of determining thiabendazole in real samples was successfully evaluated through the analysis of spiked river, tap and mineral water samples.

  3. Design review report for rotary mode core sample truck (RMCST) modifications for flammable gas tanks, preliminary design

    International Nuclear Information System (INIS)

    Corbett, J.E.

    1996-02-01

    This report documents the completion of a preliminary design review for the Rotary Mode Core Sample Truck (RMCST) modifications for flammable gas tanks. The RMCST modifications are intended to support core sampling operations in waste tanks requiring flammable gas controls. The objective of this review was to validate basic design assumptions and concepts to support a path forward leading to a final design. The conclusion reached by the review committee was that the design was acceptable and efforts should continue toward a final design review

  4. Efficiency of analytical and sampling-based uncertainty propagation in intensity-modulated proton therapy

    Science.gov (United States)

    Wahl, N.; Hennig, P.; Wieser, H. P.; Bangert, M.

    2017-07-01

    The sensitivity of intensity-modulated proton therapy (IMPT) treatment plans to uncertainties can be quantified and mitigated with robust/min-max and stochastic/probabilistic treatment analysis and optimization techniques. Those methods usually rely on sparse random, importance, or worst-case sampling. Inevitably, this imposes a trade-off between computational speed and accuracy of the uncertainty propagation. Here, we investigate analytical probabilistic modeling (APM) as an alternative for uncertainty propagation and minimization in IMPT that does not rely on scenario sampling. APM propagates probability distributions over range and setup uncertainties via a Gaussian pencil-beam approximation into moments of the probability distributions over the resulting dose in closed form. It supports arbitrary correlation models and allows for efficient incorporation of fractionation effects regarding random and systematic errors. We evaluate the trade-off between run-time and accuracy of APM uncertainty computations on three patient datasets. Results are compared against reference computations facilitating importance and random sampling. Two approximation techniques to accelerate uncertainty propagation and minimization based on probabilistic treatment plan optimization are presented. Runtimes are measured on CPU and GPU platforms, dosimetric accuracy is quantified in comparison to a sampling-based benchmark (5000 random samples). APM accurately propagates range and setup uncertainties into dose uncertainties at competitive run-times (GPU ≤slant {5} min). The resulting standard deviation (expectation value) of dose show average global γ{3% / {3}~mm} pass rates between 94.2% and 99.9% (98.4% and 100.0%). All investigated importance sampling strategies provided less accuracy at higher run-times considering only a single fraction. Considering fractionation, APM uncertainty propagation and treatment plan optimization was proven to be possible at constant time complexity

  5. Mechanical design and simulation of an automatized sample exchanger

    International Nuclear Information System (INIS)

    Lopez, Yon; Gora, Jimmy; Bedregal, Patricia; Hernandez, Yuri; Baltuano, Oscar; Gago, Javier

    2013-01-01

    The design of a turntable type sample exchanger for irradiation and with a capacity for up to 20 capsules was performed. Its function is the automatic sending of samples contained in polyethylene capsules, for irradiation in the grid position of the reactor core, using a pneumatic system and further analysis by neutron activation. This study shows the structural design analysis and calculations in selecting motors and actuators. This development will improve efficiency in the analysis, reducing the contribution of the workers and also the radiation exposure time. (authors).

  6. Analytical Model of the Nonlinear Dynamics of Cantilever Tip-Sample Surface Interactions for Various Acoustic-Atomic Force Microscopies

    Science.gov (United States)

    Cantrell, John H., Jr.; Cantrell, Sean A.

    2008-01-01

    A comprehensive analytical model of the interaction of the cantilever tip of the atomic force microscope (AFM) with the sample surface is developed that accounts for the nonlinearity of the tip-surface interaction force. The interaction is modeled as a nonlinear spring coupled at opposite ends to linear springs representing cantilever and sample surface oscillators. The model leads to a pair of coupled nonlinear differential equations that are solved analytically using a standard iteration procedure. Solutions are obtained for the phase and amplitude signals generated by various acoustic-atomic force microscope (A-AFM) techniques including force modulation microscopy, atomic force acoustic microscopy, ultrasonic force microscopy, heterodyne force microscopy, resonant difference-frequency atomic force ultrasonic microscopy (RDF-AFUM), and the commonly used intermittent contact mode (TappingMode) generally available on AFMs. The solutions are used to obtain a quantitative measure of image contrast resulting from variations in the Young modulus of the sample for the amplitude and phase images generated by the A-AFM techniques. Application of the model to RDF-AFUM and intermittent soft contact phase images of LaRC-cp2 polyimide polymer is discussed. The model predicts variations in the Young modulus of the material of 24 percent from the RDF-AFUM image and 18 percent from the intermittent soft contact image. Both predictions are in good agreement with the literature value of 21 percent obtained from independent, macroscopic measurements of sheet polymer material.

  7. Outcome-Dependent Sampling Design and Inference for Cox’s Proportional Hazards Model

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Cai, Jianwen; Sandler, Dale P.; Zhou, Haibo

    2016-01-01

    We propose a cost-effective outcome-dependent sampling design for the failure time data and develop an efficient inference procedure for data collected with this design. To account for the biased sampling scheme, we derive estimators from a weighted partial likelihood estimating equation. The proposed estimators for regression parameters are shown to be consistent and asymptotically normally distributed. A criteria that can be used to optimally implement the ODS design in practice is proposed and studied. The small sample performance of the proposed method is evaluated by simulation studies. The proposed design and inference procedure is shown to be statistically more powerful than existing alternative designs with the same sample sizes. We illustrate the proposed method with an existing real data from the Cancer Incidence and Mortality of Uranium Miners Study. PMID:28090134

  8. Analytical design method for a truss-bolt system for reinforcement of fractured coal mine roofs - illustrated with a case study

    Energy Technology Data Exchange (ETDEWEB)

    Liu, B.; Yue, Z.Q.; Tham, L.G. [University of Hong Kong, Hong Kong (China). Dept. of Civil Engineering

    2005-02-01

    This paper presents an analytical design method for the truss-bolt system in reinforcing underground fractured rock roofs in coal mines. The analytical design method is based on the mechanical analysis of the fractured rock roof with reinforcement by inclined roof bolts and a horizontal tie-rod. The mechanical analysis for the system includes a non-linear bending model for the laterally inclined roof bolts and three upper and lower bounds. The lateral resistance of the inclined roof bolts in a truss-bolt-supported roadway is examined using classical theory of a non-linear beam in bending. The paper analyses the arching action by lateral behavior of the inclined roof bolts in reinforcing the fractured roof. Based on mechanical models, the design formula concerning the lateral bolt forces, tensions in the tie-rod in the truss system, as well as the reinforcement behavior have been derived. In order to ensure that the roof truss-bolt system reinforces the coal roof effectively, a lower bound of pre-tightening forces must be applied on the tie-rod for stabilizing the fractured roof by arching action. The pre-tightening forces exerted via the tie-rod also cannot be greater than its upper bound, since the excessive tightening force will cause localized failure in the rock near the bolt tail at the abutment of the fractured roof beam. The analytical formulas for both lower and upper bounds for truss pre-tightening forces are put forward in this paper. Furthermore, the paper also presents analytical equations for designing the axial forces and dimensions for bolts in this kind of system.

  9. Green approaches in sample preparation of bioanalytical samples prior to chromatographic analysis.

    Science.gov (United States)

    Filippou, Olga; Bitas, Dimitrios; Samanidou, Victoria

    2017-02-01

    Sample preparation is considered to be the most challenging step of the analytical procedure, since it has an effect on the whole analytical methodology, therefore it contributes significantly to the greenness or lack of it of the entire process. The elimination of the sample treatment steps, pursuing at the same time the reduction of the amount of the sample, strong reductions in consumption of hazardous reagents and energy also maximizing safety for operators and environment, the avoidance of the use of big amount of organic solvents, form the basis for greening sample preparation and analytical methods. In the last decade, the development and utilization of greener and sustainable microextraction techniques is an alternative to classical sample preparation procedures. In this review, the main green microextraction techniques (solid phase microextraction, stir bar sorptive extraction, hollow-fiber liquid phase microextraction, dispersive liquid - liquid microextraction, etc.) will be presented, with special attention to bioanalytical applications of these environment-friendly sample preparation techniques which comply with the green analytical chemistry principles. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. A review of blood sample handling and pre-processing for metabolomics studies.

    Science.gov (United States)

    Hernandes, Vinicius Veri; Barbas, Coral; Dudzik, Danuta

    2017-09-01

    Metabolomics has been found to be applicable to a wide range of clinical studies, bringing a new era for improving clinical diagnostics, early disease detection, therapy prediction and treatment efficiency monitoring. A major challenge in metabolomics, particularly untargeted studies, is the extremely diverse and complex nature of biological specimens. Despite great advances in the field there still exist fundamental needs for considering pre-analytical variability that can introduce bias to the subsequent analytical process and decrease the reliability of the results and moreover confound final research outcomes. Many researchers are mainly focused on the instrumental aspects of the biomarker discovery process, and sample related variables sometimes seem to be overlooked. To bridge the gap, critical information and standardized protocols regarding experimental design and sample handling and pre-processing are highly desired. Characterization of a range variation among sample collection methods is necessary to prevent results misinterpretation and to ensure that observed differences are not due to an experimental bias caused by inconsistencies in sample processing. Herein, a systematic discussion of pre-analytical variables affecting metabolomics studies based on blood derived samples is performed. Furthermore, we provide a set of recommendations concerning experimental design, collection, pre-processing procedures and storage conditions as a practical review that can guide and serve for the standardization of protocols and reduction of undesirable variation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Analytical method development of nifedipine and its degradants binary mixture using high performance liquid chromatography through a quality by design approach

    Science.gov (United States)

    Choiri, S.; Ainurofiq, A.; Ratri, R.; Zulmi, M. U.

    2018-03-01

    Nifedipin (NIF) is a photo-labile drug that easily degrades when it exposures a sunlight. This research aimed to develop of an analytical method using a high-performance liquid chromatography and implemented a quality by design approach to obtain effective, efficient, and validated analytical methods of NIF and its degradants. A 22 full factorial design approach with a curvature as a center point was applied to optimize of the analytical condition of NIF and its degradants. Mobile phase composition (MPC) and flow rate (FR) as factors determined on the system suitability parameters. The selected condition was validated by cross-validation using a leave one out technique. Alteration of MPC affected on time retention significantly. Furthermore, an increase of FR reduced the tailing factor. In addition, the interaction of both factors affected on an increase of the theoretical plates and resolution of NIF and its degradants. The selected analytical condition of NIF and its degradants has been validated at range 1 – 16 µg/mL that had good linearity, precision, accuration and efficient due to an analysis time within 10 min.

  12. [Pre-analytical stability before centrifugation of 7 biochemical analytes in whole blood].

    Science.gov (United States)

    Perrier-Cornet, Andreas; Moineau, Marie-Pierre; Narbonne, Valérie; Plee-Gautier, Emmanuelle; Le Saos, Fabienne; Carre, Jean-Luc

    2015-01-01

    The pre-analytical stability of 7 biochemical parameters (parathyroid hormone -PTH-, vitamins A, C E and D, 1,25-dihydroxyvitamin D and insulin) at +4 °C, was studied on whole blood samples before centrifugation. The impact of freezing at -20°C was also analyzed/performed for PTH and vitamin D. The differences in the results of assays for whole blood samples, being kept for different times between sampling time and analysis, from 9 healthy adults, were compaired by using a Student t test. The 7 analytes investigated remained stable up to 4 hours at +4°C in whole blood. This study showed that it is possible to accept uncentrifuged whole blood specimens kept at +4°C before analysis. PTH is affected by freezing whereas vitamin D is not.

  13. An analytical approach for optimizing the leaf design of a multi-leaf collimator in a linear accelerator

    International Nuclear Information System (INIS)

    Topolnjak, R; Heide, U A van der

    2008-01-01

    In this study, we present an analytical approach for optimizing the leaf design of a multi-leaf collimator (MLC) in a linear accelerator. Because leaf designs vary between vendors, our goal is to characterize and quantify the effects of different compromises which have to be made between performance parameters. Subsequently, an optimal leaf design for an earlier proposed six-bank MLC which combines a high-resolution field-shaping ability with a large field size is determined. To this end a model of the linac is created that includes the following parameters: the source size, the maximum field size, the distance between source and isocenter, and the leaf's design parameters. First, the optimal radius of the leaf tip was found. This optimum was defined by the requirement that the fluence intensity should fall from 80% of the maximum value to 20% in a minimal distance, defining the width of the fluence penumbra. A second requirement was that this penumbra width should be constant when a leaf moves from one side of the field to the other. The geometric, transmission and total penumbra width (80-20%) were calculated depending on the design parameters. The analytical model is in agreement with Elekta, Varian and Siemens collimator designs. For leaves thinner than 4 cm, the transmission penumbra becomes dominant, and for leaves close to the source the geometric penumbra plays a role. Finally, by choosing the leaf thickness of 3.5 cm, 4 cm and 5 cm from the lowest to the highest bank, respectively, an optimal leaf design for a six-bank MLC is achieved

  14. Electrophoretic extraction of low molecular weight cationic analytes from sodium dodecyl sulfate containing sample matrices for their direct electrospray ionization mass spectrometry.

    Science.gov (United States)

    Kinde, Tristan F; Lopez, Thomas D; Dutta, Debashis

    2015-03-03

    While the use of sodium dodecyl sulfate (SDS) in separation buffers allows efficient analysis of complex mixtures, its presence in the sample matrix is known to severely interfere with the mass-spectrometric characterization of analyte molecules. In this article, we report a microfluidic device that addresses this analytical challenge by enabling inline electrospray ionization mass spectrometry (ESI-MS) of low molecular weight cationic samples prepared in SDS containing matrices. The functionality of this device relies on the continuous extraction of analyte molecules into an SDS-free solvent stream based on the free-flow zone electrophoresis (FFZE) technique prior to their ESI-MS analysis. The reported extraction was accomplished in our current work in a glass channel with microelectrodes fabricated along its sidewalls to realize the desired electric field. Our experiments show that a key challenge to successfully operating such a device is to suppress the electroosmotically driven fluid circulations generated in its extraction channel that otherwise tend to vigorously mix the liquid streams flowing through this duct. A new coating medium, N-(2-triethoxysilylpropyl) formamide, recently demonstrated by our laboratory to nearly eliminate electroosmotic flow in glass microchannels was employed to address this issue. Applying this surface modifier, we were able to efficiently extract two different peptides, human angiotensin I and MRFA, individually from an SDS containing matrix using the FFZE method and detect them at concentrations down to 3.7 and 6.3 μg/mL, respectively, in samples containing as much as 10 mM SDS. Notice that in addition to greatly reducing the amount of SDS entering the MS instrument, the reported approach allows rapid solvent exchange for facilitating efficient analyte ionization desired in ESI-MS analysis.

  15. Facilitating Multiple Intelligences Through Multimodal Learning Analytics

    Directory of Open Access Journals (Sweden)

    Ayesha PERVEEN

    2018-01-01

    Full Text Available This paper develops a theoretical framework for employing learning analytics in online education to trace multiple learning variations of online students by considering their potential of being multiple intelligences based on Howard Gardner’s 1983 theory of multiple intelligences. The study first emphasizes the need to facilitate students as multiple intelligences by online education systems and then suggests a framework of the advanced form of learning analytics i.e., multimodal learning analytics for tracing and facilitating multiple intelligences while they are engaged in online ubiquitous learning. As multimodal learning analytics is still an evolving area, it poses many challenges for technologists, educationists as well as organizational managers. Learning analytics make machines meet humans, therefore, the educationists with an expertise in learning theories can help technologists devise latest technological methods for multimodal learning analytics and organizational managers can implement them for the improvement of online education. Therefore, a careful instructional design based on a deep understanding of students’ learning abilities, is required to develop teaching plans and technological possibilities for monitoring students’ learning paths. This is how learning analytics can help design an adaptive instructional design based on a quick analysis of the data gathered. Based on that analysis, the academicians can critically reflect upon the quick or delayed implementation of the existing instructional design based on students’ cognitive abilities or even about the single or double loop learning design. The researcher concludes that the online education is multimodal in nature, has the capacity to endorse multiliteracies and, therefore, multiple intelligences can be tracked and facilitated through multimodal learning analytics in an online mode. However, online teachers’ training both in technological implementations and

  16. Pre-analytical and analytical validations and clinical applications of a miniaturized, simple and cost-effective solid phase extraction combined with LC-MS/MS for the simultaneous determination of catecholamines and metanephrines in spot urine samples.

    Science.gov (United States)

    Li, Xiaoguang Sunny; Li, Shu; Kellermann, Gottfried

    2016-10-01

    It remains a challenge to simultaneously quantify catecholamines and metanephrines in a simple, sensitive and cost-effective manner due to pre-analytical and analytical constraints. Herein, we describe such a method consisting of a miniaturized sample preparation and selective LC-MS/MS detection by the use of second morning spot urine samples. Ten microliters of second morning urine sample were subjected to solid phase extraction on an Oasis HLB microplate upon complexation with phenylboronic acid. The analytes were well-resolved on a Luna PFP column followed by tandem mass spectrometric detection. Full validation and suitability of spot urine sampling and biological variation were investigated. The extraction recovery and matrix effect are 74.1-97.3% and 84.1-119.0%, respectively. The linearity range is 2.5-500, 0.5-500, 2.5-1250, 2.5-1250 and 0.5-1250ng/mL for norepinephrine, epinephrine, dopamine, normetanephrine and metanephrine, respectively. The intra- and inter-assay imprecisions are ≤9.4% for spiked quality control samples, and the respective recoveries are 97.2-112.5% and 95.9-104.0%. The Deming regression slope is 0.90-1.08, and the mean Bland-Altman percentage difference is from -3.29 to 11.85 between a published and proposed method (n=50). A correlation observed for the spot and 24h urine collections is significant (n=20, p<0.0001, r: 0.84-0.95, slope: 0.61-0.98). No statistical differences are found in day-to-day biological variability (n=20). Reference intervals are established for an apparently healthy population (n=88). The developed method, being practical, sensitive, reliable and cost-effective, is expected to set a new stage for routine testing, basic research and clinical applications. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Irregular analytical errors in diagnostic testing - a novel concept.

    Science.gov (United States)

    Vogeser, Michael; Seger, Christoph

    2018-02-23

    In laboratory medicine, routine periodic analyses for internal and external quality control measurements interpreted by statistical methods are mandatory for batch clearance. Data analysis of these process-oriented measurements allows for insight into random analytical variation and systematic calibration bias over time. However, in such a setting, any individual sample is not under individual quality control. The quality control measurements act only at the batch level. Quantitative or qualitative data derived for many effects and interferences associated with an individual diagnostic sample can compromise any analyte. It is obvious that a process for a quality-control-sample-based approach of quality assurance is not sensitive to such errors. To address the potential causes and nature of such analytical interference in individual samples more systematically, we suggest the introduction of a new term called the irregular (individual) analytical error. Practically, this term can be applied in any analytical assay that is traceable to a reference measurement system. For an individual sample an irregular analytical error is defined as an inaccuracy (which is the deviation from a reference measurement procedure result) of a test result that is so high it cannot be explained by measurement uncertainty of the utilized routine assay operating within the accepted limitations of the associated process quality control measurements. The deviation can be defined as the linear combination of the process measurement uncertainty and the method bias for the reference measurement system. Such errors should be coined irregular analytical errors of the individual sample. The measurement result is compromised either by an irregular effect associated with the individual composition (matrix) of the sample or an individual single sample associated processing error in the analytical process. Currently, the availability of reference measurement procedures is still highly limited, but LC

  18. Optimum shape design of incompressible hyperelastic structures with analytical sensitivity analysis

    International Nuclear Information System (INIS)

    Jarraya, A.; Wali, M.; Dammark, F.

    2014-01-01

    This paper is focused on the structural shape optimization of incompressible hyperelastic structures. An analytical sensitivity is developed for the rubber like materials. The whole shape optimization process is carried out by coupling a closed geometric shape in R 2 with boundaries, defined by B-splines curves, exact sensitivity analysis and mathematical programming method (S.Q.P: sequential quadratic programming). Design variables are the control points coordinate. The objective function is to minimize Von-Mises stress, constrained to the total material volume of the structure remains constant. In order to validate the exact Jacobian method, the sensitivity calculation is performed: numerically by an efficient finite difference scheme and by the exact Jacobian method. Numerical optimization examples are presented for elastic and hyperelastic materials using the proposed method.

  19. Quality assurance in the pre-analytical phase of human urine samples by (1)H NMR spectroscopy.

    Science.gov (United States)

    Budde, Kathrin; Gök, Ömer-Necmi; Pietzner, Maik; Meisinger, Christine; Leitzmann, Michael; Nauck, Matthias; Köttgen, Anna; Friedrich, Nele

    2016-01-01

    Metabolomic approaches investigate changes in metabolite profiles, which may reflect changes in metabolic pathways and provide information correlated with a specific biological process or pathophysiology. High-resolution (1)H NMR spectroscopy is used to identify metabolites in biofluids and tissue samples qualitatively and quantitatively. This pre-analytical study evaluated the effects of storage time and temperature on (1)H NMR spectra from human urine in two settings. Firstly, to evaluate short time effects probably due to acute delay in sample handling and secondly, the effect of prolonged storage up to one month to find markers of sample miss-handling. A number of statistical procedures were used to assess the differences between samples stored under different conditions, including Projection to Latent Structure Discriminant Analysis (PLS-DA), non-parametric testing as well as mixed effect linear regression analysis. The results indicate that human urine samples can be stored at 10 °C for 24 h or at -80 °C for 1 month, as no relevant changes in (1)H NMR fingerprints were observed during these time periods and temperature conditions. However, some metabolites most likely of microbial origin showed alterations during prolonged storage but without facilitating classification. In conclusion, the presented protocol for urine sample handling and semi-automatic metabolite quantification is suitable for large-scale epidemiological studies. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Analytic evaluation of LAMPF II Booster Cavity design

    International Nuclear Information System (INIS)

    Friedrichs, C.C.

    1985-01-01

    Through the past few decades, a great deal of sophistication has evolved in the numeric codes used to evaluate electromagnetically resonant structures. The numeric methods are extremely precise, even for complicated geometries, whereas analytic methods require a simple uniform geometry and a simple, known mode configuration if the same precision is to be obtained. The code SUPERFISH, which is near the present state-of-the-art of numeric methods, does have the following limitations: No circumferential geometry variations are permissible; there are no provisions for magnetic or dielectric losses; and finally, it is impractical (because of the complexity of the code) to modify it to extract particular bits of data one might want that are not provided by the code as written. This paper describes how SUPERFISH was used as an aid in derivating an analytic model of the LAMPF II Booster Cavity. Once a satisfactory model was derived, simple FORTRAN codes were generated to provide whatever data was required. The analytic model is made up of TEM- and radial-mode transmission-line sections, as well as lumped elements where appropriate. Radial transmission-line equations, which include losses, were not found in any literature, and the extension of the lossless equations to include magnetic and dielectric losses are included in this paper

  1. Forensic analysis of high explosives residues in post-blast water samples employing solid phase extraction for analyte pro-concentration

    International Nuclear Information System (INIS)

    Umi Kalsom Ahmad; Rajendran, Sumathy; Ling, Lee Woan

    2008-01-01

    Nitro aromatic, nitramine and nitrate ester compounds are a major group of high order explosive or better known as military explosives. Octahydro-1,3,5,7-tetrazocine (HMX), 1,3,5-hexahydro-1,3,5-trinitro triazine (RDX), 2,4,6-trinitro-toluene (TNT), pentaerythritol tetranitrate (PETN) and 2,4-dinitrotoluene (2,4-DNT) are secondary high explosives classified as most commonly used explosives components. There is an increasing demand for pre-concentration of these compounds in water samples as the sensitivity achieved by instrumental analytical methods for these high explosives residues are the main drawback in the application at trace levels for forensic analysis. Hence, a simple cartridge solid phase extraction (SPE) procedure was optimized as the off-line extraction and pre-concentration method to enhance the detection limit of high explosive residues using micellar electrokinetic chromatography (MEKC) and gas chromatography with electron-capture detection (GC-ECD) methods. The SPE cartridges utilized LiChrolut EN as the SPE adsorbent. By emplying pre-concentration using SPE, the detection limits of the target analytes in water sample were lowered by more than 1000 times with good percentage recovery (>87%) for MEKC method and lowered by 120 times with more than 2 % percentage recovery for GC-ECD methods. In order to test the feasibility of the developed method to real cases, post-blast water samples were analyzed. The post-blast water samples which were collected from Baling Bom training range, Ulu Kinta, Perak contained RDX and PETN in the range of 0.05 - 0.17 ppm and 0.0124 - 0.0390 ppm respectively. (author)

  2. An Analytical Design Method for a Regenerative Braking Control System for DC-electrified Railway Systems under Light Load Conditions

    Science.gov (United States)

    Saito, Tatsuhito; Kondo, Keiichiro; Koseki, Takafumi

    A DC-electrified railway system that is fed by diode rectifiers at a substation is unable to return the electric power to an AC grid. Accordingly, the braking cars have to restrict regenerative braking power when the power consumption of the powering cars is not sufficient. However, the characteristics of a DC-electrified railway system, including the powering cars, is not known, and a mathematical model for designing a controller has not been established yet. Hence, the object of this study is to obtain the mathematical model for an analytical design method of the regenerative braking control system. In the first part of this paper, the static characteristics of this system are presented to show the position of the equilibrium point. The linearization of this system at the equilibrium point is then performed to describe the dynamic characteristics of the system. An analytical design method is then proposed on the basis of these characteristics. The proposed design method is verified by experimental tests with a 1kW class miniature model, and numerical simulations.

  3. Polymeric ionic liquid-based portable tip microextraction device for on-site sample preparation of water samples.

    Science.gov (United States)

    Chen, Lei; Pei, Junxian; Huang, Xiaojia; Lu, Min

    2018-06-05

    On-site sample preparation is highly desired because it avoids the transportation of large-volume samples and ensures the accuracy of the analytical results. In this work, a portable prototype of tip microextraction device (TMD) was designed and developed for on-site sample pretreatment. The assembly procedure of TMD is quite simple. Firstly, polymeric ionic liquid (PIL)-based adsorbent was in-situ prepared in a pipette tip. After that, the tip was connected with a syringe which was driven by a bidirectional motor. The flow rates in adsorption and desorption steps were controlled accurately by the motor. To evaluate the practicability of the developed device, the TMD was used to on-site sample preparation of waters and combined with high-performance liquid chromatography with diode array detection to measure trace estrogens in water samples. Under the most favorable conditions, the limits of detection (LODs, S/N = 3) for the target analytes were in the range of 4.9-22 ng/L, with good coefficients of determination. Confirmatory study well evidences that the extraction performance of TMD is comparable to that of the traditional laboratory solid-phase extraction process, but the proposed TMD is more simple and convenient. At the same time, the TMD avoids complicated sampling and transferring steps of large-volume water samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Analytical model and design of spoke-type permanent-magnet machines accounting for saturation and nonlinearity of magnetic bridges

    Science.gov (United States)

    Liang, Peixin; Chai, Feng; Bi, Yunlong; Pei, Yulong; Cheng, Shukang

    2016-11-01

    Based on subdomain model, this paper presents an analytical method for predicting the no-load magnetic field distribution, back-EMF and torque in general spoke-type motors with magnetic bridges. Taking into account the saturation and nonlinearity of magnetic material, the magnetic bridges are equivalent to fan-shaped saturation regions. For getting standard boundary conditions, a lumped parameter magnetic circuit model and iterative method are employed to calculate the permeability. The final field domain is divided into five types of simple subdomains. Based on the method of separation of variables, the analytical expression of each subdomain is derived. The analytical results of the magnetic field distribution, Back-EMF and torque are verified by finite element method, which confirms the validity of the proposed model for facilitating the motor design and optimization.

  5. Analytical fuzzy approach to biological data analysis

    Directory of Open Access Journals (Sweden)

    Weiping Zhang

    2017-03-01

    Full Text Available The assessment of the physiological state of an individual requires an objective evaluation of biological data while taking into account both measurement noise and uncertainties arising from individual factors. We suggest to represent multi-dimensional medical data by means of an optimal fuzzy membership function. A carefully designed data model is introduced in a completely deterministic framework where uncertain variables are characterized by fuzzy membership functions. The study derives the analytical expressions of fuzzy membership functions on variables of the multivariate data model by maximizing the over-uncertainties-averaged-log-membership values of data samples around an initial guess. The analytical solution lends itself to a practical modeling algorithm facilitating the data classification. The experiments performed on the heartbeat interval data of 20 subjects verified that the proposed method is competing alternative to typically used pattern recognition and machine learning algorithms.

  6. Design development of robotic system for on line sampling in fuel reprocessing

    International Nuclear Information System (INIS)

    Balasubramanian, G.R.; Venugopal, P.R.; Padmashali, G.K.

    1990-01-01

    This presentation describes the design and developmental work that is being carried out for the design of an automated sampling system for fast reactor fuel reprocessing plants. The plant proposes to use integrated sampling system. The sample is taken across regular process streams from any intermediate hold up pot. A robot system is planned to take the sample from the sample pot, transfer it to the sample bottle, cap the bottle and transfer the bottle to a pneumatic conveying station. The system covers a large number of sample pots. Alternate automated systems are also examined (1). (author). 4 refs., 2 figs

  7. Developing automated analytical methods for scientific environments using LabVIEW.

    Science.gov (United States)

    Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard

    2010-01-15

    The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.

  8. Enzyme Biosensors for Biomedical Applications: Strategies for Safeguarding Analytical Performances in Biological Fluids

    Science.gov (United States)

    Rocchitta, Gaia; Spanu, Angela; Babudieri, Sergio; Latte, Gavinella; Madeddu, Giordano; Galleri, Grazia; Nuvoli, Susanna; Bagella, Paola; Demartis, Maria Ilaria; Fiore, Vito; Manetti, Roberto; Serra, Pier Andrea

    2016-01-01

    Enzyme-based chemical biosensors are based on biological recognition. In order to operate, the enzymes must be available to catalyze a specific biochemical reaction and be stable under the normal operating conditions of the biosensor. Design of biosensors is based on knowledge about the target analyte, as well as the complexity of the matrix in which the analyte has to be quantified. This article reviews the problems resulting from the interaction of enzyme-based amperometric biosensors with complex biological matrices containing the target analyte(s). One of the most challenging disadvantages of amperometric enzyme-based biosensor detection is signal reduction from fouling agents and interference from chemicals present in the sample matrix. This article, therefore, investigates the principles of functioning of enzymatic biosensors, their analytical performance over time and the strategies used to optimize their performance. Moreover, the composition of biological fluids as a function of their interaction with biosensing will be presented. PMID:27249001

  9. Microfluidic devices for sample clean-up and screening of biological samples

    NARCIS (Netherlands)

    Tetala, K.K.R.

    2009-01-01

    Analytical chemistry plays an important role in the separation and identification of analytes from raw samples (e.g. plant extracts, blood), but the whole analytical process is tedious, difficult to automate and time consuming. To overcome these drawbacks, the concept of μTAS (miniaturized total

  10. Analytical design of an industrial two-term controller for optimal regulatory control of open-loop unstable processes under operational constraints.

    Science.gov (United States)

    Tchamna, Rodrigue; Lee, Moonyong

    2018-01-01

    This paper proposes a novel optimization-based approach for the design of an industrial two-term proportional-integral (PI) controller for the optimal regulatory control of unstable processes subjected to three common operational constraints related to the process variable, manipulated variable and its rate of change. To derive analytical design relations, the constrained optimal control problem in the time domain was transformed into an unconstrained optimization problem in a new parameter space via an effective parameterization. The resulting optimal PI controller has been verified to yield optimal performance and stability of an open-loop unstable first-order process under operational constraints. The proposed analytical design method explicitly takes into account the operational constraints in the controller design stage and also provides useful insights into the optimal controller design. Practical procedures for designing optimal PI parameters and a feasible constraint set exclusive of complex optimization steps are also proposed. The proposed controller was compared with several other PI controllers to illustrate its performance. The robustness of the proposed controller against plant-model mismatch has also been investigated. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Stratified sampling design based on data mining.

    Science.gov (United States)

    Kim, Yeonkook J; Oh, Yoonhwan; Park, Sunghoon; Cho, Sungzoon; Park, Hayoung

    2013-09-01

    To explore classification rules based on data mining methodologies which are to be used in defining strata in stratified sampling of healthcare providers with improved sampling efficiency. We performed k-means clustering to group providers with similar characteristics, then, constructed decision trees on cluster labels to generate stratification rules. We assessed the variance explained by the stratification proposed in this study and by conventional stratification to evaluate the performance of the sampling design. We constructed a study database from health insurance claims data and providers' profile data made available to this study by the Health Insurance Review and Assessment Service of South Korea, and population data from Statistics Korea. From our database, we used the data for single specialty clinics or hospitals in two specialties, general surgery and ophthalmology, for the year 2011 in this study. Data mining resulted in five strata in general surgery with two stratification variables, the number of inpatients per specialist and population density of provider location, and five strata in ophthalmology with two stratification variables, the number of inpatients per specialist and number of beds. The percentages of variance in annual changes in the productivity of specialists explained by the stratification in general surgery and ophthalmology were 22% and 8%, respectively, whereas conventional stratification by the type of provider location and number of beds explained 2% and 0.2% of variance, respectively. This study demonstrated that data mining methods can be used in designing efficient stratified sampling with variables readily available to the insurer and government; it offers an alternative to the existing stratification method that is widely used in healthcare provider surveys in South Korea.

  12. Latent spatial models and sampling design for landscape genetics

    Science.gov (United States)

    Hanks, Ephraim M.; Hooten, Mevin B.; Knick, Steven T.; Oyler-McCance, Sara J.; Fike, Jennifer A.; Cross, Todd B.; Schwartz, Michael K.

    2016-01-01

    We propose a spatially-explicit approach for modeling genetic variation across space and illustrate how this approach can be used to optimize spatial prediction and sampling design for landscape genetic data. We propose a multinomial data model for categorical microsatellite allele data commonly used in landscape genetic studies, and introduce a latent spatial random effect to allow for spatial correlation between genetic observations. We illustrate how modern dimension reduction approaches to spatial statistics can allow for efficient computation in landscape genetic statistical models covering large spatial domains. We apply our approach to propose a retrospective spatial sampling design for greater sage-grouse (Centrocercus urophasianus) population genetics in the western United States.

  13. ISCO Grab Sample Ion Chromatography Analytical Data

    Data.gov (United States)

    U.S. Environmental Protection Agency — ISCO grab samples were collected from river, wastewater treatment plant discharge, and public drinking water intakes. Samples were analyzed for major ions (ppb)...

  14. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    Directory of Open Access Journals (Sweden)

    Helen V. Hsieh

    2017-05-01

    Full Text Available Immunochromatographic or lateral flow assays (LFAs are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads, biological reagents (e.g., antibodies, blocking reagents and buffers and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness.

  15. A Novel Analysis Method for Paired-Sample Microbial Ecology Experiments.

    Science.gov (United States)

    Olesen, Scott W; Vora, Suhani; Techtmann, Stephen M; Fortney, Julian L; Bastidas-Oyanedel, Juan R; Rodríguez, Jorge; Hazen, Terry C; Alm, Eric J

    2016-01-01

    Many microbial ecology experiments use sequencing data to measure a community's response to an experimental treatment. In a common experimental design, two units, one control and one experimental, are sampled before and after the treatment is applied to the experimental unit. The four resulting samples contain information about the dynamics of organisms that respond to the treatment, but there are no analytical methods designed to extract exactly this type of information from this configuration of samples. Here we present an analytical method specifically designed to visualize and generate hypotheses about microbial community dynamics in experiments that have paired samples and few or no replicates. The method is based on the Poisson lognormal distribution, long studied in macroecology, which we found accurately models the abundance distribution of taxa counts from 16S rRNA surveys. To demonstrate the method's validity and potential, we analyzed an experiment that measured the effect of crude oil on ocean microbial communities in microcosm. Our method identified known oil degraders as well as two clades, Maricurvus and Rhodobacteraceae, that responded to amendment with oil but do not include known oil degraders. Our approach is sensitive to organisms that increased in abundance only in the experimental unit but less sensitive to organisms that increased in both control and experimental units, thus mitigating the role of "bottle effects".

  16. On Designing a Generic Framework for Cloud-based Big Data Analytics

    OpenAIRE

    Khan, Samiya; Alam, Mansaf

    2017-01-01

    Big data analytics has gathered immense research attention lately because of its ability to harness useful information from heaps of data. Cloud computing has been adjudged as one of the best infrastructural solutions for implementation of big data analytics. This research paper proposes a five-layer model for cloud-based big data analytics that uses dew computing and edge computing concepts. Besides this, the paper also presents an approach for creation of custom big data stack by selecting ...

  17. Visual Sample Plan (VSP) Software: Designs and Data Analyses for Sampling Contaminated Buildings

    International Nuclear Information System (INIS)

    Pulsipher, Brent A.; Wilson, John E.; Gilbert, Richard O.; Nuffer, Lisa L.; Hassig, Nancy L.

    2005-01-01

    A new module of the Visual Sample Plan (VSP) software has been developed to provide sampling designs and data analyses for potentially contaminated buildings. An important application is assessing levels of contamination in buildings after a terrorist attack. This new module, funded by DHS through the Combating Terrorism Technology Support Office, Technical Support Working Group, was developed to provide a tailored, user-friendly and visually-orientated buildings module within the existing VSP software toolkit, the latest version of which can be downloaded from http://dqo.pnl.gov/vsp. In case of, or when planning against, a chemical, biological, or radionuclide release within a building, the VSP module can be used to quickly and easily develop and visualize technically defensible sampling schemes for walls, floors, ceilings, and other surfaces to statistically determine if contamination is present, its magnitude and extent throughout the building and if decontamination has been effective. This paper demonstrates the features of this new VSP buildings module, which include: the ability to import building floor plans or to easily draw, manipulate, and view rooms in several ways; being able to insert doors, windows and annotations into a room; 3-D graphic room views with surfaces labeled and floor plans that show building zones that have separate air handing units. The paper will also discuss the statistical design and data analysis options available in the buildings module. Design objectives supported include comparing an average to a threshold when the data distribution is normal or unknown, and comparing measurements to a threshold to detect hotspots or to insure most of the area is uncontaminated when the data distribution is normal or unknown

  18. Integrating motivational, social, and contextual work design features: a meta-analytic summary and theoretical extension of the work design literature.

    Science.gov (United States)

    Humphrey, Stephen E; Nahrgang, Jennifer D; Morgeson, Frederick P

    2007-09-01

    The authors developed and meta-analytically examined hypotheses designed to test and extend work design theory by integrating motivational, social, and work context characteristics. Results from a summary of 259 studies and 219,625 participants showed that 14 work characteristics explained, on average, 43% of the variance in the 19 worker attitudes and behaviors examined. For example, motivational characteristics explained 25% of the variance in subjective performance, 2% in turnover perceptions, 34% in job satisfaction, 24% in organizational commitment, and 26% in role perception outcomes. Beyond motivational characteristics, social characteristics explained incremental variances of 9% of the variance in subjective performance, 24% in turnover intentions, 17% in job satisfaction, 40% in organizational commitment, and 18% in role perception outcomes. Finally, beyond both motivational and social characteristics, work context characteristics explained incremental variances of 4% in job satisfaction and 16% in stress. The results of this study suggest numerous opportunities for the continued development of work design theory and practice. (c) 2007 APA.

  19. Rapid Sampling of Hydrogen Bond Networks for Computational Protein Design.

    Science.gov (United States)

    Maguire, Jack B; Boyken, Scott E; Baker, David; Kuhlman, Brian

    2018-05-08

    Hydrogen bond networks play a critical role in determining the stability and specificity of biomolecular complexes, and the ability to design such networks is important for engineering novel structures, interactions, and enzymes. One key feature of hydrogen bond networks that makes them difficult to rationally engineer is that they are highly cooperative and are not energetically favorable until the hydrogen bonding potential has been satisfied for all buried polar groups in the network. Existing computational methods for protein design are ill-equipped for creating these highly cooperative networks because they rely on energy functions and sampling strategies that are focused on pairwise interactions. To enable the design of complex hydrogen bond networks, we have developed a new sampling protocol in the molecular modeling program Rosetta that explicitly searches for sets of amino acid mutations that can form self-contained hydrogen bond networks. For a given set of designable residues, the protocol often identifies many alternative sets of mutations/networks, and we show that it can readily be applied to large sets of residues at protein-protein interfaces or in the interior of proteins. The protocol builds on a recently developed method in Rosetta for designing hydrogen bond networks that has been experimentally validated for small symmetric systems but was not extensible to many larger protein structures and complexes. The sampling protocol we describe here not only recapitulates previously validated designs with performance improvements but also yields viable hydrogen bond networks for cases where the previous method fails, such as the design of large, asymmetric interfaces relevant to engineering protein-based therapeutics.

  20. Random sampling or geostatistical modelling? Choosing between design-based and model-based sampling strategies for soil (with discussion)

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    1997-01-01

    Classical sampling theory has been repeatedly identified with classical statistics which assumes that data are identically and independently distributed. This explains the switch of many soil scientists from design-based sampling strategies, based on classical sampling theory, to the model-based

  1. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    Science.gov (United States)

    Burtis, Carl A.; Johnson, Wayne F.; Walker, William A.

    1988-01-01

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises (1) a whole blood sample disc, (2) a serum sample disc, (3) a sample preparation rotor, and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc in capillary tubes filled by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analytical rotor for analysis by conventional methods.

  2. Principal component analysis applied to Fourier transform infrared spectroscopy for the design of calibration sets for glycerol prediction models in wine and for the detection and classification of outlier samples.

    Science.gov (United States)

    Nieuwoudt, Helene H; Prior, Bernard A; Pretorius, Isak S; Manley, Marena; Bauer, Florian F

    2004-06-16

    Principal component analysis (PCA) was used to identify the main sources of variation in the Fourier transform infrared (FT-IR) spectra of 329 wines of various styles. The FT-IR spectra were gathered using a specialized WineScan instrument. The main sources of variation included the reducing sugar and alcohol content of the samples, as well as the stage of fermentation and the maturation period of the wines. The implications of the variation between the different wine styles for the design of calibration models with accurate predictive abilities were investigated using glycerol calibration in wine as a model system. PCA enabled the identification and interpretation of samples that were poorly predicted by the calibration models, as well as the detection of individual samples in the sample set that had atypical spectra (i.e., outlier samples). The Soft Independent Modeling of Class Analogy (SIMCA) approach was used to establish a model for the classification of the outlier samples. A glycerol calibration for wine was developed (reducing sugar content 8% v/v) with satisfactory predictive ability (SEP = 0.40 g/L). The RPD value (ratio of the standard deviation of the data to the standard error of prediction) was 5.6, indicating that the calibration is suitable for quantification purposes. A calibration for glycerol in special late harvest and noble late harvest wines (RS 31-147 g/L, alcohol > 11.6% v/v) with a prediction error SECV = 0.65 g/L, was also established. This study yielded an analytical strategy that combined the careful design of calibration sets with measures that facilitated the early detection and interpretation of poorly predicted samples and outlier samples in a sample set. The strategy provided a powerful means of quality control, which is necessary for the generation of accurate prediction data and therefore for the successful implementation of FT-IR in the routine analytical laboratory.

  3. An analytical laboratory to facilitate international safeguards

    International Nuclear Information System (INIS)

    Clark, B.E.; Muellner, P.; Deron, S.

    1976-01-01

    Member States which have concluded safeguards agreements accept safeguards on part or all of their nuclear facilities and nuclear materials. The Agreements enable the Agency to make inspections in order to verify the location, identity, quantity and composition of all safeguarded nuclear material. The independent analysis of samples of safeguards material is an essential part of the verification process. A new analytical laboratory has been made available to the Agency by the Austrian Government. This facility is staffed by the Agency with scientists and technicians from five Member States. Design criteria for the laboratory were defined by the Agency. Construction was carried out under the project management of the Oesterreichische Studiengesellschaft fuer Atomenergie Ges.m.b.H. Scientific equipment was procured by the Agency. Samples of feed and product material from the nuclear fuel cycle will constitute the main work load. Irradiated and unirradiated samples of uranium, plutonium and mixtures of both will be analysed for concentration and isotopic composition. Since highly diluted solutions of spent fuel will be the most active beta-gamma samples, shielded and remote manipulation facilities are not necessary. Ptentiometry, mass spectrometry and coulometry are the main techniques to be employed. Gravimetry, alpha and gamma spectrometry and emission spectroscopy will also be utilized as required. It is not intended that this laboratory, should carry the whole burden of the Agency's safeguards analytical work, but that it should function as a member of a network of international laboratories which has been set up by the Agency for this purpose. (author)

  4. Analytical Dancoff factor evaluations for reactor designs loaded with TRISO particle fuel

    International Nuclear Information System (INIS)

    Ji, Wei; Liang, Chao; Pusateri, Elise N.

    2014-01-01

    Highlights: • The Dancoff factors for randomly distributed TRISO fuel particles are evaluated. • A new “dual-sphere” model is proposed to predict Dancoff factors. • The new model accurately accounts for the coating regions of fuel particles. • High accuracy is achieved over a broad range of design parameters. • The new model can be used to analyze reactors with double heterogeneity. - Abstract: A new mathematical model, the dual-sphere model, is proposed to analytically evaluate Dancoff factors of TRISO fuel kernels based on the chord method. The accurate evaluation of fuel kernel Dancoff factors is needed when one analyzes nuclear reactors loaded with TRISO particle fuel. In these reactor designs, fuel kernels are randomly distributed and shield each other, causing a shadowing effect. The Dancoff factor is a quantitative measure of this effect and is determined by the spatial distribution of fuel kernels. A TRISO fuel particle usually consists of four layers that form a coating region outside the fuel kernel. When fuel particles are loaded in the reactor, the spatial distribution of fuel kernels can be affected by the thickness of the coating region. Therefore, the coating region should be taken into account in the calculation of Dancoff factors. However, the previous model, the single-sphere model, assumes no coating regions in the Dancoff factor predictions. To address this model deficiency, the dual-sphere model is proposed by deriving a new chord length distribution function between two fuel kernels that explicitly accounts for coating regions. The new model is employed to derive analytical solutions of infinite medium, intra-fuel pebble and intra-fuel compact/pin Dancoff factors over a wide range of volume packing fractions of TRISO fuel particles, varying from 2% to 60%. Comparisons are made with the predictions from the single-sphere model and reference Monte Carlo simulations. A significant improvement of the accuracy, over the ranges of

  5. Analytical Study on the Beyond Design Seismic Capacity of Reinforced Concrete Shear Walls

    Energy Technology Data Exchange (ETDEWEB)

    Nugroho, Tino Sawaldi Adi [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Chi, Ho-Seok [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-10-15

    The OECD-NEA has organized an international benchmarking program to better understand this critical issue. The benchmark program provides test specimen geometry, test setup, material properties, loading conditions, recorded measures, and observations of the test specimens. The main objective of this research is to assess the beyond design seismic capacity of the reinforced concrete shear walls tested at the European Laboratory for Structural Assessment between 1997 and 1998 through participation in the OECD-NEA benchmark program. In this study, assessing the beyond design seismic capacity of reinforced concrete shear walls is performed analytically by comparing numerical results with experimental results. The seismic shear capacity of the reinforced concrete shear wall was predicted reasonably well using ABAQUS program. However, the proper calibration of the concrete material model was necessary for better prediction of the behavior of the reinforced concrete shear walls since the response was influenced significantly by the material constitutive model.

  6. A proposal of optimal sampling design using a modularity strategy

    Science.gov (United States)

    Simone, A.; Giustolisi, O.; Laucelli, D. B.

    2016-08-01

    In real water distribution networks (WDNs) are present thousands nodes and optimal placement of pressure and flow observations is a relevant issue for different management tasks. The planning of pressure observations in terms of spatial distribution and number is named sampling design and it was faced considering model calibration. Nowadays, the design of system monitoring is a relevant issue for water utilities e.g., in order to manage background leakages, to detect anomalies and bursts, to guarantee service quality, etc. In recent years, the optimal location of flow observations related to design of optimal district metering areas (DMAs) and leakage management purposes has been faced considering optimal network segmentation and the modularity index using a multiobjective strategy. Optimal network segmentation is the basis to identify network modules by means of optimal conceptual cuts, which are the candidate locations of closed gates or flow meters creating the DMAs. Starting from the WDN-oriented modularity index, as a metric for WDN segmentation, this paper proposes a new way to perform the sampling design, i.e., the optimal location of pressure meters, using newly developed sampling-oriented modularity index. The strategy optimizes the pressure monitoring system mainly based on network topology and weights assigned to pipes according to the specific technical tasks. A multiobjective optimization minimizes the cost of pressure meters while maximizing the sampling-oriented modularity index. The methodology is presented and discussed using the Apulian and Exnet networks.

  7. Role of analytical chemistry in environmental monitoring

    International Nuclear Information System (INIS)

    Kayasth, S.; Swain, K.

    2004-01-01

    Basic aspects of pollution and the role of analytical chemistry in environmental monitoring are highlighted and exemplified, with emphasis on trace elements. Sources and pathways of natural and especially man-made polluting substances as well as physico-chemical characteristics are given. Attention is paid to adequate sampling in various compartments of the environment comprising both lithosphere and biosphere. Trace analysis is dealt with using a variety of analytical techniques, including criteria for choice of suited techniques, as well as aspects of analytical quality assurance and control. Finally, some data on trace elements levels in soil and water samples from India are presented. (author)

  8. Manual of selected physico-chemical analytical methods. IV

    International Nuclear Information System (INIS)

    Beran, M.; Klosova, E.; Krtil, J.; Sus, F.; Kuvik, V.; Vrbova, L.; Hamplova, M.; Lengyel, J.; Kelnar, L.; Zakouril, K.

    1990-11-01

    The Central Testing Laboratory of the Nuclear Research Institute at Rez has for a decade been participating in the development of analytical procedures and has been providing analyses of samples of different types and origin. The analytical procedures developed have been published in special journals and a number of them in the Manuals of analytical methods, in three parts. The 4th part of the Manual contains selected physico-chemical methods developed or modified by the Laboratory in the years 1986-1990 within the project ''Development of physico-chemical analytical methods''. In most cases, techniques are involved for non-nuclear applications. Some can find wider applications, especially in analyses of environmental samples. Others have been developed for specific cases of sample analyses or require special instrumentation (mass spectrometer), which partly restricts their applicability by other institutions. (author)

  9. Biomonitoring of air pollution in Jamaica through trace-element analysis of epiphytic plants using nuclear and related analytical techniques

    International Nuclear Information System (INIS)

    Vutchkov, Mitko

    2001-01-01

    The main goal of the Coordinated Research Project (No:9937/R0), entitled 'Biomonitoring of Air Pollution in Jamaica Through Trace-Element Analysis of Epiphytic Plants Using Nuclear and Related Analytical Techniques', is to identify and validate site specific epiphytic plants for biomonitoring the atmospheric pollution in Jamaica using nuclear analytical techniques at the International Centre for Environmental and Nuclear Sciences (ICENS). The specific objectives for the second year of the project were: Development of HOP for sampling epiphytic plants in Jamaica; Sampling design and sample collection; Sample preparation and analysis; Development of an in-house SRM and participation in the NAT-5 inter-laboratory study; Data analysis and interpretation of the results; Development of a work plan of the third year of the project

  10. Analytical Validation of Quantitative Real-Time PCR Methods for Quantification of Trypanosoma cruzi DNA in Blood Samples from Chagas Disease Patients.

    Science.gov (United States)

    Ramírez, Juan Carlos; Cura, Carolina Inés; da Cruz Moreira, Otacilio; Lages-Silva, Eliane; Juiz, Natalia; Velázquez, Elsa; Ramírez, Juan David; Alberti, Anahí; Pavia, Paula; Flores-Chávez, María Delmans; Muñoz-Calderón, Arturo; Pérez-Morales, Deyanira; Santalla, José; Marcos da Matta Guedes, Paulo; Peneau, Julie; Marcet, Paula; Padilla, Carlos; Cruz-Robles, David; Valencia, Edward; Crisante, Gladys Elena; Greif, Gonzalo; Zulantay, Inés; Costales, Jaime Alfredo; Alvarez-Martínez, Miriam; Martínez, Norma Edith; Villarroel, Rodrigo; Villarroel, Sandro; Sánchez, Zunilda; Bisio, Margarita; Parrado, Rudy; Maria da Cunha Galvão, Lúcia; Jácome da Câmara, Antonia Cláudia; Espinoza, Bertha; Alarcón de Noya, Belkisyole; Puerta, Concepción; Riarte, Adelina; Diosque, Patricio; Sosa-Estani, Sergio; Guhl, Felipe; Ribeiro, Isabela; Aznar, Christine; Britto, Constança; Yadón, Zaida Estela; Schijman, Alejandro G

    2015-09-01

    An international study was performed by 26 experienced PCR laboratories from 14 countries to assess the performance of duplex quantitative real-time PCR (qPCR) strategies on the basis of TaqMan probes for detection and quantification of parasitic loads in peripheral blood samples from Chagas disease patients. Two methods were studied: Satellite DNA (SatDNA) qPCR and kinetoplastid DNA (kDNA) qPCR. Both methods included an internal amplification control. Reportable range, analytical sensitivity, limits of detection and quantification, and precision were estimated according to international guidelines. In addition, inclusivity and exclusivity were estimated with DNA from stocks representing the different Trypanosoma cruzi discrete typing units and Trypanosoma rangeli and Leishmania spp. Both methods were challenged against 156 blood samples provided by the participant laboratories, including samples from acute and chronic patients with varied clinical findings, infected by oral route or vectorial transmission. kDNA qPCR showed better analytical sensitivity than SatDNA qPCR with limits of detection of 0.23 and 0.70 parasite equivalents/mL, respectively. Analyses of clinical samples revealed a high concordance in terms of sensitivity and parasitic loads determined by both SatDNA and kDNA qPCRs. This effort is a major step toward international validation of qPCR methods for the quantification of T. cruzi DNA in human blood samples, aiming to provide an accurate surrogate biomarker for diagnosis and treatment monitoring for patients with Chagas disease. Copyright © 2015 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  11. Secondary Analysis under Cohort Sampling Designs Using Conditional Likelihood

    Directory of Open Access Journals (Sweden)

    Olli Saarela

    2012-01-01

    Full Text Available Under cohort sampling designs, additional covariate data are collected on cases of a specific type and a randomly selected subset of noncases, primarily for the purpose of studying associations with a time-to-event response of interest. With such data available, an interest may arise to reuse them for studying associations between the additional covariate data and a secondary non-time-to-event response variable, usually collected for the whole study cohort at the outset of the study. Following earlier literature, we refer to such a situation as secondary analysis. We outline a general conditional likelihood approach for secondary analysis under cohort sampling designs and discuss the specific situations of case-cohort and nested case-control designs. We also review alternative methods based on full likelihood and inverse probability weighting. We compare the alternative methods for secondary analysis in two simulated settings and apply them in a real-data example.

  12. Comparison of the analytical methods used to determine natural and artificial radionuclides from environmental samples by gamma, alpha and beta spectrometry

    DEFF Research Database (Denmark)

    Pöllänen, Roy; Virtanen, Sinikka; Kämäräinen, Meerit

    In CAMNAR, an extensive interlaboratory exercise on the analytical methods used to determine several radionuclides present in the environmental samples was organized. Activity concentration of different natural radionuclides, such as Rn-222, Pb-210, Po-210, K-40, Ra-226, Ra-228 and isotopes...... of uranium, in addition to artificial Cs-137 and Am-241 were analysed from lake sediment samples and drinking water. The measurement techniques were gamma-ray spectrometry, alpha spectrometry, liquid scintillation counting and inductively coupled plasma mass spectrometry. Twenty six laboratories from nine...

  13. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method

    Science.gov (United States)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-09-01

    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous

  14. System effects in sample self-stacking CZE: Single analyte peak splitting of salt-containing samples

    Czech Academy of Sciences Publication Activity Database

    Malá, Zdeňka; Gebauer, Petr; Boček, Petr

    2009-01-01

    Roč. 30, č. 5 (2009), s. 866-874 ISSN 0173-0835 R&D Projects: GA ČR GA203/08/1536; GA AV ČR IAA400310609; GA AV ČR IAA400310703 Institutional research plan: CEZ:AV0Z40310501 Keywords : CZE * peak splitting * self-stacking Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 3.077, year: 2009

  15. Hanford Site background: Part 1, Soil background for nonradioactive analytes

    International Nuclear Information System (INIS)

    1993-04-01

    The determination of soil background is one of the most important activities supporting environmental restoration and waste management on the Hanford Site. Background compositions serve as the basis for identifying soil contamination, and also as a baseline in risk assessment processes used to determine soil cleanup and treatment levels. These uses of soil background require an understanding of the extent to which analytes of concern occur naturally in the soils. This report documents the results of sampling and analysis activities designed to characterize the composition of soil background at the Hanford Site, and to evaluate the feasibility for use as Sitewide background. The compositions of naturally occurring soils in the vadose Zone have been-determined for-nonradioactive inorganic and organic analytes and related physical properties. These results confirm that a Sitewide approach to the characterization of soil background is technically sound and is a viable alternative to the determination and use of numerous local or area backgrounds that yield inconsistent definitions of contamination. Sitewide soil background consists of several types of data and is appropriate for use in identifying contamination in all soils in the vadose zone on the Hanford Site. The natural concentrations of nearly every inorganic analyte extend to levels that exceed calculated health-based cleanup limits. The levels of most inorganic analytes, however, are well below these health-based limits. The highest measured background concentrations occur in three volumetrically minor soil types, the most important of which are topsoils adjacent to the Columbia River that are rich in organic carbon. No organic analyte levels above detection were found in any of the soil samples

  16. Soil sample collection and analysis for the Fugitive Dust Characterization Study

    Science.gov (United States)

    Ashbaugh, Lowell L.; Carvacho, Omar F.; Brown, Michael S.; Chow, Judith C.; Watson, John G.; Magliano, Karen C.

    A unique set of soil samples was collected as part of the Fugitive Dust Characterization Study. The study was carried out to establish whether or not source profiles could be constructed using novel analytical methods that could distinguish soil dust sources from each other. The soil sources sampled included fields planted in cotton, almond, tomato, grape, and safflower, dairy and feedlot facilities, paved and unpaved roads (both urban and rural), an agricultural staging area, disturbed land with salt buildup, and construction areas where the topsoil had been removed. The samples were collected using a systematic procedure designed to reduce sampling bias, and were stored frozen to preserve possible organic signatures. For this paper the samples were characterized by particle size (percent sand, silt, and clay), dry silt content (used in EPA-recommended fugitive dust emission factors), carbon and nitrogen content, and potential to emit both PM 10 and PM 2.5. These are not the "novel analytical methods" referred to above; rather, it was the basic characterization of the samples to use in comparing analytical methods by other scientists contracted to the California Air Resources Board. The purpose of this paper is to document the methods used to collect the samples, the collection locations, the analysis of soil type and potential to emit PM 10, and the sample variability, both within field and between fields of the same crop type.

  17. Sampling practices and analytical techniques used in the monitoring of steam and water in CEGB nuclear boilers

    International Nuclear Information System (INIS)

    Goodfellow, G.I.

    1978-01-01

    The steam and water in CEGB Magnox and AGR nuclear boilers are continuously monitored, using both laboratory techniques and on-line instrumentation, in order to maintain the chemical quality within pre-determined limits. The sampling systems in use and some of the difficulties associated with sampling requirements are discussed. The relative merits of chemical instruments installed either locally in various parts of the plant or in centralized instrument rooms are reviewed. The quality of water in nuclear boilers, as with all high-pressure steam-raising plant, is extremely high; consequently very sensitive analytical procedures are required, particularly for monitoring the feed-water of 'once-through boiler' systems. Considerable progress has been made in this field and examples are given of some of the techniques developed for analyses at the 'μ/kg' level together with some of the current problems.(author)

  18. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    Energy Technology Data Exchange (ETDEWEB)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z. [and others

    1997-12-01

    Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrification campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).

  19. Exploring the design space of immersive urban analytics

    Directory of Open Access Journals (Sweden)

    Zhutian Chen

    2017-06-01

    Full Text Available Recent years have witnessed the rapid development and wide adoption of immersive head-mounted devices, such as HTC VIVE, Oculus Rift, and Microsoft HoloLens. These immersive devices have the potential to significantly extend the methodology of urban visual analytics by providing critical 3D context information and creating a sense of presence. In this paper, we propose a theoretical model to characterize the visualizations in immersive urban analytics. Furthermore, based on our comprehensive and concise model, we contribute a typology of combination methods of 2D and 3D visualizations that distinguishes between linked views, embedded views, and mixed views. We also propose a supporting guideline to assist users in selecting a proper view under certain circumstances by considering visual geometry and spatial distribution of the 2D and 3D visualizations. Finally, based on existing work, possible future research opportunities are explored and discussed.

  20. A Review on the Design Structure Matrix as an Analytical Tool for Product Development Management

    OpenAIRE

    Mokudai, Takefumi

    2006-01-01

    This article reviews fundamental concepts and analytical techniques of design structure matrix (DSM) as well as recent development of DSM studies. The DSM is a matrix representation of relationships between components of a complex system, such as products, development organizations and processes. Depending on targets of analysis, there are four basic types of DSM: Component-based DSM, Team-based DSM, Task-based DSM, and Parameter-based DSM. There are two streams of recent DSM studies: 1) ...

  1. ANALYTICAL TECHNIQUES FOR THE DETERMINATION OF MELOXICAM IN PHARMACEUTICAL FORMULATIONS AND BIOLOGICAL SAMPLES

    Directory of Open Access Journals (Sweden)

    Aisha Noreen

    2016-06-01

    Full Text Available Meloxicam (MX belongs to the family of oxicams which is the most important group of non steroidal anti-inflammatory drugs (NSAIDs and is widely used for their analgesics and antipyretic activities. It inhibits both COX-I and COX-II enzymes with less gastric and local tissues irritation. A number of analytical techniques have been used for the determination of MX in pharmaceutical as well as in biological fluids. These techniques include titrimetry, spectrometry, chromatography, flow injection spectrometry, fluorescence spectrometry, capillary zone electrophoresis and electrochemical techniques. Many of these techniques have also been used for the simultaneous determination of MX with other compounds. A comprehensive review of these analytical techniques has been done which could be useful for the analytical chemists and quality control pharmacists.

  2. DWPF Sample Vial Insert Study-Statistical Analysis of DWPF Mock-Up Test Data

    Energy Technology Data Exchange (ETDEWEB)

    Harris, S.P. [Westinghouse Savannah River Company, AIKEN, SC (United States)

    1997-09-18

    This report is prepared as part of Technical/QA Task Plan WSRC-RP-97-351 which was issued in response to Technical Task Request HLW/DWPF/TTR-970132 submitted by DWPF. Presented in this report is a statistical analysis of DWPF Mock-up test data for evaluation of two new analytical methods which use insert samples from the existing HydragardTM sampler. The first is a new hydrofluoric acid based method called the Cold Chemical Method (Cold Chem) and the second is a modified fusion method.Either new DWPF analytical method could result in a two to three fold improvement in sample analysis time.Both new methods use the existing HydragardTM sampler to collect a smaller insert sample from the process sampling system. The insert testing methodology applies to the DWPF Slurry Mix Evaporator (SME) and the Melter Feed Tank (MFT) samples.The insert sample is named after the initial trials which placed the container inside the sample (peanut) vials. Samples in small 3 ml containers (Inserts) are analyzed by either the cold chemical method or a modified fusion method. The current analytical method uses a HydragardTM sample station to obtain nearly full 15 ml peanut vials. The samples are prepared by a multi-step process for Inductively Coupled Plasma (ICP) analysis by drying, vitrification, grinding and finally dissolution by either mixed acid or fusion. In contrast, the insert sample is placed directly in the dissolution vessel, thus eliminating the drying, vitrification and grinding operations for the Cold chem method. Although the modified fusion still requires drying and calcine conversion, the process is rapid due to the decreased sample size and that no vitrification step is required.A slurry feed simulant material was acquired from the TNX pilot facility from the test run designated as PX-7.The Mock-up test data were gathered on the basis of a statistical design presented in SRT-SCS-97004 (Rev. 0). Simulant PX-7 samples were taken in the DWPF Analytical Cell Mock

  3. Temperature-controlled micro-TLC: a versatile green chemistry and fast analytical tool for separation and preliminary screening of steroids fraction from biological and environmental samples.

    Science.gov (United States)

    Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J

    2011-11-01

    This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the

  4. Asymptotic Effectiveness of the Event-Based Sampling According to the Integral Criterion

    Directory of Open Access Journals (Sweden)

    Marek Miskowicz

    2007-01-01

    Full Text Available A rapid progress in intelligent sensing technology creates new interest in a development of analysis and design of non-conventional sampling schemes. The investigation of the event-based sampling according to the integral criterion is presented in this paper. The investigated sampling scheme is an extension of the pure linear send-on- delta/level-crossing algorithm utilized for reporting the state of objects monitored by intelligent sensors. The motivation of using the event-based integral sampling is outlined. The related works in adaptive sampling are summarized. The analytical closed-form formulas for the evaluation of the mean rate of event-based traffic, and the asymptotic integral sampling effectiveness, are derived. The simulation results verifying the analytical formulas are reported. The effectiveness of the integral sampling is compared with the related linear send-on-delta/level-crossing scheme. The calculation of the asymptotic effectiveness for common signals, which model the state evolution of dynamic systems in time, is exemplified.

  5. Prevalence of Pre-Analytical Errors in Clinical Chemistry Diagnostic Labs in Sulaimani City of Iraqi Kurdistan.

    Science.gov (United States)

    Najat, Dereen

    2017-01-01

    Laboratory testing is roughly divided into three phases: a pre-analytical phase, an analytical phase and a post-analytical phase. Most analytical errors have been attributed to the analytical phase. However, recent studies have shown that up to 70% of analytical errors reflect the pre-analytical phase. The pre-analytical phase comprises all processes from the time a laboratory request is made by a physician until the specimen is analyzed at the lab. Generally, the pre-analytical phase includes patient preparation, specimen transportation, specimen collection and storage. In the present study, we report the first comprehensive assessment of the frequency and types of pre-analytical errors at the Sulaimani diagnostic labs in Iraqi Kurdistan. Over 2 months, 5500 venous blood samples were observed in 10 public diagnostic labs of Sulaimani City. The percentages of rejected samples and types of sample inappropriateness were evaluated. The percentage of each of the following pre-analytical errors were recorded: delay in sample transportation, clotted samples, expired reagents, hemolyzed samples, samples not on ice, incorrect sample identification, insufficient sample, tube broken in centrifuge, request procedure errors, sample mix-ups, communication conflicts, misinterpreted orders, lipemic samples, contaminated samples and missed physician's request orders. The difference between the relative frequencies of errors observed in the hospitals considered was tested using a proportional Z test. In particular, the survey aimed to discover whether analytical errors were recorded and examine the types of platforms used in the selected diagnostic labs. The analysis showed a high prevalence of improper sample handling during the pre-analytical phase. In appropriate samples, the percentage error was as high as 39%. The major reasons for rejection were hemolyzed samples (9%), incorrect sample identification (8%) and clotted samples (6%). Most quality control schemes at Sulaimani

  6. Prevalence of Pre-Analytical Errors in Clinical Chemistry Diagnostic Labs in Sulaimani City of Iraqi Kurdistan.

    Directory of Open Access Journals (Sweden)

    Dereen Najat

    Full Text Available Laboratory testing is roughly divided into three phases: a pre-analytical phase, an analytical phase and a post-analytical phase. Most analytical errors have been attributed to the analytical phase. However, recent studies have shown that up to 70% of analytical errors reflect the pre-analytical phase. The pre-analytical phase comprises all processes from the time a laboratory request is made by a physician until the specimen is analyzed at the lab. Generally, the pre-analytical phase includes patient preparation, specimen transportation, specimen collection and storage. In the present study, we report the first comprehensive assessment of the frequency and types of pre-analytical errors at the Sulaimani diagnostic labs in Iraqi Kurdistan.Over 2 months, 5500 venous blood samples were observed in 10 public diagnostic labs of Sulaimani City. The percentages of rejected samples and types of sample inappropriateness were evaluated. The percentage of each of the following pre-analytical errors were recorded: delay in sample transportation, clotted samples, expired reagents, hemolyzed samples, samples not on ice, incorrect sample identification, insufficient sample, tube broken in centrifuge, request procedure errors, sample mix-ups, communication conflicts, misinterpreted orders, lipemic samples, contaminated samples and missed physician's request orders. The difference between the relative frequencies of errors observed in the hospitals considered was tested using a proportional Z test. In particular, the survey aimed to discover whether analytical errors were recorded and examine the types of platforms used in the selected diagnostic labs.The analysis showed a high prevalence of improper sample handling during the pre-analytical phase. In appropriate samples, the percentage error was as high as 39%. The major reasons for rejection were hemolyzed samples (9%, incorrect sample identification (8% and clotted samples (6%. Most quality control schemes

  7. Design and evaluation of a new Peltier-cooled laser ablation cell with on-sample temperature control.

    Science.gov (United States)

    Konz, Ioana; Fernández, Beatriz; Fernández, M Luisa; Pereiro, Rosario; Sanz-Medel, Alfredo

    2014-01-27

    A new custom-built Peltier-cooled laser ablation cell is described. The proposed cryogenic cell combines a small internal volume (20 cm(3)) with a unique and reliable on-sample temperature control. The use of a flexible temperature sensor, directly located on the sample surface, ensures a rigorous sample temperature control throughout the entire analysis time and allows instant response to any possible fluctuation. In this way sample integrity and, therefore, reproducibility can be guaranteed during the ablation. The refrigeration of the proposed cryogenic cell combines an internal refrigeration system, controlled by a sensitive thermocouple, with an external refrigeration system. Cooling of the sample is directly carried out by 8 small (1 cm×1 cm) Peltier elements placed in a circular arrangement in the base of the cell. These Peltier elements are located below a copper plate where the sample is placed. Due to the small size of the cooling electronics and their circular allocation it was possible to maintain a peephole under the sample for illumination allowing a much better visualization of the sample, a factor especially important when working with structurally complex tissue sections. The analytical performance of the cryogenic cell was studied using a glass reference material (SRM NIST 612) at room temperature and at -20°C. The proposed cell design shows a reasonable signal washout (signal decay within less than 10 s to background level), high sensitivity and good signal stability (in the range 6.6-11.7%). Furthermore, high precision (0.4-2.6%) and accuracy (0.3-3.9%) in the isotope ratio measurements were also observed operating the cell both at room temperature and at -20°C. Finally, experimental results obtained for the cell application to qualitative elemental imaging of structurally complex tissue samples (e.g. eye sections from a native frozen porcine eye and fresh flower leaves) demonstrate that working in cryogenic conditions is critical in such

  8. Multiple analyte adduct formation in liquid chromatography-tandem mass spectrometry - Advantages and limitations in the analysis of biologically-related samples.

    Science.gov (United States)

    Dziadosz, Marek

    2018-05-01

    Multiple analyte adduct formation was examined and discussed in the context of reproducible signal detection in liquid chromatography-tandem mass spectrometry applied in the analysis of biologically-related samples. Appropriate infusion solutions were prepared in H 2 O/methanol (3/97, v/v) with 1 mM sodium acetate and 10 mM acetic acid. An API 4000 QTrap tandem mass spectrometer was used for experiments performed in the negative scan mode (-Q1 MS) and the negative enhanced product ion mode (-EPI). γ‑Hydroxybutyrate and its deuterated form were used as model compounds to highlight both the complexity of adduct formation in popular mobile phases used and the effective signal compensation by the application of isotope-labelled analytes as internal standards. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Ecotoxicity on a stick: A novel analytical tool for predicting the ecotoxicity of petroleum contaminated samples

    International Nuclear Information System (INIS)

    Parkerton, T.F.; Stone, M.A.

    1995-01-01

    Hydrocarbons generally elicit toxicity via a nonpolar narcotic mechanism. Recent research suggests that chemicals acting by this mode invoke ecotoxicity when the molar concentration in organisms lipid exceeds a critical threshold. Since ecotoxicity of nonpolar narcotic mixtures appears to be additive, the ecotoxicity of hydrocarbon mixtures thus depends upon: (1) the partitioning of individual hydrocarbons comprising the mixture from the environment to lipids and (2) the total molar sum of the constituent hydrocarbons in lipids. These insights have led previous investigators to advance the concept of biomimetic extraction as a novel tool for assessing potential narcosis-type or baseline ecotoxicity in aqueous samples. Drawing from this earlier work, the authors have developed a method to quantify Bioavailable Petroleum Hydrocarbons (BPHS) in hydrocarbon-contaminated aqueous and soil/sediment samples. A sample is equilibrated with a solid phase microextraction (SPME) fiber that serves as a surrogate for organism lipids. The total moles of hydrocarbons that partition to the SPME fiber is then quantified using a simple GC/FID procedure. Research conducted to support the development and initial validation of this method will be presented. Results suggest that BPH analyses provide a promising, cost-effective approach for predicting the ecotoxicity of environmental samples contaminated with hydrocarbon mixtures. Consequently, BPH analyses may provide a valuable analytical screening tool for ecotoxicity assessment in product and effluent testing, environmental monitoring and site remediation applications

  10. Analytic processor model for fast design-space exploration

    NARCIS (Netherlands)

    Jongerius, R.; Mariani, G.; Anghel, A.; Dittmann, G.; Vermij, E.; Corporaal, H.

    2015-01-01

    In this paper, we propose an analytic model that takes as inputs a) a parametric microarchitecture-independent characterization of the target workload, and b) a hardware configuration of the core and the memory hierarchy, and returns as output an estimation of processor-core performance. To validate

  11. Sampling designs and methods for estimating fish-impingement losses at cooling-water intakes

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.

    1977-01-01

    Several systems for estimating fish impingement at power plant cooling-water intakes are compared to determine the most statistically efficient sampling designs and methods. Compared to a simple random sampling scheme the stratified systematic random sampling scheme, the systematic random sampling scheme, and the stratified random sampling scheme yield higher efficiencies and better estimators for the parameters in two models of fish impingement as a time-series process. Mathematical results and illustrative examples of the applications of the sampling schemes to simulated and real data are given. Some sampling designs applicable to fish-impingement studies are presented in appendixes

  12. Expressing analytical performance from multi-sample evaluation in laboratory EQA.

    Science.gov (United States)

    Thelen, Marc H M; Jansen, Rob T P; Weykamp, Cas W; Steigstra, Herman; Meijer, Ron; Cobbaert, Christa M

    2017-08-28

    To provide its participants with an external quality assessment system (EQAS) that can be used to check trueness, the Dutch EQAS organizer, Organization for Quality Assessment of Laboratory Diagnostics (SKML), has innovated its general chemistry scheme over the last decade by introducing fresh frozen commutable samples whose values were assigned by Joint Committee for Traceability in Laboratory Medicine (JCTLM)-listed reference laboratories using reference methods where possible. Here we present some important innovations in our feedback reports that allow participants to judge whether their trueness and imprecision meet predefined analytical performance specifications. Sigma metrics are used to calculate performance indicators named 'sigma values'. Tolerance intervals are based on both Total Error allowable (TEa) according to biological variation data and state of the art (SA) in line with the European Federation of Clinical Chemistry and Laboratory Medicine (EFLM) Milan consensus. The existing SKML feedback reports that express trueness as the agreement between the regression line through the results of the last 12 months and the values obtained from reference laboratories and calculate imprecision from the residuals of the regression line are now enriched with sigma values calculated from the degree to which the combination of trueness and imprecision are within tolerance limits. The information and its conclusion to a simple two-point scoring system are also graphically represented in addition to the existing difference plot. By adding sigma metrics-based performance evaluation in relation to both TEa and SA tolerance intervals to its EQAS schemes, SKML provides its participants with a powerful and actionable check on accuracy.

  13. Slurry sampling high-resolution continuum source electrothermal atomic absorption spectrometry for direct beryllium determination in soil and sediment samples after elimination of SiO interference by least-squares background correction.

    Science.gov (United States)

    Husáková, Lenka; Urbanová, Iva; Šafránková, Michaela; Šídová, Tereza

    2017-12-01

    In this work a simple, efficient, and environmentally-friendly method is proposed for determination of Be in soil and sediment samples employing slurry sampling and high-resolution continuum source electrothermal atomic absorption spectrometry (HR-CS-ETAAS). The spectral effects originating from SiO species were identified and successfully corrected by means of a mathematical correction algorithm. Fractional factorial design has been employed to assess the parameters affecting the analytical results and especially to help in the development of the slurry preparation and optimization of measuring conditions. The effects of seven analytical variables including particle size, concentration of glycerol and HNO 3 for stabilization and analyte extraction, respectively, the effect of ultrasonic agitation for slurry homogenization, concentration of chemical modifier, pyrolysis and atomization temperature were investigated by a 2 7-3 replicate (n = 3) design. Using the optimized experimental conditions, the proposed method allowed the determination of Be with a detection limit being 0.016mgkg -1 and characteristic mass 1.3pg. Optimum results were obtained after preparing the slurries by weighing 100mg of a sample with particle size < 54µm and adding 25mL of 20% w/w glycerol. The use of 1μg Rh and 50μg citric acid was found satisfactory for the analyte stabilization. Accurate data were obtained with the use of matrix-free calibration. The accuracy of the method was confirmed by analysis of two certified reference materials (NIST SRM 2702 Inorganics in Marine Sediment and IGI BIL-1 Baikal Bottom Silt) and by comparison of the results obtained for ten real samples by slurry sampling with those determined after microwave-assisted extraction by inductively coupled plasma time of flight mass spectrometry (TOF-ICP-MS). The reported method has a precision better than 7%. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Analytical design of proportional-integral controllers for the optimal control of first-order processes with operational constraints

    Energy Technology Data Exchange (ETDEWEB)

    Thu, Hien Cao Thi; Lee, Moonyong [Yeungnam University, Gyeongsan (Korea, Republic of)

    2013-12-15

    A novel analytical design method of industrial proportional-integral (PI) controllers was developed for the optimal control of first-order processes with operational constraints. The control objective was to minimize a weighted sum of the controlled variable error and the rate of change in the manipulated variable under the maximum allowable limits in the controlled variable, manipulated variable and the rate of change in the manipulated variable. The constrained optimal servo control problem was converted to an unconstrained optimization to obtain an analytical tuning formula. A practical shortcut procedure for obtaining optimal PI parameters was provided based on graphical analysis of global optimality. The proposed PI controller was found to guarantee global optimum and deal explicitly with the three important operational constraints.

  15. Extending cluster Lot Quality Assurance Sampling designs for surveillance programs

    OpenAIRE

    Hund, Lauren; Pagano, Marcello

    2014-01-01

    Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance based on the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than ...

  16. Road Transportable Analytical Laboratory (RTAL) system

    International Nuclear Information System (INIS)

    Finger, S.M.

    1995-01-01

    U.S. Department of Energy (DOE) facilities around the country have, over the years, become contaminated with radionuclides and a range of organic and inorganic wastes. Many of the DOE sites encompass large land areas and were originally sited in relatively unpopulated regions of the country to minimize risk to surrounding populations. In addition, wastes were sometimes stored underground at the sites in 55-gallon drums, wood boxes or other containers until final disposal methods could be determined. Over the years, these containers have deteriorated, releasing contaminants into the surrounding environment. This contamination has spread, in some cases polluting extensive areas. Remediation of these sites requires extensive sampling to determine the extent of the contamination, to monitor clean-up and remediation progress, and for post-closure monitoring of facilities. The DOE would benefit greatly if it had reliable, road transportable, fully independent laboratory systems that could perform on-site the full range of analyses required. Such systems would accelerate and thereby reduce the cost of clean-up and remediation efforts by (1) providing critical analytical data more rapidly, and (2) eliminating the handling, shipping and manpower associated with sample shipments. The goal of the Road Transportable Analytical Laboratory (RTAL) Project is the development and demonstration of a system to meet the unique needs of the DOE for rapid, accurate analysis of a wide variety of hazardous and radioactive contaminants in soil, groundwater, and surface waters. This laboratory system has been designed to provide the field and laboratory analytical equipment necessary to detect and quantify radionuclides, organics, heavy metals and other inorganic compounds. The laboratory system consists of a set of individual laboratory modules deployable independently or as an interconnected group to meet each DOE site's specific needs

  17. Hanford analytical sample projections FY 1998 - FY 2002

    International Nuclear Information System (INIS)

    Joyce, S.M.

    1997-01-01

    Sample projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Sample projections are categorized by radiation level, protocol, sample matrix and Program. Analyses requirements are also presented

  18. Quantitative high throughput analytics to support polysaccharide production process development.

    Science.gov (United States)

    Noyes, Aaron; Godavarti, Ranga; Titchener-Hooker, Nigel; Coffman, Jonathan; Mukhopadhyay, Tarit

    2014-05-19

    The rapid development of purification processes for polysaccharide vaccines is constrained by a lack of analytical tools current technologies for the measurement of polysaccharide recovery and process-related impurity clearance are complex, time-consuming, and generally not amenable to high throughput process development (HTPD). HTPD is envisioned to be central to the improvement of existing polysaccharide manufacturing processes through the identification of critical process parameters that potentially impact the quality attributes of the vaccine and to the development of de novo processes for clinical candidates, across the spectrum of downstream processing. The availability of a fast and automated analytics platform will expand the scope, robustness, and evolution of Design of Experiment (DOE) studies. This paper details recent advances in improving the speed, throughput, and success of in-process analytics at the micro-scale. Two methods, based on modifications of existing procedures, are described for the rapid measurement of polysaccharide titre in microplates without the need for heating steps. A simplification of a commercial endotoxin assay is also described that features a single measurement at room temperature. These assays, along with existing assays for protein and nucleic acids are qualified for deployment in the high throughput screening of polysaccharide feedstreams. Assay accuracy, precision, robustness, interference, and ease of use are assessed and described. In combination, these assays are capable of measuring the product concentration and impurity profile of a microplate of 96 samples in less than one day. This body of work relies on the evaluation of a combination of commercially available and clinically relevant polysaccharides to ensure maximum versatility and reactivity of the final assay suite. Together, these advancements reduce overall process time by up to 30-fold and significantly reduce sample volume over current practices. The

  19. Bionic Design for Mars Sampling Scoop Inspired by Himalayan Marmot Claw

    Directory of Open Access Journals (Sweden)

    Long Xue

    2016-01-01

    Full Text Available Cave animals are often adapted to digging and life underground, with claw toes similar in structure and function to a sampling scoop. In this paper, the clawed toes of the Himalayan marmot were selected as a biological prototype for bionic research. Based on geometric parameter optimization of the clawed toes, a bionic sampling scoop for use on Mars was designed. Using a 3D laser scanner, the point cloud data of the second front claw toe was acquired. Parametric equations and contour curves for the claw were then built with cubic polynomial fitting. We obtained 18 characteristic curve equations for the internal and external contours of the claw. A bionic sampling scoop was designed according to the structural parameters of Curiosity’s sampling shovel and the contours of the Himalayan marmot’s claw. Verifying test results showed that when the penetration angle was 45° and the sampling speed was 0.33 r/min, the bionic sampling scoops’ resistance torque was 49.6% less than that of the prototype sampling scoop. When the penetration angle was 60° and the sampling speed was 0.22 r/min, the resistance torque of the bionic sampling scoop was 28.8% lower than that of the prototype sampling scoop.

  20. Evaluation of optimized bronchoalveolar lavage sampling designs for characterization of pulmonary drug distribution.

    Science.gov (United States)

    Clewe, Oskar; Karlsson, Mats O; Simonsson, Ulrika S H

    2015-12-01

    Bronchoalveolar lavage (BAL) is a pulmonary sampling technique for characterization of drug concentrations in epithelial lining fluid and alveolar cells. Two hypothetical drugs with different pulmonary distribution rates (fast and slow) were considered. An optimized BAL sampling design was generated assuming no previous information regarding the pulmonary distribution (rate and extent) and with a maximum of two samples per subject. Simulations were performed to evaluate the impact of the number of samples per subject (1 or 2) and the sample size on the relative bias and relative root mean square error of the parameter estimates (rate and extent of pulmonary distribution). The optimized BAL sampling design depends on a characterized plasma concentration time profile, a population plasma pharmacokinetic model, the limit of quantification (LOQ) of the BAL method and involves only two BAL sample time points, one early and one late. The early sample should be taken as early as possible, where concentrations in the BAL fluid ≥ LOQ. The second sample should be taken at a time point in the declining part of the plasma curve, where the plasma concentration is equivalent to the plasma concentration in the early sample. Using a previously described general pulmonary distribution model linked to a plasma population pharmacokinetic model, simulated data using the final BAL sampling design enabled characterization of both the rate and extent of pulmonary distribution. The optimized BAL sampling design enables characterization of both the rate and extent of the pulmonary distribution for both fast and slowly equilibrating drugs.

  1. Multidisciplinary design and analytic approaches to advance prospective research on the multilevel determinants of child health.

    Science.gov (United States)

    Johnson, Sara B; Little, Todd D; Masyn, Katherine; Mehta, Paras D; Ghazarian, Sharon R

    2017-06-01

    Characterizing the determinants of child health and development over time, and identifying the mechanisms by which these determinants operate, is a research priority. The growth of precision medicine has increased awareness and refinement of conceptual frameworks, data management systems, and analytic methods for multilevel data. This article reviews key methodological challenges in cohort studies designed to investigate multilevel influences on child health and strategies to address them. We review and summarize methodological challenges that could undermine prospective studies of the multilevel determinants of child health and ways to address them, borrowing approaches from the social and behavioral sciences. Nested data, variation in intervals of data collection and assessment, missing data, construct measurement across development and reporters, and unobserved population heterogeneity pose challenges in prospective multilevel cohort studies with children. We discuss innovations in missing data, innovations in person-oriented analyses, and innovations in multilevel modeling to address these challenges. Study design and analytic approaches that facilitate the integration across multiple levels, and that account for changes in people and the multiple, dynamic, nested systems in which they participate over time, are crucial to fully realize the promise of precision medicine for children and adolescents. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Determination of 93Zr, 107Pd and 135Cs in zircaloy hulls analytical development on inactive samples

    International Nuclear Information System (INIS)

    Excoffier, E.; Bienvenu, Ph.; Combes, C.; Pontremoli, S.; Delteil, N.; Ferrini, R.

    2000-01-01

    A study involving the participation of three laboratories of the Direction of the Fuel Cycle has been undertaken within the framework of a common interest program existing between the COGEMA and the CEA. Its purpose is to develop analytical methods for the determination of long-lived radionuclides in zircaloy hulls coming from spent fuel reprocessing operations. Acting as a complement to work carried out at the DRRV in ATALANTE concerning zircaloy dissolution and direct analysis of hull solutions, a study is now being conducted at the DESD/SCCD/LARC in Cadarache on three of these radionuclides, namely: zirconium 93, palladium 107 and caesium 135. It concerns three radioisotopes having very long periods (∼10 6 y), and which stabilize mainly through emission of β particles. The analytical technique chosen for the final measurement is inductively coupled plasma mass spectrometry (ICP/MS). Prior to the measurement, chemical separation processes are used to extract the radionuclides from the matrix and separate them from interfering elements and β emitters. The method developed initially on inactive solutions is being validated on irradiated samples coming from UP2/800 - UP3 reprocessing plants. (authors)

  3. Design, analysis and presentation of factorial randomised controlled trials

    Directory of Open Access Journals (Sweden)

    Little Paul

    2003-11-01

    Full Text Available Abstract Background The evaluation of more than one intervention in the same randomised controlled trial can be achieved using a parallel group design. However this requires increased sample size and can be inefficient, especially if there is also interest in considering combinations of the interventions. An alternative may be a factorial trial, where for two interventions participants are allocated to receive neither intervention, one or the other, or both. Factorial trials require special considerations, however, particularly at the design and analysis stages. Discussion Using a 2 × 2 factorial trial as an example, we present a number of issues that should be considered when planning a factorial trial. The main design issue is that of sample size. Factorial trials are most often powered to detect the main effects of interventions, since adequate power to detect plausible interactions requires greatly increased sample sizes. The main analytical issues relate to the investigation of main effects and the interaction between the interventions in appropriate regression models. Presentation of results should reflect the analytical strategy with an emphasis on the principal research questions. We also give an example of how baseline and follow-up data should be presented. Lastly, we discuss the implications of the design, analytical and presentational issues covered. Summary Difficulties in interpreting the results of factorial trials if an influential interaction is observed is the cost of the potential for efficient, simultaneous consideration of two or more interventions. Factorial trials can in principle be designed to have adequate power to detect realistic interactions, and in any case they are the only design that allows such effects to be investigated.

  4. 8. All Polish Conference on Analytical Chemistry: Analytical Chemistry for the Community of the 21. Century

    International Nuclear Information System (INIS)

    Koscielniak, P.; Wieczorek, M.; Kozak, J.

    2010-01-01

    Book of Abstracts contains short descriptions of lectures, communications and posters presented during 8 th All Polish Conference on Analytical Chemistry (Cracow, 4-9.07.2010). Scientific programme consisted of: basic analytical problems, preparation of the samples, chemometry and metrology, miniaturization of the analytical procedures, environmental analysis, medicinal analyses, industrial analyses, food analyses, biochemical analyses, analysis of relicts of the past. Several posters were devoted to the radiochemical separations, radiochemical analysis, environmental behaviour of the elements important for the nuclear science and the professional tests.

  5. Analysis and analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Batuecas Rodriguez, T [Department of Chemistry and Isotopes, Junta de Energia Nuclear, Madrid (Spain)

    1967-01-01

    The technology associated with the use of organic coolants in nuclear reactors depends to a large extent on the determination and control of their physical and chemical properties, and particularly on the viability, speed, sensitivity, precision and accuracy (depending on the intended usage) of the methods employed in detection and analytical determination. This has led to the study and development of numerous techniques, some specially designed for the extreme conditions involved in working with the types of product in question and others adapted from existing techniques. In the specific case of polyphenyl and hydropolyphenyl mixtures, which have been the principal subjects of study to date and offer greatest promise, the analytical problems are broadly as follows: Composition of initial product or virgin coolant composition of macro components and amounts of organic and inorganic impurities; Coolant during and after operation. Determination of gases and organic compounds produced by pyrolysis and radiolysis (degradation and polymerization products); Control of systems for purifying and regenerating the coolant after use. Dissolved pressurization gases; Detection of intermediate products during decomposition; these are generally very unstable (free radicals); Degree of fouling and film formation. Tests to determine potential formation of films; Corrosion of structural elements and canning materials; Health and safety. Toxicity, inflammability and impurities that can be activated. Although some of the above problems are closely interrelated and entail similar techniques, they vary as to degree of difficulty. Another question is the difficulty of distinguishing clearly between techniques for determining physical and physico-chemical properties, on one hand, and analytical techniques on the other. Any classification is therefore somewhat arbitrary (for example, in the case of dosimetry and techniques for determining mean molecular weights or electrical conductivity

  6. Optimal experiment design in a filtering context with application to sampled network data

    OpenAIRE

    Singhal, Harsh; Michailidis, George

    2010-01-01

    We examine the problem of optimal design in the context of filtering multiple random walks. Specifically, we define the steady state E-optimal design criterion and show that the underlying optimization problem leads to a second order cone program. The developed methodology is applied to tracking network flow volumes using sampled data, where the design variable corresponds to controlling the sampling rate. The optimal design is numerically compared to a myopic and a naive strategy. Finally, w...

  7. Analytical Quality by Design in pharmaceutical quality assurance: Development of a capillary electrophoresis method for the analysis of zolmitriptan and its impurities.

    Science.gov (United States)

    Orlandini, Serena; Pasquini, Benedetta; Caprini, Claudia; Del Bubba, Massimo; Pinzauti, Sergio; Furlanetto, Sandra

    2015-11-01

    A fast and selective CE method for the determination of zolmitriptan (ZOL) and its five potential impurities has been developed applying the analytical Quality by Design principles. Voltage, temperature, buffer concentration, and pH were investigated as critical process parameters that can influence the critical quality attributes, represented by critical resolution values between peak pairs, analysis time, and peak efficiency of ZOL-dimer. A symmetric screening matrix was employed for investigating the knowledge space, and a Box-Behnken design was used to evaluate the main, interaction, and quadratic effects of the critical process parameters on the critical quality attributes. Contour plots were drawn highlighting important interactions between buffer concentration and pH, and the gained information was merged into the sweet spot plots. Design space (DS) was established by the combined use of response surface methodology and Monte Carlo simulations, introducing a probability concept and thus allowing the quality of the analytical performances to be assured in a defined domain. The working conditions (with the interval defining the DS) were as follows: BGE, 138 mM (115-150 mM) phosphate buffer pH 2.74 (2.54-2.94); temperature, 25°C (24-25°C); voltage, 30 kV. A control strategy was planned based on method robustness and system suitability criteria. The main advantages of applying the Quality by Design concept consisted of a great increase of knowledge of the analytical system, obtained throughout multivariate techniques, and of the achievement of analytical assurance of quality, derived by probability-based definition of DS. The developed method was finally validated and applied to the analysis of ZOL tablets. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Automation of analytical systems in power cycles

    International Nuclear Information System (INIS)

    Staub Lukas

    2008-01-01

    'Automation' is a widely used term in instrumentation and is often applied to signal exchange, PLC and SCADA systems. Common use, however, does not necessarily described autonomous operation of analytical devices. We define an automated analytical system as a black box with an input (sample) and an output (measured value). In addition we need dedicated status lines for assessing the validities of the input for our black box and the output for subsequent systems. We will discuss input parameters, automated analytical processes and output parameters. Further considerations will be given to signal exchange and integration into the operating routine of a power plant. Local control loops (chemical dosing) and the automation of sampling systems are not discussed here. (author)

  9. Analysis of IFR samples at ANL-E

    International Nuclear Information System (INIS)

    Bowers, D.L.; Sabau, C.S.

    1993-01-01

    The Analytical Chemistry Laboratory analyzes a variety of samples submitted by the different research groups within IFR. This talk describes the analytical work on samples generated by the Plutonium Electrorefiner, Large Scale Electrorefiner and Waste Treatment Studies. The majority of these samples contain Transuranics and necessitate facilities that safely contain these radioisotopes. Details such as: sample receiving, dissolution techniques, chemical separations, Instrumentation used, reporting of results are discussed. The Importance of Interactions between customer and analytical personnel Is also demonstrated

  10. Pre-analytical and post-analytical evaluation in the era of molecular diagnosis of sexually transmitted diseases: cellularity control and internal control

    Directory of Open Access Journals (Sweden)

    Loria Bianchi

    2014-06-01

    Full Text Available Background. Increase of molecular tests performed on DNA extracted from various biological materials should not be carried out without an adequate standardization of the pre-analytical and post-analytical phase. Materials and Methods. Aim of this study was to evaluate the role of internal control (IC to standardize pre-analytical phase and the role of cellularity control (CC in the suitability evaluation of biological matrices, and their influence on false negative results. 120 cervical swabs (CS were pre-treated and extracted following 3 different protocols. Extraction performance was evaluated by amplification of: IC, added in each mix extraction; human gene HPRT1 (CC with RT-PCR to quantify sample cellularity; L1 region of HPV with SPF10 primers. 135 urine, 135 urethral swabs, 553 CS and 332 ThinPrep swabs (TP were tested for C. trachomatis (CT and U. parvum (UP with RT-PCR and for HPV by endpoint-PCR. Samples were also tested for cellularity. Results. Extraction protocol with highest average cellularity (Ac/sample showed lowest number of samples with inhibitors; highest HPV positivity was achieved by protocol with greatest Ac/PCR. CS and TP under 300.000 cells/sample showed a significant decrease of UP (P<0.01 and HPV (P<0.005 positivity. Female urine under 40.000 cells/mL were inadequate to detect UP (P<0.05. Conclusions. Our data show that IC and CC allow optimization of pre-analytical phase, with an increase of analytical quality. Cellularity/sample allows better sample adequacy evaluation, crucial to avoid false negative results, while cellularity/PCR allows better optimization of PCR amplification. Further data are required to define the optimal cut-off for result normalization.

  11. Designing Technology-Enabled Instruction to Utilize Learning Analytics

    Science.gov (United States)

    Davies, Randall; Nyland, Robert; Bodily, Robert; Chapman, John; Jones, Brian; Young, Jay

    2017-01-01

    A key notion conveyed by those who advocate for the use of data to enhance instruction is an awareness that learning analytics has the potential to improve instruction and learning but is not currently reaching that potential. Gibbons (2014) suggested that a lack of learning facilitated by current technology-enabled instructional systems may be…

  12. Headspace vapor characterization of Hanford Waste Tank SX-102: Results from samples collected on July 19, 1995. Tank Vapor Characterization Project

    International Nuclear Information System (INIS)

    McVeety, B.D.; Evans, J.C.; Clauss, T.W.; Pool, K.H.

    1996-05-01

    This report describes the results of vapor samples taken from the headspace of waste storage tank 241-SX-102 (Tank SX-102) at the Hanford Site in Washington State. Pacific Northwest National Laboratory (PNNL) contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and analyze samples for inorganic and organic analytes collected from the tank headspace and ambient air near the tank. The analytical work was performed under the PNNL Vapor Analytical Laboratory (VAL) by the Tank Vapor Characterization Project. Work performed was based on a sample and analysis plan (SAP) prepared by WHC. The SAP provided job-specific instructions for samples, analyses, and reporting. The SAP for this sample job was open-quotes Vapor Sampling and Analysis Planclose quotes, and the sample job was designated S5046. Samples were collected by WHC on July 19, 1995, using the vapor sampling system (VSS), a truck-based sampling method using a heated probe inserted into the tank headspace

  13. Headspace vapor characterization of Hanford Waste Tank AX-103: Results from samples collected on June 21, 1995. Tank Vapor Characterization Project

    International Nuclear Information System (INIS)

    Ligotke, M.W.; Pool, K.H.; Clauss, T.W.

    1996-05-01

    This report describes the results of vapor samples taken from the headspace of waste storage tank 241-AX-103 (Tank AX-103) at the Hanford Site in Washington State. Pacific Northwest National Laboratory (PNNL) contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and analyze samples for inorganic and organic analytes collected from the tank headspace and ambient air near the tank. The analytical work was performed by the PNNL Vapor Analytical Laboratory (VAL) by the Tank Vapor Characterization Project. Work performed was based on a sample and analysis plan (SAP) prepared by WHC. The SAP provided job-specific instructions for samples, analyses, and reporting. The SAP for this sample job was open-quotes Vapor Sampling and Analysis Planclose quotes, and the sample job was designated S5029. Samples were collected by WHC on June 21, 1995, using the Vapor Sampling System (VSS), a truck-based sampling method using a heated probe inserted into the tank headspace

  14. Headspace vapor characterization of Hanford Waste Tank AX-101: Results from samples collected on June 15, 1995. Tank Vapor Characterization Project

    International Nuclear Information System (INIS)

    Pool, K.H.; Clauss, T.W.; Evans, J.C.; McVeety, B.D.

    1996-05-01

    This report describes the results of vapor samples taken from the headspace of waste storage tank 241-AX-101 (Tank AX-101) at the Hanford Site in Washington State. Pacific Northwest National Laboratory (PNNL) contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and analyze samples for inorganic and organic analytes collected from the tank headspace and ambient air near the tank. The analytical work was performed by the PNNL Vapor Analytical Laboratory (VAL) under the Tank Vapor Characterization Project. Work performed was based on a sample and analysis plan (SAP) prepared by WHC. The SAP provided job-specific instructions for samples, analyses, and reporting. The SAP for this sample job was open-quotes Vapor Sampling and Analysis Planclose quotes, and the sample job was designated S5028. Samples were collected by WHC on June 15, 1995, using the Vapor Sampling System (VSS), a truck-based sampling method using a heated probe inserted into the tank headspace

  15. Critical Factors in Data Governance for Learning Analytics

    Science.gov (United States)

    Elouazizi, Noureddine

    2014-01-01

    This paper identifies some of the main challenges of data governance modelling in the context of learning analytics for higher education institutions, and discusses the critical factors for designing data governance models for learning analytics. It identifies three fundamental common challenges that cut across any learning analytics data…

  16. International Congress on Analytical Chemistry. Abstracts. V. 1

    International Nuclear Information System (INIS)

    1997-01-01

    The collection of materials of the international congress on analytical chemistry taken place in Moscow in June 1997. The main directs of investigations in such regions of analytical chemistry as quantitative and qualitative analysis, microanalysis, sample preparation and preconcentration, analytical reagents, chromatography and related techniques, flow analysis, electroanalytical and kinetic methods sensors are elucidated

  17. Design/Operations review of core sampling trucks and associated equipment

    International Nuclear Information System (INIS)

    Shrivastava, H.P.

    1996-01-01

    A systematic review of the design and operations of the core sampling trucks was commissioned by Characterization Equipment Engineering of the Westinghouse Hanford Company in October 1995. The review team reviewed the design documents, specifications, operating procedure, training manuals and safety analysis reports. The review process, findings and corrective actions are summarized in this supporting document

  18. Summarizing documentation of the laboratory automation system RADAR for the analytical services of a nuclear fuel reprocessing facility

    International Nuclear Information System (INIS)

    Brandenburg, G.; Brocke, W.; Brodda, B.G.; Buerger, K.; Halling, H.; Heer, H.; Puetz, K.; Schaedlich, W.; Watzlawik, K.H.

    1981-12-01

    The essential tasks of the system are on-line open-loop process control based on in-line measurements and automation of the off-line analytical laboratory. The in-line measurements (at 55 tanks of the chemical process area) provide density-, liquid-, level-, and temperature values. The concentration value of a single component may easily be determined, if the solution consists of no more than two phases. The automation of the off-line analytical laboratory contains laboratory organization including sample management and data organization and computer-aided sample transportation control, data acquisition and data processing at chemical and nuclear analytical devices. The computer system consists of two computer-subsystems: a front end system for sample central registration and in-line process control and a central size system for the off-line analytical tasks. The organization of the application oriented system uses a centralized data base. Similar data processing functions concerning different analytical management tasks are structured into the following subsystem: man machine interface, interrupt- and data acquisition system, data base, protocol service and data processing. The procedures for the laboratory management (organization and experiment sequences) are defined by application data bases. Following the project phases, engineering requirements-, design-, assembly-, start up- and test run phase are described. In addition figures on expenditure and experiences are given and the system concept is discussed. (orig./HP) [de

  19. Efficient sample preparation from complex biological samples using a sliding lid for immobilized droplet extractions.

    Science.gov (United States)

    Casavant, Benjamin P; Guckenberger, David J; Beebe, David J; Berry, Scott M

    2014-07-01

    Sample preparation is a major bottleneck in many biological processes. Paramagnetic particles (PMPs) are a ubiquitous method for isolating analytes of interest from biological samples and are used for their ability to thoroughly sample a solution and be easily collected with a magnet. There are three main methods by which PMPs are used for sample preparation: (1) removal of fluid from the analyte-bound PMPs, (2) removal of analyte-bound PMPs from the solution, and (3) removal of the substrate (with immobilized analyte-bound PMPs). In this paper, we explore the third and least studied method for PMP-based sample preparation using a platform termed Sliding Lid for Immobilized Droplet Extractions (SLIDE). SLIDE leverages principles of surface tension and patterned hydrophobicity to create a simple-to-operate platform for sample isolation (cells, DNA, RNA, protein) and preparation (cell staining) without the need for time-intensive wash steps, use of immiscible fluids, or precise pinning geometries. Compared to other standard isolation protocols using PMPs, SLIDE is able to perform rapid sample preparation with low (0.6%) carryover of contaminants from the original sample. The natural recirculation occurring within the pinned droplets of SLIDE make possible the performance of multistep cell staining protocols within the SLIDE by simply resting the lid over the various sample droplets. SLIDE demonstrates a simple easy to use platform for sample preparation on a range of complex biological samples.

  20. A sensitive analytical procedure for monitoring acrylamide in environmental water samples by offline SPE-UPLC/MS/MS.

    Science.gov (United States)

    Togola, Anne; Coureau, Charlotte; Guezennec, Anne-Gwenaëlle; Touzé, Solène

    2015-05-01

    The presence of acrylamide in natural systems is of concern from both environmental and health points of view. We developed an accurate and robust analytical procedure (offline solid phase extraction combined with UPLC/MS/MS) with a limit of quantification (20 ng L(-1)) compatible with toxicity threshold values. The optimized (considering the nature of extraction phases, sampling volumes, and solvent of elution) solid phase extraction (SPE) was validated according to ISO Standard ISO/IEC 17025 on groundwater, surface water, and industrial process water samples. Acrylamide is highly polar, which induces a high variability during the SPE step, therefore requiring the use of C(13)-labeled acrylamide as an internal standard to guarantee the accuracy and robustness of the method (uncertainty about 25 % (k = 2) at limit of quantification level). The specificity of the method and the stability of acrylamide were studied for these environmental media, and it was shown that the method is suitable for measuring acrylamide in environmental studies.

  1. Practical iterative learning control with frequency domain design and sampled data implementation

    CERN Document Server

    Wang, Danwei; Zhang, Bin

    2014-01-01

    This book is on the iterative learning control (ILC) with focus on the design and implementation. We approach the ILC design based on the frequency domain analysis and address the ILC implementation based on the sampled data methods. This is the first book of ILC from frequency domain and sampled data methodologies. The frequency domain design methods offer ILC users insights to the convergence performance which is of practical benefits. This book presents a comprehensive framework with various methodologies to ensure the learnable bandwidth in the ILC system to be set with a balance between learning performance and learning stability. The sampled data implementation ensures effective execution of ILC in practical dynamic systems. The presented sampled data ILC methods also ensure the balance of performance and stability of learning process. Furthermore, the presented theories and methodologies are tested with an ILC controlled robotic system. The experimental results show that the machines can work in much h...

  2. Design of sampling tools for Monte Carlo particle transport code JMCT

    International Nuclear Information System (INIS)

    Shangguan Danhua; Li Gang; Zhang Baoyin; Deng Li

    2012-01-01

    A class of sampling tools for general Monte Carlo particle transport code JMCT is designed. Two ways are provided to sample from distributions. One is the utilization of special sampling methods for special distribution; the other is the utilization of general sampling methods for arbitrary discrete distribution and one-dimensional continuous distribution on a finite interval. Some open source codes are included in the general sampling method for the maximum convenience of users. The sampling results show sampling correctly from distribution which are popular in particle transport can be achieved with these tools, and the user's convenience can be assured. (authors)

  3. The analytical utility of thermally desorbed polydimethylsilicone membranes for in-vivo sampling of volatile organic compounds in and on human skin.

    Science.gov (United States)

    Riazanskaia, S; Blackburn, G; Harker, M; Taylor, D; Thomas, C L P

    2008-08-01

    observed to be lost from the analysis with increasing sample time, in a manner analogous with breakthrough behaviour in adsorbent traps. Finally, a 10 day storage study at 4 degrees C suggested that micro-biological factors were significant in their effect on sample stability. Significant changes (up to x8) were observed in the masses of compounds recovered post storage. These studies confirmed that polydimethylsilicone membrane sampling patches of human skin provide rich and analytical useful data. It is important to note that care in experimental design is needed to avoid sampling artefacts being introduced through sampling selectivity, and/or, sample instability where samples are stored for longer than 24 h at 4 degrees C or higher.

  4. 3D-Printing for Analytical Ultracentrifugation.

    Directory of Open Access Journals (Sweden)

    Abhiksha Desai

    Full Text Available Analytical ultracentrifugation (AUC is a classical technique of physical biochemistry providing information on size, shape, and interactions of macromolecules from the analysis of their migration in centrifugal fields while free in solution. A key mechanical element in AUC is the centerpiece, a component of the sample cell assembly that is mounted between the optical windows to allow imaging and to seal the sample solution column against high vacuum while exposed to gravitational forces in excess of 300,000 g. For sedimentation velocity it needs to be precisely sector-shaped to allow unimpeded radial macromolecular migration. During the history of AUC a great variety of centerpiece designs have been developed for different types of experiments. Here, we report that centerpieces can now be readily fabricated by 3D printing at low cost, from a variety of materials, and with customized designs. The new centerpieces can exhibit sufficient mechanical stability to withstand the gravitational forces at the highest rotor speeds and be sufficiently precise for sedimentation equilibrium and sedimentation velocity experiments. Sedimentation velocity experiments with bovine serum albumin as a reference molecule in 3D printed centerpieces with standard double-sector design result in sedimentation boundaries virtually indistinguishable from those in commercial double-sector epoxy centerpieces, with sedimentation coefficients well within the range of published values. The statistical error of the measurement is slightly above that obtained with commercial epoxy, but still below 1%. Facilitated by modern open-source design and fabrication paradigms, we believe 3D printed centerpieces and AUC accessories can spawn a variety of improvements in AUC experimental design, efficiency and resource allocation.

  5. A Simple Analytic Model for Estimating Mars Ascent Vehicle Mass and Performance

    Science.gov (United States)

    Woolley, Ryan C.

    2014-01-01

    The Mars Ascent Vehicle (MAV) is a crucial component in any sample return campaign. In this paper we present a universal model for a two-stage MAV along with the analytic equations and simple parametric relationships necessary to quickly estimate MAV mass and performance. Ascent trajectories can be modeled as two-burn transfers from the surface with appropriate loss estimations for finite burns, steering, and drag. Minimizing lift-off mass is achieved by balancing optimized staging and an optimized path-to-orbit. This model allows designers to quickly find optimized solutions and to see the effects of design choices.

  6. Enhanced spot preparation for liquid extractive sampling and analysis

    Science.gov (United States)

    Van Berkel, Gary J.; King, Richard C.

    2015-09-22

    A method for performing surface sampling of an analyte, includes the step of placing the analyte on a stage with a material in molar excess to the analyte, such that analyte-analyte interactions are prevented and the analyte can be solubilized for further analysis. The material can be a matrix material that is mixed with the analyte. The material can be provided on a sample support. The analyte can then be contacted with a solvent to extract the analyte for further processing, such as by electrospray mass spectrometry.

  7. Maybe Small Is Too Small a Term: Introduction to Advancing Small Sample Prevention Science.

    Science.gov (United States)

    Fok, Carlotta Ching Ting; Henry, David; Allen, James

    2015-10-01

    Prevention research addressing health disparities often involves work with small population groups experiencing such disparities. The goals of this special section are to (1) address the question of what constitutes a small sample; (2) identify some of the key research design and analytic issues that arise in prevention research with small samples; (3) develop applied, problem-oriented, and methodologically innovative solutions to these design and analytic issues; and (4) evaluate the potential role of these innovative solutions in describing phenomena, testing theory, and evaluating interventions in prevention research. Through these efforts, we hope to promote broader application of these methodological innovations. We also seek whenever possible, to explore their implications in more general problems that appear in research with small samples but concern all areas of prevention research. This special section includes two sections. The first section aims to provide input for researchers at the design phase, while the second focuses on analysis. Each article describes an innovative solution to one or more challenges posed by the analysis of small samples, with special emphasis on testing for intervention effects in prevention research. A concluding article summarizes some of their broader implications, along with conclusions regarding future directions in research with small samples in prevention science. Finally, a commentary provides the perspective of the federal agencies that sponsored the conference that gave rise to this special section.

  8. Optimizing incomplete sample designs for item response model parameters

    NARCIS (Netherlands)

    van der Linden, Willem J.

    Several models for optimizing incomplete sample designs with respect to information on the item parameters are presented. The following cases are considered: (1) known ability parameters; (2) unknown ability parameters; (3) item sets with multiple ability scales; and (4) response models with

  9. Headspace vapor characterization of Hanford Waste Tank 241-T-110: Results from samples collected on August 31, 1995. Tank Vapor Characterization Project

    International Nuclear Information System (INIS)

    McVeety, B.D.; Thomas, B.L.; Evans, J.C.

    1996-05-01

    This report describes the results of vapor samples taken from the headspace of waste storage tank 241-T-110 (Tank T-110) at the Hanford Site in Washington State. Pacific Northwest National Laboratory (PNNL) contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and analyze samples for inorganic and organic analytes collected from the tank headspace and ambient air near the tank. The analytical work was performed by the PNNL Vapor Analytical Laboratory (VAL) by the Tank Vapor Characterization Project. Work performed was based on a sample and analysis plan (SAP) prepared by WHC. The SAP provided job-specific instructions for samples, analyses, and reporting. The SAP for this sample job was open-quotes Vapor Sampling and Analysis Planclose quotes, and the sample job was designated S5056. Samples were collected by WHC on August 31, 1995, using the Vapor Sampling System (VSS), a truck-based sampling method using a heated probe inserted into the tank headspace

  10. Headspace vapor characterization of Hanford Waste Tank 241-TX-111: Results from samples collected on October 12, 1995. Tank Vapor Characterization Project

    International Nuclear Information System (INIS)

    Pool, K.H.; Clauss, T.W.; Evans, J.C.

    1996-06-01

    This report describes the results of vapor samples taken from the headspace of waste storage tank 241-TX-111 (Tank TX-111) at the Hanford Site in Washington State. Pacific Northwest National Laboratory (PNNL) contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and analyze samples for inorganic and organic analytes collected from the tank headspace and ambient air near the tank. The analytical work was performed by the PNNL Vapor Analytical Laboratory (VAL) by the Tank Vapor Characterization Project. Work performed was based on a sample and analysis plan (SAP) prepared by WHC. The SAP provided job-specific instructions for samples, analyses, and reporting. The SAP for this sample job was open-quotes Vapor Sampling and Analysis Planclose quotes, and the sample job was designated S5069. Samples were collected by WHC on October 12, 1995, using the Vapor Sampling System (VSS), a truck-based sampling method using a heated probe inserted into the tank headspace

  11. Headspace vapor characterization of Hanford Waste Tank 241-SX-109: Results from samples collected on August 1, 1995. Tank Vapor Characterization Project

    International Nuclear Information System (INIS)

    Pool, K.H.; Clauss, T.W.; Evans, J.C.

    1996-05-01

    This report describes the results of vapor samples taken from the headspace of waste storage tank 241-SX-109 (Tank SX-109) at the Hanford Site in Washington State. Pacific Northwest National Laboratory (PNNL) contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and analyze samples for inorganic and organic analytes collected from the tank headspace and ambient air near the tank. The analytical work was performed by the PNNL Vapor Analytical Laboratory (VAL) by the Tank Vapor Characterization Project. Work performed was based on a sample and analysis plan (SAP) prepared by WHC. The SAP provided job-specific instructions for samples, analyses, and reporting. The SAP for this sample job was open-quotes Vapor Sampling and Analysis Planclose quotes, and the sample job was designated S5048. Samples were collected by WHC on August 1, 1995, using the Vapor Sampling System (VSS), a truck-based sampling method using a heated probe inserted into the tank headspace

  12. Headspace vapor characterization of Hanford Waste Tank 241-SX-104: Results from samples collected on July 25, 1995. Tank Vapor Characterization Project

    International Nuclear Information System (INIS)

    Thomas, B.L.; Clauss, T.W.; Evans, J.C.

    1996-05-01

    This report describes the results of vapor samples taken from the headspace of waste storage tank 241-SX-104 (Tank SX-104) at the Hanford Site in Washington State. Pacific Northwest National Laboratory (PNNL) contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and analyze samples for inorganic and organic analytes collected from the tank headspace and ambient air near the tank. The analytical work was performed by the PNNL Vapor Analytical Laboratory (VAL) by the Tank Vapor Characterization Project. Work performed was based on a sample and analysis plan (SAP) prepared by WHC. The SAP provided job-specific instructions for samples, analyses, and reporting. The SAP for this sample job was open-quotes Vapor Sampling and Analysis Planclose quotes, and the sample job was designated S5049. Samples were collected by WHC on July 25, 1995, using the Vapor Sampling System (VSS), a truck-based sampling method using a heated probe inserted into the tank headspace

  13. Headspace vapor characterization of Hanford Waste Tank 241-S-112: Results from samples collected on July 11, 1995. Tank Vapor Characterization Project

    International Nuclear Information System (INIS)

    Clauss, T.W.; Pool, K.H.; Evans, J.C.

    1996-05-01

    This report describes the results of vapor samples taken from the headspace of waste storage Tank 241-S-112 (Tank S-112) at the Hanford. Pacific Northwest National Laboratory (PNNL) is contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and analyze samples for inorganic and organic analytes collected from the tank headspace and ambient air near the tank. The analytical work was performed by the PNNL Vapor Analytical Laboratory (VAL) by the Tank Vapor Characterization Project. Work performed was based on a sample and analysis plan (SAP) prepared by WHC. The SAP provided job-specific instructions for samples, analyses, and reporting. The SAP for this sample job was open-quotes Vapor Sampling and Analysis Planclose quotes, and the sample job was designated S5044. Samples were collected by WHC on July 11, 1995, using the Vapor Sampling System (VSS), a truck-based sampling method using a heated probe inserted into the tank headspace

  14. Headspace vapor characterization of Hanford Waste Tank 241-SX-105: Results from samples collected on July 26, 1995. Tank Vapor Characterization Project

    International Nuclear Information System (INIS)

    Pool, K.H.; Clauss, T.W.; Evans, J.C.

    1996-05-01

    This report describes the results of vapor samples taken from the headspace of waste storage tank 241-SX-105 (Tank SX-105) at the Hanford Site in Washington State. Pacific Northwest National Laboratory (PNNL) contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and analyze samples for inorganic and organic analytes collected from the tank headspace and ambient air near the tank. The analytical work was performed by the PNNL Vapor Analytical Laboratory (VAL) by the Tank Vapor Characterization Project. Work performed was based on a sample and analysis plan (SAP) prepared by WHC. The SAP provided job-specific instructions for samples, analyses, and reporting. The SAP for this sample job was open-quotes Vapor Sampling and Analysis Planclose quotes, and the sample job was designated S5047. Samples were collected by WHC on July 26, 1995, using the Vapor Sampling System (VSS), a truck-based sampling method using a heated probe inserted into the tank headspace

  15. DIANA Code: Design and implementation of an analytic core calculus code by two group, two zone diffusion

    International Nuclear Information System (INIS)

    Mochi, Ignacio

    2005-01-01

    The principal parameters of nuclear reactors are determined in the conceptual design stage.For that purpose, it is necessary to have flexible calculation tools that represent the principal dependencies of such parameters.This capability is of critical importance in the design of innovative nuclear reactors.In order to have a proper tool that could assist the conceptual design of innovative nuclear reactors, we developed and implemented a neutronic core calculus code: DIANA (Diffusion Integral Analytic Neutron Analysis).To calculate the required parameters, this code generates its own cross sections using an analytic two group, two zones diffusion scheme based only on a minimal set of data (i.e. 2200 m/s and fission averaged microscopic cross sections, Wescott factors and Effective Resonance Integrals).Both to calculate cross sections and core parameters, DIANA takes into account heterogeneity effects that are included when it evaluates each zone.Among them lays the disadvantage factor of each energy group.DIANA was totally implemented through Object Oriented Programming using C++ language. This eases source code understanding and would allow a quick expansion of its capabilities if needed.The final product is a versatile and easy-to-use code that allows core calculations with a minimal amount of data.It also contains the required tools needed to perform many variational calculations such as the parameterisation of effective multiplication factors for different radii of the core.The diffusion scheme s simplicity allows an easy following of the involved phenomena, making DIANA the most suitable tool to design reactors whose physics lays beyond the parameters of present reactors.All this reasons make DIANA a good candidate for future innovative reactor analysis

  16. Analytical Techniques in the Pharmaceutical Sciences

    DEFF Research Database (Denmark)

    Leurs, Ulrike; Mistarz, Ulrik Hvid; Rand, Kasper Dyrberg

    2016-01-01

    Mass spectrometry (MS) offers the capability to identify, characterize and quantify a target molecule in a complex sample matrix and has developed into a premier analytical tool in drug development science. Through specific MS-based workflows including customized sample preparation, coupling...

  17. Flow cytometry for feline lymphoma: a retrospective study regarding pre-analytical factors possibly affecting the quality of samples.

    Science.gov (United States)

    Martini, Valeria; Bernardi, Serena; Marelli, Priscilla; Cozzi, Marzia; Comazzi, Stefano

    2018-06-01

    Objectives Flow cytometry (FC) is becoming increasingly popular among veterinary oncologists for the diagnosis of lymphoma or leukaemia. It is accurate, fast and minimally invasive. Several studies of FC have been carried out in canine oncology and applied with great results, whereas there is limited knowledge and use of this technique in feline patients. This is mainly owing to the high prevalence of intra-abdominal lymphomas in this species and the difficulty associated with the diagnostic procedures needed to collect the sample. The purpose of the present study is to investigate whether any pre-analytical factor might affect the quality of suspected feline lymphoma samples for FC analysis. Methods Ninety-seven consecutive samples of suspected feline lymphoma were retrospectively selected from the authors' institution's FC database. The referring veterinarians were contacted and interviewed about several different variables, including signalment, appearance of the lesion, features of the sampling procedure and the experience of veterinarians performing the sampling. Statistical analyses were performed to assess the possible influence of these variables on the cellularity of the samples and the likelihood of it being finally processed for FC. Results Sample cellularity is a major factor in the likelihood of the sample being processed. Moreover, sample cellularity was significantly influenced by the needle size, with 21 G needles providing the highest cellularity. Notably, the sample cellularity and the likelihood of being processed did not vary between peripheral and intra-abdominal lesions. Approximately half of the cats required pharmacological restraint. Side effects were reported in one case only (transient swelling after peripheral lymph node sampling). Conclusions and relevance FC can be safely applied to cases of suspected feline lymphomas, including intra-abdominal lesions. A 21 G needle should be preferred for sampling. This study provides the basis for

  18. The Study on Mental Health at Work: Design and sampling.

    Science.gov (United States)

    Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit

    2017-08-01

    The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. S-MGA is a representative study of German employees aged 31-60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment.

  19. The Study on Mental Health at Work: Design and sampling

    Science.gov (United States)

    Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit

    2017-01-01

    Aims: The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. Methods: S-MGA is a representative study of German employees aged 31–60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. Results: In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. Conclusions: There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment. PMID:28673202

  20. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    International Nuclear Information System (INIS)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z.

    1997-12-01

    Analytical and Process Chemistry (A ampersand PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A ampersand PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrification campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A ampersand PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A ampersand PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in

  1. Recent results of the investigation of a micro-fluidic sampling chip and sampling system for hot cell aqueous processing streams

    International Nuclear Information System (INIS)

    Tripp, J.; Smith, T.; Law, J.

    2013-01-01

    A Fuel Cycle Research and Development project has investigated an innovative sampling method that could evolve into the next generation sampling and analysis system for metallic elements present in aqueous processing streams. Initially sampling technologies were evaluated and micro-fluidic sampling chip technology was selected and tested. A conceptual design for a fully automated microcapillary-based system was completed and a robotic automated sampling system was fabricated. The mechanical and sampling operation of the completed sampling system was investigated. Different sampling volumes have been tested. It appears that the 10 μl volume has produced data that had much smaller relative standard deviations than the 2 μl volume. In addition, the production of a less expensive, mass produced sampling chip was investigated to avoid chip reuse thus increasing sampling reproducibility/accuracy. The micro-fluidic-based robotic sampling system's mechanical elements were tested to ensure analytical reproducibility and the optimum robotic handling of micro-fluidic sampling chips. (authors)

  2. Tank 48H Waste Composition and Results of Investigation of Analytical Methods

    Energy Technology Data Exchange (ETDEWEB)

    Walker , D.D. [Westinghouse Savannah River Company, AIKEN, SC (United States)

    1997-04-02

    This report serves two purposes. First, it documents the analytical results of Tank 48H samples taken between April and August 1996. Second, it describes investigations of the precision of the sampling and analytical methods used on the Tank 48H samples.

  3. An analytical approach to Sr isotope ratio determination in Lambrusco wines for geographical traceability purposes.

    Science.gov (United States)

    Durante, Caterina; Baschieri, Carlo; Bertacchini, Lucia; Bertelli, Davide; Cocchi, Marina; Marchetti, Andrea; Manzini, Daniela; Papotti, Giulia; Sighinolfi, Simona

    2015-04-15

    Geographical origin and authenticity of food are topics of interest for both consumers and producers. Among the different indicators used for traceability studies, (87)Sr/(86)Sr isotopic ratio has provided excellent results. In this study, two analytical approaches for wine sample pre-treatment, microwave and low temperature mineralisation, were investigated to develop accurate and precise analytical method for (87)Sr/(86)Sr determination. The two procedures led to comparable results (paired t-test, with tanalytical procedure was evaluated by using a control sample (wine sample), processed during each sample batch (calculated Relative Standard Deviation, RSD%, equal to 0.002%. Lambrusco PDO (Protected Designation of Origin) wines coming from four different vintages (2009, 2010, 2011 and 2012) were pre-treated according to the best procedure and their isotopic values were compared with isotopic data coming from (i) soils of their territory of origin and (ii) wines obtained by same grape varieties cultivated in different districts. The obtained results have shown no significant variability among the different vintages of wines and a perfect agreement between the isotopic range of the soils and wines has been observed. Nevertheless, the investigated indicator was not enough powerful to discriminate between similar products. To this regard, it is worth to note that more soil samples as well as wines coming from different districts will be considered to obtain more trustworthy results. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Examining the Job-Related, Psychological, and Physical Outcomes of Workplace Sexual Harassment: A Meta-Analytic Review

    Science.gov (United States)

    Chan, Darius K-S.; Lam, Chun Bun; Chow, Suk Yee; Cheung, Shu Fai

    2008-01-01

    This study was designed to examine the job-related, psychological, and physical outcomes of sexual harassment in the workplace. Using a meta-analytic approach, we analyzed findings from 49 primary studies, with a total sample size of 89,382, to obtain estimates of the population mean effect size of the association between sexual harassment and…

  5. A visual analytics design for studying rhythm patterns from human daily movement data

    Directory of Open Access Journals (Sweden)

    Wei Zeng

    2017-06-01

    Full Text Available Human’s daily movements exhibit high regularity in a space–time context that typically forms circadian rhythms. Understanding the rhythms for human daily movements is of high interest to a variety of parties from urban planners, transportation analysts, to business strategists. In this paper, we present an interactive visual analytics design for understanding and utilizing data collected from tracking human’s movements. The resulting system identifies and visually presents frequent human movement rhythms to support interactive exploration and analysis of the data over space and time. Case studies using real-world human movement data, including massive urban public transportation data in Singapore and the MIT reality mining dataset, and interviews with transportation researches were conducted to demonstrate the effectiveness and usefulness of our system.

  6. Analytic thinking reduces belief in conspiracy theories.

    Science.gov (United States)

    Swami, Viren; Voracek, Martin; Stieger, Stefan; Tran, Ulrich S; Furnham, Adrian

    2014-12-01

    Belief in conspiracy theories has been associated with a range of negative health, civic, and social outcomes, requiring reliable methods of reducing such belief. Thinking dispositions have been highlighted as one possible factor associated with belief in conspiracy theories, but actual relationships have only been infrequently studied. In Study 1, we examined associations between belief in conspiracy theories and a range of measures of thinking dispositions in a British sample (N=990). Results indicated that a stronger belief in conspiracy theories was significantly associated with lower analytic thinking and open-mindedness and greater intuitive thinking. In Studies 2-4, we examined the causational role played by analytic thinking in relation to conspiracist ideation. In Study 2 (N=112), we showed that a verbal fluency task that elicited analytic thinking reduced belief in conspiracy theories. In Study 3 (N=189), we found that an alternative method of eliciting analytic thinking, which related to cognitive disfluency, was effective at reducing conspiracist ideation in a student sample. In Study 4, we replicated the results of Study 3 among a general population sample (N=140) in relation to generic conspiracist ideation and belief in conspiracy theories about the July 7, 2005, bombings in London. Our results highlight the potential utility of supporting attempts to promote analytic thinking as a means of countering the widespread acceptance of conspiracy theories. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Flow cytometry for feline lymphoma: a retrospective study about pre-analytical factors possibly affecting the quality of samples

    Directory of Open Access Journals (Sweden)

    Serena Bernardi

    2017-05-01

    Full Text Available Introduction Flow cytometry (FC is an increasingly required technique on which veterinary oncologists rely to have an accurate, fast, minimally invasive lymphoma or leukemia diagnosis. FC has been studied and applied with great results in canine oncology, whereas in feline oncology the use of this technique is still to be experienced. This is mainly due to a supposed discomfort in sampling, because of the high prevalence of intra-abdominal lymphomas. The purpose of the present study is to investigate whether any pre-analytical factor might affect the quality of suspected feline lymphoma samples for FC analysis. Methods 97 consecutive samples of suspected feline lymphoma were retrospectively selected from the authors’ institution FC database. The referring veterinarians were recalled and interrogated about several different variables, including signalling, features of the lesion, features of the sampling procedure and the experience of veterinarians performing the sampling. Statistical analyses were performed to assess the possible influence of these variables on the cellularity of the samples and the likelihood of being finally processed for FC. Results None of the investigated variables significantly influenced the quality of the submitted samples, but the needle size, with 21G needles providing the highest cellularity (Table 1. Notably, the samples quality did not vary between peripheral and intra-abdominal lesions. Sample cellularity alone influenced the likelihood of being processed. About a half of the cats required pharmacological restraint. Side effects were reported in one case only (transient swelling after peripheral lymph node sampling. Conclusions FC can be safely applied to cases of suspected feline lymphomas, even for intra-abdominal lesions. 21G needle should be preferred for sampling. This study provides the bases for the spread of this minimally invasive, fast and cost-effective technique in feline medicine.

  8. Multispectral analytical image fusion

    International Nuclear Information System (INIS)

    Stubbings, T.C.

    2000-04-01

    With new and advanced analytical imaging methods emerging, the limits of physical analysis capabilities and furthermore of data acquisition quantities are constantly pushed, claiming high demands to the field of scientific data processing and visualisation. Physical analysis methods like Secondary Ion Mass Spectrometry (SIMS) or Auger Electron Spectroscopy (AES) and others are capable of delivering high-resolution multispectral two-dimensional and three-dimensional image data; usually this multispectral data is available in form of n separate image files with each showing one element or other singular aspect of the sample. There is high need for digital image processing methods enabling the analytical scientist, confronted with such amounts of data routinely, to get rapid insight into the composition of the sample examined, to filter the relevant data and to integrate the information of numerous separate multispectral images to get the complete picture. Sophisticated image processing methods like classification and fusion provide possible solution approaches to this challenge. Classification is a treatment by multivariate statistical means in order to extract analytical information. Image fusion on the other hand denotes a process where images obtained from various sensors or at different moments of time are combined together to provide a more complete picture of a scene or object under investigation. Both techniques are important for the task of information extraction and integration and often one technique depends on the other. Therefore overall aim of this thesis is to evaluate the possibilities of both techniques regarding the task of analytical image processing and to find solutions for the integration and condensation of multispectral analytical image data in order to facilitate the interpretation of the enormous amounts of data routinely acquired by modern physical analysis instruments. (author)

  9. Multiple category-lot quality assurance sampling: a new classification system with application to schistosomiasis control.

    Science.gov (United States)

    Olives, Casey; Valadez, Joseph J; Brooker, Simon J; Pagano, Marcello

    2012-01-01

    Originally a binary classifier, Lot Quality Assurance Sampling (LQAS) has proven to be a useful tool for classification of the prevalence of Schistosoma mansoni into multiple categories (≤10%, >10 and LQAS (MC-LQAS) have not received full treatment. We explore the analytical properties of MC-LQAS, and validate its use for the classification of S. mansoni prevalence in multiple settings in East Africa. We outline MC-LQAS design principles and formulae for operating characteristic curves. In addition, we derive the average sample number for MC-LQAS when utilizing semi-curtailed sampling and introduce curtailed sampling in this setting. We also assess the performance of MC-LQAS designs with maximum sample sizes of n=15 and n=25 via a weighted kappa-statistic using S. mansoni data collected in 388 schools from four studies in East Africa. Overall performance of MC-LQAS classification was high (kappa-statistic of 0.87). In three of the studies, the kappa-statistic for a design with n=15 was greater than 0.75. In the fourth study, where these designs performed poorly (kappa-statistic less than 0.50), the majority of observations fell in regions where potential error is known to be high. Employment of semi-curtailed and curtailed sampling further reduced the sample size by as many as 0.5 and 3.5 observations per school, respectively, without increasing classification error. This work provides the needed analytics to understand the properties of MC-LQAS for assessing the prevalance of S. mansoni and shows that in most settings a sample size of 15 children provides a reliable classification of schools.

  10. A methodology for more efficient tail area sampling with discrete probability distribution

    International Nuclear Information System (INIS)

    Park, Sang Ryeol; Lee, Byung Ho; Kim, Tae Woon

    1988-01-01

    Monte Carlo Method is commonly used to observe the overall distribution and to determine the lower or upper bound value in statistical approach when direct analytical calculation is unavailable. However, this method would not be efficient if the tail area of a distribution is concerned. A new method entitled 'Two Step Tail Area Sampling' is developed, which uses the assumption of discrete probability distribution and samples only the tail area without distorting the overall distribution. This method uses two step sampling procedure. First, sampling at points separated by large intervals is done and second, sampling at points separated by small intervals is done with some check points determined at first step sampling. Comparison with Monte Carlo Method shows that the results obtained from the new method converge to analytic value faster than Monte Carlo Method if the numbers of calculation of both methods are the same. This new method is applied to DNBR (Departure from Nucleate Boiling Ratio) prediction problem in design of the pressurized light water nuclear reactor

  11. Social Learning Analytics

    Science.gov (United States)

    Buckingham Shum, Simon; Ferguson, Rebecca

    2012-01-01

    We propose that the design and implementation of effective "Social Learning Analytics (SLA)" present significant challenges and opportunities for both research and enterprise, in three important respects. The first is that the learning landscape is extraordinarily turbulent at present, in no small part due to technological drivers.…

  12. Functionalization and Characterization of Nanomaterial Gated Field-Effect Transistor-Based Biosensors and the Design of a Multi-Analyte Implantable Biosensing Platform

    Science.gov (United States)

    Croce, Robert A., Jr.

    Advances in semiconductor research and complementary-metal-oxide semiconductor fabrication allow for the design and implementation of miniaturized metabolic monitoring systems, as well as advanced biosensor design. The first part of this dissertation will focus on the design and fabrication of nanomaterial (single-walled carbon nanotube and quantum dot) gated field-effect transistors configured as protein sensors. These novel device structures have been functionalized with single-stranded DNA aptamers, and have shown sensor operation towards the protein Thrombin. Such advanced transistor-based sensing schemes present considerable advantages over traditional sensing methodologies in view of its miniaturization, low cost, and facile fabrication, paving the way for the ultimate realization of a multi-analyte lab-on-chip. The second part of this dissertation focuses on the design and fabrication of a needle-implantable glucose sensing platform which is based solely on photovoltaic powering and optical communication. By employing these powering and communication schemes, this design negates the need for bulky on-chip RF-based transmitters and batteries in an effort to attain extreme miniaturization required for needle-implantable/extractable applications. A complete single-sensor system coupled with a miniaturized amperometric glucose sensor has been demonstrated to exhibit reality of this technology. Furthermore, an optical selection scheme of multiple potentiostats for four different analytes (glucose, lactate, O 2 and CO2) as well as the optical transmission of sensor data has been designed for multi-analyte applications. The last part of this dissertation will focus on the development of a computational model for the amperometric glucose sensors employed in the aforementioned implantable platform. This model has been applied to single-layer single-enzyme systems, as well as multi-layer (single enzyme) systems utilizing glucose flux limiting layer-by-layer assembled

  13. Sampling design for long-term regional trends in marine rocky intertidal communities

    Science.gov (United States)

    Irvine, Gail V.; Shelley, Alice

    2013-01-01

    Probability-based designs reduce bias and allow inference of results to the pool of sites from which they were chosen. We developed and tested probability-based designs for monitoring marine rocky intertidal assemblages at Glacier Bay National Park and Preserve (GLBA), Alaska. A multilevel design was used that varied in scale and inference. The levels included aerial surveys, extensive sampling of 25 sites, and more intensive sampling of 6 sites. Aerial surveys of a subset of intertidal habitat indicated that the original target habitat of bedrock-dominated sites with slope ≤30° was rare. This unexpected finding illustrated one value of probability-based surveys and led to a shift in the target habitat type to include steeper, more mixed rocky habitat. Subsequently, we evaluated the statistical power of different sampling methods and sampling strategies to detect changes in the abundances of the predominant sessile intertidal taxa: barnacles Balanomorpha, the mussel Mytilus trossulus, and the rockweed Fucus distichus subsp. evanescens. There was greatest power to detect trends in Mytilus and lesser power for barnacles and Fucus. Because of its greater power, the extensive, coarse-grained sampling scheme was adopted in subsequent years over the intensive, fine-grained scheme. The sampling attributes that had the largest effects on power included sampling of “vertical” line transects (vs. horizontal line transects or quadrats) and increasing the number of sites. We also evaluated the power of several management-set parameters. Given equal sampling effort, sampling more sites fewer times had greater power. The information gained through intertidal monitoring is likely to be useful in assessing changes due to climate, including ocean acidification; invasive species; trampling effects; and oil spills.

  14. Analytical model for nonlinear piezoelectric energy harvesting devices

    International Nuclear Information System (INIS)

    Neiss, S; Goldschmidtboeing, F; M Kroener; Woias, P

    2014-01-01

    In this work we propose analytical expressions for the jump-up and jump-down point of a nonlinear piezoelectric energy harvester. In addition, analytical expressions for the maximum power output at optimal resistive load and the 3 dB-bandwidth are derived. So far, only numerical models have been used to describe the physics of a piezoelectric energy harvester. However, this approach is not suitable to quickly evaluate different geometrical designs or piezoelectric materials in the harvester design process. In addition, the analytical expressions could be used to predict the jump-frequencies of a harvester during operation. In combination with a tuning mechanism, this would allow the design of an efficient control algorithm to ensure that the harvester is always working on the oscillator's high energy attractor. (paper)

  15. A ring test of in vitro neutral detergent fiber digestibility: analytical variability and sample ranking.

    Science.gov (United States)

    Hall, M B; Mertens, D R

    2012-04-01

    In vitro neutral detergent fiber (NDF) digestibility (NDFD) is an empirical measurement of fiber fermentability by rumen microbes. Variation is inherent in all assays and may be increased as multiple steps or differing procedures are used to assess an empirical measure. The main objective of this study was to evaluate variability within and among laboratories of 30-h NDFD values analyzed in repeated runs. Subsamples of alfalfa (n=4), corn forage (n=5), and grass (n=5) ground to pass a 6-mm screen passed a test for homogeneity. The 14 samples were sent to 10 laboratories on 3 occasions over 12 mo. Laboratories ground the samples and ran 1 to 3 replicates of each sample within fermentation run and analyzed 2 or 3 sets of samples. Laboratories used 1 of 2 NDFD procedures: 8 labs used procedures related to the 1970 Goering and Van Soest (GVS) procedure using fermentation vessels or filter bags, and 2 used a procedure with preincubated inoculum (PInc). Means and standard deviations (SD) of sample replicates within run within laboratory (lab) were evaluated with a statistical model that included lab, run within lab, sample, and lab × sample interaction as factors. All factors affected mean values for 30-h NDFD. The lab × sample effect suggests against a simple lab bias in mean values. The SD ranged from 0.49 to 3.37% NDFD and were influenced by lab and run within lab. The GVS procedure gave greater NDFD values than PInc, with an average difference across all samples of 17% NDFD. Because of the differences between GVS and PInc, we recommend using results in contexts appropriate to each procedure. The 95% probability limits for within-lab repeatability and among-lab reproducibility for GVS mean values were 10.2 and 13.4%, respectively. These percentages describe the span of the range around the mean into which 95% of analytical results for a sample fall for values generated within a lab and among labs. This degree of precision was supported in that the average maximum

  16. Quantification of process induced disorder in milled samples using different analytical techniques

    DEFF Research Database (Denmark)

    Zimper, Ulrike; Aaltonen, Jaakko; McGoverin, Cushla M.

    2012-01-01

    The aim of this study was to compare three different analytical methods to detect and quantify the amount of crystalline disorder/ amorphousness in two milled model drugs. X-ray powder diffraction (XRPD), differential scanning calorimetry (DSC) and Raman spectroscopy were used as analytical methods...... and indomethacin and simvastatin were chosen as the model compounds. These compounds partly converted from crystalline to disordered forms by milling. Partial least squares regression (PLS) was used to create calibration models for the XRPD and Raman data, which were subsequently used to quantify the milling......-induced crystalline disorder/ amorphousness under different process conditions. In the DSC measurements the change in heat capacity at the glass transition was used for quantification. Differently prepared amorphous indomethacin standards (prepared by either melt quench cooling or cryo milling) were compared...

  17. Advanced analytical techniques

    International Nuclear Information System (INIS)

    Mrochek, J.E.; Shumate, S.E.; Genung, R.K.; Bahner, C.T.; Lee, N.E.; Dinsmore, S.R.

    1976-01-01

    The development of several new analytical techniques for use in clinical diagnosis and biomedical research is reported. These include: high-resolution liquid chromatographic systems for the early detection of pathological molecular constituents in physiologic body fluids; gradient elution chromatography for the analysis of protein-bound carbohydrates in blood serum samples, with emphasis on changes in sera from breast cancer patients; electrophoretic separation techniques coupled with staining of specific proteins in cellular isoenzymes for the monitoring of genetic mutations and abnormal molecular constituents in blood samples; and the development of a centrifugal elution chromatographic technique for the assay of specific proteins and immunoglobulins in human blood serum samples

  18. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Practical reporting times for environmental samples

    International Nuclear Information System (INIS)

    Bayne, C.K.; Schmoyer, D.D.; Jenkins, R.A.

    1993-02-01

    Preanalytical holding times for environmental samples are specified because chemical and physical characteristics may change between sampling and chemical analysis. For example, the Federal Register prescribes a preanalytical holding time of 14 days for volatile organic compounds in soil stored at 4 degrees C. The American Society for Testing Materials (ASTM) uses a more technical definition that the preanalytical holding time is the day when the analyte concentration for an environmental sample falls below the lower 99% confidence interval on the analyte concentration at day zero. This study reviews various holding time definitions and suggest a new preanalytical holding time approach using acceptable error rates for measuring an environmental analyte. This practical reporting time (PRT) approach has been applied to nineteen volatile organic compounds and four explosives in three environmental soil samples. A PRT nomograph of error rates has been developed to estimate the consequences of missing a preanalytical holding time. This nomograph can be applied to a large class of analytes with concentrations that decay linearly or exponentially with time regardless of sample matrices and storage conditions

  20. Practical reporting times for environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Bayne, C.K.; Schmoyer, D.D.; Jenkins, R.A.

    1993-02-01

    Preanalytical holding times for environmental samples are specified because chemical and physical characteristics may change between sampling and chemical analysis. For example, the Federal Register prescribes a preanalytical holding time of 14 days for volatile organic compounds in soil stored at 4{degrees}C. The American Society for Testing Materials (ASTM) uses a more technical definition that the preanalytical holding time is the day when the analyte concentration for an environmental sample falls below the lower 99% confidence interval on the analyte concentration at day zero. This study reviews various holding time definitions and suggest a new preanalytical holding time approach using acceptable error rates for measuring an environmental analyte. This practical reporting time (PRT) approach has been applied to nineteen volatile organic compounds and four explosives in three environmental soil samples. A PRT nomograph of error rates has been developed to estimate the consequences of missing a preanalytical holding time. This nomograph can be applied to a large class of analytes with concentrations that decay linearly or exponentially with time regardless of sample matrices and storage conditions.

  1. An Analytic Solution to the Computation of Power and Sample Size for Genetic Association Studies under a Pleiotropic Mode of Inheritance.

    Science.gov (United States)

    Gordon, Derek; Londono, Douglas; Patel, Payal; Kim, Wonkuk; Finch, Stephen J; Heiman, Gary A

    2016-01-01

    Our motivation here is to calculate the power of 3 statistical tests used when there are genetic traits that operate under a pleiotropic mode of inheritance and when qualitative phenotypes are defined by use of thresholds for the multiple quantitative phenotypes. Specifically, we formulate a multivariate function that provides the probability that an individual has a vector of specific quantitative trait values conditional on having a risk locus genotype, and we apply thresholds to define qualitative phenotypes (affected, unaffected) and compute penetrances and conditional genotype frequencies based on the multivariate function. We extend the analytic power and minimum-sample-size-necessary (MSSN) formulas for 2 categorical data-based tests (genotype, linear trend test [LTT]) of genetic association to the pleiotropic model. We further compare the MSSN of the genotype test and the LTT with that of a multivariate ANOVA (Pillai). We approximate the MSSN for statistics by linear models using a factorial design and ANOVA. With ANOVA decomposition, we determine which factors most significantly change the power/MSSN for all statistics. Finally, we determine which test statistics have the smallest MSSN. In this work, MSSN calculations are for 2 traits (bivariate distributions) only (for illustrative purposes). We note that the calculations may be extended to address any number of traits. Our key findings are that the genotype test usually has lower MSSN requirements than the LTT. More inclusive thresholds (top/bottom 25% vs. top/bottom 10%) have higher sample size requirements. The Pillai test has a much larger MSSN than both the genotype test and the LTT, as a result of sample selection. With these formulas, researchers can specify how many subjects they must collect to localize genes for pleiotropic phenotypes. © 2017 S. Karger AG, Basel.

  2. Stability of purgeable VOCs in water samples during pre-analytical holding. Part 2: Analyses by an EPA regional laboratory

    Energy Technology Data Exchange (ETDEWEB)

    West, O.R.; Bayne, C.K.; Siegrist, R.L.; Holden, W.L. [Oak Ridge National Lab., TN (United States); Bottrell, D.W. [Dept. of Energy, Germantown, MD (United States)

    1997-03-01

    This study was undertaken to examine the hypothesis that prevalent and priority purgeable VOCs in properly preserved water samples are stable for at least 28 days. For the purposes of this study, VOCs were considered functionally stable if concentrations measured after 28 days did not change by more than 10% from the initial values. An extensive stability experiment was performed on freshly-collected surface water spiked with a suite of 44 purgeable VOCs. The spiked water was then distributed into multiple 40-mL VOC vials with 0.010-in Teflon-lined silicone septum caps prefilled with 250 mg of NaHSO{sub 4} (resulting pH of the water {approximately}2). The samples were sent to a commercial [Analytical Resources, Inc. (ARI)] and EPA (Region IV) laboratory where they were stored at 4 C. On 1, 8, 15, 22, 29, 36, and 71 days after sample preparation, analysts from ARI took 4 replicate samples out of storage and analyzed these samples for purgeable VOCs following EPA/SW846 8260A. A similar analysis schedule was followed by analysts at the EPA laboratory. This document contains the results from the EPA analyses; the ARI results are described in a separate report.

  3. Concepts in sample size determination

    Directory of Open Access Journals (Sweden)

    Umadevi K Rao

    2012-01-01

    Full Text Available Investigators involved in clinical, epidemiological or translational research, have the drive to publish their results so that they can extrapolate their findings to the population. This begins with the preliminary step of deciding the topic to be studied, the subjects and the type of study design. In this context, the researcher must determine how many subjects would be required for the proposed study. Thus, the number of individuals to be included in the study, i.e., the sample size is an important consideration in the design of many clinical studies. The sample size determination should be based on the difference in the outcome between the two groups studied as in an analytical study, as well as on the accepted p value for statistical significance and the required statistical power to test a hypothesis. The accepted risk of type I error or alpha value, which by convention is set at the 0.05 level in biomedical research defines the cutoff point at which the p value obtained in the study is judged as significant or not. The power in clinical research is the likelihood of finding a statistically significant result when it exists and is typically set to >80%. This is necessary since the most rigorously executed studies may fail to answer the research question if the sample size is too small. Alternatively, a study with too large a sample size will be difficult and will result in waste of time and resources. Thus, the goal of sample size planning is to estimate an appropriate number of subjects for a given study design. This article describes the concepts in estimating the sample size.

  4. Analytical performances of laser-induced micro-plasma of Al samples with single and double ultrashort pulses in air and with Ar-jet: A comparative study

    International Nuclear Information System (INIS)

    Semerok, A.; Dutouquet, C.

    2014-01-01

    Ultrashort pulse laser microablation coupled with optical emission spectroscopy was under study to obtain several micro-LIBS analytical features (shot-to-shot reproducibility, spectral line intensity and lifetime, calibration curves, detection limits). Laser microablation of Al matrix samples with known Cu- and Mg-concentrations was performed by single and double pulses of 50 fs and 1 ps pulse duration in air and with Ar-jet. The micro-LIBS analytical features obtained under different experimental conditions were characterized and compared. The highest shot-to-shot reproducibility and gain in plasma spectral line intensity were obtained with double pulses with Ar-jet for both 50 fs and 1 ps pulse durations. The best calibration curves were obtained with 1 ps pulse duration with Ar-jet. Micro-LIBS with ultrashort double pulses may find its effective application for surface elemental microcartography. - Highlights: • Analytical performances of micro-LIBS with ultrashort double pulses were studied. • The maximal line intensity gain of 20 was obtained with double pulses and Ar-jet. • LIBS gain was obtained without additional ablation of a sample by the second pulse. • LIBS properties were almost the same for both 50 fs and 1 ps pulses. • The micro-LIBS detection limit was around 35 ppm

  5. Activities at Forschungszentrum Juelich in Safeguards Analytical Techniques and Measurements

    International Nuclear Information System (INIS)

    Duerr, M.; Knott, A.; Middendorp, R.; Niemeyer, I.; Kueppers, S.; Zoriy, M.; Froning, M.; Bosbach, D.

    2015-01-01

    The application of safeguards by the IAEA involves analytical measurements of samples taken during inspections. The development and advancement of analytical techniques with support from the Member States contributes to strengthened and more efficient verification of compliance with non-proliferation obligations. Since recently, a cooperation agreement has been established between Forschungszentrum Juelich and the IAEA in the field of analytical services. The current working areas of Forschungszentrum Juelich are: (i) Production of synthetic micro-particles as calibration standard and reference material for particle analysis, (ii) qualification of the Forschungszentrum Juelich as a member of the IAEA network of analytical laboratories for safeguards (NWAL), and (iii) analysis of impurities in nuclear material samples. With respect to the synthesis of particles, a dedicated setup for the production of uranium particles is being developed, which addresses the urgent need for material tailored for its use in quality assurance and quality control measures for particle analysis of environmental swipe samples. Furthermore, Forschungszentrum Juelich has been nominated as a candidate laboratory for membership in the NWAL network. To this end, analytical capabilities at Forschungszentrum Juelich have been joined to form an analytical service within a dedicated quality management system. Another activity is the establishment of analytical techniques for impurity analysis of uranium-oxide, mainly focusing on inductively coupled mass spectrometry. This contribution will present the activities at Forschungszentrum Juelich in the area of analytical measurements and techniques for nuclear verification. (author)

  6. Sample collection and sample analysis plan in support of the 105-C/190-C concrete and soil sampling activities

    International Nuclear Information System (INIS)

    Marske, S.G.

    1996-07-01

    This sampling and analysis plan describes the sample collection and sample analysis in support of the 105-C water tunnels and 190-C main pumphouse concrete and soil sampling activities. These analytical data will be used to identify the radiological contamination and presence of hazardous materials to support the decontamination and disposal activities

  7. Performance evaluation soil samples utilizing encapsulation technology

    Science.gov (United States)

    Dahlgran, James R.

    1999-01-01

    Performance evaluation soil samples and method of their preparation using encapsulation technology to encapsulate analytes which are introduced into a soil matrix for analysis and evaluation by analytical laboratories. Target analytes are mixed in an appropriate solvent at predetermined concentrations. The mixture is emulsified in a solution of polymeric film forming material. The emulsified solution is polymerized to form microcapsules. The microcapsules are recovered, quantitated and introduced into a soil matrix in a predetermined ratio to form soil samples with the desired analyte concentration.

  8. Multiple category-lot quality assurance sampling: a new classification system with application to schistosomiasis control.

    Directory of Open Access Journals (Sweden)

    Casey Olives

    Full Text Available Originally a binary classifier, Lot Quality Assurance Sampling (LQAS has proven to be a useful tool for classification of the prevalence of Schistosoma mansoni into multiple categories (≤10%, >10 and <50%, ≥50%, and semi-curtailed sampling has been shown to effectively reduce the number of observations needed to reach a decision. To date the statistical underpinnings for Multiple Category-LQAS (MC-LQAS have not received full treatment. We explore the analytical properties of MC-LQAS, and validate its use for the classification of S. mansoni prevalence in multiple settings in East Africa.We outline MC-LQAS design principles and formulae for operating characteristic curves. In addition, we derive the average sample number for MC-LQAS when utilizing semi-curtailed sampling and introduce curtailed sampling in this setting. We also assess the performance of MC-LQAS designs with maximum sample sizes of n=15 and n=25 via a weighted kappa-statistic using S. mansoni data collected in 388 schools from four studies in East Africa.Overall performance of MC-LQAS classification was high (kappa-statistic of 0.87. In three of the studies, the kappa-statistic for a design with n=15 was greater than 0.75. In the fourth study, where these designs performed poorly (kappa-statistic less than 0.50, the majority of observations fell in regions where potential error is known to be high. Employment of semi-curtailed and curtailed sampling further reduced the sample size by as many as 0.5 and 3.5 observations per school, respectively, without increasing classification error.This work provides the needed analytics to understand the properties of MC-LQAS for assessing the prevalance of S. mansoni and shows that in most settings a sample size of 15 children provides a reliable classification of schools.

  9. Optimal design of supply chain network under uncertainty environment using hybrid analytical and simulation modeling approach

    Science.gov (United States)

    Chiadamrong, N.; Piyathanavong, V.

    2017-12-01

    Models that aim to optimize the design of supply chain networks have gained more interest in the supply chain literature. Mixed-integer linear programming and discrete-event simulation are widely used for such an optimization problem. We present a hybrid approach to support decisions for supply chain network design using a combination of analytical and discrete-event simulation models. The proposed approach is based on iterative procedures until the difference between subsequent solutions satisfies the pre-determined termination criteria. The effectiveness of proposed approach is illustrated by an example, which shows closer to optimal results with much faster solving time than the results obtained from the conventional simulation-based optimization model. The efficacy of this proposed hybrid approach is promising and can be applied as a powerful tool in designing a real supply chain network. It also provides the possibility to model and solve more realistic problems, which incorporate dynamism and uncertainty.

  10. ANALYTICAL RESULTS OF MOX COLEMANITE CONCRETE SAMPLE PBC-44.2

    Energy Technology Data Exchange (ETDEWEB)

    Best, D.; Cozzi, A.; Reigel, M.

    2012-12-20

    The Mixed Oxide Fuel Fabrication Facility (MFFF) will use colemanite bearing concrete neutron absorber panels credited with attenuating neutron flux in the criticality design analyses and shielding operators from radiation. The Savannah River National Laboratory is tasked with measuring the total density, partial hydrogen density, and partial boron density of the colemanite concrete. Sample PBC-44.2 was received on 9/20/2012 and analyzed. The average total density measured by the ASTM method C 642 was 2.03 g/cm{sup 3}, within the lower bound of 1.88 g/cm3. The average partial hydrogen density was 6.64E-02 g/cm{sup 3} as measured using method ASTM E 1311 and met the lower bound of 6.04E-02 g/cm{sup 3}. The average measured partial boron density was 1.70E-01 g/cm{sup 3} which met the lower bound of 1.65E-01 g/cm{sup 3} measured by the ASTM C 1301 method.

  11. Biofluid infrared spectro-diagnostics: pre-analytical considerations for clinical applications.

    Science.gov (United States)

    Lovergne, L; Bouzy, P; Untereiner, V; Garnotel, R; Baker, M J; Thiéfin, G; Sockalingum, G D

    2016-06-23

    Several proof-of-concept studies on the vibrational spectroscopy of biofluids have demonstrated that the methodology has promising potential as a clinical diagnostic tool. However, these studies also show that there is a lack of a standardised protocol in sample handling and preparation prior to spectroscopic analysis. One of the most important sources of analytical errors is the pre-analytical phase. For the technique to be translated into clinics, it is clear that a very strict protocol needs to be established for such biological samples. This study focuses on some of the aspects of the pre-analytical phase in the development of the high-throughput Fourier Transform Infrared (FTIR) spectroscopy of some of the most common biofluids such as serum, plasma and bile. Pre-analytical considerations that can impact either the samples (solvents, anti-coagulants, freeze-thaw cycles…) and/or spectroscopic analysis (sample preparation such as drying, deposit methods, volumes, substrates, operators dependence…) and consequently the quality and the reproducibility of spectral data will be discussed in this report.

  12. Use of analytical aids for accident management

    International Nuclear Information System (INIS)

    Ward, L.W.

    1991-01-01

    The use of analytical aids by utility technical support teams can enhance the staff's ability to manage accidents. Since instrumentation is exposed to environments beyond design-basis conditions, instruments may provide ambiguous information or may even fail. While it is most likely that many instruments will remain operable, their ability to provide unambiguous information needed for the management of beyond-design-basis events and severe accidents is questionable. Furthermore, given these limitation in instrumentation, the need to ascertain and confirm current plant status and forecast future behavior to effectively manage accidents at nuclear facilities requires a computational capability to simulate the thermal and hydraulic behavior in the primary, secondary, and containment systems. With the need to extend the current preventive approach in accident management to include mitigative actions, analytical aids could be used to further enhance the current capabilities at nuclear facilities. This need for computational or analytical aids is supported based on a review of the candidate accident management strategies discussed in NUREG/CR-5474. Based on the review of the NUREG/CR-5474 strategies, two major analytical aids are considered necessary to support the implementation and monitoring of many of the strategies in this document. These analytical aids include (1) An analytical aid to provide reactor coolant and secondary system behavior under LOCA conditions. (2) An analytical aid to predict containment pressure and temperature response with a steam, air, and noncondensable gas mixture present

  13. Application of Analytical Quality by Design concept for bilastine and its degradation impurities determination by hydrophilic interaction liquid chromatographic method.

    Science.gov (United States)

    Terzić, Jelena; Popović, Igor; Stajić, Ana; Tumpa, Anja; Jančić-Stojanović, Biljana

    2016-06-05

    This paper deals with the development of hydrophilic interaction liquid chromatographic (HILIC) method for the analysis of bilastine and its degradation impurities following Analytical Quality by Design approach. It is the first time that the method for bilastine and its impurities is proposed. The main objective was to identify the conditions where an adequate separation in minimal analysis duration could be achieved within a robust region. Critical process parameters which have the most influence on method performance were defined as acetonitrile content in the mobile phase, pH of the aqueous phase and ammonium acetate concentration in the aqueous phase. Box-Behnken design was applied for establishing a relationship between critical process parameters and critical quality attributes. The defined mathematical models and Monte Carlo simulations were used to identify the design space. Fractional factorial design was applied for experimental robustness testing and the method is validated to verify the adequacy of selected optimal conditions: the analytical column Luna(®) HILIC (100mm×4.6mm, 5μm particle size); mobile phase consisted of acetonitrile-aqueous phase (50mM ammonium acetate, pH adjusted to 5.3 with glacial acetic acid) (90.5:9.5, v/v); column temperature 30°C, mobile phase flow rate 1mLmin(-1), wavelength of detection 275nm. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. An analytical procedure for computing smooth transitions between two specified cross sections with applications to blended wing body configuration

    Science.gov (United States)

    Barger, R. L.

    1982-01-01

    An analytical procedure is described for designing smooth transition surfaces for blended wing-body configurations. Starting from two specified cross section shapes, the procedure generates a gradual transition from one cross section shape to the other as an analytic blend of the two shapes. The method utilizes a conformal mapping, with subsequent translation and scaling, to transform the specified and shapes to curves that can be combined more smoothly. A sample calculation is applied to a blended wing-body missile type configuration with a top mounted inlet.

  15. Sampling and examination methods used for TMI-2 samples

    International Nuclear Information System (INIS)

    Marley, A.W.; Akers, D.W.; McIsaac, C.V.

    1988-01-01

    The purpose of this paper is to summarize the sampling and examination techniques that were used in the collection and analysis of TMI-2 samples. Samples ranging from auxiliary building air to core debris were collected and analyzed. Handling of the larger samples and many of the smaller samples had to be done remotely and many standard laboratory analytical techniques were modified to accommodate the extremely high radiation fields associated with these samples. The TMI-2 samples presented unique problems with sampling and the laboratory analysis of prior molten fuel debris. 14 refs., 8 figs

  16. Evaluation of the quality of results obtained in institutions participating in interlaboratory experiments and of the reliability characteristics of the analytical methods used on the basis of certification of standard soil samples

    Energy Technology Data Exchange (ETDEWEB)

    Parshin, A.K.; Obol' yaninova, V.G.; Sul' dina, N.P.

    1986-08-20

    Rapid monitoring of the level of pollution of the environment and, especially, of soils necessitates preparation of standard samples (SS) close in properties and material composition to the objects to be analyzed. During 1978-1982 four sets (three types of samples in each) of State Standard Samples of different soils were developed: soddy-podzolic sandy-loamy, typical chernozem, krasnozem, and calcareous sierozem. The certification studies of the SS of the soils were carried out in accordance with the classical scheme of interlab experiment (ILE). More than 100 institutions were involved in the ILE and the total number of independent analytical results was of the order of 10/sup 4/. With such a volume of analytical information at their disposal they were able to find some general characteristics intrinsic to certification studies, to assess the quality of work of the ILE participants with due regard for their specialization, and the reliability characteristics of the analytical methods used.

  17. Life cycle management of analytical methods.

    Science.gov (United States)

    Parr, Maria Kristina; Schmidt, Alexander H

    2018-01-05

    In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Toxicologic evaluation of analytes from Tank 241-C-103

    International Nuclear Information System (INIS)

    Mahlum, D.D.; Young, J.Y.; Weller, R.E.

    1994-11-01

    Westinghouse Hanford Company requested PNL to assemble a toxicology review panel (TRP) to evaluate analytical data compiled by WHC, and provide advice concerning potential health effects associated with exposure to tank-vapor constituents. The team's objectives would be to (1) review procedures used for sampling vapors from tanks, (2) identify constituents in tank-vapor samples that could be related to symptoms reported by workers, (3) evaluate the toxicological implications of those constituents by comparison to establish toxicological databases, (4) provide advice for additional analytical efforts, and (5) support other activities as requested by WHC. The TRP represents a wide range of expertise, including toxicology, industrial hygiene, and occupational medicine. The TRP prepared a list of target analytes that chemists at the Oregon Graduate Institute/Sandia (OGI), Oak Ridge National Laboratory (ORNL), and PNL used to establish validated methods for quantitative analysis of head-space vapors from Tank 241-C-103. this list was used by the analytical laboratories to develop appropriate analytical methods for samples from Tank 241-C-103. Target compounds on the list included acetone, acetonitrile, ammonia, benzene, 1, 3-butadiene, butanal, n-butanol, hexane, 2-hexanone, methylene chloride, nitric oxide, nitrogen dioxide, nitrous oxide, dodecane, tridecane, propane nitrile, sulfur oxide, tributyl phosphate, and vinylidene chloride. The TRP considered constituent concentrations, current exposure limits, reliability of data relative to toxicity, consistency of the analytical data, and whether the material was carcinogenic or teratogenic. A final consideration in the analyte selection process was to include representative chemicals for each class of compounds found

  19. Quality by design (QbD), Process Analytical Technology (PAT), and design of experiment applied to the development of multifunctional sunscreens.

    Science.gov (United States)

    Peres, Daniela D'Almeida; Ariede, Maira Bueno; Candido, Thalita Marcilio; de Almeida, Tania Santos; Lourenço, Felipe Rebello; Consiglieri, Vladi Olga; Kaneko, Telma Mary; Velasco, Maria Valéria Robles; Baby, André Rolim

    2017-02-01

    Multifunctional formulations are of great importance to ensure better skin protection from harm caused by ultraviolet radiation (UV). Despite the advantages of Quality by Design and Process Analytical Technology approaches to the development and optimization of new products, we found in the literature only a few studies concerning their applications in cosmetic product industry. Thus, in this research work, we applied the QbD and PAT approaches to the development of multifunctional sunscreens containing bemotrizinol, ethylhexyl triazone, and ferulic acid. In addition, UV transmittance method was applied to assess qualitative and quantitative critical quality attributes of sunscreens using chemometrics analyses. Linear discriminant analysis allowed classifying unknown formulations, which is useful for investigation of counterfeit and adulteration. Simultaneous quantification of ethylhexyl triazone, bemotrizinol, and ferulic acid presented at the formulations was performed using PLS regression. This design allowed us to verify the compounds in isolation and in combination and to prove that the antioxidant action of ferulic acid as well as the sunscreen actions, since the presence of this component increased 90% of antioxidant activity in vitro.

  20. Anionic microemulsion to solvent stacking for on-line sample concentration of cationic analytes in capillary electrophoresis.

    Science.gov (United States)

    Kukusamude, Chunyapuk; Srijaranai, Supalax; Quirino, Joselito P

    2014-05-01

    The common SDS microemulsion (i.e. 3.3% SDS, 0.8% octane, and 6.6% butanol) and organic solvents were investigated for the stacking of cationic drugs in capillary zone electrophoresis using a low pH separation electrolyte. The sample was prepared in the acidic microemulsion and a high percentage of organic solvent was included in the electrolyte at anodic end of capillary. The stacking mechanism was similar to micelle to solvent stacking where the micelles were replaced by the microemulsion for the transport of analytes to the organic solvent rich boundary. This boundary is found between the microemulsion and anodic electrolyte. The effective electrophoretic mobility of the cations reversed from the direction of the anode in the microemulsion to the cathode in the boundary. Microemulsion to solvent stacking was successfully achieved with 40% ACN in the anodic electrolyte and hydrodynamic sample injection of 21 s at 1000 mbar (equivalent to 30% of the effective length). The sensitivity enhancement factors in terms of peak height and corrected peak area were 15 to 35 and 21 to 47, respectively. The linearity R(2) in terms of corrected peak area were >0.999. Interday precisions (%RSD, n = 6) were 3.3-4.0% for corrected peak area and 2.0-3.0% for migration time. Application to spiked real sample is also presented. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. SPIDIA-RNA: second external quality assessment for the pre-analytical phase of blood samples used for RNA based analyses.

    Directory of Open Access Journals (Sweden)

    Francesca Malentacchi

    Full Text Available One purpose of the EC funded project, SPIDIA, is to develop evidence-based quality guidelines for the pre-analytical handling of blood samples for RNA molecular testing. To this end, two pan-European External Quality Assessments (EQAs were implemented. Here we report the results of the second SPIDIA-RNA EQA. This second study included modifications in the protocol related to the blood collection process, the shipping conditions and pre-analytical specimen handling for participants. Participating laboratories received two identical proficiency blood specimens collected in tubes with or without an RNA stabilizer. For pre-defined specimen storage times and temperatures, laboratories were asked to perform RNA extraction from whole blood according to their usual procedure and to return extracted RNA to the SPIDIA facility for further analysis. These RNA samples were evaluated for purity, yield, integrity, stability, presence of interfering substances, and gene expression levels for the validated markers of RNA stability: FOS, IL1B, IL8, GAPDH, FOSB and TNFRSF10c. Analysis of the gene expression results of FOS, IL8, FOSB, and TNFRSF10c, however, indicated that the levels of these transcripts were significantly affected by blood collection tube type and storage temperature. These results demonstrated that only blood collection tubes containing a cellular RNA stabilizer allowed reliable gene expression analysis within 48 h from blood collection for all the genes investigated. The results of these two EQAs have been proposed for use in the development of a Technical Specification by the European Committee for Standardization.

  2. Piezoresistive Cantilever Performance-Part I: Analytical Model for Sensitivity.

    Science.gov (United States)

    Park, Sung-Jin; Doll, Joseph C; Pruitt, Beth L

    2010-02-01

    An accurate analytical model for the change in resistance of a piezoresistor is necessary for the design of silicon piezoresistive transducers. Ion implantation requires a high-temperature oxidation or annealing process to activate the dopant atoms, and this treatment results in a distorted dopant profile due to diffusion. Existing analytical models do not account for the concentration dependence of piezoresistance and are not accurate for nonuniform dopant profiles. We extend previous analytical work by introducing two nondimensional factors, namely, the efficiency and geometry factors. A practical benefit of this efficiency factor is that it separates the process parameters from the design parameters; thus, designers may address requirements for cantilever geometry and fabrication process independently. To facilitate the design process, we provide a lookup table for the efficiency factor over an extensive range of process conditions. The model was validated by comparing simulation results with the experimentally determined sensitivities of piezoresistive cantilevers. We performed 9200 TSUPREM4 simulations and fabricated 50 devices from six unique process flows; we systematically explored the design space relating process parameters and cantilever sensitivity. Our treatment focuses on piezoresistive cantilevers, but the analytical sensitivity model is extensible to other piezoresistive transducers such as membrane pressure sensors.

  3. Piezoresistive Cantilever Performance—Part I: Analytical Model for Sensitivity

    Science.gov (United States)

    Park, Sung-Jin; Doll, Joseph C.; Pruitt, Beth L.

    2010-01-01

    An accurate analytical model for the change in resistance of a piezoresistor is necessary for the design of silicon piezoresistive transducers. Ion implantation requires a high-temperature oxidation or annealing process to activate the dopant atoms, and this treatment results in a distorted dopant profile due to diffusion. Existing analytical models do not account for the concentration dependence of piezoresistance and are not accurate for nonuniform dopant profiles. We extend previous analytical work by introducing two nondimensional factors, namely, the efficiency and geometry factors. A practical benefit of this efficiency factor is that it separates the process parameters from the design parameters; thus, designers may address requirements for cantilever geometry and fabrication process independently. To facilitate the design process, we provide a lookup table for the efficiency factor over an extensive range of process conditions. The model was validated by comparing simulation results with the experimentally determined sensitivities of piezoresistive cantilevers. We performed 9200 TSUPREM4 simulations and fabricated 50 devices from six unique process flows; we systematically explored the design space relating process parameters and cantilever sensitivity. Our treatment focuses on piezoresistive cantilevers, but the analytical sensitivity model is extensible to other piezoresistive transducers such as membrane pressure sensors. PMID:20336183

  4. Sampling, storage and sample preparation procedures for X ray fluorescence analysis of environmental materials

    International Nuclear Information System (INIS)

    1997-06-01

    X ray fluorescence (XRF) method is one of the most commonly used nuclear analytical technique because of its multielement and non-destructive character, speed, economy and ease of operation. From the point of view of quality assurance practices, sampling and sample preparation procedures are the most crucial steps in all analytical techniques, (including X ray fluorescence) applied for the analysis of heterogeneous materials. This technical document covers recent modes of the X ray fluorescence method and recent developments in sample preparation techniques for the analysis of environmental materials. Refs, figs, tabs

  5. Ionic liquid-based dispersive microextraction of nitro toluenes in water samples

    International Nuclear Information System (INIS)

    Berton, Paula; Regmi, Bishnu P.; Spivak, David A.; Warner, Isiah M.

    2014-01-01

    We describe a method for dispersive liquid-liquid microextraction of nitrotoluene-based compounds. This method is based on use of the room temperature ionic liquid (RTIL) 1-hexyl-4-methylpyridinium bis(trifluoromethylsulfonyl)imide as the accepting phase, and is shown to work well for extraction of 4-nitrotoluene, 2,4-dinitrotoluene, and 2,4,6-trinitrotoluene. Separation and subsequent detection of analytes were accomplished via HPLC with UV detection. Several parameters that influence the efficiency of the extraction were optimized using experimental design. In this regard, a Plackett–Burman design was used for initial screening, followed by a central composite design to further optimize the influencing variables. For a 5-mL water sample, the optimized IL-DLLME procedure requires 26 mg of the RTIL as extraction solvent and 680 μL of methanol as the dispersant. Under optimum conditions, limits of detection (LODs) are lower than 1.05 μg L −1 . Relative standard deviations for 6 replicate determinations at a 4 μg L −1 analyte level are <4.3 % (calculated using peak areas). Correlation coefficients of >0.998 were achieved. This method was successfully applied to extraction and determination of nitrotoluene-based compounds in spiked tap and lake water samples. (author)

  6. Carbon Impact Analytics - Designing low carbon indices based on Carbon Impact Analytics indicators

    International Nuclear Information System (INIS)

    2016-01-01

    Investors are increasingly exposed to carbon risks and now face the challenge of managing these risks and developing climate-resilient investment strategies. Carbon Impact Analytics (CIA), an innovative methodology for analyzing the full carbon impact of a portfolio or index, equips investors and asset managers with the tools necessary to reduce their climate-related risks but also to seize the opportunities offered by the ongoing energy transition. Investors, asset managers and other financial institutions may use CIA results to: - measure and manage risks, - optimize their contribution to the energy transition, - seize opportunities associated with climate change mitigation, - report on GHG emissions and savings (for regulatory purposes or voluntarily), - engage in dialogue with companies, - reallocate investment portfolios, - and build new low-carbon indices. In this report, Carbone 4 offers a detailed look into how CIA indicators can be used to either 1) reallocate an existing portfolio or index to achieve maximal carbon performance or 2) build new low carbon indices from the ground up, drawn from Carbone 4's ever-growing database of CIA-analyzed firms. Two main levers were used to optimize CIA output: 1. Sectorial reallocation: exclusion of fossil fuel-related sectors or insertion of low carbon pure players; 2. Intra-sectorial reallocation: best-in-class approach within a sector. Sectorial and intra-sectorial methods may be applied in conjunction with one another to maximize results. For example, a best-in-class + fossil fuel-free index may be constructed by first excluding the fossil fuel sector and then applying a CIA best-in-class approach to all remaining sectors. This report offers a detailed look into how CIA indicators can be used to rework portfolios or indices to maximize carbon performance or to build low carbon indices from the ground up. These methods are illustrated via two preliminary examples of indices designed by Carbone 4: the reallocated

  7. Analytical Method for Carbon and Oxygen Isotope of Small Carbonate Samples with the GasBench Ⅱ-IRMS Device

    Directory of Open Access Journals (Sweden)

    LIANG Cui-cui

    2015-01-01

    Full Text Available An analytical method for measuring carbon and oxygen isotopic compositions of trace amount carbonate (>15 μg was established by Delta V Advantage isotope Ratio MS coupled with GasBench Ⅱ. Different trace amount (5-50 μg carbonate standard samples (IAEA-CO-1 were measured by GasBench Ⅱ with 12 mL and 3.7 mL vials. When the weight of samples was less than 40 μg and it was acidified in 12 mL vials, most standard deviations of the δ13C and δ18O were more than 0.1‰, which couldn’t satisfied high-precision measurements. When the weight of samples was greater than 15 μg and it was acidified in 3.7 mL vials, standard deviations for the δ13C and δ18O were 0.01‰-0.07‰ and 0.01‰-0.08‰ respectively, which satisfied high-precision measurements. Therefore, small 3.7 mL vials were used to increase the concentration of carbon dioxide in headspace, carbonate samples even less as 15 μg can be analyzed routinely by a GasBench Ⅱ continuous-flow IRMS. Meanwhile, the linear relationship between sample’s weight and peak’s area was strong (R2>0.993 2 and it can be used to determine the carbon content of carbonate samples.

  8. Analytical quality assurance procedures developed for the IAEA's Reference Asian Man Project (Phase 2)

    International Nuclear Information System (INIS)

    Kawamura, H.; Parr, R.M.; Dang, H.S.; Tian, W.; Barnes, R.M.; Iyengar, G.V.

    2000-01-01

    Analytical quality assurance procedures adopted for use in the IAEA Co-ordinated Research Project on Ingestion and Organ Content of Trace Elements of Importance in Radiological Protection are designed to ensure comparability of the analytical results for Cs, I, Sr, Th, U and other elements in human tissues and diets collected and analysed in nine participating countries. The main analytical techniques are NAA and ICP-MS. For sample preparation, all participants are using identical food blenders which have been centrally supplied after testing for contamination. For quality control of the analyses, six NIST SRMs covering a range of matrices with certified and reference values for the elements of interest have been distributed. A new Japanese reference diet material has also been developed. These quality assurance procedures are summarized here and new data are presented for Cs, I, Sr, Th and U in the NIST SRMs. (author)

  9. INVESTIGATION OF THE TOTAL ORGANIC HALOGEN ANALYTICAL METHOD AT THE WASTE SAMPLING CHARACTERIZATION FACILITY (WSCF)

    International Nuclear Information System (INIS)

    DOUGLAS JG; MEZNARICH HD, PHD; OLSEN JR; ROSS GA; STAUFFER M

    2008-01-01

    effectively remove inorganic chloride from the activated carbon adsorption tubes. With the TOX sample preparation equipment and TOX analyzers at WSCF, the nitrate wash recommended by EPA SW-846 method 9020B was found to be inadequate to remove inorganic chloride interference. Increasing the nitrate wash concentration from 10 grams per liter (g/L) to 100 g/L potassium nitrate and increasing the nitrate wash volume from 3 milliliters (mL) to 10 mL effectively removed the inorganic chloride up to at least 100 ppm chloride in the sample matrix. Excessive purging of the adsorption tubes during sample preparation was eliminated. These changes in sample preparation have been incorporated in the analytical procedure. The results using the revised sample preparation procedure show better agreement of TOX values both for replicate analyses of single samples and for the analysis of replicate samples acquired from the same groundwater well. Furthermore, less apparent column breakthrough now occurs with the revised procedure. One additional modification made to sample preparation was to discontinue the treatment of groundwater samples with sodium bisulfite. Sodium bisulfite is used to remove inorganic chlorine from the sample; inorganic chlorine is not expected to be a constituent in these groundwater samples. Several other factors were also investigated as possible sources of anomalous TOX results: (1) Instrument instability: examination of the history of results for TOX laboratory control samples and initial calibration verification standards indicate good long-term precision for the method and instrument. Determination of a method detection limit of 2.3 ppb in a deionized water matrix indicates the method and instrumentation have good stability and repeatability. (2) Non-linear instrument response: the instrument is shown to have good linear response from zero to 200 parts per billion (ppb) TOX. This concentration range encompasses the majority of samples received at WSCF for TOX

  10. INVESTIGATION OF THE TOTAL ORGANIC HALOGEN ANALYTICAL METHOD AT THE WASTE SAMPLING AND CHARACTERIZATION FACILITY

    International Nuclear Information System (INIS)

    Douglas, J.G.; Meznarich, H.K.; Olsen, J.R.; Ross, G.A.; Stauffer, M.

    2009-01-01

    remove inorganic chloride from the activated-carbon adsorption tubes. With the TOX sample preparation equipment and TOX analyzers at WSCF, the nitrate wash recommended by EPA SW-846 method 9020B was found to be inadequate to remove inorganic chloride interference. Increasing the nitrate wash concentration from 10 grams per liter (g/L) to 100 giL potassium nitrate and increasing the nitrate wash volume from 3 milliliters (mL) to 10 mL effectively removed the inorganic chloride up to at least 100 ppm chloride in the sample matrix. Excessive purging of the adsorption tubes during sample preparation was eliminated. These changes in sample preparation have been incorporated in the analytical procedure. The results using the revised sample preparation procedure show better agreement of TOX values both for replicate analyses of single samples and for the analysis of replicate samples acquired from the same groundwater well. Furthermore, less apparent adsorption tube breakthrough now occurs with the revised procedure. One additional modification made to sample preparation was to discontinue the treatment of groundwater samples with sodium bisulfite. Sodium bisulfite is used to remove inorganic chlorine from the sample; inorganic chlorine is not expected to be a constituent in these groundwater samples. Several other factors were also investigated as possible sources of anomalous TOX results: (1) Instrument instability: examination of the history of results for TOX laboratory control samples and initial calibration verification standards indicate good long-term precision for the method and instrument. Determination of a method detection limit of 2.3 ppb in a deionized water matrix indicates the method and instrumentation have good stability and repeatability. (2) Non-linear instrument response: the instrument is shown to have good linear response from zero to 200 parts per billion (ppb) TOX. This concentration range encompasses the majority of samples received at WSCF for TOX

  11. An analytical method for optimal design of MR valve structures

    International Nuclear Information System (INIS)

    Nguyen, Q H; Choi, S B; Lee, Y S; Han, M S

    2009-01-01

    This paper proposes an analytical methodology for the optimal design of a magnetorheological (MR) valve structure. The MR valve structure is constrained in a specific volume and the optimization problem identifies geometric dimensions of the valve structure that maximize the yield stress pressure drop of a MR valve or the yield stress damping force of a MR damper. In this paper, the single-coil and two-coil annular MR valve structures are considered. After describing the schematic configuration and operating principle of a typical MR valve and damper, a quasi-static model is derived based on the Bingham model of a MR fluid. The magnetic circuit of the valve and damper is then analyzed by applying Kirchoff's law and the magnetic flux conservation rule. Based on quasi-static modeling and magnetic circuit analysis, the optimization problem of the MR valve and damper is built. In order to reduce the computation load, the optimization problem is simplified and a procedure to obtain the optimal solution of the simplified optimization problem is presented. The optimal solution of the simplified optimization problem of the MR valve structure constrained in a specific volume is then obtained and compared with the solution of the original optimization problem and the optimal solution obtained from the finite element method

  12. Description of JNC's analytical method and its performance for FBR cores

    International Nuclear Information System (INIS)

    Ishikawa, M.

    2000-01-01

    The description of JNC's analytical method and its performance for FBR cores includes: an outline of JNC's Analytical System Compared with ERANOS; a standard data base for FBR Nuclear Design in JNC; JUPITER Critical Experiment; details of Analytical Method and Its Effects on JUPITER; performance of JNC Analytical System (effective multiplication factor k eff , control rod worth, and sodium void reactivity); design accuracy of a 600 MWe-class FBR Core. JNC developed a consistent analytical system for FBR core evaluation, based on JENDL library, f-table method, and three dimensional diffusion/transport theory, which includes comprehensive sensitivity tools to improve the prediction accuracy of core parameters. JNC system was verified by analysis of JUPITER critical experiment, and other facilities. Its performance can be judged quite satisfactory for FBR-core design work, though there is room for further improvement, such as more detailed treatment of cross-section resonance regions

  13. Demonstrating Success: Web Analytics and Continuous Improvement

    Science.gov (United States)

    Loftus, Wayne

    2012-01-01

    As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…

  14. Statistical Analysis Of Tank 19F Floor Sample Results

    International Nuclear Information System (INIS)

    Harris, S.

    2010-01-01

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 19F as per the statistical sampling plan developed by Harris and Shine. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples results to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL95%) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current scrape sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 19F. The uncertainty is quantified in this report by an UCL95% on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL95% was based entirely on the six current scrape sample results (each averaged across three analytical determinations).

  15. Remote sampling system in reprocessing: present and future perspective

    International Nuclear Information System (INIS)

    Garcha, J.S.; Balakrishnan, V.P.; Rao, M.K.

    1990-01-01

    For the process and inventory control of the reprocessing plant operation it is essential to analyse the samples from the various process vessels to assess the plant performance and take corrective action if needed in the operating parameters. In view of the very high radioactive inventory in the plant, these plants are operated remotely behind thick shielding. The liquid sampling also has to be carried out by remote techniques only as no direct approach is feasible. A vacuum assisted air lift method is employed for the purpose of obtaining samples from remotely located process vessels. A brief description of the present technique, the design criteria, various interlocks and manual operations involved during sampling and despatching the same to the analytical laboratory is given in the paper. A design approach for making the sampling system, a fully automated remote operation has been attempted in this paper. Utilisation of custom built robots and dedicated computer for the various operations and interlocks has been visualised to ensure a complete remotised system for the adoption in future plants. (author). 2 figs., 2 tabs

  16. Analysing task design and students' responses to context-based problems through different analytical frameworks

    Science.gov (United States)

    Broman, Karolina; Bernholt, Sascha; Parchmann, Ilka

    2015-05-01

    Background:Context-based learning approaches are used to enhance students' interest in, and knowledge about, science. According to different empirical studies, students' interest is improved by applying these more non-conventional approaches, while effects on learning outcomes are less coherent. Hence, further insights are needed into the structure of context-based problems in comparison to traditional problems, and into students' problem-solving strategies. Therefore, a suitable framework is necessary, both for the analysis of tasks and strategies. Purpose:The aim of this paper is to explore traditional and context-based tasks as well as students' responses to exemplary tasks to identify a suitable framework for future design and analyses of context-based problems. The paper discusses different established frameworks and applies the Higher-Order Cognitive Skills/Lower-Order Cognitive Skills (HOCS/LOCS) taxonomy and the Model of Hierarchical Complexity in Chemistry (MHC-C) to analyse traditional tasks and students' responses. Sample:Upper secondary students (n=236) at the Natural Science Programme, i.e. possible future scientists, are investigated to explore learning outcomes when they solve chemistry tasks, both more conventional as well as context-based chemistry problems. Design and methods:A typical chemistry examination test has been analysed, first the test items in themselves (n=36), and thereafter 236 students' responses to one representative context-based problem. Content analysis using HOCS/LOCS and MHC-C frameworks has been applied to analyse both quantitative and qualitative data, allowing us to describe different problem-solving strategies. Results:The empirical results show that both frameworks are suitable to identify students' strategies, mainly focusing on recall of memorized facts when solving chemistry test items. Almost all test items were also assessing lower order thinking. The combination of frameworks with the chemistry syllabus has been

  17. Defense Waste Processing Facility prototypic analytical laboratory

    International Nuclear Information System (INIS)

    Policke, T.A.; Bryant, M.F.; Spencer, R.B.

    1991-01-01

    The Defense Waste Processing Technology (DWPT) Analytical Laboratory is a relatively new laboratory facility at the Savannah River Site (SRS). It is a non-regulated, non-radioactive laboratory whose mission is to support research and development (R ampersand D) and waste treatment operations by providing analytical and experimental services in a way that is safe, efficient, and produces quality results in a timely manner so that R ampersand D personnel can provide quality technical data and operations personnel can efficiently operate waste treatment facilities. The modules are sample receiving, chromatography I, chromatography II, wet chemistry and carbon, sample preparation, and spectroscopy

  18. Making Decisions by Analytical Chemistry

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    . These discrepancies are very unfortunate because erroneous conclusions may arise from an otherwise meticulous and dedicated effort of research staff. This may eventually lead to unreliable conclusions thus jeopardizing investigations of environmental monitoring, climate changes, food safety, clinical chemistry......It has been long recognized that results of analytical chemistry are not flawless, owing to the fact that professional laboratories and research laboratories analysing the same type of samples by the same type of instruments are likely to obtain significantly different results. The European......, forensics and other fields of science where analytical chemistry is the key instrument of decision making. In order to elucidate the potential origin of the statistical variations found among laboratories, a major program was undertaken including several analytical technologies where the purpose...

  19. Design of a gravity corer for near shore sediment sampling

    Digital Repository Service at National Institute of Oceanography (India)

    Bhat, S.T.; Sonawane, A.V.; Nayak, B.U.

    For the purpose of geotechnical investigation a gravity corer has been designed and fabricated to obtain undisturbed sediment core samples from near shore waters. The corer was successfully operated at 75 stations up to water depth 30 m. Simplicity...

  20. Summative Mass Analysis of Algal Biomass - Integration of Analytical Procedures: Laboratory Analytical Procedure (LAP)

    Energy Technology Data Exchange (ETDEWEB)

    Laurens, Lieve M. L.

    2016-01-13

    This procedure guides the integration of laboratory analytical procedures to measure algal biomass constituents in an unambiguous manner and ultimately achieve mass balance closure for algal biomass samples. Many of these methods build on years of research in algal biomass analysis.

  1. New analytical methods for quality control of St. John's wort

    International Nuclear Information System (INIS)

    Huck-Pezzei, V.

    2013-01-01

    In the present work, a novel analytical platform is introduced, which enables both anal-ysis and quality control of St. John´s wort extracts and tissue. The synergistic combina-tion of separation techniques (including thin-layer chromatography (TLC), high-performance liquid chromatography (HPLC)) with mass spectrometry (MS) and vibra-tional spectroscopy is demonstrated to get deeper insights into the ingredients composi-tion. TLC was successfully employed to identify some unknown ingredients being pre-sent in samples with Chinese provenience. The here described novel HPLC method allowed to differentiate clearly between European and Chinese samples on one hand, on the other hand this method could successfully be employed for the semi-preparative isolation of the unknown ingredient. Matrix-free laser desorption ionization time of flight mass spectrometry (mf-LDI-TOF/MS) using a special designed titanium oxide layer was employed to identify the structure of the substance. The analytical knowledge generated so far was used to establish an infrared spectroscopic model allowing both quantitative analysis of ingredients as well as differentiating between European and Chinese provenience. Finally, infrared imaging spectroscopy was conducted to get knowledge about the high resolved distribution of ingredients. The analytical platform established can be used for fast and non-destructive quantitation and quality control to identify adulteration being of interest according to the Deutsche Arzneimittel Codex (DAC) even for the phytopharmaceutical industry. (author) [de

  2. Combination of Cyclodextrin and Ionic Liquid in Analytical Chemistry: Current and Future Perspectives.

    Science.gov (United States)

    Hui, Boon Yih; Raoov, Muggundha; Zain, Nur Nadhirah Mohamad; Mohamad, Sharifah; Osman, Hasnah

    2017-09-03

    The growth in driving force and popularity of cyclodextrin (CDs) and ionic liquids (ILs) as promising materials in the field of analytical chemistry has resulted in an exponentially increase of their exploitation and production in analytical chemistry field. CDs belong to the family of cyclic oligosaccharides composing of α-(1,4) linked glucopyranose subunits and possess a cage-like supramolecular structure. This structure enables chemical reactions to proceed between interacting ions, radical or molecules in the absence of covalent bonds. Conversely, ILs are an ionic fluids comprising of only cation and anion often with immeasurable vapor pressure making them as green or designer solvent. The cooperative effect between CD and IL due to their fascinating properties, have nowadays contributed their footprints for a better development in analytical chemistry nowadays. This comprehensive review serves to give an overview on some of the recent studies and provides an analytical trend for the application of CDs with the combination of ILs that possess beneficial and remarkable effects in analytical chemistry including their use in various sample preparation techniques such as solid phase extraction, magnetic solid phase extraction, cloud point extraction, microextraction, and separation techniques which includes gas chromatography, high-performance liquid chromatography, capillary electrophoresis as well as applications of electrochemical sensors as electrode modifiers with references to recent applications. This review will highlight the nature of interactions and synergic effects between CDs, ILs, and analytes. It is hoped that this review will stimulate further research in analytical chemistry.

  3. Towards Standardization of Sampling Methodology for Evaluation of ...

    African Journals Online (AJOL)

    This article proposes the procedure that may be adopted for comparable, representative and cost effective, soil sampling, and thereafter explores the policy issues regarding standardization of sampling activities and analytical process as it relates to soil pollution in Nigeria. Standardized sampling and analytical data for soil ...

  4. A binary logistic regression model with complex sampling design of ...

    African Journals Online (AJOL)

    2017-09-03

    Sep 3, 2017 ... Bi-variable and multi-variable binary logistic regression model with complex sampling design was fitted. .... Data was entered into STATA-12 and analyzed using. SPSS-21. .... lack of access/too far or costs too much. 35. 1.2.

  5. Surrogate analyte approach for quantitation of endogenous NAD(+) in human acidified blood samples using liquid chromatography coupled with electrospray ionization tandem mass spectrometry.

    Science.gov (United States)

    Liu, Liling; Cui, Zhiyi; Deng, Yuzhong; Dean, Brian; Hop, Cornelis E C A; Liang, Xiaorong

    2016-02-01

    A high-performance liquid chromatography tandem mass spectrometry (LC-MS/MS) assay for the quantitative determination of NAD(+) in human whole blood using a surrogate analyte approach was developed and validated. Human whole blood was acidified using 0.5N perchloric acid at a ratio of 1:3 (v:v, blood:perchloric acid) during sample collection. 25μL of acidified blood was extracted using a protein precipitation method and the resulting extracts were analyzed using reverse-phase chromatography and positive electrospray ionization mass spectrometry. (13)C5-NAD(+) was used as the surrogate analyte for authentic analyte, NAD(+). The standard curve ranging from 0.250 to 25.0μg/mL in acidified human blood for (13)C5-NAD(+) was fitted to a 1/x(2) weighted linear regression model. The LC-MS/MS response between surrogate analyte and authentic analyte at the same concentration was obtained before and after the batch run. This response factor was not applied when determining the NAD(+) concentration from the (13)C5-NAD(+) standard curve since the percent difference was less than 5%. The precision and accuracy of the LC-MS/MS assay based on the five analytical QC levels were well within the acceptance criteria from both FDA and EMA guidance for bioanalytical method validation. Average extraction recovery of (13)C5-NAD(+) was 94.6% across the curve range. Matrix factor was 0.99 for both high and low QC indicating minimal ion suppression or enhancement. The validated assay was used to measure the baseline level of NAD(+) in 29 male and 21 female human subjects. This assay was also used to study the circadian effect of endogenous level of NAD(+) in 10 human subjects. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. NC CATCH: Advancing Public Health Analytics.

    Science.gov (United States)

    Studnicki, James; Fisher, John W; Eichelberger, Christopher; Bridger, Colleen; Angelon-Gaetz, Kim; Nelson, Debi

    2010-01-01

    The North Carolina Comprehensive Assessment for Tracking Community Health (NC CATCH) is a Web-based analytical system deployed to local public health units and their community partners. The system has the following characteristics: flexible, powerful online analytic processing (OLAP) interface; multiple sources of multidimensional, event-level data fully conformed to common definitions in a data warehouse structure; enabled utilization of available decision support software tools; analytic capabilities distributed and optimized locally with centralized technical infrastructure; two levels of access differentiated by the user (anonymous versus registered) and by the analytical flexibility (Community Profile versus Design Phase); and, an emphasis on user training and feedback. The ability of local public health units to engage in outcomes-based performance measurement will be influenced by continuing access to event-level data, developments in evidence-based practice for improving population health, and the application of information technology-based analytic tools and methods.

  7. DOE methods for evaluating environmental and waste management samples

    Energy Technology Data Exchange (ETDEWEB)

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K. [eds.

    1994-10-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Analytical Services Division of DOE. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types, {open_quotes}Draft{close_quotes} or {open_quotes}Verified{close_quotes}. {open_quotes}Draft{close_quotes} methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. {open_quotes}Verified{close_quotes} methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations. These methods have delineated measures of precision and accuracy.

  8. DOE methods for evaluating environmental and waste management samples

    International Nuclear Information System (INIS)

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K.

    1994-10-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Analytical Services Division of DOE. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types, open-quotes Draftclose quotes or open-quotes Verifiedclose quotes. open-quotes Draftclose quotes methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. open-quotes Verifiedclose quotes methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations. These methods have delineated measures of precision and accuracy

  9. Decision-Based Design Integrating Consumer Preferences into Engineering Design

    CERN Document Server

    Chen, Wei; Wassenaar, Henk Jan

    2013-01-01

    Building upon the fundamental principles of decision theory, Decision-Based Design: Integrating Consumer Preferences into Engineering Design presents an analytical approach to enterprise-driven Decision-Based Design (DBD) as a rigorous framework for decision making in engineering design.  Once the related fundamentals of decision theory, economic analysis, and econometrics modelling are established, the remaining chapters describe the entire process, the associated analytical techniques, and the design case studies for integrating consumer preference modeling into the enterprise-driven DBD framework. Methods for identifying key attributes, optimal design of human appraisal experiments, data collection, data analysis, and demand model estimation are presented and illustrated using engineering design case studies. The scope of the chapters also provides: •A rigorous framework of integrating the interests from both producer and consumers in engineering design, •Analytical techniques of consumer choice model...

  10. XRF analysis of mineralised samples

    International Nuclear Information System (INIS)

    Ahmedali, T.

    2002-01-01

    Full text: Software now supplied by instrument manufacturers has made it practical and convenient for users to analyse unusual samples routinely. Semiquantitative scanning software can be used for rapid preliminary screening of elements ranging from Carbon to Uranium, prior to assigning mineralised samples to an appropriate quantitative analysis routine. The general quality and precision of analytical results obtained from modern XRF spectrometers can be significantly enhanced by several means: a. Modifications in preliminary sample preparation can result in less contamination from crushing and grinding equipment. Optimised techniques of actual sample preparation can significantly increase precision of results. b. Employment of automatic data recording balances and the use of catch weights during sample preparation reduces technician time as well as weighing errors. * c. Consistency of results can be improved significantly by the use of appropriate stable drift monitors with a statistically significant content of the analyte d. A judicious selection of kV/mA combinations, analysing crystals, primary beam filters, collimators, peak positions, accurate background correction and peak overlap corrections, followed by the use of appropriate matrix correction procedures. e. Preventative maintenance procedures for XRF spectrometers and ancillary equipment, which can also contribute significantly to reducing instrument down times, are described. Examples of various facets of sample processing routines are given from the XRF spectrometer component of a multi-instrument analytical university facility, which provides XRF data to 17 Canadian universities. Copyright (2002) Australian X-ray Analytical Association Inc

  11. Guidance for the Design and Adoption of Analytic Tools.

    Energy Technology Data Exchange (ETDEWEB)

    Bandlow, Alisa [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  12. A preliminary factor analytic investigation into the firstorder factor structure of the Fifteen Factor Plus (15FQ+ on a sample of Black South African managers

    Directory of Open Access Journals (Sweden)

    Seretse Moyo

    2011-10-01

    Research purpose: The primary objective of this study was to undertake a factor analytic investigation of the first-order factor structure of the 15FQ+. Motivation for the study: The construct validity of the 15FQ+, as a measure of personality, is necessary even though it is insufficient to justify its use in personnel selection. Research design, approach and method: The researchers evaluated the fit of the measurement model, which the structure and scoring key of the 15FQ+ implies, in a quantitative study that used an ex post facto correlation design through structural equation modelling. They conducted a secondary data analysis. They selected a sample of 241 Black South African managers from a large 15FQ+ database. Main findings: The researchers found good measurement model fit. The measurement model parameter estimates were worrying. The magnitude of the estimated model parameters suggests that the items generally do not reflect the latent personality dimensions the designers intended them to with a great degree of precision. The items are reasonably noisy measures of the latent variables they represent. Practical/managerial implications: Organisations should use the 15FQ+ carefully on Black South African managers until further local research evidence becomes available. Contribution/value-add: The study is a catalyst to trigger the necessary additional research we need to establish convincingly the psychometric credentials of the 15FQ+ as a valuable assessment tool in South Africa.

  13. Laser ablation for analytical sampling: what can we learn from modeling?

    International Nuclear Information System (INIS)

    Bogaerts, Annemie; Chen Zhaoyang; Gijbels, Renaat; Vertes, Akos

    2003-01-01

    The paper is built up in two parts. First, a rather comprehensive introduction is given, with a brief overview of the different application fields of laser ablation, focusing mainly on the analytical applications, and an overview of the different modeling approaches available for laser ablation. Further, a discussion is presented here about the laser evaporated plume expansion in vacuum or in a background gas, as well as about the different mechanisms for particle formation in the laser ablation process, which is most relevant for laser ablation as solid sampling technique for inductively coupled plasma (ICP) spectrometry. In the second part, a model is presented that describes the interaction of an ns-pulsed laser with a Cu target, as well as the resulting plume expansion and plasma formation. The results presented here, include the temperature distribution in the target, the melting and evaporation of the target, the vapor density, velocity and temperature distribution in the evaporated plume, the ionization degree and the density profiles of Cu 0 atoms, Cu + and Cu 2+ ions and electrons in the plume (plasma), as well as the resulting plasma shielding of the incoming laser beam. Results are presented as a function of time during and after the laser pulse, and as a function of position in the target or in the plume. The influence of the target reflection coefficient on the above calculation results is investigated. Finally, the effect of the laser pulse fluence on the target heating, melting and vaporization, and on the plume characteristics and plasma formation is studied. Our modeling results are in reasonable agreement with calculated and measured data from literature

  14. INEL Sample Management Office

    International Nuclear Information System (INIS)

    Watkins, C.

    1994-01-01

    The Idaho National Engineering Laboratory (INEL) Sample Management Office (SMO) was formed as part of the EG ampersand G Idaho Environmental Restoration Program (ERP) in June, 1990. Since then, the SMO has been recognized and sought out by other prime contractors and programs at the INEL. Since December 1991, the DOE-ID Division Directors for the Environmental Restoration Division and Waste Management Division supported the expansion of the INEL ERP SMO into the INEL site wide SMO. The INEL SMO serves as a point of contact for multiple environmental analytical chemistry and laboratory issues (e.g., capacity, capability). The SMO chemists work with project managers during planning to help develop data quality objectives, select appropriate analytical methods, identify special analytical services needs, identify a source for the services, and ensure that requirements for sampling and analysis (e.g., preservations, sample volumes) are clear and technically accurate. The SMO chemists also prepare work scope statements for the laboratories performing the analyses

  15. Hanford Sampling Quality Management Plan (HSQMP)

    International Nuclear Information System (INIS)

    Hyatt, J.E.

    1995-01-01

    This document provides a management tool for evaluating and designing the appropriate elements of a field sampling program. This document provides discussion of the elements of a program and is to be used as a guidance document during the preparation of project and/or function specific documentation. This document does not specify how a sampling program shall be organized. The HSQMP is to be used as a companion document to the Hanford Analytical Services Quality Assurance Plan (HASQAP) DOE/RL-94-55. The generation of this document was enhanced by conducting baseline evaluations of current sampling organizations. Valuable input was received from members of field and Quality Assurance organizations. The HSQMP is expected to be a living document. Revisions will be made as regulations and or Hanford Site conditions warrant changes in the best management practices. Appendices included are: summary of the sampling and analysis work flow process, a user's guide to the Data Quality Objective process, and a self-assessment checklist

  16. Simulation of sampling effects in FPAs

    Science.gov (United States)

    Cook, Thomas H.; Hall, Charles S.; Smith, Frederick G.; Rogne, Timothy J.

    1991-09-01

    The use of multiplexers and large focal plane arrays in advanced thermal imaging systems has drawn renewed attention to sampling and aliasing issues in imaging applications. As evidenced by discussions in a recent workshop, there is no clear consensus among experts whether aliasing in sensor designs can be readily tolerated, or must be avoided at all cost. Further, there is no straightforward, analytical method that can answer the question, particularly when considering image interpreters as different as humans and autonomous target recognizers (ATR). However, the means exist for investigating sampling and aliasing issues through computer simulation. The U.S. Army Tank-Automotive Command (TACOM) Thermal Image Model (TTIM) provides realistic sensor imagery that can be evaluated by both human observers and TRs. This paper briefly describes the history and current status of TTIM, explains the simulation of FPA sampling effects, presents validation results of the FPA sensor model, and demonstrates the utility of TTIM for investigating sampling effects in imagery.

  17. Social Data Analytics Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    This paper presents the design, development and demonstrative case studies of the Social Data Analytics Tool, SODATO. Adopting Action Design Framework [1], the objective of SODATO [2] is to collect, store, analyze, and report big social data emanating from the social media engagement of and social...... media conversations about organizations. We report and discuss results from two demonstrative case studies that were conducted using SODATO and conclude with implications and future work....

  18. GA-4/GA-9 honeycomb impact limiter tests and analytical model

    International Nuclear Information System (INIS)

    Koploy, M.A.; Taylor, C.S.

    1991-01-01

    General Atomics (GA) has a test program underway to obtain data on the behavior of a honeycomb impact limiter. The program includes testing of small samples to obtain basic information, as well as testing of complete 1/4-scale impact limiters to obtain load-versus-deflection curves for different crush orientations. GA has used the test results to aid in the development of an analytical model to predict the impact limiter loads. The results also helped optimize the design of the impact limiters for the GA-4 and GA-9 Casks

  19. DOE methods for evaluating environmental and waste management samples

    International Nuclear Information System (INIS)

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K.

    1994-04-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Laboratory Management Division of the DOE. Methods are prepared for entry into DOE Methods as chapter editors, together with DOE and other participants in this program, identify analytical and sampling method needs. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types. open-quotes Draftclose quotes or open-quotes Verified.close quotes. open-quotes Draftclose quotes methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. open-quotes Verifiedclose quotes methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations

  20. Are quantitative trait-dependent sampling designs cost-effective for analysis of rare and common variants?

    Science.gov (United States)

    Yilmaz, Yildiz E; Bull, Shelley B

    2011-11-29

    Use of trait-dependent sampling designs in whole-genome association studies of sequence data can reduce total sequencing costs with modest losses of statistical efficiency. In a quantitative trait (QT) analysis of data from the Genetic Analysis Workshop 17 mini-exome for unrelated individuals in the Asian subpopulation, we investigate alternative designs that sequence only 50% of the entire cohort. In addition to a simple random sampling design, we consider extreme-phenotype designs that are of increasing interest in genetic association analysis of QTs, especially in studies concerned with the detection of rare genetic variants. We also evaluate a novel sampling design in which all individuals have a nonzero probability of being selected into the sample but in which individuals with extreme phenotypes have a proportionately larger probability. We take differential sampling of individuals with informative trait values into account by inverse probability weighting using standard survey methods which thus generalizes to the source population. In replicate 1 data, we applied the designs in association analysis of Q1 with both rare and common variants in the FLT1 gene, based on knowledge of the generating model. Using all 200 replicate data sets, we similarly analyzed Q1 and Q4 (which is known to be free of association with FLT1) to evaluate relative efficiency, type I error, and power. Simulation study results suggest that the QT-dependent selection designs generally yield greater than 50% relative efficiency compared to using the entire cohort, implying cost-effectiveness of 50% sample selection and worthwhile reduction of sequencing costs.