WorldWideScience

Sample records for source testing analytical

  1. Electrospray ion source with reduced analyte electrochemistry

    Science.gov (United States)

    Kertesz, Vilmos [Knoxville, TN; Van Berkel, Gary [Clinton, TN

    2011-08-23

    An electrospray ion (ESI) source and method capable of ionizing an analyte molecule without oxidizing or reducing the analyte of interest. The ESI source can include an emitter having a liquid conduit, a working electrode having a liquid contacting surface, a spray tip, a secondary working electrode, and a charge storage coating covering partially or fully the liquid contacting surface of the working electrode. The liquid conduit, the working electrode and the secondary working electrode can be in liquid communication. The electrospray ion source can also include a counter electrode proximate to, but separated from, said spray tip. The electrospray ion source can also include a power system for applying a voltage difference between the working electrodes and a counter-electrode. The power system can deliver pulsed voltage changes to the working electrodes during operation of said electrospray ion source to minimize the surface potential of the charge storage coating.

  2. Light Source Estimation with Analytical Path-tracing

    OpenAIRE

    Kasper, Mike; Keivan, Nima; Sibley, Gabe; Heckman, Christoffer

    2017-01-01

    We present a novel algorithm for light source estimation in scenes reconstructed with a RGB-D camera based on an analytically-derived formulation of path-tracing. Our algorithm traces the reconstructed scene with a custom path-tracer and computes the analytical derivatives of the light transport equation from principles in optics. These derivatives are then used to perform gradient descent, minimizing the photometric error between one or more captured reference images and renders of our curre...

  3. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    Science.gov (United States)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82

  4. Analytical performances of food microbiology laboratories - critical analysis of 7 years of proficiency testing results.

    Science.gov (United States)

    Abdel Massih, M; Planchon, V; Polet, M; Dierick, K; Mahillon, J

    2016-02-01

    Based on the results of 19 food microbiology proficiency testing (PT) schemes, this study aimed to assess the laboratory performances, to highlight the main sources of unsatisfactory analytical results and to suggest areas of improvement. The 2009-2015 results of REQUASUD and IPH PT, involving a total of 48 laboratories, were analysed. On average, the laboratories failed to detect or enumerate foodborne pathogens in 3·0% of the tests. Thanks to a close collaboration with the PT participants, the causes of outliers could be identified in 74% of the cases. The main causes of erroneous PT results were either pre-analytical (handling of the samples, timing of analysis), analytical (unsuitable methods, confusion of samples, errors in colony counting or confirmation) or postanalytical mistakes (calculation and encoding of results). PT schemes are a privileged observation post to highlight analytical problems, which would otherwise remain unnoticed. In this perspective, this comprehensive study of PT results provides insight into the sources of systematic errors encountered during the analyses. This study draws the attention of the laboratories to the main causes of analytical errors and suggests practical solutions to avoid them, in an educational purpose. The observations support the hypothesis that regular participation to PT, when followed by feed-back and appropriate corrective actions, can play a key role in quality improvement and provide more confidence in the laboratory testing results. © 2015 The Society for Applied Microbiology.

  5. Pentaho Business Analytics: a Business Intelligence Open Source Alternative

    Directory of Open Access Journals (Sweden)

    Diana TÂRNĂVEANU

    2012-10-01

    Full Text Available Most organizations strive to obtain fast, interactive and insightful analytics in order to fundament the most effective and profitable decisions. They need to incorporate huge amounts of data in order to run analysis based on queries and reports with collaborative capabilities. The large variety of Business Intelligence solutions on the market makes it very difficult for organizations to select one and evaluate the impact of the selected solution to the organization. The need of a strategy to help organization chose the best solution for investment emerges. In the past, Business Intelligence (BI market was dominated by closed source and commercial tools, but in the last years open source solutions developed everywhere. An Open Source Business Intelligence solution can be an option due to time-sensitive, sprawling requirements and tightening budgets. This paper presents a practical solution implemented in a suite of Open Source Business Intelligence products called Pentaho Business Analytics, which provides data integration, OLAP services, reporting, dashboarding, data mining and ETL capabilities. The study conducted in this paper suggests that the open source phenomenon could become a valid alternative to commercial platforms within the BI context.

  6. Shielding Characteristics Using an Ultrasonic Configurable Fan Artificial Noise Source to Generate Modes - Experimental Measurements and Analytical Predictions

    Science.gov (United States)

    Sutliff, Daniel L.; Walker, Bruce E.

    2014-01-01

    An Ultrasonic Configurable Fan Artificial Noise Source (UCFANS) was designed, built, and tested in support of the NASA Langley Research Center's 14x22 wind tunnel test of the Hybrid Wing Body (HWB) full 3-D 5.8% scale model. The UCFANS is a 5.8% rapid prototype scale model of a high-bypass turbofan engine that can generate the tonal signature of proposed engines using artificial sources (no flow). The purpose of the program was to provide an estimate of the acoustic shielding benefits possible from mounting an engine on the upper surface of a wing; a flat plate model was used as the shielding surface. Simple analytical simulations were used to preview the radiation patterns - Fresnel knife-edge diffraction was coupled with a dense phased array of point sources to compute shielded and unshielded sound pressure distributions for potential test geometries and excitation modes. Contour plots of sound pressure levels, and integrated power levels, from nacelle alone and shielded configurations for both the experimental measurements and the analytical predictions are presented in this paper.

  7. Test set of gaseous analytes at Hanford tank farms

    International Nuclear Information System (INIS)

    1997-01-01

    DOE has stored toxic and radioactive waste materials in large underground tanks. When the vapors in the tank headspaces vent to the open atmosphere a potentially dangerous situation can occur for personnel in the area. An open-path atmospheric pollution monitor is being developed to monitor the open air space above these tanks. In developing this infrared spectra monitor as a safety alert instrument, it is important to know what hazardous gases, called the Analytes of Concern, are most likely to be found in dangerous concentrations. The monitor must consider other gases which could interfere with measurements of the Analytes of Concern. The total list of gases called the Test Set Analytes form the basis for testing the pollution monitor. Prior measurements in 54 tank headspaces have detected 102 toxic air pollutants (TAPs) and over 1000 other analytes. The hazardous Analytes are ranked herein by a Hazardous Atmosphere Rating which combines their measured concentration, their density relative to air, and the concentration at which they become dangerous. The top 20 toxic air pollutants, as ranked by the Hazardous Atmosphere Rating, and the top 20 other analytes, in terms of measured concentrations, are analyzed for possible inclusion in the Test Set Analytes. Of these 40 gases, 20 are selected. To these 20 gases are added the 6 omnipresent atmospheric gases with the highest concentrations, since their spectra could interfere with measurements of the other spectra. The 26 Test Set Analytes are divided into a Primary Set and a Secondary Set. The Primary Set, gases which must be detectable by the monitor, includes the 6 atmospheric gases and the 6 hazardous gases which have been measured at dangerous concentrations. The Secondary Set gases need not be monitored at this time. The infrared spectra indicates that the pollution monitor will detect all 26 Test Set Analytes by thermal emission and will detect 15 Test Set Analytes by laser absorption

  8. An Analytical Method of Auxiliary Sources Solution for Plane Wave Scattering by Impedance Cylinders

    DEFF Research Database (Denmark)

    Larsen, Niels Vesterdal; Breinbjerg, Olav

    2004-01-01

    Analytical Method of Auxiliary Sources solutions for plane wave scattering by circular impedance cylinders are derived by transformation of the exact eigenfunction series solutions employing the Hankel function wave transformation. The analytical Method of Auxiliary Sources solution thus obtained...

  9. Dispersant testing : a study on analytical test procedures

    International Nuclear Information System (INIS)

    Fingas, M.F.; Fieldhouse, B.; Wang, Z.; Environment Canada, Ottawa, ON

    2004-01-01

    Crude oil is a complex mixture of hydrocarbons, ranging from small, volatile compounds to very large, non-volatile compounds. Analysis of the dispersed oil is crucial. This paper described Environment Canada's ongoing studies on various traits of dispersants. In particular, it describes small studies related to dispersant effectiveness and methods to improve analytical procedures. The study also re-evaluated the analytical procedure for the Swirling Flask Test, which is now part of the ASTM standard procedure. There are new and improved methods for analyzing oil-in-water using gas chromatography (GC). The methods could be further enhanced by integrating the entire chromatogram rather than just peaks. This would result in a decrease in maximum variation from 5 per cent to about 2 per cent. For oil-dispersant studies, the surfactant-dispersed oil hydrocarbons consist of two parts: GC-resolved hydrocarbons and GC-unresolved hydrocarbons. This study also tested a second feature of the Swirling Flask Test in which the side spout was tested and compared with a new vessel with a septum port instead of a side spout. This decreased the variability as well as the energy and mixing in the vessel. Rather than being a variation of the Swirling Flask Test, it was suggested that a spoutless vessel might be considered as a completely separate test. 7 refs., 2 tabs., 4 figs

  10. Improvement of spatial discretization error on the semi-analytic nodal method using the scattered source subtraction method

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Tatsumi, Masahiro

    2006-01-01

    In this paper, the scattered source subtraction (SSS) method is newly proposed to improve the spatial discretization error of the semi-analytic nodal method with the flat-source approximation. In the SSS method, the scattered source is subtracted from both side of the diffusion or the transport equation to make spatial variation of the source term to be small. The same neutron balance equation is still used in the SSS method. Since the SSS method just modifies coefficients of node coupling equations (those used in evaluation for the response of partial currents), its implementation is easy. Validity of the present method is verified through test calculations that are carried out in PWR multi-assemblies configurations. The calculation results show that the SSS method can significantly improve the spatial discretization error. Since the SSS method does not have any negative impact on execution time, convergence behavior and memory requirement, it will be useful to reduce the spatial discretization error of the semi-analytic nodal method with the flat-source approximation. (author)

  11. In situ impulse test: an experimental and analytical evaluation of data interpretation procedures

    International Nuclear Information System (INIS)

    1975-08-01

    Special experimental field testing and analytical studies were undertaken at Fort Lawton in Seattle, Washington, to study ''close-in'' wave propagation and evaluate data interpretation procedures for a new in situ impulse test. This test was developed to determine the shear wave velocity and dynamic modulus of soils underlying potential nuclear power plant sites. The test is different from conventional geophysical testing in that the velocity variation with strain is determined for each test. In general, strains between 10 -1 and 10 -3 percent are achieved. The experimental field work consisted of performing special tests in a large test sand fill to obtain detailed ''close-in'' data. Six recording transducers were placed at various points on the energy source, while approximately 37 different transducers were installed within the soil fill, all within 7 feet of the energy source. Velocity measurements were then taken simultaneously under controlled test conditions to study shear wave propagation phenomenology and help evaluate data interpretation procedures. Typical test data are presented along with detailed descriptions of the results

  12. Analytical challenges in sports drug testing.

    Science.gov (United States)

    Thevis, Mario; Krug, Oliver; Geyer, Hans; Walpurgis, Katja; Baume, Norbert; Thomas, Andreas

    2018-03-01

    Analytical chemistry represents a central aspect of doping controls. Routine sports drug testing approaches are primarily designed to address the question whether a prohibited substance is present in a doping control sample and whether prohibited methods (for example, blood transfusion or sample manipulation) have been conducted by an athlete. As some athletes have availed themselves of the substantial breadth of research and development in the pharmaceutical arena, proactive and preventive measures are required such as the early implementation of new drug candidates and corresponding metabolites into routine doping control assays, even though these drug candidates are to date not approved for human use. Beyond this, analytical data are also cornerstones of investigations into atypical or adverse analytical findings, where the overall picture provides ample reason for follow-up studies. Such studies have been of most diverse nature, and tailored approaches have been required to probe hypotheses and scenarios reported by the involved parties concerning the plausibility and consistency of statements and (analytical) facts. In order to outline the variety of challenges that doping control laboratories are facing besides providing optimal detection capabilities and analytical comprehensiveness, selected case vignettes involving the follow-up of unconventional adverse analytical findings, urine sample manipulation, drug/food contamination issues, and unexpected biotransformation reactions are thematized.

  13. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    Science.gov (United States)

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  14. SPANDOM - source projection analytic nodal discrete ordinates method

    International Nuclear Information System (INIS)

    Kim, Tae Hyeong; Cho, Nam Zin

    1994-01-01

    We describe a new discrete ordinates nodal method for the two-dimensional transport equation. We solve the discrete ordinates equation analytically after the source term is projected and represented in polynomials. The method is applied to two fast reactor benchmark problems and compared with the TWOHEX code. The results indicate that the present method accurately predicts not only multiplication factor but also flux distribution

  15. Emergency analytical testing: things to consider

    CSIR Research Space (South Africa)

    Pretorius, Cecilia J

    2017-07-01

    Full Text Available Circumstances may dictate that samples from mining operations are analysed for unknown compounds that are potentially harmful to humans. These circumstances may be out of the ordinary, unique or isolated incidents. Emergency analytical testing may...

  16. Analytical study on model tests of soil-structure interaction

    International Nuclear Information System (INIS)

    Odajima, M.; Suzuki, S.; Akino, K.

    1987-01-01

    Since nuclear power plant (NPP) structures are stiff, heavy and partly-embedded, the behavior of those structures during an earthquake depends on the vibrational characteristics of not only the structure but also the soil. Accordingly, seismic response analyses considering the effects of soil-structure interaction (SSI) are extremely important for seismic design of NPP structures. Many studies have been conducted on analytical techniques concerning SSI and various analytical models and approaches have been proposed. Based on the studies, SSI analytical codes (computer programs) for NPP structures have been improved at JINS (Japan Institute of Nuclear Safety), one of the departments of NUPEC (Nuclear Power Engineering Test Center) in Japan. These codes are soil-spring lumped-mass code (SANLUM), finite element code (SANSSI), thin layered element code (SANSOL). In proceeding with the improvement of the analytical codes, in-situ large-scale forced vibration SSI tests were performed using models simulating light water reactor buildings, and simulation analyses were performed to verify the codes. This paper presents an analytical study to demonstrate the usefulness of the codes

  17. Analytic tests and their relation to jet fuel thermal stability

    Energy Technology Data Exchange (ETDEWEB)

    Heneghan, S.P.; Kauffman, R.E. [Univ. of Dayton Research Institute, OH (United States)

    1995-05-01

    The evaluation of jet fuel thermal stability (TS) by simple analytic procedures has long been a goal of fuels chemists. The reason is obvious: if the analytic chemist can determine which types of material cause his test to respond, the refiners will know which materials to remove to improve stability. Complicating this quest is the lack of an acceptable quantitative TS test with which to compare any analytic procedures. To circumvent this problem, we recently compiled the results of TS tests for 12 fuels using six separate test procedures. The results covering a range of flow and temperature conditions show that TS is not as dependent on test conditions as previously thought. Also, comparing the results from these tests with several analytic procedures shows that either a measure of the number of phenols or the total sulfur present in jet fuels is strongly indicative of the TS. The phenols have been measured using a cyclic voltammetry technique and the polar material by gas chromatography (atomic emission detection) following a solid phase extraction on silica gel. The polar material has been identified as mainly phenols (by mass spectrometry identification). Measures of the total acid number or peroxide concentration have little correlation with TS.

  18. Analytical modeling of Schottky tunneling source impact ionization MOSFET with reduced breakdown voltage

    Directory of Open Access Journals (Sweden)

    Sangeeta Singh

    2016-03-01

    Full Text Available In this paper, we have investigated a novel Schottky tunneling source impact ionization MOSFET (STS-IMOS to lower the breakdown voltage of conventional impact ionization MOS (IMOS and developed an analytical model for the same. In STS-IMOS there is an accumulative effect of both impact ionization and source induced barrier tunneling. The silicide source offers very low parasitic resistance, the outcome of which is an increment in voltage drop across the intrinsic region for the same applied bias. This reduces operating voltage and hence, it exhibits a significant reduction in both breakdown and threshold voltage. STS-IMOS shows high immunity against hot electron damage. As a result of this the device reliability increases magnificently. The analytical model for impact ionization current (Iii is developed based on the integration of ionization integral (M. Similarly, to get Schottky tunneling current (ITun expression, Wentzel–Kramers–Brillouin (WKB approximation is employed. Analytical models for threshold voltage and subthreshold slope is optimized against Schottky barrier height (ϕB variation. The expression for the drain current is computed as a function of gate-to-drain bias via integral expression. It is validated by comparing it with the technology computer-aided design (TCAD simulation results as well. In essence, this analytical framework provides the physical background for better understanding of STS-IMOS and its performance estimation.

  19. Source-to-incident flux relation for a tokamak fusion test reactor blanket module

    International Nuclear Information System (INIS)

    Imel, G.R.

    1982-01-01

    The source-to-incident 14-MeV flux relation for a blanket module on the Tokamak Fusion Test Reactor is derived. It is shown that assumptions can be made that allow an analytical expression to be derived, using point kernel methods. In addition, the effect of a nonuniform source distribution is derived, again by relatively simple point kernel methods. It is thought that the methodology developed is valid for a variety of blanket modules on tokamak reactors

  20. Review and evaluation of spark source mass spectrometry as an analytical method

    International Nuclear Information System (INIS)

    Beske, H.E.

    1981-01-01

    The analytical features and most important fields of application of spark source mass spectrometry are described with respect to the trace analysis of high-purity materials and the multielement analysis of technical alloys, geochemical and cosmochemical, biological and radioactive materials, as well as in environmental analysis. Comparisons are made to other analytical methods. The distribution of the method as well as opportunities for contract analysis are indicated and developmental tendencies discussed. (orig.) [de

  1. Analytical and pre-analytical performance characteristics of a novel cartridge-type blood gas analyzer for point-of-care and laboratory testing.

    Science.gov (United States)

    Oyaert, Matthijs; Van Maerken, Tom; Bridts, Silke; Van Loon, Silvi; Laverge, Heleen; Stove, Veronique

    2018-03-01

    Point-of-care blood gas test results may benefit therapeutic decision making by their immediate impact on patient care. We evaluated the (pre-)analytical performance of a novel cartridge-type blood gas analyzer, the GEM Premier 5000 (Werfen), for the determination of pH, partial carbon dioxide pressure (pCO 2 ), partial oxygen pressure (pO 2 ), sodium (Na + ), potassium (K + ), chloride (Cl - ), ionized calcium ( i Ca 2+ ), glucose, lactate, and total hemoglobin (tHb). Total imprecision was estimated according to the CLSI EP5-A2 protocol. The estimated total error was calculated based on the mean of the range claimed by the manufacturer. Based on the CLSI EP9-A2 evaluation protocol, a method comparison with the Siemens RapidPoint 500 and Abbott i-STAT CG8+ was performed. Obtained data were compared against preset quality specifications. Interference of potential pre-analytical confounders on co-oximetry and electrolyte concentrations were studied. The analytical performance was acceptable for all parameters tested. Method comparison demonstrated good agreement to the RapidPoint 500 and i-STAT CG8+, except for some parameters (RapidPoint 500: pCO 2 , K + , lactate and tHb; i-STAT CG8+: pO 2 , Na + , i Ca 2+ and tHb) for which significant differences between analyzers were recorded. No interference of lipemia or methylene blue on CO-oximetry results was found. On the contrary, significant interference for benzalkonium and hemolysis on electrolyte measurements were found, for which the user is notified by an interferent specific flag. Identification of sample errors from pre-analytical sources, such as interferences and automatic corrective actions, along with the analytical performance, ease of use and low maintenance time of the instrument, makes the evaluated instrument a suitable blood gas analyzer for both POCT and laboratory use. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  2. Large source test stand for H-(D-) ion source

    International Nuclear Information System (INIS)

    Larson, R.; McKenzie-Wilson, R.

    1981-01-01

    The Brookhaven National Laboratory Neutral Beam Group has constructed a large source test stand for testing of the various source modules under development. The first objective of the BNL program is to develop a source module capable of delivering 10A of H - (D - ) at 25 kV operating in the steady state mode with satisfactory gas and power efficiency. The large source test stand contains gas supply and vacuum pumping systems, source cooling systems, magnet power supplies and magnet cooling systems, two arc power supplies rated at 25 kW and 50 kW, a large battery driven power supply and an extractor electrode power supply. Figure 1 is a front view of the vacuum vessel showing the control racks with the 36'' vacuum valves and refrigerated baffles mounted behind. Figure 2 shows the rear view of the vessel with a BNL Mk V magnetron source mounted in the source aperture and also shows the cooled magnet coils. Currently two types of sources are under test: a large magnetron source and a hollow cathode discharge source

  3. Analytic Approximation to Radiation Fields from Line Source Geometry

    International Nuclear Information System (INIS)

    Michieli, I.

    2000-01-01

    Line sources with slab shields represent typical source-shield configuration in gamma-ray attenuation problems. Such shielding problems often lead to the generalized Secant integrals of the specific form. Besides numerical integration approach, various expansions and rational approximations with limited applicability are in use for computing the value of such integral functions. Lately, the author developed rapidly convergent infinite series representation of generalized Secant Integrals involving incomplete Gamma functions. Validity of such representation was established for zero and positive values of integral parameter a (a=0). In this paper recurrence relations for generalized Secant Integrals are derived allowing us simple approximate analytic calculation of the integral for arbitrary a values. It is demonstrated how truncated series representation can be used, as the basis for such calculations, when possibly negative a values are encountered. (author)

  4. Analytic solution of magnetic induction distribution of ideal hollow spherical field sources

    Science.gov (United States)

    Xu, Xiaonong; Lu, Dingwei; Xu, Xibin; Yu, Yang; Gu, Min

    2017-12-01

    The Halbach type hollow spherical permanent magnet arrays (HSPMA) are volume compacted, energy efficient field sources, and capable of producing multi-Tesla field in the cavity of the array, which have attracted intense interests in many practical applications. Here, we present analytical solutions of magnetic induction to the ideal HSPMA in entire space, outside of array, within the cavity of array, and in the interior of the magnet. We obtain solutions using concept of magnetic charge to solve the Poisson's and Laplace's equations for the HSPMA. Using these analytical field expressions inside the material, a scalar demagnetization function is defined to approximately indicate the regions of magnetization reversal, partial demagnetization, and inverse magnetic saturation. The analytical field solution provides deeper insight into the nature of HSPMA and offer guidance in designing optimized one.

  5. 105-KE Basin isolation barrier leak rate test analytical development. Revision 1

    International Nuclear Information System (INIS)

    Irwin, J.J.

    1995-01-01

    This document provides an analytical development in support of the proposed leak rate test of the 105-KE Basin. The analytical basis upon which the K-basin leak test results will be used to determine the basin leakage rates is developed in this report. The leakage of the K-Basin isolation barriers under postulated accident conditions will be determined from the test results. There are two fundamental flow regimes that may exist in the postulated K-Basin leakage: viscous laminar and turbulent flow. An analytical development is presented for each flow regime. The basic geometry and nomenclature of the postulated leak paths are denoted

  6. Irregular analytical errors in diagnostic testing - a novel concept.

    Science.gov (United States)

    Vogeser, Michael; Seger, Christoph

    2018-02-23

    In laboratory medicine, routine periodic analyses for internal and external quality control measurements interpreted by statistical methods are mandatory for batch clearance. Data analysis of these process-oriented measurements allows for insight into random analytical variation and systematic calibration bias over time. However, in such a setting, any individual sample is not under individual quality control. The quality control measurements act only at the batch level. Quantitative or qualitative data derived for many effects and interferences associated with an individual diagnostic sample can compromise any analyte. It is obvious that a process for a quality-control-sample-based approach of quality assurance is not sensitive to such errors. To address the potential causes and nature of such analytical interference in individual samples more systematically, we suggest the introduction of a new term called the irregular (individual) analytical error. Practically, this term can be applied in any analytical assay that is traceable to a reference measurement system. For an individual sample an irregular analytical error is defined as an inaccuracy (which is the deviation from a reference measurement procedure result) of a test result that is so high it cannot be explained by measurement uncertainty of the utilized routine assay operating within the accepted limitations of the associated process quality control measurements. The deviation can be defined as the linear combination of the process measurement uncertainty and the method bias for the reference measurement system. Such errors should be coined irregular analytical errors of the individual sample. The measurement result is compromised either by an irregular effect associated with the individual composition (matrix) of the sample or an individual single sample associated processing error in the analytical process. Currently, the availability of reference measurement procedures is still highly limited, but LC

  7. Analytical and semi-analytical formalism for the voltage and the current sources of a superconducting cavity under dynamic detuning

    CERN Document Server

    Doleans, M

    2003-01-01

    Elliptical superconducting radio frequency (SRF) cavities are sensitive to frequency detuning because they have a high Q value in comparison with normal conducting cavities and weak mechanical properties. Radiation pressure on the cavity walls, microphonics, and tuning system are possible sources of dynamic detuning during SRF cavity-pulsed operation. A general analytic relation between the cavity voltage, the dynamic detuning function, and the RF control function is developed. This expression for the voltage envelope in a cavity under dynamic detuning and dynamic RF controls is analytically expressed through an integral formulation. A semi-analytical scheme is derived to calculate the voltage behavior in any practical case. Examples of voltage envelope behavior for different cases of dynamic detuning and RF control functions are shown. The RF control function for a cavity under dynamic detuning is also investigated and as an application various filling schemes are presented.

  8. Reconstruction of sound source signal by analytical passive TR in the environment with airflow

    Science.gov (United States)

    Wei, Long; Li, Min; Yang, Debin; Niu, Feng; Zeng, Wu

    2017-03-01

    In the acoustic design of air vehicles, the time-domain signals of noise sources on the surface of air vehicles can serve as data support to reveal the noise source generation mechanism, analyze acoustic fatigue, and take measures for noise insulation and reduction. To rapidly reconstruct the time-domain sound source signals in an environment with flow, a method combining the analytical passive time reversal mirror (AP-TR) with a shear flow correction is proposed. In this method, the negative influence of flow on sound wave propagation is suppressed by the shear flow correction, obtaining the corrected acoustic propagation time delay and path. Those corrected time delay and path together with the microphone array signals are then submitted to the AP-TR, reconstructing more accurate sound source signals in the environment with airflow. As an analytical method, AP-TR offers a supplementary way in 3D space to reconstruct the signal of sound source in the environment with airflow instead of the numerical TR. Experiments on the reconstruction of the sound source signals of a pair of loud speakers are conducted in an anechoic wind tunnel with subsonic airflow to validate the effectiveness and priorities of the proposed method. Moreover the comparison by theorem and experiment result between the AP-TR and the time-domain beamforming in reconstructing the sound source signal is also discussed.

  9. An analytic data analysis method for oscillatory slug tests.

    Science.gov (United States)

    Chen, Chia-Shyun

    2006-01-01

    An analytical data analysis method is developed for slug tests in partially penetrating wells in confined or unconfined aquifers of high hydraulic conductivity. As adapted from the van der Kamp method, the determination of the hydraulic conductivity is based on the occurrence times and the displacements of the extreme points measured from the oscillatory data and their theoretical counterparts available in the literature. This method is applied to two sets of slug test response data presented by Butler et al.: one set shows slow damping with seven discernable extremities, and the other shows rapid damping with three extreme points. The estimates of the hydraulic conductivity obtained by the analytic method are in good agreement with those determined by an available curve-matching technique.

  10. Modification and validation of an analytical source model for external beam radiotherapy Monte Carlo dose calculations

    Energy Technology Data Exchange (ETDEWEB)

    Davidson, Scott E., E-mail: sedavids@utmb.edu [Radiation Oncology, The University of Texas Medical Branch, Galveston, Texas 77555 (United States); Cui, Jing [Radiation Oncology, University of Southern California, Los Angeles, California 90033 (United States); Kry, Stephen; Ibbott, Geoffrey S.; Followill, David S. [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States); Deasy, Joseph O. [Department of Medical Physics, Memorial Sloan Kettering Cancer Center, New York, New York 10065 (United States); Vicic, Milos [Department of Applied Physics, University of Belgrade, Belgrade 11000 (Serbia); White, R. Allen [Bioinformatics and Computational Biology, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States)

    2016-08-15

    points tested. The model was capable of predicting the depth of the maximum dose within 1 mm. Anthropomorphic phantom benchmark testing of modulated and patterned MLCs treatment plans showed agreement to measurement within 3% in target regions using thermoluminescent dosimeters (TLD). Using radiochromic film normalized to TLD, a gamma criteria of 3% of maximum dose and 2 mm DTA was applied with a pass rate of least 85% in the high dose, high gradient, and low dose regions. Finally, recalculations of patient plans using DPM showed good agreement relative to a commercial TPS when comparing dose volume histograms and 2D dose distributions. Conclusions: A unique analytical source model coupled to the dose planning method Monte Carlo dose calculation code has been modified and validated using basic beam data and anthropomorphic phantom measurement. While this tool can be applied in general use for a particular linac model, specifically it was developed to provide a singular methodology to independently assess treatment plan dose distributions from those clinical institutions participating in National Cancer Institute trials.

  11. Semi-Analytical Benchmarks for MCNP6

    Energy Technology Data Exchange (ETDEWEB)

    Grechanuk, Pavel Aleksandrovi [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-11-07

    Code verification is an extremely important process that involves proving or disproving the validity of code algorithms by comparing them against analytical results of the underlying physics or mathematical theory on which the code is based. Monte Carlo codes such as MCNP6 must undergo verification and testing upon every release to ensure that the codes are properly simulating nature. Specifically, MCNP6 has multiple sets of problems with known analytic solutions that are used for code verification. Monte Carlo codes primarily specify either current boundary sources or a volumetric fixed source, either of which can be very complicated functions of space, energy, direction and time. Thus, most of the challenges with modeling analytic benchmark problems in Monte Carlo codes come from identifying the correct source definition to properly simulate the correct boundary conditions. The problems included in this suite all deal with mono-energetic neutron transport without energy loss, in a homogeneous material. The variables that differ between the problems are source type (isotropic/beam), medium dimensionality (infinite/semi-infinite), etc.

  12. Analytic solution of field distribution and demagnetization function of ideal hollow cylindrical field source

    Science.gov (United States)

    Xu, Xiaonong; Lu, Dingwei; Xu, Xibin; Yu, Yang; Gu, Min

    2017-09-01

    The Halbach type hollow cylindrical permanent magnet array (HCPMA) is a volume compact and energy conserved field source, which have attracted intense interests in many practical applications. Here, using the complex variable integration method based on the Biot-Savart Law (including current distributions inside the body and on the surfaces of magnet), we derive analytical field solutions to an ideal multipole HCPMA in entire space including the interior of magnet. The analytic field expression inside the array material is used to construct an analytic demagnetization function, with which we can explain the origin of demagnetization phenomena in HCPMA by taking into account an ideal magnetic hysteresis loop with finite coercivity. These analytical field expressions and demagnetization functions provide deeper insight into the nature of such permanent magnet array systems and offer guidance in designing optimized array system.

  13. Algorithms and analytical solutions for rapidly approximating long-term dispersion from line and area sources

    Science.gov (United States)

    Barrett, Steven R. H.; Britter, Rex E.

    Predicting long-term mean pollutant concentrations in the vicinity of airports, roads and other industrial sources are frequently of concern in regulatory and public health contexts. Many emissions are represented geometrically as ground-level line or area sources. Well developed modelling tools such as AERMOD and ADMS are able to model dispersion from finite (i.e. non-point) sources with considerable accuracy, drawing upon an up-to-date understanding of boundary layer behaviour. Due to mathematical difficulties associated with line and area sources, computationally expensive numerical integration schemes have been developed. For example, some models decompose area sources into a large number of line sources orthogonal to the mean wind direction, for which an analytical (Gaussian) solution exists. Models also employ a time-series approach, which involves computing mean pollutant concentrations for every hour over one or more years of meteorological data. This can give rise to computer runtimes of several days for assessment of a site. While this may be acceptable for assessment of a single industrial complex, airport, etc., this level of computational cost precludes national or international policy assessments at the level of detail available with dispersion modelling. In this paper, we extend previous work [S.R.H. Barrett, R.E. Britter, 2008. Development of algorithms and approximations for rapid operational air quality modelling. Atmospheric Environment 42 (2008) 8105-8111] to line and area sources. We introduce approximations which allow for the development of new analytical solutions for long-term mean dispersion from line and area sources, based on hypergeometric functions. We describe how these solutions can be parameterized from a single point source run from an existing advanced dispersion model, thereby accounting for all processes modelled in the more costly algorithms. The parameterization method combined with the analytical solutions for long-term mean

  14. Review of Pre-Analytical Errors in Oral Glucose Tolerance Testing in a Tertiary Care Hospital.

    Science.gov (United States)

    Nanda, Rachita; Patel, Suprava; Sahoo, Sibashish; Mohapatra, Eli

    2018-03-13

    The pre-pre-analytical and pre-analytical phases form a major chunk of the errors in a laboratory. The process has taken into consideration a very common procedure which is the oral glucose tolerance test to identify the pre-pre-analytical errors. Quality indicators provide evidence of quality, support accountability and help in the decision making of laboratory personnel. The aim of this research is to evaluate pre-analytical performance of the oral glucose tolerance test procedure. An observational study that was conducted overa period of three months, in the phlebotomy and accessioning unit of our laboratory using questionnaire that examined the pre-pre-analytical errors through a scoring system. The pre-analytical phase was analyzed for each sample collected as per seven quality indicators. About 25% of the population gave wrong answer with regard to the question that tested the knowledge of patient preparation. The appropriateness of test result QI-1 had the most error. Although QI-5 for sample collection had a low error rate, it is a very important indicator as any wrongly collected sample can alter the test result. Evaluating the pre-analytical and pre-pre-analytical phase is essential and must be conducted routinely on a yearly basis to identify errors and take corrective action and to facilitate their gradual introduction into routine practice.

  15. Testing of the analytical anisotropic algorithm for photon dose calculation

    International Nuclear Information System (INIS)

    Esch, Ann van; Tillikainen, Laura; Pyykkonen, Jukka; Tenhunen, Mikko; Helminen, Hannu; Siljamaeki, Sami; Alakuijala, Jyrki; Paiusco, Marta; Iori, Mauro; Huyskens, Dominique P.

    2006-01-01

    The analytical anisotropic algorithm (AAA) was implemented in the Eclipse (Varian Medical Systems) treatment planning system to replace the single pencil beam (SPB) algorithm for the calculation of dose distributions for photon beams. AAA was developed to improve the dose calculation accuracy, especially in heterogeneous media. The total dose deposition is calculated as the superposition of the dose deposited by two photon sources (primary and secondary) and by an electron contamination source. The photon dose is calculated as a three-dimensional convolution of Monte-Carlo precalculated scatter kernels, scaled according to the electron density matrix. For the configuration of AAA, an optimization algorithm determines the parameters characterizing the multiple source model by optimizing the agreement between the calculated and measured depth dose curves and profiles for the basic beam data. We have combined the acceptance tests obtained in three different departments for 6, 15, and 18 MV photon beams. The accuracy of AAA was tested for different field sizes (symmetric and asymmetric) for open fields, wedged fields, and static and dynamic multileaf collimation fields. Depth dose behavior at different source-to-phantom distances was investigated. Measurements were performed on homogeneous, water equivalent phantoms, on simple phantoms containing cork inhomogeneities, and on the thorax of an anthropomorphic phantom. Comparisons were made among measurements, AAA, and SPB calculations. The optimization procedure for the configuration of the algorithm was successful in reproducing the basic beam data with an overall accuracy of 3%, 1 mm in the build-up region, and 1%, 1 mm elsewhere. Testing of the algorithm in more clinical setups showed comparable results for depth dose curves, profiles, and monitor units of symmetric open and wedged beams below d max . The electron contamination model was found to be suboptimal to model the dose around d max , especially for physical

  16. Preliminary results of testing bioassay analytical performance standards

    International Nuclear Information System (INIS)

    Fisher, D.R.; Robinson, A.V.; Hadley, R.T.

    1983-08-01

    The analytical performance of both in vivo and in vitro bioassay laboratories is being studied to determine the capability of these laboratories to meet the minimum criteria for accuracy and precision specified in the draft ANSI Standard N13.30, Performance Criteria for Radiobioassay. This paper presents preliminary results of the first round of testing

  17. Continuous Analytical Performances Monitoring at the On-Site Laboratory through Proficiency, Inter-Laboratory Testing and Inter-Comparison Analytical Methods

    International Nuclear Information System (INIS)

    Duhamel, G.; Decaillon, J.-G.; Dashdondog, S.; Kim, C.-K.; Toervenyi, A.; Hara, S.; Kato, S.; Kawaguchi, T.; Matsuzawa, K.

    2015-01-01

    Since 2008, as one measure to strengthen its quality management system, the On-Site Laboratory for nuclear safeguards at the Rokkasho Reprocessing Plant, has increased its participation in domestic and international proficiency and inter-laboratory testing for the purpose of determining analytical method accuracy, precision and robustness but also to support method development and improvement. This paper provides a description of the testing and its scheduling. It presents the way the testing was optimized to cover most of the analytical methods at the OSL. The paper presents the methodology used for the evaluation of the obtained results based on Analysis of variance (ANOVA). Results are discussed with respect to random, systematic and long term systematic error. (author)

  18. The analytical benchmark solution of spatial diffusion kinetics in source driven systems for homogeneous media

    International Nuclear Information System (INIS)

    Oliveira, F.L. de; Maiorino, J.R.; Santos, R.S.

    2007-01-01

    This paper describes a closed form solution obtained by the expansion method for the general time dependent diffusion model with delayed emission for source transients in homogeneous media. In particular, starting from simple models, and increasing the complexity, numerical results were obtained for different types of source transients. Thus, first an analytical solution of the one group without precursors was solved, followed by considering one precursors family. The general case of G-groups with R families of precursor although having a closed form solution, cannot be solved analytically, since there are no explicit formulae for the eigenvalues, and numerical methods must be used to solve such problem. To illustrate the general solution, the multi-group (three groups) time-dependent without precursors was also solved and the results inter compared with results obtained by the previous one group models for a given fast homogeneous media, and different types of source transients. The results are being compared with the obtained by numerical methods. (author)

  19. A Generic analytical solution for modelling pumping tests in wells intersecting fractures

    Science.gov (United States)

    Dewandel, Benoît; Lanini, Sandra; Lachassagne, Patrick; Maréchal, Jean-Christophe

    2018-04-01

    The behaviour of transient flow due to pumping in fractured rocks has been studied for at least the past 80 years. Analytical solutions were proposed for solving the issue of a well intersecting and pumping from one vertical, horizontal or inclined fracture in homogeneous aquifers, but their domain of application-even if covering various fracture geometries-was restricted to isotropic or anisotropic aquifers, whose potential boundaries had to be parallel or orthogonal to the fracture direction. The issue thus remains unsolved for many field cases. For example, a well intersecting and pumping a fracture in a multilayer or a dual-porosity aquifer, where intersected fractures are not necessarily parallel or orthogonal to aquifer boundaries, where several fractures with various orientations intersect the well, or the effect of pumping not only in fractures, but also in the aquifer through the screened interval of the well. Using a mathematical demonstration, we show that integrating the well-known Theis analytical solution (Theis, 1935) along the fracture axis is identical to the equally well-known analytical solution of Gringarten et al. (1974) for a uniform-flux fracture fully penetrating a homogeneous aquifer. This result implies that any existing line- or point-source solution can be used for implementing one or more discrete fractures that are intersected by the well. Several theoretical examples are presented and discussed: a single vertical fracture in a dual-porosity aquifer or in a multi-layer system (with a partially intersecting fracture); one and two inclined fractures in a leaky-aquifer system with pumping either only from the fracture(s), or also from the aquifer between fracture(s) in the screened interval of the well. For the cases with several pumping sources, analytical solutions of flowrate contribution from each individual source (fractures and well) are presented, and the drawdown behaviour according to the length of the pumped screened interval of

  20. Analytical quality control of neutron activation analysis by interlaboratory comparison and proficiency test

    International Nuclear Information System (INIS)

    Kim, S. H.; Moon, J. H.; Jeong, Y. S.

    2002-01-01

    Two air filters (V-50, P-50) artificially loaded with urban dust were provided from IAEA and trace elements to study inter-laboratory comparison and proficiency test were determined using instrumental neutron activation analysis non-destructively. Standard reference material(Urban Particulate Matter, NIST SRM 1648) of National Institute of Standard and Technology was used for internal analytical quality control. About 20 elements in each loaded filter sample were determined, respectively. Our analytical data were compared with statistical results using neutron activation analysis, particle induced X-ray emission spectrometry, inductively coupled plasma mass spectroscopy, etc., which were collected from 49 laboratories of 40 countries. From the results that were statistically re-treated with reported values, Z-scores of our analytical values are within ±2. In addition, the results of proficiency test are passed and accuracy and precision of the analytical values are reliable. Consequently, it was proved that analytical quality control for the analysis of air dust samples is reasonable

  1. Sampling analytical tests and destructive tests for quality assurance

    International Nuclear Information System (INIS)

    Saas, A.; Pasquini, S.; Jouan, A.; Angelis, de; Hreen Taywood, H.; Odoj, R.

    1990-01-01

    In the context of the third programme of the European Communities on the monitoring of radioactive waste, various methods have been developed for the performance of sampling and measuring tests on encapsulated waste of low and medium level activity, on the one hand, and of high level activity, on the other hand. The purpose was to provide better quality assurance for products to be stored on an interim or long-term basis. Various testing sampling means are proposed such as: - sampling of raw waste before conditioning and determination of the representative aliquot, - sampling of encapsulated waste on process output, - sampling of core specimens subjected to measurement before and after cutting. Equipment suitable for these sampling procedures have been developed and, in the case of core samples, a comparison of techniques has been made. The results are described for the various analytical tests carried out on the samples such as: - mechanical tests, - radiation resistance, - fire resistance, - lixiviation, - determination of free water, - biodegradation, - water resistance, - chemical and radiochemical analysis. Every time it was possible, these tests were compared with non-destructive tests on full-scale packages and some correlations are given. This word has made if possible to improve and clarify sample optimization, with fine sampling techniques and methodologies and draw up characterization procedures. It also provided an occasion for a first collaboration between the laboratories responsible for these studies and which will be furthered in the scope of the 1990-1994 programme

  2. Analytical solution of spatial kinetics of the diffusion model for subcritical homogeneous systems driven by external source

    International Nuclear Information System (INIS)

    Oliveira, Fernando Luiz de

    2008-01-01

    This work describes an analytical solution obtained by the expansion method for the spatial kinetics using the diffusion model with delayed emission for source transients in homogeneous media. In particular, starting from simple models, and increasing the complexity, numerical results were obtained for different types of source transients. An analytical solution of the one group without precursors was solved, followed by considering one precursors family. The general case of G-groups with R families of precursor although having a closed form solution, cannot be solved analytically, since there are no explicit formulae for the eigenvalues, and numerical methods must be used to solve such problem. To illustrate the general solution, the multi-group (three groups) time-dependent problem without precursors was solved and the numerical results of a finite difference code were compared with the exact results for different transients. (author)

  3. Annual banned-substance review: analytical approaches in human sports drug testing.

    Science.gov (United States)

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm

    2017-01-01

    There has been an immense amount of visibility of doping issues on the international stage over the past 12 months with the complexity of doping controls reiterated on various occasions. Hence, analytical test methods continuously being updated, expanded, and improved to provide specific, sensitive, and comprehensive test results in line with the World Anti-Doping Agency's (WADA) 2016 Prohibited List represent one of several critical cornerstones of doping controls. This enterprise necessitates expediting the (combined) exploitation of newly generated information on novel and/or superior target analytes for sports drug testing assays, drug elimination profiles, alternative test matrices, and recent advances in instrumental developments. This paper is a continuation of the series of annual banned-substance reviews appraising the literature published between October 2015 and September 2016 concerning human sports drug testing in the context of WADA's 2016 Prohibited List. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  4. Neutron Sources for Standard-Based Testing

    Energy Technology Data Exchange (ETDEWEB)

    Radev, Radoslav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McLean, Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-11-10

    The DHS TC Standards and the consensus ANSI Standards use 252Cf as the neutron source for performance testing because its energy spectrum is similar to the 235U and 239Pu fission sources used in nuclear weapons. An emission rate of 20,000 ± 20% neutrons per second is used for testing of the radiological requirements both in the ANSI standards and the TCS. Determination of the accurate neutron emission rate of the test source is important for maintaining consistency and agreement between testing results obtained at different testing facilities. Several characteristics in the manufacture and the decay of the source need to be understood and accounted for in order to make an accurate measurement of the performance of the neutron detection instrument. Additionally, neutron response characteristics of the particular instrument need to be known and taken into account as well as neutron scattering in the testing environment.

  5. Analytical studies related to Indian PHWR containment system performance

    International Nuclear Information System (INIS)

    Haware, S.K.; Markandeya, S.G.; Ghosh, A.K.; Kushwaha, H.S.; Venkat Raj, V.

    1998-01-01

    Build-up of pressure in a multi-compartment containment after a postulated accident, the growth, transportation and removal of aerosols in the containment are complex processes of vital importance in deciding the source term. The release of hydrogen and its combustion increases the overpressure. In order to analyze these complex processes and to enable proper estimation of the source term, well tested analytical tools are necessary. This paper gives a detailed account of the analytical tools developed/adapted for PSA level 2 studies. (author)

  6. PB-AM: An open-source, fully analytical linear poisson-boltzmann solver.

    Science.gov (United States)

    Felberg, Lisa E; Brookes, David H; Yap, Eng-Hui; Jurrus, Elizabeth; Baker, Nathan A; Head-Gordon, Teresa

    2017-06-05

    We present the open source distributed software package Poisson-Boltzmann Analytical Method (PB-AM), a fully analytical solution to the linearized PB equation, for molecules represented as non-overlapping spherical cavities. The PB-AM software package includes the generation of outputs files appropriate for visualization using visual molecular dynamics, a Brownian dynamics scheme that uses periodic boundary conditions to simulate dynamics, the ability to specify docking criteria, and offers two different kinetics schemes to evaluate biomolecular association rate constants. Given that PB-AM defines mutual polarization completely and accurately, it can be refactored as a many-body expansion to explore 2- and 3-body polarization. Additionally, the software has been integrated into the Adaptive Poisson-Boltzmann Solver (APBS) software package to make it more accessible to a larger group of scientists, educators, and students that are more familiar with the APBS framework. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  7. Annual banned-substance review: analytical approaches in human sports drug testing.

    Science.gov (United States)

    Thevis, Mario; Kuuranne, Tiia; Walpurgis, Katja; Geyer, Hans; Schänzer, Wilhelm

    2016-01-01

    The aim of improving anti-doping efforts is predicated on several different pillars, including, amongst others, optimized analytical methods. These commonly result from exploiting most recent developments in analytical instrumentation as well as research data on elite athletes' physiology in general, and pharmacology, metabolism, elimination, and downstream effects of prohibited substances and methods of doping, in particular. The need for frequent and adequate adaptations of sports drug testing procedures has been incessant, largely due to the uninterrupted emergence of new chemical entities but also due to the apparent use of established or even obsolete drugs for reasons other than therapeutic means, such as assumed beneficial effects on endurance, strength, and regeneration capacities. Continuing the series of annual banned-substance reviews, literature concerning human sports drug testing published between October 2014 and September 2015 is summarized and reviewed in reference to the content of the 2015 Prohibited List as issued by the World Anti-Doping Agency (WADA), with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2016 John Wiley & Sons, Ltd.

  8. Application of californium-252 neutron sources for analytical chemistry

    International Nuclear Information System (INIS)

    Ishii, Daido

    1976-01-01

    The researches made for the application of Cf-252 neutron sources to analytical chemistry during the period from 1970 to 1974 including partly 1975 are reviewed. The first part is the introduction to the above. The second part deals with general review of symposia, publications and the like. Attention is directed to ERDA publishing the periodical ''Californium-252 Progress'' and to a study group of Cf-252 utilization held by Japanese Radioisotope Association in 1974. The third part deals with its application for radio activation analysis. The automated absolute activation analysis (AAAA) of Savannha River is briefly explained. The joint experiment of Savannha River operation office with New Brunswick laboratory is mentioned. Cf-252 radiation source was used for the non-destructive analysis of elements in river water. East neutrons of Cf-252 were used for the quantitative analysis of lead in paints. Many applications for industrial control processes have been reported. Attention is drawn to the application of Cf-252 neutron sources for the field search of neutral resources. For example, a logging sonde for searching uranium resources was developed. the fourth part deals with the application of the analysis with gamma ray by capturing neutrons. For example, a bore hole sonde and the process control analysis of sulfur in fuel utilized capture gamma ray. The prompt gamma ray by capturing neutrons may be used for the nondestructive analysis of enrivonment. (Iwakiri, K.)

  9. Teaching Analytical Method Transfer through Developing and Validating Then Transferring Dissolution Testing Methods for Pharmaceuticals

    Science.gov (United States)

    Kimaru, Irene; Koether, Marina; Chichester, Kimberly; Eaton, Lafayette

    2017-01-01

    Analytical method transfer (AMT) and dissolution testing are important topics required in industry that should be taught in analytical chemistry courses. Undergraduate students in senior level analytical chemistry laboratory courses at Kennesaw State University (KSU) and St. John Fisher College (SJFC) participated in development, validation, and…

  10. Pulsed voltage electrospray ion source and method for preventing analyte electrolysis

    Science.gov (United States)

    Kertesz, Vilmos [Knoxville, TN; Van Berkel, Gary [Clinton, TN

    2011-12-27

    An electrospray ion source and method of operation includes the application of pulsed voltage to prevent electrolysis of analytes with a low electrochemical potential. The electrospray ion source can include an emitter, a counter electrode, and a power supply. The emitter can include a liquid conduit, a primary working electrode having a liquid contacting surface, and a spray tip, where the liquid conduit and the working electrode are in liquid communication. The counter electrode can be proximate to, but separated from, the spray tip. The power system can supply voltage to the working electrode in the form of a pulse wave, where the pulse wave oscillates between at least an energized voltage and a relaxation voltage. The relaxation duration of the relaxation voltage can range from 1 millisecond to 35 milliseconds. The pulse duration of the energized voltage can be less than 1 millisecond and the frequency of the pulse wave can range from 30 to 800 Hz.

  11. Process-Hardened, Multi-Analyte Sensor for Characterizing Rocket Plum Constituents Under Test Environment, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the Phase II STTR project is to develop a prototype multi-analyte sensor system to detect gaseous analytes present in the test stands during...

  12. GA-4/GA-9 honeycomb impact limiter tests and analytical model

    International Nuclear Information System (INIS)

    Koploy, M.A.; Taylor, C.S.

    1991-01-01

    General Atomics (GA) has a test program underway to obtain data on the behavior of a honeycomb impact limiter. The program includes testing of small samples to obtain basic information, as well as testing of complete 1/4-scale impact limiters to obtain load-versus-deflection curves for different crush orientations. GA has used the test results to aid in the development of an analytical model to predict the impact limiter loads. The results also helped optimize the design of the impact limiters for the GA-4 and GA-9 Casks

  13. Immunochemical faecal occult blood tests have superior stability and analytical performance characteristics over guaiac-based tests in a controlled in vitro study.

    LENUS (Irish Health Repository)

    Lee, Chun Seng

    2011-06-01

    The aims of this study were (1) to determine the measurement accuracy of a widely used guaiac faecal occult blood test (gFOBT) compared with an immunochemical faecal occult blood test (iFOBT) during in vitro studies, including their analytical stability over time at ambient temperature and at 4°C; and (2) to compare analytical imprecision and other characteristics between two commercially available iFOBT methods.

  14. Generalized Analytical Treatment Of The Source Strength In The Solution Of The Diffusion Equation

    International Nuclear Information System (INIS)

    Essa, Kh.S.M.; EI-Otaify, M.S.

    2007-01-01

    The source release strength (which is an integral part of the mathematical formulation of the diffusion equation) together with the boundary conditions leads to three different forms of the diffusion equation. The obtained forms have been solved analytically under different boundary conditions, by using transformation of axis, cosine, and Fourier transformation. Three equivalent alternative mathematical formulations of the problem have been obtained. The estimated solution of the concentrations at the ground source has been used for comparison with observed concentrations data for SF 6 tracer experiments in low wind and unstable conditions at lIT Delhi sports ground. A good agreement between estimated and observed concentrations is found

  15. TrajAnalytics: An Open-Source, Web-Based Visual Analytics Software of Urban Trajectory Data

    OpenAIRE

    Zhao, Ye

    2018-01-01

    We developed a software system named TrajAnalytics, which explicitly supports interactive visual analytics of the emerging trajectory data. It offers data management capability and support various data queries by leveraging web-based computing platforms. It allows users to visually conduct queries and make sense of massive trajectory data.

  16. Guidelines for testing sealed radiation sources

    International Nuclear Information System (INIS)

    1989-01-01

    These guidelines are based on article 16(1) of the Ordinance on the Implementation of Atomic Safety and Radiation Protection dated 11 October 1984 (VOAS), in connection with article 36 of the Executory Provision to the VOAS, of 11 October 1984. They apply to the testing of sealed sources to verify their intactness, tightness and non-contamination as well as observance of their fixed service time. The type, scope and intervals of testing as well as the evaluation of test results are determined. These guidelines also apply to the testing of radiation sources forming part of radiation equipment, unless otherwise provided for in the type license or permit. These guidelines enter into force on 1 January 1990

  17. Track 4: basic nuclear science variance reduction for Monte Carlo criticality simulations. 2. Assessment of MCNP Statistical Analysis of keff Eigenvalue Convergence with an Analytical Criticality Verification Test Set

    International Nuclear Information System (INIS)

    Sood, Avnet; Forster, R. Arthur; Parsons, D. Kent

    2001-01-01

    Monte Carlo simulations of nuclear criticality eigenvalue problems are often performed by general purpose radiation transport codes such as MCNP. MCNP performs detailed statistical analysis of the criticality calculation and provides feedback to the user with warning messages, tables, and graphs. The purpose of the analysis is to provide the user with sufficient information to assess spatial convergence of the eigenfunction and thus the validity of the criticality calculation. As a test of this statistical analysis package in MCNP, analytic criticality verification benchmark problems have been used for the first time to assess the performance of the criticality convergence tests in MCNP. The MCNP statistical analysis capability has been recently assessed using the 75 multigroup criticality verification analytic problem test set. MCNP was verified with these problems at the 10 -4 to 10 -5 statistical error level using 40 000 histories per cycle and 2000 active cycles. In all cases, the final boxed combined k eff answer was given with the standard deviation and three confidence intervals that contained the analytic k eff . To test the effectiveness of the statistical analysis checks in identifying poor eigenfunction convergence, ten problems from the test set were deliberately run incorrectly using 1000 histories per cycle, 200 active cycles, and 10 inactive cycles. Six problems with large dominance ratios were chosen from the test set because they do not achieve the normal spatial mode in the beginning of the calculation. To further stress the convergence tests, these problems were also started with an initial fission source point 1 cm from the boundary thus increasing the likelihood of a poorly converged initial fission source distribution. The final combined k eff confidence intervals for these deliberately ill-posed problems did not include the analytic k eff value. In no case did a bad confidence interval go undetected. Warning messages were given signaling that

  18. Ibmdbpy-spatial : An Open-source implementation of in-database geospatial analytics in Python

    Science.gov (United States)

    Roy, Avipsa; Fouché, Edouard; Rodriguez Morales, Rafael; Moehler, Gregor

    2017-04-01

    As the amount of spatial data acquired from several geodetic sources has grown over the years and as data infrastructure has become more powerful, the need for adoption of in-database analytic technology within geosciences has grown rapidly. In-database analytics on spatial data stored in a traditional enterprise data warehouse enables much faster retrieval and analysis for making better predictions about risks and opportunities, identifying trends and spot anomalies. Although there are a number of open-source spatial analysis libraries like geopandas and shapely available today, most of them have been restricted to manipulation and analysis of geometric objects with a dependency on GEOS and similar libraries. We present an open-source software package, written in Python, to fill the gap between spatial analysis and in-database analytics. Ibmdbpy-spatial provides a geospatial extension to the ibmdbpy package, implemented in 2015. It provides an interface for spatial data manipulation and access to in-database algorithms in IBM dashDB, a data warehouse platform with a spatial extender that runs as a service on IBM's cloud platform called Bluemix. Working in-database reduces the network overload, as the complete data need not be replicated into the user's local system altogether and only a subset of the entire dataset can be fetched into memory in a single instance. Ibmdbpy-spatial accelerates Python analytics by seamlessly pushing operations written in Python into the underlying database for execution using the dashDB spatial extender, thereby benefiting from in-database performance-enhancing features, such as columnar storage and parallel processing. The package is currently supported on Python versions from 2.7 up to 3.4. The basic architecture of the package consists of three main components - 1) a connection to the dashDB represented by the instance IdaDataBase, which uses a middleware API namely - pypyodbc or jaydebeapi to establish the database connection via

  19. A two-dimensional transient analytical solution for a ponded ditch drainage system under the influence of source/sink

    Science.gov (United States)

    Sarmah, Ratan; Tiwari, Shubham

    2018-03-01

    An analytical solution is developed for predicting two-dimensional transient seepage into ditch drainage network receiving water from a non-uniform steady ponding field from the surface of the soil under the influence of source/sink in the flow domain. The flow domain is assumed to be saturated, homogeneous and anisotropic in nature and have finite extends in horizontal and vertical directions. The drains are assumed to be standing vertical and penetrating up to impervious layer. The water levels in the drains are unequal and invariant with time. The flow field is also assumed to be under the continuous influence of time-space dependent arbitrary source/sink term. The correctness of the proposed model is checked by developing a numerical code and also with the existing analytical solution for the simplified case. The study highlights the significance of source/sink influence in the subsurface flow. With the imposition of the source and sink term in the flow domain, the pathline and travel time of water particles started deviating from their original position and above that the side and top discharge to the drains were also observed to have a strong influence of the source/sink terms. The travel time and pathline of water particles are also observed to have a dependency on the height of water in the ditches and on the location of source/sink activation area.

  20. Analytical support for the preparation of bundle test QUENCH-10 on air ingress

    International Nuclear Information System (INIS)

    Birchley, J.; Haste, T.; Homann, C.; Hering, W.

    2005-07-01

    Bundle test QUENCH-10 is dedicated to study air ingress with subsequent water quench during a supposed accident in a spent fuel storage tank. It was proposed by AEKI, Budapest, Hungary and was performed on 21 July 2004 in the QUENCH facility at Forschungszentrum Karlsruhe. Preparation of the test is based on common analytical work at Forschungszentrum Karlsruhe and Paul Scherrer Institut, Villigen, Switzerland, mainly with the severe accident codes SCDAP/RELAP5 and MELCOR, to derive the protocol for the essential test phases, namely pre-oxidation, air ingress and quench phase. For issues that could not be tackled by this computational work, suggestions for the test conduct were made and applied during the test. Improvements of the experimental set-up and the test conduct were suggested and largely applied. In SCDAP/RELAP5, an error was found: for thick oxide scales, the output value of the oxide scale is sensibly underestimated. For the aims of the test preparation, its consequences could be taken into account. Together with the related computational and other analytical support by the engaged institutions the test is co-financed as test QUENCH-L1 by the European Community under the Euratom Fifth Framework Programme on Nuclear Fission Safety 1998 - 2002 (LACOMERA Project, contract No. FIR1-CT2002-40158). (orig.)

  1. Radioimmunoassay. A revolution in the analytic procedure

    Energy Technology Data Exchange (ETDEWEB)

    Strecker, H; Eckert, H G [Farbwerke Hoechst A.G., Frankfurt am Main (Germany, F.R.). Radiochemisches Lab.

    1978-04-01

    Radioimmunoassay is an analytic method which combines the sensitivity of radioactive measurement and the specificity of the antigen-antibody reaction Substances down to a concentration of some picograms per ml serum (or biological material) can be measured in the presence of a millionfold excess of otherwise interfering substances. The method is easy to carry out (test tube chemistry). The main field of application at the moment is in endocrinology; further possibilities of application are in pharmaceutical research, environmental protection, forensic medicine, and for general analytic purposes. Radioactive sources are used only in vitro in the nanocurie range, i.e. radiation exposure is negligible.

  2. Aspects related to the testing of sealed radioactive sources

    International Nuclear Information System (INIS)

    Olteanu, C. M.; Nistor, V.; Valeca, S. C.

    2016-01-01

    Sealed radioactive sources are commonly used in a wide range of applications, such as: medical, industrial, agricultural and scientific research. The radioactive material is contained within the sealed source and the device allows the radiation to be used in a controlled way. Accidents can result if the control over a small fraction of those sources is lost. Sealed nuclear sources fall under the category of special form radioactive material, therefore they must meet safety requirements during transport according to regulations. Testing sealed radioactive sources is an important step in the conformity assessment process in order to obtain the design approval. In ICN Pitesti, the Reliability and Testing Laboratory is notified by CNCAN to perform tests on sealed radioactive sources. This paper wants to present aspects of the verifying tests on sealed capsules for Iridium-192 sources in order to demonstrate the compliance with the regulatory requirements and the program of quality assurance of the tests performed. (authors)

  3. An analytical threshold voltage model for a short-channel dual-metal-gate (DMG) recessed-source/drain (Re-S/D) SOI MOSFET

    Science.gov (United States)

    Saramekala, G. K.; Santra, Abirmoya; Dubey, Sarvesh; Jit, Satyabrata; Tiwari, Pramod Kumar

    2013-08-01

    In this paper, an analytical short-channel threshold voltage model is presented for a dual-metal-gate (DMG) fully depleted recessed source/drain (Re-S/D) SOI MOSFET. For the first time, the advantages of recessed source/drain (Re-S/D) and of dual-metal-gate structure are incorporated simultaneously in a fully depleted SOI MOSFET. The analytical surface potential model at Si-channel/SiO2 interface and Si-channel/buried-oxide (BOX) interface have been developed by solving the 2-D Poisson’s equation in the channel region with appropriate boundary conditions assuming parabolic potential profile in the transverse direction of the channel. Thereupon, a threshold voltage model is derived from the minimum surface potential in the channel. The developed model is analyzed extensively for a variety of device parameters like the oxide and silicon channel thicknesses, thickness of source/drain extension in the BOX, control and screen gate length ratio. The validity of the present 2D analytical model is verified with ATLAS™, a 2D device simulator from SILVACO Inc.

  4. Analytical validation of an ultra low-cost mobile phone microplate reader for infectious disease testing.

    Science.gov (United States)

    Wang, Li-Ju; Naudé, Nicole; Demissie, Misganaw; Crivaro, Anne; Kamoun, Malek; Wang, Ping; Li, Lei

    2018-07-01

    Most mobile health (mHealth) diagnostic devices for laboratory tests only analyze one sample at a time, which is not suitable for large volume serology testing, especially in low-resource settings with shortage of health professionals. In this study, we developed an ultra-low-cost clinically-accurate mobile phone microplate reader (mReader), and clinically validated this optical device for 12 infectious disease tests. The mReader optically reads 96 samples on a microplate at one time. 771 de-identified patient samples were tested for 12 serology assays for bacterial/viral infections. The mReader and the clinical instrument blindly read and analyzed all tests in parallel. The analytical accuracy and the diagnostic performance of the mReader were evaluated across the clinical reportable categories by comparison with clinical laboratorial testing results. The mReader exhibited 97.59-99.90% analytical accuracy and envision the mReader can benefit underserved areas/populations and low-resource settings in rural clinics/hospitals at a low cost (~$50 USD) with clinical-level analytical quality. It has the potential to improve health access, speed up healthcare delivery, and reduce health disparities and education disparities by providing access to a low-cost spectrophotometer. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China.

    Science.gov (United States)

    Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li'an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling

    2016-03-01

    A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box-Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China.

  6. Analytical formulae to calculate the solid angle subtended at an arbitrarily positioned point source by an elliptical radiation detector

    International Nuclear Information System (INIS)

    Abbas, Mahmoud I.; Hammoud, Sami; Ibrahim, Tarek; Sakr, Mohamed

    2015-01-01

    In this article, we introduce a direct analytical mathematical method for calculating the solid angle, Ω, subtended at a point by closed elliptical contours. The solid angle is required in many areas of optical and nuclear physics to estimate the flux of particle beam of radiation and to determine the activity of a radioactive source. The validity of the derived analytical expressions was successfully confirmed by the comparison with some published data (Numerical Method)

  7. [Quality Management and Quality Specifications of Laboratory Tests in Clinical Studies--Challenges in Pre-Analytical Processes in Clinical Laboratories].

    Science.gov (United States)

    Ishibashi, Midori

    2015-01-01

    The cost, speed, and quality are the three important factors recently indicated by the Ministry of Health, Labour and Welfare (MHLW) for the purpose of accelerating clinical studies. Based on this background, the importance of laboratory tests is increasing, especially in the evaluation of clinical study participants' entry and safety, and drug efficacy. To assure the quality of laboratory tests, providing high-quality laboratory tests is mandatory. For providing adequate quality assurance in laboratory tests, quality control in the three fields of pre-analytical, analytical, and post-analytical processes is extremely important. There are, however, no detailed written requirements concerning specimen collection, handling, preparation, storage, and shipping. Most laboratory tests for clinical studies are performed onsite in a local laboratory; however, a part of laboratory tests is done in offsite central laboratories after specimen shipping. As factors affecting laboratory tests, individual and inter-individual variations are well-known. Besides these factors, standardizing the factors of specimen collection, handling, preparation, storage, and shipping, may improve and maintain the high quality of clinical studies in general. Furthermore, the analytical method, units, and reference interval are also important factors. It is concluded that, to overcome the problems derived from pre-analytical processes, it is necessary to standardize specimen handling in a broad sense.

  8. Rational Selection, Criticality Assessment, and Tiering of Quality Attributes and Test Methods for Analytical Similarity Evaluation of Biosimilars.

    Science.gov (United States)

    Vandekerckhove, Kristof; Seidl, Andreas; Gutka, Hiten; Kumar, Manish; Gratzl, Gyöngyi; Keire, David; Coffey, Todd; Kuehne, Henriette

    2018-05-10

    Leading regulatory agencies recommend biosimilar assessment to proceed in a stepwise fashion, starting with a detailed analytical comparison of the structural and functional properties of the proposed biosimilar and reference product. The degree of analytical similarity determines the degree of residual uncertainty that must be addressed through downstream in vivo studies. Substantive evidence of similarity from comprehensive analytical testing may justify a targeted clinical development plan, and thus enable a shorter path to licensing. The importance of a careful design of the analytical similarity study program therefore should not be underestimated. Designing a state-of-the-art analytical similarity study meeting current regulatory requirements in regions such as the USA and EU requires a methodical approach, consisting of specific steps that far precede the work on the actual analytical study protocol. This white paper discusses scientific and methodological considerations on the process of attribute and test method selection, criticality assessment, and subsequent assignment of analytical measures to US FDA's three tiers of analytical similarity assessment. Case examples of selection of critical quality attributes and analytical methods for similarity exercises are provided to illustrate the practical implementation of the principles discussed.

  9. Analytical validation of a novel multiplex test for detection of advanced adenoma and colorectal cancer in symptomatic patients.

    Science.gov (United States)

    Dillon, Roslyn; Croner, Lisa J; Bucci, John; Kairs, Stefanie N; You, Jia; Beasley, Sharon; Blimline, Mark; Carino, Rochele B; Chan, Vicky C; Cuevas, Danissa; Diggs, Jeff; Jennings, Megan; Levy, Jacob; Mina, Ginger; Yee, Alvin; Wilcox, Bruce

    2018-05-30

    Early detection of colorectal cancer (CRC) is key to reducing associated mortality. Despite the importance of early detection, approximately 40% of individuals in the United States between the ages of 50-75 have never been screened for CRC. The low compliance with colonoscopy and fecal-based screening may be addressed with a non-invasive alternative such as a blood-based test. We describe here the analytical validation of a multiplexed blood-based assay that measures the plasma concentrations of 15 proteins to assess advanced adenoma (AA) and CRC risk in symptomatic patients. The test was developed on an electrochemiluminescent immunoassay platform employing four multi-marker panels, to be implemented in the clinic as a laboratory developed test (LDT). Under the Clinical Laboratory Improvement Amendments (CLIA) and College of American Pathologists (CAP) regulations, a United States-based clinical laboratory utilizing an LDT must establish performance characteristics relating to analytical validity prior to releasing patient test results. This report describes a series of studies demonstrating the precision, accuracy, analytical sensitivity, and analytical specificity for each of the 15 assays, as required by CLIA/CAP. In addition, the report describes studies characterizing each of the assays' dynamic range, parallelism, tolerance to common interfering substances, spike recovery, and stability to sample freeze-thaw cycles. Upon completion of the analytical characterization, a clinical accuracy study was performed to evaluate concordance of AA and CRC classifier model calls using the analytical method intended for use in the clinic. Of 434 symptomatic patient samples tested, the percent agreement with original CRC and AA calls was 87% and 92% respectively. All studies followed CLSI guidelines and met the regulatory requirements for implementation of a new LDT. The results provide the analytical evidence to support the implementation of the novel multi-marker test as

  10. I. Forensic data analysis by pattern recognition. Categorization of white bond papers by elemental composition. II. Source identification of oil spills by pattern recognition analysis of natural elemental composition. III. Improving the reliability of factor analysis of chemical measured analytical data by utilizing the measured analytical uncertainity. IV. Elucidating the structure of some clinical data

    International Nuclear Information System (INIS)

    Duewer, D.L.

    1977-01-01

    Pattern recognition techniques are applied to the analysis of white bond papers and the problem of determining the source of an oil spill. In each case, an elemental analysis by neutron activation is employed. For the determination of source of oil spills, the field sample was weathered prior to activation analysis. A procedure for including measured analytical uncertainty into data analysis methodology is discussed, with particular reference to factor analysis. The suitability of various dispersion matrices and matrix rank determination criteria for data having analytical uncertainty is investigated. A criterion useful for judging the number of factors insensitive to analytical uncertainty is presented. A model data structure for investigating the behavior of factor analysis techniques in a known, controlled manner is described and analyzed. A chemically interesting test data base having analytical uncertainty is analyzed and compared with the model data. The data structure of 22 blood constituents in three categories of liver disease (viral or toxic hepatitis, alcoholic liver diseases and obstructive processes) is studied using various statistical and pattern recognition techniques. Comparison of classification results on the original data, in combination with principal component analysis, suggests a possible underlying structure for the data. This model structure is tested by the application of two simple data transformations. Analysis of the transformed data appears to confirm that some basic understanding of the studied data has been achieved

  11. Waste minimization methods for treating analytical instrumentation effluents at the source

    International Nuclear Information System (INIS)

    Ritter, J.A.; Barnhart, C.

    1995-01-01

    The primary goal of this project was to reduce the amount of hazardous waste being generated by the Savannah River Siste Defense Waste Processing Technology-analytical Laboratory (DWPT-AL). A detailed characterization study was performed on 12 of the liquid effluent streams generated within the DWPT-AL. Two of the streams were not hazardous, and are now being collected separately from the 10 hazardous streams. A secondary goal of the project was to develop in-line methods using primarily adsorption/ion exchange columns to treat liquid effluent as it emerges from the analytical instrument as a slow, dripping flow. Samples from the 10 hazardous streams were treated by adsorption in an experimental apparatus that resembled an in-line or at source column apparatus. The layered adsorbent bed contained activated carbon and ion exchange resin. The column technique did not work on the first three samples of the spectroscopy waste stream, but worked well on the next three samples which were treated in a different column. It was determined that an unusual form of mercury was present in the first three samples. Similarly, two samples of a combined waste stream were rendered nonhazardous, but the last two samples contained acetylnitrile that prevented analysis. The characteristics of these streams changed from the initial characterization study; therefore, continual, in-deptch stream characterization is the key to making this project successful

  12. CheapStat: an open-source, "do-it-yourself" potentiostat for analytical and educational applications.

    Directory of Open Access Journals (Sweden)

    Aaron A Rowe

    Full Text Available Although potentiostats are the foundation of modern electrochemical research, they have seen relatively little application in resource poor settings, such as undergraduate laboratory courses and the developing world. One reason for the low penetration of potentiostats is their cost, as even the least expensive commercially available laboratory potentiostats sell for more than one thousand dollars. An inexpensive electrochemical workstation could thus prove useful in educational labs, and increase access to electrochemistry-based analytical techniques for food, drug and environmental monitoring. With these motivations in mind, we describe here the CheapStat, an inexpensive (<$80, open-source (software and hardware, hand-held potentiostat that can be constructed by anyone who is proficient at assembling circuits. This device supports a number of potential waveforms necessary to perform cyclic, square wave, linear sweep and anodic stripping voltammetry. As we demonstrate, it is suitable for a wide range of applications ranging from food- and drug-quality testing to environmental monitoring, rapid DNA detection, and educational exercises. The device's schematics, parts lists, circuit board layout files, sample experiments, and detailed assembly instructions are available in the supporting information and are released under an open hardware license.

  13. SmartR: an open-source platform for interactive visual analytics for translational research data.

    Science.gov (United States)

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-07-15

    In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR , a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR . reinhard.schneider@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  14. Development and application of test apparatus for classification of sealed source

    International Nuclear Information System (INIS)

    Kim, Dong Hak; Seo, Ki Seog; Bang, Kyoung Sik; Lee, Ju Chan; Son, Kwang Je

    2007-01-01

    Sealed sources have to conducted the tests be done according to the classification requirements for their typical usages in accordance with the relevant domestic notice standard and ISO 2919. After each test, the source shall be examined visually for loss of integrity and pass an appropriate leakage test. Tests to class a sealed source are temperature, external pressure, impact, vibration and puncture test. The environmental test conditions for tests with class numbers are arranged in increasing order of severity. In this study, the apparatus of tests, except the vibration test, were developed and applied to three kinds of sealed source. The conditions of the tests to class a sealed source were stated and the difference between the domestic notice standard and ISO 2919 were considered. And apparatus of the tests were made. Using developed apparatus we conducted the test for 192 Ir brachytherapy sealed source and two kinds of sealed source for industrial radiography. 192 Ir brachytherapy sealed source is classified by temperature class 5, external pressure class 3, impact class 2 and vibration and puncture class 1. Two kinds of sealed source for industrial radiography are classified by temperature class 4, external pressure class 2, impact and puncture class 5 and vibration class 1. After the tests, Liquid nitrogen bubble test and vacuum bubble test were done to evaluate the safety of the sealed sources

  15. The Analytical Repository Source-Term (AREST) model: Description and documentation

    International Nuclear Information System (INIS)

    Liebetrau, A.M.; Apted, M.J.; Engel, D.W.; Altenhofen, M.K.; Strachan, D.M.; Reid, C.R.; Windisch, C.F.; Erikson, R.L.; Johnson, K.I.

    1987-10-01

    The geologic repository system consists of several components, one of which is the engineered barrier system. The engineered barrier system interfaces with natural barriers that constitute the setting of the repository. A model that simulates the releases from the engineered barrier system into the natural barriers of the geosphere, called a source-term model, is an important component of any model for assessing the overall performance of the geologic repository system. The Analytical Repository Source-Term (AREST) model being developed is one such model. This report describes the current state of development of the AREST model and the code in which the model is implemented. The AREST model consists of three component models and five process models that describe the post-emplacement environment of a waste package. All of these components are combined within a probabilistic framework. The component models are a waste package containment (WPC) model that simulates the corrosion and degradation processes which eventually result in waste package containment failure; a waste package release (WPR) model that calculates the rates of radionuclide release from the failed waste package; and an engineered system release (ESR) model that controls the flow of information among all AREST components and process models and combines release output from the WPR model with failure times from the WPC model to produce estimates of total release. 167 refs., 40 figs., 12 tabs

  16. VAP/VAT: video analytics platform and test bed for testing and deploying video analytics

    Science.gov (United States)

    Gorodnichy, Dmitry O.; Dubrofsky, Elan

    2010-04-01

    Deploying Video Analytics in operational environments is extremely challenging. This paper presents a methodological approach developed by the Video Surveillance and Biometrics Section (VSB) of the Science and Engineering Directorate (S&E) of the Canada Border Services Agency (CBSA) to resolve these problems. A three-phase approach to enable VA deployment within an operational agency is presented and the Video Analytics Platform and Testbed (VAP/VAT) developed by the VSB section is introduced. In addition to allowing the integration of third party and in-house built VA codes into an existing video surveillance infrastructure, VAP/VAT also allows the agency to conduct an unbiased performance evaluation of the cameras and VA software available on the market. VAP/VAT consists of two components: EventCapture, which serves to Automatically detect a "Visual Event", and EventBrowser, which serves to Display & Peruse of "Visual Details" captured at the "Visual Event". To deal with Open architecture as well as with Closed architecture cameras, two video-feed capture mechanisms have been developed within the EventCapture component: IPCamCapture and ScreenCapture.

  17. Helios: Understanding Solar Evolution Through Text Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Randazzese, Lucien [SRI International, Menlo Park, CA (United States)

    2016-12-02

    This proof-of-concept project focused on developing, testing, and validating a range of bibliometric, text analytic, and machine-learning based methods to explore the evolution of three photovoltaic (PV) technologies: Cadmium Telluride (CdTe), Dye-Sensitized solar cells (DSSC), and Multi-junction solar cells. The analytical approach to the work was inspired by previous work by the same team to measure and predict the scientific prominence of terms and entities within specific research domains. The goal was to create tools that could assist domain-knowledgeable analysts in investigating the history and path of technological developments in general, with a focus on analyzing step-function changes in performance, or “breakthroughs,” in particular. The text-analytics platform developed during this project was dubbed Helios. The project relied on computational methods for analyzing large corpora of technical documents. For this project we ingested technical documents from the following sources into Helios: Thomson Scientific Web of Science (papers), the U.S. Patent & Trademark Office (patents), the U.S. Department of Energy (technical documents), the U.S. National Science Foundation (project funding summaries), and a hand curated set of full-text documents from Thomson Scientific and other sources.

  18. Coagulation Tests and Selected Biochemical Analytes in Dairy Cows with Hepatic Lipidosis

    OpenAIRE

    S. Padilla-Arellanes; F. Constantino-Casas; L. Núnez-Ochoa; J. Doubek; C. Vega-Murguia; J. Bouda

    2007-01-01

    The aim of this study was to determine the values and changes in conventional and optimised clotting tests, as well as in selected biochemical analytes during hepatic lipidosis in postpartum dairy cows. Ten healthy and ten Holstein cows with hepatic lipidosis were selected based upon clinical history, clinical examination, liver biopsy, flotation test and histological analysis of hepatic tissue. Prothrombin time (PT) and partial thromboplastin time (PTT) were determined in non-diluted and dil...

  19. Enhanced H- ion source testing capabilities at LANSCE

    International Nuclear Information System (INIS)

    Ingalls, W.B.; Hardy, M.W.; Prichard, B.A.; Sander, O.R.; Stelzer, J.E.; Stevens, R.R.; Leung, K.N.; Williams, M.D.

    1998-01-01

    As part of the on-going beam-current upgrade in the Proton Storage Ring (PSR) at the Los Alamos Neutron Science Center (LANSCE), the current available from the H - injector will be increased from the present 16 to 18 mA to as much as 40 mA. A collaboration between the Ion Beam Technology Group at Lawrence Berkeley National Laboratory (LBNL) and the Ion Sources and Injectors section of LANSCE-2 at Los Alamos National Laboratory (LANL) has been formed to develop and evaluate a new ion source. A new Ion Source Test Stand (ISTS) has been constructed at LANSCE to evaluate candidate ion sources. The ISTS has been constructed to duplicate as closely as possible the beam transport and ancillary systems presently in use in the LANSCE H - injector, while incorporating additional beam diagnostics for source testing. The construction and commissioning of the ISTS will be described, preliminary results for the proof-of-principle ion source developed by the Berkeley group will be presented, and future plans for the extension of the test stand will be presented

  20. Rate-control algorithms testing by using video source model

    DEFF Research Database (Denmark)

    Belyaev, Evgeny; Turlikov, Andrey; Ukhanova, Anna

    2008-01-01

    In this paper the method of rate control algorithms testing by the use of video source model is suggested. The proposed method allows to significantly improve algorithms testing over the big test set.......In this paper the method of rate control algorithms testing by the use of video source model is suggested. The proposed method allows to significantly improve algorithms testing over the big test set....

  1. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    Science.gov (United States)

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  2. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    International Nuclear Information System (INIS)

    Brown, Forrest B.

    2016-01-01

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple a ce.pl and simple a ce m g.pl.

  3. Basic Testing of the DUCHAMP Source Finder

    Science.gov (United States)

    Westmeier, T.; Popping, A.; Serra, P.

    2012-01-01

    This paper presents and discusses the results of basic source finding tests in three dimensions (using spectroscopic data cubes) with DUCHAMP, the standard source finder for the Australian Square Kilometre Array Pathfinder. For this purpose, we generated different sets of unresolved and extended Hi model sources. These models were then fed into DUCHAMP, using a range of different parameters and methods provided by the software. The main aim of the tests was to study the performance of DUCHAMP on sources with different parameters and morphologies and assess the accuracy of DUCHAMP's source parametrisation. Overall, we find DUCHAMP to be a powerful source finder capable of reliably detecting sources down to low signal-to-noise ratios and accurately measuring their position and velocity. In the presence of noise in the data, DUCHAMP's measurements of basic source parameters, such as spectral line width and integrated flux, are affected by systematic errors. These errors are a consequence of the effect of noise on the specific algorithms used by DUCHAMP for measuring source parameters in combination with the fact that the software only takes into account pixels above a given flux threshold and hence misses part of the flux. In scientific applications of DUCHAMP these systematic errors would have to be corrected for. Alternatively, DUCHAMP could be used as a source finder only, and source parametrisation could be done in a second step using more sophisticated parametrisation algorithms.

  4. 10 CFR 39.35 - Leak testing of sealed sources.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Leak testing of sealed sources. 39.35 Section 39.35 Energy....35 Leak testing of sealed sources. (a) Testing and recordkeeping requirements. Each licensee who uses... record of leak test results in units of microcuries and retain the record for inspection by the...

  5. A new DG nanoscale TFET based on MOSFETs by using source gate electrode: 2D simulation and an analytical potential model

    Science.gov (United States)

    Ramezani, Zeinab; Orouji, Ali A.

    2017-08-01

    This paper suggests and investigates a double-gate (DG) MOSFET, which emulates tunnel field effect transistors (M-TFET). We have combined this novel concept into a double-gate MOSFET, which behaves as a tunneling field effect transistor by work function engineering. In the proposed structure, in addition to the main gate, we utilize another gate over the source region with zero applied voltage and a proper work function to convert the source region from N+ to P+. We check the impact obtained by varying the source gate work function and source doping on the device parameters. The simulation results of the M-TFET indicate that it is a suitable case for a switching performance. Also, we present a two-dimensional analytic potential model of the proposed structure by solving the Poisson's equation in x and y directions and by derivatives from the potential profile; thus, the electric field is achieved. To validate our present model, we use the SILVACO ATLAS device simulator. The analytical results have been compared with it.

  6. Kinetic calculations for miniature neutron source reactor using analytical and numerical techniques

    International Nuclear Information System (INIS)

    Ampomah-Amoako, E.

    2008-06-01

    The analytical methods, step change in reactivity and ramp change in reactivity as well as numerical methods, fixed point iteration and Runge Kutta-gill were used to simulate the initial build up of neutrons in a miniature neutron source reactor with and without temperature feedback effect. The methods were modified to include photo neutron concentration. PARET 7.3 was used to simulate the transients behaviour of Ghana Research Reactor-1. The PARET code was capable of simulating the transients for 2.1 mk and 4 mk insertions of reactivity with peak powers of 49.87 kW and 92.34 kW, respectively. PARET code however failed to simulate 6.71 mk of reactivity which was predicted by Akaho et al through TEMPFED. (au)

  7. Use of Strain Measurements from Acoustic Bench Tests of the Battleship Flowliner Test Articles To Link Analytical Model Results to In-Service Resonant Response

    Science.gov (United States)

    Frady, Greg; Smaolloey, Kurt; LaVerde, Bruce; Bishop, Jim

    2004-01-01

    The paper will discuss practical and analytical findings of a test program conducted to assist engineers in determining which analytical strain fields are most appropriate to describe the crack initiating and crack propagating stresses in thin walled cylindrical hardware that serves as part of the Space Shuttle Main Engine's fuel system. In service the hardware is excited by fluctuating dynamic pressures in a cryogenic fuel that arise from turbulent flow/pump cavitation. A bench test using a simplified system was conducted using acoustic energy in air to excite the test articles. Strain measurements were used to reveal response characteristics of two Flowliner test articles that are assembled as a pair when installed in the engine feed system.

  8. Google analytics integrations

    CERN Document Server

    Waisberg, Daniel

    2015-01-01

    A roadmap for turning Google Analytics into a centralized marketing analysis platform With Google Analytics Integrations, expert author Daniel Waisberg shows you how to gain a more meaningful, complete view of customers that can drive growth opportunities. This in-depth guide shows not only how to use Google Analytics, but also how to turn this powerful data collection and analysis tool into a central marketing analysis platform for your company. Taking a hands-on approach, this resource explores the integration and analysis of a host of common data sources, including Google AdWords, AdSens

  9. Analytic and Unambiguous Phase-Based Algorithm for 3-D Localization of a Single Source with Uniform Circular Array

    Directory of Open Access Journals (Sweden)

    Le Zuo

    2018-02-01

    Full Text Available This paper presents an analytic algorithm for estimating three-dimensional (3-D localization of a single source with uniform circular array (UCA interferometers. Fourier transforms are exploited to expand the phase distribution of a single source and the localization problem is reformulated as an equivalent spectrum manipulation problem. The 3-D parameters are decoupled to different spectrums in the Fourier domain. Algebraic relations are established between the 3-D localization parameters and the Fourier spectrums. Fourier sampling theorem ensures that the minimum element number for 3-D localization of a single source with a UCA is five. Accuracy analysis provides mathematical insights into the 3-D localization algorithm that larger number of elements gives higher estimation accuracy. In addition, the phase-based high-order difference invariance (HODI property of a UCA is found and exploited to realize phase range compression. Following phase range compression, ambiguity resolution is addressed by the HODI of a UCA. A major advantage of the algorithm is that the ambiguity resolution and 3-D localization estimation are both analytic and are processed simultaneously, hence computationally efficient. Numerical simulations and experimental results are provided to verify the effectiveness of the proposed 3-D localization algorithm.

  10. Analytical Model of Coil Spring Damper Based on the Loading Test

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Sung Gook; Park, Woong Ki [INNOSE TECH Co. LTD, Incheon (Korea, Republic of); Furuya, Osamu [Tokyo City University, Tokyo (Japan); Kurabayashi, Hiroshi [Vibro-System, Tokyo (Japan)

    2016-05-15

    The one way of solving such problems is to enhance and to develop an improved damping element used in base-isolation and response control system. A cost reduction of damper for a large scale structure is another important task to upgrade the total response control abilities in the near future. This study has examined a response control device using elastoplastic hysteresis damping of metal material. The proposed damper is designed to be coil spring element shape for a uniform stress of metal and for a reduction of low cyclic fatigue in large deformation to upgrade a repetitive strength during the earthquake motions. By using the metal material of SS400 general structural rolled steel, the corresponding cost issues of the damping element will be effectively reduced. The analytical of elasto-plastic coil spring damper (CSD) is introduced, and basic mechanical properties evaluated experimentally and analytically. This study has been examined the response control damper using elasto-plastic hysteresis characteristics of metal material. The paper described the design method of elasto-plastic coil spring damper, basic mechanical properties evaluated from loading test, and analytical model of damper are summarized. It was confirmed that the damping force and mechanical characteristics of elasto-plastic coil spring damper are almost satisfied the design specifications.

  11. Analytical performance of centrifuge-based device for clinical chemistry testing.

    Science.gov (United States)

    Suk-Anake, Jamikorn; Promptmas, Chamras

    2012-01-01

    A centrifuge-based device has been introduced to the Samsung Blood Analyzer (SBA). The verification of this analyzer is essential to meet the ISO15189 standard. Analytical performance was evaluated according to the NCCLS EP05-A method. The results of plasma samples were compared between the SBA and a Hitachi 917 analyzer according to the NCCLS EP09-A2-IR method. Percent recovery was determined via analysis of original control serum and spiked serum. Within-run precision was found to be 0.00 - 6.61% and 0.96 - 5.99% in normal- and abnormal-level assays, respectively, while between-run precision was 1.31 - 9.09% and 0.89 - 6.92%, respectively. The correlation coefficients (r) were > 0.990. The SBA presented analytical accuracy at 96.64 +/- 3.39% to 102.82 +/- 2.75% and 98.31 +/- 4.04% to 103.61 +/- 8.28% recovery, respectively. The results obtained verify that all of the 13 tests performed using the SBA demonstrates good and reliable precision suitable for use in qualified clinical chemistry laboratory service.

  12. A 2D semi-analytical model for Faraday shield in ICP source

    International Nuclear Information System (INIS)

    Zhang, L.G.; Chen, D.Z.; Li, D.; Liu, K.F.; Li, X.F.; Pan, R.M.; Fan, M.W.

    2016-01-01

    Highlights: • In this paper, a 2D model of ICP with faraday shield is proposed considering the complex structure of the Faraday shield. • Analytical solution is found to evaluate the electromagnetic field in the ICP source with Faraday shield. • The collision-free motion of electrons in the source is investigated and the results show that the electrons will oscillate along the radial direction, which brings insight into how the RF power couple to the plasma. - Abstract: Faraday shield is a thin copper structure with a large number of slits which is usually used in inductive coupled plasma (ICP) sources. RF power is coupled into the plasma through these slits, therefore Faraday shield plays an important role in ICP discharge. However, due to the complex structure of the Faraday shield, the resulted electromagnetic field is quite hard to evaluate. In this paper, a 2D model is proposed on the assumption that the Faraday shield is sufficiently long and the RF coil is uniformly distributed, and the copper is considered as ideal conductor. Under these conditions, the magnetic field inside the source is uniform with only the axial component, while the electric field can be decomposed into a vortex field generated by changing magnetic field together with a gradient field generated by electric charge accumulated on the Faraday shield surface, which can be easily found by solving Laplace's equation. The motion of the electrons in the electromagnetic field is investigated and the results show that the electrons will oscillate along the radial direction when taking no account of collision. This interesting result brings insight into how the RF power couples into the plasma.

  13. A Comparison of Two Approaches for the Ruggedness Testing of an Analytical Method

    International Nuclear Information System (INIS)

    Maestroni, Britt

    2016-01-01

    As part of an initiative under the “Red Analitica de Latino America y el Caribe” (RALACA) network the FAO/IAEA Food and Environmental Protection Laboratory validated a multi-residue method for pesticides in potato. One of the parameters to be assessed was the intra laboratory robustness or ruggedness. The objective of this work was to implement a worked example for RALACA laboratories to test for the robustness (ruggedness) of an analytical method. As a conclusion to this study, it is evident that there is a need for harmonization of the definition of the terms robustness/ruggedness, the limits, the methodology and the statistical treatment of the generated data. A worked example for RALACA laboratories to test for the robustness (ruggedness) of an analytical method will soon be posted on the RALACA website (www.red-ralaca.net). This study was carried out with collaborators from LVA (Austria), University of Antwerp (Belgium), University of Leuwen (The Netherlands), Universidad de la Republica (Uruguay) and Agilent technologies.

  14. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    Science.gov (United States)

    Dinov, Ivo D; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W; Price, Nathan D; Van Horn, John D; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M; Dauer, William; Toga, Arthur W

    2016-01-01

    A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches

  15. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    Full Text Available A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI. The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data.Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i introduce methods for rebalancing imbalanced cohorts, (ii utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model

  16. Generalizing Source Geometry of Site Contamination by Simulating and Analyzing Analytical Solution of Three-Dimensional Solute Transport Model

    Directory of Open Access Journals (Sweden)

    Xingwei Wang

    2014-01-01

    Full Text Available Due to the uneven distribution of pollutions and blur edge of pollutant area, there will exist uncertainty of source term shape in advective-diffusion equation model of contaminant transport. How to generalize those irregular source terms and deal with those uncertainties is very critical but rarely studied in previous research. In this study, the fate and transport of contaminant from rectangular and elliptic source geometry were simulated based on a three-dimensional analytical solute transport model, and the source geometry generalization guideline was developed by comparing the migration of contaminant. The result indicated that the variation of source area size had no effect on pollution plume migration when the plume migrated as far as five times of source side length. The migration of pollution plume became slower with the increase of aquifer thickness. The contaminant concentration was decreasing with scale factor rising, and the differences among various scale factors became smaller with the distance to field increasing.

  17. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis.

    Science.gov (United States)

    Mohammed, Emad A; Naugler, Christopher

    2017-01-01

    Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.

  18. Open-source software for demand forecasting of clinical laboratory test volumes using time-series analysis

    Directory of Open Access Journals (Sweden)

    Emad A Mohammed

    2017-01-01

    Full Text Available Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand.

  19. Clinical laboratory analytics: Challenges and promise for an emerging discipline

    Directory of Open Access Journals (Sweden)

    Brian H Shirts

    2015-01-01

    Full Text Available The clinical laboratory is a major source of health care data. Increasingly these data are being integrated with other data to inform health system-wide actions meant to improve diagnostic test utilization, service efficiency, and "meaningful use." The Academy of Clinical Laboratory Physicians and Scientists hosted a satellite meeting on clinical laboratory analytics in conjunction with their annual meeting on May 29, 2014 in San Francisco. There were 80 registrants for the clinical laboratory analytics meeting. The meeting featured short presentations on current trends in clinical laboratory analytics and several panel discussions on data science in laboratory medicine, laboratory data and its role in the larger healthcare system, integrating laboratory analytics, and data sharing for collaborative analytics. One main goal of meeting was to have an open forum of leaders that work with the "big data" clinical laboratories produce. This article summarizes the proceedings of the meeting and content discussed.

  20. Evaluation of methods to leak test sealed radiation sources

    International Nuclear Information System (INIS)

    Arbeau, N.D.; Scott, C.K.

    1987-04-01

    The methods for the leak testing of sealed radiation sources were reviewed. One hundred and thirty-one equipment vendors were surveyed to identify commercially available leak test instruments. The equipment is summarized in tabular form by radiation type and detector type for easy reference. The radiation characteristics of the licensed sources were reviewed and summarized in a format that can be used to select the most suitable detection method. A test kit is proposed for use by inspectors when verifying a licensee's test procedures. The general elements of leak test procedures are discussed

  1. Two-dimensional analytical solutions for chemical transport in aquifers. Part 1. Simplified solutions for sources with constant concentration. Part 2. Exact solutions for sources with constant flux rate

    International Nuclear Information System (INIS)

    Shan, C.; Javandel, I.

    1996-05-01

    Analytical solutions are developed for modeling solute transport in a vertical section of a homogeneous aquifer. Part 1 of the series presents a simplified analytical solution for cases in which a constant-concentration source is located at the top (or the bottom) of the aquifer. The following transport mechanisms have been considered: advection (in the horizontal direction), transverse dispersion (in the vertical direction), adsorption, and biodegradation. In the simplified solution, however, longitudinal dispersion is assumed to be relatively insignificant with respect to advection, and has been neglected. Example calculations are given to show the movement of the contamination front, the development of concentration profiles, the mass transfer rate, and an application to determine the vertical dispersivity. The analytical solution developed in this study can be a useful tool in designing an appropriate monitoring system and an effective groundwater remediation method

  2. Identification of Evidence for Key Parameters in Decision-Analytic Models of Cost Effectiveness: A Description of Sources and a Recommended Minimum Search Requirement.

    Science.gov (United States)

    Paisley, Suzy

    2016-06-01

    This paper proposes recommendations for a minimum level of searching for data for key parameters in decision-analytic models of cost effectiveness and describes sources of evidence relevant to each parameter type. Key parameters are defined as treatment effects, adverse effects, costs, resource use, health state utility values (HSUVs) and baseline risk of events. The recommended minimum requirement for treatment effects is comprehensive searching according to available methodological guidance. For other parameter types, the minimum is the searching of one bibliographic database plus, where appropriate, specialist sources and non-research-based and non-standard format sources. The recommendations draw on the search methods literature and on existing analyses of how evidence is used to support decision-analytic models. They take account of the range of research and non-research-based sources of evidence used in cost-effectiveness models and of the need for efficient searching. Consideration is given to what constitutes best evidence for the different parameter types in terms of design and scientific quality and to making transparent the judgments that underpin the selection of evidence from the options available. Methodological issues are discussed, including the differences between decision-analytic models of cost effectiveness and systematic reviews when searching and selecting evidence and comprehensive versus sufficient searching. Areas are highlighted where further methodological research is required.

  3. Analytic treatment of leading-order parton evolution equations: Theory and tests

    International Nuclear Information System (INIS)

    Block, Martin M.; Durand, Loyal; McKay, Douglas W.

    2009-01-01

    We recently derived an explicit expression for the gluon distribution function G(x,Q 2 )=xg(x,Q 2 ) in terms of the proton structure function F 2 γp (x,Q 2 ) in leading-order (LO) QCD by solving the LO Dokshitzer-Gribov-Lipatov-Altarelli-Parisi equation for the Q 2 evolution of F 2 γp (x,Q 2 ) analytically, using a differential-equation method. We showed that accurate experimental knowledge of F 2 γp (x,Q 2 ) in a region of Bjorken x and virtuality Q 2 is all that is needed to determine the gluon distribution in that region. We rederive and extend the results here using a Laplace-transform technique, and show that the singlet quark structure function F S (x,Q 2 ) can be determined directly in terms of G from the Dokshitzer-Gribov-Lipatov-Altarelli-Parisi gluon evolution equation. To illustrate the method and check the consistency of existing LO quark and gluon distributions, we used the published values of the LO quark distributions from the CTEQ5L and MRST2001 LO analyses to form F 2 γp (x,Q 2 ), and then solved analytically for G(x,Q 2 ). We find that the analytic and fitted gluon distributions from MRST2001LO agree well with each other for all x and Q 2 , while those from CTEQ5L differ significantly from each other for large x values, x > or approx. 0.03-0.05, at all Q 2 . We conclude that the published CTEQ5L distributions are incompatible in this region. Using a nonsinglet evolution equation, we obtain a sensitive test of quark distributions which holds in both LO and next-to-leading order perturbative QCD. We find in either case that the CTEQ5 quark distributions satisfy the tests numerically for small x, but fail the tests for x > or approx. 0.03-0.05--their use could potentially lead to significant shifts in predictions of quantities sensitive to large x. We encountered no problems with the MRST2001LO distributions or later CTEQ distributions. We suggest caution in the use of the CTEQ5 distributions.

  4. The machine in multimedia analytics

    NARCIS (Netherlands)

    Zahálka, J.

    2017-01-01

    This thesis investigates the role of the machine in multimedia analytics, a discipline that combines visual analytics with multimedia analysis algorithms in order to unlock the potential of multimedia collections as sources of knowledge in scientific and applied domains. Specifically, the central

  5. Formative assessment and learning analytics

    NARCIS (Netherlands)

    Tempelaar, D.T.; Heck, A.; Cuypers, H.; van der Kooij, H.; van de Vrie, E.; Suthers, D.; Verbert, K.; Duval, E.; Ochoa, X.

    2013-01-01

    Learning analytics seeks to enhance the learning process through systematic measurements of learning related data, and informing learners and teachers of the results of these measurements, so as to support the control of the learning process. Learning analytics has various sources of information,

  6. A negative ion source test facility

    Energy Technology Data Exchange (ETDEWEB)

    Melanson, S.; Dehnel, M., E-mail: morgan@d-pace.com; Potkins, D.; Theroux, J.; Hollinger, C.; Martin, J.; Stewart, T.; Jackle, P.; Withington, S. [D-Pace, Inc., P.O. Box 201, Nelson, British Columbia V1L 5P9 (Canada); Philpott, C.; Williams, P.; Brown, S.; Jones, T.; Coad, B. [Buckley Systems Ltd., 6 Bowden Road, Mount Wellington, Auckland 1060 (New Zealand)

    2016-02-15

    Progress is being made in the development of an Ion Source Test Facility (ISTF) by D-Pace Inc. in collaboration with Buckley Systems Ltd. in Auckland, NZ. The first phase of the ISTF is to be commissioned in October 2015 with the second phase being commissioned in March 2016. The facility will primarily be used for the development and the commercialization of ion sources. It will also be used to characterize and further develop various D-Pace Inc. beam diagnostic devices.

  7. Pre-Analytical Conditions in Non-Invasive Prenatal Testing of Cell-Free Fetal RHD

    DEFF Research Database (Denmark)

    Clausen, Frederik Banch; Jakobsen, Tanja Roien; Rieneck, Klaus

    2013-01-01

    D positive fetus. Prophylaxis reduces the risk of immunization that may lead to hemolytic disease of the fetus and the newborn. The reliability of predicting the fetal RhD type depends on pre-analytical factors and assay sensitivity. We evaluated the testing setup in the Capital Region of Denmark, based...

  8. Technical challenges for big data in biomedicine and health: data sources, infrastructure, and analytics.

    Science.gov (United States)

    Peek, N; Holmes, J H; Sun, J

    2014-08-15

    To review technical and methodological challenges for big data research in biomedicine and health. We discuss sources of big datasets, survey infrastructures for big data storage and big data processing, and describe the main challenges that arise when analyzing big data. The life and biomedical sciences are massively contributing to the big data revolution through secondary use of data that were collected during routine care and through new data sources such as social media. Efficient processing of big datasets is typically achieved by distributing computation over a cluster of computers. Data analysts should be aware of pitfalls related to big data such as bias in routine care data and the risk of false-positive findings in high-dimensional datasets. The major challenge for the near future is to transform analytical methods that are used in the biomedical and health domain, to fit the distributed storage and processing model that is required to handle big data, while ensuring confidentiality of the data being analyzed.

  9. MAGNETO-FRICTIONAL MODELING OF CORONAL NONLINEAR FORCE-FREE FIELDS. I. TESTING WITH ANALYTIC SOLUTIONS

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Y.; Keppens, R. [School of Astronomy and Space Science, Nanjing University, Nanjing 210023 (China); Xia, C. [Centre for mathematical Plasma-Astrophysics, Department of Mathematics, KU Leuven, B-3001 Leuven (Belgium); Valori, G., E-mail: guoyang@nju.edu.cn [University College London, Mullard Space Science Laboratory, Holmbury St. Mary, Dorking, Surrey RH5 6NT (United Kingdom)

    2016-09-10

    We report our implementation of the magneto-frictional method in the Message Passing Interface Adaptive Mesh Refinement Versatile Advection Code (MPI-AMRVAC). The method aims at applications where local adaptive mesh refinement (AMR) is essential to make follow-up dynamical modeling affordable. We quantify its performance in both domain-decomposed uniform grids and block-adaptive AMR computations, using all frequently employed force-free, divergence-free, and other vector comparison metrics. As test cases, we revisit the semi-analytic solution of Low and Lou in both Cartesian and spherical geometries, along with the topologically challenging Titov–Démoulin model. We compare different combinations of spatial and temporal discretizations, and find that the fourth-order central difference with a local Lax–Friedrichs dissipation term in a single-step marching scheme is an optimal combination. The initial condition is provided by the potential field, which is the potential field source surface model in spherical geometry. Various boundary conditions are adopted, ranging from fully prescribed cases where all boundaries are assigned with the semi-analytic models, to solar-like cases where only the magnetic field at the bottom is known. Our results demonstrate that all the metrics compare favorably to previous works in both Cartesian and spherical coordinates. Cases with several AMR levels perform in accordance with their effective resolutions. The magneto-frictional method in MPI-AMRVAC allows us to model a region of interest with high spatial resolution and large field of view simultaneously, as required by observation-constrained extrapolations using vector data provided with modern instruments. The applications of the magneto-frictional method to observations are shown in an accompanying paper.

  10. An analytical calculation of the peak efficiency for cylindrical sources perpendicular to the detector axis in gamma-ray spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Aguiar, Julio C. [Autoridad Regulatoria Nuclear, Laboratorio de Espectrometria Gamma-CTBTO, Av. Del Libertador 8250, C1429BNP Buenos Aires (Argentina)], E-mail: jaguiar@sede.arn.gov.ar

    2008-08-15

    An analytical expression for the so-called full-energy peak efficiency {epsilon}(E) for cylindrical source with perpendicular axis to an HPGe detector is derived, using point-source measurements. The formula covers different measuring distances, matrix compositions, densities and gamma-ray energies; the only assumption is that the radioactivity is homogeneously distributed within the source. The term for the photon self-attenuation is included in the calculation. Measurements were made using three different sized cylindrical sources of {sup 241}Am, {sup 57}Co, {sup 137}Cs, {sup 54}Mn, and {sup 60}Co with corresponding peaks of 59.5, 122, 662, 835, 1173, and 1332 keV, respectively, and one measurement of radioactive waste drum for 662, 1173, and 1332 keV.

  11. An Extensible Open-Source Compiler Infrastructure for Testing

    Energy Technology Data Exchange (ETDEWEB)

    Quinlan, D; Ur, S; Vuduc, R

    2005-12-09

    Testing forms a critical part of the development process for large-scale software, and there is growing need for automated tools that can read, represent, analyze, and transform the application's source code to help carry out testing tasks. However, the support required to compile applications written in common general purpose languages is generally inaccessible to the testing research community. In this paper, we report on an extensible, open-source compiler infrastructure called ROSE, which is currently in development at Lawrence Livermore National Laboratory. ROSE specifically targets developers who wish to build source-based tools that implement customized analyses and optimizations for large-scale C, C++, and Fortran90 scientific computing applications (on the order of a million lines of code or more). However, much of this infrastructure can also be used to address problems in testing, and ROSE is by design broadly accessible to those without a formal compiler background. This paper details the interactions between testing of applications and the ways in which compiler technology can aid in the understanding of those applications. We emphasize the particular aspects of ROSE, such as support for the general analysis of whole programs, that are particularly well-suited to the testing research community and the scale of the problems that community solves.

  12. Proficiency Testing by Interlaboratory Comparison Performed in 2010-2015 for Neutron Activation Analysis and Other Analytical Techniques

    International Nuclear Information System (INIS)

    2017-12-01

    The IAEA supports its Member States to increase the utilization of their research reactors. Small and medium sized reactors are mostly used for neutron activation analysis (NAA). Although the markets for NAA laboratories have been identified, demonstration of valid analytical results and organizational quality of the work process are preconditions for expanding the stakeholder community, particularly in commercial routine application of this powerful technique. The IAEA has implemented a new mechanism for supporting NAA laboratories in demonstrating their analytical performance by participation in proficiency testing schemes by interlaboratory comparison. This activity makes possible the identification of deviations and non-conformities, their causes and the process to implement effective approaches to eliminate them. Over 30 laboratories participated between 2010 and 2015 in consecutive proficiency tests organized by the IAEA in conjunction with the Wageningen Evaluating Programmes for Analytical Laboratories (WEPAL) to assess their analytical performances. This publication reports the findings and includes lessons learned of this activity. An attached CD-ROM contains many individual participating laboratory papers sharing their individual results and experience gained through this participation.

  13. A pilot analytic study of a research-level, lower-cost human papillomavirus 16, 18, and 45 test.

    Science.gov (United States)

    Yang, Hannah P; Walmer, David K; Merisier, Delson; Gage, Julia C; Bell, Laura; Rangwala, Sameera; Shrestha, Niwashin; Kobayashi, Lori; Eder, Paul S; Castle, Philip E

    2011-09-01

    The analytic performance of a low-cost, research-stage DNA test for the most carcinogenic human papillomavirus (HPV) genotypes (HPV16, HPV18, and HPV45) in aggregate was evaluated among carcinogenic HPV-positive women, which might be used to decide who needs immediate colposcopy in low-resource settings ("triage test"). We found that HPV16/18/45 test agreed well with two DNA tests, a GP5+/6+ genotyping assay (Kappa = 0.77) and a quantitative PCR assay (at a cutpoint of 5000 viral copies) (Kappa = 0.87). DNA sequencing on a subset of 16 HPV16/18/45 positive and 16 HPV16/18/45 negative verified the analytic specificity of the research test. It is concluded that the HPV16/18/45 assay is a promising triage test with a minimum detection of approximately 5000 viral copies, the clinically relevant threshold. Published by Elsevier B.V.

  14. Analytical Approximation of Spectrum for Pulse X-ray Tubes

    International Nuclear Information System (INIS)

    Vavilov, S; Fofanof, O; Koshkin, G; Udod, V

    2016-01-01

    Among the main characteristics of the pulsed X-ray apparatuses the spectral energy characteristics are the most important ones: the spectral distribution of the photon energy, effective and maximum energy of quanta. Knowing the spectral characteristics of the radiation of pulse sources is very important for the practical use of them in non-destructive testing. We have attempted on the analytical approximation of the pulsed X-ray apparatuses spectra obtained in the different experimental papers. The results of the analytical approximation of energy spectrum for pulse X-ray tube are presented. Obtained formulas are adequate to experimental data and can be used by designing pulsed X-ray apparatuses. (paper)

  15. Analytical investigation of low temperature lift energy conversion systems with renewable energy source

    International Nuclear Information System (INIS)

    Lee, Hoseong; Hwang, Yunho; Radermacher, Reinhard

    2014-01-01

    The efficiency of the renewable energy powered energy conversion system is typically low due to its moderate heat source temperature. Therefore, improving its energy efficiency is essential. In this study, the performance of the energy conversion system with renewable energy source was theoretically investigated in order to explore its design aspect. For this purpose, a computer model of n-stage low temperature lift energy conversion (LTLEC) system was developed. The results showed that under given operating conditions such as temperatures and mass flow rates of heat source and heat sink fluids the unit power generation of the system increased with the number of stage, and it became saturated when the number of staging reached four. Investigation of several possible working fluids for the optimum stage LTLEC system revealed that ethanol could be an alternative to ammonia. The heat exchanger effectiveness is a critical factor on the system performance. The power generation was increased by 7.83% for the evaporator and 9.94% for the condenser with 10% increase of heat exchanger effectiveness. When these low temperature source fluids are applied to the LTLEC system, the heat exchanger performance would be very critical and it has to be designed accordingly. - Highlights: •Energy conversion system with renewable energy is analytically investigated. •A model of multi-stage low temperature lift energy conversion systems was developed. •The system performance increases as the stage number is increased. •The unit power generation is increased with increase of HX effectiveness. •Ethanol is found to be a good alternative to ammonia

  16. An Analytical Threshold Voltage Model of Fully Depleted (FD) Recessed-Source/Drain (Re-S/D) SOI MOSFETs with Back-Gate Control

    Science.gov (United States)

    Saramekala, Gopi Krishna; Tiwari, Pramod Kumar

    2016-10-01

    This paper presents an analytical threshold voltage model for back-gated fully depleted (FD), recessed-source drain silicon-on-insulator metal-oxide-semiconductor field-effect transistors (MOSFETs). Analytical surface potential models have been developed at front and back surfaces of the channel by solving the two-dimensional (2-D) Poisson's equation in the channel region with appropriate boundary conditions assuming a parabolic potential profile in the transverse direction of the channel. The strong inversion criterion is applied to the front surface potential as well as on the back one in order to find two separate threshold voltages for front and back channels of the device, respectively. The device threshold voltage has been assumed to be associated with the surface that offers a lower threshold voltage. The developed model was analyzed extensively for a variety of device geometry parameters like the oxide and silicon channel thicknesses, the thickness of the source/drain extension in the buried oxide, and the applied bias voltages with back-gate control. The proposed model has been validated by comparing the analytical results with numerical simulation data obtained from ATLAS™, a 2-D device simulator from SILVACO.

  17. Performance specifications for the extra-analytical phases of laboratory testing: Why and how.

    Science.gov (United States)

    Plebani, Mario

    2017-07-01

    An important priority in the current healthcare scenario should be to address errors in laboratory testing, which account for a significant proportion of diagnostic errors. Efforts made in laboratory medicine to enhance the diagnostic process have been directed toward improving technology, greater volumes and more accurate laboratory tests being achieved, but data collected in the last few years highlight the need to re-evaluate the total testing process (TTP) as the unique framework for improving quality and patient safety. Valuable quality indicators (QIs) and extra-analytical performance specifications are required for guidance in improving all TTP steps. Yet in literature no data are available on extra-analytical performance specifications based on outcomes, and nor is it possible to set any specification using calculations involving biological variability. The collection of data representing the state-of-the-art based on quality indicators is, therefore, underway. The adoption of a harmonized set of QIs, a common data collection and standardised reporting method is mandatory as it will not only allow the accreditation of clinical laboratories according to the International Standard, but also assure guidance for promoting improvement processes and guaranteeing quality care to patients. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  18. A multi-analyte biosensor for the simultaneous label-free detection of pathogens and biomarkers in point-of-need animal testing.

    Science.gov (United States)

    Ewald, Melanie; Fechner, Peter; Gauglitz, Günter

    2015-05-01

    For the first time, a multi-analyte biosensor platform has been developed using the label-free 1-lambda-reflectometry technique. This platform is the first, which does not use imaging techniques, but is able to perform multi-analyte measurements. It is designed to be portable and cost-effective and therefore allows for point-of-need testing or on-site field-testing with possible applications in diagnostics. This work highlights the application possibilities of this platform in the field of animal testing, but is also relevant and transferable to human diagnostics. The performance of the platform has been evaluated using relevant reference systems like biomarker (C-reactive protein) and serology (anti-Salmonella antibodies) as well as a panel of real samples (animal sera). The comparison of the working range and limit of detection shows no loss of performance transferring the separate assays to the multi-analyte setup. Moreover, the new multi-analyte platform allows for discrimination between sera of animals infected with different Salmonella subtypes.

  19. Temperature field due to time-dependent heat sources in a large rectangular grid - Derivation of analytical solution

    International Nuclear Information System (INIS)

    Claesson, J.; Probert, T.

    1996-01-01

    The temperature field in rock due to a large rectangular grid of heat releasing canisters containing nuclear waste is studied. The solution is by superposition divided into different parts. There is a global temperature field due to the large rectangular canister area, while a local field accounts for the remaining heat source problem. The global field is reduced to a single integral. The local field is also solved analytically using solutions for a finite line heat source and for an infinite grid of point sources. The local solution is reduced to three parts, each of which depends on two spatial coordinates only. The temperatures at the envelope of a canister are given by a single thermal resistance, which is given by an explicit formula. The results are illustrated by a few numerical examples dealing with the KBS-3 concept for storage of nuclear waste. 8 refs

  20. Optimizing RDF Data Cubes for Efficient Processing of Analytical Queries

    DEFF Research Database (Denmark)

    Jakobsen, Kim Ahlstrøm; Andersen, Alex B.; Hose, Katja

    2015-01-01

    data warehouses and data cubes. Today, external data sources are essential for analytics and, as the Semantic Web gains popularity, more and more external sources are available in native RDF. With the recent SPARQL 1.1 standard, performing analytical queries over RDF data sources has finally become...

  1. Intuitive versus analytical decision making modulates trust in e-commerce

    Directory of Open Access Journals (Sweden)

    Paola Iannello

    2014-11-01

    Full Text Available The hypothesis that intuition and analytical processes affect differently trust in e-commerce was tested. Participants were offered products by a series of sellers via Internet. In the intuitive condition pictures of the sellers were followed by neutral descriptions and participants had less time to decide whether to trust the seller. In the analytical condition participants were given an informative description of the seller and had a longer time to decide. Interactions among condition, price and trust emerged in behavioral and psychophysiological responses. EMG signals increased during analytical processing, suggesting a cognitive effort, whereas higher cardiovascular measures mirrored the emotional involvement when faced to untrustworthy sellers. The study supported the fruitful application of the intuitive vs. analytical approach to e-commerce and of the combination of different sources of information about the buyers while they have to choose to trust the seller in a financial transaction over the Internet.

  2. Median of patient results as a tool for assessment of analytical stability.

    Science.gov (United States)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Numerical simulation and experimental validation of the three-dimensional flow field and relative analyte concentration distribution in an atmospheric pressure ion source.

    Science.gov (United States)

    Poehler, Thorsten; Kunte, Robert; Hoenen, Herwart; Jeschke, Peter; Wissdorf, Walter; Brockmann, Klaus J; Benter, Thorsten

    2011-11-01

    In this study, the validation and analysis of steady state numerical simulations of the gas flows within a multi-purpose ion source (MPIS) are presented. The experimental results were obtained with particle image velocimetry (PIV) measurements in a non-scaled MPIS. Two-dimensional time-averaged velocity and turbulent kinetic energy distributions are presented for two dry gas volume flow rates. The numerical results of the validation simulations are in very good agreement with the experimental data. All significant flow features have been correctly predicted within the accuracy of the experiments. For technical reasons, the experiments were conducted at room temperature. Thus, numerical simulations of ionization conditions at two operating points of the MPIS are also presented. It is clearly shown that the dry gas volume flow rate has the most significant impact on the overall flow pattern within the APLI source; far less critical is the (larger) nebulization gas flow. In addition to the approximate solution of Reynolds-Averaged Navier-Stokes equations, a transport equation for the relative analyte concentration has been solved. The results yield information on the three-dimensional analyte distribution within the source. It becomes evident that for ion transport into the MS ion transfer capillary, electromagnetic forces are at least as important as fluid dynamic forces. However, only the fluid dynamics determines the three-dimensional distribution of analyte gas. Thus, local flow phenomena in close proximity to the spray shield are strongly impacting on the ionization efficiency.

  4. Nitrogen-isotopes and multi-parameter sewage water test for identification of nitrate sources: Groundwater body Marchfeld East of Vienna

    Science.gov (United States)

    Kralik, Martin

    2017-04-01

    The application of nitrogen and oxygen isotopes in nitrate allows, under favourable circumstances, to identify potential sources such as precipitation, chemical fertilisers and manure or sewage water. Without any additional tracer, the source distinction of nitrate from manure or sewage water is still difficult. Even the application of boron isotopes can in some cases not avoid ambiguous interpretation. Therefore, the Environment Agency Austria developed a new multi parametrical indicator test to allow the identification and quantification of pollution by domestic sewage water. The test analyses 8 substances well known to occur in sewage water: Acesulfame and sucralose (two artificial, calorie-free sweeteners), benzotriazole and tolyltriazole (two industrial chemicals/corrosion inhibitors), metoprolol, sotalol, carbamazepine and the metabolite 10,11-Dihydro-10,11-dihydroxycarbamazepine (pharmaceuticals) [1]. These substances are polar and degradation in the aquatic system by microbiological processes is not documented. These 8 Substances do not occur naturally which make them ideal tracers. The test can detect wastewater in the analysed water sample down to 0.1 %. This ideal coupling of these analytic tests helps to identify the nitrogen sources in the groundwater body Marchfeld East of Vienna to a high confidence level. In addition, the results allow a reasonable quantification of nitrogen sources from different types of fertilizers as well as sewage water contributions close to villages and in wells recharged by bank filtration. Recent investigations of groundwater in selected wells in Marchfeld [2] indicated a clear nitrogen contribution by wastewater leakages (sewers or septic tanks) to the total nitrogen budget. However, this contribution is shrinking and the main source comes still from agricultural activities. [1] Humer, F.; Weiss, S.; Reinnicke, S.; Clara, M.; Grath, J.; Windhofer, G. (2013): Multi parametrical indicator test for urban wastewater influence

  5. Analytical description of photon beam phase spaces in inverse Compton scattering sources

    Directory of Open Access Journals (Sweden)

    C. Curatolo

    2017-08-01

    Full Text Available We revisit the description of inverse Compton scattering sources and the photon beams generated therein, emphasizing the behavior of their phase space density distributions and how they depend upon those of the two colliding beams of electrons and photons. The main objective is to provide practical formulas for bandwidth, spectral density, brilliance, which are valid in general for any value of the recoil factor, i.e. both in the Thomson regime of negligible electron recoil, and in the deep Compton recoil dominated region, which is of interest for gamma-gamma colliders and Compton sources for the production of multi-GeV photon beams. We adopt a description based on the center of mass reference system of the electron-photon collision, in order to underline the role of the electron recoil and how it controls the relativistic Doppler/boost effect in various regimes. Using the center of mass reference frame greatly simplifies the treatment, allowing us to derive simple formulas expressed in terms of rms momenta of the two colliding beams (emittance, energy spread, etc. and the collimation angle in the laboratory system. Comparisons with Monte Carlo simulations of inverse Compton scattering in various scenarios are presented, showing very good agreement with the analytical formulas: in particular we find that the bandwidth dependence on the electron beam emittance, of paramount importance in Thomson regime, as it limits the amount of focusing imparted to the electron beam, becomes much less sensitive in deep Compton regime, allowing a stronger focusing of the electron beam to enhance luminosity without loss of mono-chromaticity. A similar effect occurs concerning the bandwidth dependence on the frequency spread of the incident photons: in deep recoil regime the bandwidth comes out to be much less dependent on the frequency spread. The set of formulas here derived are very helpful in designing inverse Compton sources in diverse regimes, giving a

  6. Transfer of test-enhanced learning: Meta-analytic review and synthesis.

    Science.gov (United States)

    Pan, Steven C; Rickard, Timothy C

    2018-05-07

    Attempting recall of information from memory, as occurs when taking a practice test, is one of the most potent training techniques known to learning science. However, does testing yield learning that transfers to different contexts? In the present article, we report the findings of the first comprehensive meta-analytic review into that question. Our review encompassed 192 transfer effect sizes extracted from 122 experiments and 67 published and unpublished articles (N = 10,382) that together comprise more than 40 years of research. A random-effects model revealed that testing can yield transferrable learning as measured relative to a nontesting reexposure control condition (d = 0.40, 95% CI [0.31, 0.50]). That transfer of learning is greatest across test formats, to application and inference questions, to problems involving medical diagnoses, and to mediator and related word cues; it is weakest to rearranged stimulus-response items, to untested materials seen during initial study, and to problems involving worked examples. Moderator analyses further indicated that response congruency and elaborated retrieval practice, as well as initial test performance, strongly influence the likelihood of positive transfer. In two assessments for publication bias using PET-PEESE and various selection methods, the moderator effect sizes were minimally affected. However, the intercept predictions were substantially reduced, often indicating no positive transfer when none of the aforementioned moderators are present. Overall, our results motivate a three-factor framework for transfer of test-enhanced learning and have practical implications for the effective use of practice testing in educational and other training contexts. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  7. Pre-analytical issues in the haemostasis laboratory: guidance for the clinical laboratories.

    Science.gov (United States)

    Magnette, A; Chatelain, M; Chatelain, B; Ten Cate, H; Mullier, F

    2016-01-01

    Ensuring quality has become a daily requirement in laboratories. In haemostasis, even more than in other disciplines of biology, quality is determined by a pre-analytical step that encompasses all procedures, starting with the formulation of the medical question, and includes patient preparation, sample collection, handling, transportation, processing, and storage until time of analysis. This step, based on a variety of manual activities, is the most vulnerable part of the total testing process and is a major component of the reliability and validity of results in haemostasis and constitutes the most important source of erroneous or un-interpretable results. Pre-analytical errors may occur throughout the testing process and arise from unsuitable, inappropriate or wrongly handled procedures. Problems may arise during the collection of blood specimens such as misidentification of the sample, use of inadequate devices or needles, incorrect order of draw, prolonged tourniquet placing, unsuccessful attempts to locate the vein, incorrect use of additive tubes, collection of unsuitable samples for quality or quantity, inappropriate mixing of a sample, etc. Some factors can alter the result of a sample constituent after collection during transportation, preparation and storage. Laboratory errors can often have serious adverse consequences. Lack of standardized procedures for sample collection accounts for most of the errors encountered within the total testing process. They can also have clinical consequences as well as a significant impact on patient care, especially those related to specialized tests as these are often considered as "diagnostic". Controlling pre-analytical variables is critical since this has a direct influence on the quality of results and on their clinical reliability. The accurate standardization of the pre-analytical phase is of pivotal importance for achieving reliable results of coagulation tests and should reduce the side effects of the influence

  8. Verification of the analytical fracture assessments methods by a large scale pressure vessel test

    Energy Technology Data Exchange (ETDEWEB)

    Keinanen, H; Oberg, T; Rintamaa, R; Wallin, K

    1988-12-31

    This document deals with the use of fracture mechanics for the assessment of reactor pressure vessel. Tests have been carried out to verify the analytical fracture assessment methods. The analysis is focused on flaw dimensions and the scatter band of material characteristics. Results are provided and are compared to experimental ones. (TEC).

  9. Cost effectiveness of ovarian reserve testing in in vitro fertilization : a Markov decision-analytic model

    NARCIS (Netherlands)

    Moolenaar, Lobke M.; Broekmans, Frank J. M.; van Disseldorp, Jeroen; Fauser, Bart C. J. M.; Eijkemans, Marinus J. C.; Hompes, Peter G. A.; van der Veen, Fulco; Mol, Ben Willem J.

    2011-01-01

    Objective: To compare the cost effectiveness of ovarian reserve testing in in vitro fertilization (IVF). Design: A Markov decision model based on data from the literature and original patient data. Setting: Decision analytic framework. Patient(s): Computer-simulated cohort of subfertile women aged

  10. Analytical evaluation on loss of off-side electric power simulation of the High Temperature Engineering Test Reactor

    International Nuclear Information System (INIS)

    Takeda, Takeshi; Nakagawa, Shigeaki; Tachibana, Yukio; Takada, Eiji; Kunitomi, Kazuhiko

    2000-03-01

    A rise-to-power test of the high temperature engineering test reactor (HTTR) started on September 28 in 1999 for establishing and upgrading the technological basis for the high temperature gas-cooled reactor (HTGR). A loss of off-site electric power test of the HTTR from the normal operation under 15 and 30 MW thermal power will be carried out in the rise-to-power test. Analytical evaluations on transient behaviors of the reactor and plant during the loss of off-site electric power were conducted. These estimations are proposed as benchmark problems for the IAEA coordinated research program on 'Evaluation of HTGR Performance'. This report describes an event scenario of transient during the loss of off-site electric power, the outline of major components and system, detailed thermal and nuclear data set for these problems and pre-estimation results of the benchmark problems by an analytical code 'ACCORD' for incore and plant dynamics of the HTGR. (author)

  11. A Test Stand for Ion Sources of Ultimate Reliability

    International Nuclear Information System (INIS)

    Enparantza, R.; Uriarte, L.; Romano, P.; Alonso, J.; Ariz, I.; Egiraun, M.; Bermejo, F. J.; Etxebarria, V.; Lucas, J.; Del Rio, J. M.; Letchford, A.; Faircloth, D.; Stockli, M.

    2009-01-01

    The rationale behind the ITUR project is to perform a comparison between different kinds of H - ion sources using the same beam diagnostics setup. In particular, a direct comparison will be made in terms of the emittance characteristics of Penning Type sources such as those currently in use in the injector for the ISIS (UK) Pulsed Neutron Source and those of volumetric type such as that driving the injector for the ORNL Spallation Neutron Source (TN, U.S.A.). The endeavour here pursued is thus to build an Ion Source Test Stand where virtually any type of source can be tested and its features measured and, thus compared to the results of other sources under the same gauge. It would be possible then to establish a common ground for effectively comparing different ion sources. The long term objectives are thus to contribute towards building compact sources of minimum emittance, maximum performance, high reliability-availability, high percentage of desired particle production, stability and high brightness. The project consortium is lead by Tekniker-IK4 research centre and partners are companies Elytt Energy and Jema Group. The technical viability is guaranteed by the collaboration between the project consortium and several scientific institutions, such the CSIC (Spain), the University of the Basque Country (Spain), ISIS (STFC-UK), SNS (ORNL-USA) and CEA in Saclay (France).

  12. Analytical Study of High Concentration PCB Paint at the Heavy Water Components Test Reactor

    International Nuclear Information System (INIS)

    Lowry, N.J.

    1998-01-01

    This report provides results of an analytical study of high concentration PCB paint in a shutdown nuclear test reactor located at the US Department of Energy's Savannah River Site (SRS). The study was designed to obtain data relevant for an evaluation of potential hazards associated with the use of and exposure to such paints

  13. Effects of Analytical and Holistic Scoring Patterns on Scorer Reliability in Biology Essay Tests

    Science.gov (United States)

    Ebuoh, Casmir N.

    2018-01-01

    Literature revealed that the patterns/methods of scoring essay tests had been criticized for not being reliable and this unreliability is more likely to be more in internal examinations than in the external examinations. The purpose of this study is to find out the effects of analytical and holistic scoring patterns on scorer reliability in…

  14. 10 CFR 34.27 - Leak testing and replacement of sealed sources.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Leak testing and replacement of sealed sources. 34.27... SAFETY REQUIREMENTS FOR INDUSTRIAL RADIOGRAPHIC OPERATIONS Equipment § 34.27 Leak testing and replacement... radiographic exposure device and leak testing of any sealed source must be performed by persons authorized to...

  15. Well water quality in rural Nicaragua using a low-cost bacterial test and microbial source tracking.

    Science.gov (United States)

    Weiss, Patricia; Aw, Tiong Gim; Urquhart, Gerald R; Galeano, Miguel Ruiz; Rose, Joan B

    2016-04-01

    Water-related diseases, particularly diarrhea, are major contributors to morbidity and mortality in developing countries. Monitoring water quality on a global scale is crucial to making progress in terms of population health. Traditional analytical methods are difficult to use in many regions of the world in low-resource settings that face severe water quality issues due to the inaccessibility of laboratories. This study aimed to evaluate a new low-cost method (the compartment bag test (CBT)) in rural Nicaragua. The CBT was used to quantify the presence of Escherichia coli in drinking water wells and aimed to determine the source(s) of any microbial contamination. Results indicate that the CBT is a viable method for use in remote rural regions. The overall quality of well water in Pueblo Nuevo, Nicaragua was deemed unsafe, and results led to the conclusion that animal fecal wastes may be one of the leading causes of well contamination. Elevation and depth of wells were not found to impact overall water quality. However rope-pump wells had a 64.1% reduction in contamination when compared with simple wells.

  16. An analytic thomism?

    Directory of Open Access Journals (Sweden)

    Daniel Alejandro Pérez Chamorro.

    2012-12-01

    Full Text Available For 50 years the philosophers of the Anglo-Saxon analytic tradition (E. Anscombre, P. Geach, A. Kenny, P. Foot have tried to follow the Thomas Aquinas School which they use as a source to surpass the Cartesian Epistemology and to develop the virtue ethics. Recently, J. Haldane has inaugurated a program of “analytical thomism” which main result until the present has been his “theory of identity mind/world”. Nevertheless, none of Thomás’ admirers has still found the means of assimilating his metaphysics of being.

  17. An Open Source Tool to Test Interoperability

    Science.gov (United States)

    Bermudez, L. E.

    2012-12-01

    Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and

  18. Analytical Study of High Concentration PCB Paint at the Heavy Water Components Test Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, N.J.

    1998-10-21

    This report provides results of an analytical study of high concentration PCB paint in a shutdown nuclear test reactor located at the US Department of Energy's Savannah River Site (SRS). The study was designed to obtain data relevant for an evaluation of potential hazards associated with the use of and exposure to such paints.

  19. Cost effectiveness of ovarian reserve testing in in vitro fertilization: a Markov decision-analytic model

    NARCIS (Netherlands)

    Moolenaar, Lobke M.; Broekmans, Frank J. M.; van Disseldorp, Jeroen; Fauser, Bart C. J. M.; Eijkemans, Marinus J. C.; Hompes, Peter G. A.; van der Veen, Fulco; Mol, Ben Willem J.

    2011-01-01

    To compare the cost effectiveness of ovarian reserve testing in in vitro fertilization (IVF). A Markov decision model based on data from the literature and original patient data. Decision analytic framework. Computer-simulated cohort of subfertile women aged 20 to 45 years who are eligible for IVF.

  20. An Analytical Method for the Abel Inversion of Asymmetrical Gaussian Profiles

    International Nuclear Information System (INIS)

    Xu Guosheng; Wan Baonian

    2007-01-01

    An analytical algorithm for fast calculation of the Abel inversion for density profile measurement in tokamak is developed. Based upon the assumptions that the particle source is negligibly small in the plasma core region, density profiles can be approximated by an asymmetrical Gaussian distribution controlled only by one parameter V 0 /D and V 0 /D is constant along the radial direction, the analytical algorithm is presented and examined against a testing profile. The validity is confirmed by benchmark with the standard Abel inversion method and the theoretical profile. The scope of application as well as the error analysis is also discussed in detail

  1. Transformational Leadership and Organizational Citizenship Behavior: A Meta-Analytic Test of Underlying Mechanisms.

    Science.gov (United States)

    Nohe, Christoph; Hertel, Guido

    2017-01-01

    Based on social exchange theory, we examined and contrasted attitudinal mediators (affective organizational commitment, job satisfaction) and relational mediators (trust in leader, leader-member exchange; LMX) of the positive relationship between transformational leadership and organizational citizenship behavior (OCB). Hypotheses were tested using meta-analytic path models with correlations from published meta-analyses (761 samples with 227,419 individuals overall). When testing single-mediator models, results supported our expectations that each of the mediators explained the relationship between transformational leadership and OCB. When testing a multi-mediator model, LMX was the strongest mediator. When testing a model with a latent attitudinal mechanism and a latent relational mechanism, the relational mechanism was the stronger mediator of the relationship between transformational leadership and OCB. Our findings help to better understand the underlying mechanisms of the relationship between transformational leadership and OCB.

  2. Analytical Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Analytical Labspecializes in Oil and Hydraulic Fluid Analysis, Identification of Unknown Materials, Engineering Investigations, Qualification Testing (to support...

  3. Final report of the IAEA advisory group meeting on accelerator-based nuclear analytical techniques for characterization and source identification of aerosol particles

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-01

    The field of aerosol characterization and source identification covers a wide range of scientific and technical activities in many institutions, in both developed and developing countries. This field includes research and applications on urban air pollution, source apportionment of suspended particulate matter, radioactive aerosol particles, organic compounds carried on particulate matter, elemental characterization of particles, and other areas. The subject of this AGM focused on the use of accelerator-based nuclear analytical techniques for determination of elemental composition of particles (by either bulk or single particle analysis) and the use of accumulated knowledge for source identification.

  4. Final report of the IAEA advisory group meeting on accelerator-based nuclear analytical techniques for characterization and source identification of aerosol particles

    International Nuclear Information System (INIS)

    1995-01-01

    The field of aerosol characterization and source identification covers a wide range of scientific and technical activities in many institutions, in both developed and developing countries. This field includes research and applications on urban air pollution, source apportionment of suspended particulate matter, radioactive aerosol particles, organic compounds carried on particulate matter, elemental characterization of particles, and other areas. The subject of this AGM focused on the use of accelerator-based nuclear analytical techniques for determination of elemental composition of particles (by either bulk or single particle analysis) and the use of accumulated knowledge for source identification

  5. 100-B/C Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    R.W. Ovink

    2010-03-18

    This report documents the process used to identify source area target analytes in support of the 100-B/C remedial investigation/feasibility study addendum to DOE/RL-2008-46. This report also establishes the analyte exclusion criteria applicable for 100-B/C use and the analytical methods needed to analyze the target analytes.

  6. Development of an Automated LIBS Analytical Test System Integrated with Component Control and Spectrum Analysis Capabilities

    International Nuclear Information System (INIS)

    Ding Yu; Tian Di; Chen Feipeng; Chen Pengfei; Qiao Shujun; Yang Guang; Li Chunsheng

    2015-01-01

    The present paper proposes an automated Laser-Induced Breakdown Spectroscopy (LIBS) analytical test system, which consists of a LIBS measurement and control platform based on a modular design concept, and a LIBS qualitative spectrum analysis software and is developed in C#. The platform provides flexible interfacing and automated control; it is compatible with different manufacturer component models and is constructed in modularized form for easy expandability. During peak identification, a more robust peak identification method with improved stability in peak identification has been achieved by applying additional smoothing on the slope obtained by calculation before peak identification. For the purpose of element identification, an improved main lines analysis method, which detects all elements on the spectral peak to avoid omission of certain elements without strong spectral lines, is applied to element identification in the tested LIBS samples. This method also increases the identification speed. In this paper, actual applications have been carried out. According to tests, the analytical test system is compatible with components of various models made by different manufacturers. It can automatically control components to get experimental data and conduct filtering, peak identification and qualitative analysis, etc. on spectral data. (paper)

  7. Infrared source test

    Energy Technology Data Exchange (ETDEWEB)

    Ott, L.

    1994-11-15

    The purpose of the Infrared Source Test (IRST) is to demonstrate the ability to track a ground target with an infrared sensor from an airplane. The system is being developed within the Advance Technology Program`s Theater Missile Defense/Unmanned Aerial Vehicle (UAV) section. The IRST payload consists of an Amber Radiance 1 infrared camera system, a computer, a gimbaled mirror, and a hard disk. The processor is a custom R3000 CPU board made by Risq Modular Systems, Inc. for LLNL. The board has ethernet, SCSI, parallel I/O, and serial ports, a DMA channel, a video (frame buffer) interface, and eight MBytes of main memory. The real-time operating system VxWorks has been ported to the processor. The application code is written in C on a host SUN 4 UNIX workstation. The IRST is the result of a combined effort by physicists, electrical and mechanical engineers, and computer scientists.

  8. An analytical approach for a nodal formulation of a two-dimensional fixed-source neutron transport problem in heterogeneous medium

    Energy Technology Data Exchange (ETDEWEB)

    Basso Barichello, Liliane; Dias da Cunha, Rudnei [Universidade Federal do Rio Grande do Sul, Porto Alegre, RS (Brazil). Inst. de Matematica; Becker Picoloto, Camila [Universidade Federal do Rio Grande do Sul, Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica; Tres, Anderson [Universidade Federal do Rio Grande do Sul, Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Matematica Aplicada

    2015-05-15

    A nodal formulation of a fixed-source two-dimensional neutron transport problem, in Cartesian geometry, defined in a heterogeneous medium, is solved by an analytical approach. Explicit expressions, in terms of the spatial variables, are derived for averaged fluxes in each region in which the domain is subdivided. The procedure is an extension of an analytical discrete ordinates method, the ADO method, for the solution of the two-dimensional homogeneous medium case. The scheme is developed from the discrete ordinates version of the two-dimensional transport equation along with the level symmetric quadrature scheme. As usual for nodal schemes, relations between the averaged fluxes and the unknown angular fluxes at the contours are introduced as auxiliary equations. Numerical results are in agreement with results available in the literature.

  9. Deriving Earth Science Data Analytics Requirements

    Science.gov (United States)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  10. 1+-n+ ECR ION SOURCE DEVELOPMENT TEST STAND

    International Nuclear Information System (INIS)

    Donald P. May

    2006-01-01

    A test stand for the investigation of 1+-n+ charge boosting using an ECR ion sources is currently being assembled at the Texas A and M Cyclotron Institute. The ultimate goal is to relate the charge-boosting of ions of stable species to possible charge-boosting of ions of radioactive species extracted from the diverse, low-charge-state ion sources developed for radioactive ion beams

  11. fMRI activation patterns in an analytic reasoning task: consistency with EEG source localization

    Science.gov (United States)

    Li, Bian; Vasanta, Kalyana C.; O'Boyle, Michael; Baker, Mary C.; Nutter, Brian; Mitra, Sunanda

    2010-03-01

    Functional magnetic resonance imaging (fMRI) is used to model brain activation patterns associated with various perceptual and cognitive processes as reflected by the hemodynamic (BOLD) response. While many sensory and motor tasks are associated with relatively simple activation patterns in localized regions, higher-order cognitive tasks may produce activity in many different brain areas involving complex neural circuitry. We applied a recently proposed probabilistic independent component analysis technique (PICA) to determine the true dimensionality of the fMRI data and used EEG localization to identify the common activated patterns (mapped as Brodmann areas) associated with a complex cognitive task like analytic reasoning. Our preliminary study suggests that a hybrid GLM/PICA analysis may reveal additional regions of activation (beyond simple GLM) that are consistent with electroencephalography (EEG) source localization patterns.

  12. Analytical support for the B4C control rod test QUENCH-07

    International Nuclear Information System (INIS)

    Homann, C.; Hering, W.; Fernandez Benitez, J.A.; Ortega Bernardo, M.

    2003-04-01

    Degradation of B 4 C absorber rods during a beyond design accident in a nuclear power reactor may be a safety concern. Among others, the integral test QUENCH-07 is performed in the FZK QUENCH facility and supported by analytical work within the Euratom Fifth Framework Programme on Nuclear Fission Safety to get a more profound database. Since the test differed substantially from previous QUENCH tests, much more work had to be done for pretest calculations than usual to guarantee the safety of the facility and to derive the test protocol. Several institutions shared in this work with different computer code systems, as used for nuclear reactor safety analyses. Due to this effort, problems could be identified and solved, leading to several modifications of the originally planned test conduct, until a feasible test protocol could be derived and recommended. All calculations showed the same trends. Especially the high temperatures and hence the small safety margin for the facility were a concern. In this report, contributions of various authors, engaged in this work, are presented. The test QUENCH-07 and the related computational support by the engaged institutions were co-financed by the European Community under the Euratom Fifth Framework Programme on Nuclear Fission Safety 1998 - 2002 (COLOSS Project, contract No. FIKS-CT-1999-00002). (orig.)

  13. Analytical solution for the transient wave propagation of a buried cylindrical P-wave line source in a semi-infinite elastic medium with a fluid surface layer

    Science.gov (United States)

    Shan, Zhendong; Ling, Daosheng

    2018-02-01

    This article develops an analytical solution for the transient wave propagation of a cylindrical P-wave line source in a semi-infinite elastic solid with a fluid layer. The analytical solution is presented in a simple closed form in which each term represents a transient physical wave. The Scholte equation is derived, through which the Scholte wave velocity can be determined. The Scholte wave is the wave that propagates along the interface between the fluid and solid. To develop the analytical solution, the wave fields in the fluid and solid are defined, their analytical solutions in the Laplace domain are derived using the boundary and interface conditions, and the solutions are then decomposed into series form according to the power series expansion method. Each item of the series solution has a clear physical meaning and represents a transient wave path. Finally, by applying Cagniard's method and the convolution theorem, the analytical solutions are transformed into the time domain. Numerical examples are provided to illustrate some interesting features in the fluid layer, the interface and the semi-infinite solid. When the P-wave velocity in the fluid is higher than that in the solid, two head waves in the solid, one head wave in the fluid and a Scholte wave at the interface are observed for the cylindrical P-wave line source.

  14. Analytic nuclear scattering theories

    International Nuclear Information System (INIS)

    Di Marzio, F.; University of Melbourne, Parkville, VIC

    1999-01-01

    A wide range of nuclear reactions are examined in an analytical version of the usual distorted wave Born approximation. This new approach provides either semi analytic or fully analytic descriptions of the nuclear scattering processes. The resulting computational simplifications, when used within the limits of validity, allow very detailed tests of both nuclear interaction models as well as large basis models of nuclear structure to be performed

  15. Challenges in defining a radiologic and hydrologic source term for underground nuclear test centers, Nevada Test Site, Nye County, Nevada

    International Nuclear Information System (INIS)

    Smith, D.K.

    1995-06-01

    The compilation of a radionuclide inventory for long-lived radioactive contaminants residual from nuclear testing provides a partial measure of the radiologic source term at the Nevada Test Site. The radiologic source term also includes potentially mobile short-lived radionuclides excluded from the inventory. The radiologic source term for tritium is known with accuracy and is equivalent to the hydrologic source term within the saturated zone. Definition of the total hydrologic source term for fission and activation products that have high activities for decades following underground testing involves knowledge and assumptions which are presently unavailable. Systematic investigation of the behavior of fission products, activation products and actinides under saturated or Partially saturated conditions is imperative to define a representative total hydrologic source term. This is particularly important given the heterogeneous distribution of radionuclides within testing centers. Data quality objectives which emphasize a combination of measurements and credible estimates of the hydrologic source term are a priority for near-field investigations at the Nevada Test Site

  16. Schedule Analytics

    Science.gov (United States)

    2016-04-30

    Warfare, Naval Sea Systems Command Acquisition Cycle Time : Defining the Problem David Tate, Institute for Defense Analyses Schedule Analytics Jennifer...research was comprised of the following high- level steps :  Identify and review primary data sources 1...research. However, detailed reviews of the OMB IT Dashboard data revealed that schedule data is highly aggregated. Program start date and program end date

  17. Radionuclides in analytical chemistry

    International Nuclear Information System (INIS)

    Tousset, J.

    1984-01-01

    Applications of radionuclides in analytical chemistry are reviewed in this article: tracers, radioactive sources and activation analysis. Examples are given in all these fields and it is concluded that these methods should be used more widely [fr

  18. ANALYTICAL STUDY OF CURCUMIN CONTENT IN DIFFERENT DOSAGE FORMS CONTAINING TURMERIC EXTRACT POWDER AND TURMERIC OLEORESIN

    OpenAIRE

    Rane Rajashree; Gangolli Divya; Patil Sushma; Ingawale Kanchan; Kundalwal Sachin

    2013-01-01

    Different dosage forms namely tablets, capsules, creams and syrups were analysed for curcumin content, by the well-known spectrophotometric method. Turmeric extract powder was used as a source of curcumin in capsule and tablet formulations. Turmeric oleoresin was used as a source of curcumin in cream formulation. Additionally, syrup formulations containing turmeric extract powder as well as turmeric oleoresin, separately, were also tested for their curcumin contents. Analytical results for cu...

  19. A test on analytic continuation of thermal imaginary-time data

    International Nuclear Information System (INIS)

    Burnier, Y.; Laine, M.; Mether, L.

    2011-01-01

    Some time ago, Cuniberti et al. have proposed a novel method for analytically continuing thermal imaginary-time correlators to real time, which requires no model input and should be applicable with finite-precision data as well. Given that these assertions go against common wisdom, we report on a naive test of the method with an idealized example. We do encounter two problems, which we spell out in detail; this implies that systematic errors are difficult to quantify. On a more positive note, the method is simple to implement and allows for an empirical recipe by which a reasonable qualitative estimate for some transport coefficient may be obtained, if statistical errors of an ultraviolet-subtracted imaginary-time measurement can be reduced to roughly below the per mille level. (orig.)

  20. Type testing of devices with inserted radioactive sources

    International Nuclear Information System (INIS)

    Rolle, A.; Droste, B.; Dombrowski, H.

    2006-01-01

    In Germany devices with inserted radioactive sources can get a type approval if they comply with specific requirements. Whoever operates a device whose type has been approved in accordance with the German Radiation Protection Ordinance does not need an individual authorization. Such type approvals for free use are granted by the Federal Office for Radiation Protection (B.f.S.) on the basis of type testing performed by the Physikalisch-Technische Bundesanstalt (P.T.B.), the national metrology institute, and the Bundesanstalt fur Materialforschung und -prufung (B.A.M.), the Federal Institute for Materials Research and Testing. Main aspects of the assessment are the activity of the radioactive sources, the dose equivalent rate near the devices, the tamper-proofness and leak-tightness of the sources and the safety of the construction of the devices. With the new Radiation Protection Ordinance in 2001, more stringent requirements for a type approval were established. Experiences with the new regulations and the relevant assessment criteria applied by P.T.B. and B.A.M. will be presented. (authors)

  1. Exact analytical solution of time-independent neutron transport equation, and its applications to systems with a point source

    International Nuclear Information System (INIS)

    Mikata, Y.

    2014-01-01

    Highlights: • An exact solution for the one-speed neutron transport equation is obtained. • This solution as well as its derivation are believed to be new. • Neutron flux for a purely absorbing material with a point neutron source off the origin is obtained. • Spherically as well as cylindrically piecewise constant cross sections are studied. • Neutron flux expressions for a point neutron source off the origin are believed to be new. - Abstract: An exact analytical solution of the time-independent monoenergetic neutron transport equation is obtained in this paper. The solution is applied to systems with a point source. Systematic analysis of the solution of the time-independent neutron transport equation, and its applications represent the primary goal of this paper. To the best of the author’s knowledge, certain key results on the scalar neutron flux as well as their derivations are new. As an application of these results, a scalar neutron flux for a purely absorbing medium with a spherically piecewise constant cross section and an isotropic point neutron source off the origin as well as that for a cylindrically piecewise constant cross section with a point neutron source off the origin are obtained. Both of these results are believed to be new

  2. Potential sources of analytical bias and error in selected trace element data-quality analyses

    Science.gov (United States)

    Paul, Angela P.; Garbarino, John R.; Olsen, Lisa D.; Rosen, Michael R.; Mebane, Christopher A.; Struzeski, Tedmund M.

    2016-09-28

    Potential sources of analytical bias and error associated with laboratory analyses for selected trace elements where concentrations were greater in filtered samples than in paired unfiltered samples were evaluated by U.S. Geological Survey (USGS) Water Quality Specialists in collaboration with the USGS National Water Quality Laboratory (NWQL) and the Branch of Quality Systems (BQS).Causes for trace-element concentrations in filtered samples to exceed those in associated unfiltered samples have been attributed to variability in analytical measurements, analytical bias, sample contamination either in the field or laboratory, and (or) sample-matrix chemistry. These issues have not only been attributed to data generated by the USGS NWQL but have been observed in data generated by other laboratories. This study continues the evaluation of potential analytical bias and error resulting from matrix chemistry and instrument variability by evaluating the performance of seven selected trace elements in paired filtered and unfiltered surface-water and groundwater samples collected from 23 sampling sites of varying chemistries from six States, matrix spike recoveries, and standard reference materials.Filtered and unfiltered samples have been routinely analyzed on separate inductively coupled plasma-mass spectrometry instruments. Unfiltered samples are treated with hydrochloric acid (HCl) during an in-bottle digestion procedure; filtered samples are not routinely treated with HCl as part of the laboratory analytical procedure. To evaluate the influence of HCl on different sample matrices, an aliquot of the filtered samples was treated with HCl. The addition of HCl did little to differentiate the analytical results between filtered samples treated with HCl from those samples left untreated; however, there was a small, but noticeable, decrease in the number of instances where a particular trace-element concentration was greater in a filtered sample than in the associated

  3. Final report on the proficiency test of the Analytical Laboratories for the Measurement of Environmental Radioactivity (ALMERA) network

    International Nuclear Information System (INIS)

    Shakhashiro, A.; Radecki, Z.; Trinkl, A.; Sansone, U.; Benesch, T.

    2005-08-01

    This report presents the statistical evaluation of results from the analysis of 12 radionuclides in 8 samples within the frame of the First Proficiency Test of Analytical Laboratories for the Measurement Environmental RAdioactivity (ALMERA) organized in 2001-2002 by the Chemistry Unit, Agency's Laboratory in Seibersdorf. The results were evaluated by using appropriate statistical means to assess laboratory analytical performance and to estimate the overall performance for the determination of each radionuclide. Evaluation of the analytical data for gamma emitting radionuclides showed that 68% of data obtained a 'Passed' final score for both the trueness and precision criteria applied to this exercise. However, transuranic radionuclides obtained only 58% for the same criteria. (author)

  4. Analytical admittance characterization of high mobility channel

    Energy Technology Data Exchange (ETDEWEB)

    Mammeri, A. M.; Mahi, F. Z., E-mail: fati-zo-mahi2002@yahoo.fr [Institute of Science and Technology, University of Bechar (Algeria); Varani, L. [Institute of Electronics of the South (IES - CNRS UMR 5214), University of Montpellier (France)

    2015-03-30

    In this contribution, we investigate the small-signal admittance of the high electron mobility transistors field-effect channels under a continuation branching of the current between channel and gate by using an analytical model. The analytical approach takes into account the linearization of the 2D Poisson equation and the drift current along the channel. The analytical equations discuss the frequency dependence of the admittance at source and drain terminals on the geometrical transistor parameters.

  5. SOURCE 1ST 2.0: development and beta testing

    International Nuclear Information System (INIS)

    Barber, D.H.; Iglesias, F.C.; Hoang, Y.; Dickson, L.W.; Dickson, R.S.; Richards, M.J.; Gibb, R.A.

    1999-01-01

    SOURCE 1ST 2.0 is the Industry Standard fission product release code that is being developed by Ontario Power Generation, New Brunswick Power, Hydro-Quebec, and Atomic Energy of Canada Ltd. This paper is a report on recent progress on requirement specification, code development, and module verification and validation activities. The theoretical basis for each model in the code is described in a module Software Theory Manual. The development of SOURCE IST 2.0 has required code design decisions about how to implement the software requirements. Development and module testing of the β1 release of SOURCE IST 2.0 (released in July 1999) have led to some interesting insights into fission product release modelling. The beta testing process has allowed code developers and analysts to refine the software requirements for the code. The need to verify physical reference data has guided some decisions on the code and data structure design. Examples of these design decisions are provided. Module testing, and verification and validation activities are discussed. These activities include code-targeted testing, stress testing, code inspection, comparison of code with requirements, and comparison of code results with independent algebraic, numerical, or semi-algebraic calculations. The list of isotopes to be modelled by SOURCE IST 2.0 provides an example of a subset of a reference data set. Isotopes are present on the list for a variety of reasons: personnel or public dose, equipment dose (for environmental qualification), fission rate and actinide modelling, or stable (or long-lived) targets for activation processes. To accommodate controlled changes to the isotope list, the isotope list and associated nuclear data are contained in a reference data file. The questions of multiple computing platforms, and of Year 2000 compliance have been addressed by programming rules for the code. By developing and testing modules on most of the different platforms on which the code is intended

  6. Predictive Big Data Analytics: A Study of Parkinson’s Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations

    Science.gov (United States)

    Dinov, Ivo D.; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W.; Price, Nathan D.; Van Horn, John D.; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M.; Dauer, William; Toga, Arthur W.

    2016-01-01

    Background A unique archive of Big Data on Parkinson’s Disease is collected, managed and disseminated by the Parkinson’s Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson’s disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data–large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources–all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Methods and Findings Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several

  7. Analytic model of the stress waves propagation in thin wall tubes, seeking the location of a harmonic point source in its surface

    International Nuclear Information System (INIS)

    Boaratti, Mario Francisco Guerra

    2006-01-01

    Leaks in pressurized tubes generate acoustic waves that propagate through the walls of these tubes, which can be captured by accelerometers or by acoustic emission sensors. The knowledge of how these walls can vibrate, or in another way, how these acoustic waves propagate in this material is fundamental in the detection and localization process of the leak source. In this work an analytic model was implemented, through the motion equations of a cylindrical shell, with the objective to understand the behavior of the tube surface excited by a point source. Since the cylindrical surface has a closed pattern in the circumferential direction, waves that are beginning their trajectory will meet with another that has already completed the turn over the cylindrical shell, in the clockwise direction as well as in the counter clockwise direction, generating constructive and destructive interferences. After enough time of propagation, peaks and valleys in the shell surface are formed, which can be visualized through a graphic representation of the analytic solution created. The theoretical results were proven through measures accomplished in an experimental setup composed of a steel tube finished in sand box, simulating the condition of infinite tube. To determine the location of the point source on the surface, the process of inverse solution was adopted, that is to say, known the signals of the sensor disposed in the tube surface , it is determined through the theoretical model where the source that generated these signals can be. (author)

  8. Design and qualification testing of a strontium-90 fluoride heat source

    International Nuclear Information System (INIS)

    Fullam, H.T.

    1981-12-01

    The Strontium Heat Source Development Program began at the Pacific Northwest Laboratory (PNL) in 1972 and is scheduled to be completed by the end of FY-1981. The program is currently funded by the US Department of Energy (DOE) By-Product Utilization Program. The primary objective of the program has been to develop the data and technology required to permit the licensing of power systems for terrestrial applications that utilize 90 SrF 2 -fueled radioisotope heat sources. A secondary objective of the program has been to design and qualification-test a general purpose 90 SrF 2 -fueled heat source. The effort expended in the design and testing of the heat source is described. Detailed information is included on: heat source design, licensing requirements, and qualification test requirements; the qualification test procedures; and the fabrication and testing of capsules of various materials. The results obtained in the qualification tests show that the outer capsule design proposed for the 90 SrF 2 heat source is capable of meeting current licensing requirements when Hastelloy S is used as the outer capsule material. The data also indicate that an outer capsule of Hastelloy C-4 would probably also meet licensing requirements, although Hastelloy S is the preferred material. Therefore, based on the results of this study, the general purpose 90 SrF 2 heat source will consist of a standard WESF Hastelloy C-276 inner capsule filled with 90 SrF 2 and a Hastelloy S outer capsule having a 2.375-in. inner diameter and 0.500-in. wall thickness. The end closures for this study, the general purpose 90 SrF 2 heat a Hastelloy S outer capsule having a 2.375-in. inner diameter and 0.500-in. wall thickness. The end closures for the outer capsule will utilize an interlocking joint design requiring a 0.1-in. penetration closure weld

  9. Design and tests of a package for the transport of radioactive sources

    International Nuclear Information System (INIS)

    Santos, Paulo de Oliveira

    2011-01-01

    The Type A package was designed for transportation of seven cobalt-60 sources with total activity of 1 GBq. The shield thickness to accomplish the dose rate and the transport index established by the radioactive transport regulation was calculated by the code MCNP (Monte Carlo N-Particle Transport Code Version 5). The sealed cobalt-60 sources were tested for leakages. according to the regulation ISO 9978:1992 (E). The package was tested according to regulation Radioactive Material Transport CNEN. The leakage tests results pf the sources, and the package tests demonstrate that the transport can be safe performed from the CDTN to the steelmaking industries

  10. Analytical Solution of the Hyperbolic Heat Conduction Equation for Moving Semi-Infinite Medium under the Effect of Time-Dependent Laser Heat Source

    Directory of Open Access Journals (Sweden)

    R. T. Al-Khairy

    2009-01-01

    source, whose capacity is given by (,=((1−− while the semi-infinite body has insulated boundary. The solution is obtained by Laplace transforms method, and the discussion of solutions for different time characteristics of heat sources capacity (constant, instantaneous, and exponential is presented. The effect of absorption coefficients on the temperature profiles is examined in detail. It is found that the closed form solution derived from the present study reduces to the previously obtained analytical solution when the medium velocity is set to zero in the closed form solution.

  11. The detection of problem analytes in a single proficiency test challenge in the absence of the Health Care Financing Administration rule violations.

    Science.gov (United States)

    Cembrowski, G S; Hackney, J R; Carey, N

    1993-04-01

    The Clinical Laboratory Improvement Act of 1988 (CLIA 88) has dramatically changed proficiency testing (PT) practices having mandated (1) satisfactory PT for certain analytes as a condition of laboratory operation, (2) fixed PT limits for many of these "regulated" analytes, and (3) an increased number of PT specimens (n = 5) for each testing cycle. For many of these analytes, the fixed limits are much broader than the previously employed Standard Deviation Index (SDI) criteria. Paradoxically, there may be less incentive to identify and evaluate analytically significant outliers to improve the analytical process. Previously described "control rules" to evaluate these PT results are unworkable as they consider only two or three results. We used Monte Carlo simulations of Kodak Ektachem analyzers participating in PT to determine optimal control rules for the identification of PT results that are inconsistent with those from other laboratories using the same methods. The analysis of three representative analytes, potassium, creatine kinase, and iron was simulated with varying intrainstrument and interinstrument standard deviations (si and sg, respectively) obtained from the College of American Pathologists (Northfield, Ill) Quality Assurance Services data and Proficiency Test data, respectively. Analytical errors were simulated in each of the analytes and evaluated in terms of multiples of the interlaboratory SDI. Simple control rules for detecting systematic and random error were evaluated with power function graphs, graphs of probability of error detected vs magnitude of error. Based on the simulation results, we recommend screening all analytes for the occurrence of two or more observations exceeding the same +/- 1 SDI limit. For any analyte satisfying this condition, the mean of the observations should be calculated. For analytes with sg/si ratios between 1.0 and 1.5, a significant systematic error is signaled by the mean exceeding 1.0 SDI. Significant random error

  12. ESIP Earth Sciences Data Analytics (ESDA) Cluster - Work in Progress

    Science.gov (United States)

    Kempler, Steven

    2015-01-01

    The purpose of this poster is to promote a common understanding of the usefulness of, and activities that pertain to, Data Analytics and more broadly, the Data Scientist; Facilitate collaborations to better understand the cross usage of heterogeneous datasets and to provide accommodating data analytics expertise, now and as the needs evolve into the future; Identify gaps that, once filled, will further collaborative activities. Objectives Provide a forum for Academic discussions that provides ESIP members a better understanding of the various aspects of Earth Science Data Analytics Bring in guest speakers to describe external efforts, and further teach us about the broader use of Data Analytics. Perform activities that:- Compile use cases generated from specific community needs to cross analyze heterogeneous data- Compile sources of analytics tools, in particular, to satisfy the needs of the above data users- Examine gaps between needs and sources- Examine gaps between needs and community expertise- Document specific data analytics expertise needed to perform Earth science data analytics Seek graduate data analytics Data Science student internship opportunities.

  13. Coagulation Tests and Selected Biochemical Analytes in Dairy Cows with Hepatic Lipidosis

    Directory of Open Access Journals (Sweden)

    S. Padilla-Arellanes

    2007-01-01

    Full Text Available The aim of this study was to determine the values and changes in conventional and optimised clotting tests, as well as in selected biochemical analytes during hepatic lipidosis in postpartum dairy cows. Ten healthy and ten Holstein cows with hepatic lipidosis were selected based upon clinical history, clinical examination, liver biopsy, flotation test and histological analysis of hepatic tissue. Prothrombin time (PT and partial thromboplastin time (PTT were determined in non-diluted and diluted blood plasma samples. Clotting times determined in diluted plasma samples were prolonged in cows with hepatic lipidosis and there was a difference in the PT value at both 50% and 25% plasma dilutions between both groups of animals (P = 0.004 and P = 0.001. Significant differences between healthy animals and cows with hepatic lipidosis were observed in blood serum values for free fatty acids (FFA, aspartate aminotransferase (AST and triacyglycerols (P = 0.001, P = 0.007 and P = 0.044, respectively. FFA and liver biopsy are better diagnostic indicators for hepatic lipidosis than coagulation tests. The optimised PT is prolonged in cows with hepatic lipidosis and can detect this alteration that cannot be appreciated using conventional PT test.

  14. Analytical solution for the transient response of a fluid/saturated porous medium halfspace system subjected to an impulsive line source

    Science.gov (United States)

    Shan, Zhendong; Ling, Daosheng; Jing, Liping; Li, Yongqiang

    2018-05-01

    In this paper, transient wave propagation is investigated within a fluid/saturated porous medium halfspace system with a planar interface that is subjected to a cylindrical P-wave line source. Assuming the permeability coefficient is sufficiently large, analytical solutions for the transient response of the fluid/saturated porous medium halfspace system are developed. Moreover, the analytical solutions are presented in simple closed forms wherein each term represents a transient physical wave, especially the expressions for head waves. The methodology utilised to determine where the head wave can emerge within the system is also given. The wave fields within the fluid and porous medium are first defined considering the behaviour of two compressional waves and one tangential wave in the saturated porous medium and one compressional wave in the fluid. Substituting these wave fields into the interface continuity conditions, the analytical solutions in the Laplace domain are then derived. To transform the solutions into the time domain, a suitable distortion of the contour is provided to change the integration path of the solution, after which the analytical solutions in the Laplace domain are transformed into the time domain by employing Cagniard's method. Numerical examples are provided to illustrate some interesting features of the fluid/saturated porous medium halfspace system. In particular, the interface wave and head waves that propagate along the interface between the fluid and saturated porous medium can be observed.

  15. Plasma-cavity ringdown spectroscopy for analytical measurement: Progress and prospectives

    Science.gov (United States)

    Zhang, Sida; Liu, Wei; Zhang, Xiaohe; Duan, Yixiang

    2013-07-01

    Plasma-cavity ringdown spectroscopy is a powerful absorption technique for analytical measurement. It combines the inherent advantages of high sensitivity, absolute measurement, and relative insensitivity to light source intensity fluctuations of the cavity ringdown technique with use of plasma as an atomization/ionization source. In this review, we briefly describe the background and principles of plasma-cavity ringdown spectroscopy(CRDS) technology, the instrumental components, and various applications. The significant developments of the plasma sources, lasers, and cavity optics are illustrated. Analytical applications of plasma-CRDS for elemental detection and isotopic measurement in atomic spectrometry are outlined in this review. Plasma-CRDS is shown to have a promising future for various analytical applications, while some further efforts are still needed in fields such as cavity design, plasma source design, instrumental improvement and integration, as well as potential applications in radical and molecular measurements.

  16. Geodesics of electrically and magnetically charged test particles in the Reissner-Nordstroem space-time: Analytical solutions

    International Nuclear Information System (INIS)

    Grunau, Saskia; Kagramanova, Valeria

    2011-01-01

    We present the full set of analytical solutions of the geodesic equations of charged test particles in the Reissner-Nordstroem space-time in terms of the Weierstrass weierp, σ, and ζ elliptic functions. Based on the study of the polynomials in the θ and r equations, we characterize the motion of test particles and discuss their properties. The motion of charged test particles in the Reissner-Nordstroem space-time is compared with the motion of neutral test particles in the field of a gravitomagnetic monopole. Electrically or magnetically charged particles in the Reissner-Nordstroem space-time with magnetic or electric charges, respectively, move on cones similar to neutral test particles in the Taub-NUT space-times.

  17. A two dimensional analytical modeling of surface potential in triple metal gate (TMG) fully-depleted Recessed-Source/Drain (Re-S/D) SOI MOSFET

    Science.gov (United States)

    Priya, Anjali; Mishra, Ram Awadh

    2016-04-01

    In this paper, analytical modeling of surface potential is proposed for new Triple Metal Gate (TMG) fully depleted Recessed-Source/Dain Silicon On Insulator (SOI) Metal Oxide Semiconductor Field Effect Transistor (MOSFET). The metal with the highest work function is arranged near the source region and the lowest one near the drain. Since Recessed-Source/Drain SOI MOSFET has higher drain current as compared to conventional SOI MOSFET due to large source and drain region. The surface potential model developed by 2D Poisson's equation is verified by comparison to the simulation result of 2-dimensional ATLAS simulator. The model is compared with DMG and SMG devices and analysed for different device parameters. The ratio of metal gate length is varied to optimize the result.

  18. The impact of repeat-testing of common chemistry analytes at critical concentrations.

    Science.gov (United States)

    Onyenekwu, Chinelo P; Hudson, Careen L; Zemlin, Annalise E; Erasmus, Rajiv T

    2014-12-01

    Early notification of critical values by the clinical laboratory to the treating physician is a requirement for accreditation and is essential for effective patient management. Many laboratories automatically repeat a critical value before reporting it to prevent possible misdiagnosis. Given today's advanced instrumentation and quality assurance practices, we questioned the validity of this approach. We performed an audit of repeat-testing in our laboratory to assess for significant differences between initial and repeated test results, estimate the delay caused by repeat-testing and to quantify the cost of repeating these assays. A retrospective audit of repeat-tests for sodium, potassium, calcium and magnesium in the first quarter of 2013 at Tygerberg Academic Laboratory was conducted. Data on the initial and repeat-test values and the time that they were performed was extracted from our laboratory information system. The Clinical Laboratory Improvement Amendment criteria for allowable error were employed to assess for significant difference between results. A total of 2308 repeated tests were studied. There was no significant difference in 2291 (99.3%) of the samples. The average delay ranged from 35 min for magnesium to 42 min for sodium and calcium. At least 2.9% of laboratory running costs for the analytes was spent on repeating them. The practice of repeating a critical test result appears unnecessary as it yields similar results, delays notification to the treating clinician and increases laboratory running costs.

  19. Explosion overpressure test series: General-Purpose Heat Source development: Safety Verification Test program

    International Nuclear Information System (INIS)

    Cull, T.A.; George, T.G.; Pavone, D.

    1986-09-01

    The General-Purpose Heat Source (GPHS) is a modular, radioisotope heat source that will be used in radioisotope thermoelectric generators (RTGs) to supply electric power for space missions. The first two uses will be the NASA Galileo and the ESA Ulysses missions. The RTG for these missions will contain 18 GPHS modules, each of which contains four 238 PuO 2 -fueled clads and generates 250 W/sub (t)/. A series of Safety Verification Tests (SVTs) was conducted to assess the ability of the GPHS modules to contain the plutonia in accident environments. Because a launch pad or postlaunch explosion of the Space Transportation System vehicle (space shuttle) is a conceivable accident, the SVT plan included a series of tests that simulated the overpressure exposure the RTG and GPHS modules could experience in such an event. Results of these tests, in which we used depleted UO 2 as a fuel simulant, suggest that exposure to overpressures as high as 15.2 MPa (2200 psi), without subsequent impact, does not result in a release of fuel

  20. Development of KU-band waveguide break for ECR-3 ion source

    International Nuclear Information System (INIS)

    Misra, Anuraag; Prasad, R.K.; Nabhiraj, P.Y.; Mallik, C.

    2011-01-01

    This article describes the analytical design, simulation results, engineering design and testing of WR-62 waveguide break for ECR-3 ion source and it also emphasizes on the estimation of far-field radiation with the use of advanced 3D codes. (author)

  1. Installation and Characterization of Charged Particle Sources for Space Environmental Effects Testing

    Science.gov (United States)

    Skevington, Jennifer L.

    2010-01-01

    Charged particle sources are integral devices used by Marshall Space Flight Center s Environmental Effects Branch (EM50) in order to simulate space environments for accurate testing of materials and systems. By using these sources inside custom vacuum systems, materials can be tested to determine charging and discharging properties as well as resistance to sputter damage. This knowledge can enable scientists and engineers to choose proper materials that will not fail in harsh space environments. This paper combines the steps utilized to build a low energy electron gun (The "Skevington 3000") as well as the methods used to characterize the output of both the Skevington 3000 and a manufactured Xenon ion source. Such characterizations include beam flux, beam uniformity, and beam energy. Both sources were deemed suitable for simulating environments in future testing.

  2. Source effects on surface waves from Nevada Test Site explosions

    International Nuclear Information System (INIS)

    Patton, H.J.; Vergino, E.S.

    1981-11-01

    Surface waves recorded on the Lawrence Livermore National Laboratory (LLNL) digital network have been used to study five underground nuclear explosions detonated in Yucca Valley at the Nevada Test Site. The purpose of this study is to characterize the reduced displacement potential (RDP) at low frequencies and to test secondary source models of underground explosions. The observations consist of Rayleigh- and Love-wave amplitude and phase spectra in the frequency range 0.03 to 0.16 Hz. We have found that Rayleigh-wave spectral amplitudes are modeled well by a RDP with little or no overshoot for explosions detonated in alluvium and tuff. On the basis of comparisons between observed and predicted source phase, the spall closure source proposed by Viecelli does not appear to be a significant source of Rayleigh waves that reach the far field. We tested two other secondary source models, the strike-slip, tectonic strain release model proposed by Toksoez and Kehrer and the dip-slip thrust model of Masse. The surface-wave observations do not provide sufficient information to discriminate between these models at the low F-values (0.2 to 0.8) obtained for these explosions. In the case of the strike-slip model, the principal stress axes inferred from the fault slip angle and strike angle are in good agreement with the regional tectonic stress field for all but one explosion, Nessel. The results of the Nessel explosion suggest a mechanism other than tectonic strain release

  3. Sources of Variation in Creep Testing

    Science.gov (United States)

    Loewenthal, William S.; Ellis, David L.

    2011-01-01

    Creep rupture is an important material characteristic for the design of rocket engines. It was observed during the characterization of GRCop-84 that the complete data set had nearly 4 orders of magnitude of scatter. This scatter likely confounded attempts to determine how creep performance was influenced by manufacturing. It was unclear if this variation was from the testing, the material, or both. Sources of variation were examined by conducting tests on identically processed specimens at the same specified stresses and temperatures. Significant differences existed between the five constant-load creep frames. The specimen temperature was higher than the desired temperature by as much as 43 C. It was also observed that the temperature gradient was up to 44 C. Improved specimen temperature control minimized temperature variations. The data from additional tests demonstrated that the results from all five frames were comparable. The variation decreased to 1/2 order of magnitude from 2 orders of magnitude for the baseline data set. Independent determination of creep rates in a reference load frame closely matched the creep rates determined after the modifications. Testing in helium tended to decrease the sample temperature gradient, but helium was not a significant improvement over vacuum.

  4. Pre-analytical and analytical factors influencing Alzheimer's disease cerebrospinal fluid biomarker variability.

    Science.gov (United States)

    Fourier, Anthony; Portelius, Erik; Zetterberg, Henrik; Blennow, Kaj; Quadrio, Isabelle; Perret-Liaudet, Armand

    2015-09-20

    A panel of cerebrospinal fluid (CSF) biomarkers including total Tau (t-Tau), phosphorylated Tau protein at residue 181 (p-Tau) and β-amyloid peptides (Aβ42 and Aβ40), is frequently used as an aid in Alzheimer's disease (AD) diagnosis for young patients with cognitive impairment, for predicting prodromal AD in mild cognitive impairment (MCI) subjects, for AD discrimination in atypical clinical phenotypes and for inclusion/exclusion and stratification of patients in clinical trials. Due to variability in absolute levels between laboratories, there is no consensus on medical cut-off value for the CSF AD signature. Thus, for full implementation of this core AD biomarker panel in clinical routine, this issue has to be solved. Variability can be explained both by pre-analytical and analytical factors. For example, the plastic tubes used for CSF collection and storage, the lack of reference material and the variability of the analytical protocols were identified as important sources of variability. The aim of this review is to highlight these pre-analytical and analytical factors and describe efforts done to counteract them in order to establish cut-off values for core CSF AD biomarkers. This review will give the current state of recommendations. Copyright © 2015. Published by Elsevier B.V.

  5. Testing methods of ECR ion source experimental platform

    International Nuclear Information System (INIS)

    Zhou Changgeng; Hu Yonghong; Li Yan

    2006-12-01

    The principle and structure of ECR ion source experimental platform were introduce. The testing methods of the parameters of single main component and the comprehensive parameters under the condition of certain beam current and beam spot diameter were summarized in process of manufacturing. Some appropriate testing dates were given. The existent questions (the parameters of plasma density in discharge chamber and accurate hydrogen flow, etc. can not be measured in operation) and resolutions were also put forward. (authors)

  6. A pilot analytic study of a research-level, lower-cost human papillomavirus 16, 18, and 45 test

    OpenAIRE

    Yang, Hannah P.; Walmer, David K.; Merisier, Delson; Gage, Julia C.; Bell, Laura; Rangwala, Sameera; Shrestha, Niwashin; Kobayashi, Lori; Eder, Paul S.; Castle, Philip E.

    2011-01-01

    The analytic performance of a low-cost, research-stage DNA test for the most carcinogenic human papillomavirus (HPV) genotypes (HPV16, HPV18, and HPV45) in aggregate was evaluated among carcinogenic HPV-positive women, which might be used to decide who needs immediate colposcopy in low-resource settings (“triage test”). We found that HPV16/18/45 test agreed well with two DNA tests, a GP5+/6+ genotyping assay (Kappa = 0.77) and a quantitative PCR assay (at a cutpoint of 5000 viral copies) (Kap...

  7. High-Activity ICP-AES Measurements in the ATALANTE Facility Applied to Analytical Monitoring of an Extraction Test

    International Nuclear Information System (INIS)

    Esbelin, E.; Boyer-Deslys, V.; Beres, A.; Viallesoubranne, C.

    2008-01-01

    The Material Analysis and Metrology Laboratory (LAMM) of the Cea's Atalante complex ensures analytical monitoring of enhanced separation tests. Certain fission products, actinides and lanthanides were assayed by ICP-AES (Inductively Coupled Plasma-Atomic Emission Spectroscopy) in the CBA shielded analysis line. These analyses were particularly effective for controlling the Diamex test, and contributed to its success. The Diamex process consists in extracting the actinides and lanthanides from a Purex raffinate using a diamide, DMDOHEMA, followed by stripping at low acidity. The major elements analyzed during the test were Am, Nd, Mo, Fe, and Zr

  8. Principles of Single-Laboratory Validation of Analytical Methods for Testing the Chemical Composition of Pesticides

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    Underlying theoretical and practical approaches towards pesticide formulation analysis are discussed, i.e. general principles, performance characteristics, applicability of validation data, verification of method performance, and adaptation of validated methods by other laboratories. The principles of single laboratory validation of analytical methods for testing the chemical composition of pesticides are outlined. Also the theoretical background is described for performing pesticide formulation analysis as outlined in ISO, CIPAC/AOAC and IUPAC guidelines, including methodological characteristics such as specificity, selectivity, linearity, accuracy, trueness, precision and bias. Appendices I–III hereof give practical and elaborated examples on how to use the Horwitz approach and formulae for estimating the target standard deviation towards acceptable analytical repeatability. The estimation of trueness and the establishment of typical within-laboratory reproducibility are treated in greater detail by means of worked-out examples. (author)

  9. NC CATCH: Advancing Public Health Analytics.

    Science.gov (United States)

    Studnicki, James; Fisher, John W; Eichelberger, Christopher; Bridger, Colleen; Angelon-Gaetz, Kim; Nelson, Debi

    2010-01-01

    The North Carolina Comprehensive Assessment for Tracking Community Health (NC CATCH) is a Web-based analytical system deployed to local public health units and their community partners. The system has the following characteristics: flexible, powerful online analytic processing (OLAP) interface; multiple sources of multidimensional, event-level data fully conformed to common definitions in a data warehouse structure; enabled utilization of available decision support software tools; analytic capabilities distributed and optimized locally with centralized technical infrastructure; two levels of access differentiated by the user (anonymous versus registered) and by the analytical flexibility (Community Profile versus Design Phase); and, an emphasis on user training and feedback. The ability of local public health units to engage in outcomes-based performance measurement will be influenced by continuing access to event-level data, developments in evidence-based practice for improving population health, and the application of information technology-based analytic tools and methods.

  10. Plasma-cavity ringdown spectroscopy for analytical measurement: Progress and prospectives

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Sida; Liu, Wei [Research Center of Analytical Instrumentation, Analytical and Testing Center, College of Chemistry, Sichuan University, Chengdu (China); Zhang, Xiaohe [College of Water Resources and Hydropower, Sichuan University, Chengdu (China); Duan, Yixiang, E-mail: yduan@scu.edu.cn [Research Center of Analytical Instrumentation, Analytical and Testing Center, College of Chemistry, Sichuan University, Chengdu (China)

    2013-07-01

    Plasma-cavity ringdown spectroscopy is a powerful absorption technique for analytical measurement. It combines the inherent advantages of high sensitivity, absolute measurement, and relative insensitivity to light source intensity fluctuations of the cavity ringdown technique with use of plasma as an atomization/ionization source. In this review, we briefly describe the background and principles of plasma-cavity ringdown spectroscopy(CRDS) technology, the instrumental components, and various applications. The significant developments of the plasma sources, lasers, and cavity optics are illustrated. Analytical applications of plasma-CRDS for elemental detection and isotopic measurement in atomic spectrometry are outlined in this review. Plasma-CRDS is shown to have a promising future for various analytical applications, while some further efforts are still needed in fields such as cavity design, plasma source design, instrumental improvement and integration, as well as potential applications in radical and molecular measurements. - Highlights: • Plasma-based cavity ringdown spectroscopy • High sensitivity and high resolution • Elemental and isotopic measurements.

  11. Planning for Low End Analytics Disruptions in Business School Curricula

    Science.gov (United States)

    Rienzo, Thomas; Chen, Kuanchin

    2018-01-01

    Analytics is getting a great deal of attention in both industrial and academic venues. Organizations of all types are becoming more serious about transforming data from a variety of sources into insight, and analytics is the key to that transformation. Academic institutions are rapidly responding to the demand for analytics talent, with hundreds…

  12. Analytical support for the B{sub 4}C control rod test QUENCH-07

    Energy Technology Data Exchange (ETDEWEB)

    Homann, C.; Hering, W. [Forschungszentrum Karlsruhe GmbH Technik und Umwelt (Germany). Inst. fuer Reaktorsicherheit]|[Forschungszentrum Karlsruhe GmbH Technik und Umwelt (Germany). Programm Nukleare Sicherheitsforschung; Birchley, J. [Paul Scherrer Inst. (Switzerland); Fernandez Benitez, J.A.; Ortega Bernardo, M. [Univ. Politecnica de Madrid (Spain)

    2003-04-01

    Degradation of B{sub 4}C absorber rods during a beyond design accident in a nuclear power reactor may be a safety concern. Among others, the integral test QUENCH-07 is performed in the FZK QUENCH facility and supported by analytical work within the Euratom Fifth Framework Programme on Nuclear Fission Safety to get a more profound database. Since the test differed substantially from previous QUENCH tests, much more work had to be done for pretest calculations than usual to guarantee the safety of the facility and to derive the test protocol. Several institutions shared in this work with different computer code systems, as used for nuclear reactor safety analyses. Due to this effort, problems could be identified and solved, leading to several modifications of the originally planned test conduct, until a feasible test protocol could be derived and recommended. All calculations showed the same trends. Especially the high temperatures and hence the small safety margin for the facility were a concern. In this report, contributions of various authors, engaged in this work, are presented. The test QUENCH-07 and the related computational support by the engaged institutions were co-financed by the European Community under the Euratom Fifth Framework Programme on Nuclear Fission Safety 1998 - 2002 (COLOSS Project, contract No. FIKS-CT-1999-00002). (orig.)

  13. Analytical study for frequency effects on the EPRI/USNRC piping component tests. Part 1: Theoretical basis and model development

    International Nuclear Information System (INIS)

    Adams, T.M.; Branch, E.B.; Tagart, S.W. Jr.

    1994-01-01

    As part of the engineering effort for the Advanced Light Water Reactor the Advanced Reactor Corporation formed a Piping Technical Core Group to develop a set of improved ASME Boiler and Pressure Vessel Code, Section III design rules and approaches for ALWR plant piping and support design. The technical basis for the proposed changes to the ASME Boiler and Pressure Vessel Code developed by Technical Core Group for the design of piping relies heavily on the failure margins determined from the EPRI/USNRC piping component test program. The majority of the component tests forming the basis for the reported margins against failure were run with input frequency to natural frequency ratios (Ω/ω) in the range of 0.74 to 0.87. One concern investigated by the Technical Core Group was the effect which could exist on measured margins if the tests had been run at higher or lower frequency ratios than those in the limited frequency ratio range tested. Specifically, the concern investigated was that the proposed Technical Core Group Piping Stress Criteria will allow piping to be designed in the low frequency range (Ω/ω ≥ 2.0) for which there is little test data from the EPRI/USNRC test program. The purpose of this analytical study was to: (1) evaluate the potential for margin variation as a function of the frequency ratio (R ω = Ω/ω, where Ω is the forcing frequency and ω is the natural component frequency), (2) recommend a margin reduction factor (MRF) that could be applied to margins determined from the EPRI/USNRC test program to adjust those margins for potential margin variation with frequency ratio. Presented in this paper is the analytical approach and methodology, which are inelastic analysis, which was the basis of the study. Also, discussed is the development of the analytical model, the procedure used to benchmark the model to actual test results, and the various parameter studies conducted

  14. LC-MS/MS analytical procedure to quantify tris(nonylphenyl)phosphite, as a source of the endocrine disruptors 4-nonylphenols, in food packaging materials.

    Science.gov (United States)

    Mottier, Pascal; Frank, Nancy; Dubois, Mathieu; Tarres, Adrienne; Bessaire, Thomas; Romero, Roman; Delatour, Thierry

    2014-01-01

    Tris(nonylphenyl)phosphite, an antioxidant used in polyethylene resins for food applications, is problematic since it is a source of the endocrine-disrupting chemicals 4-nonylphenols (4NP) upon migration into packaged foods. As a response to concerns surrounding the presence of 4NP-based compounds in packaging materials, some resin producers and additive suppliers have decided to eliminate TNPP from formulations. This paper describes an analytical procedure to verify the "TNPP-free" statement in multilayer laminates used for bag-in-box packaging. The method involves extraction of TNPP from laminates with organic solvents followed by detection/quantification by LC-MS/MS using the atmospheric pressure chemical ionisation (APCI) mode. A further acidic treatment of the latter extract allows the release of 4NP from potentially extracted TNPP. 4NP is then analysed by LC-MS/MS using electrospray ionisation (ESI) mode. This two-step analytical procedure ensures not only TNPP quantification in laminates, but also allows the flagging of other possible sources of 4NP in such packaging materials, typically as non-intentionally added substances (NIAS). The limits of quantification were 0.50 and 0.48 µg dm⁻² for TNPP and 4NP in laminates, respectively, with recoveries ranging between 87% and 114%. Usage of such analytical methodologies in quality control operations has pointed to a lack of traceability at the packaging supplier level and cross-contamination of extrusion equipment at the converter level, when TNPP-containing laminates are processed on the same machine beforehand.

  15. Emerging technology and architecture for big-data analytics

    CERN Document Server

    Chang, Chip; Yu, Hao

    2017-01-01

    This book describes the current state of the art in big-data analytics, from a technology and hardware architecture perspective. The presentation is designed to be accessible to a broad audience, with general knowledge of hardware design and some interest in big-data analytics. Coverage includes emerging technology and devices for data-analytics, circuit design for data-analytics, and architecture and algorithms to support data-analytics. Readers will benefit from the realistic context used by the authors, which demonstrates what works, what doesn’t work, and what are the fundamental problems, solutions, upcoming challenges and opportunities. Provides a single-source reference to hardware architectures for big-data analytics; Covers various levels of big-data analytics hardware design abstraction and flow, from device, to circuits and systems; Demonstrates how non-volatile memory (NVM) based hardware platforms can be a viable solution to existing challenges in hardware architecture for big-data analytics.

  16. Orthodontic brackets removal under shear and tensile bond strength resistance tests - a comparative test between light sources

    Science.gov (United States)

    Silva, P. C. G.; Porto-Neto, S. T.; Lizarelli, R. F. Z.; Bagnato, V. S.

    2008-03-01

    We have investigated if a new LEDs system has enough efficient energy to promote efficient shear and tensile bonding strength resistance under standardized tests. LEDs 470 ± 10 nm can be used to photocure composite during bracket fixation. Advantages considering resistance to tensile and shear bonding strength when these systems were used are necessary to justify their clinical use. Forty eight human extracted premolars teeth and two light sources were selected, one halogen lamp and a LEDs system. Brackets for premolar were bonded through composite resin. Samples were submitted to standardized tests. A comparison between used sources under shear bonding strength test, obtained similar results; however, tensile bonding test showed distinct results: a statistical difference at a level of 1% between exposure times (40 and 60 seconds) and even to an interaction between light source and exposure time. The best result was obtained with halogen lamp use by 60 seconds, even during re-bonding; however LEDs system can be used for bonding and re-bonding brackets if power density could be increased.

  17. Upgrade of the BATMAN test facility for H- source development

    Science.gov (United States)

    Heinemann, B.; Fröschle, M.; Falter, H.-D.; Fantz, U.; Franzen, P.; Kraus, W.; Nocentini, R.; Riedl, R.; Ruf, B.

    2015-04-01

    The development of a radio frequency (RF) driven source for negative hydrogen ions for the neutral beam heating devices of fusion experiments has been successfully carried out at IPP since 1996 on the test facility BATMAN. The required ITER parameters have been achieved with the prototype source consisting of a cylindrical driver on the back side of a racetrack like expansion chamber. The extraction system, called "Large Area Grid" (LAG) was derived from a positive ion accelerator from ASDEX Upgrade (AUG) using its aperture size (ø 8 mm) and pattern but replacing the first two electrodes and masking down the extraction area to 70 cm2. BATMAN is a well diagnosed and highly flexible test facility which will be kept operational in parallel to the half size ITER source test facility ELISE for further developments to improve the RF efficiency and the beam properties. It is therefore planned to upgrade BATMAN with a new ITER-like grid system (ILG) representing almost one ITER beamlet group, namely 5 × 14 apertures (ø 14 mm). Additionally to the standard three grid extraction system a repeller electrode upstream of the grounded grid can optionally be installed which is positively charged against it by 2 kV. This is designated to affect the onset of the space charge compensation downstream of the grounded grid and to reduce the backstreaming of positive ions from the drift space backwards into the ion source. For magnetic filter field studies a plasma grid current up to 3 kA will be available as well as permanent magnets embedded into a diagnostic flange or in an external magnet frame. Furthermore different source vessels and source configurations are under discussion for BATMAN, e.g. using the AUG type racetrack RF source as driver instead of the circular one or modifying the expansion chamber for a more flexible position of the external magnet frame.

  18. Upgrade of the BATMAN test facility for H− source development

    International Nuclear Information System (INIS)

    Heinemann, B.; Fröschle, M.; Falter, H.-D.; Fantz, U.; Franzen, P.; Kraus, W.; Nocentini, R.; Riedl, R.; Ruf, B.

    2015-01-01

    The development of a radio frequency (RF) driven source for negative hydrogen ions for the neutral beam heating devices of fusion experiments has been successfully carried out at IPP since 1996 on the test facility BATMAN. The required ITER parameters have been achieved with the prototype source consisting of a cylindrical driver on the back side of a racetrack like expansion chamber. The extraction system, called “Large Area Grid” (LAG) was derived from a positive ion accelerator from ASDEX Upgrade (AUG) using its aperture size (ø 8 mm) and pattern but replacing the first two electrodes and masking down the extraction area to 70 cm2. BATMAN is a well diagnosed and highly flexible test facility which will be kept operational in parallel to the half size ITER source test facility ELISE for further developments to improve the RF efficiency and the beam properties. It is therefore planned to upgrade BATMAN with a new ITER-like grid system (ILG) representing almost one ITER beamlet group, namely 5 × 14 apertures (ø 14 mm). Additionally to the standard three grid extraction system a repeller electrode upstream of the grounded grid can optionally be installed which is positively charged against it by 2 kV. This is designated to affect the onset of the space charge compensation downstream of the grounded grid and to reduce the backstreaming of positive ions from the drift space backwards into the ion source. For magnetic filter field studies a plasma grid current up to 3 kA will be available as well as permanent magnets embedded into a diagnostic flange or in an external magnet frame. Furthermore different source vessels and source configurations are under discussion for BATMAN, e.g. using the AUG type racetrack RF source as driver instead of the circular one or modifying the expansion chamber for a more flexible position of the external magnet frame

  19. Post-analytical stability of 23 common chemistry and immunochemistry analytes in incurred samples

    DEFF Research Database (Denmark)

    Nielsen, Betina Klint; Frederiksen, Tina; Friis-Hansen, Lennart

    2017-01-01

    BACKGROUND: Storage of blood samples after centrifugation, decapping and initial sampling allows ordering of additional blood tests. The pre-analytic stability of biochemistry and immunochemistry analytes has been studied in detail, but little is known about the post-analytical stability...... in incurred samples. METHODS: We examined the stability of 23 routine analytes on the Dimension Vista® (Siemens Healthineers, Denmark): 42-60 routine samples in lithium-heparin gel tubes (Vacutainer, BD, USA) were centrifuged at 3000×g for 10min. Immediately after centrifugation, initial concentration...... of analytes were measured in duplicate (t=0). The tubes were stored decapped at room temperature and re-analyzed after 2, 4, 6, 8 and 10h in singletons. The concentration from reanalysis were normalized to initial concentration (t=0). Internal acceptance criteria for bias and total error were used...

  20. Introductory statistics and analytics a resampling perspective

    CERN Document Server

    Bruce, Peter C

    2014-01-01

    Concise, thoroughly class-tested primer that features basic statistical concepts in the concepts in the context of analytics, resampling, and the bootstrapA uniquely developed presentation of key statistical topics, Introductory Statistics and Analytics: A Resampling Perspective provides an accessible approach to statistical analytics, resampling, and the bootstrap for readers with various levels of exposure to basic probability and statistics. Originally class-tested at one of the first online learning companies in the discipline, www.statistics.com, the book primarily focuses on application

  1. Quantifying uncertainty in nuclear analytical measurements

    International Nuclear Information System (INIS)

    2004-07-01

    Laboratories to quantify the uncertainty of measurement results, and the fact that this standard is used as a basis for the development and implementation of quality management systems in many laboratories performing nuclear analytical measurements, triggered the demand for specific guidance to cover uncertainty issues of nuclear analytical methods. The demand was recognized by the IAEA and a series of examples was worked out by a group of consultants in 1998. The diversity and complexity of the topics addressed delayed the publication of a technical guidance report, but the exchange of views among the experts was also beneficial and led to numerous improvements and additions with respect to the initial version. This publication is intended to assist scientists using nuclear analytical methods in assessing and quantifying the sources of uncertainty of their measurements. The numerous examples provide a tool for applying the principles elaborated in the GUM and EURACHEM/CITAC publications to their specific fields of interest and for complying with the requirements of current quality management standards for testing and calibration laboratories. It also provides a means for the worldwide harmonization of approaches to uncertainty quantification and thereby contributes to enhanced comparability and competitiveness of nuclear analytical measurements

  2. Project of a test stand for cyclotron ion sources

    International Nuclear Information System (INIS)

    Buettig, H.; Dietrich, J.; Merker, H.; Odrich, H.; Preusche, S.; Weissig, J.

    1978-10-01

    In the work the construction of a test stand for testing and optimization of ion sources of the Rossendorf cyclotron U-120 is represented. The design procedure and the construction of the electromagnet, the vacuum chamber with monant, the vacuum system, the power supply and the detecting system are demonstrated. The results of calculations of the motion of ions in the magnetic field are presented. (author)

  3. An analytical turn-on power loss model for 650-V GaN eHEMTs

    DEFF Research Database (Denmark)

    Shen, Yanfeng; Wang, Huai; Shen, Zhan

    2018-01-01

    This paper proposes an improved analytical turn-on power loss model for 650-V GaN eHEMTs. The static characteristics, i.e., the parasitic capacitances and transconductance, are firstly modeled. Then the turn-on process is divided into multiple stages and analyzed in detail; as results, the time-d......-domain solutions to the drain-source voltage and drain current are obtained. Finally, double-pulse tests are conducted to verify the proposed power loss model. This analytical model enables an accurate and fast switching behavior characterization and power loss prediction.......This paper proposes an improved analytical turn-on power loss model for 650-V GaN eHEMTs. The static characteristics, i.e., the parasitic capacitances and transconductance, are firstly modeled. Then the turn-on process is divided into multiple stages and analyzed in detail; as results, the time...

  4. The wire optical test: a thorough analytical study in and out of caustic surface, and advantages of a dynamical adaptation

    Science.gov (United States)

    Alejandro Juárez-Reyes, Salvador; Sosa-Sánchez, Citlalli Teresa; Silva-Ortigoza, Gilberto; de Jesús Cabrera-Rosas, Omar; Espíndola-Ramos, Ernesto; Ortega-Vidals, Paula

    2018-03-01

    Among the best known non-interferometric optical tests are the wire test, the Foucault test and Ronchi test with a low frequency grating. Since the wire test is the seed to understand the other ones, the aim of the present work is to do a thorough study of this test for a lens with symmetry of revolution and to do this study for any configuration of the object and detection planes where both planes could intersect: two, one or no branches of the caustic region (including the marginal and paraxial foci). To this end, we calculated the vectorial representation for the caustic region, and we found the analytical expression for the pattern; we report that the analytical pattern explicitly depends on the magnitude of a branch of the caustic. With the analytical pattern we computed a set of simulations of a dynamical adaptation of the optical wire test. From those simulations, we have done a thorough analysis of the topological structure of the pattern; so we explain how the multiple image formation process and the image collapse process take place for each configuration, in particular, when both the wire and the detection planes are placed inside the caustic region, which has not been studied before. For the first time, we remark that not only the intersections of the object and detection planes with the caustic are important in the change of pattern topology; but also the projection of the intersection between the caustic and the object plane mapped onto the detection plane; and the virtual projection of the intersection between the caustic and the detection plane mapped onto the object plane. We present that for the new configurations of the optical system, the wire image is curves of the Tschirnhausen’s cubic, the piriform and the deformed eight-curve types.

  5. Acoustic emission non-destructive testing of structures using source location techniques.

    Energy Technology Data Exchange (ETDEWEB)

    Beattie, Alan G.

    2013-09-01

    The technology of acoustic emission (AE) testing has been advanced and used at Sandia for the past 40 years. AE has been used on structures including pressure vessels, fire bottles, wind turbines, gas wells, nuclear weapons, and solar collectors. This monograph begins with background topics in acoustics and instrumentation and then focuses on current acoustic emission technology. It covers the overall design and system setups for a test, with a wind turbine blade as the object. Test analysis is discussed with an emphasis on source location. Three test examples are presented, two on experimental wind turbine blades and one on aircraft fire extinguisher bottles. Finally, the code for a FORTRAN source location program is given as an example of a working analysis program. Throughout the document, the stress is on actual testing of real structures, not on laboratory experiments.

  6. High-Activity ICP-AES Measurements in the ATALANTE Facility Applied to Analytical Monitoring of an Extraction Test

    Energy Technology Data Exchange (ETDEWEB)

    Esbelin, E.; Boyer-Deslys, V.; Beres, A.; Viallesoubranne, C. [CEA Marcoule, DEN/DRCP/SE2A/LAMM, BP17171, 30207 Bagnols-sur-Ceze (France)

    2008-07-01

    The Material Analysis and Metrology Laboratory (LAMM) of the Cea's Atalante complex ensures analytical monitoring of enhanced separation tests. Certain fission products, actinides and lanthanides were assayed by ICP-AES (Inductively Coupled Plasma-Atomic Emission Spectroscopy) in the CBA shielded analysis line. These analyses were particularly effective for controlling the Diamex test, and contributed to its success. The Diamex process consists in extracting the actinides and lanthanides from a Purex raffinate using a diamide, DMDOHEMA, followed by stripping at low acidity. The major elements analyzed during the test were Am, Nd, Mo, Fe, and Zr.

  7. Doubling immunochemistry laboratory testing efficiency with the cobas e 801 module while maintaining consistency in analytical performance.

    Science.gov (United States)

    Findeisen, P; Zahn, I; Fiedler, G M; Leichtle, A B; Wang, S; Soria, G; Johnson, P; Henzell, J; Hegel, J K; Bendavid, C; Collet, N; McGovern, M; Klopprogge, K

    2018-06-04

    The new immunochemistry cobas e 801 module (Roche Diagnostics) was developed to meet increasing demands on routine laboratories to further improve testing efficiency, while maintaining high quality and reliable data. During a non-interventional multicenter evaluation study, the overall performance, functionality and reliability of the new module was investigated under routine-like conditions. It was tested as a dedicated immunochemistry system at four sites and as a consolidator combined with clinical chemistry at three sites. We report on testing efficiency and analytical performance of the new module. Evaluation of sample workloads with site-specific routine request patterns demonstrated increased speed and almost doubled throughput (maximal 300 tests per h), thus revealing that one cobas e 801 module can replace two cobas e 602 modules while saving up to 44% floor space. Result stability was demonstrated by QC analysis per assay throughout the study. Precision testing over 21 days yielded excellent results within and between labs, and, method comparison performed versus the cobas e 602 module routine results showed high consistency of results for all assays under study. In a practicability assessment related to performance and handling, 99% of graded features met (44%) or even exceeded (55%) laboratory expectations, with enhanced reagent management and loading during operation being highlighted. By nearly doubling immunochemistry testing efficiency on the same footprint as a cobas e 602 module, the new module has a great potential to further consolidate and enhance laboratory testing while maintaining high quality analytical performance with Roche platforms. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  8. General-Purpose Heat Source Development: Safety Test Program. Postimpact evaluation, Design Iteration Test 3

    International Nuclear Information System (INIS)

    Schonfeld, F.W.; George, T.G.

    1984-07-01

    The General-Purpose Heat Source(GPHS) provides power for space missions by transmitting the heat of 238 PuO 2 decay to thermoelectric elements. Because of the inevitable return of certain aborted missions, the heat source must be designed and constructed to survive both re-entry and Earth impact. The Design Iteration Test (DIT) series is part of an ongoing test program. In the third test (DIT-3), a full GPHS module was impacted at 58 m/s and 930 0 C. The module impacted the target at an angle of 30 0 to the pole of the large faces. The four capsules used in DIT-3 survived impact with minimal deformation; no internal cracks other than in the regions indicated by Savannah River Plant (SRP) preimpact nondestructive testing were observed in any of the capsules. The 30 0 impact orientation used in DIT-3 was considerably less severe than the flat-on impact utilized in DIT-1 and DIT-2. The four capsules used in DIT-1 survived, while two of the capsules used in DIT-2 breached; a small quantity (approx. = 50 μg) of 238 PuO 2 was released from the capsules breached in the DIT-2 impact. All of the capsules used in DIT-1 and DIT-2 were severely deformed and contained large internal cracks. Postimpact analyses of the DIT-3 test components are described, with emphasis on weld structure and the behavior of defects identified by SRP nondestructive testing

  9. Analytical research using synchrotron radiation based techniques

    International Nuclear Information System (INIS)

    Jha, Shambhu Nath

    2015-01-01

    There are many Synchrotron Radiation (SR) based techniques such as X-ray Absorption Spectroscopy (XAS), X-ray Fluorescence Analysis (XRF), SR-Fourier-transform Infrared (SRFTIR), Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. which are increasingly being employed worldwide in analytical research. With advent of modern synchrotron sources these analytical techniques have been further revitalized and paved ways for new techniques such as microprobe XRF and XAS, FTIR microscopy, Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. The talk will cover mainly two techniques illustrating its capability in analytical research namely XRF and XAS. XRF spectroscopy: XRF spectroscopy is an analytical technique which involves the detection of emitted characteristic X-rays following excitation of the elements within the sample. While electron, particle (protons or alpha particles), or X-ray beams can be employed as the exciting source for this analysis, the use of X-ray beams from a synchrotron source has been instrumental in the advancement of the technique in the area of microprobe XRF imaging and trace level compositional characterisation of any sample. Synchrotron radiation induced X-ray emission spectroscopy, has become competitive with the earlier microprobe and nanoprobe techniques following the advancements in manipulating and detecting these X-rays. There are two important features that contribute to the superb elemental sensitivities of microprobe SR induced XRF: (i) the absence of the continuum (Bremsstrahlung) background radiation that is a feature of spectra obtained from charged particle beams, and (ii) the increased X-ray flux on the sample associated with the use of tunable third generation synchrotron facilities. Detection sensitivities have been reported in the ppb range, with values of 10 -17 g - 10 -14 g (depending on the particular element and matrix). Keeping in mind its demand, a microprobe XRF beamline has been setup by RRCAT at Indus-2 synchrotron

  10. Simulations of Liners and Test Objects for a New Atlas Advanced Radiography Source

    International Nuclear Information System (INIS)

    Morgan, D. V.; Iversen, S.; Hilko, R. A.

    2002-01-01

    The Advanced Radiographic Source (ARS) will improve the data significantly due to its smaller source width. Because of the enhanced ARS output, larger source-to-object distances are a reality. The harder ARS source will allow radiography of thick high-Z targets. The five different spectral simulations resulted in similar imaging detector weighted transmission. This work used a limited set of test objects and imaging detectors. Other test objects and imaging detectors could possibly change the MVp-sensitivity result. The effect of material motion blur must be considered for the ARS due to the expected smaller X-ray source size. This study supports the original 1.5-MVp value

  11. Safety quality classification test of the sealed neutron sources used in start-up neutron source rods for Qinshan Nuclear Power Plant

    International Nuclear Information System (INIS)

    Yao Chunbing; Guo Gang; Chao Jinglan; Duan Liming

    1992-01-01

    According to the regulations listed in the GB4075, the safety quality classification tests have been carried out for the neutron sources. The test items include temperature, external pressure, impact, vibration and puncture, Two dummy sealed sources are used for each test item. The testing equipment used have been examined and verified to be qualified by the measuring department which is admitted by the National standard Bureau. The leak rate of each tested sample is measured by UL-100 Helium Leak Detector (its minimum detectable leak rate is 1 x 10 -10 Pa·m 3 ·s -1 ). The samples with leak rate less than 1.33 x 10 -8 Pa·m 3 ·s -1 are considered up to the standard. The test results show the safety quality classification class of the neutron sources have reached the class of GB/E66545 which exceeds the preset class

  12. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening.

    Science.gov (United States)

    Lawton, Zachary E; Traub, Angelica; Fatigante, William L; Mancias, Jose; O'Leary, Adam E; Hall, Seth E; Wieland, Jamie R; Oberacher, Herbert; Gizzi, Michael C; Mulligan, Christopher C

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. Graphical Abstract ᅟ.

  13. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening

    Science.gov (United States)

    Lawton, Zachary E.; Traub, Angelica; Fatigante, William L.; Mancias, Jose; O'Leary, Adam E.; Hall, Seth E.; Wieland, Jamie R.; Oberacher, Herbert; Gizzi, Michael C.; Mulligan, Christopher C.

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. [Figure not available: see fulltext.

  14. Estimating and Testing the Sources of Evoked Potentials in the Brain.

    Science.gov (United States)

    Huizenga, Hilde M.; Molenaar, Peter C. M.

    1994-01-01

    The source of an event-related brain potential (ERP) is estimated from multivariate measures of ERP on the head under several mathematical and physical constraints on the parameters of the source model. Statistical aspects of estimation are discussed, and new tests are proposed. (SLD)

  15. Concurrence of big data analytics and healthcare: A systematic review.

    Science.gov (United States)

    Mehta, Nishita; Pandit, Anil

    2018-06-01

    The application of Big Data analytics in healthcare has immense potential for improving the quality of care, reducing waste and error, and reducing the cost of care. This systematic review of literature aims to determine the scope of Big Data analytics in healthcare including its applications and challenges in its adoption in healthcare. It also intends to identify the strategies to overcome the challenges. A systematic search of the articles was carried out on five major scientific databases: ScienceDirect, PubMed, Emerald, IEEE Xplore and Taylor & Francis. The articles on Big Data analytics in healthcare published in English language literature from January 2013 to January 2018 were considered. Descriptive articles and usability studies of Big Data analytics in healthcare and medicine were selected. Two reviewers independently extracted information on definitions of Big Data analytics; sources and applications of Big Data analytics in healthcare; challenges and strategies to overcome the challenges in healthcare. A total of 58 articles were selected as per the inclusion criteria and analyzed. The analyses of these articles found that: (1) researchers lack consensus about the operational definition of Big Data in healthcare; (2) Big Data in healthcare comes from the internal sources within the hospitals or clinics as well external sources including government, laboratories, pharma companies, data aggregators, medical journals etc.; (3) natural language processing (NLP) is most widely used Big Data analytical technique for healthcare and most of the processing tools used for analytics are based on Hadoop; (4) Big Data analytics finds its application for clinical decision support; optimization of clinical operations and reduction of cost of care (5) major challenge in adoption of Big Data analytics is non-availability of evidence of its practical benefits in healthcare. This review study unveils that there is a paucity of information on evidence of real-world use of

  16. Role of analytical chemistry in environmental monitoring

    International Nuclear Information System (INIS)

    Kayasth, S.; Swain, K.

    2004-01-01

    Basic aspects of pollution and the role of analytical chemistry in environmental monitoring are highlighted and exemplified, with emphasis on trace elements. Sources and pathways of natural and especially man-made polluting substances as well as physico-chemical characteristics are given. Attention is paid to adequate sampling in various compartments of the environment comprising both lithosphere and biosphere. Trace analysis is dealt with using a variety of analytical techniques, including criteria for choice of suited techniques, as well as aspects of analytical quality assurance and control. Finally, some data on trace elements levels in soil and water samples from India are presented. (author)

  17. Beam Profile Measurement of 300 kV Ion Source Test Stand for 1 MV Electrostatic Accelerator

    International Nuclear Information System (INIS)

    Park, Sae-Hoon; Kim, Yu-Seok; Kim, Dae-Il; Kwon, Hyeok-Jung; Cho, Yong-Sub

    2015-01-01

    In this paper, RF ion source, test stand of the ion source and its test results are presented. Beam profile was measured at the downstream from the accelerating tube and at the beam dump by using BPM and wire scanner. The RF ion source of the test stand is verified by measuring the total beam current with a faraday cup in the chamber. The KOMAC (KOrea Multi-purpose Accelerator Complex) has been developing a 300 kV ion source test stand for a 1 MV electrostatic accelerator. An ion source and accelerating tube will be installed in a high pressure vessel. The ion source in a high pressure vessel requires high reliability. To confirm the stable operation of the ion source, a test stand was proposed and developed. The ion source will be tested at the test stand to verify its long-term operation conditions. The test stand consists of a 300 kV high voltage terminal, a battery for the ion source power, a 60 Hz inverter, a 200 MHz RF power, a 5 kV extraction power supply, a 300 kV accelerating tube, and a vacuum system. The beam profile monitor was installed at the downstream from the accelerating tube. Wire scanner and faraday-cup was installed at the end of the chamber

  18. Beam Profile Measurement of 300 kV Ion Source Test Stand for 1 MV Electrostatic Accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sae-Hoon; Kim, Yu-Seok [Dongguk University, Gyeonju (Korea, Republic of); Kim, Dae-Il; Kwon, Hyeok-Jung; Cho, Yong-Sub [Korea Multipurpose Accelerator Complex, Gyeongju (Korea, Republic of)

    2015-10-15

    In this paper, RF ion source, test stand of the ion source and its test results are presented. Beam profile was measured at the downstream from the accelerating tube and at the beam dump by using BPM and wire scanner. The RF ion source of the test stand is verified by measuring the total beam current with a faraday cup in the chamber. The KOMAC (KOrea Multi-purpose Accelerator Complex) has been developing a 300 kV ion source test stand for a 1 MV electrostatic accelerator. An ion source and accelerating tube will be installed in a high pressure vessel. The ion source in a high pressure vessel requires high reliability. To confirm the stable operation of the ion source, a test stand was proposed and developed. The ion source will be tested at the test stand to verify its long-term operation conditions. The test stand consists of a 300 kV high voltage terminal, a battery for the ion source power, a 60 Hz inverter, a 200 MHz RF power, a 5 kV extraction power supply, a 300 kV accelerating tube, and a vacuum system. The beam profile monitor was installed at the downstream from the accelerating tube. Wire scanner and faraday-cup was installed at the end of the chamber.

  19. Test Method for High β Particle Emission Rate of 63Ni Source Plate

    OpenAIRE

    ZHANG Li-feng

    2015-01-01

    For the problem of measurement difficulties of β particle emission rate of Ni-63 source plate used for Ni-63 betavoltaic battery, a relative test method of scintillation current method was erected according to the measurement principle of scintillation detector.β particle emission rate of homemade Ni-63 source plate was tested by the method, and the test results were analysed and evaluated, it was initially thought that scintillation current method was a feasible way of testing β particle emi...

  20. A shape and mesh adaptive computational methodology for gamma ray dose from volumetric sources

    International Nuclear Information System (INIS)

    Mirza, N.M.; Ali, B.; Mirza, S.M.; Tufail, M.; Ahmad, N.

    1991-01-01

    Indoor external exposure to the population is dominated by gamma rays emitted from the walls and the floor of a room. A shape and mesh size adaptive flux calculational approach has been developed for a typical wall source. Parametric studies of the effect of mesh size on flux calculations have been done. The optimum value of the mesh size is found to depend strongly on distance from the source, permissible limits on uncertainty in flux predictions and on computer Central Processing Unit time. To test the computations, a typical wall source was reduced to a point, a line and an infinite volume source having finite thickness, and the computed flux values were compared with values from corresponding analytical expressions for these sources. Results indicate that the errors under optimum conditions remain less than 6% for the fluxes calculated from this approach when compared with the analytical values for the point and the line source approximations. Also, when the wall is simulated as an infinite volume source having finite thickness, the errors in computed to analytical flux ratios remain large for smaller wall dimensions. However, the errors become less than 10% when the wall dimensions are greater than ten mean free paths for 3 MeV gamma rays. Also, specific dose rates from this methodology remain within the difference of 15% for the values obtained by Monte Carlo method. (author)

  1. Seismic II over I Drop Test Program results and interpretation

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, B.

    1993-03-01

    The consequences of non-seismically qualified (Category 2) objects falling and striking essential seismically qualified (Category 1) objects has always been a significant, yet analytically difficult problem, particularly in evaluating the potential damage to equipment that may result from earthquakes. Analytical solutions for impact problems are conservative and available for mostly simple configurations. In a nuclear facility, the {open_quotes}sources{close_quotes} and {open_quotes}targets{close_quotes} requiring evaluation are frequently irregular in shape and configuration, making calculations and computer modeling difficult. Few industry or regulatory rules are available on this topic even though it is a source of considerable construction upgrade costs. A drop test program was recently conducted to develop a more accurate understanding of the consequences of seismic interactions. The resulting data can be used as a means to improve the judgment of seismic qualification engineers performing interaction evaluations and to develop realistic design criteria for seismic interactions. Impact tests on various combinations of sources and targets commonly found in one Savannah River Site (SRS) nuclear facility were performed by dropping the sources from various heights onto the targets. This report summarizes results of the Drop Test Program. Force and acceleration time history data are presented as well as general observations on the overall ruggedness of various targets when subjected to impacts from different types of sources.

  2. Seismic II over I Drop Test Program results and interpretation

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, B.

    1993-03-01

    The consequences of non-seismically qualified (Category 2) objects falling and striking essential seismically qualified (Category 1) objects has always been a significant, yet analytically difficult problem, particularly in evaluating the potential damage to equipment that may result from earthquakes. Analytical solutions for impact problems are conservative and available for mostly simple configurations. In a nuclear facility, the [open quotes]sources[close quotes] and [open quotes]targets[close quotes] requiring evaluation are frequently irregular in shape and configuration, making calculations and computer modeling difficult. Few industry or regulatory rules are available on this topic even though it is a source of considerable construction upgrade costs. A drop test program was recently conducted to develop a more accurate understanding of the consequences of seismic interactions. The resulting data can be used as a means to improve the judgment of seismic qualification engineers performing interaction evaluations and to develop realistic design criteria for seismic interactions. Impact tests on various combinations of sources and targets commonly found in one Savannah River Site (SRS) nuclear facility were performed by dropping the sources from various heights onto the targets. This report summarizes results of the Drop Test Program. Force and acceleration time history data are presented as well as general observations on the overall ruggedness of various targets when subjected to impacts from different types of sources.

  3. A semi-analytical solution for slug tests in an unconfined aquifer considering unsaturated flow

    Science.gov (United States)

    Sun, Hongbing

    2016-01-01

    A semi-analytical solution considering the vertical unsaturated flow is developed for groundwater flow in response to a slug test in an unconfined aquifer in Laplace space. The new solution incorporates the effects of partial penetrating, anisotropy, vertical unsaturated flow, and a moving water table boundary. Compared to the Kansas Geological Survey (KGS) model, the new solution can significantly improve the fittings of the modeled to the measured hydraulic heads at the late stage of slug tests in an unconfined aquifer, particularly when the slug well has a partially submerged screen and moisture drainage above the water table is significant. The radial hydraulic conductivities estimated with the new solution are comparable to those from the KGS, Bouwer and Rice, and Hvorslev methods. In addition, the new solution also can be used to examine the vertical conductivity, specific storage, specific yield, and the moisture retention parameters in an unconfined aquifer based on slug test data.

  4. A New 500-kV Ion Source Test Stand for HIF

    International Nuclear Information System (INIS)

    Sangster, T.C.; Ahle, L.E.; Halaxa, E.F.; Karpenko, V.P.; Oldaker, M. E.; Mitchell, J.W.; Beck, D.N.; Bieniosek, F.M.; Henestroza, E.; Kwan, J.W.

    2000-01-01

    One of the most challenging aspects of ion beam driven inertial fusion energy is the reliable and efficient generation of low emittance, high current ion beams. The primary ion source requirements include a rise time of order 1-msec, a pulse width of at least 20-msec, a flattop ripple of less than 0.1% and a repetition rate of at least 5-Hz. Naturally, at such a repetition rate, the duty cycle of the source must be greater than 108 pulses. Although these specifications do not appear to exceed the state-of-the-art for pulsed power, considerable effort remains to develop a suitable high current ion source. Therefore, we are constructing a 500-kV test stand specifically for studying various ion source concepts including surface, plasma and metal vapor arc. This paper will describe the test stand design specifications as well as the details of the various subsystems and components

  5. Evaluation of gamma dose effect on PIN photodiode using analytical model

    Science.gov (United States)

    Jafari, H.; Feghhi, S. A. H.; Boorboor, S.

    2018-03-01

    The PIN silicon photodiodes are widely used in the applications which may be found in radiation environment such as space mission, medical imaging and non-destructive testing. Radiation-induced damage in these devices causes to degrade the photodiode parameters. In this work, we have used new approach to evaluate gamma dose effects on a commercial PIN photodiode (BPX65) based on an analytical model. In this approach, the NIEL parameter has been calculated for gamma rays from a 60Co source by GEANT4. The radiation damage mechanisms have been considered by solving numerically the Poisson and continuity equations with the appropriate boundary conditions, parameters and physical models. Defects caused by radiation in silicon have been formulated in terms of the damage coefficient for the minority carriers' lifetime. The gamma induced degradation parameters of the silicon PIN photodiode have been analyzed in detail and the results were compared with experimental measurements and as well as the results of ATLAS semiconductor simulator to verify and parameterize the analytical model calculations. The results showed reasonable agreement between them for BPX65 silicon photodiode irradiated by 60Co gamma source at total doses up to 5 kGy under different reverse voltages.

  6. Test plan for Series 2 spent fuel cladding containment credit tests

    International Nuclear Information System (INIS)

    Wilson, C.N.

    1984-10-01

    This test plan describes a second series of tests to be conducted by Westinghouse Hanford Company (WHC) to evaluate the effectiveness of breached cladding as a barrier to radionuclide release in the NNWSI-proposed geologic repository. These tests will be conducted at the Hanford Engineering Development Laboratory (HEDL). A first series of tests, initiated at HEDL during FY 1983, demonstrated specimen preparation and feasibility of the testing concept. The second series tests will be similar to the Series 1 tests with the following exceptions: NNWSI reference groundwater obtained from well J-13 will be used as the leachant instead of deionized water; fuel from a second source will be used; and certain refinements will be made in specimen preparation, sampling, and analytical procedures. 12 references, 5 figures, 5 tables

  7. A Test Beamline on Diamond Light Source

    International Nuclear Information System (INIS)

    Sawhney, K. J. S.; Dolbnya, I. P.; Tiwari, M. K.; Alianelli, L.; Scott, S. M.; Preece, G. M.; Pedersen, U. K.; Walton, R. D.

    2010-01-01

    A Test beamline B16 has been built on the 3 GeV Diamond synchrotron radiation source. The beamline covers a wide photon energy range from 2 to 25 keV. The beamline is highly flexible and versatile in terms of the available beam size (a micron to 100 mm) and the range of energy resolution and photon flux; by virtue of its several operational modes, and the different inter-changeable instruments available in the experiments hutch. Diverse experimental configurations can be flexibly configured using a five-circle diffractometer, a versatile optics test bench, and a suite of detectors. Several experimental techniques including reflectivity, diffraction and imaging are routinely available. Details of the beamline and its measured performance are presented.

  8. A multicenter nationwide reference intervals study for common biochemical analytes in Turkey using Abbott analyzers.

    Science.gov (United States)

    Ozarda, Yesim; Ichihara, Kiyoshi; Aslan, Diler; Aybek, Hulya; Ari, Zeki; Taneli, Fatma; Coker, Canan; Akan, Pinar; Sisman, Ali Riza; Bahceci, Onur; Sezgin, Nurzen; Demir, Meltem; Yucel, Gultekin; Akbas, Halide; Ozdem, Sebahat; Polat, Gurbuz; Erbagci, Ayse Binnur; Orkmez, Mustafa; Mete, Nuriye; Evliyaoglu, Osman; Kiyici, Aysel; Vatansev, Husamettin; Ozturk, Bahadir; Yucel, Dogan; Kayaalp, Damla; Dogan, Kubra; Pinar, Asli; Gurbilek, Mehmet; Cetinkaya, Cigdem Damla; Akin, Okhan; Serdar, Muhittin; Kurt, Ismail; Erdinc, Selda; Kadicesme, Ozgur; Ilhan, Necip; Atali, Dilek Sadak; Bakan, Ebubekir; Polat, Harun; Noyan, Tevfik; Can, Murat; Bedir, Abdulkerim; Okuyucu, Ali; Deger, Orhan; Agac, Suret; Ademoglu, Evin; Kaya, Ayşem; Nogay, Turkan; Eren, Nezaket; Dirican, Melahat; Tuncer, GulOzlem; Aykus, Mehmet; Gunes, Yeliz; Ozmen, Sevda Unalli; Kawano, Reo; Tezcan, Sehavet; Demirpence, Ozlem; Degirmen, Elif

    2014-12-01

    A nationwide multicenter study was organized to establish reference intervals (RIs) in the Turkish population for 25 commonly tested biochemical analytes and to explore sources of variation in reference values, including regionality. Blood samples were collected nationwide in 28 laboratories from the seven regions (≥400 samples/region, 3066 in all). The sera were collectively analyzed in Uludag University in Bursa using Abbott reagents and analyzer. Reference materials were used for standardization of test results. After secondary exclusion using the latent abnormal values exclusion method, RIs were derived by a parametric method employing the modified Box-Cox formula and compared with the RIs by the non-parametric method. Three-level nested ANOVA was used to evaluate variations among sexes, ages and regions. Associations between test results and age, body mass index (BMI) and region were determined by multiple regression analysis (MRA). By ANOVA, differences of reference values among seven regions were significant in none of the 25 analytes. Significant sex-related and age-related differences were observed for 10 and seven analytes, respectively. MRA revealed BMI-related changes in results for uric acid, glucose, triglycerides, high-density lipoprotein (HDL)-cholesterol, alanine aminotransferase, and γ-glutamyltransferase. Their RIs were thus derived by applying stricter criteria excluding individuals with BMI >28 kg/m2. Ranges of RIs by non-parametric method were wider than those by parametric method especially for those analytes affected by BMI. With the lack of regional differences and the well-standardized status of test results, the RIs derived from this nationwide study can be used for the entire Turkish population.

  9. Environmental assessment of general-purpose heat source safety verification testing

    International Nuclear Information System (INIS)

    1995-02-01

    This Environmental Assessment (EA) was prepared to identify and evaluate potential environmental, safety, and health impacts associated with the Proposed Action to test General-Purpose Heat Source (GPHS) Radioisotope Thermoelectric Generator (RTG) assemblies at the Sandia National Laboratories (SNL) 10,000-Foot Sled Track Facility, Albuquerque, New Mexico. RTGs are used to provide a reliable source of electrical power on board some spacecraft when solar power is inadequate during long duration space missions. These units are designed to convert heat from the natural decay of radioisotope fuel into electrical power. Impact test data are required to support DOE's mission to provide radioisotope power systems to NASA and other user agencies. The proposed tests will expand the available safety database regarding RTG performance under postulated accident conditions. Direct observations and measurements of GPHS/RTG performance upon impact with hard, unyielding surfaces are required to verify model predictions and to ensure the continual evolution of the RTG designs that perform safely under varied accident environments. The Proposed Action is to conduct impact testing of RTG sections containing GPHS modules with simulated fuel. End-On and Side-On impact test series are planned

  10. Designing a Marketing Analytics Course for the Digital Age

    Science.gov (United States)

    Liu, Xia; Burns, Alvin C.

    2018-01-01

    Marketing analytics is receiving great attention because of evolving technology and the radical changes in the marketing environment. This study aims to assist the design and implementation of a marketing analytics course. We assembled a rich data set from four sources: business executives, 400 employers' job postings, one million tweets about…

  11. Force Limited Vibration Testing: Computation C2 for Real Load and Probabilistic Source

    Science.gov (United States)

    Wijker, J. J.; de Boer, A.; Ellenbroek, M. H. M.

    2014-06-01

    To prevent over-testing of the test-item during random vibration testing Scharton proposed and discussed the force limited random vibration testing (FLVT) in a number of publications, in which the factor C2 is besides the random vibration specification, the total mass and the turnover frequency of the load(test item), a very important parameter. A number of computational methods to estimate C2 are described in the literature, i.e. the simple and the complex two degrees of freedom system, STDFS and CTDFS, respectively. Both the STDFS and the CTDFS describe in a very reduced (simplified) manner the load and the source (adjacent structure to test item transferring the excitation forces, i.e. spacecraft supporting an instrument).The motivation of this work is to establish a method for the computation of a realistic value of C2 to perform a representative random vibration test based on force limitation, when the adjacent structure (source) description is more or less unknown. Marchand formulated a conservative estimation of C2 based on maximum modal effective mass and damping of the test item (load) , when no description of the supporting structure (source) is available [13].Marchand discussed the formal description of getting C 2 , using the maximum PSD of the acceleration and maximum PSD of the force, both at the interface between load and source, in combination with the apparent mass and total mass of the the load. This method is very convenient to compute the factor C 2 . However, finite element models are needed to compute the spectra of the PSD of both the acceleration and force at the interface between load and source.Stevens presented the coupled systems modal approach (CSMA), where simplified asparagus patch models (parallel-oscillator representation) of load and source are connected, consisting of modal effective masses and the spring stiffnesses associated with the natural frequencies. When the random acceleration vibration specification is given the CMSA

  12. The effects of free recall testing on subsequent source memory.

    Science.gov (United States)

    Brewer, Gene A; Marsh, Richard L; Meeks, Joseph T; Clark-Foos, Arlo; Hicks, Jason L

    2010-05-01

    The testing effect is the finding that prior retrieval of information from memory will result in better subsequent memory for that material. One explanation for these effects is that initial free recall testing increases the recollective details for tested information, which then becomes more available during a subsequent test phase. In three experiments we explored this hypothesis using a source-monitoring test phase after the initial free recall tests. We discovered that memory is differentially enhanced for certain recollective details depending on the nature of the free recall task. Thus further research needs to be conducted to specify how different kinds of memorial details are enhanced by free recall testing.

  13. ALMA observations of lensed Herschel sources: testing the dark matter halo paradigm

    Science.gov (United States)

    Amvrosiadis, A.; Eales, S. A.; Negrello, M.; Marchetti, L.; Smith, M. W. L.; Bourne, N.; Clements, D. L.; De Zotti, G.; Dunne, L.; Dye, S.; Furlanetto, C.; Ivison, R. J.; Maddox, S. J.; Valiante, E.; Baes, M.; Baker, A. J.; Cooray, A.; Crawford, S. M.; Frayer, D.; Harris, A.; Michałowski, M. J.; Nayyeri, H.; Oliver, S.; Riechers, D. A.; Serjeant, S.; Vaccari, M.

    2018-04-01

    With the advent of wide-area submillimetre surveys, a large number of high-redshift gravitationally lensed dusty star-forming galaxies have been revealed. Because of the simplicity of the selection criteria for candidate lensed sources in such surveys, identified as those with S500 μm > 100 mJy, uncertainties associated with the modelling of the selection function are expunged. The combination of these attributes makes submillimetre surveys ideal for the study of strong lens statistics. We carried out a pilot study of the lensing statistics of submillimetre-selected sources by making observations with the Atacama Large Millimeter Array (ALMA) of a sample of strongly lensed sources selected from surveys carried out with the Herschel Space Observatory. We attempted to reproduce the distribution of image separations for the lensed sources using a halo mass function taken from a numerical simulation that contains both dark matter and baryons. We used three different density distributions, one based on analytical fits to the haloes formed in the EAGLE simulation and two density distributions [Singular Isothermal Sphere (SIS) and SISSA] that have been used before in lensing studies. We found that we could reproduce the observed distribution with all three density distributions, as long as we imposed an upper mass transition of ˜1013 M⊙ for the SIS and SISSA models, above which we assumed that the density distribution could be represented by a Navarro-Frenk-White profile. We show that we would need a sample of ˜500 lensed sources to distinguish between the density distributions, which is practical given the predicted number of lensed sources in the Herschel surveys.

  14. Magnetic anomaly depth and structural index estimation using different height analytic signals data

    Science.gov (United States)

    Zhou, Shuai; Huang, Danian; Su, Chao

    2016-09-01

    This paper proposes a new semi-automatic inversion method for magnetic anomaly data interpretation that uses the combination of analytic signals of the anomaly at different heights to determine the depth and the structural index N of the sources. The new method utilizes analytic signals of the original anomaly at different height to effectively suppress the noise contained in the anomaly. Compared with the other high-order derivative calculation methods based on analytic signals, our method only computes first-order derivatives of the anomaly, which can be used to obtain more stable and accurate results. Tests on synthetic noise-free and noise-corrupted magnetic data indicate that the new method can estimate the depth and N efficiently. The technique is applied to a real measured magnetic anomaly in Southern Illinois caused by a known dike, and the result is in agreement with the drilling information and inversion results within acceptable calculation error.

  15. Orthodontic brackets removal under shear and tensile bond strength resistance tests – a comparative test between light sources

    International Nuclear Information System (INIS)

    Silva, P C G; Porto-Neto, S T; Lizarelli, R F Z; Bagnato, V S

    2008-01-01

    We have investigated if a new LEDs system has enough efficient energy to promote efficient shear and tensile bonding strength resistance under standardized tests. LEDs 470 ± 10 nm can be used to photocure composite during bracket fixation. Advantages considering resistance to tensile and shear bonding strength when these systems were used are necessary to justify their clinical use. Forty eight human extracted premolars teeth and two light sources were selected, one halogen lamp and a LEDs system. Brackets for premolar were bonded through composite resin. Samples were submitted to standardized tests. A comparison between used sources under shear bonding strength test, obtained similar results; however, tensile bonding test showed distinct results: a statistical difference at a level of 1% between exposure times (40 and 60 seconds) and even to an interaction between light source and exposure time. The best result was obtained with halogen lamp use by 60 seconds, even during re-bonding; however LEDs system can be used for bonding and re-bonding brackets if power density could be increased

  16. OpenBAN: An Open Building ANalytics Middleware for Smart Buildings

    Directory of Open Access Journals (Sweden)

    Pandarasamy Arjunan

    2016-03-01

    Full Text Available Towards the realization of smart building applications, buildings are increasingly instrumented with diverse sensors and actuators. These sensors generate large volumes of data which can be analyzed for optimizing building operations. Many building energy management tasks such as energy forecasting, disaggregation, among others require complex analytics leveraging collected sensor data. While several standalone and cloud-based systems for archiving, sharing and visualizing sensor data have emerged, their support for analyzing sensor data streams is primitive and limited to rule-based actions based on thresholds and simple aggregation functions. We develop OpenBAN, an open source sensor data analytics middleware for buildings, to make analytics an integral component of modern smart building applications. OpenBAN provides a framework of extensible sensor data processing elements for identifying various building context, which different applications can leverage. We validate the capabilities of OpenBAN by developing three representative real-world applications which are deployed in our test-bed buildings: (i household energy disaggregation, (ii detection of sprinkler usage from water meter data, and (iii electricity demand forecasting. We also provide a preliminary system performance of OpenBAN when deployed in the cloud and locally.

  17. Data from thermal testing of the Open Source Cryostage

    DEFF Research Database (Denmark)

    Buch, Johannes Lørup; Ramløv, Hans

    2016-01-01

    The data presented here is related to the research article "An open source cryostage and software analysis method for detection of antifreeze activity" (Buch and Ramløv, 2016) [1]. The design of the Open Source Cryostage (OSC) is tested in terms of thermal limits, thermal efficiency and electrical...... efficiency. This article furthermore includes an overview of the electrical circuitry and a flowchart of the software program controlling the temperature of the OSC. The thermal efficiency data is presented here as degrees per volt and maximum cooling capacity....

  18. Comparison of EPRI safety valve test data with analytically determined hydraulic results

    International Nuclear Information System (INIS)

    Smith, L.C.; Howe, K.S.

    1983-01-01

    NUREG-0737 (November 1980) and all subsequent U.S. NRC generic follow-up letters require that all operating plant licensees and applicants verify the acceptability of plant specific pressurizer safety valve piping systems for valve operation transients by testing. To aid in this verification process, the Electric Power Research Institute (EPRI) conducted an extensive testing program at the Combustion Engineering Test Facility. Pertinent tests simulating dynamic opening of the safety valves for representative upstream environments were carried out. Different models and sizes of safety valves were tested at the simulated operating conditions. Transducers placed at key points in the system monitored a variety of thermal, hydraulic and structural parameters. From this data, a more complete description of the transient can be made. The EPRI test configuration was analytically modeled using a one-dimensional thermal hydraulic computer program that uses the method of characteristics approach to generate key fluid parameters as a function of space and time. The conservative equations are solved by applying both the implicit and explicit characteristic methods. Unbalanced or wave forces were determined for each straight run of pipe bounded on each side by a turn or elbow. Blowdown forces were included, where appropriate. Several parameters were varied to determine the effects on the pressure, hydraulic forces and timings of events. By comparing these quantities with the experimentally obtained data, an approximate picture of the flow dynamics is arrived at. Two cases in particular are presented. These are the hot and cold loop seal discharge tests made with the Crosby 6M6 spring-loaded safety valve. Included in the paper is a description of the hydraulic code, modeling techniques and assumptions, a comparison of the numerical results with experimental data and a qualitative description of the factors which govern pipe support loading. (orig.)

  19. Analytical model of nanoscale junctionless transistors towards controlling of short channel effects through source/drain underlap and channel thickness engineering

    Science.gov (United States)

    Roy, Debapriya; Biswas, Abhijit

    2018-01-01

    We develop a 2D analytical subthreshold model for nanoscale double-gate junctionless transistors (DGJLTs) with gate-source/drain underlap. The model is validated using well-calibrated TCAD simulation deck obtained by comparing experimental data in the literature. To analyze and control short-channel effects, we calculate the threshold voltage, drain induced barrier lowering (DIBL) and subthreshold swing of DGJLTs using our model and compare them with corresponding simulation value at channel length of 20 nm with channel thickness tSi ranging 5-10 nm, gate-source/drain underlap (LSD) values 0-7 nm and source/drain doping concentrations (NSD) ranging 5-12 × 1018 cm-3. As tSi reduces from 10 to 5 nm DIBL drops down from 42.5 to 0.42 mV/V at NSD = 1019 cm-3 and LSD = 5 nm in contrast to decrement from 71 to 4.57 mV/V without underlap. For a lower tSiDIBL increases marginally with increasing NSD. The subthreshold swing reduces more rapidly with thinning of channel thickness rather than increasing LSD or decreasing NSD.

  20. Influence of test configuration on the combustion characteristics of polymers as ignition sources

    Science.gov (United States)

    Julien, Howard L.

    1993-01-01

    The experimental evaluation of polymers as ignition sources for metals was accomplished at the NASA White Sands Test Facility (WSTF) using a standard promoted combustion test. These tests involve the transient burning of materials in high-pressure oxygen environments. They have provided data from which design decisions can be made; data include video recordings of ignition and non-ignition for specific combinations of metals and polymers. Other tests provide the measured compositions of combustion products for polymers at select burn times and an empirical basis for estimating burn rates. With the current test configuration, the detailed analysis of test results requires modeling a three-dimensional, transient convection process involving fluid motion, thermal conduction and convection, the diffusion of chemical species, and the erosion of sample surface. At the high pressure extremes, it even requires the analysis of turbulent, transient convection where the physics of the problem are not well known and the computation requirements are not practical at this time. An alternative test configuration that can be analyzed with a relatively-simple convection model was developed during the summer period. The principal change constitutes replacing a large-diameter polymer disk at the end of the metal test rod with coaxial polymer cylinders that have a diameter nearer to that of the metal rod. The experimental objective is to assess the importance of test geometries on the promotion of metal ignition by testing with different lengths of the polymer and, with an extended effort, to analyze the surface combustion in the redesigned promoted combustion tests through analytical modeling of the process. The analysis shall use the results of cone-calorimeter tests of the polymer material to model primary chemical reactions and, with proper design of the promoted combustion test, modeling of the convection process could be conveniently limited to a quasi-steady boundary layer

  1. An analytical approach for the Propagation Saw Test

    Science.gov (United States)

    Benedetti, Lorenzo; Fischer, Jan-Thomas; Gaume, Johan

    2016-04-01

    The Propagation Saw Test (PST) [1, 2] is an experimental in-situ technique that has been introduced to assess crack propagation propensity in weak snowpack layers buried below cohesive snow slabs. This test attracted the interest of a large number of practitioners, being relatively easy to perform and providing useful insights for the evaluation of snow instability. The PST procedure requires isolating a snow column of 30 centimeters of width and -at least-1 meter in the downslope direction. Then, once the stratigraphy is known (e.g. from a manual snow profile), a saw is used to cut a weak layer which could fail, potentially leading to the release of a slab avalanche. If the length of the saw cut reaches the so-called critical crack length, the onset of crack propagation occurs. Furthermore, depending on snow properties, the crack in the weak layer can initiate the fracture and detachment of the overlying slab. Statistical studies over a large set of field data confirmed the relevance of the PST, highlighting the positive correlation between test results and the likelihood of avalanche release [3]. Recent works provided key information on the conditions for the onset of crack propagation [4] and on the evolution of slab displacement during the test [5]. In addition, experimental studies [6] and simplified models [7] focused on the qualitative description of snowpack properties leading to different failure types, namely full propagation or fracture arrest (with or without slab fracture). However, beside current numerical studies utilizing discrete elements methods [8], only little attention has been devoted to a detailed analytical description of the PST able to give a comprehensive mechanical framework of the sequence of processes involved in the test. Consequently, this work aims to give a quantitative tool for an exhaustive interpretation of the PST, stressing the attention on important parameters that influence the test outcomes. First, starting from a pure

  2. Analytical Validation of the ReEBOV Antigen Rapid Test for Point-of-Care Diagnosis of Ebola Virus Infection

    Science.gov (United States)

    Cross, Robert W.; Boisen, Matthew L.; Millett, Molly M.; Nelson, Diana S.; Oottamasathien, Darin; Hartnett, Jessica N.; Jones, Abigal B.; Goba, Augustine; Momoh, Mambu; Fullah, Mohamed; Bornholdt, Zachary A.; Fusco, Marnie L.; Abelson, Dafna M.; Oda, Shunichiro; Brown, Bethany L.; Pham, Ha; Rowland, Megan M.; Agans, Krystle N.; Geisbert, Joan B.; Heinrich, Megan L.; Kulakosky, Peter C.; Shaffer, Jeffrey G.; Schieffelin, John S.; Kargbo, Brima; Gbetuwa, Momoh; Gevao, Sahr M.; Wilson, Russell B.; Saphire, Erica Ollmann; Pitts, Kelly R.; Khan, Sheik Humarr; Grant, Donald S.; Geisbert, Thomas W.; Branco, Luis M.; Garry, Robert F.

    2016-01-01

    Background. Ebola virus disease (EVD) is a severe viral illness caused by Ebola virus (EBOV). The 2013–2016 EVD outbreak in West Africa is the largest recorded, with >11 000 deaths. Development of the ReEBOV Antigen Rapid Test (ReEBOV RDT) was expedited to provide a point-of-care test for suspected EVD cases. Methods. Recombinant EBOV viral protein 40 antigen was used to derive polyclonal antibodies for RDT and enzyme-linked immunosorbent assay development. ReEBOV RDT limits of detection (LOD), specificity, and interference were analytically validated on the basis of Food and Drug Administration (FDA) guidance. Results. The ReEBOV RDT specificity estimate was 95% for donor serum panels and 97% for donor whole-blood specimens. The RDT demonstrated sensitivity to 3 species of Ebolavirus (Zaire ebolavirus, Sudan ebolavirus, and Bundibugyo ebolavirus) associated with human disease, with no cross-reactivity by pathogens associated with non-EBOV febrile illness, including malaria parasites. Interference testing exhibited no reactivity by medications in common use. The LOD for antigen was 4.7 ng/test in serum and 9.4 ng/test in whole blood. Quantitative reverse transcription–polymerase chain reaction testing of nonhuman primate samples determined the range to be equivalent to 3.0 × 105–9.0 × 108 genomes/mL. Conclusions. The analytical validation presented here contributed to the ReEBOV RDT being the first antigen-based assay to receive FDA and World Health Organization emergency use authorization for this EVD outbreak, in February 2015. PMID:27587634

  3. Iterative and range test methods for an inverse source problem for acoustic waves

    International Nuclear Information System (INIS)

    Alves, Carlos; Kress, Rainer; Serranho, Pedro

    2009-01-01

    We propose two methods for solving an inverse source problem for time-harmonic acoustic waves. Based on the reciprocity gap principle a nonlinear equation is presented for the locations and intensities of the point sources that can be solved via Newton iterations. To provide an initial guess for this iteration we suggest a range test algorithm for approximating the source locations. We give a mathematical foundation for the range test and exhibit its feasibility in connection with the iteration method by some numerical examples

  4. Direct Assay of Filter Media following DEOX Testing

    International Nuclear Information System (INIS)

    R.P. Lind; J.J. Giglio; D.G. Cummings; M.W. Huntley; C.D. Morgan; K.J. Bateman; D.L. Wahlquist; D.A. Sell

    2007-01-01

    The direct assay of filter media by gamma spectrometry following DEOX testing has distinct advantages over analytical chemistry. Prior to using gamma spectrometry for the quantification of cesium (Cs-137), a calibration must be established with known sources since gamma spectrometry yields relative results. Quantitative analytical chemistry, in particular ICP-MS, has been performed on the filter media for comparison to the gamma spectrometry data. The correlation of gamma spectrometry to ICP-MS data is presented to justify the continued use of gamma spectrometry for filter media

  5. Source passing test in Vesivehmaa air field - STUK/HUT team

    International Nuclear Information System (INIS)

    Honkamaa, T.; Tiilikainen, H.; Aarnio, P.; Nikkinen, M.

    1997-01-01

    Carborne radiation monitors were tested for point source responses at distances 10 m, 20 m, 50 m, 100 m, 150 m, and 200 m using speed of 20 km h -1 and 50 km h -1 . A large pressurised ionisation chamber (PIC), and HPGe detector (relative efficiency 36.9%) and a NaI(Tl) scintillator detector (size 5'x5') were used. The sources had a nominal activity of 22 MBq ( 60 Co) and 1.85 GBq ( 137 Cs). The 60 Co source strength was under the detection limit in all measurements. The detection of the 137 Cs source is visually clear up to 50 m for the spectrometers and up to 20 m for PIC. Statistical analysis shows that 137 Cs source could be detected up to 100 m with the spectrometers and up to 50 m with PIC if the background is well known. (au)

  6. Feed Preparation for Source of Alkali Melt Rate Tests

    International Nuclear Information System (INIS)

    Stone, M. E.; Lambert, D. P.

    2005-01-01

    The purpose of the Source of Alkali testing was to prepare feed for melt rate testing in order to determine the maximum melt-rate for a series of batches where the alkali was increased from 0% Na 2 O in the frit (low washed sludge) to 16% Na 2 O in the frit (highly washed sludge). This document summarizes the feed preparation for the Source of Alkali melt rate testing. The Source of Alkali melt rate results will be issued in a separate report. Five batches of Sludge Receipt and Adjustment Tank (SRAT) product and four batches of Slurry Mix Evaporator (SME) product were produced to support Source of Alkali (SOA) melt rate testing. Sludge Batch 3 (SB3) simulant and frit 418 were used as targets for the 8% Na 2 O baseline run. For the other four cases (0% Na 2 O, 4% Na 2 O, 12% Na 2 O, and 16% Na 2 O in frit), special sludge and frit preparations were necessary. The sludge preparations mimicked washing of the SB3 baseline composition, while frit adjustments consisted of increasing or decreasing Na and then re-normalizing the remaining frit components. For all batches, the target glass compositions were identical. The five SRAT products were prepared for testing in the dry fed melt-rate furnace and the four SME products were prepared for the Slurry-fed Melt-Rate Furnace (SMRF). At the same time, the impacts of washing on a baseline composition from a Chemical Process Cell (CPC) perspective could also be investigated. Five process simulations (0% Na 2 O in frit, 4% Na 2 O in frit, 8% Na 2 O in frit or baseline, 12% Na 2 O in frit, and 16% Na 2 O in frit) were completed in three identical 4-L apparatus to produce the five SRAT products. The SRAT products were later dried and combined with the complementary frits to produce identical glass compositions. All five batches were produced with identical processing steps, including off-gas measurement using online gas chromatographs. Two slurry-fed melter feed batches, a 4% Na 2 O in frit run (less washed sludge combined with

  7. Analytic Methods Used in Quality Control in a Compounding Pharmacy.

    Science.gov (United States)

    Allen, Loyd V

    2017-01-01

    Analytical testing will no doubt become a more important part of pharmaceutical compounding as the public and regulatory agencies demand increasing documentation of the quality of compounded preparations. Compounding pharmacists must decide what types of testing and what amount of testing to include in their quality-control programs, and whether testing should be done in-house or outsourced. Like pharmaceutical compounding, analytical testing should be performed only by those who are appropriately trained and qualified. This article discusses the analytical methods that are used in quality control in a compounding pharmacy. Copyright© by International Journal of Pharmaceutical Compounding, Inc.

  8. Special aerosol sources for certification and test of aerosol radiometers

    International Nuclear Information System (INIS)

    Belkina, S.K.; Zalmanzon, Y.E.; Kuznetsov, Y.V.; Rizin, A.I.; Fertman, D.E.

    1991-01-01

    The results are presented of the development and practical application of new radionuclide source types (Special Aerosol Sources (SAS)), that meet the international standard recommendations, which are used for certification and test of aerosol radiometers (monitors) using model aerosols of plutonium-239, strontium-yttrium-90 or uranium of natural isotope composition and certified against Union of Soviet Socialist Republics USSR national radioactive aerosol standard or by means of a reference radiometer. The original technology for source production allows the particular features of sampling to be taken into account as well as geometry and conditions of radionuclides radiation registration in the sample for the given type of radiometer. (author)

  9. Special aerosol sources for certification and test of aerosol radiometers

    Energy Technology Data Exchange (ETDEWEB)

    Belkina, S.K.; Zalmanzon, Y.E.; Kuznetsov, Y.V.; Rizin, A.I.; Fertman, D.E. (Union Research Institute of Instrumentation, Moscow (USSR))

    1991-01-01

    The results are presented of the development and practical application of new radionuclide source types (Special Aerosol Sources (SAS)), that meet the international standard recommendations, which are used for certification and test of aerosol radiometers (monitors) using model aerosols of plutonium-239, strontium-yttrium-90 or uranium of natural isotope composition and certified against Union of Soviet Socialist Republics USSR national radioactive aerosol standard or by means of a reference radiometer. The original technology for source production allows the particular features of sampling to be taken into account as well as geometry and conditions of radionuclides radiation registration in the sample for the given type of radiometer. (author).

  10. Evaluation and Testing of Several Free/Open Source Web Vulnerability Scanners

    OpenAIRE

    Suteva, Natasa; Zlatkovski, Dragi; Mileva, Aleksandra

    2013-01-01

    Web Vulnerability Scanners (WVSs) are software tools for identifying vulnerabilities in web applications. There are commercial WVSs, free/open source WVSs, and some companies offer them as a Software-as-a-Service. In this paper, we test and evaluate six free/open source WVSs using the web application WackoPicko with many known vulnerabilities, primary for false negative rates.

  11. Analytical estimation of emission zone mean position and width in organic light-emitting diodes from emission pattern image-source interference fringes

    International Nuclear Information System (INIS)

    Epstein, Ariel; Tessler, Nir; Einziger, Pinchas D.; Roberts, Matthew

    2014-01-01

    We present an analytical method for evaluating the first and second moments of the effective exciton spatial distribution in organic light-emitting diodes (OLED) from measured emission patterns. Specifically, the suggested algorithm estimates the emission zone mean position and width, respectively, from two distinct features of the pattern produced by interference between the emission sources and their images (induced by the reflective cathode): the angles in which interference extrema are observed, and the prominence of interference fringes. The relations between these parameters are derived rigorously for a general OLED structure, indicating that extrema angles are related to the mean position of the radiating excitons via Bragg's condition, and the spatial broadening is related to the attenuation of the image-source interference prominence due to an averaging effect. The method is applied successfully both on simulated emission patterns and on experimental data, exhibiting a very good agreement with the results obtained by numerical techniques. We investigate the method performance in detail, showing that it is capable of producing accurate estimations for a wide range of source-cathode separation distances, provided that the measured spectral interval is large enough; guidelines for achieving reliable evaluations are deduced from these results as well. As opposed to numerical fitting tools employed to perform similar tasks to date, our approximate method explicitly utilizes physical intuition and requires far less computational effort (no fitting is involved). Hence, applications that do not require highly resolved estimations, e.g., preliminary design and production-line verification, can benefit substantially from the analytical algorithm, when applicable. This introduces a novel set of efficient tools for OLED engineering, highly important in the view of the crucial role the exciton distribution plays in determining the device performance.

  12. Analytical estimation of emission zone mean position and width in organic light-emitting diodes from emission pattern image-source interference fringes

    Energy Technology Data Exchange (ETDEWEB)

    Epstein, Ariel, E-mail: ariel.epstein@utoronto.ca; Tessler, Nir, E-mail: nir@ee.technion.ac.il; Einziger, Pinchas D. [Department of Electrical Engineering, Technion-Israel Institute of Technology, Haifa 32000 (Israel); Roberts, Matthew, E-mail: mroberts@cdtltd.co.uk [Cambridge Display Technology Ltd, Building 2020, Cambourne Business Park, Cambourne, Cambridgeshire CB23 6DW (United Kingdom)

    2014-06-14

    We present an analytical method for evaluating the first and second moments of the effective exciton spatial distribution in organic light-emitting diodes (OLED) from measured emission patterns. Specifically, the suggested algorithm estimates the emission zone mean position and width, respectively, from two distinct features of the pattern produced by interference between the emission sources and their images (induced by the reflective cathode): the angles in which interference extrema are observed, and the prominence of interference fringes. The relations between these parameters are derived rigorously for a general OLED structure, indicating that extrema angles are related to the mean position of the radiating excitons via Bragg's condition, and the spatial broadening is related to the attenuation of the image-source interference prominence due to an averaging effect. The method is applied successfully both on simulated emission patterns and on experimental data, exhibiting a very good agreement with the results obtained by numerical techniques. We investigate the method performance in detail, showing that it is capable of producing accurate estimations for a wide range of source-cathode separation distances, provided that the measured spectral interval is large enough; guidelines for achieving reliable evaluations are deduced from these results as well. As opposed to numerical fitting tools employed to perform similar tasks to date, our approximate method explicitly utilizes physical intuition and requires far less computational effort (no fitting is involved). Hence, applications that do not require highly resolved estimations, e.g., preliminary design and production-line verification, can benefit substantially from the analytical algorithm, when applicable. This introduces a novel set of efficient tools for OLED engineering, highly important in the view of the crucial role the exciton distribution plays in determining the device performance.

  13. Preliminary Tests Of The Decris-sc Ion Source

    CERN Document Server

    Efremov, A; Bechterev, V; Bogomolov, S L; Bondarenko, P G; Datskov, V I; Dmitriev, S; Drobin, V; Lebedev, A; Leporis, M; Malinowski, H; Nikiforov, A; Paschenko, S V; Seleznev, V; Shishov, Yu A; Smirnov, Yu; Tsvineva, G; Yakovlev, B; Yazvitsky, N Yu

    2004-01-01

    A new "liquid He-free" superconducting Electron Cyclotron Resonance Ion Source DECRIS-SC, to be used as injector for the IC-100 small cyclotron, has been designed by FLNR and LHE JINR. The main feature is that a compact refrigerator of Gifford-McMahon type is used to cool the solenoid coils. For the reason of very small cooling power at 4.2 K (about 1 W) our efforts were to optimize the magnetic structure and minimize an external heating of the coils. The maximum magnetic field strength is 3 T and 2 T in injection and extraction region respectively. For the radial plasma confinement a hexapole made of NdFeB permanent magnet is used. The source will be capable of ECR plasma heating using different frequencies (14 GHz or 18 GHz). To be able to deliver usable intensities of solids, the design is also allow axial access for evaporation oven and metal samples using the plasma sputtering technique. Very preliminary results of the source test are presented.

  14. Source passing test in Vesivehmaa air field - STUK/HUT team

    Energy Technology Data Exchange (ETDEWEB)

    Honkamaa, T.; Tiilikainen, H. [Finnish Centre for Radiation and Nuclear Safety, Helsinki (Finland); Aarnio, P.; Nikkinen, M. [Helsinki Univ. of Technology, Espoo (Finland)

    1997-12-31

    Carborne radiation monitors were tested for point source responses at distances 10 m, 20 m, 50 m, 100 m, 150 m, and 200 m using speed of 20 km h{sup -1} and 50 km h{sup -1}. A large pressurised ionisation chamber (PIC), and HPGe detector (relative efficiency 36.9%) and a NaI(Tl) scintillator detector (size 5`x5`) were used. The sources had a nominal activity of 22 MBq ({sup 60}Co) and 1.85 GBq ({sup 137}Cs). The {sup 60}Co source strength was under the detection limit in all measurements. The detection of the {sup 137}Cs source is visually clear up to 50 m for the spectrometers and up to 20 m for PIC. Statistical analysis shows that {sup 137}Cs source could be detected up to 100 m with the spectrometers and up to 50 m with PIC if the background is well known. (au).

  15. Source passing test in Vesivehmaa air field - STUK/HUT team

    Energy Technology Data Exchange (ETDEWEB)

    Honkamaa, T; Tiilikainen, H [Finnish Centre for Radiation and Nuclear Safety, Helsinki (Finland); Aarnio, P; Nikkinen, M [Helsinki Univ. of Technology, Espoo (Finland)

    1998-12-31

    Carborne radiation monitors were tested for point source responses at distances 10 m, 20 m, 50 m, 100 m, 150 m, and 200 m using speed of 20 km h{sup -1} and 50 km h{sup -1}. A large pressurised ionisation chamber (PIC), and HPGe detector (relative efficiency 36.9%) and a NaI(Tl) scintillator detector (size 5`x5`) were used. The sources had a nominal activity of 22 MBq ({sup 60}Co) and 1.85 GBq ({sup 137}Cs). The {sup 60}Co source strength was under the detection limit in all measurements. The detection of the {sup 137}Cs source is visually clear up to 50 m for the spectrometers and up to 20 m for PIC. Statistical analysis shows that {sup 137}Cs source could be detected up to 100 m with the spectrometers and up to 50 m with PIC if the background is well known. (au).

  16. 8. All Polish Conference on Analytical Chemistry: Analytical Chemistry for the Community of the 21. Century

    International Nuclear Information System (INIS)

    Koscielniak, P.; Wieczorek, M.; Kozak, J.

    2010-01-01

    Book of Abstracts contains short descriptions of lectures, communications and posters presented during 8 th All Polish Conference on Analytical Chemistry (Cracow, 4-9.07.2010). Scientific programme consisted of: basic analytical problems, preparation of the samples, chemometry and metrology, miniaturization of the analytical procedures, environmental analysis, medicinal analyses, industrial analyses, food analyses, biochemical analyses, analysis of relicts of the past. Several posters were devoted to the radiochemical separations, radiochemical analysis, environmental behaviour of the elements important for the nuclear science and the professional tests.

  17. Analytical prediction of fuel assembly spacer grid loss coefficient

    International Nuclear Information System (INIS)

    Lim, J. S.; Nam, K. I.; Park, S. K.; Kwon, J. T.; Park, W. J.

    2002-01-01

    The analytical prediction model of the fuel assembly spacer grid pressure loss coefficient has been studied. The pressure loss of gap between the test section wall and spacer grid was separated from the current model and the different friction drag coefficient on spacer straps from high Reynolds number region were used to low Reynolds number region. The analytical model has been verified based on the hydraulic pressure drop test results for the spacer grids of three types for 5x5, 16x16(or 17x17) arrays. The analytical model predicts the pressure loss coefficients obtained from test results within the maximum errors of 12% and 7% for 5x5 test bundle and full size bundle, respectively, at Reynolds number 500,000 of the core operating condition. This result shows that the analytical model can be used for research and design change of the nuclear fuel assembly

  18. Seismic II over I Drop Test Program results and interpretation

    International Nuclear Information System (INIS)

    Thomas, B.

    1993-03-01

    The consequences of non-seismically qualified (Category 2) objects falling and striking essential seismically qualified (Category 1) objects has always been a significant, yet analytically difficult problem, particularly in evaluating the potential damage to equipment that may result from earthquakes. Analytical solutions for impact problems are conservative and available for mostly simple configurations. In a nuclear facility, the open-quotes sourcesclose quotes and open-quotes targetsclose quotes requiring evaluation are frequently irregular in shape and configuration, making calculations and computer modeling difficult. Few industry or regulatory rules are available on this topic even though it is a source of considerable construction upgrade costs. A drop test program was recently conducted to develop a more accurate understanding of the consequences of seismic interactions. The resulting data can be used as a means to improve the judgment of seismic qualification engineers performing interaction evaluations and to develop realistic design criteria for seismic interactions. Impact tests on various combinations of sources and targets commonly found in one Savannah River Site (SRS) nuclear facility were performed by dropping the sources from various heights onto the targets. This report summarizes results of the Drop Test Program. Force and acceleration time history data are presented as well as general observations on the overall ruggedness of various targets when subjected to impacts from different types of sources

  19. Design of the 'half-size' ITER neutral beam source for the test facility ELISE

    International Nuclear Information System (INIS)

    Heinemann, B.; Falter, H.; Fantz, U.; Franzen, P.; Froeschle, M.; Gutser, R.; Kraus, W.; Nocentini, R.; Riedl, R.; Speth, E.; Staebler, A.; Wuenderlich, D.; Agostinetti, P.; Jiang, T.

    2009-01-01

    In 2007 the radio frequency driven negative hydrogen ion source developed at IPP in Garching was chosen by the ITER board as the new reference source for the ITER neutral beam system. In order to support the design and the commissioning and operating phases of the ITER test facilities ISTF and NBTF in Padua, IPP is presently constructing a new test facility ELISE (Extraction from a Large Ion Source Experiment). ELISE will be operated with the so-called 'half-size ITER source' which is an intermediate step between the present small IPP RF sources (1/8 ITER size) and the full size ITER source. The source will have approximately the width but only half the height of the ITER source. The modular concept with 4 drivers will allow an easy extrapolation to the full ITER size with 8 drivers. Pulsed beam extraction and acceleration up to 60 kV (corresponding to pre-acceleration voltage of SINGAP) is foreseen. The aim of the design of the ELISE source and extraction system was to be as close as possible to the ITER design; it has however some modifications allowing a better diagnostic access as well as more flexibility for exploring open questions. Therefore one major difference compared to the source of ITER, NBTF or ISTF is the possible operation in air. Specific requirements for RF sources as found on IPP test facilities BATMAN and MANITU are implemented [A. Staebler, et al., Development of a RF-driven ion source for the ITER NBI system, SOFT Conference 2008, Fusion Engineering and Design, 84 (2009) 265-268].

  20. Direct assay of filter media following DEOX testing

    International Nuclear Information System (INIS)

    Westphal, B.R.; Lind, R.P.; Giglio, J.J.; Cummings, D.G.; Huntley, M.W.; Morgan, C.D.; Bateman, K.J.; Wahlquist, D.L.; Sell, D.A.

    2007-01-01

    The direct assay of filter media by gamma spectrometry following DEOX testing has distinct advantages over analytical chemistry. Prior to using gamma spectrometry for the quantification of cesium (Cs- 137), a calibration must be established with known sources since gamma spectrometry yields relative results. Quantitative analytical chemistry, in particular ICP-MS, has been performed on the filter media for comparison to the gamma spectrometry data. The correlation of gamma spectrometry to ICP-MS data is presented to justify the continued use of gamma spectrometry for filter media. (authors)

  1. Interior beam searchlight semi-analytical benchmark

    International Nuclear Information System (INIS)

    Ganapol, Barry D.; Kornreich, Drew E.

    2008-01-01

    Multidimensional semi-analytical benchmarks to provide highly accurate standards to assess routine numerical particle transport algorithms are few and far between. Because of the well-established 1D theory for the analytical solution of the transport equation, it is sometimes possible to 'bootstrap' a 1D solution to generate a more comprehensive solution representation. Here, we consider the searchlight problem (SLP) as a multidimensional benchmark. A variation of the usual SLP is the interior beam SLP (IBSLP) where a beam source lies beneath the surface of a half space and emits directly towards the free surface. We consider the establishment of a new semi-analytical benchmark based on a new FN formulation. This problem is important in radiative transfer experimental analysis to determine cloud absorption and scattering properties. (authors)

  2. Induced over voltage test on transformers using enhanced Z-source inverter based circuit

    Science.gov (United States)

    Peter, Geno; Sherine, Anli

    2017-09-01

    The normal life of a transformer is well above 25 years. The economical operation of the distribution system has its roots in the equipments being used. The economy being such, that it is financially advantageous to replace transformers with more than 15 years of service in the second perennial market. Testing of transformer is required, as its an indication of the extent to which a transformer can comply with the customers specified requirements and the respective standards (IEC 60076-3). In this paper, induced over voltage testing on transformers using enhanced Z source inverter is discussed. Power electronic circuits are now essential for a whole array of industrial electronic products. The bulky motor generator set, which is used to generate the required frequency to conduct the induced over voltage testing of transformers is nowadays replaced by static frequency converter. First conventional Z-source inverter, and second an enhanced Z source inverter is being used to generate the required voltage and frequency to test the transformer for induced over voltage test, and its characteristics is analysed.

  3. Data mining and business analytics with R

    CERN Document Server

    Ledolter, Johannes

    2013-01-01

    Collecting, analyzing, and extracting valuable information from a large amount of data requires easily accessible, robust, computational and analytical tools. Data Mining and Business Analytics with R utilizes the open source software R for the analysis, exploration, and simplification of large high-dimensional data sets. As a result, readers are provided with the needed guidance to model and interpret complicated data and become adept at building powerful models for prediction and classification. Highlighting both underlying concepts and practical computational skills, Data Mining

  4. Analytical Validation of the ReEBOV Antigen Rapid Test for Point-of-Care Diagnosis of Ebola Virus Infection.

    Science.gov (United States)

    Cross, Robert W; Boisen, Matthew L; Millett, Molly M; Nelson, Diana S; Oottamasathien, Darin; Hartnett, Jessica N; Jones, Abigal B; Goba, Augustine; Momoh, Mambu; Fullah, Mohamed; Bornholdt, Zachary A; Fusco, Marnie L; Abelson, Dafna M; Oda, Shunichiro; Brown, Bethany L; Pham, Ha; Rowland, Megan M; Agans, Krystle N; Geisbert, Joan B; Heinrich, Megan L; Kulakosky, Peter C; Shaffer, Jeffrey G; Schieffelin, John S; Kargbo, Brima; Gbetuwa, Momoh; Gevao, Sahr M; Wilson, Russell B; Saphire, Erica Ollmann; Pitts, Kelly R; Khan, Sheik Humarr; Grant, Donald S; Geisbert, Thomas W; Branco, Luis M; Garry, Robert F

    2016-10-15

     Ebola virus disease (EVD) is a severe viral illness caused by Ebola virus (EBOV). The 2013-2016 EVD outbreak in West Africa is the largest recorded, with >11 000 deaths. Development of the ReEBOV Antigen Rapid Test (ReEBOV RDT) was expedited to provide a point-of-care test for suspected EVD cases.  Recombinant EBOV viral protein 40 antigen was used to derive polyclonal antibodies for RDT and enzyme-linked immunosorbent assay development. ReEBOV RDT limits of detection (LOD), specificity, and interference were analytically validated on the basis of Food and Drug Administration (FDA) guidance.  The ReEBOV RDT specificity estimate was 95% for donor serum panels and 97% for donor whole-blood specimens. The RDT demonstrated sensitivity to 3 species of Ebolavirus (Zaire ebolavirus, Sudan ebolavirus, and Bundibugyo ebolavirus) associated with human disease, with no cross-reactivity by pathogens associated with non-EBOV febrile illness, including malaria parasites. Interference testing exhibited no reactivity by medications in common use. The LOD for antigen was 4.7 ng/test in serum and 9.4 ng/test in whole blood. Quantitative reverse transcription-polymerase chain reaction testing of nonhuman primate samples determined the range to be equivalent to 3.0 × 10 5 -9.0 × 10 8 genomes/mL.  The analytical validation presented here contributed to the ReEBOV RDT being the first antigen-based assay to receive FDA and World Health Organization emergency use authorization for this EVD outbreak, in February 2015. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  5. APL/JHU free flight tests of the General Purpose Heat Source module. Testing: 5-7 March 1984

    International Nuclear Information System (INIS)

    Baker, W.M. II.

    1984-01-01

    Purpose of the test was to obtain statistical information on the dynamics of the General Purpose Heat Source (GPHS) module at terminal speeds. Models were designed to aerodynamically and dynamically represent the GPHS module. Normal and high speed photographic coverage documented the motion of the models. This report documents test parameters and techniques for the free-spin tests. It does not include data analysis

  6. The Analytical Repository Source-Term (AREST) model: Analysis of spent fuel as a nuclear waste form

    International Nuclear Information System (INIS)

    Apted, M.J.; Liebetrau, A.M.; Engel, D.W.

    1989-02-01

    The purpose of this report is to assess the performance of spent fuel as a final waste form. The release of radionuclides from spent nuclear fuel has been simulated for the three repository sites that were nominated for site characterization in accordance with the Nuclear Waste Policy Act of 1982. The simulation is based on waste package designs that were presented in the environmental assessments prepared for each site. Five distinct distributions for containment failure have been considered, and the release for nuclides from the UO 2 matrix, gap (including grain boundary), crud/surface layer, and cladding has been calculated with the Analytic Repository Source-Term (AREST) code. Separate scenarios involving incongruent and congruent release from the UO 2 matrix have also been examined using the AREST code. Congruent release is defined here as the condition in which the relative mass release rates of a given nuclide and uranium from the UO 2 matrix are equal to their mass ratios in the matrix. Incongruent release refers to release of a given nuclide from the UO 2 matrix controlled by its own solubility-limiting solid phase. Release of nuclides from other sources within the spent fuel (e.g., cladding, fuel/cladding gap) is evaluated separately from either incongruent or congruent matrix release. 51 refs., 200 figs., 9 tabs

  7. Analytic solutions of a class of nonlinearly dynamic systems

    International Nuclear Information System (INIS)

    Wang, M-C; Zhao, X-S; Liu, X

    2008-01-01

    In this paper, the homotopy perturbation method (HPM) is applied to solve a coupled system of two nonlinear differential with first-order similar model of Lotka-Volterra and a Bratus equation with a source term. The analytic approximate solutions are derived. Furthermore, the analytic approximate solutions obtained by the HPM with the exact solutions reveals that the present method works efficiently

  8. Getting Open Source Right for Big Data Analytics: Software Sharing, Governance, Collaboration and Most of All, Fun!

    Science.gov (United States)

    Mattmann, C. A.

    2013-12-01

    A wave of open source big data analytic infrastructure is currently shaping government, private sector, and academia. Projects are consuming, adapting, and contributing back to various ecosystems of software e.g., the Apache Hadoop project and its ecosystem of related efforts including Hive, HBase, Pig, Oozie, Ambari, Knox, Tez and Yarn, to name a few; the Berkeley AMPLab stack which includes Spark, Shark, Mesos, Tachyon, BlinkDB, MLBase, and other emerging efforts; MapR and its related stack of technologies, offerings from commercial companies building products around these tools e.g., Hortonworks Data Platform (HDP), Cloudera's CDH project, etc. Though the technologies all offer different capabilities including low latency support/in-memory, versus record oriented file I/O, high availability, support for the Map Reduce programming paradigm or other dataflow/workflow constructs, there is a common thread that binds these products - they are all released under an open source license e.g., Apache2, MIT, BSD, GPL/LGPL, etc.; all thrive in various ecosystems, such as Apache, or Berkeley AMPLab; all are developed collaboratively, and all technologies provide plug in architecture models and methodologies for allowing others to contribute, and participate via various community models. This talk will cover the open source aspects and governance aspects of the aforementioned Big Data ecosystems and point out the differences, subtleties, and implications of those differences. The discussion will be by example, using several national deployments and Big Data initiatives stemming from the Administration including DARPA's XDATA program; NASA's CMAC program; NSF's EarthCube and geosciences BigData projects. Lessons learned from these efforts in terms of the open source aspects of these technologies will help guide the AGU community in their use, deployment and understanding.

  9. Make-up of injector test stand (ITS-1) and preliminary results with Model-I ion source

    International Nuclear Information System (INIS)

    Matsuda, S.; Ito, T.; Kondo, U.; Ohara, Y.; Oga, T.; Shibata, T.; Shirakata, H.; Sugawara, T.; Tanaka, S.

    Constitution of the 1-st injector test stand (ITS-1) in the Thermonuclear Division, JAERI, and the performance of the Model-I ion source are described. Heating a plasma by neutral beam injection is one of the promising means in the thermonuclear fusion devices. Purpose of the test stand is to develop the ion sources used in such injection systems. The test stand was completed in February 1975, which is capable of testing the ion sources up to 12 amps at 30 kV. A hydrogen ion beam of 5.5 amps at 25 kV was obtained in the Model-I ion source

  10. Latent structure of the Wisconsin Card Sorting Test: a confirmatory factor analytic study.

    Science.gov (United States)

    Greve, Kevin W; Stickle, Timothy R; Love, Jeffrey M; Bianchini, Kevin J; Stanford, Matthew S

    2005-05-01

    The present study represents the first large scale confirmatory factor analysis of the Wisconsin Card Sorting Test (WCST). The results generally support the three factor solutions reported in the exploratory factor analysis literature. However, only the first factor, which reflects general executive functioning, is statistically sound. The secondary factors, while likely reflecting meaningful cognitive abilities, are less stable except when all subjects complete all 128 cards. It is likely that having two discontinuation rules for the WCST has contributed to the varied factor analytic solutions reported in the literature and early discontinuation may result in some loss of useful information. Continued multivariate research will be necessary to better clarify the processes underlying WCST performance and their relationships to one another.

  11. Preservatives and neutralizing substances in milk: analytical sensitivity of official specific and nonspecific tests, microbial inhibition effect, and residue persistence in milk

    Directory of Open Access Journals (Sweden)

    Livia Cavaletti Corrêa da Silva

    2015-09-01

    Full Text Available Milk fraud has been a recurring problem in Brazil; thus, it is important to know the effect of most frequently used preservatives and neutralizing substances as well as the detection capability of official tests. The objective of this study was to evaluate the analytical sensitivity of legislation-described tests and nonspecific microbial inhibition tests, and to investigate the effect of such substances on microbial growth inhibition and the persistence of detectable residues after 24/48h of refrigeration. Batches of raw milk, free from any contaminant, were divided into aliquots and mixed with different concentrations of formaldehyde, hydrogen peroxide, sodium hypochlorite, chlorine, chlorinated alkaline detergent, or sodium hydroxide. The analytical sensitivity of the official tests was 0.005%, 0.003%, and 0.013% for formaldehyde, hydrogen peroxide, and hypochlorite, respectively. Chlorine and chlorinated alkaline detergent were not detected by regulatory tests. In the tests for neutralizing substances, sodium hydroxide could not be detected when acidity was accurately neutralized. The yogurt culture test gave results similar to those obtained by official tests for the detection of specific substances. Concentrations of 0.05% of formaldehyde, 0.003% of hydrogen peroxide and 0.013% of sodium hypochlorite significantly reduced (P

  12. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  13. Croatian Analytical Terminology

    Directory of Open Access Journals (Sweden)

    Kastelan-Macan; M.

    2008-04-01

    Full Text Available Results of analytical research are necessary in all human activities. They are inevitable in making decisions in the environmental chemistry, agriculture, forestry, veterinary medicine, pharmaceutical industry, and biochemistry. Without analytical measurements the quality of materials and products cannot be assessed, so that analytical chemistry is an essential part of technical sciences and disciplines.The language of Croatian science, and analytical chemistry within it, was one of the goals of our predecessors. Due to the political situation, they did not succeed entirely, but for the scientists in independent Croatia this is a duty, because language is one of the most important features of the Croatian identity. The awareness of the need to introduce Croatian terminology was systematically developed in the second half of the 19th century, along with the founding of scientific societies and the wish of scientists to write their scientific works in Croatian, so that the results of their research may be applied in economy. Many authors of textbooks from the 19th and the first half of the 20th century contributed to Croatian analytical terminology (F. Rački, B. Šulek, P. Žulić, G. Pexidr, J. Domac, G. Janeček , F. Bubanović, V. Njegovan and others. M. DeŢelić published the first systematic chemical terminology in 1940, adjusted to the IUPAC recommendations. In the second half of 20th century textbooks in classic analytical chemistry were written by V. Marjanović-Krajovan, M. Gyiketta-Ogrizek, S. Žilić and others. I. Filipović wrote the General and Inorganic Chemistry textbook and the Laboratory Handbook (in collaboration with P. Sabioncello and contributed greatly to establishing the terminology in instrumental analytical methods.The source of Croatian nomenclature in modern analytical chemistry today are translated textbooks by Skoog, West and Holler, as well as by Günnzler i Gremlich, and original textbooks by S. Turina, Z.

  14. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  15. Comparison of General Purpose Heat Source testing with the ANSI N43.6-1977 (R 1989) sealed source standard

    International Nuclear Information System (INIS)

    Grigsby, C.O.

    1998-01-01

    This analysis provides a comparison of the testing of Radioisotope Thermoelectric Generators (RTGs) and RTG components with the testing requirements of ANSI N43.6-1977 (R1989) ''Sealed Radioactive Sources, Categorization''. The purpose of this comparison is to demonstrate that the RTGs meet or exceed the requirements of the ANSI standard, and thus can be excluded from the radioactive inventory of the Chemistry and Metallurgy Research (CMR) building in Los Alamos per Attachment 1 of DOE STD 1027-92. The approach used in this analysis is as follows: (1) describe the ANSI sealed source classification methodology; (2) develop sealed source performance requirements for the RTG and/or RTG components based on criteria from the accident analysis for CMR; (3) compare the existing RTG or RTG component test data to the CMR requirements; and (4) determine the appropriate ANSI classification for the RTG and/or RTG components based on CMR performance requirements. The CMR requirements for treating RTGs as sealed sources are derived from the radiotoxicity of the isotope ( 238 P7) and amount (13 kg) of radioactive material contained in the RTG. The accident analysis for the CMR BIO identifies the bounding accidents as wing-wide fire, explosion and earthquake. These accident scenarios set the requirements for RTGs or RTG components stored within the CMR

  16. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention

    Directory of Open Access Journals (Sweden)

    Samar Al-Hajj

    2017-09-01

    Full Text Available Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA. GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications.

  17. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.

    Science.gov (United States)

    Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-09-12

    Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.

  18. Recent UCN source developments at Los Alamos

    International Nuclear Information System (INIS)

    Seestrom, S.J.; Anaya, J.M.; Bowles, T.J.

    1998-01-01

    The most intense sources of ultra cold neutrons (UCN) have bee built at reactors where the high average thermal neutron flux can overcome the low UCN production rate to achieve usable densities of UCN. At spallation neutron sources the average flux available is much lower than at a reactor, though the peak flux can be comparable or higher. The authors have built a UCN source that attempts to take advantage of the high peak flux available at the short pulse spallation neutron source at the Los Alamos Neutron Science Center (LANSCE) to generate a useful number of UCN. In the source UCN are produced by Doppler-shifted Bragg scattering of neutrons to convert 400-m/s neutrons down into the UCN regime. This source was initially tested in 1996 and various improvements were made based on the results of the 1996 running. These improvements were implemented and tested in 1997. In sections 2 and 3 they discuss the improvements that have been made and the resulting source performance. Recently an even more interesting concept was put forward by Serebrov et al. This involves combining a solid Deuterium UCN source, previously studied by Serebrov et al., with a pulsed spallation source to achieve world record UCN densities. They have initiated a program of calculations and measurements aimed at verifying the solid Deuterium UCN source concept. The approach has been to develop an analytical capability, combine with Monte Carlo calculations of neutron production, and perform benchmark experiments to verify the validity of the calculations. Based on the calculations and measurements they plan to test a modified version of the Serebrov UCN factory. They estimate that they could produce over 1,000 UCN/cc in a 15 liter volume, using 1 microamp of 800 MeV protons for two seconds every 500 seconds. They will discuss the result UCN production measurements in section 4

  19. Low-Energy Microfocus X-Ray Source for Enhanced Testing Capability in the Stray Light Facility

    Science.gov (United States)

    Gaskin, Jessica; O'Dell, Stephen; Kolodziejczak, Jeff

    2015-01-01

    Research toward high-resolution, soft x-ray optics (mirrors and gratings) necessary for the next generation large x-ray observatories requires x-ray testing using a low-energy x-ray source with fine angular size (energy microfocus (approximately 0.1 mm spot) x-ray source from TruFocus Corporation that mates directly to the Stray Light Facility (SLF). MSFC X-ray Astronomy team members are internationally recognized for their expertise in the development, fabrication, and testing of grazing-incidence optics for x-ray telescopes. One of the key MSFC facilities for testing novel x-ray instrumentation is the SLF. This facility is an approximately 100-m-long beam line equipped with multiple x-ray sources and detectors. This new source adds to the already robust compliment of instrumentation, allowing MSFC to support additional internal and community x-ray testing needs.

  20. What makes us think? A three-stage dual-process model of analytic engagement.

    Science.gov (United States)

    Pennycook, Gordon; Fugelsang, Jonathan A; Koehler, Derek J

    2015-08-01

    The distinction between intuitive and analytic thinking is common in psychology. However, while often being quite clear on the characteristics of the two processes ('Type 1' processes are fast, autonomous, intuitive, etc. and 'Type 2' processes are slow, deliberative, analytic, etc.), dual-process theorists have been heavily criticized for being unclear on the factors that determine when an individual will think analytically or rely on their intuition. We address this issue by introducing a three-stage model that elucidates the bottom-up factors that cause individuals to engage Type 2 processing. According to the model, multiple Type 1 processes may be cued by a stimulus (Stage 1), leading to the potential for conflict detection (Stage 2). If successful, conflict detection leads to Type 2 processing (Stage 3), which may take the form of rationalization (i.e., the Type 1 output is verified post hoc) or decoupling (i.e., the Type 1 output is falsified). We tested key aspects of the model using a novel base-rate task where stereotypes and base-rate probabilities cued the same (non-conflict problems) or different (conflict problems) responses about group membership. Our results support two key predictions derived from the model: (1) conflict detection and decoupling are dissociable sources of Type 2 processing and (2) conflict detection sometimes fails. We argue that considering the potential stages of reasoning allows us to distinguish early (conflict detection) and late (decoupling) sources of analytic thought. Errors may occur at both stages and, as a consequence, bias arises from both conflict monitoring and decoupling failures. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Rationale for a spallation neutron source target system test facility at the 1-MW Long-Pulse Spallation Source

    International Nuclear Information System (INIS)

    Sommer, W.F.

    1995-12-01

    The conceptual design study for a 1-MW Long-Pulse Spallation Source at the Los Alamos Neutron Science Center has shown the feasibility of including a spallation neutron test facility at a relatively low cost. This document presents a rationale for developing such a test bed. Currently, neutron scattering facilities operate at a maximum power of 0.2 MW. Proposed new designs call for power levels as high as 10 MW, and future transmutation activities may require as much as 200 MW. A test bed will allow assessment of target neutronics; thermal hydraulics; remote handling; mechanical structure; corrosion in aqueous, non-aqueous, liquid metal, and molten salt systems; thermal shock on systems and system components; and materials for target systems. Reliable data in these areas are crucial to the safe and reliable operation of new high-power facilities. These tests will provide data useful not only to spallation neutron sources proposed or under development, but also to other projects in accelerator-driven transmutation technologies such as the production of tritium

  2. Failure analysis of radioisotopic heat source capsules tested under multi-axial conditions

    International Nuclear Information System (INIS)

    Zielinski, R.E.; Stacy, E.; Burgan, C.E.

    In order to qualify small radioisotopic heat sources for a 25-yr design life, multi-axial mechanical tests were performed on the structural components of the heat source. The results of these tests indicated that failure predominantly occurred in the middle of the weld ramp-down zone. Examination of the failure zone by standard metallographic techniques failed to indicate the true cause of failure. A modified technique utilizing chemical etching, scanning electron microscopy, and energy dispersive x-ray analysis was employed and dramatically indicated the true cause of failure, impurity concentration in the ramp-down zone. As a result of the initial investigation, weld parameters for the heat sources were altered. Example welds made with a pulse arc technique did not have this impurity buildup in the ramp-down zone

  3. Electron capture detector based on a non-radioactive electron source: operating parameters vs. analytical performance

    Directory of Open Access Journals (Sweden)

    E. Bunert

    2017-12-01

    Full Text Available Gas chromatographs with electron capture detectors are widely used for the analysis of electron affine substances such as pesticides or chlorofluorocarbons. With detection limits in the low pptv range, electron capture detectors are the most sensitive detectors available for such compounds. Based on their operating principle, they require free electrons at atmospheric pressure, which are usually generated by a β− decay. However, the use of radioactive materials leads to regulatory restrictions regarding purchase, operation, and disposal. Here, we present a novel electron capture detector based on a non-radioactive electron source that shows similar detection limits compared to radioactive detectors but that is not subject to these limitations and offers further advantages such as adjustable electron densities and energies. In this work we show first experimental results using 1,1,2-trichloroethane and sevoflurane, and investigate the effect of several operating parameters on the analytical performance of this new non-radioactive electron capture detector (ECD.

  4. Preparation of tracing source layer in simulation test of nuclide migration

    International Nuclear Information System (INIS)

    Zhao Yingjie; Ni Shiwei; Li Weijuan; Yamamoto, T.; Tanaka, T.; Komiya, T.

    1993-01-01

    In cooperative research between CIRP and JAERI on safety assessment for shallow land disposal of low level radioactive waste, a laboratory simulation test of nuclide migration was carried out, in which the undisturbed loess soil column sampled from CIRP' s field test site was used as testing material, three nuclides, Sr-85, Cs-137 and Co-60 were used as tracers. Special experiment on tracing method was carried out, which included measuring pH value of quartz sand in HCl solution, determining the eligible water content of quartz sand as tracer carrier, measuring distribution uniformity of nuclides in the tracing quartz sand, determining elution rate of nuclides from the tracing quartz sand and detecting activity uniformity of tracing source layer. The experiment results showed that the tracing source layer, in which fine quartz sand was used as tracer carrier, satisfied expected requirement. (1 fig.)

  5. A Simple Analytical Model for Predicting the Detectable Ion Current in Ion Mobility Spectrometry Using Corona Discharge Ionization Sources

    Science.gov (United States)

    Kirk, Ansgar Thomas; Kobelt, Tim; Spehlbrink, Hauke; Zimmermann, Stefan

    2018-05-01

    Corona discharge ionization sources are often used in ion mobility spectrometers (IMS) when a non-radioactive ion source with high ion currents is required. Typically, the corona discharge is followed by a reaction region where analyte ions are formed from the reactant ions. In this work, we present a simple yet sufficiently accurate model for predicting the ion current available at the end of this reaction region when operating at reduced pressure as in High Kinetic Energy Ion Mobility Spectrometers (HiKE-IMS) or most IMS-MS instruments. It yields excellent qualitative agreement with measurement results and is even able to calculate the ion current within an error of 15%. Additional interesting findings of this model are the ion current at the end of the reaction region being independent from the ion current generated by the corona discharge and the ion current in High Kinetic Energy Ion Mobility Spectrometers (HiKE-IMS) growing quadratically when scaling down the length of the reaction region. [Figure not available: see fulltext.

  6. Mixed analytical-stochastic simulation method for the recovery of a Brownian gradient source from probability fluxes to small windows.

    Science.gov (United States)

    Dobramysl, U; Holcman, D

    2018-02-15

    Is it possible to recover the position of a source from the steady-state fluxes of Brownian particles to small absorbing windows located on the boundary of a domain? To address this question, we develop a numerical procedure to avoid tracking Brownian trajectories in the entire infinite space. Instead, we generate particles near the absorbing windows, computed from the analytical expression of the exit probability. When the Brownian particles are generated by a steady-state gradient at a single point, we compute asymptotically the fluxes to small absorbing holes distributed on the boundary of half-space and on a disk in two dimensions, which agree with stochastic simulations. We also derive an expression for the splitting probability between small windows using the matched asymptotic method. Finally, when there are more than two small absorbing windows, we show how to reconstruct the position of the source from the diffusion fluxes. The present approach provides a computational first principle for the mechanism of sensing a gradient of diffusing particles, a ubiquitous problem in cell biology.

  7. Analytical approach of laser beam propagation in the hollow polygonal light pipe.

    Science.gov (United States)

    Zhu, Guangzhi; Zhu, Xiao; Zhu, Changhong

    2013-08-10

    An analytical method of researching the light distribution properties on the output end of a hollow n-sided polygonal light pipe and a light source with a Gaussian distribution is developed. The mirror transformation matrices and a special algorithm of removing void virtual images are created to acquire the location and direction vector of each effective virtual image on the entrance plane. The analytical method is demonstrated by Monte Carlo ray tracing. At the same time, four typical cases are discussed. The analytical results indicate that the uniformity of light distribution varies with the structural and optical parameters of the hollow n-sided polygonal light pipe and light source with a Gaussian distribution. The analytical approach will be useful to design and choose the hollow n-sided polygonal light pipe, especially for high-power laser beam homogenization techniques.

  8. International Test Comparisons: Reviewing Translation Error in Different Source Language-Target Language Combinations

    Science.gov (United States)

    Zhao, Xueyu; Solano-Flores, Guillermo; Qian, Ming

    2018-01-01

    This article addresses test translation review in international test comparisons. We investigated the applicability of the theory of test translation error--a theory of the multidimensionality and inevitability of test translation error--across source language-target language combinations in the translation of PISA (Programme of International…

  9. Nuclear analytical chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Brune, D.; Forkman, B.; Persson, B.

    1984-01-01

    This book covers the general theories and techniques of nuclear chemical analysis, directed at applications in analytical chemistry, nuclear medicine, radiophysics, agriculture, environmental sciences, geological exploration, industrial process control, etc. The main principles of nuclear physics and nuclear detection on which the analysis is based are briefly outlined. An attempt is made to emphasise the fundamentals of activation analysis, detection and activation methods, as well as their applications. The book provides guidance in analytical chemistry, agriculture, environmental and biomedical sciences, etc. The contents include: the nuclear periodic system; nuclear decay; nuclear reactions; nuclear radiation sources; interaction of radiation with matter; principles of radiation detectors; nuclear electronics; statistical methods and spectral analysis; methods of radiation detection; neutron activation analysis; charged particle activation analysis; photon activation analysis; sample preparation and chemical separation; nuclear chemical analysis in biological and medical research; the use of nuclear chemical analysis in the field of criminology; nuclear chemical analysis in environmental sciences, geology and mineral exploration; and radiation protection.

  10. Nuclear analytical chemistry

    International Nuclear Information System (INIS)

    Brune, D.; Forkman, B.; Persson, B.

    1984-01-01

    This book covers the general theories and techniques of nuclear chemical analysis, directed at applications in analytical chemistry, nuclear medicine, radiophysics, agriculture, environmental sciences, geological exploration, industrial process control, etc. The main principles of nuclear physics and nuclear detection on which the analysis is based are briefly outlined. An attempt is made to emphasise the fundamentals of activation analysis, detection and activation methods, as well as their applications. The book provides guidance in analytical chemistry, agriculture, environmental and biomedical sciences, etc. The contents include: the nuclear periodic system; nuclear decay; nuclear reactions; nuclear radiation sources; interaction of radiation with matter; principles of radiation detectors; nuclear electronics; statistical methods and spectral analysis; methods of radiation detection; neutron activation analysis; charged particle activation analysis; photon activation analysis; sample preparation and chemical separation; nuclear chemical analysis in biological and medical research; the use of nuclear chemical analysis in the field of criminology; nuclear chemical analysis in environmental sciences, geology and mineral exploration; and radiation protection

  11. Ultracold neutron source at the PULSTAR reactor: Engineering design and cryogenic testing

    Energy Technology Data Exchange (ETDEWEB)

    Korobkina, E., E-mail: ekorobk@ncsu.edu [Department of Nuclear Engineering, North Carolina State University, 2500 Stinson Drive, Box 7909, Raleigh, NC 27695 (United States); Medlin, G. [Department of Physics, North Carolina State University, 2401 Stinson Drive, Box 8202, Raleigh, NC 27695 (United States); Triangle Universities Nuclear Laboratory, 116 Science Drive, Box 90308, Durham, NC 27708 (United States); Wehring, B.; Hawari, A.I. [Department of Nuclear Engineering, North Carolina State University, 2500 Stinson Drive, Box 7909, Raleigh, NC 27695 (United States); Huffman, P.R.; Young, A.R. [Department of Physics, North Carolina State University, 2401 Stinson Drive, Box 8202, Raleigh, NC 27695 (United States); Triangle Universities Nuclear Laboratory, 116 Science Drive, Box 90308, Durham, NC 27708 (United States); Beaumont, B. [Department of Physics, North Carolina State University, 2401 Stinson Drive, Box 8202, Raleigh, NC 27695 (United States); Palmquist, G. [Department of Physics, North Carolina State University, 2401 Stinson Drive, Box 8202, Raleigh, NC 27695 (United States); Triangle Universities Nuclear Laboratory, 116 Science Drive, Box 90308, Durham, NC 27708 (United States)

    2014-12-11

    Construction is completed and commissioning is in progress for an ultracold neutron (UCN) source at the PULSTAR reactor on the campus of North Carolina State University. The source utilizes two stages of neutron moderation, one in heavy water at room temperature and the other in solid methane at ∼40K, followed by a converter stage, solid deuterium at 5 K, that allows a single down scattering of cold neutrons to provide UCN. The UCN source rolls into the thermal column enclosure of the PULSTAR reactor, where neutrons will be delivered from a bare face of the reactor core by streaming through a graphite-lined assembly. The source infrastructure, i.e., graphite-lined assembly, heavy-water system, gas handling system, and helium liquefier cooling system, has been tested and all systems operate as predicted. The research program being considered for the PULSTAR UCN source includes the physics of UCN production, fundamental particle physics, and material surface studies of nanolayers containing hydrogen. In the present paper we report details of the engineering and cryogenic design of the facility as well as results of critical commissioning tests without neutrons.

  12. Transport methods: general. 1. The Analytical Monte Carlo Method for Radiation Transport Calculations

    International Nuclear Information System (INIS)

    Martin, William R.; Brown, Forrest B.

    2001-01-01

    We present an alternative Monte Carlo method for solving the coupled equations of radiation transport and material energy. This method is based on incorporating the analytical solution to the material energy equation directly into the Monte Carlo simulation for the radiation intensity. This method, which we call the Analytical Monte Carlo (AMC) method, differs from the well known Implicit Monte Carlo (IMC) method of Fleck and Cummings because there is no discretization of the material energy equation since it is solved as a by-product of the Monte Carlo simulation of the transport equation. Our method also differs from the method recently proposed by Ahrens and Larsen since they use Monte Carlo to solve both equations, while we are solving only the radiation transport equation with Monte Carlo, albeit with effective sources and cross sections to represent the emission sources. Our method bears some similarity to a method developed and implemented by Carter and Forest nearly three decades ago, but there are substantive differences. We have implemented our method in a simple zero-dimensional Monte Carlo code to test the feasibility of the method, and the preliminary results are very promising, justifying further extension to more realistic geometries. (authors)

  13. Comparison of in-plant performance test data with analytic prediction of reactor safety system injection transient (U)

    International Nuclear Information System (INIS)

    Roy, B.N.; Neill, C.H. Jr.

    1993-01-01

    This paper compares the performance test data from injection transients for both of the subsystems of the Supplementary Safety System of the Savannah River Site production reactor with analytical predictions from an in-house thermal hydraulic computer code. The code was initially developed for design validation of the new Supplementary Safety System subsystem, but is shown to be equally capable of predicting the performance of the Supplementary Safety System existing subsystem even though the two subsystem transient injections have marked differences. The code itself was discussed and its validation using prototypic tests with simulated fluids was reported in an earlier paper (Roy and Nomm 1991)

  14. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  15. Implicit/explicit memory versus analytic/nonanalytic processing: rethinking the mere exposure effect.

    Science.gov (United States)

    Whittlesea, B W; Price, J R

    2001-03-01

    In studies of the mere exposure effect, rapid presentation of items can increase liking without accurate recognition. The effect on liking has been explained as a misattribution of fluency caused by prior presentation. However, fluency is also a source of feelings of familiarity. It is, therefore, surprising that prior experience can enhance liking without also causing familiarity-based recognition. We suggest that when study opportunities are minimal and test items are perceptually similar, people adopt an analytic approach, attempting to recognize distinctive features. That strategy fails because rapid presentation prevents effective encoding of such features; it also prevents people from experiencing fluency and a consequent feeling of familiarity. We suggest that the liking-without-recognition effect results from using an effective (nonanalytic) strategy in judging pleasantness, but an ineffective (analytic) strategy in recognition. Explanations of the mere exposure effect based on a distinction between implicit and explicit memory are unnecessary.

  16. Clinical Neuropathology practice news 1-2014: Pyrosequencing meets clinical and analytical performance criteria for routine testing of MGMT promoter methylation status in glioblastoma

    Science.gov (United States)

    Preusser, Matthias; Berghoff, Anna S.; Manzl, Claudia; Filipits, Martin; Weinhäusel, Andreas; Pulverer, Walter; Dieckmann, Karin; Widhalm, Georg; Wöhrer, Adelheid; Knosp, Engelbert; Marosi, Christine; Hainfellner, Johannes A.

    2014-01-01

    Testing of the MGMT promoter methylation status in glioblastoma is relevant for clinical decision making and research applications. Two recent and independent phase III therapy trials confirmed a prognostic and predictive value of the MGMT promoter methylation status in elderly glioblastoma patients. Several methods for MGMT promoter methylation testing have been proposed, but seem to be of limited test reliability. Therefore, and also due to feasibility reasons, translation of MGMT methylation testing into routine use has been protracted so far. Pyrosequencing after prior DNA bisulfite modification has emerged as a reliable, accurate, fast and easy-to-use method for MGMT promoter methylation testing in tumor tissues (including formalin-fixed and paraffin-embedded samples). We performed an intra- and inter-laboratory ring trial which demonstrates a high analytical performance of this technique. Thus, pyrosequencing-based assessment of MGMT promoter methylation status in glioblastoma meets the criteria of high analytical test performance and can be recommended for clinical application, provided that strict quality control is performed. Our article summarizes clinical indications, practical instructions and open issues for MGMT promoter methylation testing in glioblastoma using pyrosequencing. PMID:24359605

  17. Analytical-numerical method for treatment of turbulent diffusion of particles in the air

    International Nuclear Information System (INIS)

    Arsov, L.J.

    1976-01-01

    This work deals with the problem of air pollution around a stationary punctual source. For description of air pollution from a punctual source a mathematical model is suggested, and for calculation of effluents concentration an analytical-numerical algorithm is given. In addition to the analitical treatment the mathematical model is far more flexible and complete. Eddy diffusivity is represented by an arbitrary function, and an arbitrary wind velocity profile ahs been proposed. The apsorption of the ground is introduced through a variable apsorption coefficient, and the sedimentation through the mean velocity of deposition. To determine the movement of particles a parabolic equation of diffusion is used. The method has been tested through calculation of effluents concentration for different values of physical parameters

  18. Analytic information processing style in epilepsy patients.

    Science.gov (United States)

    Buonfiglio, Marzia; Di Sabato, Francesco; Mandillo, Silvia; Albini, Mariarita; Di Bonaventura, Carlo; Giallonardo, Annateresa; Avanzini, Giuliano

    2017-08-01

    Relevant to the study of epileptogenesis is learning processing, given the pivotal role that neuroplasticity assumes in both mechanisms. Recently, evoked potential analyses showed a link between analytic cognitive style and altered neural excitability in both migraine and healthy subjects, regardless of cognitive impairment or psychological disorders. In this study we evaluated analytic/global and visual/auditory perceptual dimensions of cognitive style in patients with epilepsy. Twenty-five cryptogenic temporal lobe epilepsy (TLE) patients matched with 25 idiopathic generalized epilepsy (IGE) sufferers and 25 healthy volunteers were recruited and participated in three cognitive style tests: "Sternberg-Wagner Self-Assessment Inventory", the C. Cornoldi test series called AMOS, and the Mariani Learning style Questionnaire. Our results demonstrate a significant association between analytic cognitive style and both IGE and TLE and respectively a predominant auditory and visual analytic style (ANOVA: p values <0,0001). These findings should encourage further research to investigate information processing style and its neurophysiological correlates in epilepsy. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Heat and mass release for some transient fuel source fires: A test report

    International Nuclear Information System (INIS)

    Nowlen, S.P.

    1986-10-01

    Nine fire tests using five different trash fuel source packages were conducted by Sandia National Laboratories. This report presents the findings of these tests. Data reported includes heat and mass release rates, total heat and mass release, plume temperatures, and average fuel heat of combustion. These tests were conducted as a part of the US Nuclear Regulatory Commission sponsored fire safety research program. Data from these tests were intended for use in nuclear power plant probabilistic risk assessment fire analyses. The results were also used as input to a fire test program at Sandia investigating the vulnerability of electrical control cabinets to fire. The fuel packages tested were chosen to be representative of small to moderately sized transient trash fuel sources of the type that would be found in a nuclear power plant. The highest fire intensity encountered during these tests was 145 kW. Plume temperatures did not exceed 820 0 C

  20. Development of cold source moderator structure

    International Nuclear Information System (INIS)

    Aso, Tomokaze; Ishikura, Syuichi; Terada, Atsuhiko; Teshigawara, Makoto; Watanabe, Noboru; HIno, Ryutaro

    1999-01-01

    The cold and thermal neutrons generated at the target (which works as a spallation neutron source under a 5MW proton beam condition) is filtered with cold source moderators using supercritical hydrogen. Preliminary structural analysis was carried out to clarify technical problems on the concept of the thin-walled structure for the cold source moderator. Structural analytical results showed that the maximum stress of 1 12MPa occurred on the moderator surface, which exceeded the allowable design stresses of ordinary aluminum alloys. Flow patterns measured by water flow experiments agreed well with hydraulic analytical results, which showed that an impinging jet flow from an inner pipe of the moderator caused a recirculation flow on a large scale. Based on analytical and experimental results, new moderator structures with minute frames, blowing flow holes etc. were proposed to keep its strength and to suppress the recirculation flow. (author)

  1. Low frequency interference between short synchrotron radiation sources

    Directory of Open Access Journals (Sweden)

    F. Méot

    2001-06-01

    Full Text Available A recently developed analytical formalism describing low frequency far-field synchrotron radiation (SR is applied to the calculation of spectral angular radiation densities from interfering short sources (edge, short magnet. This is illustrated by analytical calculation of synchrotron radiation from various assemblies of short dipoles, including an “isolated” highest density infrared SR source.

  2. Locating gamma radiation source by self collimating BGO detector system

    Energy Technology Data Exchange (ETDEWEB)

    Orion, I; Pernick, A; Ilzycer, D; Zafrir, H [Israel Atomic Energy Commission, Yavne (Israel). Soreq Nuclear Research Center; Shani, G [Ben-Gurion Univ. of the Negev, Beersheba (Israel)

    1996-12-01

    The need for airborne collimated gamma detector system to estimate the radiation released from a nuclear accident has been established. A BGO detector system has been developed as an array of separate seven cylindrical Bismuth Germanate scintillators, one central detector symmetrically surrounded by six detectors. In such an arrangement, each of the detectors reduced the exposure of other detectors in the array to a radiation incident from a possible specific spatial angle, around file array. This shielding property defined as `self-collimation`, differs the point source response function for each of the detectors. The BGO detector system has a high density and atomic number, and therefore provides efficient self-collimation. Using the response functions of the separate detectors enables locating point sources as well as the direction of a nuclear radioactive plume with satisfactory angular resolution, of about 10 degrees. The detector`s point source response, as function of the source direction, in a horizontal plane, has been predicted by analytical calculation, and was verified by Monte-Carlo simulation using the code EGS4. The detector`s response was tested in a laboratory-scale experiment for several gamma ray energies, and the experimental results validated the theoretical (analytical and Monte-Carlo) results. (authors).

  3. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  4. Paper-Based Analytical Device for Zinc Ion Quantification in Water Samples with Power-Free Analyte Concentration

    Directory of Open Access Journals (Sweden)

    Hiroko Kudo

    2017-04-01

    Full Text Available Insufficient sensitivity is a general issue of colorimetric paper-based analytical devices (PADs for trace analyte detection, such as metal ions, in environmental water. This paper demonstrates the colorimetric detection of zinc ions (Zn2+ on a paper-based analytical device with an integrated analyte concentration system. Concentration of Zn2+ ions from an enlarged sample volume (1 mL has been achieved with the aid of a colorimetric Zn2+ indicator (Zincon electrostatically immobilized onto a filter paper substrate in combination with highly water-absorbent materials. Analyte concentration as well as sample pretreatment, including pH adjustment and interferent masking, has been elaborated. The resulting device enables colorimetric quantification of Zn2+ in environmental water samples (tap water, river water from a single sample application. The achieved detection limit of 0.53 μM is a significant improvement over that of a commercial colorimetric Zn2+ test paper (9.7 μM, demonstrating the efficiency of the developed analyte concentration system not requiring any equipment.

  5. Analytical performance, agreement and user-friendliness of six point-of-care testing urine analysers for urinary tract infection in general practice

    NARCIS (Netherlands)

    Schot, Marjolein J C; van Delft, Sanne; Kooijman-Buiting, Antoinette M J; de Wit, Niek J; Hopstaken, Rogier M

    2015-01-01

    OBJECTIVE: Various point-of-care testing (POCT) urine analysers are commercially available for routine urine analysis in general practice. The present study compares analytical performance, agreement and user-friendliness of six different POCT urine analysers for diagnosing urinary tract infection

  6. 42 CFR 493.1250 - Condition: Analytic systems.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Analytic systems. 493.1250 Section 493.1250 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... quality testing. The laboratory must monitor and evaluate the overall quality of the analytic systems and...

  7. Evaluation of the Non-Transient Hydrologic Source Term from the CAMBRIC Underground Nuclear Test in Frenchman Flat, Nevada Test Site

    International Nuclear Information System (INIS)

    Tompson, A B; Maxwell, R M; Carle, S F; Zavarin, M; Pawloski, G A.; Shumaker, D E

    2005-01-01

    Hydrologic Source Term (HST) calculations completed in 1998 at the CAMBRIC underground nuclear test site were LLNL's first attempt to simulate a hydrologic source term at the NTS by linking groundwater flow and transport modeling with geochemical modeling (Tompson et al., 1999). Significant effort was applied to develop a framework that modeled in detail the flow regime and captured all appropriate chemical processes that occurred over time. However, portions of the calculations were simplified because of data limitations and a perceived need for generalization of the results. For example: (1) Transient effects arising from a 16 years of pumping at the site for a radionuclide migration study were not incorporated. (2) Radionuclide fluxes across the water table, as derived from infiltration from a ditch to which pumping effluent was discharged, were not addressed. (3) Hydrothermal effects arising from residual heat of the test were not considered. (4) Background data on the ambient groundwater flow direction were uncertain and not represented. (5) Unclassified information on the Radiologic Source Term (RST) inventory, as tabulated recently by Bowen et al. (2001), was unavailable; instead, only a limited set of derived data were available (see Tompson et al., 1999). (6) Only a small number of radionuclides and geochemical reactions were incorporated in the work. (7) Data and interpretation of the RNM-2S multiple well aquifer test (MWAT) were not available. As a result, the current Transient CAMBRIC Hydrologic Source Term project was initiated as part of a broader Phase 2 Frenchman Flat CAU flow and transport modeling effort. The source term will be calculated under two scenarios: (1) A more specific representation of the transient flow and radionuclide release behavior at the site, reflecting the influence of the background hydraulic gradient, residual test heat, pumping experiment, and ditch recharge, and taking into account improved data sources and modeling

  8. Big Data Analytics in Chemical Engineering.

    Science.gov (United States)

    Chiang, Leo; Lu, Bo; Castillo, Ivan

    2017-06-07

    Big data analytics is the journey to turn data into insights for more informed business and operational decisions. As the chemical engineering community is collecting more data (volume) from different sources (variety), this journey becomes more challenging in terms of using the right data and the right tools (analytics) to make the right decisions in real time (velocity). This article highlights recent big data advancements in five industries, including chemicals, energy, semiconductors, pharmaceuticals, and food, and then discusses technical, platform, and culture challenges. To reach the next milestone in multiplying successes to the enterprise level, government, academia, and industry need to collaboratively focus on workforce development and innovation.

  9. Mars Analytical Microimager

    Science.gov (United States)

    Batory, Krzysztof J.; Govindjee; Andersen, Dale; Presley, John; Lucas, John M.; Sears, S. Kelly; Vali, Hojatollah

    Unambiguous detection of extraterrestrial nitrogenous hydrocarbon microbiology requires an instrument both to recognize potential biogenic specimens and to successfully discriminate them from geochemical settings. Such detection should ideally be in-situ and not jeopardize other experiments by altering samples. Taken individually most biomarkers are inconclusive. For example, since amino acids can be synthesized abiotically they are not always considered reliable biomarkers. An enantiomeric imbalance, which is characteristic of all terrestrial life, may be questioned because chirality can also be altered abiotically. However, current scientific understanding holds that aggregates of identical proteins or proteinaceous complexes, with their well-defined amino acid residue sequences, are indisputable biomarkers. Our paper describes the Mars Analytical Microimager, an instrument for the simultaneous imaging of generic autofluorescent biomarkers and overall morphology. Autofluorescence from ultraviolet to near-infrared is emitted by all known terrestrial biology, and often as consistent complex bands uncharacteristic of abiotic mineral luminescence. The MAM acquires morphology, and even sub-micron morphogenesis, at a 3-centimeter working distance with resolution approaching a laser scanning microscope. Luminescence is simultaneously collected via a 2.5-micron aperture, thereby permitting accurate correlation of multi-dimensional optical behavior with specimen morphology. A variable wavelength excitation source and photospectrometer serve to obtain steady-state and excitation spectra of biotic and luminescent abiotic sources. We believe this is the first time instrumentation for detecting hydrated or desiccated microbiology non-destructively in-situ has been demonstrated. We have obtained excellent preliminary detection of biota and inorganic matrix discrimination from terrestrial polar analogues, and perimetric morphology of individual magnetotactic bacteria. Proposed

  10. Dataset for Testing Contamination Source Identification Methods for Water Distribution Networks

    Data.gov (United States)

    U.S. Environmental Protection Agency — This dataset includes the results of a simulation study using the source inversion techniques available in the Water Security Toolkit. The data was created to test...

  11. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.; Mai, Paul Martin; Schorlemmer, Danijel

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data

  12. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory.

    Science.gov (United States)

    Horowitz, Gary L; Zaman, Zahur; Blanckaert, Norbert J C; Chan, Daniel W; Dubois, Jeffrey A; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W; Nilsen, Olaug L; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality.

  13. Hypothesis tests for the detection of constant speed radiation moving sources

    Energy Technology Data Exchange (ETDEWEB)

    Dumazert, Jonathan; Coulon, Romain; Kondrasovs, Vladimir; Boudergui, Karim; Sannie, Guillaume; Gameiro, Jordan; Normand, Stephane [CEA, LIST, Laboratoire Capteurs Architectures Electroniques, 99 Gif-sur-Yvette, (France); Mechin, Laurence [CNRS, UCBN, Groupe de Recherche en Informatique, Image, Automatique et Instrumentation de Caen, 4050 Caen, (France)

    2015-07-01

    Radiation Portal Monitors are deployed in linear network to detect radiological material in motion. As a complement to single and multichannel detection algorithms, inefficient under too low signal to noise ratios, temporal correlation algorithms have been introduced. Test hypothesis methods based on empirically estimated mean and variance of the signals delivered by the different channels have shown significant gain in terms of a tradeoff between detection sensitivity and false alarm probability. This paper discloses the concept of a new hypothesis test for temporal correlation detection methods, taking advantage of the Poisson nature of the registered counting signals, and establishes a benchmark between this test and its empirical counterpart. The simulation study validates that in the four relevant configurations of a pedestrian source carrier under respectively high and low count rate radioactive background, and a vehicle source carrier under the same respectively high and low count rate radioactive background, the newly introduced hypothesis test ensures a significantly improved compromise between sensitivity and false alarm, while guaranteeing the stability of its optimization parameter regardless of signal to noise ratio variations between 2 to 0.8. (authors)

  14. Analytical chemistry

    International Nuclear Information System (INIS)

    Anon.

    1985-01-01

    The division for Analytical Chemistry continued to try and develope an accurate method for the separation of trace amounts from mixtures which, contain various other elements. Ion exchange chromatography is of special importance in this regard. New separation techniques were tried on certain trace amounts in South African standard rock materials and special ceramics. Methods were also tested for the separation of carrier-free radioisotopes from irradiated cyclotron discs

  15. Newspaper Reading among College Students in Development of Their Analytical Ability

    Science.gov (United States)

    Kumar, Dinesh

    2009-01-01

    The study investigated the newspaper reading among college students in development of their analytical ability. Newspapers are one of the few sources of information that are comprehensive, interconnected and offered in one format. The main objective of the study was to find out the development of the analytical ability among college students by…

  16. Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.

    Science.gov (United States)

    Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs

    2018-01-01

    While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.

  17. Multiplicity distributions of gluon and quark jets and a test of QCD analytic calculations

    International Nuclear Information System (INIS)

    Gary, J. William

    1999-01-01

    Gluon jets are identified in e + e - hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The mean and first few higher moments of the gluon jet charged particle multiplicity distribution are compared to the analogous results found for light quark (uds) jets, also defined inclusively. Large differences are observed between the mean, skew and curtosis values of the gluon and quark jets, but not between their dispersions. The cumulant factorial moments of the distributions are also measured, and are used to test the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the separated gluon and quark jet cumulant moments compared to a next-to-leading order calculation without energy conservation. There is good quantitative agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets. The data sample used is the LEP-1 sample of the OPAL experiment at LEP

  18. Multiplicity distributions of gluon and quark jets and a test of QCD analytic calculations

    Energy Technology Data Exchange (ETDEWEB)

    Gary, J. William

    1999-03-01

    Gluon jets are identified in e{sup +}e{sup -} hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The mean and first few higher moments of the gluon jet charged particle multiplicity distribution are compared to the analogous results found for light quark (uds) jets, also defined inclusively. Large differences are observed between the mean, skew and curtosis values of the gluon and quark jets, but not between their dispersions. The cumulant factorial moments of the distributions are also measured, and are used to test the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the separated gluon and quark jet cumulant moments compared to a next-to-leading order calculation without energy conservation. There is good quantitative agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets. The data sample used is the LEP-1 sample of the OPAL experiment at LEP.

  19. Multiplicity distributions of gluon and quark jets and a test of QCD analytic calculations

    Energy Technology Data Exchange (ETDEWEB)

    Gary, J.W. [California Univ., Riverside, CA (United States). Dept. of Physics

    1999-03-01

    Gluon jets are identified in e{sup +}e{sup -} hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The mean and first few higher moments of the gluon jet charged particle multiplicity distribution are compared to the analogous results found for light quark (uds) jets, also defined inclusively. Large differences are observed between the mean, skew and curtosis values of the gluon and quark jets, but not between their dispersions. The cumulant factorial moments of the distributions are also measured, and are used to test the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the separated gluon and quark jet cumulant moments compared to a next-to-leading order calculation without energy conservation. There is good quantitative agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets. The data sample used is the LEP-1 sample of the OPAL experiment at LEP. (orig.) 6 refs.

  20. Multiplicity distributions of gluon and quark jets and a test of QCD analytic calculations

    International Nuclear Information System (INIS)

    Gary, J.W.

    1999-01-01

    Gluon jets are identified in e + e - hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The mean and first few higher moments of the gluon jet charged particle multiplicity distribution are compared to the analogous results found for light quark (uds) jets, also defined inclusively. Large differences are observed between the mean, skew and curtosis values of the gluon and quark jets, but not between their dispersions. The cumulant factorial moments of the distributions are also measured, and are used to test the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the separated gluon and quark jet cumulant moments compared to a next-to-leading order calculation without energy conservation. There is good quantitative agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets. The data sample used is the LEP-1 sample of the OPAL experiment at LEP. (orig.)

  1. Multiplicity distributions of gluon and quark jets and a test of QCD analytic calculations

    Science.gov (United States)

    Gary, J. William

    1999-03-01

    Gluon jets are identified in e +e - hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon hets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The mean and first few higher moments of the gluon jet charged particle multiplicity distribution are compared to the analogous results found for light quark (uds) jets, also defined inclusively. Large differences are observed between the mean, skew and curtosis values of the gluon and quark jets, but not between their dispersions. The cumulant factorial moments of the distributions are also measured, and are used to test the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the separated gluon and quark jet cumulant moments compared to a next-to-leading order calculation without energy conservation. There is good quantitative agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets. The data sample used is the LEP-1 sample of the OPAL experiment at LEP.

  2. Site study plan for geochemical analytical requirements and methodologies: Revision 1

    International Nuclear Information System (INIS)

    1987-12-01

    This site study plan documents the analytical methodologies and procedures that will be used to analyze geochemically the rock and fluid samples collected during Site Characterization. Information relating to the quality aspects of these analyses is also provided, where available. Most of the proposed analytical procedures have been used previously on the program and are sufficiently sensitive to yield high-quality analyses. In a few cases improvements in analytical methodology (e.g., greater sensitivity, fewer interferences) are desired. Suggested improvements to these methodologies are discussed. In most cases these method-development activities have already been initiated. The primary source of rock and fluid samples for geochemical analysis during Site Characterization will be the drilling program, as described in various SRP Site Study Plans. The Salt Repository Project (SRP) Networks specify the schedule under which the program will operate. Drilling will not begin until after site ground water baseline conditions have been established. The Technical Field Services Contractor (TFSC) is responsible for conducting the field program of drilling and testing. Samples and data will be handled and reported in accordance with established SRP procedures. A quality assurance program will be utilized to assure that activities affecting quality are performed correctly and that the appropriate documentation is maintained. 28 refs., 9 figs., 14 tabs

  3. Design and tests of a package for the transport of radioactive sources; Projeto e testes de uma embalagem para o transporte de fontes radioativas

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Paulo de Oliveira, E-mail: pos@cdtn.b [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2011-10-26

    The Type A package was designed for transportation of seven cobalt-60 sources with total activity of 1 GBq. The shield thickness to accomplish the dose rate and the transport index established by the radioactive transport regulation was calculated by the code MCNP (Monte Carlo N-Particle Transport Code Version 5). The sealed cobalt-60 sources were tested for leakages. according to the regulation ISO 9978:1992 (E). The package was tested according to regulation Radioactive Material Transport CNEN. The leakage tests results pf the sources, and the package tests demonstrate that the transport can be safe performed from the CDTN to the steelmaking industries

  4. OpenSR: An Open-Source Stimulus-Response Testing Framework

    Directory of Open Access Journals (Sweden)

    Carolyn C. Matheus

    2015-01-01

    Full Text Available Stimulus–response (S–R tests provide a unique way to acquire information about human perception by capturing automatic responses to stimuli and attentional processes. This paper presents OpenSR, a user-centered S–R testing framework providing a graphical user interface that can be used by researchers to customize, administer, and manage one type of S–R test, the implicit association test. OpenSR provides an extensible open-source Web-based framework that is platform independent and can be implemented on most computers using any operating system. In addition, it provides capabilities for automatically generating and assigning participant identifications, assigning participants to different condition groups, tracking responses, and facilitating collecting and exporting of data. The Web technologies and languages used in creating the OpenSR framework are discussed, namely, HTML5, CSS3, JavaScript, jQuery, Twitter Bootstrap, Python, and Django. OpenSR is available for free download.

  5. Report on the engineering test of the LBL 30 second neutral beam source for the MFTF-B project

    International Nuclear Information System (INIS)

    Vella, M.C.; Pincosy, P.A.; Hauck, C.A.; Pyle, R.V.

    1984-08-01

    Positive ion based neutral beam development in the US has centered on the long pulse, Advanced Positive Ion Source (APIS). APIS eventually focused on development of 30 second sources for MFTF-B. The Engineering Test was part of competitive testing of the LBL and ORNL long pulse sources carried out for the MFTF-B Project. The test consisted of 500 beam shots with 80 kV, 30 second deuterium, and was carried out on the Neutral Beam Engineering Test Facility (NBETF). This report summarizes the results of LBL testing, in which the LBL APIS demonstrated that it would meet the requirements for MFTF-B 30 second sources. In part as a result of this test, the LBL design was found to be suitable as the baseline for a Common Long Pulse Source design for MFTF-B, TFTR, and Doublet Upgrade

  6. A new method of testing space-based high-energy electron detectors with radioactive electron sources

    Science.gov (United States)

    Zhang, S. Y.; Shen, G. H.; Sun, Y.; Zhou, D. Z.; Zhang, X. X.; Li, J. W.; Huang, C.; Zhang, X. G.; Dong, Y. J.; Zhang, W. J.; Zhang, B. Q.; Shi, C. Y.

    2016-05-01

    Space-based electron detectors are commonly tested using radioactive β-sources which emit a continuous spectrum without spectral lines. Therefore, the tests are often to be considered only qualitative. This paper introduces a method, which results in more than a qualitative test even when using a β-source. The basic idea is to use the simulated response function of the instrument to invert the measured spectrum and compare this inverted spectrum with a reference spectrum obtained from the same source. Here we have used Geant4 to simulate the instrument response function (IRF) and a 3.5 mm thick Li-drifted Si detector to obtain the reference 90Sr/90Yi source spectrum to test and verify the geometric factors of the Omni-Direction Particle Detector (ODPD) on the Tiangong-1 (TG-1) and Tiangong-2 (TG-2) spacecraft. The TG spacecraft are experimental space laboratories and prototypes of the Chinese space station. The excellent agreement between the measured and reference spectra demonstrates that this test method can be used to quantitatively assess the quality of the instrument. Due to its simplicity, the method is faster and therefore more efficient than traditional full calibrations using an electron accelerator.

  7. The feasibility of 10 keV X-ray as radiation source in total dose response radiation test

    International Nuclear Information System (INIS)

    Li Ruoyu; Li Bin; Luo Hongwei; Shi Qian

    2005-01-01

    The standard radiation source utilized in traditional total dose response radiation test is 60 Co, which is environment-threatening. X-rays, as a new radiation source, has the advantages such as safety, precise control of dose rate, strong intensity, possibility of wafer-level test or even on-line test, which greatly reduce cost for package, test and transportation. This paper discussed the feasibility of X-rays replacing 60 Co as the radiation source, based on the radiation mechanism and the effects of radiation on gate oxide. (authors)

  8. Applications of photon-in, photon-out spectroscopy with third-generation, synchrotron-radiation sources

    International Nuclear Information System (INIS)

    Lindle, D.W.; Perera, R.C.C.

    1991-01-01

    This report discusses the following topics: Mother nature's finest test probe; soft x-ray emission spectroscopy with high-brightness synchrotron radiation sources; anisotropy and polarization of x-ray emission from atoms and molecules; valence-hole fluorescence from molecular photoions as a probe of shape-resonance ionization: progress and prospects; structural biophysics on third-generation synchrotron sources; ultra-soft x-ray fluorescence-yield XAFS: an in situ photon-in, photon-out spectroscopy; and x-ray microprobe: an analytical tool for imaging elemental composition and microstructure

  9. Nodewise analytical calculation of the transfer function

    International Nuclear Information System (INIS)

    Makai, Mihaly

    1994-01-01

    The space dependence of neutron noise has so far been mostly investigated in homogeneous core models. Application of core diagnostic methods to locate a malfunction requires however that the transfer function be calculated for real, inhomogeneous cores. A code suitable for such purpose must be able to handle complex arithmetic and delta-function source. Further requirements are analytical dependence in one spatial variable and fast execution. The present work describes the TIDE program written to fulfil the above requirements. The core is subdivided into homogeneous, square assemblies. An analytical solution is given, which is a generalisation of the inhomogeneous response matrix method. (author)

  10. service line analytics in the new era.

    Science.gov (United States)

    Spence, Jay; Seargeant, Dan

    2015-08-01

    To succeed under the value-based business model, hospitals and health systems require effective service line analytics that combine inpatient and outpatient data and that incorporate quality metrics for evaluating clinical operations. When developing a framework for collection, analysis, and dissemination of service line data, healthcare organizations should focus on five key aspects of effective service line analytics: Updated service line definitions. Ability to analyze and trend service line net patient revenues by payment source. Access to accurate service line cost information across multiple dimensions with drill-through capabilities. Ability to redesign key reports based on changing requirements. Clear assignment of accountability.

  11. Big Data Analytics in Medicine and Healthcare.

    Science.gov (United States)

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  12. A big data geospatial analytics platform - Physical Analytics Integrated Repository and Services (PAIRS)

    Science.gov (United States)

    Hamann, H.; Jimenez Marianno, F.; Klein, L.; Albrecht, C.; Freitag, M.; Hinds, N.; Lu, S.

    2015-12-01

    A big data geospatial analytics platform:Physical Analytics Information Repository and Services (PAIRS)Fernando Marianno, Levente Klein, Siyuan Lu, Conrad Albrecht, Marcus Freitag, Nigel Hinds, Hendrik HamannIBM TJ Watson Research Center, Yorktown Heights, NY 10598A major challenge in leveraging big geospatial data sets is the ability to quickly integrate multiple data sources into physical and statistical models and be run these models in real time. A geospatial data platform called Physical Analytics Information and Services (PAIRS) is developed on top of open source hardware and software stack to manage Terabyte of data. A new data interpolation and re gridding is implemented where any geospatial data layers can be associated with a set of global grid where the grid resolutions is doubling for consecutive layers. Each pixel on the PAIRS grid have an index that is a combination of locations and time stamp. The indexing allow quick access to data sets that are part of a global data layers and allowing to retrieve only the data of interest. PAIRS takes advantages of parallel processing framework (Hadoop) in a cloud environment to digest, curate, and analyze the data sets while being very robust and stable. The data is stored on a distributed no-SQL database (Hbase) across multiple server, data upload and retrieval is parallelized where the original analytics task is broken up is smaller areas/volume, analyzed independently, and then reassembled for the original geographical area. The differentiating aspect of PAIRS is the ability to accelerate model development across large geographical regions and spatial resolution ranging from 0.1 m up to hundreds of kilometer. System performance is benchmarked on real time automated data ingestion and retrieval of Modis and Landsat data layers. The data layers are curated for sensor error, verified for correctness, and analyzed statistically to detect local anomalies. Multi-layer query enable PAIRS to filter different data

  13. Analytical support of plant specific SAMG development validation of SAMG using MELCOR 1.8.5

    International Nuclear Information System (INIS)

    Duspiva, Jiri

    2006-01-01

    They are two NPPs in operation in Czech Republic. Both of NPPs operated in CR have already implemented EOPs, developed under collaboration with the WESE. The project on SAMG development has started and follows the previous one for EOPs also with the WESE as the leading organization. Plant specific SAMGs for the Temelin as well as Dukovany NPPs are based on the WOG generic SAMGs. The analytical support of plant specific SAMGs development is performed by the NRI Rez within the validation process. Basic conditions as well as their filling by NRI Rez are focused on analyst, analytical tools and their applications. More detail description is attended to the approach of the preparation of the MELCOR code application to the evaluation of hydrogen risk, validation of recent set of hydrogen passive autocatalytic recombiners and definition of proposals to amend system of hydrogen removal. Such kind of parametric calculations will request to perform very wide set of runs. It could not be possible with the whole plant model and decoupling of such calculation with storing of mass and energy sources into the containment is only one way. The example of this decoupling for the LOCA scenario is shown. It includes seven sources - heat losses from primary and secondary circuits, fluid blowndown through cold leg break, fission products blowndown through cold leg break, fluid blowndown through break in reactor pressure vessel bottom head, fission products through break in reactor pressure vessel bottom head, melt ejection from reactor pressure vessel to cavity and gas masses and heat losses from corium in cavity. The stand alone containment analysis was tested in two configurations - with or without taking of fission products into account. Testing showed very good agreement of all calculations until lower head failure and acceptable agreement after that. Also some problematic features appeared. The stand alone test with fission product was possible only after the changes in source code

  14. Practical web analytics for user experience how analytics can help you understand your users

    CERN Document Server

    Beasley, Michael

    2013-01-01

    Practical Web Analytics for User Experience teaches you how to use web analytics to help answer the complicated questions facing UX professionals. Within this book, you'll find a quantitative approach for measuring a website's effectiveness and the methods for posing and answering specific questions about how users navigate a website. The book is organized according to the concerns UX practitioners face. Chapters are devoted to traffic, clickpath, and content use analysis, measuring the effectiveness of design changes, including A/B testing, building user profiles based on search hab

  15. Various quantum nonlocality tests with a commercial two-photon entanglement source

    International Nuclear Information System (INIS)

    Pomarico, Enrico; Bancal, Jean-Daniel; Sanguinetti, Bruno; Rochdi, Anas; Gisin, Nicolas

    2011-01-01

    Nonlocality is a fascinating and counterintuitive aspect of nature, revealed by the violation of a Bell inequality. The standard and easiest configuration in which Bell inequalities can be measured has been proposed by Clauser-Horne-Shimony-Holt (CHSH). However, alternative nonlocality tests can also be carried out. In particular, Bell inequalities requiring multiple measurement settings can provide deeper fundamental insights about quantum nonlocality, as well as offering advantages in the presence of noise and detection inefficiency. In this paper we show how these nonlocality tests can be performed using a commercially available source of entangled photon pairs. We report the violation of a series of these nonlocality tests (I 3322 , I 4422 , and chained inequalities). With the violation of the chained inequality with 4 settings per side we put an upper limit at 0.49 on the local content of the states prepared by the source (instead of 0.63 attainable with CHSH). We also quantify the amount of true randomness that has been created during our experiment (assuming fair sampling of the detected events).

  16. Analytical thermal model validation for Cassini radioisotope thermoelectric generator

    International Nuclear Information System (INIS)

    Lin, E.I.

    1997-01-01

    The Saturn-bound Cassini spacecraft is designed to rely, without precedent, on the waste heat from its three radioisotope thermoelectric generators (RTGs) to warm the propulsion module subsystem, and the RTG end dome temperature is a key determining factor of the amount of waste heat delivered. A previously validated SINDA thermal model of the RTG was the sole guide to understanding its complex thermal behavior, but displayed large discrepancies against some initial thermal development test data. A careful revalidation effort led to significant modifications and adjustments of the model, which result in a doubling of the radiative heat transfer from the heat source support assemblies to the end domes and bring up the end dome and flange temperature predictions to within 2 C of the pertinent test data. The increased inboard end dome temperature has a considerable impact on thermal control of the spacecraft central body. The validation process offers an example of physically-driven analytical model calibration with test data from not only an electrical simulator but also a nuclear-fueled flight unit, and has established the end dome temperatures of a flight RTG where no in-flight or ground-test data existed before

  17. Civil Society In Tanzania: An Analytical Review Of Sources Of ...

    African Journals Online (AJOL)

    Sixty percent of civil societies deal with social development programmes. Additionally, results show that most civil societies had disproportionate staffing problems; and sixty six percent depended on international sources of funding while 46% reported that they secured funds from both local and foreign sources of financing.

  18. Experimental analytical study on heat pipes

    International Nuclear Information System (INIS)

    Ismail, K.A.R.; Liu, C.Y.; Murcia, N.

    1981-01-01

    An analytical model is developed for optimizing the thickness distribution of the porous material in heat pipes. The method was used to calculate, design and construct heat pipes with internal geometrical changes. Ordinary pipes are also constructed and tested together with the modified ones. The results showed that modified tubes are superior in performance and that the analytical model can predict their performance to within 1.5% precision. (Author) [pt

  19. (U) An Analytic Study of Piezoelectric Ejecta Mass Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Tregillis, Ian Lee [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-16

    We consider the piezoelectric measurement of the areal mass of an ejecta cloud, for the specific case where ejecta are created by a single shock at the free surface and fly ballistically through vacuum to the sensor. To do so, we define time- and velocity-dependent ejecta “areal mass functions” at the source and sensor in terms of typically unknown distribution functions for the ejecta particles. Next, we derive an equation governing the relationship between the areal mass function at the source (which resides in the rest frame of the free surface) and at the sensor (which resides in the laboratory frame). We also derive expressions for the analytic (“true”) accumulated ejecta mass at the sensor and the measured (“inferred”) value obtained via the standard method for analyzing piezoelectric voltage traces. This approach enables us to derive an exact expression for the error imposed upon a piezoelectric ejecta mass measurement (in a perfect system) by the assumption of instantaneous creation. We verify that when the ejecta are created instantaneously (i.e., when the time dependence is a delta function), the piezoelectric inference method exactly reproduces the correct result. When creation is not instantaneous, the standard piezo analysis will always overestimate the true mass. However, the error is generally quite small (less than several percent) for most reasonable velocity and time dependences. In some cases, errors exceeding 10-15% may require velocity distributions or ejecta production timescales inconsistent with experimental observations. These results are demonstrated rigorously with numerous analytic test problems.

  20. Proactive Supply Chain Performance Management with Predictive Analytics

    Directory of Open Access Journals (Sweden)

    Nenad Stefanovic

    2014-01-01

    Full Text Available Today’s business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators. Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment.

  1. Proactive supply chain performance management with predictive analytics.

    Science.gov (United States)

    Stefanovic, Nenad

    2014-01-01

    Today's business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI) model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators). Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment.

  2. Proactive Supply Chain Performance Management with Predictive Analytics

    Science.gov (United States)

    Stefanovic, Nenad

    2014-01-01

    Today's business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI) model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators). Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment. PMID:25386605

  3. Effects of fecal sampling on preanalytical and analytical phases in quantitative fecal immunochemical tests for hemoglobin.

    Science.gov (United States)

    Rapi, Stefano; Berardi, Margherita; Cellai, Filippo; Ciattini, Samuele; Chelazzi, Laura; Ognibene, Agostino; Rubeca, Tiziana

    2017-07-24

    Information on preanalytical variability is mandatory to bring laboratories up to ISO 15189 requirements. Fecal sampling is greatly affected by lack of harmonization in laboratory medicine. The aims of this study were to obtain information on the devices used for fecal sampling and to explore the effect of different amounts of feces on the results from the fecal immunochemical test for hemoglobin (FIT-Hb). Four commercial sample collection devices for quantitative FIT-Hb measurements were investigated. The volume of interest (VOI) of the probes was measured from diameter and geometry. Quantitative measurements of the mass of feces were carried out by gravimetry. The effects of an increased amount of feces on the analytical environment were investigated measuring the Hb values with a single analytical method. VOI was 8.22, 7.1 and 9.44 mm3 for probes that collected a target of 10 mg of feces, and 3.08 mm3 for one probe that targeted 2 mg of feces. The ratio between recovered and target amounts of devices ranged from 56% to 121%. Different changes in the measured Hb values were observed, in adding increasing amounts of feces in commercial buffers. The amounts of collected materials are related to the design of probes. Three out 4 manufacturers declare the same target amount using different sampling volumes and obtaining different amounts of collected materials. The introduction of a standard probes to reduce preanalytical variability could be an useful step for fecal test harmonization and to fulfill the ISO 15189 requirements.

  4. World Spatiotemporal Analytics and Mapping Project (WSTAMP): Discovering, Exploring, and Mapping Spatiotemporal Patterns across the World s Largest Open Source Geographic Data Sets

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Robert N [ORNL; Piburn, Jesse O [ORNL; Sorokine, Alexandre [ORNL; Myers, Aaron T [ORNL; White, Devin A [ORNL

    2015-01-01

    The application of spatiotemporal (ST) analytics to integrated data from major sources such as the World Bank, United Nations, and dozens of others holds tremendous potential for shedding new light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, and changing attributes, as well as content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 10,000+ attributes covering over 200 nation states spanning over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We discuss the status of this work and report on major findings. Acknowledgment Prepared by Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, Tennessee 37831-6285, managed by UT-Battelle, LLC for the U. S. Department of Energy under contract no. DEAC05-00OR22725. Copyright This manuscript has been authored by employees of UT-Battelle, LLC, under contract DE-AC05-00OR22725 with the U.S. Department of Energy. Accordingly, the United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or

  5. Source Country Differences in Test Score Gaps: Evidence from Denmark

    Science.gov (United States)

    Rangvid, Beatrice Schindler

    2010-01-01

    We combine data from three studies for Denmark in the PISA 2000 framework to investigate differences in the native-immigrant test score gap by country of origin. In addition to the controls available from PISA data sources, we use student-level data on home background and individual migration histories linked from administrative registers. We find…

  6. A new method of testing space-based high-energy electron detectors with radioactive electron sources

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, S.Y. [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Shen, G.H., E-mail: shgh@nssc.ac.cn [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Sun, Y., E-mail: sunying@nssc.ac.cn [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Zhou, D.Z., E-mail: dazhuang.zhou@gmail.com [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Zhang, X.X., E-mail: xxzhang@cma.gov.cn [National Center for Space Weather, Beijing (China); Li, J.W., E-mail: lijw@cma.gov.cn [National Center for Space Weather, Beijing (China); Huang, C., E-mail: huangc@cma.gov.cn [National Center for Space Weather, Beijing (China); Zhang, X.G., E-mail: zhangxg@nssc.ac.cn [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Dong, Y.J., E-mail: dyj@nssc.ac.cn [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Zhang, W.J., E-mail: zhangreatest@163.com [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Zhang, B.Q., E-mail: zhangbinquan@nssc.ac.cn [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Shi, C.Y., E-mail: scy@nssc.ac.cn [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China)

    2016-05-01

    Space-based electron detectors are commonly tested using radioactive β-sources which emit a continuous spectrum without spectral lines. Therefore, the tests are often to be considered only qualitative. This paper introduces a method, which results in more than a qualitative test even when using a β-source. The basic idea is to use the simulated response function of the instrument to invert the measured spectrum and compare this inverted spectrum with a reference spectrum obtained from the same source. Here we have used Geant4 to simulate the instrument response function (IRF) and a 3.5 mm thick Li-drifted Si detector to obtain the reference {sup 90}Sr/{sup 90}Yi source spectrum to test and verify the geometric factors of the Omni-Direction Particle Detector (ODPD) on the Tiangong-1 (TG-1) and Tiangong-2 (TG-2) spacecraft. The TG spacecraft are experimental space laboratories and prototypes of the Chinese space station. The excellent agreement between the measured and reference spectra demonstrates that this test method can be used to quantitatively assess the quality of the instrument. Due to its simplicity, the method is faster and therefore more efficient than traditional full calibrations using an electron accelerator.

  7. A new method of testing space-based high-energy electron detectors with radioactive electron sources

    International Nuclear Information System (INIS)

    Zhang, S.Y.; Shen, G.H.; Sun, Y.; Zhou, D.Z.; Zhang, X.X.; Li, J.W.; Huang, C.; Zhang, X.G.; Dong, Y.J.; Zhang, W.J.; Zhang, B.Q.; Shi, C.Y.

    2016-01-01

    Space-based electron detectors are commonly tested using radioactive β-sources which emit a continuous spectrum without spectral lines. Therefore, the tests are often to be considered only qualitative. This paper introduces a method, which results in more than a qualitative test even when using a β-source. The basic idea is to use the simulated response function of the instrument to invert the measured spectrum and compare this inverted spectrum with a reference spectrum obtained from the same source. Here we have used Geant4 to simulate the instrument response function (IRF) and a 3.5 mm thick Li-drifted Si detector to obtain the reference "9"0Sr/"9"0Yi source spectrum to test and verify the geometric factors of the Omni-Direction Particle Detector (ODPD) on the Tiangong-1 (TG-1) and Tiangong-2 (TG-2) spacecraft. The TG spacecraft are experimental space laboratories and prototypes of the Chinese space station. The excellent agreement between the measured and reference spectra demonstrates that this test method can be used to quantitatively assess the quality of the instrument. Due to its simplicity, the method is faster and therefore more efficient than traditional full calibrations using an electron accelerator.

  8. A genetic algorithm-based job scheduling model for big data analytics.

    Science.gov (United States)

    Lu, Qinghua; Li, Shanshan; Zhang, Weishan; Zhang, Lei

    Big data analytics (BDA) applications are a new category of software applications that process large amounts of data using scalable parallel processing infrastructure to obtain hidden value. Hadoop is the most mature open-source big data analytics framework, which implements the MapReduce programming model to process big data with MapReduce jobs. Big data analytics jobs are often continuous and not mutually separated. The existing work mainly focuses on executing jobs in sequence, which are often inefficient and consume high energy. In this paper, we propose a genetic algorithm-based job scheduling model for big data analytics applications to improve the efficiency of big data analytics. To implement the job scheduling model, we leverage an estimation module to predict the performance of clusters when executing analytics jobs. We have evaluated the proposed job scheduling model in terms of feasibility and accuracy.

  9. USE OF BIG DATA ANALYTICS FOR CUSTOMER RELATIONSHIP MANAGEMENT: POINT OF PARITY OR SOURCE OF COMPETITIVE ADVANTAGE?

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas; Zablah, Alex R.; Straub, Detmar W.

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (CA use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: 1. What are the key antecedents of big data customer analytics use? 2. How, and to what extent, does big data...

  10. Pollutant source identification model for water pollution incidents in small straight rivers based on genetic algorithm

    Science.gov (United States)

    Zhang, Shou-ping; Xin, Xiao-kang

    2017-07-01

    Identification of pollutant sources for river pollution incidents is an important and difficult task in the emergency rescue, and an intelligent optimization method can effectively compensate for the weakness of traditional methods. An intelligent model for pollutant source identification has been established using the basic genetic algorithm (BGA) as an optimization search tool and applying an analytic solution formula of one-dimensional unsteady water quality equation to construct the objective function. Experimental tests show that the identification model is effective and efficient: the model can accurately figure out the pollutant amounts or positions no matter single pollution source or multiple sources. Especially when the population size of BGA is set as 10, the computing results are sound agree with analytic results for a single source amount and position identification, the relative errors are no more than 5 %. For cases of multi-point sources and multi-variable, there are some errors in computing results for the reasons that there exist many possible combinations of the pollution sources. But, with the help of previous experience to narrow the search scope, the relative errors of the identification results are less than 5 %, which proves the established source identification model can be used to direct emergency responses.

  11. Application of radioactive sources in analytical instruments for planetary exploration

    International Nuclear Information System (INIS)

    Economou, T.E.

    2008-01-01

    Full text: In the past 50 years or so, many types of radioactive sources have been used in space exploration. 238 Pu is often used in space missions in Radioactive Heater Units (RHU) and Radioisotope Thermoelectric Generators (RTG) for heat and power generation, respectively. In 1960's, 2 ' 42 Cm alpha radioactive sources have been used for the first time in space applications on 3 Surveyor spacecrafts to obtain the chemical composition of the lunar surface with an instrument based on the Rutherford backscatterring of the alpha particles from nuclei in the analyzed sample. 242 Cm is an alpha emitter of 6.1 MeV alpha particles. Its half-life time, 163 days, is short enough to allow sources to be prepared with the necessary high intensity per unit area ( up to 470 mCi and FWHM of about 1.5% in the lunar instruments) that results in narrow energy distribution, yet long enough that the sources have adequate lifetimes for short duration missions. 242 Cm is readily prepared in curie quantities by irradiation of 241 Am by neutrons in nuclear reactors, followed by chemical separation of the curium from the americium and fission products. For long duration missions, like for example missions to Mars, comets, and asteroids, the isotope 244 Cm (T 1/2 =18.1 y, E α =5.8 MeV) is a better source because of its much longer half-life time. Both of these isotopes are also excellent x-ray excitation sources and have been used for that purpose on several planetary missions. For the light elements the excitation is caused mainly by the alpha particles, while for the heavier elements (> Ca) the excitation is mainly due to the x-rays from the Pu L-lines (E x =14-18 keV). 244 Cm has been used in several variations of the Alpha Proton Xray Spectrometer (APXS): PHOBOS 1 and 2 Pathfinder, Russian Mars-96 mission, Mars Exploration Rover (MER) and Rosetta. Other sources used in X-ray fluorescence instruments in space are 55 Fe and 109 Cd (Viking1,2, Beagle 2) and 57 Co is used in Moessbauer

  12. Integrated Array/Metadata Analytics

    Science.gov (United States)

    Misev, Dimitar; Baumann, Peter

    2015-04-01

    Data comes in various forms and types, and integration usually presents a problem that is often simply ignored and solved with ad-hoc solutions. Multidimensional arrays are an ubiquitous data type, that we find at the core of virtually all science and engineering domains, as sensor, model, image, statistics data. Naturally, arrays are richly described by and intertwined with additional metadata (alphanumeric relational data, XML, JSON, etc). Database systems, however, a fundamental building block of what we call "Big Data", lack adequate support for modelling and expressing these array data/metadata relationships. Array analytics is hence quite primitive or non-existent at all in modern relational DBMS. Recognizing this, we extended SQL with a new SQL/MDA part seamlessly integrating multidimensional array analytics into the standard database query language. We demonstrate the benefits of SQL/MDA with real-world examples executed in ASQLDB, an open-source mediator system based on HSQLDB and rasdaman, that already implements SQL/MDA.

  13. Argon analytical procedures for potassium-argon dating

    International Nuclear Information System (INIS)

    Gabites, J.E.; Adams, C.J.

    1981-01-01

    A manual for the argon analytical methods involved in potassium-argon geochronology, including: i) operating procedures for the ultra-high vacuum argon extraction/purification equipment for the analysis of nanolitre quantities of radiogenic argon in rocks, minerals and gases; ii) operating procedures for the AEI-MS10 gas source mass spectrometer

  14. Family of Quantum Sources for Improving Near Field Accuracy in Transducer Modeling by the Distributed Point Source Method

    Directory of Open Access Journals (Sweden)

    Dominique Placko

    2016-10-01

    Full Text Available The distributed point source method, or DPSM, developed in the last decade has been used for solving various engineering problems—such as elastic and electromagnetic wave propagation, electrostatic, and fluid flow problems. Based on a semi-analytical formulation, the DPSM solution is generally built by superimposing the point source solutions or Green’s functions. However, the DPSM solution can be also obtained by superimposing elemental solutions of volume sources having some source density called the equivalent source density (ESD. In earlier works mostly point sources were used. In this paper the DPSM formulation is modified to introduce a new kind of ESD, replacing the classical single point source by a family of point sources that are referred to as quantum sources. The proposed formulation with these quantum sources do not change the dimension of the global matrix to be inverted to solve the problem when compared with the classical point source-based DPSM formulation. To assess the performance of this new formulation, the ultrasonic field generated by a circular planer transducer was compared with the classical DPSM formulation and analytical solution. The results show a significant improvement in the near field computation.

  15. An Analysis of Earth Science Data Analytics Use Cases

    Science.gov (United States)

    Shie, Chung-Lin; Kempler, Steve

    2014-01-01

    The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https:earthdata.nasa.govaboutsystem-- performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co--analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.

  16. 100-K Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    Ovink, R.

    2012-09-18

    This report documents the process used to identify source area target analytes in support of the 100-K Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).

  17. 100-F Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    Ovink, R.

    2012-09-18

    This report documents the process used to identify source area target analytes in support of the 100-F Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).

  18. Investigation of rf plasma light sources for dye laser excitation

    International Nuclear Information System (INIS)

    Kendall, J.S.; Jaminet, J.F.

    1975-06-01

    Analytical and experimental studies were performed to assess the applicability of radio frequency (rf) induction heated plasma light sources for potential excitation of continuous dye lasers. Experimental efforts were directed toward development of a continuous light source having spectral flux and emission characteristics approaching that required for pumping organic dye lasers. Analytical studies were performed to investigate (1) methods of pulsing the light source to obtain higher radiant intensity and (2) methods of integrating the source with a reflective cavity for pumping a dye cell. (TFD)

  19. Improved Thermal-Vacuum Compatible Flat Plate Radiometric Source For System-Level Testing Of Optical Sensors

    Science.gov (United States)

    Schwarz, Mark A.; Kent, Craig J.; Bousquet, Robert; Brown, Steven W.

    2016-01-01

    In this work, we describe an improved thermal-vacuum compatible flat plate radiometric source which has been developed and utilized for the characterization and calibration of remote optical sensors. This source is unique in that it can be used in situ, in both ambient and thermal-vacuum environments, allowing it to follow the sensor throughout its testing cycle. The performance of the original flat plate radiometric source was presented at the 2009 SPIE1. Following the original efforts, design upgrades were incorporated into the source to improve both radiometric throughput and uniformity. The pre-thermal-vacuum (pre-TVAC) testing results of a spacecraft-level optical sensor with the improved flat plate illumination source, both in ambient and vacuum environments, are presented. We also briefly discuss potential FPI configuration changes in order to improve its radiometric performance.

  20. Experimental and analytical study of natural-convection heat transfer of internally heated liquids

    International Nuclear Information System (INIS)

    Green, G.A.

    1982-08-01

    Boundary heat transfer from a liquid pool with a uniform internal heat source to a vertical or inclined boundary was investigated. The experiments were performed in an open rectangular liquid pool in which the internal heat source was generated by electrical heating. The local heat flux was measured to a boron nitride test wall which was able to be continuously inclined from vertical. Gold plated microthermocouples of 0.01 inch outside diameter were developed to measure the local surface temperature, both front and back, of the boron nitride. The local heat flux and, thus, the local heat transfer coefficient was measured at nineteen locations along the vertical axis of the test plate. A theoretical analysis of the coupled nonlinear boundary layer equations was performed. The parametric effect of the Prandtl number and the dimensionless wall temperature on the boundary heat transfer were investigated When the analytical model was used to calculate the boundary heat transfer data, agreement was achieved with the experimental data within 3% for the local heat transfer and within 2% for the average heat transfer

  1. Manufacturing cost study on the ion sources for the Mirror Fusion Test Facility

    International Nuclear Information System (INIS)

    A study of the cost of manufacturing 48 ion sources for the Mirror Fusion Test Facility is described. The estimate is built up from individual part costs and assembly operation times for the 80 kV prototype source constructed by LLL and described by LLL drawings furnished during December 1978. Recommendations for cost reduction are made

  2. Climax granite test results

    Energy Technology Data Exchange (ETDEWEB)

    Ramspott, L.D.

    1980-01-15

    The Lawrence Livermore Laboratory (LLL), as part of the Nevada Nuclear Waste Storage Investigations (NNWSI) program, is carrying out in situ rock mechanics testing in the Climax granitic stock at the Nevada Test Site (NTS). This summary addresses only those field data taken to date that address thermomechanical modeling for a hard-rock repository. The results to be discussed include thermal measurements in a heater test that was conducted from October 1977 through July 1978, and stress and displacement measurements made during and after excavation of the canister storage drift for the Spent Fuel Test (SFT) in the Climax granite. Associated laboratory and field measurements are summarized. The rock temperature for a given applied heat load at a point in time and space can be adequately modeled with simple analytic calculations involving superposition and integration of numerous point source solutions. The input, for locations beyond about a meter from the source, can be a constant thermal conductivity and diffusivity. The value of thermal conductivity required to match the field data is as much as 25% different from laboratory-measured values. Therefore, unless we come to understand the mechanisms for this difference, a simple in situ test will be required to obtain a value for final repository design. Some sensitivity calculations have shown that the temperature field is about ten times more sensitive to conductivity than to diffusivity under the test conditions. The orthogonal array was designed to detect anisotropy. After considering all error sources, anisotropic efforts in the thermal field were less than 5 to 10%.

  3. Analytic Reflected Lightcurves for Exoplanets

    Science.gov (United States)

    Haggard, Hal M.; Cowan, Nicolas B.

    2018-04-01

    The disk-integrated reflected brightness of an exoplanet changes as a function of time due to orbital and rotational motion coupled with an inhomogeneous albedo map. We have previously derived analytic reflected lightcurves for spherical harmonic albedo maps in the special case of a synchronously-rotating planet on an edge-on orbit (Cowan, Fuentes & Haggard 2013). In this letter, we present analytic reflected lightcurves for the general case of a planet on an inclined orbit, with arbitrary spin period and non-zero obliquity. We do so for two different albedo basis maps: bright points (δ-maps), and spherical harmonics (Y_l^m-maps). In particular, we use Wigner D-matrices to express an harmonic lightcurve for an arbitrary viewing geometry as a non-linear combination of harmonic lightcurves for the simpler edge-on, synchronously rotating geometry. These solutions will enable future exploration of the degeneracies and information content of reflected lightcurves, as well as fast calculation of lightcurves for mapping exoplanets based on time-resolved photometry. To these ends we make available Exoplanet Analytic Reflected Lightcurves (EARL), a simple open-source code that allows rapid computation of reflected lightcurves.

  4. Advanced photon source low-energy undulator test line

    International Nuclear Information System (INIS)

    Milton, S.V.

    1997-01-01

    The injector system of the Advanced Photon Source (APS) consists of a linac capable of producing 450-MeV positrons or > 650-MeV electrons, a positron accumulator ring (PAR), and a booster synchrotron designed to accelerate particles to 7 GeV. There are long periods of time when these machines are not required for filling the main storage ring and instead can be used for synchrotron radiation research. We describe here an extension of the linac beam transport called the Low-Energy Undulator Test Line (LEUTL). The LEUTL will have a twofold purpose. The first is to fully characterize innovative, future generation undulators, some of which may prove difficult or impossible to measure by traditional techniques. These might include small-gap and superconducting undulators, very long undulators, undulators with designed-in internal focusing, and helical undulators. This technique also holds the promise of extending the magnetic measurement sensitivity beyond that presently attainable. This line will provide the capability to directly test undulators before their possible insertion into operating storage rings. A second use for the test line will be to investigate the generation of coherent radiation at wavelengths down to a few tens of nanometers

  5. Optimization of a coaxial electron cyclotron resonance plasma thruster with an analytical model

    Energy Technology Data Exchange (ETDEWEB)

    Cannat, F., E-mail: felix.cannat@onera.fr, E-mail: felix.cannat@gmail.com; Lafleur, T. [Physics and Instrumentation Department, Onera -The French Aerospace Lab, Palaiseau, Cedex 91123 (France); Laboratoire de Physique des Plasmas, CNRS, Sorbonne Universites, UPMC Univ Paris 06, Univ Paris-Sud, Ecole Polytechnique, 91128 Palaiseau (France); Jarrige, J.; Elias, P.-Q.; Packan, D. [Physics and Instrumentation Department, Onera -The French Aerospace Lab, Palaiseau, Cedex 91123 (France); Chabert, P. [Laboratoire de Physique des Plasmas, CNRS, Sorbonne Universites, UPMC Univ Paris 06, Univ Paris-Sud, Ecole Polytechnique, 91128 Palaiseau (France)

    2015-05-15

    A new cathodeless plasma thruster currently under development at Onera is presented and characterized experimentally and analytically. The coaxial thruster consists of a microwave antenna immersed in a magnetic field, which allows electron heating via cyclotron resonance. The magnetic field diverges at the thruster exit and forms a nozzle that accelerates the quasi-neutral plasma to generate a thrust. Different thruster configurations are tested, and in particular, the influence of the source diameter on the thruster performance is investigated. At microwave powers of about 30 W and a xenon flow rate of 0.1 mg/s (1 SCCM), a mass utilization of 60% and a thrust of 1 mN are estimated based on angular electrostatic probe measurements performed downstream of the thruster in the exhaust plume. Results are found to be in fair agreement with a recent analytical helicon thruster model that has been adapted for the coaxial geometry used here.

  6. 10 CFR 34.67 - Records of leak testing of sealed sources and devices containing depleted uranium.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Records of leak testing of sealed sources and devices containing depleted uranium. 34.67 Section 34.67 Energy NUCLEAR REGULATORY COMMISSION LICENSES FOR INDUSTRIAL... Requirements § 34.67 Records of leak testing of sealed sources and devices containing depleted uranium. Each...

  7. Supply risk management functions of sourcing intermediaries – an investigation of the clothing industry

    DEFF Research Database (Denmark)

    Vedel, Mette; Ellegaard, Chris

    2013-01-01

    Purpose: The purpose of this research is to uncover how buying companies use sourcing intermediaries to manage supply risks in global sourcing. Design/methodology/approach: We carry out an explorative qualitative study of the clothing industry, interviewing key respondents that occupy different...... intermediary types, characterised by the set of functions they handle. Research limitations/implications: By analysing a limited set of in-depth interviews in one industry we have traded off broader analytical generalization for in-depth exploration and theory building. Therefore, future research should test...... by identifying the supply risk management functions that sourcing intermediaries carry out for buying companies. We also contribute by uncovering different types of sourcing intermediaries, determined by the collection of functions handled....

  8. Analytical method for determining colour intensities based on Cherenkov radiation colour quenching

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez-Gomez, C; Lopez-Gonzalez, J deD; Ferro-Garcia, M A [Univ. of Granada, Granada (Spain). Faculty of Sciences, Dept. of Inorganic Chemistry. Radiochemistry Section; Consejo Superior de Investigaciones Cientificas, Granada (Spain). Dept. of Chemical Research Coordinated Centre)

    1983-01-01

    A study was made for determining color intensities using as luminous non-monochromatic source produced by the Cherenkov emission in the walls of a glass capillary which acts as luminous source itself inside the colored solution to be evaluated. The reproducibility of this method has been compared with the spectrophotometric assay; the relative errors of both analytical methods have been calculated for different concentrations of congo red solution in the range of minimal error, according to Ringbom's criterion. The sensitivity of this analytical method has been studied for the two ..beta..-emitters employed: /sup 90/Sr//sup 90/Y and /sup 204/Tl.

  9. Waste minimization in analytical methods

    International Nuclear Information System (INIS)

    Green, D.W.; Smith, L.L.; Crain, J.S.; Boparai, A.S.; Kiely, J.T.; Yaeger, J.S. Schilling, J.B.

    1995-01-01

    The US Department of Energy (DOE) will require a large number of waste characterizations over a multi-year period to accomplish the Department's goals in environmental restoration and waste management. Estimates vary, but two million analyses annually are expected. The waste generated by the analytical procedures used for characterizations is a significant source of new DOE waste. Success in reducing the volume of secondary waste and the costs of handling this waste would significantly decrease the overall cost of this DOE program. Selection of appropriate analytical methods depends on the intended use of the resultant data. It is not always necessary to use a high-powered analytical method, typically at higher cost, to obtain data needed to make decisions about waste management. Indeed, for samples taken from some heterogeneous systems, the meaning of high accuracy becomes clouded if the data generated are intended to measure a property of this system. Among the factors to be considered in selecting the analytical method are the lower limit of detection, accuracy, turnaround time, cost, reproducibility (precision), interferences, and simplicity. Occasionally, there must be tradeoffs among these factors to achieve the multiple goals of a characterization program. The purpose of the work described here is to add waste minimization to the list of characteristics to be considered. In this paper the authors present results of modifying analytical methods for waste characterization to reduce both the cost of analysis and volume of secondary wastes. Although tradeoffs may be required to minimize waste while still generating data of acceptable quality for the decision-making process, they have data demonstrating that wastes can be reduced in some cases without sacrificing accuracy or precision

  10. Thermal modeling of multi-shape heating sources on n-layer electronic board

    Directory of Open Access Journals (Sweden)

    Monier-Vinard Eric

    2017-01-01

    Full Text Available The present work completes the toolbox of analytical solutions that deal with resolving steady-state temperatures of a multi-layered structure heated by one or many heat sources. The problematic of heating sources having non-rectangular shapes is addressed to enlarge the capability of analytical approaches. Moreover, various heating sources could be located on the external surfaces of the sandwiched layers as well as embedded at interface of its constitutive layers. To demonstrate its relevance, the updated analytical solution has been compared with numerical simulations on the case of a multi-layered electronic board submitted to a set of heating source configurations. The comparison shows a high agreement between analytical and numerical calculations to predict the centroid and average temperatures. The promoted analytical approach establishes a kit of practical expressions, easy to implement, which would be cumulated, using superposition principle, to help electronic designers to early detect component or board temperatures beyond manufacturer limit. The ability to eliminate bad concept candidates with a minimum of set-up, relevant assumptions and low computation time can be easily achieved.

  11. Crowd Sourced Formal Verification-Augmentation (CSFV-A)

    Science.gov (United States)

    2016-06-01

    Projects Agency (DARPA), Air Force Research Laboratory (AFRL), Charles River Analytics Inc., and TopCoder, Inc. will be holding a contest to reward...CROWD SOURCED FORMAL VERIFICATION – AUGMENTATION (CSFV-A) CHARLES RIVER ANALYTICS, INC. JUNE 2016 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC...CSFV 5e. TASK NUMBER TC 5f. WORK UNIT NUMBER RA 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Charles River Analytics, Inc. 625 Mount Auburn

  12. Structural level characterization of base oils using advanced analytical techniques

    KAUST Repository

    Hourani, Nadim; Muller, Hendrik; Adam, Frederick M.; Panda, Saroj K.; Witt, Matthias; Al-Hajji, Adnan A.; Sarathy, Mani

    2015-01-01

    cyclotron resonance mass spectrometry (FT-ICR MS) equipped with atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) sources. First, the capabilities and limitations of each analytical technique were evaluated

  13. Dataset on statistical analysis of editorial board composition of Hindawi journals indexed in Emerging sources citation index

    Directory of Open Access Journals (Sweden)

    Hilary I. Okagbue

    2018-04-01

    Full Text Available This data article contains the statistical analysis of the total, percentage and distribution of editorial board composition of 111 Hindawi journals indexed in Emerging Sources Citation Index (ESCI across the continents. The reliability of the data was shown using correlation, goodness-of-fit test, analysis of variance and statistical variability tests. Keywords: Hindawi, Bibliometrics, Data analysis, ESCI, Random, Smart campus, Web of science, Ranking analytics, Statistics

  14. Using analytic element models to delineate drinking water source protection areas.

    Science.gov (United States)

    Raymond, Heather A; Bondoc, Michael; McGinnis, John; Metropulos, Kathy; Heider, Pat; Reed, Allison; Saines, Steve

    2006-01-01

    Since 1999, Ohio EPA hydrogeologists have used two analytic element models (AEMs), the proprietary software GFLOW and U.S. EPA's WhAEM, to delineate protection areas for 535 public water systems. Both models now use the GFLOW2001 solution engine, integrate well with Geographic Information System (GIS) technology, have a user-friendly graphical interface, are capable of simulating a variety of complex hydrogeologic settings, and do not rely upon a model grid. These features simplify the modeling process and enable AEMs to bridge the gap between existing simplistic delineation methods and more complex numerical models. Ohio EPA hydrogeologists demonstrated that WhAEM2000 and GFLOW2000 were capable of producing capture zones similar to more widely accepted models by applying the AEMs to eight sites that had been previously delineated using other methods. After the Ohio EPA delineated protection areas using AEMs, more simplistic delineation methods used by other states (volumetric equation and arbitrary fixed radii) were applied to the same water systems to compare the differences between various methods. GIS software and two-tailed paired t-tests were used to quantify the differences in protection areas and analyze the data. The results of this analysis demonstrate that AEMs typically produce significantly different protection areas than the most simplistic delineation methods, in terms of total area and shape. If the volumetric equation had been used instead of AEMs, Ohio would not have protected 265 km2 of critical upgradient area and would have overprotected 269 km2 of primarily downgradient land. Since an increasing number of land-use restrictions are being tied to drinking water protection areas, this analysis has broad policy implications.

  15. General-Purpose Heat Source Safety Verification Test program: Edge-on flyer plate tests

    International Nuclear Information System (INIS)

    George, T.G.

    1987-03-01

    The radioisotope thermoelectric generator (RTG) that will supply power for the Galileo and Ulysses space missions contains 18 General-Purpose Heat Source (GPHS) modules. The GPHS modules provide power by transmitting the heat of 238 Pu α-decay to an array of thermoelectric elements. Each module contains four 238 PuO 2 -fueled clads and generates 250 W(t). Because the possibility of a launch vehicle explosion always exists, and because such an explosion could generate a field of high-energy fragments, the fueled clads within each GPHS module must survive fragment impact. The edge-on flyer plate tests were included in the Safety Verification Test series to provide information on the module/clad response to the impact of high-energy plate fragments. The test results indicate that the edge-on impact of a 3.2-mm-thick, aluminum-alloy (2219-T87) plate traveling at 915 m/s causes the complete release of fuel from capsules contained within a bare GPHS module, and that the threshold velocity sufficient to cause the breach of a bare, simulant-fueled clad impacted by a 3.5-mm-thick, aluminum-alloy (5052-T0) plate is approximately 140 m/s

  16. General-Purpose Heat Source development: Safety Verification Test Program. Bullet/fragment test series

    Energy Technology Data Exchange (ETDEWEB)

    George, T.G.; Tate, R.E.; Axler, K.M.

    1985-05-01

    The radioisotope thermoelectric generator (RTG) that will provide power for space missions contains 18 General-Purpose Heat Source (GPHS) modules. Each module contains four /sup 238/PuO/sub 2/-fueled clads and generates 250 W/sub (t)/. Because a launch-pad or post-launch explosion is always possible, we need to determine the ability of GPHS fueled clads within a module to survive fragment impact. The bullet/fragment test series, part of the Safety Verification Test Plan, was designed to provide information on clad response to impact by a compact, high-energy, aluminum-alloy fragment and to establish a threshold value of fragment energy required to breach the iridium cladding. Test results show that a velocity of 555 m/s (1820 ft/s) with an 18-g bullet is at or near the threshold value of fragment velocity that will cause a clad breach. Results also show that an exothermic Ir/Al reaction occurs if aluminum and hot iridium are in contact, a contact that is possible and most damaging to the clad within a narrow velocity range. The observed reactions between the iridium and the aluminum were studied in the laboratory and are reported in the Appendix.

  17. Using meta-analytic path analysis to test theoretical predictions in health behavior: An illustration based on meta-analyses of the theory of planned behavior.

    Science.gov (United States)

    Hagger, Martin S; Chan, Derwin K C; Protogerou, Cleo; Chatzisarantis, Nikos L D

    2016-08-01

    Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs from social cognitive theories is important to test nomological validity, account for mediation effects, and evaluate unique effects of theory constructs independent of past behavior. We illustrate our points by conducting new analyses of two meta-analyses of a popular theory applied to health behaviors, the theory of planned behavior. We conducted meta-analytic path analyses of the theory in two behavioral contexts (alcohol and dietary behaviors) using data from the primary studies included in the original meta-analyses augmented to include intercorrelations among constructs and relations with past behavior missing from the original analysis. Findings supported the nomological validity of the theory and its hypotheses for both behaviors, confirmed important model processes through mediation analysis, demonstrated the attenuating effect of past behavior on theory relations, and provided estimates of the unique effects of theory constructs independent of past behavior. Our analysis illustrates the importance of conducting a simultaneous test of theory-stipulated effects in meta-analyses of social cognitive theories applied to health behavior. We recommend researchers adopt this analytic procedure when synthesizing evidence across primary tests of social cognitive theories in health. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Safety Test Program Summary SNAP 19 Pioneer Heat Source Safety Program

    Energy Technology Data Exchange (ETDEWEB)

    None,

    1971-07-01

    Sixteen heat source assemblies have been tested in support of the SNAP 19 Pioneer Safety Test Program. Seven were subjected to simulated reentry heating in various plasma arc facilities followed by impact on earth or granite. Six assemblies were tested under abort accident conditions of overpressure, shrapnel impact, and solid and liquid propellant fires. Three capsules were hot impacted under Transit capsule impact conditions to verify comparability of test results between the two similar capsule designs, thus utilizing both Pioneer and Transit Safety Test results to support the Safety Analysis Report for Pioneer. The tests have shown the fuel is contained under all nominal accident environments with the exception of minor capsule cracks under severe impact and solid fire environments. No catastrophic capsule failures occurred in this test which would release large quantities of fuel. In no test was fuel visible to the eye following impact or fire. Breached capsules were defined as those which exhibit thoria contamination on its surface following a test, or one which exhibited visible cracks in the post test metallographic analyses.

  19. Case Study : Visual Analytics in Software Product Assessments

    NARCIS (Netherlands)

    Telea, Alexandru; Voinea, Lucian; Lanza, M; Storey, M; Muller, H

    2009-01-01

    We present how a combination of static source code analysis, repository analysis, and visualization techniques has been used to effectively get and communicate insight in the development and project management problems of a large industrial code base. This study is an example of how visual analytics

  20. The World Spatiotemporal Analytics and Mapping Project (WSTAMP): Further Progress in Discovering, Exploring, and Mapping Spatiotemporal Patterns Across the World's Largest Open Source Data Sets

    Science.gov (United States)

    Piburn, J.; Stewart, R.; Myers, A.; Sorokine, A.; Axley, E.; Anderson, D.; Burdette, J.; Biddle, C.; Hohl, A.; Eberle, R.; Kaufman, J.; Morton, A.

    2017-10-01

    Spatiotemporal (ST) analytics applied to major data sources such as the World Bank and World Health Organization has shown tremendous value in shedding light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. WSTAMP engages this opportunity by situating analysts, data, and analytics together within a visually rich and computationally rigorous online analysis environment. Since introducing WSTAMP at the First International Workshop on Spatiotemporal Computing, several transformative advances have occurred. Collaboration with human computer interaction experts led to a complete interface redesign that deeply immerses the analyst within a ST context, significantly increases visual and textual content, provides navigational crosswalks for attribute discovery, substantially reduce mouse and keyboard actions, and supports user data uploads. Secondly, the database has been expanded to include over 16,000 attributes, 50 years of time, and 200+ nation states and redesigned to support non-annual, non-national, city, and interaction data. Finally, two new analytics are implemented for analyzing large portfolios of multi-attribute data and measuring the behavioral stability of regions along different dimensions. These advances required substantial new approaches in design, algorithmic innovations, and increased computational efficiency. We report on these advances and inform how others may freely access the tool.

  1. Chemical/Biological Agent Resistance Test (CBART) Test Fixture System Verification and Analytical Monitioring System Development

    Science.gov (United States)

    2011-03-15

    progress was made towards the proportional intergral derivative (PID) tuning. The CBART NRT analytical system was developed, moved, replumbed, and...efficacy, or applicability of the contents hereof. The use of trade names in this report does not constitute endorsement of any commercial product ...Office MFC mass flow controller MS mass spectrometer MSD mass selective detector NRT near real-time PID proportional intergral derivative

  2. Detailed design of the RF source for the 1 MV neutral beam test facility

    International Nuclear Information System (INIS)

    Marcuzzi, D.; Palma, M. Dalla; Pavei, M.; Heinemann, B.; Kraus, W.; Riedl, R.

    2009-01-01

    In the framework of the EU activities for the development of the Neutral Beam Injector for ITER, the detailed design of the Radio Frequency (RF) driven negative ion source to be installed in the 1 MV ITER Neutral Beam Test Facility (NBTF) has been carried out. Results coming from ongoing R and D on IPP test beds [A. Staebler et al., Development of a RF-Driven Ion Source for the ITER NBI System, this conference] and the design of the new ELISE facility [B. Heinemann et al., Design of the Half-Size ITER Neutral Beam Source Test Facility ELISE, this conference] brought several modifications to the solution based on the previous design. An assessment was carried out regarding the Back-Streaming positive Ions (BSI+) that impinge on the back plates of the ion source and cause high and localized heat loads. This led to the redesign of most heated components to increase cooling, and to different choices for the plasma facing materials to reduce the effects of sputtering. The design of the electric circuit, gas supply and the other auxiliary systems has been optimized. Integration with other components of the beam source has been revised, with regards to the interfaces with the supporting structure, the plasma grid and the flexible connections. In the paper the design will be presented in detail, as well as the results of the analyses performed for the thermo-mechanical verification of the components.

  3. Long-term storage life of light source modules by temperature cycling accelerated life test

    International Nuclear Information System (INIS)

    Sun Ningning; Tan Manqing; Li Ping; Jiao Jian; Guo Xiaofeng; Guo Wentao

    2014-01-01

    Light source modules are the most crucial and fragile devices that affect the life and reliability of the interferometric fiber optic gyroscope (IFOG). While the light emitting chips were stable in most cases, the module packaging proved to be less satisfactory. In long-term storage or the working environment, the ambient temperature changes constantly and thus the packaging and coupling performance of light source modules are more likely to degrade slowly due to different materials with different coefficients of thermal expansion in the bonding interface. A constant temperature accelerated life test cannot evaluate the impact of temperature variation on the performance of a module package, so the temperature cycling accelerated life test was studied. The main failure mechanism affecting light source modules is package failure due to solder fatigue failure including a fiber coupling shift, loss of cooling efficiency and thermal resistor degradation, so the Norris-Landzberg model was used to model solder fatigue life and determine the activation energy related to solder fatigue failure mechanism. By analyzing the test data, activation energy was determined and then the mean life of light source modules in different storage environments with a continuously changing temperature was simulated, which has provided direct reference data for the storage life prediction of IFOG. (semiconductor devices)

  4. Comparison of Video Head Impulse Test (vHIT) Gains Between Two Commercially Available Devices and by Different Gain Analytical Methods.

    Science.gov (United States)

    Lee, Sang Hun; Yoo, Myung Hoon; Park, Jun Woo; Kang, Byung Chul; Yang, Chan Joo; Kang, Woo Suk; Ahn, Joong Ho; Chung, Jong Woo; Park, Hong Ju

    2018-06-01

    To evaluate whether video head impulse test (vHIT) gains are dependent on the measuring device and method of analysis. Prospective study. vHIT was performed in 25 healthy subjects using two devices simultaneously. vHIT gains were compared between these instruments and using five different methods of comparing position and velocity gains during head movement intervals. The two devices produced different vHIT gain results with the same method of analysis. There were also significant differences in the vHIT gains measured using different analytical methods. The gain analytic method that compares the areas under the velocity curve (AUC) of the head and eye movements during head movements showed lower vHIT gains than a method that compared the peak velocities of the head and eye movements. The former method produced the vHIT gain with the smallest standard deviation among the five procedures tested in this study. vHIT gains differ in normal subjects depending on the device and method of analysis used, suggesting that it is advisable for each device to have its own normal values. Gain calculations that compare the AUC of the head and eye movements during the head movements show the smallest variance.

  5. Distributed data networks: a blueprint for Big Data sharing and healthcare analytics.

    Science.gov (United States)

    Popovic, Jennifer R

    2017-01-01

    This paper defines the attributes of distributed data networks and outlines the data and analytic infrastructure needed to build and maintain a successful network. We use examples from one successful implementation of a large-scale, multisite, healthcare-related distributed data network, the U.S. Food and Drug Administration-sponsored Sentinel Initiative. Analytic infrastructure-development concepts are discussed from the perspective of promoting six pillars of analytic infrastructure: consistency, reusability, flexibility, scalability, transparency, and reproducibility. This paper also introduces one use case for machine learning algorithm development to fully utilize and advance the portfolio of population health analytics, particularly those using multisite administrative data sources. © 2016 New York Academy of Sciences.

  6. Deuterium results at the negative ion source test facility ELISE

    Science.gov (United States)

    Kraus, W.; Wünderlich, D.; Fantz, U.; Heinemann, B.; Bonomo, F.; Riedl, R.

    2018-05-01

    The ITER neutral beam system will be equipped with large radio frequency (RF) driven negative ion sources, with a cross section of 0.9 m × 1.9 m, which have to deliver extracted D- ion beams of 57 A at 1 MeV for 1 h. On the extraction from a large ion source experiment test facility, a source of half of this size is being operational since 2013. The goal of this experiment is to demonstrate a high operational reliability and to achieve the extracted current densities and beam properties required for ITER. Technical improvements of the source design and the RF system were necessary to provide reliable operation in steady state with an RF power of up to 300 kW. While in short pulses the required D- current density has almost been reached, the performance in long pulses is determined in particular in Deuterium by inhomogeneous and unstable currents of co-extracted electrons. By application of refined caesium evaporation and distribution procedures, and reduction and symmetrization of the electron currents, considerable progress has been made and up to 190 A/m2 D-, corresponding to 66% of the value required for ITER, have been extracted for 45 min.

  7. Plasma lenses for SLAC Final Focus Test facility

    International Nuclear Information System (INIS)

    Betz, D.; Cline, D.; Joshi, C.; Rajagopalan, S.; Rosenzweig, J.; Su, J.J.; Williams, R.; Chen, P.; Gundersen, M.; Katsouleas, T.; Norem, J.

    1991-01-01

    A collaborative group of accelerator and plasma physicists and engineers has formed with an interest in exploring the use of plasma lenses to meet the needs of future colliders. Analytic and computational models of plasma lenses are briefly reviewed and several design examples for the SLAC Final Focus Test Beam are presented. The examples include discrete, thick, and adiabatic lenses. A potential plasma source with desirable lens characteristics is presented

  8. Performance Test of the Microwave Ion Source with the Multi-layer DC Break

    International Nuclear Information System (INIS)

    Kim, Dae Il; Kwon, Hyeok Jung; Kim, Han Sung; Seol, Kyung Tae; Cho, Yong Sub

    2012-01-01

    A microwave proton source has been developed as a proton injector for the 100-MeV proton linac of the PEFP (Proton Engineering Frontier Project). On microwave ion source, the high voltage for the beam extraction is applied to the plasma chamber, also to the microwave components such as a 2.45GHz magnetron, a 3-stub tuner, waveguides. If microwave components can be installed on ground side, the microwave ion source can be operated and maintained easily. For the purpose, the multi-layer DC break has been developed. A multi-layer insulation has the arrangement of conductors and insulators as shown in the Fig. 1. For the purpose of stable operation as the multi-layer DC break, we checked the radiation of the insulator depending on materials and high voltage test of a fabricated multi-layer insulation. In this report, the details of performance test of the multi-layer DC break will be presented

  9. Test of a potential link between analytic and nonanalytic category learning and automatic, effortful processing.

    Science.gov (United States)

    Tracy, J I; Pinsk, M; Helverson, J; Urban, G; Dietz, T; Smith, D J

    2001-08-01

    The link between automatic and effortful processing and nonanalytic and analytic category learning was evaluated in a sample of 29 college undergraduates using declarative memory, semantic category search, and pseudoword categorization tasks. Automatic and effortful processing measures were hypothesized to be associated with nonanalytic and analytic categorization, respectively. Results suggested that contrary to prediction strong criterion-attribute (analytic) responding on the pseudoword categorization task was associated with strong automatic, implicit memory encoding of frequency-of-occurrence information. Data are discussed in terms of the possibility that criterion-attribute category knowledge, once established, may be expressed with few attentional resources. The data indicate that attention resource requirements, even for the same stimuli and task, vary depending on the category rule system utilized. Also, the automaticity emerging from familiarity with analytic category exemplars is very different from the automaticity arising from extensive practice on a semantic category search task. The data do not support any simple mapping of analytic and nonanalytic forms of category learning onto the automatic and effortful processing dichotomy and challenge simple models of brain asymmetries for such procedures. Copyright 2001 Academic Press.

  10. Data analytics in the ATLAS Distributed Computing

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration; Bryant, Lincoln

    2015-01-01

    The ATLAS Data analytics effort is focused on creating systems which provide the ATLAS ADC with new capabilities for understanding distributed systems and overall operational performance. These capabilities include: warehousing information from multiple systems (the production and distributed analysis system - PanDA, the distributed data management system - Rucio, the file transfer system, various monitoring services etc. ); providing a platform to execute arbitrary data mining and machine learning algorithms over aggregated data; satisfy a variety of use cases for different user roles; host new third party analytics services on a scalable compute platform. We describe the implemented system where: data sources are existing RDBMS (Oracle) and Flume collectors; a Hadoop cluster is used to store the data; native Hadoop and Apache Pig scripts are used for data aggregation; and R for in-depth analytics. Part of the data is indexed in ElasticSearch so both simpler investigations and complex dashboards can be made ...

  11. International Congress on Analytical Chemistry. Abstracts. V. 2

    International Nuclear Information System (INIS)

    1997-01-01

    The collection of materials of the international congress on analytical chemistry taken place in Moscow in June 1997 is presented. The main directs of investigations are elucidated in such regions of analytical chemistry as quantitative and qualitative chemical analysis, sample preparation, express test methods of environmental and biological materials, clinical analysis, analysis of food and agricultural products

  12. International Congress on Analytical Chemistry. Abstracts. V. 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-12-31

    The collection of materials of the international congress on analytical chemistry taken place in Moscow in June 1997 is presented. The main directs of investigations are elucidated in such regions of analytical chemistry as quantitative and qualitative chemical analysis, sample preparation, express test methods of environmental and biological materials, clinical analysis, analysis of food and agricultural products

  13. Evaluation of analytical errors in a clinical chemistry laboratory: a 3 year experience.

    Science.gov (United States)

    Sakyi, As; Laing, Ef; Ephraim, Rk; Asibey, Of; Sadique, Ok

    2015-01-01

    Proficient laboratory service is the cornerstone of modern healthcare systems and has an impact on over 70% of medical decisions on admission, discharge, and medications. In recent years, there is an increasing awareness of the importance of errors in laboratory practice and their possible negative impact on patient outcomes. We retrospectively analyzed data spanning a period of 3 years on analytical errors observed in our laboratory. The data covered errors over the whole testing cycle including pre-, intra-, and post-analytical phases and discussed strategies pertinent to our settings to minimize their occurrence. We described the occurrence of pre-analytical, analytical and post-analytical errors observed at the Komfo Anokye Teaching Hospital clinical biochemistry laboratory during a 3-year period from January, 2010 to December, 2012. Data were analyzed with Graph Pad Prism 5(GraphPad Software Inc. CA USA). A total of 589,510 tests was performed on 188,503 outpatients and hospitalized patients. The overall error rate for the 3 years was 4.7% (27,520/58,950). Pre-analytical, analytical and post-analytical errors contributed 3.7% (2210/58,950), 0.1% (108/58,950), and 0.9% (512/58,950), respectively. The number of tests reduced significantly over the 3-year period, but this did not correspond with a reduction in the overall error rate (P = 0.90) along with the years. Analytical errors are embedded within our total process setup especially pre-analytical and post-analytical phases. Strategic measures including quality assessment programs for staff involved in pre-analytical processes should be intensified.

  14. Numerical Test of Analytical Theories for Perpendicular Diffusion in Small Kubo Number Turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Heusen, M.; Shalchi, A., E-mail: husseinm@myumanitoba.ca, E-mail: andreasm4@yahoo.com [Department of Physics and Astronomy, University of Manitoba, Winnipeg, MB R3T 2N2 (Canada)

    2017-04-20

    In the literature, one can find various analytical theories for perpendicular diffusion of energetic particles interacting with magnetic turbulence. Besides quasi-linear theory, there are different versions of the nonlinear guiding center (NLGC) theory and the unified nonlinear transport (UNLT) theory. For turbulence with high Kubo numbers, such as two-dimensional turbulence or noisy reduced magnetohydrodynamic turbulence, the aforementioned nonlinear theories provide similar results. For slab and small Kubo number turbulence, however, this is not the case. In the current paper, we compare different linear and nonlinear theories with each other and test-particle simulations for a noisy slab model corresponding to small Kubo number turbulence. We show that UNLT theory agrees very well with all performed test-particle simulations. In the limit of long parallel mean free paths, the perpendicular mean free path approaches asymptotically the quasi-linear limit as predicted by the UNLT theory. For short parallel mean free paths we find a Rechester and Rosenbluth type of scaling as predicted by UNLT theory as well. The original NLGC theory disagrees with all performed simulations regardless what the parallel mean free path is. The random ballistic interpretation of the NLGC theory agrees much better with the simulations, but compared to UNLT theory the agreement is inferior. We conclude that for this type of small Kubo number turbulence, only the latter theory allows for an accurate description of perpendicular diffusion.

  15. Numerical Test of Analytical Theories for Perpendicular Diffusion in Small Kubo Number Turbulence

    International Nuclear Information System (INIS)

    Heusen, M.; Shalchi, A.

    2017-01-01

    In the literature, one can find various analytical theories for perpendicular diffusion of energetic particles interacting with magnetic turbulence. Besides quasi-linear theory, there are different versions of the nonlinear guiding center (NLGC) theory and the unified nonlinear transport (UNLT) theory. For turbulence with high Kubo numbers, such as two-dimensional turbulence or noisy reduced magnetohydrodynamic turbulence, the aforementioned nonlinear theories provide similar results. For slab and small Kubo number turbulence, however, this is not the case. In the current paper, we compare different linear and nonlinear theories with each other and test-particle simulations for a noisy slab model corresponding to small Kubo number turbulence. We show that UNLT theory agrees very well with all performed test-particle simulations. In the limit of long parallel mean free paths, the perpendicular mean free path approaches asymptotically the quasi-linear limit as predicted by the UNLT theory. For short parallel mean free paths we find a Rechester and Rosenbluth type of scaling as predicted by UNLT theory as well. The original NLGC theory disagrees with all performed simulations regardless what the parallel mean free path is. The random ballistic interpretation of the NLGC theory agrees much better with the simulations, but compared to UNLT theory the agreement is inferior. We conclude that for this type of small Kubo number turbulence, only the latter theory allows for an accurate description of perpendicular diffusion.

  16. Lifetime test on a high-performance dc microwave proton source

    International Nuclear Information System (INIS)

    Sherman, J.D.; Hodgkins, D.J.; Lara, P.D.; Schneider, J.D.; Stevens, R.R. Jr.

    1995-01-01

    Powerful CW proton linear accelerators (100 mA at 0.5--1 GeV) are being proposed for spallation neutron source applications.These production accelerators require high availability and reliability. A microwave proton source, which has already demonstrated several key beam requirements, was operated for one week (170 hours) in a dc mode to test the reliability and lifetime of its plasma generator. The source was operated with 570 W of microwave (2.45 GHz) discharge power and with a 47-kV extraction voltage. This choice of operating parameters gave a proton current density of 250-mA/cm 2 at 83% proton fraction, which is sufficient for a conservative dc injector design. The beam current was 60--65 mA over most of the week, and was sufficiently focused for RFQ injection. Total beam availability, defined as 47-keV beam-on time divided by elapsed time, was 96.2%. Spark downs in the high voltage column and a gas flow control problem caused all the downtime; no plasma generator failures were observed

  17. Analytical modeling of post-tensioned precast beam-to-column connections

    International Nuclear Information System (INIS)

    Kaya, Mustafa; Arslan, A. Samet

    2009-01-01

    In this study, post-tensioned precast beam-to-column connections are tested experimentally at different stress levels, and are modelled analytically using 3D nonlinear finite element modelling method. ANSYS finite element software is used for this purposes. Nonlinear static analysis is used to determine the connection strength, behavior and stiffness when subjected to cyclic inelastic loads simulating ground excitation during an earthquake. The results obtained from the analytical studies are compared with the test results. In terms of stiffness, it was seen that the initial stiffness of the analytical models was lower than that of the tested specimens. As a result, modelling of these types of connection using 3D FEM can give crucial beforehand information, and overcome the disadvantages of time consuming workmanship and cost of experimental studies.

  18. Analytical magmatic source modelling from a joint inversion of ground deformation and focal mechanisms data

    Science.gov (United States)

    Cannavo', Flavio; Scandura, Danila; Palano, Mimmo; Musumeci, Carla

    2014-05-01

    Seismicity and ground deformation represent the principal geophysical methods for volcano monitoring and provide important constraints on subsurface magma movements. The occurrence of migrating seismic swarms, as observed at several volcanoes worldwide, are commonly associated with dike intrusions. In addition, on active volcanoes, (de)pressurization and/or intrusion of magmatic bodies stress and deform the surrounding crustal rocks, often causing earthquakes randomly distributed in time within a volume extending about 5-10 km from the wall of the magmatic bodies. Despite advances in space-based, geodetic and seismic networks have significantly improved volcano monitoring in the last decades on an increasing worldwide number of volcanoes, quantitative models relating deformation and seismicity are not common. The observation of several episodes of volcanic unrest throughout the world, where the movement of magma through the shallow crust was able to produce local rotation of the ambient stress field, introduces an opportunity to improve the estimate of the parameters of a deformation source. In particular, during these episodes of volcanic unrest a radial pattern of P-axes of the focal mechanism solutions, similar to that of ground deformation, has been observed. Therefore, taking into account additional information from focal mechanisms data, we propose a novel approach to volcanic source modeling based on the joint inversion of deformation and focal plane solutions assuming that both observations are due to the same source. The methodology is first verified against a synthetic dataset of surface deformation and strain within the medium, and then applied to real data from an unrest episode occurred before the May 13th 2008 eruption at Mt. Etna (Italy). The main results clearly indicate as the joint inversion improves the accuracy of the estimated source parameters of about 70%. The statistical tests indicate that the source depth is the parameter with the highest

  19. Conceptual and analytical modeling of fracture zone aquifers in hard rock. Implications of pumping tests in the Pohjukansalo well field, east-central Finland

    International Nuclear Information System (INIS)

    Leveinen, J.

    2001-01-01

    Fracture zones with an interconnected network of open fractures can conduct significant groundwater flow and as in the case of the Pohjukansalo well field in Leppaevirta, can yield sufficiently for small-scale municipal water supply. Glaciofluvial deposits comprising major aquifers commonly overlay fracture zones that can contribute to the water balance directly or indirectly by providing hydraulic interconnections between different formations. Fracture zones and fractures can also transport contaminants in a poorly predictable way. Consequently, hydrogeological research of fracture zones is important for the management and protection of soil aquifers in Finland. Hydraulic properties of aquifers are estimated in situ by well test analyses based on analytical models. Most analytical models rely on the concepts of radial flow and horizontal slab aquifer. In Paper 1, pump test responses of fracture zones in the Pohjukansalo well field were characterised based on alternative analytical models developed for channelled flow cases. In Paper 2, the tests were analysed based on the generalised radial flow (GRF) model and a concept of a fracture network possessing fractional flow dimension due to limited connectivity compared to ideal 2- or 3- dimensional systems. The analysis provides estimates of hydraulic properties in terms of parameters that do not have concrete meaning when the flow dimension of the aquifer has fractional values. Concrete estimates of hydraulic parameters were produced by making simplified assumptions and by using the composite model developed in Paper 3. In addition to estimates of hydraulic parameters, analysis of hydraulic tests provides qualitative information that is useful when the hydraulic connections in the fracture system are not well known. However, attention should be paid to the frequency of drawdown measurements-particularly for the application of derivative curves. In groundwater studies, analytical models have been also used to estimate

  20. Manufacturing, assembly and tests of SPIDER Vacuum Vessel to develop and test a prototype of ITER neutral beam ion source

    Energy Technology Data Exchange (ETDEWEB)

    Zaccaria, Pierluigi, E-mail: pierluigi.zaccaria@igi.cnr.it [Consorzio RFX (CNR, ENEA, INFN, Università di Padova, Acciaierie Venete S.p.A.), Padova (Italy); Valente, Matteo; Rigato, Wladi; Dal Bello, Samuele; Marcuzzi, Diego; Agostini, Fabio Degli; Rossetto, Federico; Tollin, Marco [Consorzio RFX (CNR, ENEA, INFN, Università di Padova, Acciaierie Venete S.p.A.), Padova (Italy); Masiello, Antonio [Fusion for Energy F4E, Barcelona (Spain); Corniani, Giorgio; Badalocchi, Matteo; Bettero, Riccardo; Rizzetto, Dario [Ettore Zanon S.p.A., Schio (VI) (Italy)

    2015-10-15

    Highlights: • The SPIDER experiment aims to qualify and optimize the ion source for ITER injectors. • The large SPIDER Vacuum Vessel was built and it is under testing at the supplier. • The main working and assembly steps for production are presented in the paper. - Abstract: The SPIDER experiment (Source for the Production of Ions of Deuterium Extracted from an RF plasma) aims to qualify and optimize the full size prototype of the negative ion source foreseen for MITICA (full size ITER injector prototype) and the ITER Heating and Current Drive Injectors. Both SPIDER and MITICA experiments are presently under construction at Consorzio RFX in Padova (I), with the financial support from IO (ITER Organization), Fusion for Energy, Italian research institutions and contributions from Japan and India Domestic Agencies. The vacuum vessel hosting the SPIDER in-vessel components (Beam Source and calorimeters) has been manufactured, assembled and tested during the last two years 2013–2014. The cylindrical vessel, about 6 m long and 4 m in diameter, is composed of two cylindrical modules and two torispherical lids at the ends. All the parts are made by AISI 304 L stainless steel. The possibility of opening/closing the vessel for monitoring, maintenance or modifications of internal components is guaranteed by bolted junctions and suitable movable support structures running on rails fixed to the building floor. A large number of ports, about one hundred, are present on the vessel walls for diagnostic and service purposes. The main working steps for construction and specific technological issues encountered and solved for production are presented in the paper. Assembly sequences and tests on site are furthermore described in detail, highlighting all the criteria and requirements for correct positioning and testing of performances.

  1. Manufacturing, assembly and tests of SPIDER Vacuum Vessel to develop and test a prototype of ITER neutral beam ion source

    International Nuclear Information System (INIS)

    Zaccaria, Pierluigi; Valente, Matteo; Rigato, Wladi; Dal Bello, Samuele; Marcuzzi, Diego; Agostini, Fabio Degli; Rossetto, Federico; Tollin, Marco; Masiello, Antonio; Corniani, Giorgio; Badalocchi, Matteo; Bettero, Riccardo; Rizzetto, Dario

    2015-01-01

    Highlights: • The SPIDER experiment aims to qualify and optimize the ion source for ITER injectors. • The large SPIDER Vacuum Vessel was built and it is under testing at the supplier. • The main working and assembly steps for production are presented in the paper. - Abstract: The SPIDER experiment (Source for the Production of Ions of Deuterium Extracted from an RF plasma) aims to qualify and optimize the full size prototype of the negative ion source foreseen for MITICA (full size ITER injector prototype) and the ITER Heating and Current Drive Injectors. Both SPIDER and MITICA experiments are presently under construction at Consorzio RFX in Padova (I), with the financial support from IO (ITER Organization), Fusion for Energy, Italian research institutions and contributions from Japan and India Domestic Agencies. The vacuum vessel hosting the SPIDER in-vessel components (Beam Source and calorimeters) has been manufactured, assembled and tested during the last two years 2013–2014. The cylindrical vessel, about 6 m long and 4 m in diameter, is composed of two cylindrical modules and two torispherical lids at the ends. All the parts are made by AISI 304 L stainless steel. The possibility of opening/closing the vessel for monitoring, maintenance or modifications of internal components is guaranteed by bolted junctions and suitable movable support structures running on rails fixed to the building floor. A large number of ports, about one hundred, are present on the vessel walls for diagnostic and service purposes. The main working steps for construction and specific technological issues encountered and solved for production are presented in the paper. Assembly sequences and tests on site are furthermore described in detail, highlighting all the criteria and requirements for correct positioning and testing of performances.

  2. Development of analytical methods for the separation of plutonium, americium, curium and neptunium from environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Salminen, S.

    2009-07-01

    In this work, separation methods have been developed for the analysis of anthropogenic transuranium elements plutonium, americium, curium and neptunium from environmental samples contaminated by global nuclear weapons testing and the Chernobyl accident. The analytical methods utilized in this study are based on extraction chromatography. Highly varying atmospheric plutonium isotope concentrations and activity ratios were found at both Kurchatov (Kazakhstan), near the former Semipalatinsk test site, and Sodankylae (Finland). The origin of plutonium is almost impossible to identify at Kurchatov, since hundreds of nuclear tests were performed at the Semipalatinsk test site. In Sodankylae, plutonium in the surface air originated from nuclear weapons testing, conducted mostly by USSR and USA before the sampling year 1963. The variation in americium, curium and neptunium concentrations was great as well in peat samples collected in southern and central Finland in 1986 immediately after the Chernobyl accident. The main source of transuranium contamination in peats was from global nuclear test fallout, although there are wide regional differences in the fraction of Chernobyl-originated activity (of the total activity) for americium, curium and neptunium. The separation methods developed in this study yielded good chemical recovery for the elements investigated and adequately pure fractions for radiometric activity determination. The extraction chromatographic methods were faster compared to older methods based on ion exchange chromatography. In addition, extraction chromatography is a more environmentally friendly separation method than ion exchange, because less acidic waste solutions are produced during the analytical procedures. (orig.)

  3. Advances in analytical tools for high throughput strain engineering

    DEFF Research Database (Denmark)

    Marcellin, Esteban; Nielsen, Lars Keld

    2018-01-01

    The emergence of inexpensive, base-perfect genome editing is revolutionising biology. Modern industrial biotechnology exploits the advances in genome editing in combination with automation, analytics and data integration to build high-throughput automated strain engineering pipelines also known...... as biofoundries. Biofoundries replace the slow and inconsistent artisanal processes used to build microbial cell factories with an automated design–build–test cycle, considerably reducing the time needed to deliver commercially viable strains. Testing and hence learning remains relatively shallow, but recent...... advances in analytical chemistry promise to increase the depth of characterization possible. Analytics combined with models of cellular physiology in automated systems biology pipelines should enable deeper learning and hence a steeper pitch of the learning cycle. This review explores the progress...

  4. Analytical purpose electron backscattering system

    International Nuclear Information System (INIS)

    Desdin, L.; Padron, I.; Laria, J.

    1996-01-01

    In this work an analytical purposes electron backscattering system improved at the Center of Applied Studies for Nuclear Development is described. This system can be applied for fast, exact and nondestructive testing of binary and AL/Cu, AL/Ni in alloys and for other applications

  5. Data science and big data analytics discovering, analyzing, visualizing and presenting data

    CERN Document Server

    2014-01-01

    Data Science and Big Data Analytics is about harnessing the power of data for new insights. The book covers the breadth of activities and methods and tools that Data Scientists use. The content focuses on concepts, principles and practical applications that are applicable to any industry and technology environment, and the learning is supported and explained with examples that you can replicate using open-source software. This book will help you: Become a contributor on a data science teamDeploy a structured lifecycle approach to data analytics problemsApply appropriate analytic techniques and

  6. Semi-analytic flux formulas for shielding calculations

    International Nuclear Information System (INIS)

    Wallace, O.J.

    1976-06-01

    A special coordinate system based on the work of H. Ono and A. Tsuro has been used to derive exact semi-analytic formulas for the flux from cylindrical, spherical, toroidal, rectangular, annular and truncated cone volume sources; from cylindrical, spherical, truncated cone, disk and rectangular surface sources; and from curved and tilted line sources. In most of the cases where the source is curved, shields of the same curvature are allowed in addition to the standard slab shields; cylindrical shields are also allowed in the rectangular volume source flux formula. An especially complete treatment of a cylindrical volume source is given, in which dose points may be arbitrarily located both within and outside the source, and a finite cylindrical shield may be considered. Detector points may also be specified as lying within spherical and annular source volumes. The integral functions encountered in these formulas require at most two-dimensional numeric integration in order to evaluate the flux values. The classic flux formulas involving only slab shields and slab, disk, line, sphere and truncated cone sources become some of the many special cases which are given in addition to the more general formulas mentioned above

  7. [Pre-analytical stability before centrifugation of 7 biochemical analytes in whole blood].

    Science.gov (United States)

    Perrier-Cornet, Andreas; Moineau, Marie-Pierre; Narbonne, Valérie; Plee-Gautier, Emmanuelle; Le Saos, Fabienne; Carre, Jean-Luc

    2015-01-01

    The pre-analytical stability of 7 biochemical parameters (parathyroid hormone -PTH-, vitamins A, C E and D, 1,25-dihydroxyvitamin D and insulin) at +4 °C, was studied on whole blood samples before centrifugation. The impact of freezing at -20°C was also analyzed/performed for PTH and vitamin D. The differences in the results of assays for whole blood samples, being kept for different times between sampling time and analysis, from 9 healthy adults, were compaired by using a Student t test. The 7 analytes investigated remained stable up to 4 hours at +4°C in whole blood. This study showed that it is possible to accept uncentrifuged whole blood specimens kept at +4°C before analysis. PTH is affected by freezing whereas vitamin D is not.

  8. 40 CFR 425.03 - Sulfide analytical methods and applicability.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Sulfide analytical methods and applicability. 425.03 Section 425.03 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS LEATHER TANNING AND FINISHING POINT SOURCE CATEGORY General Provisions...

  9. Validation of an analytical method based on the high-resolution continuum source flame atomic absorption spectrometry for the fast-sequential determination of several hazardous/priority hazardous metals in soil.

    Science.gov (United States)

    Frentiu, Tiberiu; Ponta, Michaela; Hategan, Raluca

    2013-03-01

    The aim of this paper was the validation of a new analytical method based on the high-resolution continuum source flame atomic absorption spectrometry for the fast-sequential determination of several hazardous/priority hazardous metals (Ag, Cd, Co, Cr, Cu, Ni, Pb and Zn) in soil after microwave assisted digestion in aqua regia. Determinations were performed on the ContrAA 300 (Analytik Jena) air-acetylene flame spectrometer equipped with xenon short-arc lamp as a continuum radiation source for all elements, double monochromator consisting of a prism pre-monocromator and an echelle grating monochromator, and charge coupled device as detector. For validation a method-performance study was conducted involving the establishment of the analytical performance of the new method (limits of detection and quantification, precision and accuracy). Moreover, the Bland and Altman statistical method was used in analyzing the agreement between the proposed assay and inductively coupled plasma optical emission spectrometry as standardized method for the multielemental determination in soil. The limits of detection in soil sample (3σ criterion) in the high-resolution continuum source flame atomic absorption spectrometry method were (mg/kg): 0.18 (Ag), 0.14 (Cd), 0.36 (Co), 0.25 (Cr), 0.09 (Cu), 1.0 (Ni), 1.4 (Pb) and 0.18 (Zn), close to those in inductively coupled plasma optical emission spectrometry: 0.12 (Ag), 0.05 (Cd), 0.15 (Co), 1.4 (Cr), 0.15 (Cu), 2.5 (Ni), 2.5 (Pb) and 0.04 (Zn). Accuracy was checked by analyzing 4 certified reference materials and a good agreement for 95% confidence interval was found in both methods, with recoveries in the range of 94-106% in atomic absorption and 97-103% in optical emission. Repeatability found by analyzing real soil samples was in the range 1.6-5.2% in atomic absorption, similar with that of 1.9-6.1% in optical emission spectrometry. The Bland and Altman method showed no statistical significant difference between the two spectrometric

  10. From the Kirsch-Kress potential method via the range test to the singular sources method

    International Nuclear Information System (INIS)

    Potthast, R; Schulz, J

    2005-01-01

    We review three reconstruction methods for inverse obstacle scattering problems. We will analyse the relation between the Kirsch-Kress potential method 1986, the range test of Kusiak, Potthast and Sylvester (2003) and the singular sources method of Potthast (2000). In particular, we show that the range test is a logical extension of the Kirsch-Kress method into the category of sampling methods employing the tool of domain sampling. Then we will show how a multi-wave version of the range test can be set up and we will work out its relation to the singular sources method. Numerical examples and demonstrations will be provided

  11. Testing a 1-D Analytical Salt Intrusion Model and the Predictive Equation in Malaysian Estuaries

    Science.gov (United States)

    Gisen, Jacqueline Isabella; Savenije, Hubert H. G.

    2013-04-01

    Little is known about the salt intrusion behaviour in Malaysian estuaries. Study on this topic sometimes requires large amounts of data especially if a 2-D or 3-D numerical models are used for analysis. In poor data environments, 1-D analytical models are more appropriate. For this reason, a fully analytical 1-D salt intrusion model, based on the theory of Savenije in 2005, was tested in three Malaysian estuaries (Bernam, Selangor and Muar) because it is simple and requires minimal data. In order to achieve that, site surveys were conducted in these estuaries during the dry season (June-August) at spring tide by moving boat technique. Data of cross-sections, water levels and salinity were collected, and then analysed with the salt intrusion model. This paper demonstrates a good fit between the simulated and observed salinity distribution for all three estuaries. Additionally, the calibrated Van der Burgh's coefficient K, Dispersion coefficient D0, and salt intrusion length L, for the estuaries also displayed a reasonable correlations with those calculated from the predictive equations. This indicates that not only is the salt intrusion model valid for the case studies in Malaysia but also the predictive model. Furthermore, the results from this study describe the current state of the estuaries with which the Malaysian water authority in Malaysia can make decisions on limiting water abstraction or dredging. Keywords: salt intrusion, Malaysian estuaries, discharge, predictive model, dispersion

  12. Field and analytical data relating to the 1972 and 1978 surveys of residual contamination of the Monte Bello Islands and Emu atomic weapons test sites

    International Nuclear Information System (INIS)

    Cooper, M.B.; Duggleby, J.C.

    1980-12-01

    Radiation surveys of the Monte Bello Islands test site in Western Australia, and the Emu test site in South Australia, were carried out in 1972 and 1978. The results have been published in ARL reports ARL/TR--010 and ARL/TR--012. The detailed field and analytical data which formed the basis of those publications are given

  13. AGN outflows as neutrino sources: an observational test

    Science.gov (United States)

    Padovani, P.; Turcati, A.; Resconi, E.

    2018-04-01

    We test the recently proposed idea that outflows associated with Active Galactic Nuclei (AGN) could be neutrino emitters in two complementary ways. First, we cross-correlate a list of 94 "bona fide" AGN outflows with the most complete and updated repository of IceCube neutrinos currently publicly available, assembled by us for this purpose. It turns out that AGN with outflows matched to an IceCube neutrino have outflow and kinetic energy rates, and bolometric powers larger than those of AGN with outflows not matched to neutrinos. Second, we carry out a statistical analysis on a catalogue of [O III] λ5007 line profiles using a sample of 23,264 AGN at z values (˜6 and 18 per cent respectively, pre-trial) for relatively high velocities and luminosities. Our results are consistent with a scenario where AGN outflows are neutrino emitters but at present do not provide a significant signal. This can be tested with better statistics and source stacking. A predominant role of AGN outflows in explaining the IceCube data appears in any case to be ruled out.

  14. Dependence of the source performance on plasma parameters at the BATMAN test facility

    Science.gov (United States)

    Wimmer, C.; Fantz, U.

    2015-04-01

    The investigation of the dependence of the source performance (high jH-, low je) for optimum Cs conditions on the plasma parameters at the BATMAN (Bavarian Test MAchine for Negative hydrogen ions) test facility is desirable in order to find key parameters for the operation of the source as well as to deepen the physical understanding. The most relevant source physics takes place in the extended boundary layer, which is the plasma layer with a thickness of several cm in front of the plasma grid: the production of H-, its transport through the plasma and its extraction, inevitably accompanied by the co-extraction of electrons. Hence, a link of the source performance with the plasma parameters in the extended boundary layer is expected. In order to characterize electron and negative hydrogen ion fluxes in the extended boundary layer, Cavity Ring-Down Spectroscopy and Langmuir probes have been applied for the measurement of the H- density and the determination of the plasma density, the plasma potential and the electron temperature, respectively. The plasma potential is of particular importance as it determines the sheath potential profile at the plasma grid: depending on the plasma grid bias relative to the plasma potential, a transition in the plasma sheath from an electron repelling to an electron attracting sheath takes place, influencing strongly the electron fraction of the bias current and thus the amount of co-extracted electrons. Dependencies of the source performance on the determined plasma parameters are presented for the comparison of two source pressures (0.6 Pa, 0.45 Pa) in hydrogen operation. The higher source pressure of 0.6 Pa is a standard point of operation at BATMAN with external magnets, whereas the lower pressure of 0.45 Pa is closer to the ITER requirements (p ≤ 0.3 Pa).

  15. Dependence of the source performance on plasma parameters at the BATMAN test facility

    International Nuclear Information System (INIS)

    Wimmer, C.; Fantz, U.

    2015-01-01

    The investigation of the dependence of the source performance (high j H − , low j e ) for optimum Cs conditions on the plasma parameters at the BATMAN (Bavarian Test MAchine for Negative hydrogen ions) test facility is desirable in order to find key parameters for the operation of the source as well as to deepen the physical understanding. The most relevant source physics takes place in the extended boundary layer, which is the plasma layer with a thickness of several cm in front of the plasma grid: the production of H − , its transport through the plasma and its extraction, inevitably accompanied by the co-extraction of electrons. Hence, a link of the source performance with the plasma parameters in the extended boundary layer is expected. In order to characterize electron and negative hydrogen ion fluxes in the extended boundary layer, Cavity Ring-Down Spectroscopy and Langmuir probes have been applied for the measurement of the H − density and the determination of the plasma density, the plasma potential and the electron temperature, respectively. The plasma potential is of particular importance as it determines the sheath potential profile at the plasma grid: depending on the plasma grid bias relative to the plasma potential, a transition in the plasma sheath from an electron repelling to an electron attracting sheath takes place, influencing strongly the electron fraction of the bias current and thus the amount of co-extracted electrons. Dependencies of the source performance on the determined plasma parameters are presented for the comparison of two source pressures (0.6 Pa, 0.45 Pa) in hydrogen operation. The higher source pressure of 0.6 Pa is a standard point of operation at BATMAN with external magnets, whereas the lower pressure of 0.45 Pa is closer to the ITER requirements (p ≤ 0.3 Pa)

  16. Procedure to carry out leakage test in beta radiation sealed sources emitters of 90Sr/90Y

    International Nuclear Information System (INIS)

    Alvarez R, J. T.

    2010-09-01

    In the alpha-beta room of the Secondary Laboratory of Dosimetric Calibration of the Metrology Department of Ionizing Radiations ophthalmic applicators are calibrated in absorbed dose terms in water D w ; these applicators, basically are emitter sealed sources of pure beta radiation of 90 Sr / 90 Y. Concretely, the laboratory quality system indicates to use the established procedure for the calibration of these sources, which establishes the requirement of to carry out a leakage test, before to calibrate the source. However, in the Laboratory leakage test certificates sent by specialized companies in radiological protection services have been received, in which are used gamma spectrometry equipment s for beta radiation leakage tests, since it is not reliable to detect pure beta radiation with a scintillating detector with NaI crystal, (because it could detect the braking radiation produced in the detector). Therefore the Laboratory has had to verify the results of the tests with a correct technique, with the purpose of determining the presence of sources with their altered integrity and radioactive material leakage. The objective of this work is to describe a technique for beta activity measurement - of the standard ISO 7503, part 1 (1988) - and its application with a detector Gm plane (type pankage) in the realization of leakage tests in emitter sources of pure beta radiation, inside the mark of quality assurance indicated by the report ICRU 76. (Author)

  17. Investigation of a precise static leach test for the testing of simulated nuclear waste materials

    International Nuclear Information System (INIS)

    Kingston, H.M.; Cronin, D.J.; Epstein, M.S.

    1984-01-01

    The precision of the nuclear waste static leach test was evaluated using controlled experimental conditions and homogeneous glass materials. The majority of the leachate components were subjected to simultaneous multielement DCP analysis. The overall precision of the static leach test is determined by the summation of random effects caused by: variance in the experimental conditions of the leaching procedure; inhomogeneity of the material to be leached; and variance of the analytical techniques used to determine elemental concentrations in the leachate. In this study, strict control of key experimental parameters was employed to reduce the first source of variance. In addition, special attention to the preparation of glass samples to be tested assured a high degree of homogeneity. Described here are the details of the reduction of these two sources of variance to a point where the overall test precision is limited by that of the analysis step. Of the elements determined - B, Ba, Ca, Cs, Mo, Na, Si, Sr, and Zn - only Ca and Zn exhibited replicate imprecision significantly greater than that observed in the analysis of the leachate solutions. The imprecision in the Zn was partially attributed to the non-reproducible adsorption onto the leach vessel walls during the 28 day test period. None of the other elements exhibited this behavior

  18. Analytical Subthreshold Current and Subthreshold Swing Models for a Fully Depleted (FD) Recessed-Source/Drain (Re-S/D) SOI MOSFET with Back-Gate Control

    Science.gov (United States)

    Saramekala, Gopi Krishna; Tiwari, Pramod Kumar

    2017-08-01

    Two-dimensional (2D) analytical models for the subthreshold current and subthreshold swing of the back-gated fully depleted recessed-source/drain (Re-S/D) silicon-on-insulator (SOI) metal-oxide-semiconductor field-effect transistor (MOSFET) are presented. The surface potential is determined by solving the 2D Poisson equation in both channel and buried-oxide (BOX) regions, considering suitable boundary conditions. To derive closed-form expressions for the subthreshold characteristics, the virtual cathode potential expression has been derived in terms of the minimum of the front and back surface potentials. The effect of various device parameters such as gate oxide and Si film thicknesses, thickness of source/drain penetration into BOX, applied back-gate bias voltage, etc. on the subthreshold current and subthreshold swing has been analyzed. The validity of the proposed models is established using the Silvaco ATLAS™ 2D device simulator.

  19. pyJac: Analytical Jacobian generator for chemical kinetics

    Science.gov (United States)

    Niemeyer, Kyle E.; Curtis, Nicholas J.; Sung, Chih-Jen

    2017-06-01

    Accurate simulations of combustion phenomena require the use of detailed chemical kinetics in order to capture limit phenomena such as ignition and extinction as well as predict pollutant formation. However, the chemical kinetic models for hydrocarbon fuels of practical interest typically have large numbers of species and reactions and exhibit high levels of mathematical stiffness in the governing differential equations, particularly for larger fuel molecules. In order to integrate the stiff equations governing chemical kinetics, generally reactive-flow simulations rely on implicit algorithms that require frequent Jacobian matrix evaluations. Some in situ and a posteriori computational diagnostics methods also require accurate Jacobian matrices, including computational singular perturbation and chemical explosive mode analysis. Typically, finite differences numerically approximate these, but for larger chemical kinetic models this poses significant computational demands since the number of chemical source term evaluations scales with the square of species count. Furthermore, existing analytical Jacobian tools do not optimize evaluations or support emerging SIMD processors such as GPUs. Here we introduce pyJac, a Python-based open-source program that generates analytical Jacobian matrices for use in chemical kinetics modeling and analysis. In addition to producing the necessary customized source code for evaluating reaction rates (including all modern reaction rate formulations), the chemical source terms, and the Jacobian matrix, pyJac uses an optimized evaluation order to minimize computational and memory operations. As a demonstration, we first establish the correctness of the Jacobian matrices for kinetic models of hydrogen, methane, ethylene, and isopentanol oxidation (number of species ranging 13-360) by showing agreement within 0.001% of matrices obtained via automatic differentiation. We then demonstrate the performance achievable on CPUs and GPUs using py

  20. Transfer of analytical procedures: a panel of strategies selected for risk management, with emphasis on an integrated equivalence-based comparative testing approach.

    Science.gov (United States)

    Agut, C; Caron, A; Giordano, C; Hoffman, D; Ségalini, A

    2011-09-10

    In 2001, a multidisciplinary team made of analytical scientists and statisticians at Sanofi-aventis has published a methodology which has governed, from that time, the transfers from R&D sites to Manufacturing sites of the release monographs. This article provides an overview of the recent adaptations brought to this original methodology taking advantage of our experience and the new regulatory framework, and, in particular, the risk management perspective introduced by ICH Q9. Although some alternate strategies have been introduced in our practices, the comparative testing one, based equivalence testing as statistical approach, remains the standard for assays lying on very critical quality attributes. This is conducted with the concern to control the most important consumer's risk involved at two levels in analytical decisions in the frame of transfer studies: risk, for the receiving laboratory, to take poor release decisions with the analytical method and risk, for the sending laboratory, to accredit such a receiving laboratory on account of its insufficient performances with the method. Among the enhancements to the comparative studies, the manuscript presents the process settled within our company for a better integration of the transfer study into the method life-cycle, just as proposals of generic acceptance criteria and designs for assay and related substances methods. While maintaining rigor and selectivity of the original approach, these improvements tend towards an increased efficiency in the transfer operations. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Prototype tests on the ion source power supplies of the TEXTOR NI-system

    International Nuclear Information System (INIS)

    Goll, O.; Braunsberger, U.; Schwarz, U.

    1987-01-01

    The PINI ion source for the TEXTOR neutral injector is fed by a new modular transistorized power supply. All modules are located in a high voltage cage on 55 kV dc against ground. The normal operation of the injectors includes frequent grid breakdowns causing transient high voltage stresses on the ion source power supplies. These stresses must not disturb the safe operation of the power supplies. The paper describes the set up for extensive testing of a supply prototype module under the expected operating conditions. The main features of this test program are reviewed and the measures taken for a safe operation are discussed. As a result of the investigations, recommendations for the installation of the power supplies at the TEXTOR NI system are given

  2. The World Spatiotemporal Analytics and Mapping Project (WSTAMP: Further Progress in Discovering, Exploring, and Mapping Spatiotemporal Patterns Across the World’s Largest Open Source Data Sets

    Directory of Open Access Journals (Sweden)

    J. Piburn

    2017-10-01

    Full Text Available Spatiotemporal (ST analytics applied to major data sources such as the World Bank and World Health Organization has shown tremendous value in shedding light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. WSTAMP engages this opportunity by situating analysts, data, and analytics together within a visually rich and computationally rigorous online analysis environment. Since introducing WSTAMP at the First International Workshop on Spatiotemporal Computing, several transformative advances have occurred. Collaboration with human computer interaction experts led to a complete interface redesign that deeply immerses the analyst within a ST context, significantly increases visual and textual content, provides navigational crosswalks for attribute discovery, substantially reduce mouse and keyboard actions, and supports user data uploads. Secondly, the database has been expanded to include over 16,000 attributes, 50 years of time, and 200+ nation states and redesigned to support non-annual, non-national, city, and interaction data. Finally, two new analytics are implemented for analyzing large portfolios of multi-attribute data and measuring the behavioral stability of regions along different dimensions. These advances required substantial new approaches in design, algorithmic innovations, and increased computational efficiency. We report on these advances and inform how others may freely access the tool.

  3. Dual metal gate tunneling field effect transistors based on MOSFETs: A 2-D analytical approach

    Science.gov (United States)

    Ramezani, Zeinab; Orouji, Ali A.

    2018-01-01

    A novel 2-D analytical drain current model of novel Dual Metal Gate Tunnel Field Effect Transistors Based on MOSFETs (DMG-TFET) is presented in this paper. The proposed Tunneling FET is extracted from a MOSFET structure by employing an additional electrode in the source region with an appropriate work function to induce holes in the N+ source region and hence makes it as a P+ source region. The electric field is derived which is utilized to extract the expression of the drain current by analytically integrating the band to band tunneling generation rate in the tunneling region based on the potential profile by solving the Poisson's equation. Through this model, the effects of the thin film thickness and gate voltage on the potential, the electric field, and the effects of the thin film thickness on the tunneling current can be studied. To validate our present model we use SILVACO ATLAS device simulator and the analytical results have been compared with it and found a good agreement.

  4. 77 FR 56176 - Analytical Methods Used in Periodic Reporting

    Science.gov (United States)

    2012-09-12

    ... informal rulemaking proceeding to consider changes in analytical principles (Proposals Six and Seven) used... (Proposals Six and Seven), September 4, 2012 (Petition). Proposal Six: Use of Foreign Postal Settlement System as Sole Source for Reporting of Inbound International Revenue, Pieces, and Weights. The Postal...

  5. Chemical Explosion Experiments to Improve Nuclear Test Monitoring - Developing a New Paradigm for Nuclear Test Monitoring with the Source Physics Experiments (SPE)

    International Nuclear Information System (INIS)

    Snelson, Catherine M.; Abbott, Robert E.; Broome, Scott T.; Mellors, Robert J.; Patton, Howard J.; Sussman, Aviva J.; Townsend, Margaret J.; Walter, William R.

    2013-01-01

    A series of chemical explosions, called the Source Physics Experiments (SPE), is being conducted under the auspices of the U.S. Department of Energy's National Nuclear Security Administration (NNSA) to develop a new more physics-based paradigm for nuclear test monitoring. Currently, monitoring relies on semi-empirical models to discriminate explosions from earthquakes and to estimate key parameters such as yield. While these models have been highly successful monitoring established test sites, there is concern that future tests could occur in media and at scale depths of burial outside of our empirical experience. This is highlighted by North Korean tests, which exhibit poor performance of a reliable discriminant, mb:Ms (Selby et al., 2012), possibly due to source emplacement and differences in seismic responses for nascent and established test sites. The goal of SPE is to replace these semi-empirical relationships with numerical techniques grounded in a physical basis and thus applicable to any geologic setting or depth

  6. A very high yield electron impact ion source for analytical mass spectrometry

    International Nuclear Information System (INIS)

    Koontz, S.L.; Bonner Denton, M.

    1981-01-01

    A novel ion source designed for use in mass spectrometric determination of organic compounds is described. The source is designed around a low pressure, large volume, hot cathode Penning discharge. The source operates in the 10 -4 - 10 -7 torr pressure domain and is capable of producing focusable current densities several orders of magnitude greater than those produced by conventional Nier -type sources. Mass spectra of n-butane and octafluoro-2-butene are presented. An improved signal-to-noise ratio is demonstrated with a General Electric Monopole 300 mass spectrometer. (orig.)

  7. Library improvement through data analytics

    CERN Document Server

    Farmer, Lesley S J

    2017-01-01

    This book shows how to act on and make sense of data in libraries. Using a range of techniques, tools and methodologies it explains how data can be used to help inform decision making at every level. Sound data analytics is the foundation for making an evidence-based case for libraries, in addition to guiding myriad organizational decisions, from optimizing operations for efficiency to responding to community needs. Designed to be useful for beginners as well as those with a background in data, this book introduces the basics of a six point framework that can be applied to a variety of library settings for effective system based, data-driven management. Library Improvement Through Data Analytics includes: - the basics of statistical concepts - recommended data sources for various library functions and processes, and guidance for using census, university, or - - government data in analysis - techniques for cleaning data - matching data to appropriate data analysis methods - how to make descriptive statistics m...

  8. Analytic reconstruction algorithms for triple-source CT with horizontal data truncation

    International Nuclear Information System (INIS)

    Chen, Ming; Yu, Hengyong

    2015-01-01

    Purpose: This paper explores a triple-source imaging method with horizontal data truncation to enlarge the field of view (FOV) for big objects. Methods: The study is conducted by using theoretical analysis, mathematical deduction, and numerical simulations. The proposed algorithms are implemented in c + + and MATLAB. While the basic platform is constructed in MATLAB, the computationally intensive segments are coded in c + +, which are linked via a MEX interface. Results: A triple-source circular scanning configuration with horizontal data truncation is developed, where three pairs of x-ray sources and detectors are unevenly distributed on the same circle to cover the whole imaging object. For this triple-source configuration, a fan-beam filtered backprojection-type algorithm is derived for truncated full-scan projections without data rebinning. The algorithm is also extended for horizontally truncated half-scan projections and cone-beam projections in a Feldkamp-type framework. Using their method, the FOV is enlarged twofold to threefold to scan bigger objects with high speed and quality. The numerical simulation results confirm the correctness and effectiveness of the developed algorithms. Conclusions: The triple-source scanning configuration with horizontal data truncation cannot only keep most of the advantages of a traditional multisource system but also cover a larger FOV for big imaging objects. In addition, because the filtering is shift-invariant, the proposed algorithms are very fast and easily parallelized on graphic processing units

  9. Pre-analytical and analytical aspects affecting clinical reliability of plasma glucose results.

    Science.gov (United States)

    Pasqualetti, Sara; Braga, Federica; Panteghini, Mauro

    2017-07-01

    The measurement of plasma glucose (PG) plays a central role in recognizing disturbances in carbohydrate metabolism, with established decision limits that are globally accepted. This requires that PG results are reliable and unequivocally valid no matter where they are obtained. To control the pre-analytical variability of PG and prevent in vitro glycolysis, the use of citrate as rapidly effective glycolysis inhibitor has been proposed. However, the commercial availability of several tubes with studies showing different performance has created confusion among users. Moreover, and more importantly, studies have shown that tubes promptly inhibiting glycolysis give PG results that are significantly higher than tubes containing sodium fluoride only, used in the majority of studies generating the current PG cut-points, with a different clinical classification of subjects. From the analytical point of view, to be equivalent among different measuring systems, PG results should be traceable to a recognized higher-order reference via the implementation of an unbroken metrological hierarchy. In doing this, it is important that manufacturers of measuring systems consider the uncertainty accumulated through the different steps of the selected traceability chain. In particular, PG results should fulfil analytical performance specifications defined to fit the intended clinical application. Since PG has tight homeostatic control, its biological variability may be used to define these limits. Alternatively, given the central diagnostic role of the analyte, an outcome model showing the impact of analytical performance of test on clinical classifications of subjects can be used. Using these specifications, performance assessment studies employing commutable control materials with values assigned by reference procedure have shown that the quality of PG measurements is often far from desirable and that problems are exacerbated using point-of-care devices. Copyright © 2017 The Canadian

  10. Development and tests of molybdenum armored copper components for MITICA ion source

    Science.gov (United States)

    Pavei, Mauro; Böswirth, Bernd; Greuner, Henri; Marcuzzi, Diego; Rizzolo, Andrea; Valente, Matteo

    2016-02-01

    In order to prevent detrimental material erosion of components impinged by back-streaming positive D or H ions in the megavolt ITER injector and concept advancement beam source, a solution based on explosion bonding technique has been identified for producing a 1 mm thick molybdenum armour layer on copper substrate, compatible with ITER requirements. Prototypes have been recently manufactured and tested in the high heat flux test facility Garching Large Divertor Sample Test Facility (GLADIS) to check the capability of the molybdenum-copper interface to withstand several thermal shock cycles at high power density. This paper presents both the numerical fluid-dynamic analyses of the prototypes simulating the test conditions in GLADIS as well as the experimental results.

  11. Development and tests of molybdenum armored copper components for MITICA ion source

    International Nuclear Information System (INIS)

    Pavei, Mauro; Marcuzzi, Diego; Rizzolo, Andrea; Valente, Matteo; Böswirth, Bernd; Greuner, Henri

    2016-01-01

    In order to prevent detrimental material erosion of components impinged by back-streaming positive D or H ions in the megavolt ITER injector and concept advancement beam source, a solution based on explosion bonding technique has been identified for producing a 1 mm thick molybdenum armour layer on copper substrate, compatible with ITER requirements. Prototypes have been recently manufactured and tested in the high heat flux test facility Garching Large Divertor Sample Test Facility (GLADIS) to check the capability of the molybdenum-copper interface to withstand several thermal shock cycles at high power density. This paper presents both the numerical fluid-dynamic analyses of the prototypes simulating the test conditions in GLADIS as well as the experimental results

  12. Development and tests of molybdenum armored copper components for MITICA ion source

    Energy Technology Data Exchange (ETDEWEB)

    Pavei, Mauro, E-mail: mauro.pavei@igi.cnr.it; Marcuzzi, Diego; Rizzolo, Andrea; Valente, Matteo [Consorzio RFX, Corso Stati Uniti, 4, I-35127 Padova (Italy); Böswirth, Bernd; Greuner, Henri [Max-Planck-Institut für Plasmaphysik, Boltzmannstrasse 2, D-85748 Garching (Germany)

    2016-02-15

    In order to prevent detrimental material erosion of components impinged by back-streaming positive D or H ions in the megavolt ITER injector and concept advancement beam source, a solution based on explosion bonding technique has been identified for producing a 1 mm thick molybdenum armour layer on copper substrate, compatible with ITER requirements. Prototypes have been recently manufactured and tested in the high heat flux test facility Garching Large Divertor Sample Test Facility (GLADIS) to check the capability of the molybdenum-copper interface to withstand several thermal shock cycles at high power density. This paper presents both the numerical fluid-dynamic analyses of the prototypes simulating the test conditions in GLADIS as well as the experimental results.

  13. Determination of Cr, Mn, Si, and Ni in carbon steels by optical emission spectrometry with spark source

    International Nuclear Information System (INIS)

    Garcia Gonzalez, M.A.; Pomares Alfonso, M.; Mora Lopez, L.

    1995-01-01

    Elemental composition of steels determines some important of his characteristic moreover it is necessary to obtain their quality certification. Analytical procedure has performed for determination of Cr, Mn, Si and Ni in carbon steels by optical emission spectrometry with spark source. reproducibility of results is 5-11 %. Exactitude has tested with results that have obtained by internationally recognised methods-

  14. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    Science.gov (United States)

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix

  15. Analytic posteriors for Pearson's correlation coefficient.

    Science.gov (United States)

    Ly, Alexander; Marsman, Maarten; Wagenmakers, Eric-Jan

    2018-02-01

    Pearson's correlation is one of the most common measures of linear dependence. Recently, Bernardo (11th International Workshop on Objective Bayes Methodology, 2015) introduced a flexible class of priors to study this measure in a Bayesian setting. For this large class of priors, we show that the (marginal) posterior for Pearson's correlation coefficient and all of the posterior moments are analytic. Our results are available in the open-source software package JASP.

  16. Analytic posteriors for Pearson's correlation coefficient

    OpenAIRE

    Ly, A.; Marsman, M.; Wagenmakers, E.-J.

    2018-01-01

    Pearson's correlation is one of the most common measures of linear dependence. Recently, Bernardo (11th International Workshop on Objective Bayes Methodology, 2015) introduced a flexible class of priors to study this measure in a Bayesian setting. For this large class of priors, we show that the (marginal) posterior for Pearson's correlation coefficient and all of the posterior moments are analytic. Our results are available in the open‐source software package JASP.

  17. Krakow conference on low emissions sources: Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Pierce, B.L.; Butcher, T.A. [eds.

    1995-12-31

    The Krakow Conference on Low Emission Sources presented the information produced and analytical tools developed in the first phase of the Krakow Clean Fossil Fuels and Energy Efficiency Program. This phase included: field testing to provide quantitative data on missions and efficiencies as well as on opportunities for building energy conservation; engineering analysis to determine the costs of implementing pollution control; and incentives analysis to identify actions required to create a market for equipment, fuels, and services needed to reduce pollution. Collectively, these Proceedings contain reports that summarize the above phase one information, present the status of energy system management in Krakow, provide information on financing pollution control projects in Krakow and elsewhere, and highlight the capabilities and technologies of Polish and American companies that are working to reduce pollution from low emission sources. It is intended that the US reader will find in these Proceedings useful results and plans for control of pollution from low emission sources that are representative of heating systems in central and Eastern Europe. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  18. Aplikasi Analytical Hierarchy Process Pada Pemilihan Metode Analisis Zat Organik Dalam Air

    Directory of Open Access Journals (Sweden)

    Dino Rimantho

    2016-07-01

    Full Text Available Water is one of the food products analyzed in water chemistry and environmental laboratories. One of the parameters analyzed are organic substances. The number of samples that were not comparable with the analytical skills can cause delays in test results. Analytical Hierarchy Process applied to evaluate the analytical methods used. Alternative methods tested include titrimetric method, spectrophotometry, and total organic carbon (TOC. Respondents consisted of deputy technical manager, laboratory coordinator, and two senior analysts. Alternative results obtained are methods of TOC. Proposed improvements alternative analytical method based on the results obtained, the method of the TOC with a 10-15 minute analysis time and use of CRM to the validity of the analysis results.

  19. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  20. A new H2+ source: Conceptual study and experimental test of an upgraded version of the VIS—Versatile ion source

    Science.gov (United States)

    Castro, G.; Torrisi, G.; Celona, L.; Mascali, D.; Neri, L.; Sorbello, G.; Leonardi, O.; Patti, G.; Castorina, G.; Gammino, S.

    2016-08-01

    The versatile ion source is an off-resonance microwave discharge ion source which produces a slightly overdense plasma at 2.45 GHz of pumping wave frequency extracting more than 60 mA proton beams and 50 mA He+ beams. DAEδALUS and IsoDAR experiments require high intensities for H2+ beams to be accelerated by high power cyclotrons for neutrinos generation. In order to fulfill the new requirements, a new plasma chamber and injection system has been designed and manufactured for increasing the H2+ beam intensity. In this paper the studies for the increasing of the H2+/p ratio and for the design of the new plasma chamber and injection system will be shown and discussed together with the experimental tests carried out at Istituto Nazionale di Fisica Nucleare-Laboratori Nazionali del Sud (INFN-LNS) and at Best Cyclotron Systems test-bench in Vancouver, Canada.

  1. GLOBAL OPTIMIZATION METHODS FOR GRAVITATIONAL LENS SYSTEMS WITH REGULARIZED SOURCES

    International Nuclear Information System (INIS)

    Rogers, Adam; Fiege, Jason D.

    2012-01-01

    Several approaches exist to model gravitational lens systems. In this study, we apply global optimization methods to find the optimal set of lens parameters using a genetic algorithm. We treat the full optimization procedure as a two-step process: an analytical description of the source plane intensity distribution is used to find an initial approximation to the optimal lens parameters; the second stage of the optimization uses a pixelated source plane with the semilinear method to determine an optimal source. Regularization is handled by means of an iterative method and the generalized cross validation (GCV) and unbiased predictive risk estimator (UPRE) functions that are commonly used in standard image deconvolution problems. This approach simultaneously estimates the optimal regularization parameter and the number of degrees of freedom in the source. Using the GCV and UPRE functions, we are able to justify an estimation of the number of source degrees of freedom found in previous work. We test our approach by applying our code to a subset of the lens systems included in the SLACS survey.

  2. Applications of nuclear analytical techniques to environmental studies

    International Nuclear Information System (INIS)

    Freitas, M.C.; Marques, A.P.; Reis, M.A.; Pacheco, A.M.G.; Barros, L.I.C.

    2001-01-01

    A few examples of application of nuclear-analytical techniques to biological monitors - natives and transplants - are given herein. Parmelia sulcata Taylor transplants were set up in a heavily industrialized area of Portugal - the Setubal peninsula, about 50 km south of Lisbon - where indigenous lichens are rare. The whole area was 10x15 km around an oil-fired power station, and a 2.5x2.5 km grid was used. In north-western Portugal, native thalli of the same epiphytes (Parmelia spp., mostly Parmelia sulcata Taylor) and bark from olive trees (Olea europaea) were sampled across an area of 50x50 km, using a 10x10 km grid. This area is densely populated and features a blend of rural, urban-industrial and coastal environments, together with the country's second-largest metro area (Porto). All biomonitors have been analyzed by INAA and PIXE. Results were put through nonparametric tests and factor analysis for trend significance and emission sources, respectively

  3. Analytical and Experimental Studies of Leak Location and Environment Characterization for the International Space Station

    Science.gov (United States)

    Woronowicz, Michael; Abel, Joshua; Autrey, David; Blackmon, Rebecca; Bond, Tim; Brown, Martin; Buffington, Jesse; Cheng, Edward; DeLatte, Danielle; Garcia, Kelvin; hide

    2014-01-01

    The International Space Station program is developing a robotically-operated leak locator tool to be used externally. The tool would consist of a Residual Gas Analyzer for partial pressure measurements and a full range pressure gauge for total pressure measurements. The primary application is to detect NH3 coolant leaks in the ISS thermal control system. An analytical model of leak plume physics is presented that can account for effusive flow as well as plumes produced by sonic orifices and thruster operations. This model is used along with knowledge of typical RGA and full range gauge performance to analyze the expected instrument sensitivity to ISS leaks of various sizes and relative locations ("directionality"). The paper also presents experimental results of leak simulation testing in a large thermal vacuum chamber at NASA Goddard Space Flight Center. This test characterized instrument sensitivity as a function of leak rates ranging from 1 lb-mass/yr. to about 1 lb-mass/day. This data may represent the first measurements collected by an RGA or ion gauge system monitoring off-axis point sources as a function of location and orientation. Test results are compared to the analytical model and used to propose strategies for on-orbit leak location and environment characterization using the proposed instrument while taking into account local ISS conditions and the effects of ram/wake flows and structural shadowing within low Earth orbit.

  4. Inorganic Arsenic Determination in Food: A Review of Analytical Proposals and Quality Assessment Over the Last Six Years.

    Science.gov (United States)

    Llorente-Mirandes, Toni; Rubio, Roser; López-Sánchez, José Fermín

    2017-01-01

    Here we review recent developments in analytical proposals for the assessment of inorganic arsenic (iAs) content in food products. Interest in the determination of iAs in products for human consumption such as food commodities, wine, and seaweed among others is fueled by the wide recognition of its toxic effects on humans, even at low concentrations. Currently, the need for robust and reliable analytical methods is recognized by various international safety and health agencies, and by organizations in charge of establishing acceptable tolerance levels of iAs in food. This review summarizes the state of the art of analytical methods while highlighting tools for the assessment of quality assessment of the results, such as the production and evaluation of certified reference materials (CRMs) and the availability of specific proficiency testing (PT) programmes. Because the number of studies dedicated to the subject of this review has increased considerably over recent years, the sources consulted and cited here are limited to those from 2010 to the end of 2015.

  5. The Role of Nanoparticle Design in Determining Analytical Performance of Lateral Flow Immunoassays.

    Science.gov (United States)

    Zhan, Li; Guo, Shuang-Zhuang; Song, Fayi; Gong, Yan; Xu, Feng; Boulware, David R; McAlpine, Michael C; Chan, Warren C W; Bischof, John C

    2017-12-13

    Rapid, simple, and cost-effective diagnostics are needed to improve healthcare at the point of care (POC). However, the most widely used POC diagnostic, the lateral flow immunoassay (LFA), is ∼1000-times less sensitive and has a smaller analytical range than laboratory tests, requiring a confirmatory test to establish truly negative results. Here, a rational and systematic strategy is used to design the LFA contrast label (i.e., gold nanoparticles) to improve the analytical sensitivity, analytical detection range, and antigen quantification of LFAs. Specifically, we discovered that the size (30, 60, or 100 nm) of the gold nanoparticles is a main contributor to the LFA analytical performance through both the degree of receptor interaction and the ultimate visual or thermal contrast signals. Using the optimal LFA design, we demonstrated the ability to improve the analytical sensitivity by 256-fold and expand the analytical detection range from 3 log 10 to 6 log 10 for diagnosing patients with inflammatory conditions by measuring C-reactive protein. This work demonstrates that, with appropriate design of the contrast label, a simple and commonly used diagnostic technology can compete with more expensive state-of-the-art laboratory tests.

  6. Improvement of Analytical Technique for Determination of Gold in ...

    African Journals Online (AJOL)

    This article elucidates the improvement of analytical technique for determination of gold in geological matrix. Samples suspected to have gold in them were subjected to neutron flux from the Nigeria Research Reactor (NRR-1), a Miniature Neutron Source Reactor (MNSR). Two geological samples – one sample was ...

  7. Interlaboratory analytical performance studies; a way to estimate measurement uncertainty

    Directory of Open Access Journals (Sweden)

    El¿bieta £ysiak-Pastuszak

    2004-09-01

    Full Text Available Comparability of data collected within collaborative programmes became the key challenge of analytical chemistry in the 1990s, including monitoring of the marine environment. To obtain relevant and reliable data, the analytical process has to proceed under a well-established Quality Assurance (QA system with external analytical proficiency tests as an inherent component. A programme called Quality Assurance in Marine Monitoring in Europe (QUASIMEME was established in 1993 and evolved over the years as the major provider of QA proficiency tests for nutrients, trace metals and chlorinated organic compounds in marine environment studies. The article presents an evaluation of results obtained in QUASIMEME Laboratory Performance Studies by the monitoring laboratory of the Institute of Meteorology and Water Management (Gdynia, Poland in exercises on nutrient determination in seawater. The measurement uncertainty estimated from routine internal quality control measurements and from results of analytical performance exercises is also presented in the paper.

  8. Characterization and source term assessments of radioactive particles from Marshall Islands using non-destructive analytical techniques

    Science.gov (United States)

    Jernström, J.; Eriksson, M.; Simon, R.; Tamborini, G.; Bildstein, O.; Marquez, R. Carlos; Kehl, S. R.; Hamilton, T. F.; Ranebo, Y.; Betti, M.

    2006-08-01

    Six plutonium-containing particles stemming from Runit Island soil (Marshall Islands) were characterized by non-destructive analytical and microanalytical methods. Composition and elemental distribution in the particles were studied with synchrotron radiation based micro X-ray fluorescence spectrometry. Scanning electron microscope equipped with energy dispersive X-ray detector and with wavelength dispersive system as well as a secondary ion mass spectrometer were used to examine particle surfaces. Based on the elemental composition the particles were divided into two groups: particles with pure Pu matrix, and particles where the plutonium is included in Si/O-rich matrix being more heterogenously distributed. All of the particles were identified as nuclear fuel fragments of exploded weapon components. As containing plutonium with low 240Pu/ 239Pu atomic ratio, less than 0.065, which corresponds to weapons-grade plutonium or a detonation with low fission yield, the particles were identified to originate from the safety test and low-yield tests conducted in the history of Runit Island. The Si/O-rich particles contained traces of 137Cs ( 239 + 240 Pu/ 137Cs activity ratio higher than 2500), which indicated that a minor fission process occurred during the explosion. The average 241Am/ 239Pu atomic ratio in the six particles was 3.7 × 10 - 3 ± 0.2 × 10 - 3 (February 2006), which indicated that plutonium in the different particles had similar age.

  9. The Chandra Source Catalog: Source Variability

    Science.gov (United States)

    Nowak, Michael; Rots, A. H.; McCollough, M. L.; Primini, F. A.; Glotfelty, K. J.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Evans, I.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hain, R.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-09-01

    The Chandra Source Catalog (CSC) contains fields of view that have been studied with individual, uninterrupted observations that span integration times ranging from 1 ksec to 160 ksec, and a large number of which have received (multiple) repeat observations days to years later. The CSC thus offers an unprecedented look at the variability of the X-ray sky over a broad range of time scales, and across a wide diversity of variable X-ray sources: stars in the local galactic neighborhood, galactic and extragalactic X-ray binaries, Active Galactic Nuclei, etc. Here we describe the methods used to identify and quantify source variability within a single observation, and the methods used to assess the variability of a source when detected in multiple, individual observations. Three tests are used to detect source variability within a single observation: the Kolmogorov-Smirnov test and its variant, the Kuiper test, and a Bayesian approach originally suggested by Gregory and Loredo. The latter test not only provides an indicator of variability, but is also used to create a best estimate of the variable lightcurve shape. We assess the performance of these tests via simulation of statistically stationary, variable processes with arbitrary input power spectral densities (here we concentrate on results of red noise simulations) at variety of mean count rates and fractional root mean square variabilities relevant to CSC sources. We also assess the false positive rate via simulations of constant sources whose sole source of fluctuation is Poisson noise. We compare these simulations to an assessment of the variability found in real CSC sources, and estimate the variability sensitivities of the CSC.

  10. Sealed source and device design safety testing. Volume 4: Technical report on the findings of Task 4, Investigation of sealed source for paper mill digester

    International Nuclear Information System (INIS)

    Benac, D.J.; Iddings, F.A.

    1995-10-01

    This report covers the Task 4 activities for the Sealed Source and Device Safety testing program. SwRI was contracted to investigate a suspected leaking radioactive source that was installed in a gauge that was on a paper mill digester. The actual source that was leaking was not available, therefore, SwRI examined another source. SwRI concluded that the encapsulated source examined by SwRI was not leaking. However, the presence of Cs-137 on the interior and exterior of the outer encapsulation and hending tube suggests that contamination probably occurred when the source was first manufactured, then installed in the handling tube

  11. Analytic continuation by duality estimation of the S parameter

    International Nuclear Information System (INIS)

    Ignjatovic, S. R.; Wijewardhana, L. C. R.; Takeuchi, T.

    2000-01-01

    We investigate the reliability of the analytic continuation by duality (ACD) technique in estimating the electroweak S parameter for technicolor theories. The ACD technique, which is an application of finite energy sum rules, relates the S parameter for theories with unknown particle spectra to known OPE coefficients. We identify the sources of error inherent in the technique and evaluate them for several toy models to see if they can be controlled. The evaluation of errors is done analytically and all relevant formulas are provided in appendixes including analytical formulas for approximating the function 1/s with a polynomial in s. The use of analytical formulas protects us from introducing additional errors due to numerical integration. We find that it is very difficult to control the errors even when the momentum dependence of the OPE coefficients is known exactly. In realistic cases in which the momentum dependence of the OPE coefficients is only known perturbatively, it is impossible to obtain a reliable estimate. (c) 2000 The American Physical Society

  12. Pre-analytical and post-analytical evaluation in the era of molecular diagnosis of sexually transmitted diseases: cellularity control and internal control

    Directory of Open Access Journals (Sweden)

    Loria Bianchi

    2014-06-01

    Full Text Available Background. Increase of molecular tests performed on DNA extracted from various biological materials should not be carried out without an adequate standardization of the pre-analytical and post-analytical phase. Materials and Methods. Aim of this study was to evaluate the role of internal control (IC to standardize pre-analytical phase and the role of cellularity control (CC in the suitability evaluation of biological matrices, and their influence on false negative results. 120 cervical swabs (CS were pre-treated and extracted following 3 different protocols. Extraction performance was evaluated by amplification of: IC, added in each mix extraction; human gene HPRT1 (CC with RT-PCR to quantify sample cellularity; L1 region of HPV with SPF10 primers. 135 urine, 135 urethral swabs, 553 CS and 332 ThinPrep swabs (TP were tested for C. trachomatis (CT and U. parvum (UP with RT-PCR and for HPV by endpoint-PCR. Samples were also tested for cellularity. Results. Extraction protocol with highest average cellularity (Ac/sample showed lowest number of samples with inhibitors; highest HPV positivity was achieved by protocol with greatest Ac/PCR. CS and TP under 300.000 cells/sample showed a significant decrease of UP (P<0.01 and HPV (P<0.005 positivity. Female urine under 40.000 cells/mL were inadequate to detect UP (P<0.05. Conclusions. Our data show that IC and CC allow optimization of pre-analytical phase, with an increase of analytical quality. Cellularity/sample allows better sample adequacy evaluation, crucial to avoid false negative results, while cellularity/PCR allows better optimization of PCR amplification. Further data are required to define the optimal cut-off for result normalization.

  13. Analytic properties of Feynman diagrams in quantum field theory

    CERN Document Server

    Todorov, I T

    1971-01-01

    Analytic Properties of Feynman Diagrams in Quantum Field Theory deals with quantum field theory, particularly in the study of the analytic properties of Feynman graphs. This book is an elementary presentation of a self-contained exposition of the majorization method used in the study of these graphs. The author has taken the intermediate position between Eden et al. who assumes the physics of the analytic properties of the S-matrix, containing physical ideas and test results without using the proper mathematical methods, and Hwa and Teplitz, whose works are more mathematically inclined with a

  14. Test for arsenic speciation in waters based on a paper-based analytical device with scanometric detection.

    Science.gov (United States)

    Pena-Pereira, Francisco; Villar-Blanco, Lorena; Lavilla, Isela; Bendicho, Carlos

    2018-06-29

    A rapid, simple and affordable method for arsenic speciation analysis is described in this work. The proposed methodology involves in situ arsine generation, transfer of the volatile to the headspace and its reaction with silver nitrate at the detection zone of a paper-based analytical device (PAD). Thus, silver nitrate acts as a recognition element for arsine in the paper-based sensor. The chemical reaction between the recognition element and the analyte derivative results in the formation of a colored product which can be detected by scanning the detection zone and data treatment with an image processing and analysis program. Detection and injection zones were defined in the paper substrate by formation of hydrophobic barriers, thus enabling the formation of the volatile derivative without affecting the chemical stability of the recognition element present in the PAD. Experimental parameters influencing the analytical performance of the methodology, namely color mode detection, composition of the paper-based sensor and hydride generation and mass transfer conditions, were evaluated. Under optimal conditions, the proposed method showed limits of detection and quantification of 1.1 and 3.6 ng mL -1 , respectively. Remarkably, the limit of detection of the method reported herein was much lower than the maximum contaminant levels set by both the World Health Organization and the US Environmental Protection Agency for arsenic in drinking water, unlike several commercially available arsenic test kits. The repeatability, expressed as relative standard deviation, was found to be 7.1% (n = 8). The method was validated against the European Reference Material ERM ® -CA615 groundwater and successfully applied to the determination of As(III), As(V) and total inorganic As in different water samples. Furthermore, the method can be used for the screening analysis of total arsenic in waters when a cut-off level of 7 ng mL -1 is used. Copyright © 2018 Elsevier B

  15. Rapid detection and E-test antimicrobial susceptibility testing of Vibrio parahaemolyticus isolated from seafood and environmental sources in Malaysia.

    Science.gov (United States)

    Al-Othrubi, Saleh M; Hanafiah, Alfizah; Radu, Son; Neoh, Humin; Jamal, Rahaman

    2011-04-01

    To find out the prevalence and antimicrobial susceptibility of Vibrio parahaemolyticus in seafoods and environmental sources. The study was carried out at the Center of Excellence for Food Safety Research, University Putra Malaysia; Universiti Kebangsaan Malaysia; Medical Molecular Biology Institute; and University Kebansaan Malaysia Hospital, Malaysia between January 2006 and August 2008. One hundred and forty-four isolates from 400 samples of seafood (122 isolates) and seawater sources (22 isolates) were investigated for the presence of thermostable direct hemolysin (tdh+) and TDH-related hemolysin (trh+) genes using the standard methods. The E-test method was used to test the antimicrobial susceptibility. The study indicates low occurrence of tdh+ (0.69%) and trh+ isolates (8.3%). None of the isolates tested posses both virulence genes. High sensitivity was observed against tetracycline (98%). The mean minimum inhibitory concentration (MIC) of the isolates toward ampicillin increased from 4 ug/ml in 2004 to 24 ug/ml in 2007. The current study demonstrates a low occurrence of pathogenic Vibrio parahaemolyticus in the marine environment and seafood. Nonetheless, the potential risk of vibrio infection due to consumption of Vibrio parahaemolyticus contaminated seafood in Malaysia should not be neglected.

  16. Thermal hydraulic tests of a liquid hydrogen cold neutron source. NISTIR 5026

    International Nuclear Information System (INIS)

    Siegwarth, J.D.; Olson, D.A.; Lewis, M.A.; Rowe, J.M.; Williams, R.E.; Kopetka, P.

    1995-01-01

    Liquid hydrogen cold neutron source designed at NBSR contains neutron moderator chamber. The NIST-B electrically heated glass moderator chamber used to test the NBSR chamber testing showed the following results: Stable operation possible up to at least 2200 watts with two-phase flow; LH 2 mass quickly reaches new, stable value after heat load change; Void fraction well below 20 at anticipated power and pressure; Restart of H 2 flow verified after extending supply line; Visual inspection showed no dryout or unexpected voids

  17. Pros and cons of analytical methods to quantify surrogate contaminants from the challenge test in recycled polyethylene terephthalate

    Energy Technology Data Exchange (ETDEWEB)

    Felix, Juliana S., E-mail: jfelix@unizar.es [Department of Analytical Chemistry, Aragon Institute of Engineering Research (I3A), CPS, University of Zaragoza, Torres Quevedo Bldg., Maria de Luna St. 3, E-50018 Zaragoza (Spain); Alfaro, Pilar, E-mail: palfarot@unizar.es [Department of Analytical Chemistry, Aragon Institute of Engineering Research (I3A), CPS, University of Zaragoza, Torres Quevedo Bldg., Maria de Luna St. 3, E-50018 Zaragoza (Spain); Nerin, Cristina, E-mail: cnerin@unizar.es [Department of Analytical Chemistry, Aragon Institute of Engineering Research (I3A), CPS, University of Zaragoza, Torres Quevedo Bldg., Maria de Luna St. 3, E-50018 Zaragoza (Spain)

    2011-02-14

    Different analytical methods were optimized and applied to quantify certain surrogate contaminants (toluene, chlorobenzene, phenol, limonene and benzophenone) in samples of contaminated and recycled flakes and virgin pellets of polyethylene terephthalate (PET) coming from the industrial challenge test. A screening analysis of the PET samples was carried out by direct solid-phase microextraction (SPME) in headspace mode (HS). The methods developed and used for quantitative analysis were a) total dissolution of PET samples in dichloroacetic acid and analysis by HS-SPME coupled to gas chromatography-mass spectrometry (GC-MS) and, b) dichloromethane extraction and analysis by GC-MS. The concentration of all surrogates in the contaminated PET flakes analyzed by HS-SPME method was lower than expected according to information provided by the supplier. Dichloroacetic acid interacted with the surrogates, resulting in a tremendous decrease of limonene concentration. The degradation compounds from limonene were identified. Dichloromethane extraction and GC-MS analysis evidenced the highest values of analytes in these PET samples. Based on the foregoing data, the efficiency of the recycling process was evaluated, whereby the removal of 99.9% of the surrogates proceeding from the contaminated flakes was confirmed.

  18. Pros and cons of analytical methods to quantify surrogate contaminants from the challenge test in recycled polyethylene terephthalate

    International Nuclear Information System (INIS)

    Felix, Juliana S.; Alfaro, Pilar; Nerin, Cristina

    2011-01-01

    Different analytical methods were optimized and applied to quantify certain surrogate contaminants (toluene, chlorobenzene, phenol, limonene and benzophenone) in samples of contaminated and recycled flakes and virgin pellets of polyethylene terephthalate (PET) coming from the industrial challenge test. A screening analysis of the PET samples was carried out by direct solid-phase microextraction (SPME) in headspace mode (HS). The methods developed and used for quantitative analysis were a) total dissolution of PET samples in dichloroacetic acid and analysis by HS-SPME coupled to gas chromatography-mass spectrometry (GC-MS) and, b) dichloromethane extraction and analysis by GC-MS. The concentration of all surrogates in the contaminated PET flakes analyzed by HS-SPME method was lower than expected according to information provided by the supplier. Dichloroacetic acid interacted with the surrogates, resulting in a tremendous decrease of limonene concentration. The degradation compounds from limonene were identified. Dichloromethane extraction and GC-MS analysis evidenced the highest values of analytes in these PET samples. Based on the foregoing data, the efficiency of the recycling process was evaluated, whereby the removal of 99.9% of the surrogates proceeding from the contaminated flakes was confirmed.

  19. Studies on application of neutron activation analysis -Applied research on air pollution monitoring and development of analytical method of environmental samples

    International Nuclear Information System (INIS)

    Chung, Yong Sam; Moon, Jong Hwa; Chung, Young Ju; Jeong, Eui Sik; Lee, Sang Mi; Kang, Sang Hun; Cho, Seung Yeon; Kwon, Young Sik; Chung, Sang Wuk; Lee, Kyu Sung; Chun, Ki Hong; Kim, Nak Bae; Lee, Kil Yong; Yoon, Yoon Yeol; Chun, Sang Ki.

    1997-09-01

    This research report is written for results of applied research on air pollution monitoring using instrumental neutron activation analysis. For identification and standardization of analytical method, 24 environmental samples are analyzed quantitatively, and accuracy and precision of this method are measured. Using airborne particulate matter and biomonitor chosen as environmental indicators, trace elemental concentrations of sample collected at urban and rural site monthly are determined ant then the calculation of statistics and the factor analysis are carried out for investigation of emission source. Facilities for NAA are installed in a new HANARO reactor, functional test is performed for routine operation. In addition, unified software code for NAA is developed to improve accuracy, precision and abilities of analytical processes. (author). 103 refs., 61 tabs., 19 figs

  20. BARTTest: Community-Standard Atmospheric Radiative-Transfer and Retrieval Tests

    Science.gov (United States)

    Harrington, Joseph; Himes, Michael D.; Cubillos, Patricio E.; Blecic, Jasmina; Challener, Ryan C.

    2018-01-01

    Atmospheric radiative transfer (RT) codes are used both to predict planetary and brown-dwarf spectra and in retrieval algorithms to infer atmospheric chemistry, clouds, and thermal structure from observations. Observational plans, theoretical models, and scientific results depend on the correctness of these calculations. Yet, the calculations are complex and the codes implementing them are often written without modern software-verification techniques. The community needs a suite of test calculations with analytically, numerically, or at least community-verified results. We therefore present the Bayesian Atmospheric Radiative Transfer Test Suite, or BARTTest. BARTTest has four categories of tests: analytically verified RT tests of simple atmospheres (single line in single layer, line blends, saturation, isothermal, multiple line-list combination, etc.), community-verified RT tests of complex atmospheres, synthetic retrieval tests on simulated data with known answers, and community-verified real-data retrieval tests.BARTTest is open-source software intended for community use and further development. It is available at https://github.com/ExOSPORTS/BARTTest. We propose this test suite as a standard for verifying atmospheric RT and retrieval codes, analogous to the Held-Suarez test for general circulation models. This work was supported by NASA Planetary Atmospheres grant NX12AI69G, NASA Astrophysics Data Analysis Program grant NNX13AF38G, and NASA Exoplanets Research Program grant NNX17AB62G.

  1. Test stand for magnetron H negative ion source at IPP-Nagoya

    Energy Technology Data Exchange (ETDEWEB)

    Okamura, H; Kuroda, T; Miyahara, A

    1981-02-01

    Test facilities for the development of magnetron H(-) ion source consists of the vacuum system, power supplies, diagnostic equipment, and their controlling electronics. Schematics are presented and relevant items described including sequence control, optical links, the charged pulse forming network, the extractor power supply, magnet power supply, temperature control of the cesium feeder, and the pulsed valve driver. Noise problems and diagnostics are also considered.

  2. Strain accumulation in a prototypic lmfbr nozzle: Experimental and analytical correlation

    International Nuclear Information System (INIS)

    Woodward, W.S.; Dhalia, A.K.; Berton, P.A.

    1986-01-01

    At an early stage in the design of the primary inlet nozzle for the Intermediate Heat Exchanger (IHX) of the Fast Flux Test Facility (FFTF), it was predicted that the inelastic strain accumulation during elevated temperature operation (1050 0 F/566 0 C) would exceed the ASME Code design allowables. Therefore, a proof test of a prototypic FFTF IHX nozzle was performed in the Westinghouse Creep Ratcheting Test Facility (CRTF) to measure the ratchet strain increments during the most severe postulated FFTF plant thermal transients. In addition, analytical procedures similar to those used in the plant design, were used to predict strain accumulation in the CRTF nozzle. This paper describes how the proof test was successfully completed, and it shows that both the test measurements and analytical predictions confirm that the FFTF IHX nozzle, subjected to postulated thermal and mechanical loadings, complies with the ASME Code strain limits. Also, these results provide a measure of validation for the analytical procedures used in the design of FFTF as well as demonstrate the structural adequacy of the FFTF IHX primary inlet nozzle

  3. On analytical justification of phase synchronization in different chaotic systems

    International Nuclear Information System (INIS)

    Erjaee, G.H.

    2009-01-01

    In analytical or numerical synchronizations studies of coupled chaotic systems the phase synchronizations have less considered in the leading literatures. This article is an attempt to find a sufficient analytical condition for stability of phase synchronization in some coupled chaotic systems. The method of nonlinear feedback function and the scheme of matrix measure have been used to justify this analytical stability, and tested numerically for the existence of the phase synchronization in some coupled chaotic systems.

  4. Tests of MVD prototype pad detector with a β- source

    International Nuclear Information System (INIS)

    Yeol Kim, Sang; Gook Kim, Young; Su Ryu, Sang; Hwan Kang, Ju; Simon-Gillo, Jehanne; Sullivan, John P.; Heck, Hubert W. van; Xu Guanghua

    1999-01-01

    The MVD group has been testing two versions of silicon pad detectors. One design uses a single metal layer for readout trace routing. The second type uses two layers of metal, allowing for greatly simplified signal routing. However, because the readout traces for the pads pass over the other pads in the same column (separated by an oxide layer), the double-metal design introduces crosstalk into the system. A simple test stand using a 90 Sr β - source with scintillator triggers was made to estimate the crosstalk. The crosstalk between pads in the same column of the pad detector was 1.6-3.1%. The values measured between pads in different columns were very close to zero. The measured crosstalk was below our maximum allowed value of 7.8%

  5. Experimental testing of constructivism and related theories.

    Science.gov (United States)

    Fidelman, U

    1991-10-01

    The purpose of this article is to show that experimental scientific methods can be applied to explain how the analytic mechanism of the left cerebral hemisphere and the synthetic mechanism of the right one create complex cognitive constructions like ontology and mathematics. Nominalism and ordinal mathematical concepts are related to the analytic left hemisphere while Platonism and cardinal mathematical concepts are related to the synthetic right one. Thus persons with a dominant left hemisphere tend to prefer nominalist ontology and have more aptitude for ordinal mathematics than for cardinal mathematics, while persons with a dominant right hemisphere tend to prefer platonist ontology and have more aptitude for cardinal mathematics than for ordinal mathematics. It is further explained how the Kantism temporal mode of perceiving experience can be related to the left hemisphere while the Kantian spatial mode of perceiving experience can be related to the right hemisphere. This relation can be tested experimentally, thus the Kantian source of constructivism, and through it constructivism itself, can be tested experimentally.

  6. Benchmark Tests to Develop Analytical Time-Temperature Limit for HANA-6 Cladding for Compliance with New LOCA Criteria

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sung Yong; Jang, Hun; Lim, Jea Young; Kim, Dae Il; Kim, Yoon Ho; Mok, Yong Kyoon [KEPCO Nuclear Fuel Co. Ltd., Daejeon (Korea, Republic of)

    2016-10-15

    According to 10CFR50.46c, two analytical time and temperature limits for breakaway oxidation and postquench ductility (PQD) should be determined by approved experimental procedure as described in NRC Regulatory Guide (RG) 1.222 and 1.223. According to RG 1.222 and 1.223, rigorous qualification requirements for test system are required, such as thermal and weight gain benchmarks. In order to meet these requirements, KEPCO NF has developed the new special facility to evaluate LOCA performance of zirconium alloy cladding. In this paper, qualification results for test facility and HT oxidation model for HANA-6 are summarized. The results of thermal benchmark tests of LOCA HT oxidation tester is summarized as follows. 1. The best estimate HT oxidation model of HANA- 6 was developed for the vender proprietary HT oxidation model. 2. In accordance with the RG 1.222 and 1.223, Benchmark tests were performed by using LOCA HT oxidation tester 3. The maximum axial and circumferential temperature difference are ± 9 .deg. C and ± 2 .deg. C at 1200 .deg. C, respectively. At the other temperature conditions, temperature difference is less than 1200 .deg. C result. Thermal benchmark test results meet the requirements of NRC RG 1.222 and 1.223.

  7. Hanford performance evaluation program for Hanford site analytical services

    International Nuclear Information System (INIS)

    Markel, L.P.

    1995-09-01

    The U.S. Department of Energy (DOE) Order 5700.6C, Quality Assurance, and Title 10 of the Code of Federal Regulations, Part 830.120, Quality Assurance Requirements, states that it is the responsibility of DOE contractors to ensure that ''quality is achieved and maintained by those who have been assigned the responsibility for performing the work.'' Hanford Analytical Services Quality Assurance Plan (HASQAP) is designed to meet the needs of the Richland Operations Office (RL) for maintaining a consistent level of quality for the analytical chemistry services provided by contractor and commmercial analytical laboratory operations. Therefore, services supporting Hanford environmental monitoring, environmental restoration, and waste management analytical services shall meet appropriate quality standards. This performance evaluation program will monitor the quality standards of all analytical laboratories supporting the Hanforad Site including on-site and off-site laboratories. The monitoring and evaluation of laboratory performance can be completed by the use of several tools. This program will discuss the tools that will be utilized for laboratory performance evaluations. Revision 0 will primarily focus on presently available programs using readily available performance evaluation materials provided by DOE, EPA or commercial sources. Discussion of project specific PE materials and evaluations will be described in section 9.0 and Appendix A

  8. On the analytic continuation of functions defined by Legendre series

    International Nuclear Information System (INIS)

    Grinstein, F.F.

    1981-07-01

    An infinite diagonal sequence of Punctual Pade Approximants is considered for the approximate analytical continuation of a function defined by a formal Legendre series. The technique is tested in the case of two series with exactly known analytical sum: the generating function for Legendre polynomials and the Coulombian scattering amplitude. (author)

  9. (U) An Analytic Examination of Piezoelectric Ejecta Mass Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Tregillis, Ian Lee [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-02

    Ongoing efforts to validate a Richtmyer-Meshkov instability (RMI) based ejecta source model [1, 2, 3] in LANL ASC codes use ejecta areal masses derived from piezoelectric sensor data [4, 5, 6]. However, the standard technique for inferring masses from sensor voltages implicitly assumes instantaneous ejecta creation [7], which is not a feature of the RMI source model. To investigate the impact of this discrepancy, we define separate “areal mass functions” (AMFs) at the source and sensor in terms of typically unknown distribution functions for the ejecta particles, and derive an analytic relationship between them. Then, for the case of single-shock ejection into vacuum, we use the AMFs to compare the analytic (or “true”) accumulated mass at the sensor with the value that would be inferred from piezoelectric voltage measurements. We confirm the inferred mass is correct when creation is instantaneous, and furthermore prove that when creation is not instantaneous, the inferred values will always overestimate the true mass. Finally, we derive an upper bound for the error imposed on a perfect system by the assumption of instantaneous ejecta creation. When applied to shots in the published literature, this bound is frequently less than several percent. Errors exceeding 15% may require velocities or timescales at odds with experimental observations.

  10. Development of analytical and numerical models for the assessment and interpretation of hydrogeological field tests

    Energy Technology Data Exchange (ETDEWEB)

    Mironenko, V.A.; Rumynin, V.G.; Konosavsky, P.K. [St. Petersburg Mining Inst. (Russian Federation); Pozdniakov, S.P.; Shestakov, V.M. [Moscow State Univ. (Russian Federation); Roshal, A.A. [Geosoft-Eastlink, Moscow (Russian Federation)

    1994-07-01

    Mathematical models of the flow and tracer tests in fractured aquifers are being developed for the further study of radioactive wastes migration in round water at the Lake Area, which is associated with one of the waste disposal site in Russia. The choice of testing methods, tracer types (chemical or thermal) and the appropriate models are determined by the nature of the ongoing ground-water pollution processes and the hydrogeological features of the site under consideration. Special importance is attached to the increased density of wastes as well as to the possible redistribution of solutes both in the liquid phase and in the absorbed state (largely, on fracture surfaces). This allows for studying physical-and-chemical (hydrogeochemical) interaction parameters which are hard to obtain (considering a fractured structure of the rock mass) in laboratory. Moreover, a theoretical substantiation is being given to the field methods of studying the properties of a fractured stratum aimed at the further construction of the drainage system or the subsurface flow barrier (cutoff wall), as well as the monitoring system that will evaluate the reliability of these ground-water protection measures. The proposed mathematical models are based on a tight combination of analytical and numerical methods, the former being preferred in solving the principal (2D axisymmetrical) class of the problems. The choice of appropriate problems is based on the close feedback with subsequent field tests in the Lake Area. 63 refs.

  11. Development of analytical and numerical models for the assessment and interpretation of hydrogeological field tests

    International Nuclear Information System (INIS)

    Mironenko, V.A.; Rumynin, V.G.; Konosavsky, P.K.; Pozdniakov, S.P.; Shestakov, V.M.; Roshal, A.A.

    1994-07-01

    Mathematical models of the flow and tracer tests in fractured aquifers are being developed for the further study of radioactive wastes migration in round water at the Lake Area, which is associated with one of the waste disposal site in Russia. The choice of testing methods, tracer types (chemical or thermal) and the appropriate models are determined by the nature of the ongoing ground-water pollution processes and the hydrogeological features of the site under consideration. Special importance is attached to the increased density of wastes as well as to the possible redistribution of solutes both in the liquid phase and in the absorbed state (largely, on fracture surfaces). This allows for studying physical-and-chemical (hydrogeochemical) interaction parameters which are hard to obtain (considering a fractured structure of the rock mass) in laboratory. Moreover, a theoretical substantiation is being given to the field methods of studying the properties of a fractured stratum aimed at the further construction of the drainage system or the subsurface flow barrier (cutoff wall), as well as the monitoring system that will evaluate the reliability of these ground-water protection measures. The proposed mathematical models are based on a tight combination of analytical and numerical methods, the former being preferred in solving the principal (2D axisymmetrical) class of the problems. The choice of appropriate problems is based on the close feedback with subsequent field tests in the Lake Area. 63 refs

  12. Development and performance test of a continuous source of nitrous acid (HONO)

    Energy Technology Data Exchange (ETDEWEB)

    Ammann, M.; Roessler, E.; Kalberer, M.; Bruetsch, S.; Schwikowski, M.; Baltensperger, U.; Zellweger, C.; Gaeggeler, H.W. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1997-09-01

    Laboratory investigations involving nitrous acid (HONO) require a stable, continuous source of HONO at ppb levels. A flow type generation system based on the reaction of sodium nitrite with sulfuric acid has been developed. Performance and speciation of gaseous products were tested with denuder and chemiluminescence techniques. (author) 2 figs., 2 refs.

  13. Manual of selected physico-chemical analytical methods. IV

    International Nuclear Information System (INIS)

    Beran, M.; Klosova, E.; Krtil, J.; Sus, F.; Kuvik, V.; Vrbova, L.; Hamplova, M.; Lengyel, J.; Kelnar, L.; Zakouril, K.

    1990-11-01

    The Central Testing Laboratory of the Nuclear Research Institute at Rez has for a decade been participating in the development of analytical procedures and has been providing analyses of samples of different types and origin. The analytical procedures developed have been published in special journals and a number of them in the Manuals of analytical methods, in three parts. The 4th part of the Manual contains selected physico-chemical methods developed or modified by the Laboratory in the years 1986-1990 within the project ''Development of physico-chemical analytical methods''. In most cases, techniques are involved for non-nuclear applications. Some can find wider applications, especially in analyses of environmental samples. Others have been developed for specific cases of sample analyses or require special instrumentation (mass spectrometer), which partly restricts their applicability by other institutions. (author)

  14. Analytical study in 1D nuclear waste migration

    International Nuclear Information System (INIS)

    Perez Guerrero, Jesus S.; Heilbron Filho, Paulo L.; Romani, Zrinka V.

    1999-01-01

    The simulation of the nuclear waste migration phenomena are governed mainly by diffusive-convective equation that includes the effects of hydrodynamic dispersion (mechanical dispersion and molecular diffusion), radioactive decay and chemical interaction. For some special problems (depending on the boundary conditions and when the domain is considered infinite or semi-infinite) an analytical solution may be obtained using classical analytical methods such as Laplace Transform or variable separation. The hybrid Generalized Integral Transform Technique (GITT) is a powerful tool that can be applied to solve diffusive-convective linear problems to obtain formal analytical solutions. The aim of this work is to illustrate that the GITT may be used to obtain an analytical formal solution for the study of migration of radioactive waste in saturated flow porous media. A case test considering 241 Am radionuclide is presented. (author)

  15. Radioactive particles in the environment: sources, particle characterization and analytical techniques

    International Nuclear Information System (INIS)

    2011-08-01

    Over the years, radioactive particles have been released to the environment from nuclear weapons testing and nuclear fuel cycle operations. However, measurements of environmental radioactivity and any associated assessments are often based on the average bulk mass or surface concentration, assuming that radionuclides are homogeneously distributed as simple ionic species. It has generally not been recognised that radioactive particles present in the environment often contain a significant fraction of the bulk sample activity, leading to sample heterogeneity problems and false and/or erratic measurement data. Moreover, the inherent differences in the transport and bioavailability of particle bound radionuclides compared with those existing as molecules or ions have largely been ignored in dose assessments. To date, most studies regarding radionuclide behaviour in the soil-plant system have dealt with soluble forms of radionuclides. When radionuclides are deposited in a less mobile form, or in case of a superposition of different physico-chemical forms, the behaviour of radionuclides becomes much more complicated and extra efforts are required to provide information about environmental status and behaviour of radioactive particles. There are currently no documents or international guides covering this aspect of environmental impact assessments. To fill this gap, between 2001 and 2008 the IAEA performed a Coordinated Research Programme (CRP- G4.10.03) on the 'Radiochemical, Chemical and Physical Characterization of Radioactive Particles in the Environment' with the objective of development, adoption and application of standardized analytical techniques for the comprehensive study of radioactive particles. The CRP was in line with the IAEA project intended to assist the Member States in building capacity for improving environmental assessments and for management of sites contaminated with radioactive particles. This IAEA-TECDOC presents the findings and achievements of

  16. Gravitational wave generation from bubble collisions in first-order phase transitions: An analytic approach

    International Nuclear Information System (INIS)

    Caprini, Chiara; Durrer, Ruth; Servant, Geraldine

    2008-01-01

    Gravitational wave production from bubble collisions was calculated in the early 1990s using numerical simulations. In this paper, we present an alternative analytic estimate, relying on a different treatment of stochasticity. In our approach, we provide a model for the bubble velocity power spectrum, suitable for both detonations and deflagrations. From this, we derive the anisotropic stress and analytically solve the gravitational wave equation. We provide analytical formulas for the peak frequency and the shape of the spectrum which we compare with numerical estimates. In contrast to the previous analysis, we do not work in the envelope approximation. This paper focuses on a particular source of gravitational waves from phase transitions. In a companion article, we will add together the different sources of gravitational wave signals from phase transitions: bubble collisions, turbulence and magnetic fields and discuss the prospects for probing the electroweak phase transition at LISA

  17. Analytical and functional similarity of Amgen biosimilar ABP 215 to bevacizumab.

    Science.gov (United States)

    Seo, Neungseon; Polozova, Alla; Zhang, Mingxuan; Yates, Zachary; Cao, Shawn; Li, Huimin; Kuhns, Scott; Maher, Gwendolyn; McBride, Helen J; Liu, Jennifer

    ABP 215 is a biosimilar product to bevacizumab. Bevacizumab acts by binding to vascular endothelial growth factor A, inhibiting endothelial cell proliferation and new blood vessel formation, thereby leading to tumor vasculature normalization. The ABP 215 analytical similarity assessment was designed to assess the structural and functional similarity of ABP 215 and bevacizumab sourced from both the United States (US) and the European Union (EU). Similarity assessment was also made between the US- and EU-sourced bevacizumab to assess the similarity between the two products. The physicochemical properties and structural similarity of ABP 215 and bevacizumab were characterized using sensitive state-of-the-art analytical techniques capable of detecting small differences in product attributes. ABP 215 has the same amino acid sequence and exhibits similar post-translational modification profiles compared to bevacizumab. The functional similarity assessment employed orthogonal assays designed to interrogate all expected biological activities, including those known to affect the mechanisms of action for ABP 215 and bevacizumab. More than 20 batches of bevacizumab (US) and bevacizumab (EU), and 13 batches of ABP 215 representing unique drug substance lots were assessed for similarity. The large dataset allows meaningful comparisons and garners confidence in the overall conclusion for the analytical similarity assessment of ABP 215 to both US- and EU-sourced bevacizumab. The structural and purity attributes, and biological properties of ABP 215 are demonstrated to be highly similar to those of bevacizumab.

  18. Kawerau fluid chemistry : analytical results

    International Nuclear Information System (INIS)

    Mroczek, E.K.; Christenson, B.W.; Mountain, B.; Stewart, M.K.

    2001-01-01

    This report summarises the water and gas analytical data collected from Kawerau geothermal field 1998-2000 under the Sustainable Management of Geothermal and Mineral Resources (GMR) Project, Objective 2 'Understanding New Zealand Geothermal Systems'. The work is part of the continuing effort to characterise the chemical, thermal and isotopic signatures of the deep magmatic heat sources which drive our geothermal systems. At Kawerau there is clear indication that the present-day heat source relates to young volcanism within the field. However, being at the margins of the explored reservoir, little is presently known of the characteristics of that heat source. The Kawerau study follows on directly from the recently completed work characterising the geochemical signatures of the Ohaaki hydrothermal system. In the latter study the interpretation of the radiogenic noble gas isotope systematics was of fundamental importance in characterising the magmatic heat source. Unfortunately the collaboration with LLNL, which analysed the isotopes, could not be extended to include the Kawerau data. The gas samples have been archived and will be analysed once a new collaborator is found to continue the work. The purpose of the present compilation is to facilitate the final completion of the study by ensuring the data is accessible in one report. (author). 5 refs., 2 figs., 9 tabs

  19. Simplified analytical modeling of the normal hole erosion test; Modelado analitico simplificado del ensayo normal de ersoion de tubo

    Energy Technology Data Exchange (ETDEWEB)

    Khamlichi, A.; Bezzazi, M.; El Bakkali, L.; Jabbouri, A.; Kissi, B.; Yakhlef, F.; Parron Vera, M. A.; Rubio Cintas, M. D.; Castillo Lopez, O.

    2009-07-01

    The role erosion test was developed in order to study erosion phenomenon which occurs in cracks appearing in hydraulic infrastructures such as dams. This test enables describing experimentally the erosive characteristics of soils by means of an index which is called erosion rate and a critical tension which indicates the threshold of surface erosion initiation. The objective of this work is to five modelling of this experiment by means of a simplified analytical approach. The erosion law is derived by taking into account the flow regime. This law shows that the erosion occurring in the tube is controlled by a first order dynamics where only two parameters are involved: the characteristic's time linked to the erosion rate and the stress shear threshold for which erosion begins to develop. (Author) 5 refs.

  20. Big data analytics to improve cardiovascular care: promise and challenges.

    Science.gov (United States)

    Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M

    2016-06-01

    The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system.

  1. Application of an analytical method for solution of thermal hydraulic conservation equations

    Energy Technology Data Exchange (ETDEWEB)

    Fakory, M.R. [Simulation, Systems & Services Technologies Company (S3 Technologies), Columbia, MD (United States)

    1995-09-01

    An analytical method has been developed and applied for solution of two-phase flow conservation equations. The test results for application of the model for simulation of BWR transients are presented and compared with the results obtained from application of the explicit method for integration of conservation equations. The test results show that with application of the analytical method for integration of conservation equations, the Courant limitation associated with explicit Euler method of integration was eliminated. The results obtained from application of the analytical method (with large time steps) agreed well with the results obtained from application of explicit method of integration (with time steps smaller than the size imposed by Courant limitation). The results demonstrate that application of the analytical approach significantly improves the numerical stability and computational efficiency.

  2. Analysis of the Variability of Classified and Unclassified Radiological Source term Inventories in the Frenchman Flat Area, Nevada test Site

    International Nuclear Information System (INIS)

    Zhao, P.; Zavarin, M.

    2008-01-01

    It has been proposed that unclassified source terms used in the reactive transport modeling investigations at NTS CAUs should be based on yield-weighted source terms calculated using the average source term from Bowen et al. (2001) and the unclassified announced yields reported in DOE/NV-209. This unclassified inventory is likely to be used in unclassified contaminant boundary calculations and is, thus, relevant to compare to the classified inventory. They have examined the classified radionuclide inventory produced by 10 underground nuclear tests conducted in the Frenchman Flat (FF) area of the Nevada Test Site. The goals were to (1) evaluate the variability in classified radiological source terms among the 10 tests and (2) compare that variability and inventory uncertainties to an average unclassified inventory (e.g. Bowen 2001). To evaluate source term variability among the 10 tests, radiological inventories were compared on two relative scales: geometric mean and yield-weighted geometric mean. Furthermore, radiological inventories were either decay corrected to a common date (9/23/1992) or the time zero (t 0 ) of each test. Thus, a total of four data sets were produced. The date of 9/23/1992 was chosen based on the date of the last underground nuclear test at the Nevada Test Site

  3. DeltaSA tool for source apportionment benchmarking, description and sensitivity analysis

    Science.gov (United States)

    Pernigotti, D.; Belis, C. A.

    2018-05-01

    DeltaSA is an R-package and a Java on-line tool developed at the EC-Joint Research Centre to assist and benchmark source apportionment applications. Its key functionalities support two critical tasks in this kind of studies: the assignment of a factor to a source in factor analytical models (source identification) and the model performance evaluation. The source identification is based on the similarity between a given factor and source chemical profiles from public databases. The model performance evaluation is based on statistical indicators used to compare model output with reference values generated in intercomparison exercises. The references values are calculated as the ensemble average of the results reported by participants that have passed a set of testing criteria based on chemical profiles and time series similarity. In this study, a sensitivity analysis of the model performance criteria is accomplished using the results of a synthetic dataset where "a priori" references are available. The consensus modulated standard deviation punc gives the best choice for the model performance evaluation when a conservative approach is adopted.

  4. Numerical and analytical investigation of steel beam subjected to four-point bending

    Science.gov (United States)

    Farida, F. M.; Surahman, A.; Sofwan, A.

    2018-03-01

    A One type of bending tests is four-point bending test. The aim of this test is to investigate the properties and behavior of materials with structural applications. This study uses numerical and analytical studies. Results from both of these studies help to improve in experimental works. The purpose of this study is to predict steel beam behavior subjected to four-point bending test. This study intension is to analyze flexural beam subjected to four-point bending prior to experimental work. Main results of this research are location of strain gauge and LVDT on steel beam based on numerical study, manual calculation, and analytical study. Analytical study uses linear elasticity theory of solid objects. This study results is position of strain gauge and LVDT. Strain gauge is located between two concentrated loads at the top beam and bottom beam. LVDT is located between two concentrated loads.

  5. Humidity Effects on Fragmentation in Plasma-Based Ambient Ionization Sources.

    Science.gov (United States)

    Newsome, G Asher; Ackerman, Luke K; Johnson, Kevin J

    2016-01-01

    Post-plasma ambient desorption/ionization (ADI) sources are fundamentally dependent on surrounding water vapor to produce protonated analyte ions. There are two reports of humidity effects on ADI spectra. However, it is unclear whether humidity will affect all ADI sources and analytes, and by what mechanism humidity affects spectra. Flowing atmospheric pressure afterglow (FAPA) ionization and direct analysis in real time (DART) mass spectra of various surface-deposited and gas-phase analytes were acquired at ambient temperature and pressure across a range of observed humidity values. A controlled humidity enclosure around the ion source and mass spectrometer inlet was used to create programmed humidity and temperatures. The relative abundance and fragmentation of molecular adduct ions for several compounds consistently varied with changing ambient humidity and also were controlled with the humidity enclosure. For several compounds, increasing humidity decreased protonated molecule and other molecular adduct ion fragmentation in both FAPA and DART spectra. For others, humidity increased fragment ion ratios. The effects of humidity on molecular adduct ion fragmentation were caused by changes in the relative abundances of different reagent protonated water clusters and, thus, a change in the average difference in proton affinity between an analyte and the population of water clusters. Control of humidity in ambient post-plasma ion sources is needed to create spectral stability and reproducibility.

  6. A procedure for merging land cover/use data from LANDSAT, aerial photography, and map sources: Compatibility, accuracy, and cost. Remote Sensing Project

    Science.gov (United States)

    Enslin, W. R.; Tilmann, S. E.; Hill-Rowley, R.; Rogers, R. H.

    1977-01-01

    Regional planning agencies are currently expressing a need for detailed land cover/use information to effectively meet the requirements of various federal programs. Individual data sources have advantages and limitations in fulfilling this need, both in terms of time/cost and technological capability. A methodology has been developed to merge land cover/use data from LANDSAT, aerial photography and map sources to maximize the effective use of a variety of data sources in the provision of an integrated information system for regional analysis. A test of the proposed inventory method is currently under way in four central Michigan townships. This test will evaluate the compatibility, accuracy and cost of the integrated method with reference to inventories developed from a single data source, and determine both the technological feasibility and analytical potential of such a system.

  7. Analytical chemistry instrumentation

    International Nuclear Information System (INIS)

    Laing, W.R.

    1986-01-01

    In nine sections, 48 chapters cover 1) analytical chemistry and the environment 2) environmental radiochemistry 3) automated instrumentation 4) advances in analytical mass spectrometry 5) fourier transform spectroscopy 6) analytical chemistry of plutonium 7) nuclear analytical chemistry 8) chemometrics and 9) nuclear fuel technology

  8. Evaluated Rayleigh integrals for pulsed planar expanding ring sources

    International Nuclear Information System (INIS)

    Warshaw, S.I.

    1985-01-01

    Time-domain analytic and semianalytic pressure fields acoustically radiated from expanding pulsed ring sources imbedded in a planar rigid baffle have been calculated. The source functions are radially symmetric delta-function distributions whose amplitude and argument have simple functional dependencies on radius and time. Certain cases yield closed analytic results, while others result in elliptic integrals, which are evaluated to high accuracy by Gauss-Chebyshev and modified Gauss-Legendre quadrature. These results are of value for calibrating computer simulations and convolution procedures, and estimating fields from more complex planar radiators. 3 refs., 4 figs

  9. Redshift anisotropy among test-particle sources inside a black hole

    International Nuclear Information System (INIS)

    Debney, G.

    1976-01-01

    An elementary (mass-normalized) model of observers and emitters of light in free-fall within a black hole's radius is investigated in terms of the redshift spectrum induced. All observers and emitters follow the same kinds of trajectories, radially inward and starting from rest at spatial infinity. The major results are concerned with demonstrating the types of redshifts possible in all directions on a typical observer's celestial sphere. These are simulated by considering all equatorial light paths inside and generalizing to three dimensions by symmetry. Under certain assumptions a direction for maximum redshift and one for minimum redshift are obtained; these lie on antipodal points on the observer's celestial sphere. No multiple imaging or focusing is possible from isotropic sources inside r = 2m, however. At this stage no luminosity distances or intensity results are developed; these more complicated relationships would be required to simulate the actual picture getting through to an observer. Some of the redshift results are applied to a black hole whose scale is cosmological. This extreme example is included mainly as a curiosity to illustrate the impact of a simple change of scale and to reemphasize the importance of the microwave isotropy to theoretical models. A careful analytical formulation of general relativistic redshifts as seen in local Lorentz frames provides the tools for this investigation. (author)

  10. Redshift anisotropy among test-particle sources inside a black hole

    Energy Technology Data Exchange (ETDEWEB)

    Debney, G [Virginia Polytechnic Inst. and State Univ., Blacksburg (USA)

    1976-09-01

    An elementary (mass-normalized) model of observers and emitters of light in free-fall within a black hole's radius is investigated in terms of the redshift spectrum induced. All observers and emitters follow the same kinds of trajectories, radially inward and starting from rest at spatial infinity. The major results are concerned with demonstrating the types of redshifts possible in all directions on a typical observer's celestial sphere. These are simulated by considering all equatorial light paths inside and generalizing to three dimensions by symmetry. Under certain assumptions a direction for maximum redshift and one for minimum redshift are obtained; these lie on antipodal points on the observer's celestial sphere. No multiple imaging or focusing is possible from isotropic sources inside r = 2m, however. At this stage no luminosity distances or intensity results are developed; these more complicated relationships would be required to simulate the actual picture getting through to an observer. Some of the redshift results are applied to a black hole whose scale is cosmological. This extreme example is included mainly as a curiosity to illustrate the impact of a simple change of scale and to reemphasize the importance of the microwave isotropy to theoretical models. A careful analytical formulation of general relativistic redshifts as seen in local Lorentz frames provides the tools for this investigation.

  11. A test of unification towards the radio source PKS1413+135

    International Nuclear Information System (INIS)

    Ferreira, M.C.; Julião, M.D.; Martins, C.J.A.P.; Monteiro, A.M.R.V.L.

    2013-01-01

    We point out that existing astrophysical measurements of combinations of the fine-structure constant α, the proton-to-electron mass ratio μ and the proton gyromagnetic ratio g p towards the radio source PKS1413+135 can be used to individually constrain each of these fundamental couplings. While the accuracy of the available measurements is not yet sufficient to test the spatial dipole scenario, our analysis serves as a proof of concept as new observational facilities will soon allow significantly more robust tests. Moreover, these measurements can also be used to obtain constraints on certain classes of unification scenarios, and we compare the constraints obtained for PKS1413+135 with those previously obtained from local atomic clock measurements

  12. A test of unification towards the radio source PKS1413+135

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, M.C., E-mail: up200802537@fc.up.pt [Centro de Astrofísica, Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal); Faculdade de Ciências, Universidade do Porto, Rua do Campo Alegre, 4150-007 Porto (Portugal); Julião, M.D., E-mail: meinf12013@fe.up.pt [Centro de Astrofísica, Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal); Faculdade de Engenharia, Universidade do Porto, Rua Dr Roberto Frias, 4200-465 Porto (Portugal); Martins, C.J.A.P., E-mail: Carlos.Martins@astro.up.pt [Centro de Astrofísica, Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal); Monteiro, A.M.R.V.L., E-mail: mmonteiro@fc.up.pt [Centro de Astrofísica, Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal); Faculdade de Ciências, Universidade do Porto, Rua do Campo Alegre, 4150-007 Porto (Portugal); Department of Applied Physics, Delft University of Technology, P.O. Box 5046, 2600 GA Delft (Netherlands)

    2013-07-09

    We point out that existing astrophysical measurements of combinations of the fine-structure constant α, the proton-to-electron mass ratio μ and the proton gyromagnetic ratio g{sub p} towards the radio source PKS1413+135 can be used to individually constrain each of these fundamental couplings. While the accuracy of the available measurements is not yet sufficient to test the spatial dipole scenario, our analysis serves as a proof of concept as new observational facilities will soon allow significantly more robust tests. Moreover, these measurements can also be used to obtain constraints on certain classes of unification scenarios, and we compare the constraints obtained for PKS1413+135 with those previously obtained from local atomic clock measurements.

  13. ANALYTICAL SOLUTIONS OF SINGULAR ISOTHERMAL QUADRUPOLE LENS

    International Nuclear Information System (INIS)

    Chu Zhe; Lin, W. P.; Yang Xiaofeng

    2013-01-01

    Using an analytical method, we study the singular isothermal quadrupole (SIQ) lens system, which is the simplest lens model that can produce four images. In this case, the radial mass distribution is in accord with the profile of the singular isothermal sphere lens, and the tangential distribution is given by adding a quadrupole on the monopole component. The basic properties of the SIQ lens have been studied in this Letter, including the deflection potential, deflection angle, magnification, critical curve, caustic, pseudo-caustic, and transition locus. Analytical solutions of the image positions and magnifications for the source on axes are derived. We find that naked cusps will appear when the relative intensity k of quadrupole to monopole is larger than 0.6. According to the magnification invariant theory of the SIQ lens, the sum of the signed magnifications of the four images should be equal to unity, as found by Dalal. However, if a source lies in the naked cusp, the summed magnification of the left three images is smaller than the invariant 1. With this simple lens system, we study the situations where a point source infinitely approaches a cusp or a fold. The sum of the magnifications of the cusp image triplet is usually not equal to 0, and it is usually positive for major cusps while negative for minor cusps. Similarly, the sum of magnifications of the fold image pair is usually not equal to 0 either. Nevertheless, the cusp and fold relations are still equal to 0 in that the sum values are divided by infinite absolute magnifications by definition.

  14. New neutron-based isotopic analytical methods; An explorative study of resonance capture and incoherent scattering

    NARCIS (Netherlands)

    Perego, R.C.

    2004-01-01

    Two novel neutron-based analytical techniques have been treated in this thesis, Neutron Resonance Capture Analysis (NRCA), employing a pulsed neutron source, and Neutron Incoherent Scattering (NIS), making use of a cold neutron source. With the NRCA method isotopes are identified by the

  15. Pre-analytical factors influencing the stability of cerebrospinal fluid proteins

    DEFF Research Database (Denmark)

    Simonsen, Anja H; Bahl, Justyna M C; Danborg, Pia B

    2013-01-01

    Cerebrospinal fluid (CSF) is a potential source for new biomarkers due to its proximity to the brain. This study aimed to clarify the stability of the CSF proteome when undergoing pre-analytical factors. We investigated the effects of repeated freeze/thaw cycles, protease inhibitors and delayed s...

  16. Prevalence of Pre-Analytical Errors in Clinical Chemistry Diagnostic Labs in Sulaimani City of Iraqi Kurdistan

    OpenAIRE

    Najat, Dereen

    2017-01-01

    Background Laboratory testing is roughly divided into three phases: a pre-analytical phase, an analytical phase and a post-analytical phase. Most analytical errors have been attributed to the analytical phase. However, recent studies have shown that up to 70% of analytical errors reflect the pre-analytical phase. The pre-analytical phase comprises all processes from the time a laboratory request is made by a physician until the specimen is analyzed at the lab. Generally, the pre-analytical ph...

  17. Process analytical chemistry applied to actinide waste streams

    International Nuclear Information System (INIS)

    Day, R.S.

    1994-01-01

    The Department of Energy is being called upon to clean up it's legacy of waste from the nuclear complex generated during the cold war period. Los Alamos National Laboratory is actively involved in waste minimization and waste stream polishing activities associated with this clean up. The Advanced Testing Line for Actinide Separations (ATLAS) at Los Alamos serves as a developmental test bed for integrating flow sheet development of nitric acid waste streams with process analytical chemistry and process control techniques. The wastes require processing in glove boxes because of the radioactive components, thus adding to the difficulties of making analytical measurements. Process analytical chemistry methods provide real-time chemical analysis in support of existing waste stream operations and enhances the development of new waste stream polishing initiatives. The instrumentation and methods being developed on ATLAS are designed to supply near-real time analyses on virtually all of the chemical parameters found in nitric acid processing of actinide waste. These measurements supply information on important processing parameters including actinide oxidation states, free acid concentration, interfering anions and metal impurities

  18. JRR-3 cold neutron source facility H2-O2 explosion safety proof testing

    International Nuclear Information System (INIS)

    Hibi, T.; Fuse, H.; Takahashi, H.; Akutsu, C.; Kumai, T.; Kawabata, Y.

    1990-01-01

    A cold Neutron Source (CNS) will be installed in Japan Research Reactor-3 (JRR-3) in Japan Atomic Energy Research Institute (JAERI) during its remodeling project. This CNS holds liquid hydrogen at a temperature of about 20 K as a cold neutron source moderator in the heavy water area of the reactor to moderate thermal neutrons from the reactor to cold neutrons of about 5 meV energy. In the hydrogen circuit of the CNS safety measures are taken to prevent oxygen/hydrogen reaction (H 2 -O 2 explosion). It is also designed in such manner that, should an H 2 -O 2 explosion take place, the soundness of all the components can be maintained so as not to harm the reactor safety. A test hydrogen circuit identical to that of the CNS (real components designed by TECHNICATOME of France) was manufactured to conduct the H 2 -O 2 explosion test. In this test, the detonation that is the severest phenomenon of the oxygen/hydrogen reaction took place in the test hydrogen circuit to measure the exerted pressure on the components and their strain, deformation, leakage, cracking, etc. Based on the results of this measurement, the structural strength of the test hydrogen circuit was analyzed. The results of this test show that the hydrogen circuit components have sufficient structural strength to withstand an oxygen/hydrogen reaction

  19. Analytical Modeling of Triple-Metal Hetero-Dielectric DG SON TFET

    Science.gov (United States)

    Mahajan, Aman; Dash, Dinesh Kumar; Banerjee, Pritha; Sarkar, Subir Kumar

    2018-02-01

    In this paper, a 2-D analytical model of triple-metal hetero-dielectric DG TFET is presented by combining the concepts of triple material gate engineering and hetero-dielectric engineering. Three metals with different work functions are used as both front- and back gate electrodes to modulate the barrier at source/channel and channel/drain interface. In addition to this, front gate dielectric consists of high-K HfO2 at source end and low-K SiO2 at drain side, whereas back gate dielectric is replaced by air to further improve the ON current of the device. Surface potential and electric field of the proposed device are formulated solving 2-D Poisson's equation and Young's approximation. Based on this electric field expression, tunneling current is obtained by using Kane's model. Several device parameters are varied to examine the behavior of the proposed device. The analytical model is validated with TCAD simulation results for proving the accuracy of our proposed model.

  20. Assessment of the gas dynamic trap mirror facility as intense neutron source for fusion material test irradiations

    International Nuclear Information System (INIS)

    Fischer, U.; Moeslang, A.; Ivanov, A.A.

    2000-01-01

    The gas dynamic trap (GDT) mirror machine has been proposed by the Budker Institute of nuclear physics, Novosibirsk, as a volumetric neutron source for fusion material test irradiations. On the basis of the GDT plasma confinement concept, 14 MeV neutrons are generated at high production rates in the two end sections of the axially symmetrical central mirror cell, serving as suitable irradiation test regions. In this paper, we present an assessment of the GDT as intense neutron source for fusion material test irradiations. This includes comparisons to irradiation conditions in fusion reactor systems (ITER, Demo) and the International Fusion Material Irradiation Facility (IFMIF), as well as a conceptual design for a helium-cooled tubular test assembly elaborated for the largest of the two test zones taking proper account of neutronics, thermal-hydraulic and mechanical aspects. This tubular test assembly incorporates ten rigs of about 200 cm length used for inserting instrumented test capsules with miniaturized specimens taking advantage of the 'small specimen test technology'. The proposed design allows individual temperatures in each of the rigs, and active heating systems inside the capsules ensures specimen temperature stability even during beam-off periods. The major concern is about the maximum achievable dpa accumulation of less than 15 dpa per full power year on the basis of the present design parameters of the GDT neutron source. A design upgrading is proposed to allow for higher neutron wall loadings in the material test regions