WorldWideScience

Sample records for sources specific analytical

  1. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    Science.gov (United States)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82

  2. Compound-specific radiocarbon analysis - Analytical challenges and applications

    Science.gov (United States)

    Mollenhauer, G.; Rethemeyer, J.

    2009-01-01

    Within the last decades, techniques have become available that allow measurement of isotopic compositions of individual organic compounds (compound-specific isotope measurements). Most often the carbon isotopic composition of these compounds is studied, including stable carbon (δ13C) and radiocarbon (Δ14C) measurements. While compound-specific stable carbon isotope measurements are fairly simple, and well-established techniques are widely available, radiocarbon analysis of specific organic compounds is a more challenging method. Analytical challenges include difficulty obtaining adequate quantities of sample, tedious and complicated laboratory separations, the lack of authentic standards for measuring realistic processing blanks, and large uncertainties in values of Δ14C at small sample sizes. The challenges associated with sample preparation for compound-specific Δ14C measurements will be discussed in this contribution. Several years of compound-specific radiocarbon analysis have revealed that in most natural samples, purified organic compounds consist of heterogeneous mixtures of the same compound. These mixtures could derive from multiple sources, each having a different initial reservoir age but mixed in the same terminal reservoir, from a single source but mixed after deposition, or from a prokaryotic organism using variable carbon sources including mobilization of ancient carbon. These processes not only represent challenges to the interpretation of compound-specific radiocarbon data, but provide unique tools for the understanding of biogeochemical and sedimentological processes influencing the preserved organic geochemical records in marine sediments. We will discuss some examples where compound-specific radiocarbon analysis has provided new insights for the understanding of carbon source utilization and carbon cycling.

  3. 21 CFR 864.4020 - Analyte specific reagents.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Analyte specific reagents. 864.4020 Section 864.4020 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Specimen Preparation Reagents § 864.4020 Analyte specific...

  4. Electrospray ion source with reduced analyte electrochemistry

    Science.gov (United States)

    Kertesz, Vilmos [Knoxville, TN; Van Berkel, Gary [Clinton, TN

    2011-08-23

    An electrospray ion (ESI) source and method capable of ionizing an analyte molecule without oxidizing or reducing the analyte of interest. The ESI source can include an emitter having a liquid conduit, a working electrode having a liquid contacting surface, a spray tip, a secondary working electrode, and a charge storage coating covering partially or fully the liquid contacting surface of the working electrode. The liquid conduit, the working electrode and the secondary working electrode can be in liquid communication. The electrospray ion source can also include a counter electrode proximate to, but separated from, said spray tip. The electrospray ion source can also include a power system for applying a voltage difference between the working electrodes and a counter-electrode. The power system can deliver pulsed voltage changes to the working electrodes during operation of said electrospray ion source to minimize the surface potential of the charge storage coating.

  5. Analytic Approximation to Radiation Fields from Line Source Geometry

    International Nuclear Information System (INIS)

    Michieli, I.

    2000-01-01

    Line sources with slab shields represent typical source-shield configuration in gamma-ray attenuation problems. Such shielding problems often lead to the generalized Secant integrals of the specific form. Besides numerical integration approach, various expansions and rational approximations with limited applicability are in use for computing the value of such integral functions. Lately, the author developed rapidly convergent infinite series representation of generalized Secant Integrals involving incomplete Gamma functions. Validity of such representation was established for zero and positive values of integral parameter a (a=0). In this paper recurrence relations for generalized Secant Integrals are derived allowing us simple approximate analytic calculation of the integral for arbitrary a values. It is demonstrated how truncated series representation can be used, as the basis for such calculations, when possibly negative a values are encountered. (author)

  6. Light Source Estimation with Analytical Path-tracing

    OpenAIRE

    Kasper, Mike; Keivan, Nima; Sibley, Gabe; Heckman, Christoffer

    2017-01-01

    We present a novel algorithm for light source estimation in scenes reconstructed with a RGB-D camera based on an analytically-derived formulation of path-tracing. Our algorithm traces the reconstructed scene with a custom path-tracer and computes the analytical derivatives of the light transport equation from principles in optics. These derivatives are then used to perform gradient descent, minimizing the photometric error between one or more captured reference images and renders of our curre...

  7. Pentaho Business Analytics: a Business Intelligence Open Source Alternative

    Directory of Open Access Journals (Sweden)

    Diana TÂRNĂVEANU

    2012-10-01

    Full Text Available Most organizations strive to obtain fast, interactive and insightful analytics in order to fundament the most effective and profitable decisions. They need to incorporate huge amounts of data in order to run analysis based on queries and reports with collaborative capabilities. The large variety of Business Intelligence solutions on the market makes it very difficult for organizations to select one and evaluate the impact of the selected solution to the organization. The need of a strategy to help organization chose the best solution for investment emerges. In the past, Business Intelligence (BI market was dominated by closed source and commercial tools, but in the last years open source solutions developed everywhere. An Open Source Business Intelligence solution can be an option due to time-sensitive, sprawling requirements and tightening budgets. This paper presents a practical solution implemented in a suite of Open Source Business Intelligence products called Pentaho Business Analytics, which provides data integration, OLAP services, reporting, dashboarding, data mining and ETL capabilities. The study conducted in this paper suggests that the open source phenomenon could become a valid alternative to commercial platforms within the BI context.

  8. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals.

    Science.gov (United States)

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G; Sandberg, Sverre; Sölétormos, György

    2018-01-01

    Background Many clinical decisions are based on comparison of patient results with reference intervals. Therefore, an estimation of the analytical performance specifications for the quality that would be required to allow sharing common reference intervals is needed. The International Federation of Clinical Chemistry (IFCC) recommended a minimum of 120 reference individuals to establish reference intervals. This number implies a certain level of quality, which could then be used for defining analytical performance specifications as the maximum combination of analytical bias and imprecision required for sharing common reference intervals, the aim of this investigation. Methods Two methods were investigated for defining the maximum combination of analytical bias and imprecision that would give the same quality of common reference intervals as the IFCC recommendation. Method 1 is based on a formula for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation of analytical performance specifications for both Gaussian and log-Gaussian distributions of reference intervals.

  9. An Analytical Method of Auxiliary Sources Solution for Plane Wave Scattering by Impedance Cylinders

    DEFF Research Database (Denmark)

    Larsen, Niels Vesterdal; Breinbjerg, Olav

    2004-01-01

    Analytical Method of Auxiliary Sources solutions for plane wave scattering by circular impedance cylinders are derived by transformation of the exact eigenfunction series solutions employing the Hankel function wave transformation. The analytical Method of Auxiliary Sources solution thus obtained...

  10. The machine in multimedia analytics

    NARCIS (Netherlands)

    Zahálka, J.

    2017-01-01

    This thesis investigates the role of the machine in multimedia analytics, a discipline that combines visual analytics with multimedia analysis algorithms in order to unlock the potential of multimedia collections as sources of knowledge in scientific and applied domains. Specifically, the central

  11. Setting analytical performance specifications based on outcome studies - is it possible?

    NARCIS (Netherlands)

    Horvath, Andrea Rita; Bossuyt, Patrick M. M.; Sandberg, Sverre; John, Andrew St; Monaghan, Phillip J.; Verhagen-Kamerbeek, Wilma D. J.; Lennartz, Lieselotte; Cobbaert, Christa M.; Ebert, Christoph; Lord, Sarah J.

    2015-01-01

    The 1st Strategic Conference of the European Federation of Clinical Chemistry and Laboratory Medicine proposed a simplified hierarchy for setting analytical performance specifications (APS). The top two levels of the 1999 Stockholm hierarchy, i.e., evaluation of the effect of analytical performance

  12. SPANDOM - source projection analytic nodal discrete ordinates method

    International Nuclear Information System (INIS)

    Kim, Tae Hyeong; Cho, Nam Zin

    1994-01-01

    We describe a new discrete ordinates nodal method for the two-dimensional transport equation. We solve the discrete ordinates equation analytically after the source term is projected and represented in polynomials. The method is applied to two fast reactor benchmark problems and compared with the TWOHEX code. The results indicate that the present method accurately predicts not only multiplication factor but also flux distribution

  13. Analytical performance, reference values and decision limits. A need to differentiate between reference intervals and decision limits and to define analytical quality specifications

    DEFF Research Database (Denmark)

    Petersen, Per Hyltoft; Jensen, Esther A; Brandslund, Ivan

    2012-01-01

    of the values of analytical components measured on reference samples from reference individuals. Decision limits are based on guidelines from national and international expert groups defining specific concentrations of certain components as limits for decision about diagnosis or well-defined specific actions....... Analytical quality specifications for reference intervals have been defined for bias since the 1990s, but in the recommendations specified in the clinical guidelines analytical quality specifications are only scarcely defined. The demands for negligible biases are, however, even more essential for decision...... limits, as the choice is no longer left to the clinician, but emerge directly from the concentration. Even a small bias will change the number of diseased individuals, so the demands for negligible biases are obvious. A view over the analytical quality as published gives a variable picture of bias...

  14. Specification of brachytherapy sources

    Energy Technology Data Exchange (ETDEWEB)

    1984-10-01

    BCRU recommends that the following specification of gamma-ray brachytherapy sources be adopted. Unless otherwise stated, the output of a cylindrical source should be specified in air kerma rate at a point in free space at a distance of 1 m from the source on the radial plane of symmetry, i.e. the plane bisecting the active length and perpendicular to the cylindrical axis of the source. For a wire source the output should be specified for a 1 cm length. For any other construction of source, the point at which the output is specified should be stated. It is also recommended that the units in which the air kerma rate is expressed should be micrograys per hour (..mu..Gy/h).

  15. Ibmdbpy-spatial : An Open-source implementation of in-database geospatial analytics in Python

    Science.gov (United States)

    Roy, Avipsa; Fouché, Edouard; Rodriguez Morales, Rafael; Moehler, Gregor

    2017-04-01

    As the amount of spatial data acquired from several geodetic sources has grown over the years and as data infrastructure has become more powerful, the need for adoption of in-database analytic technology within geosciences has grown rapidly. In-database analytics on spatial data stored in a traditional enterprise data warehouse enables much faster retrieval and analysis for making better predictions about risks and opportunities, identifying trends and spot anomalies. Although there are a number of open-source spatial analysis libraries like geopandas and shapely available today, most of them have been restricted to manipulation and analysis of geometric objects with a dependency on GEOS and similar libraries. We present an open-source software package, written in Python, to fill the gap between spatial analysis and in-database analytics. Ibmdbpy-spatial provides a geospatial extension to the ibmdbpy package, implemented in 2015. It provides an interface for spatial data manipulation and access to in-database algorithms in IBM dashDB, a data warehouse platform with a spatial extender that runs as a service on IBM's cloud platform called Bluemix. Working in-database reduces the network overload, as the complete data need not be replicated into the user's local system altogether and only a subset of the entire dataset can be fetched into memory in a single instance. Ibmdbpy-spatial accelerates Python analytics by seamlessly pushing operations written in Python into the underlying database for execution using the dashDB spatial extender, thereby benefiting from in-database performance-enhancing features, such as columnar storage and parallel processing. The package is currently supported on Python versions from 2.7 up to 3.4. The basic architecture of the package consists of three main components - 1) a connection to the dashDB represented by the instance IdaDataBase, which uses a middleware API namely - pypyodbc or jaydebeapi to establish the database connection via

  16. Analytical modeling of Schottky tunneling source impact ionization MOSFET with reduced breakdown voltage

    Directory of Open Access Journals (Sweden)

    Sangeeta Singh

    2016-03-01

    Full Text Available In this paper, we have investigated a novel Schottky tunneling source impact ionization MOSFET (STS-IMOS to lower the breakdown voltage of conventional impact ionization MOS (IMOS and developed an analytical model for the same. In STS-IMOS there is an accumulative effect of both impact ionization and source induced barrier tunneling. The silicide source offers very low parasitic resistance, the outcome of which is an increment in voltage drop across the intrinsic region for the same applied bias. This reduces operating voltage and hence, it exhibits a significant reduction in both breakdown and threshold voltage. STS-IMOS shows high immunity against hot electron damage. As a result of this the device reliability increases magnificently. The analytical model for impact ionization current (Iii is developed based on the integration of ionization integral (M. Similarly, to get Schottky tunneling current (ITun expression, Wentzel–Kramers–Brillouin (WKB approximation is employed. Analytical models for threshold voltage and subthreshold slope is optimized against Schottky barrier height (ϕB variation. The expression for the drain current is computed as a function of gate-to-drain bias via integral expression. It is validated by comparing it with the technology computer-aided design (TCAD simulation results as well. In essence, this analytical framework provides the physical background for better understanding of STS-IMOS and its performance estimation.

  17. Modification and validation of an analytical source model for external beam radiotherapy Monte Carlo dose calculations

    Energy Technology Data Exchange (ETDEWEB)

    Davidson, Scott E., E-mail: sedavids@utmb.edu [Radiation Oncology, The University of Texas Medical Branch, Galveston, Texas 77555 (United States); Cui, Jing [Radiation Oncology, University of Southern California, Los Angeles, California 90033 (United States); Kry, Stephen; Ibbott, Geoffrey S.; Followill, David S. [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States); Deasy, Joseph O. [Department of Medical Physics, Memorial Sloan Kettering Cancer Center, New York, New York 10065 (United States); Vicic, Milos [Department of Applied Physics, University of Belgrade, Belgrade 11000 (Serbia); White, R. Allen [Bioinformatics and Computational Biology, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States)

    2016-08-15

    Purpose: A dose calculation tool, which combines the accuracy of the dose planning method (DPM) Monte Carlo code and the versatility of a practical analytical multisource model, which was previously reported has been improved and validated for the Varian 6 and 10 MV linear accelerators (linacs). The calculation tool can be used to calculate doses in advanced clinical application studies. One shortcoming of current clinical trials that report dose from patient plans is the lack of a standardized dose calculation methodology. Because commercial treatment planning systems (TPSs) have their own dose calculation algorithms and the clinical trial participant who uses these systems is responsible for commissioning the beam model, variation exists in the reported calculated dose distributions. Today’s modern linac is manufactured to tight specifications so that variability within a linac model is quite low. The expectation is that a single dose calculation tool for a specific linac model can be used to accurately recalculate dose from patient plans that have been submitted to the clinical trial community from any institution. The calculation tool would provide for a more meaningful outcome analysis. Methods: The analytical source model was described by a primary point source, a secondary extra-focal source, and a contaminant electron source. Off-axis energy softening and fluence effects were also included. The additions of hyperbolic functions have been incorporated into the model to correct for the changes in output and in electron contamination with field size. A multileaf collimator (MLC) model is included to facilitate phantom and patient dose calculations. An offset to the MLC leaf positions was used to correct for the rudimentary assumed primary point source. Results: Dose calculations of the depth dose and profiles for field sizes 4 × 4 to 40 × 40 cm agree with measurement within 2% of the maximum dose or 2 mm distance to agreement (DTA) for 95% of the data

  18. Review and evaluation of spark source mass spectrometry as an analytical method

    International Nuclear Information System (INIS)

    Beske, H.E.

    1981-01-01

    The analytical features and most important fields of application of spark source mass spectrometry are described with respect to the trace analysis of high-purity materials and the multielement analysis of technical alloys, geochemical and cosmochemical, biological and radioactive materials, as well as in environmental analysis. Comparisons are made to other analytical methods. The distribution of the method as well as opportunities for contract analysis are indicated and developmental tendencies discussed. (orig.) [de

  19. Analytical support of plant specific SAMG development validation of SAMG using MELCOR 1.8.5

    International Nuclear Information System (INIS)

    Duspiva, Jiri

    2006-01-01

    They are two NPPs in operation in Czech Republic. Both of NPPs operated in CR have already implemented EOPs, developed under collaboration with the WESE. The project on SAMG development has started and follows the previous one for EOPs also with the WESE as the leading organization. Plant specific SAMGs for the Temelin as well as Dukovany NPPs are based on the WOG generic SAMGs. The analytical support of plant specific SAMGs development is performed by the NRI Rez within the validation process. Basic conditions as well as their filling by NRI Rez are focused on analyst, analytical tools and their applications. More detail description is attended to the approach of the preparation of the MELCOR code application to the evaluation of hydrogen risk, validation of recent set of hydrogen passive autocatalytic recombiners and definition of proposals to amend system of hydrogen removal. Such kind of parametric calculations will request to perform very wide set of runs. It could not be possible with the whole plant model and decoupling of such calculation with storing of mass and energy sources into the containment is only one way. The example of this decoupling for the LOCA scenario is shown. It includes seven sources - heat losses from primary and secondary circuits, fluid blowndown through cold leg break, fission products blowndown through cold leg break, fluid blowndown through break in reactor pressure vessel bottom head, fission products through break in reactor pressure vessel bottom head, melt ejection from reactor pressure vessel to cavity and gas masses and heat losses from corium in cavity. The stand alone containment analysis was tested in two configurations - with or without taking of fission products into account. Testing showed very good agreement of all calculations until lower head failure and acceptable agreement after that. Also some problematic features appeared. The stand alone test with fission product was possible only after the changes in source code

  20. ESIP Earth Sciences Data Analytics (ESDA) Cluster - Work in Progress

    Science.gov (United States)

    Kempler, Steven

    2015-01-01

    The purpose of this poster is to promote a common understanding of the usefulness of, and activities that pertain to, Data Analytics and more broadly, the Data Scientist; Facilitate collaborations to better understand the cross usage of heterogeneous datasets and to provide accommodating data analytics expertise, now and as the needs evolve into the future; Identify gaps that, once filled, will further collaborative activities. Objectives Provide a forum for Academic discussions that provides ESIP members a better understanding of the various aspects of Earth Science Data Analytics Bring in guest speakers to describe external efforts, and further teach us about the broader use of Data Analytics. Perform activities that:- Compile use cases generated from specific community needs to cross analyze heterogeneous data- Compile sources of analytics tools, in particular, to satisfy the needs of the above data users- Examine gaps between needs and sources- Examine gaps between needs and community expertise- Document specific data analytics expertise needed to perform Earth science data analytics Seek graduate data analytics Data Science student internship opportunities.

  1. Analytical performance specifications for external quality assessment - definitions and descriptions.

    Science.gov (United States)

    Jones, Graham R D; Albarede, Stephanie; Kesseler, Dagmar; MacKenzie, Finlay; Mammen, Joy; Pedersen, Morten; Stavelin, Anne; Thelen, Marc; Thomas, Annette; Twomey, Patrick J; Ventura, Emma; Panteghini, Mauro

    2017-06-27

    External Quality Assurance (EQA) is vital to ensure acceptable analytical quality in medical laboratories. A key component of an EQA scheme is an analytical performance specification (APS) for each measurand that a laboratory can use to assess the extent of deviation of the obtained results from the target value. A consensus conference held in Milan in 2014 has proposed three models to set APS and these can be applied to setting APS for EQA. A goal arising from this conference is the harmonisation of EQA APS between different schemes to deliver consistent quality messages to laboratories irrespective of location and the choice of EQA provider. At this time there are wide differences in the APS used in different EQA schemes for the same measurands. Contributing factors to this variation are that the APS in different schemes are established using different criteria, applied to different types of data (e.g. single data points, multiple data points), used for different goals (e.g. improvement of analytical quality; licensing), and with the aim of eliciting different responses from participants. This paper provides recommendations from the European Federation of Laboratory Medicine (EFLM) Task and Finish Group on Performance Specifications for External Quality Assurance Schemes (TFG-APSEQA) and on clear terminology for EQA APS. The recommended terminology covers six elements required to understand APS: 1) a statement on the EQA material matrix and its commutability; 2) the method used to assign the target value; 3) the data set to which APS are applied; 4) the applicable analytical property being assessed (i.e. total error, bias, imprecision, uncertainty); 5) the rationale for the selection of the APS; and 6) the type of the Milan model(s) used to set the APS. The terminology is required for EQA participants and other interested parties to understand the meaning of meeting or not meeting APS.

  2. Analytic solution of magnetic induction distribution of ideal hollow spherical field sources

    Science.gov (United States)

    Xu, Xiaonong; Lu, Dingwei; Xu, Xibin; Yu, Yang; Gu, Min

    2017-12-01

    The Halbach type hollow spherical permanent magnet arrays (HSPMA) are volume compacted, energy efficient field sources, and capable of producing multi-Tesla field in the cavity of the array, which have attracted intense interests in many practical applications. Here, we present analytical solutions of magnetic induction to the ideal HSPMA in entire space, outside of array, within the cavity of array, and in the interior of the magnet. We obtain solutions using concept of magnetic charge to solve the Poisson's and Laplace's equations for the HSPMA. Using these analytical field expressions inside the material, a scalar demagnetization function is defined to approximately indicate the regions of magnetization reversal, partial demagnetization, and inverse magnetic saturation. The analytical field solution provides deeper insight into the nature of HSPMA and offer guidance in designing optimized one.

  3. Heat-source specification 500 watt(e) RTG

    International Nuclear Information System (INIS)

    1983-02-01

    This specification establishes the requirements for a 90 SrF 2 heat source and its fuel capsule for application in a 500 W(e) thermoelectric generator. The specification covers: fuel composition and quantity; the Hastelloy S fuel capsule material and fabrication; and the quality assurance requirements for the assembled heat source

  4. Analytical and semi-analytical formalism for the voltage and the current sources of a superconducting cavity under dynamic detuning

    CERN Document Server

    Doleans, M

    2003-01-01

    Elliptical superconducting radio frequency (SRF) cavities are sensitive to frequency detuning because they have a high Q value in comparison with normal conducting cavities and weak mechanical properties. Radiation pressure on the cavity walls, microphonics, and tuning system are possible sources of dynamic detuning during SRF cavity-pulsed operation. A general analytic relation between the cavity voltage, the dynamic detuning function, and the RF control function is developed. This expression for the voltage envelope in a cavity under dynamic detuning and dynamic RF controls is analytically expressed through an integral formulation. A semi-analytical scheme is derived to calculate the voltage behavior in any practical case. Examples of voltage envelope behavior for different cases of dynamic detuning and RF control functions are shown. The RF control function for a cavity under dynamic detuning is also investigated and as an application various filling schemes are presented.

  5. Semi-Analytical Benchmarks for MCNP6

    Energy Technology Data Exchange (ETDEWEB)

    Grechanuk, Pavel Aleksandrovi [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-11-07

    Code verification is an extremely important process that involves proving or disproving the validity of code algorithms by comparing them against analytical results of the underlying physics or mathematical theory on which the code is based. Monte Carlo codes such as MCNP6 must undergo verification and testing upon every release to ensure that the codes are properly simulating nature. Specifically, MCNP6 has multiple sets of problems with known analytic solutions that are used for code verification. Monte Carlo codes primarily specify either current boundary sources or a volumetric fixed source, either of which can be very complicated functions of space, energy, direction and time. Thus, most of the challenges with modeling analytic benchmark problems in Monte Carlo codes come from identifying the correct source definition to properly simulate the correct boundary conditions. The problems included in this suite all deal with mono-energetic neutron transport without energy loss, in a homogeneous material. The variables that differ between the problems are source type (isotropic/beam), medium dimensionality (infinite/semi-infinite), etc.

  6. Reconstruction of sound source signal by analytical passive TR in the environment with airflow

    Science.gov (United States)

    Wei, Long; Li, Min; Yang, Debin; Niu, Feng; Zeng, Wu

    2017-03-01

    In the acoustic design of air vehicles, the time-domain signals of noise sources on the surface of air vehicles can serve as data support to reveal the noise source generation mechanism, analyze acoustic fatigue, and take measures for noise insulation and reduction. To rapidly reconstruct the time-domain sound source signals in an environment with flow, a method combining the analytical passive time reversal mirror (AP-TR) with a shear flow correction is proposed. In this method, the negative influence of flow on sound wave propagation is suppressed by the shear flow correction, obtaining the corrected acoustic propagation time delay and path. Those corrected time delay and path together with the microphone array signals are then submitted to the AP-TR, reconstructing more accurate sound source signals in the environment with airflow. As an analytical method, AP-TR offers a supplementary way in 3D space to reconstruct the signal of sound source in the environment with airflow instead of the numerical TR. Experiments on the reconstruction of the sound source signals of a pair of loud speakers are conducted in an anechoic wind tunnel with subsonic airflow to validate the effectiveness and priorities of the proposed method. Moreover the comparison by theorem and experiment result between the AP-TR and the time-domain beamforming in reconstructing the sound source signal is also discussed.

  7. Subsurface Shielding Source Term Specification Calculation

    International Nuclear Information System (INIS)

    S.Su

    2001-01-01

    The purpose of this calculation is to establish appropriate and defensible waste-package radiation source terms for use in repository subsurface shielding design. This calculation supports the shielding design for the waste emplacement and retrieval system, and subsurface facility system. The objective is to identify the limiting waste package and specify its associated source terms including source strengths and energy spectra. Consistent with the Technical Work Plan for Subsurface Design Section FY 01 Work Activities (CRWMS M and O 2001, p. 15), the scope of work includes the following: (1) Review source terms generated by the Waste Package Department (WPD) for various waste forms and waste package types, and compile them for shielding-specific applications. (2) Determine acceptable waste package specific source terms for use in subsurface shielding design, using a reasonable and defensible methodology that is not unduly conservative. This calculation is associated with the engineering and design activity for the waste emplacement and retrieval system, and subsurface facility system. The technical work plan for this calculation is provided in CRWMS M and O 2001. Development and performance of this calculation conforms to the procedure, AP-3.12Q, Calculations

  8. Analytic solution of field distribution and demagnetization function of ideal hollow cylindrical field source

    Science.gov (United States)

    Xu, Xiaonong; Lu, Dingwei; Xu, Xibin; Yu, Yang; Gu, Min

    2017-09-01

    The Halbach type hollow cylindrical permanent magnet array (HCPMA) is a volume compact and energy conserved field source, which have attracted intense interests in many practical applications. Here, using the complex variable integration method based on the Biot-Savart Law (including current distributions inside the body and on the surfaces of magnet), we derive analytical field solutions to an ideal multipole HCPMA in entire space including the interior of magnet. The analytic field expression inside the array material is used to construct an analytic demagnetization function, with which we can explain the origin of demagnetization phenomena in HCPMA by taking into account an ideal magnetic hysteresis loop with finite coercivity. These analytical field expressions and demagnetization functions provide deeper insight into the nature of such permanent magnet array systems and offer guidance in designing optimized array system.

  9. Algorithms and analytical solutions for rapidly approximating long-term dispersion from line and area sources

    Science.gov (United States)

    Barrett, Steven R. H.; Britter, Rex E.

    Predicting long-term mean pollutant concentrations in the vicinity of airports, roads and other industrial sources are frequently of concern in regulatory and public health contexts. Many emissions are represented geometrically as ground-level line or area sources. Well developed modelling tools such as AERMOD and ADMS are able to model dispersion from finite (i.e. non-point) sources with considerable accuracy, drawing upon an up-to-date understanding of boundary layer behaviour. Due to mathematical difficulties associated with line and area sources, computationally expensive numerical integration schemes have been developed. For example, some models decompose area sources into a large number of line sources orthogonal to the mean wind direction, for which an analytical (Gaussian) solution exists. Models also employ a time-series approach, which involves computing mean pollutant concentrations for every hour over one or more years of meteorological data. This can give rise to computer runtimes of several days for assessment of a site. While this may be acceptable for assessment of a single industrial complex, airport, etc., this level of computational cost precludes national or international policy assessments at the level of detail available with dispersion modelling. In this paper, we extend previous work [S.R.H. Barrett, R.E. Britter, 2008. Development of algorithms and approximations for rapid operational air quality modelling. Atmospheric Environment 42 (2008) 8105-8111] to line and area sources. We introduce approximations which allow for the development of new analytical solutions for long-term mean dispersion from line and area sources, based on hypergeometric functions. We describe how these solutions can be parameterized from a single point source run from an existing advanced dispersion model, thereby accounting for all processes modelled in the more costly algorithms. The parameterization method combined with the analytical solutions for long-term mean

  10. The analytical benchmark solution of spatial diffusion kinetics in source driven systems for homogeneous media

    International Nuclear Information System (INIS)

    Oliveira, F.L. de; Maiorino, J.R.; Santos, R.S.

    2007-01-01

    This paper describes a closed form solution obtained by the expansion method for the general time dependent diffusion model with delayed emission for source transients in homogeneous media. In particular, starting from simple models, and increasing the complexity, numerical results were obtained for different types of source transients. Thus, first an analytical solution of the one group without precursors was solved, followed by considering one precursors family. The general case of G-groups with R families of precursor although having a closed form solution, cannot be solved analytically, since there are no explicit formulae for the eigenvalues, and numerical methods must be used to solve such problem. To illustrate the general solution, the multi-group (three groups) time-dependent without precursors was also solved and the results inter compared with results obtained by the previous one group models for a given fast homogeneous media, and different types of source transients. The results are being compared with the obtained by numerical methods. (author)

  11. Analytical solution of spatial kinetics of the diffusion model for subcritical homogeneous systems driven by external source

    International Nuclear Information System (INIS)

    Oliveira, Fernando Luiz de

    2008-01-01

    This work describes an analytical solution obtained by the expansion method for the spatial kinetics using the diffusion model with delayed emission for source transients in homogeneous media. In particular, starting from simple models, and increasing the complexity, numerical results were obtained for different types of source transients. An analytical solution of the one group without precursors was solved, followed by considering one precursors family. The general case of G-groups with R families of precursor although having a closed form solution, cannot be solved analytically, since there are no explicit formulae for the eigenvalues, and numerical methods must be used to solve such problem. To illustrate the general solution, the multi-group (three groups) time-dependent problem without precursors was solved and the numerical results of a finite difference code were compared with the exact results for different transients. (author)

  12. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    Science.gov (United States)

    Dinov, Ivo D; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W; Price, Nathan D; Van Horn, John D; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M; Dauer, William; Toga, Arthur W

    2016-01-01

    A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches

  13. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    Full Text Available A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI. The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data.Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i introduce methods for rebalancing imbalanced cohorts, (ii utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model

  14. PB-AM: An open-source, fully analytical linear poisson-boltzmann solver.

    Science.gov (United States)

    Felberg, Lisa E; Brookes, David H; Yap, Eng-Hui; Jurrus, Elizabeth; Baker, Nathan A; Head-Gordon, Teresa

    2017-06-05

    We present the open source distributed software package Poisson-Boltzmann Analytical Method (PB-AM), a fully analytical solution to the linearized PB equation, for molecules represented as non-overlapping spherical cavities. The PB-AM software package includes the generation of outputs files appropriate for visualization using visual molecular dynamics, a Brownian dynamics scheme that uses periodic boundary conditions to simulate dynamics, the ability to specify docking criteria, and offers two different kinetics schemes to evaluate biomolecular association rate constants. Given that PB-AM defines mutual polarization completely and accurately, it can be refactored as a many-body expansion to explore 2- and 3-body polarization. Additionally, the software has been integrated into the Adaptive Poisson-Boltzmann Solver (APBS) software package to make it more accessible to a larger group of scientists, educators, and students that are more familiar with the APBS framework. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  15. Improvement of spatial discretization error on the semi-analytic nodal method using the scattered source subtraction method

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Tatsumi, Masahiro

    2006-01-01

    In this paper, the scattered source subtraction (SSS) method is newly proposed to improve the spatial discretization error of the semi-analytic nodal method with the flat-source approximation. In the SSS method, the scattered source is subtracted from both side of the diffusion or the transport equation to make spatial variation of the source term to be small. The same neutron balance equation is still used in the SSS method. Since the SSS method just modifies coefficients of node coupling equations (those used in evaluation for the response of partial currents), its implementation is easy. Validity of the present method is verified through test calculations that are carried out in PWR multi-assemblies configurations. The calculation results show that the SSS method can significantly improve the spatial discretization error. Since the SSS method does not have any negative impact on execution time, convergence behavior and memory requirement, it will be useful to reduce the spatial discretization error of the semi-analytic nodal method with the flat-source approximation. (author)

  16. Sealed radionuclide sources - new technical specifications and current practice

    Energy Technology Data Exchange (ETDEWEB)

    Brabec, D

    1987-03-01

    Basic technical specifications are discussed valid in Czechoslovakia for sealed radionuclide sources, based on international ISO and CMEA standards. Described are the standardization of terminology, relationships of tests, testing methods, types of sealed sources and their applications, relations to Czechoslovak regulations on radiation protection and to IAEA specifications for radioactive material shipment, etc. Practical impact is shown of the introduction of the new standards governing sealed sources on the national economy, and the purpose is explained of various documents issued with sealed sources. (author). 2 figs., 45 refs.

  17. Deriving Earth Science Data Analytics Requirements

    Science.gov (United States)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  18. Minimum analytical quality specifications of inter-laboratory comparisons: agreement among Spanish EQAP organizers.

    Science.gov (United States)

    Ricós, Carmen; Ramón, Francisco; Salas, Angel; Buño, Antonio; Calafell, Rafael; Morancho, Jorge; Gutiérrez-Bassini, Gabriella; Jou, Josep M

    2011-11-18

    Four Spanish scientific societies organizing external quality assessment programs (EQAP) formed a working group to promote the use of common minimum quality specifications for clinical tests. Laboratories that do not meet the minimum specifications are encouraged to make immediate review of the analytical procedure affected and to implement corrective actions if necessary. The philosophy was to use the 95th percentile of results sent to EQAP (expressed in terms of percentage deviation from the target value) obtained for all results (except the outliers) during a cycle of 1 year. The target value for a number of analytes of the basic biochemistry program was established as the overall mean. However, because of the substantial discrepancies between routine methods for basic hematology, hormones, proteins, therapeutic drugs and tumor markers, the target in these cases was the peer group mean. The resulting specifications were quite similar to those established in the US (CLIA), and Germany (Richtlinie). The proposed specifications stand for the minimum level of quality to be attained for laboratories, to assure harmonized service performance. They have nothing to do with satisfying clinical requirements, which are the final level of quality to be reached, and that is strongly recommended in our organizations by means of documents, courses, symposiums and all types of educational activities.

  19. Source specific risk assessment of indoor aerosol particles

    Energy Technology Data Exchange (ETDEWEB)

    Koivisto, A.J.

    2013-05-15

    In the urban environment, atmospheric aerosols consist mainly of pollutants from anthropogenic sources. The majority of these originate from traffic and other combustion processes. A fraction of these pollutants will penetrate indoors via ventilation. However, indoor air concentrations are usually predominated by indoor sources due to the small amount of dilution air. In modern societies, people spend most of their time indoors. Thus, their exposure is controlled mainly by indoor concentrations from indoor sources. During the last decades, engineering of nanosized structures has created a new field of material science. Some of these materials have been shown to be potentially toxic to human health. The greatest potential for exposure to engineered nanomaterials (ENMs) occurs in the workplace during production and handling of ENMs. In an exposure assessment, both gaseous and particulate matter pollutants need to be considered. The toxicities of the particles usually depend on the source and age. With time, particle morphology and composition changes due to their tendency to undergo coagulation, condensation and evaporation. The PM exposure risk is related to source specific emissions, and thus, in risk assessment one needs to define source specific exposures. This thesis describes methods for source specific risk assessment of airborne particulate matter. It consists of studies related to workers' ENM exposures during the synthesis of nanoparticles, packing of agglomerated TiO{sub 2} nanoparticles, and handling of nanodiamonds. Background particles were distinguished from the ENM concentrations by using different measurement techniques and indoor aerosol modelings. Risk characterization was performed by using a source specific exposure and calculated dose levels in units of particle number and mass. The exposure risk was estimated by using non-health based occupational exposure limits for ENMs. For the nanosized TiO{sub 2}, the risk was also assessed from dose

  20. Application of californium-252 neutron sources for analytical chemistry

    International Nuclear Information System (INIS)

    Ishii, Daido

    1976-01-01

    The researches made for the application of Cf-252 neutron sources to analytical chemistry during the period from 1970 to 1974 including partly 1975 are reviewed. The first part is the introduction to the above. The second part deals with general review of symposia, publications and the like. Attention is directed to ERDA publishing the periodical ''Californium-252 Progress'' and to a study group of Cf-252 utilization held by Japanese Radioisotope Association in 1974. The third part deals with its application for radio activation analysis. The automated absolute activation analysis (AAAA) of Savannha River is briefly explained. The joint experiment of Savannha River operation office with New Brunswick laboratory is mentioned. Cf-252 radiation source was used for the non-destructive analysis of elements in river water. East neutrons of Cf-252 were used for the quantitative analysis of lead in paints. Many applications for industrial control processes have been reported. Attention is drawn to the application of Cf-252 neutron sources for the field search of neutral resources. For example, a logging sonde for searching uranium resources was developed. the fourth part deals with the application of the analysis with gamma ray by capturing neutrons. For example, a bore hole sonde and the process control analysis of sulfur in fuel utilized capture gamma ray. The prompt gamma ray by capturing neutrons may be used for the nondestructive analysis of enrivonment. (Iwakiri, K.)

  1. Performance specifications for the extra-analytical phases of laboratory testing: Why and how.

    Science.gov (United States)

    Plebani, Mario

    2017-07-01

    An important priority in the current healthcare scenario should be to address errors in laboratory testing, which account for a significant proportion of diagnostic errors. Efforts made in laboratory medicine to enhance the diagnostic process have been directed toward improving technology, greater volumes and more accurate laboratory tests being achieved, but data collected in the last few years highlight the need to re-evaluate the total testing process (TTP) as the unique framework for improving quality and patient safety. Valuable quality indicators (QIs) and extra-analytical performance specifications are required for guidance in improving all TTP steps. Yet in literature no data are available on extra-analytical performance specifications based on outcomes, and nor is it possible to set any specification using calculations involving biological variability. The collection of data representing the state-of-the-art based on quality indicators is, therefore, underway. The adoption of a harmonized set of QIs, a common data collection and standardised reporting method is mandatory as it will not only allow the accreditation of clinical laboratories according to the International Standard, but also assure guidance for promoting improvement processes and guaranteeing quality care to patients. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  2. Pulsed voltage electrospray ion source and method for preventing analyte electrolysis

    Science.gov (United States)

    Kertesz, Vilmos [Knoxville, TN; Van Berkel, Gary [Clinton, TN

    2011-12-27

    An electrospray ion source and method of operation includes the application of pulsed voltage to prevent electrolysis of analytes with a low electrochemical potential. The electrospray ion source can include an emitter, a counter electrode, and a power supply. The emitter can include a liquid conduit, a primary working electrode having a liquid contacting surface, and a spray tip, where the liquid conduit and the working electrode are in liquid communication. The counter electrode can be proximate to, but separated from, the spray tip. The power system can supply voltage to the working electrode in the form of a pulse wave, where the pulse wave oscillates between at least an energized voltage and a relaxation voltage. The relaxation duration of the relaxation voltage can range from 1 millisecond to 35 milliseconds. The pulse duration of the energized voltage can be less than 1 millisecond and the frequency of the pulse wave can range from 30 to 800 Hz.

  3. Sources of variability in fatty acid (FA) biomarkers in the application of compound-specific stable isotopes (CSSIs) to soil and sediment fingerprinting and tracing: A review

    Energy Technology Data Exchange (ETDEWEB)

    Reiffarth, D.G., E-mail: Dominic.Reiffarth@unbc.ca [Natural Resources and Environmental Studies Program, University of Northern British Columbia, 3333 University Way, Prince George, BC V2N 4Z9 (Canada); Petticrew, E.L., E-mail: Ellen.Petticrew@unbc.ca [Geography Program and Quesnel River Research Centre, University of Northern British Columbia, 3333 University Way, Prince George, BC V2N 4Z9 (Canada); Owens, P.N., E-mail: Philip.Owens@unbc.ca [Environmental Science Program and Quesnel River Research Centre, University of Northern British Columbia, 3333 University Way, Prince George, BC, V2N 4Z9 (Canada); Lobb, D.A., E-mail: David.Lobb@umanitoba.ca [Watershed Systems Research Program, University of Manitoba, 13 Freedman Crescent, Winnipeg, MB R3T 2N2 (Canada)

    2016-09-15

    Determining soil redistribution and sediment budgets in watersheds is often challenging. One of the methods for making such determinations employs soil and sediment fingerprinting techniques, using sediment properties such as geochemistry, fallout radionuclides, and mineral magnetism. These methods greatly improve the estimation of erosion and deposition within a watershed, but are limited when determining land use-based soil and sediment movement. Recently, compound-specific stable isotopes (CSSIs), which employ fatty acids naturally occurring in the vegetative cover of soils, offer the possibility of refining fingerprinting techniques based on land use, complementing other methods that are currently in use. The CSSI method has been met with some success; however, challenges still remain with respect to scale and resolution due to a potentially large degree of biological, environmental and analytical uncertainty. By better understanding the source of tracers used in CSSI work and the inherent biochemical variability in those tracers, improvement in sample design and tracer selection is possible. Furthermore, an understanding of environmental and analytical factors affecting the CSSI signal will lead to refinement of the approach and the ability to generate more robust data. This review focuses on sources of biological, environmental and analytical variability in applying CSSI to soil and sediment fingerprinting, and presents recommendations based on past work and current research in this area for improving the CSSI technique. A recommendation, based on current information available in the literature, is to use very-long chain saturated fatty acids and to avoid the use of the ubiquitous saturated fatty acids, C{sub 16} and C{sub 18}. - Highlights: • Compound-specific stable isotopes (CSSIs) of carbon may be used as soil tracers. • The variables affecting CSSI data are: biological, environmental and analytical. • Understanding sources of variability will lead

  4. Sources of variability in fatty acid (FA) biomarkers in the application of compound-specific stable isotopes (CSSIs) to soil and sediment fingerprinting and tracing: A review

    International Nuclear Information System (INIS)

    Reiffarth, D.G.; Petticrew, E.L.; Owens, P.N.; Lobb, D.A.

    2016-01-01

    Determining soil redistribution and sediment budgets in watersheds is often challenging. One of the methods for making such determinations employs soil and sediment fingerprinting techniques, using sediment properties such as geochemistry, fallout radionuclides, and mineral magnetism. These methods greatly improve the estimation of erosion and deposition within a watershed, but are limited when determining land use-based soil and sediment movement. Recently, compound-specific stable isotopes (CSSIs), which employ fatty acids naturally occurring in the vegetative cover of soils, offer the possibility of refining fingerprinting techniques based on land use, complementing other methods that are currently in use. The CSSI method has been met with some success; however, challenges still remain with respect to scale and resolution due to a potentially large degree of biological, environmental and analytical uncertainty. By better understanding the source of tracers used in CSSI work and the inherent biochemical variability in those tracers, improvement in sample design and tracer selection is possible. Furthermore, an understanding of environmental and analytical factors affecting the CSSI signal will lead to refinement of the approach and the ability to generate more robust data. This review focuses on sources of biological, environmental and analytical variability in applying CSSI to soil and sediment fingerprinting, and presents recommendations based on past work and current research in this area for improving the CSSI technique. A recommendation, based on current information available in the literature, is to use very-long chain saturated fatty acids and to avoid the use of the ubiquitous saturated fatty acids, C 16 and C 18 . - Highlights: • Compound-specific stable isotopes (CSSIs) of carbon may be used as soil tracers. • The variables affecting CSSI data are: biological, environmental and analytical. • Understanding sources of variability will lead to more

  5. Radioimmunoassay. A revolution in the analytic procedure

    Energy Technology Data Exchange (ETDEWEB)

    Strecker, H; Eckert, H G [Farbwerke Hoechst A.G., Frankfurt am Main (Germany, F.R.). Radiochemisches Lab.

    1978-04-01

    Radioimmunoassay is an analytic method which combines the sensitivity of radioactive measurement and the specificity of the antigen-antibody reaction Substances down to a concentration of some picograms per ml serum (or biological material) can be measured in the presence of a millionfold excess of otherwise interfering substances. The method is easy to carry out (test tube chemistry). The main field of application at the moment is in endocrinology; further possibilities of application are in pharmaceutical research, environmental protection, forensic medicine, and for general analytic purposes. Radioactive sources are used only in vitro in the nanocurie range, i.e. radiation exposure is negligible.

  6. An Analytical Study of Prostate-Specific Antigen Dynamics.

    Science.gov (United States)

    Esteban, Ernesto P; Deliz, Giovanni; Rivera-Rodriguez, Jaileen; Laureano, Stephanie M

    2016-01-01

    The purpose of this research is to carry out a quantitative study of prostate-specific antigen dynamics for patients with prostatic diseases, such as benign prostatic hyperplasia (BPH) and localized prostate cancer (LPC). The proposed PSA mathematical model was implemented using clinical data of 218 Japanese patients with histological proven BPH and 147 Japanese patients with LPC (stages T2a and T2b). For prostatic diseases (BPH and LPC) a nonlinear equation was obtained and solved in a close form to predict PSA progression with patients' age. The general solution describes PSA dynamics for patients with both diseases LPC and BPH. Particular solutions allow studying PSA dynamics for patients with BPH or LPC. Analytical solutions have been obtained and solved in a close form to develop nomograms for a better understanding of PSA dynamics in patients with BPH and LPC. This study may be useful to improve the diagnostic and prognosis of prostatic diseases.

  7. Generalized Analytical Treatment Of The Source Strength In The Solution Of The Diffusion Equation

    International Nuclear Information System (INIS)

    Essa, Kh.S.M.; EI-Otaify, M.S.

    2007-01-01

    The source release strength (which is an integral part of the mathematical formulation of the diffusion equation) together with the boundary conditions leads to three different forms of the diffusion equation. The obtained forms have been solved analytically under different boundary conditions, by using transformation of axis, cosine, and Fourier transformation. Three equivalent alternative mathematical formulations of the problem have been obtained. The estimated solution of the concentrations at the ground source has been used for comparison with observed concentrations data for SF 6 tracer experiments in low wind and unstable conditions at lIT Delhi sports ground. A good agreement between estimated and observed concentrations is found

  8. TrajAnalytics: An Open-Source, Web-Based Visual Analytics Software of Urban Trajectory Data

    OpenAIRE

    Zhao, Ye

    2018-01-01

    We developed a software system named TrajAnalytics, which explicitly supports interactive visual analytics of the emerging trajectory data. It offers data management capability and support various data queries by leveraging web-based computing platforms. It allows users to visually conduct queries and make sense of massive trajectory data.

  9. Median of patient results as a tool for assessment of analytical stability.

    Science.gov (United States)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. A two-dimensional transient analytical solution for a ponded ditch drainage system under the influence of source/sink

    Science.gov (United States)

    Sarmah, Ratan; Tiwari, Shubham

    2018-03-01

    An analytical solution is developed for predicting two-dimensional transient seepage into ditch drainage network receiving water from a non-uniform steady ponding field from the surface of the soil under the influence of source/sink in the flow domain. The flow domain is assumed to be saturated, homogeneous and anisotropic in nature and have finite extends in horizontal and vertical directions. The drains are assumed to be standing vertical and penetrating up to impervious layer. The water levels in the drains are unequal and invariant with time. The flow field is also assumed to be under the continuous influence of time-space dependent arbitrary source/sink term. The correctness of the proposed model is checked by developing a numerical code and also with the existing analytical solution for the simplified case. The study highlights the significance of source/sink influence in the subsurface flow. With the imposition of the source and sink term in the flow domain, the pathline and travel time of water particles started deviating from their original position and above that the side and top discharge to the drains were also observed to have a strong influence of the source/sink terms. The travel time and pathline of water particles are also observed to have a dependency on the height of water in the ditches and on the location of source/sink activation area.

  11. Helios: Understanding Solar Evolution Through Text Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Randazzese, Lucien [SRI International, Menlo Park, CA (United States)

    2016-12-02

    This proof-of-concept project focused on developing, testing, and validating a range of bibliometric, text analytic, and machine-learning based methods to explore the evolution of three photovoltaic (PV) technologies: Cadmium Telluride (CdTe), Dye-Sensitized solar cells (DSSC), and Multi-junction solar cells. The analytical approach to the work was inspired by previous work by the same team to measure and predict the scientific prominence of terms and entities within specific research domains. The goal was to create tools that could assist domain-knowledgeable analysts in investigating the history and path of technological developments in general, with a focus on analyzing step-function changes in performance, or “breakthroughs,” in particular. The text-analytics platform developed during this project was dubbed Helios. The project relied on computational methods for analyzing large corpora of technical documents. For this project we ingested technical documents from the following sources into Helios: Thomson Scientific Web of Science (papers), the U.S. Patent & Trademark Office (patents), the U.S. Department of Energy (technical documents), the U.S. National Science Foundation (project funding summaries), and a hand curated set of full-text documents from Thomson Scientific and other sources.

  12. Predictive Big Data Analytics: A Study of Parkinson’s Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations

    Science.gov (United States)

    Dinov, Ivo D.; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W.; Price, Nathan D.; Van Horn, John D.; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M.; Dauer, William; Toga, Arthur W.

    2016-01-01

    Background A unique archive of Big Data on Parkinson’s Disease is collected, managed and disseminated by the Parkinson’s Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson’s disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data–large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources–all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Methods and Findings Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several

  13. An analytical threshold voltage model for a short-channel dual-metal-gate (DMG) recessed-source/drain (Re-S/D) SOI MOSFET

    Science.gov (United States)

    Saramekala, G. K.; Santra, Abirmoya; Dubey, Sarvesh; Jit, Satyabrata; Tiwari, Pramod Kumar

    2013-08-01

    In this paper, an analytical short-channel threshold voltage model is presented for a dual-metal-gate (DMG) fully depleted recessed source/drain (Re-S/D) SOI MOSFET. For the first time, the advantages of recessed source/drain (Re-S/D) and of dual-metal-gate structure are incorporated simultaneously in a fully depleted SOI MOSFET. The analytical surface potential model at Si-channel/SiO2 interface and Si-channel/buried-oxide (BOX) interface have been developed by solving the 2-D Poisson’s equation in the channel region with appropriate boundary conditions assuming parabolic potential profile in the transverse direction of the channel. Thereupon, a threshold voltage model is derived from the minimum surface potential in the channel. The developed model is analyzed extensively for a variety of device parameters like the oxide and silicon channel thicknesses, thickness of source/drain extension in the BOX, control and screen gate length ratio. The validity of the present 2D analytical model is verified with ATLAS™, a 2D device simulator from SILVACO Inc.

  14. Analytical formulae to calculate the solid angle subtended at an arbitrarily positioned point source by an elliptical radiation detector

    International Nuclear Information System (INIS)

    Abbas, Mahmoud I.; Hammoud, Sami; Ibrahim, Tarek; Sakr, Mohamed

    2015-01-01

    In this article, we introduce a direct analytical mathematical method for calculating the solid angle, Ω, subtended at a point by closed elliptical contours. The solid angle is required in many areas of optical and nuclear physics to estimate the flux of particle beam of radiation and to determine the activity of a radioactive source. The validity of the derived analytical expressions was successfully confirmed by the comparison with some published data (Numerical Method)

  15. Shielding Characteristics Using an Ultrasonic Configurable Fan Artificial Noise Source to Generate Modes - Experimental Measurements and Analytical Predictions

    Science.gov (United States)

    Sutliff, Daniel L.; Walker, Bruce E.

    2014-01-01

    An Ultrasonic Configurable Fan Artificial Noise Source (UCFANS) was designed, built, and tested in support of the NASA Langley Research Center's 14x22 wind tunnel test of the Hybrid Wing Body (HWB) full 3-D 5.8% scale model. The UCFANS is a 5.8% rapid prototype scale model of a high-bypass turbofan engine that can generate the tonal signature of proposed engines using artificial sources (no flow). The purpose of the program was to provide an estimate of the acoustic shielding benefits possible from mounting an engine on the upper surface of a wing; a flat plate model was used as the shielding surface. Simple analytical simulations were used to preview the radiation patterns - Fresnel knife-edge diffraction was coupled with a dense phased array of point sources to compute shielded and unshielded sound pressure distributions for potential test geometries and excitation modes. Contour plots of sound pressure levels, and integrated power levels, from nacelle alone and shielded configurations for both the experimental measurements and the analytical predictions are presented in this paper.

  16. TH-A-19A-06: Site-Specific Comparison of Analytical and Monte Carlo Based Dose Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Schuemann, J; Grassberger, C; Paganetti, H [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Dowdell, S [Illawarra Shoalhaven Local Health District, Wollongong (Australia)

    2014-06-15

    Purpose: To investigate the impact of complex patient geometries on the capability of analytical dose calculation algorithms to accurately predict dose distributions and to verify currently used uncertainty margins in proton therapy. Methods: Dose distributions predicted by an analytical pencilbeam algorithm were compared with Monte Carlo simulations (MCS) using TOPAS. 79 complete patient treatment plans were investigated for 7 disease sites (liver, prostate, breast, medulloblastoma spine and whole brain, lung and head and neck). A total of 508 individual passively scattered treatment fields were analyzed for field specific properties. Comparisons based on target coverage indices (EUD, D95, D90 and D50) were performed. Range differences were estimated for the distal position of the 90% dose level (R90) and the 50% dose level (R50). Two-dimensional distal dose surfaces were calculated and the root mean square differences (RMSD), average range difference (ARD) and average distal dose degradation (ADD), the distance between the distal position of the 80% and 20% dose levels (R80- R20), were analyzed. Results: We found target coverage indices calculated by TOPAS to generally be around 1–2% lower than predicted by the analytical algorithm. Differences in R90 predicted by TOPAS and the planning system can be larger than currently applied range margins in proton therapy for small regions distal to the target volume. We estimate new site-specific range margins (R90) for analytical dose calculations considering total range uncertainties and uncertainties from dose calculation alone based on the RMSD. Our results demonstrate that a reduction of currently used uncertainty margins is feasible for liver, prostate and whole brain fields even without introducing MC dose calculations. Conclusion: Analytical dose calculation algorithms predict dose distributions within clinical limits for more homogeneous patients sites (liver, prostate, whole brain). However, we recommend

  17. TH-A-19A-06: Site-Specific Comparison of Analytical and Monte Carlo Based Dose Calculations

    International Nuclear Information System (INIS)

    Schuemann, J; Grassberger, C; Paganetti, H; Dowdell, S

    2014-01-01

    Purpose: To investigate the impact of complex patient geometries on the capability of analytical dose calculation algorithms to accurately predict dose distributions and to verify currently used uncertainty margins in proton therapy. Methods: Dose distributions predicted by an analytical pencilbeam algorithm were compared with Monte Carlo simulations (MCS) using TOPAS. 79 complete patient treatment plans were investigated for 7 disease sites (liver, prostate, breast, medulloblastoma spine and whole brain, lung and head and neck). A total of 508 individual passively scattered treatment fields were analyzed for field specific properties. Comparisons based on target coverage indices (EUD, D95, D90 and D50) were performed. Range differences were estimated for the distal position of the 90% dose level (R90) and the 50% dose level (R50). Two-dimensional distal dose surfaces were calculated and the root mean square differences (RMSD), average range difference (ARD) and average distal dose degradation (ADD), the distance between the distal position of the 80% and 20% dose levels (R80- R20), were analyzed. Results: We found target coverage indices calculated by TOPAS to generally be around 1–2% lower than predicted by the analytical algorithm. Differences in R90 predicted by TOPAS and the planning system can be larger than currently applied range margins in proton therapy for small regions distal to the target volume. We estimate new site-specific range margins (R90) for analytical dose calculations considering total range uncertainties and uncertainties from dose calculation alone based on the RMSD. Our results demonstrate that a reduction of currently used uncertainty margins is feasible for liver, prostate and whole brain fields even without introducing MC dose calculations. Conclusion: Analytical dose calculation algorithms predict dose distributions within clinical limits for more homogeneous patients sites (liver, prostate, whole brain). However, we recommend

  18. Waste minimization methods for treating analytical instrumentation effluents at the source

    International Nuclear Information System (INIS)

    Ritter, J.A.; Barnhart, C.

    1995-01-01

    The primary goal of this project was to reduce the amount of hazardous waste being generated by the Savannah River Siste Defense Waste Processing Technology-analytical Laboratory (DWPT-AL). A detailed characterization study was performed on 12 of the liquid effluent streams generated within the DWPT-AL. Two of the streams were not hazardous, and are now being collected separately from the 10 hazardous streams. A secondary goal of the project was to develop in-line methods using primarily adsorption/ion exchange columns to treat liquid effluent as it emerges from the analytical instrument as a slow, dripping flow. Samples from the 10 hazardous streams were treated by adsorption in an experimental apparatus that resembled an in-line or at source column apparatus. The layered adsorbent bed contained activated carbon and ion exchange resin. The column technique did not work on the first three samples of the spectroscopy waste stream, but worked well on the next three samples which were treated in a different column. It was determined that an unusual form of mercury was present in the first three samples. Similarly, two samples of a combined waste stream were rendered nonhazardous, but the last two samples contained acetylnitrile that prevented analysis. The characteristics of these streams changed from the initial characterization study; therefore, continual, in-deptch stream characterization is the key to making this project successful

  19. SmartR: an open-source platform for interactive visual analytics for translational research data.

    Science.gov (United States)

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-07-15

    In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR , a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR . reinhard.schneider@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  20. The Analytical Repository Source-Term (AREST) model: Description and documentation

    International Nuclear Information System (INIS)

    Liebetrau, A.M.; Apted, M.J.; Engel, D.W.; Altenhofen, M.K.; Strachan, D.M.; Reid, C.R.; Windisch, C.F.; Erikson, R.L.; Johnson, K.I.

    1987-10-01

    The geologic repository system consists of several components, one of which is the engineered barrier system. The engineered barrier system interfaces with natural barriers that constitute the setting of the repository. A model that simulates the releases from the engineered barrier system into the natural barriers of the geosphere, called a source-term model, is an important component of any model for assessing the overall performance of the geologic repository system. The Analytical Repository Source-Term (AREST) model being developed is one such model. This report describes the current state of development of the AREST model and the code in which the model is implemented. The AREST model consists of three component models and five process models that describe the post-emplacement environment of a waste package. All of these components are combined within a probabilistic framework. The component models are a waste package containment (WPC) model that simulates the corrosion and degradation processes which eventually result in waste package containment failure; a waste package release (WPR) model that calculates the rates of radionuclide release from the failed waste package; and an engineered system release (ESR) model that controls the flow of information among all AREST components and process models and combines release output from the WPR model with failure times from the WPC model to produce estimates of total release. 167 refs., 40 figs., 12 tabs

  1. Analytical estimation of emission zone mean position and width in organic light-emitting diodes from emission pattern image-source interference fringes

    International Nuclear Information System (INIS)

    Epstein, Ariel; Tessler, Nir; Einziger, Pinchas D.; Roberts, Matthew

    2014-01-01

    We present an analytical method for evaluating the first and second moments of the effective exciton spatial distribution in organic light-emitting diodes (OLED) from measured emission patterns. Specifically, the suggested algorithm estimates the emission zone mean position and width, respectively, from two distinct features of the pattern produced by interference between the emission sources and their images (induced by the reflective cathode): the angles in which interference extrema are observed, and the prominence of interference fringes. The relations between these parameters are derived rigorously for a general OLED structure, indicating that extrema angles are related to the mean position of the radiating excitons via Bragg's condition, and the spatial broadening is related to the attenuation of the image-source interference prominence due to an averaging effect. The method is applied successfully both on simulated emission patterns and on experimental data, exhibiting a very good agreement with the results obtained by numerical techniques. We investigate the method performance in detail, showing that it is capable of producing accurate estimations for a wide range of source-cathode separation distances, provided that the measured spectral interval is large enough; guidelines for achieving reliable evaluations are deduced from these results as well. As opposed to numerical fitting tools employed to perform similar tasks to date, our approximate method explicitly utilizes physical intuition and requires far less computational effort (no fitting is involved). Hence, applications that do not require highly resolved estimations, e.g., preliminary design and production-line verification, can benefit substantially from the analytical algorithm, when applicable. This introduces a novel set of efficient tools for OLED engineering, highly important in the view of the crucial role the exciton distribution plays in determining the device performance.

  2. Analytical estimation of emission zone mean position and width in organic light-emitting diodes from emission pattern image-source interference fringes

    Energy Technology Data Exchange (ETDEWEB)

    Epstein, Ariel, E-mail: ariel.epstein@utoronto.ca; Tessler, Nir, E-mail: nir@ee.technion.ac.il; Einziger, Pinchas D. [Department of Electrical Engineering, Technion-Israel Institute of Technology, Haifa 32000 (Israel); Roberts, Matthew, E-mail: mroberts@cdtltd.co.uk [Cambridge Display Technology Ltd, Building 2020, Cambourne Business Park, Cambourne, Cambridgeshire CB23 6DW (United Kingdom)

    2014-06-14

    We present an analytical method for evaluating the first and second moments of the effective exciton spatial distribution in organic light-emitting diodes (OLED) from measured emission patterns. Specifically, the suggested algorithm estimates the emission zone mean position and width, respectively, from two distinct features of the pattern produced by interference between the emission sources and their images (induced by the reflective cathode): the angles in which interference extrema are observed, and the prominence of interference fringes. The relations between these parameters are derived rigorously for a general OLED structure, indicating that extrema angles are related to the mean position of the radiating excitons via Bragg's condition, and the spatial broadening is related to the attenuation of the image-source interference prominence due to an averaging effect. The method is applied successfully both on simulated emission patterns and on experimental data, exhibiting a very good agreement with the results obtained by numerical techniques. We investigate the method performance in detail, showing that it is capable of producing accurate estimations for a wide range of source-cathode separation distances, provided that the measured spectral interval is large enough; guidelines for achieving reliable evaluations are deduced from these results as well. As opposed to numerical fitting tools employed to perform similar tasks to date, our approximate method explicitly utilizes physical intuition and requires far less computational effort (no fitting is involved). Hence, applications that do not require highly resolved estimations, e.g., preliminary design and production-line verification, can benefit substantially from the analytical algorithm, when applicable. This introduces a novel set of efficient tools for OLED engineering, highly important in the view of the crucial role the exciton distribution plays in determining the device performance.

  3. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    Science.gov (United States)

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  4. A comparison of average wages with age-specific wages for assessing indirect productivity losses: analytic simplicity versus analytic precision.

    Science.gov (United States)

    Connolly, Mark P; Tashjian, Cole; Kotsopoulos, Nikolaos; Bhatt, Aomesh; Postma, Maarten J

    2017-07-01

    Numerous approaches are used to estimate indirect productivity losses using various wage estimates applied to poor health in working aged adults. Considering the different wage estimation approaches observed in the published literature, we sought to assess variation in productivity loss estimates when using average wages compared with age-specific wages. Published estimates for average and age-specific wages for combined male/female wages were obtained from the UK Office of National Statistics. A polynomial interpolation was used to convert 5-year age-banded wage data into annual age-specific wages estimates. To compare indirect cost estimates, average wages and age-specific wages were used to project productivity losses at various stages of life based on the human capital approach. Discount rates of 0, 3, and 6 % were applied to projected age-specific and average wage losses. Using average wages was found to overestimate lifetime wages in conditions afflicting those aged 1-27 and 57-67, while underestimating lifetime wages in those aged 27-57. The difference was most significant for children where average wage overestimated wages by 15 % and for 40-year-olds where it underestimated wages by 14 %. Large differences in projecting productivity losses exist when using the average wage applied over a lifetime. Specifically, use of average wages overestimates productivity losses between 8 and 15 % for childhood illnesses. Furthermore, during prime working years, use of average wages will underestimate productivity losses by 14 %. We suggest that to achieve more precise estimates of productivity losses, age-specific wages should become the standard analytic approach.

  5. Intense neutron source: high-voltage power supply specifications

    International Nuclear Information System (INIS)

    Riedel, A.A.

    1980-08-01

    This report explains the need for and sets forth the electrical, mechanical and safety specifications for a high-voltage power supply to be used with the intense neutron source. It contains sufficient information for a supplier to bid on such a power supply

  6. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  7. A new DG nanoscale TFET based on MOSFETs by using source gate electrode: 2D simulation and an analytical potential model

    Science.gov (United States)

    Ramezani, Zeinab; Orouji, Ali A.

    2017-08-01

    This paper suggests and investigates a double-gate (DG) MOSFET, which emulates tunnel field effect transistors (M-TFET). We have combined this novel concept into a double-gate MOSFET, which behaves as a tunneling field effect transistor by work function engineering. In the proposed structure, in addition to the main gate, we utilize another gate over the source region with zero applied voltage and a proper work function to convert the source region from N+ to P+. We check the impact obtained by varying the source gate work function and source doping on the device parameters. The simulation results of the M-TFET indicate that it is a suitable case for a switching performance. Also, we present a two-dimensional analytic potential model of the proposed structure by solving the Poisson's equation in x and y directions and by derivatives from the potential profile; thus, the electric field is achieved. To validate our present model, we use the SILVACO ATLAS device simulator. The analytical results have been compared with it.

  8. Kinetic calculations for miniature neutron source reactor using analytical and numerical techniques

    International Nuclear Information System (INIS)

    Ampomah-Amoako, E.

    2008-06-01

    The analytical methods, step change in reactivity and ramp change in reactivity as well as numerical methods, fixed point iteration and Runge Kutta-gill were used to simulate the initial build up of neutrons in a miniature neutron source reactor with and without temperature feedback effect. The methods were modified to include photo neutron concentration. PARET 7.3 was used to simulate the transients behaviour of Ghana Research Reactor-1. The PARET code was capable of simulating the transients for 2.1 mk and 4 mk insertions of reactivity with peak powers of 49.87 kW and 92.34 kW, respectively. PARET code however failed to simulate 6.71 mk of reactivity which was predicted by Akaho et al through TEMPFED. (au)

  9. Evaluation and analytical validation of a handheld digital refractometer for urine specific gravity measurement

    Directory of Open Access Journals (Sweden)

    Sara P. Wyness

    2016-08-01

    Full Text Available Objectives: Refractometers are commonly used to determine urine specific gravity (SG in the assessment of hydration status and urine specimen validity testing. Few comprehensive performance evaluations are available demonstrating refractometer capability from a clinical laboratory perspective. The objective of this study was therefore to conduct an analytical validation of a handheld digital refractometer used for human urine SG testing. Design and methods: A MISCO Palm Abbe™ refractometer was used for all experiments, including device familiarization, carryover, precision, accuracy, linearity, analytical sensitivity, evaluation of potential substances which contribute to SG (i.e. “interference”, and reference interval evaluation. A manual refractometer, urine osmometer, and a solute score (sum of urine chloride, creatinine, glucose, potassium, sodium, total protein, and urea nitrogen; all in mg/dL were used as comparative methods for accuracy assessment. Results: Significant carryover was not observed. A wash step was still included as good laboratory practice. Low imprecision (%CV, <0.01 was demonstrated using low and high QC material. Accuracy studies showed strong correlation to manual refractometry. Linear correlation was also demonstrated between SG, osmolality, and solute score. Linearity of Palm Abbe performance was verified with observed error of ≤0.1%. Increases in SG were observed with increasing concentrations of albumin, creatinine, glucose, hemoglobin, sodium chloride, and urea. Transference of a previously published urine SG reference interval of 1.0020–1.0300 was validated. Conclusions: The Palm Abbe digital refractometer was a fast, simple, and accurate way to measure urine SG. Analytical validity was confirmed by the present experiments. Keywords: Specific gravity, Osmolality, Digital refractometry, Hydration, Sports medicine, Urine drug testing, Urine adulteration

  10. Google analytics integrations

    CERN Document Server

    Waisberg, Daniel

    2015-01-01

    A roadmap for turning Google Analytics into a centralized marketing analysis platform With Google Analytics Integrations, expert author Daniel Waisberg shows you how to gain a more meaningful, complete view of customers that can drive growth opportunities. This in-depth guide shows not only how to use Google Analytics, but also how to turn this powerful data collection and analysis tool into a central marketing analysis platform for your company. Taking a hands-on approach, this resource explores the integration and analysis of a host of common data sources, including Google AdWords, AdSens

  11. Analytic and Unambiguous Phase-Based Algorithm for 3-D Localization of a Single Source with Uniform Circular Array

    Directory of Open Access Journals (Sweden)

    Le Zuo

    2018-02-01

    Full Text Available This paper presents an analytic algorithm for estimating three-dimensional (3-D localization of a single source with uniform circular array (UCA interferometers. Fourier transforms are exploited to expand the phase distribution of a single source and the localization problem is reformulated as an equivalent spectrum manipulation problem. The 3-D parameters are decoupled to different spectrums in the Fourier domain. Algebraic relations are established between the 3-D localization parameters and the Fourier spectrums. Fourier sampling theorem ensures that the minimum element number for 3-D localization of a single source with a UCA is five. Accuracy analysis provides mathematical insights into the 3-D localization algorithm that larger number of elements gives higher estimation accuracy. In addition, the phase-based high-order difference invariance (HODI property of a UCA is found and exploited to realize phase range compression. Following phase range compression, ambiguity resolution is addressed by the HODI of a UCA. A major advantage of the algorithm is that the ambiguity resolution and 3-D localization estimation are both analytic and are processed simultaneously, hence computationally efficient. Numerical simulations and experimental results are provided to verify the effectiveness of the proposed 3-D localization algorithm.

  12. A 2D semi-analytical model for Faraday shield in ICP source

    International Nuclear Information System (INIS)

    Zhang, L.G.; Chen, D.Z.; Li, D.; Liu, K.F.; Li, X.F.; Pan, R.M.; Fan, M.W.

    2016-01-01

    Highlights: • In this paper, a 2D model of ICP with faraday shield is proposed considering the complex structure of the Faraday shield. • Analytical solution is found to evaluate the electromagnetic field in the ICP source with Faraday shield. • The collision-free motion of electrons in the source is investigated and the results show that the electrons will oscillate along the radial direction, which brings insight into how the RF power couple to the plasma. - Abstract: Faraday shield is a thin copper structure with a large number of slits which is usually used in inductive coupled plasma (ICP) sources. RF power is coupled into the plasma through these slits, therefore Faraday shield plays an important role in ICP discharge. However, due to the complex structure of the Faraday shield, the resulted electromagnetic field is quite hard to evaluate. In this paper, a 2D model is proposed on the assumption that the Faraday shield is sufficiently long and the RF coil is uniformly distributed, and the copper is considered as ideal conductor. Under these conditions, the magnetic field inside the source is uniform with only the axial component, while the electric field can be decomposed into a vortex field generated by changing magnetic field together with a gradient field generated by electric charge accumulated on the Faraday shield surface, which can be easily found by solving Laplace's equation. The motion of the electrons in the electromagnetic field is investigated and the results show that the electrons will oscillate along the radial direction when taking no account of collision. This interesting result brings insight into how the RF power couples into the plasma.

  13. Generalizing Source Geometry of Site Contamination by Simulating and Analyzing Analytical Solution of Three-Dimensional Solute Transport Model

    Directory of Open Access Journals (Sweden)

    Xingwei Wang

    2014-01-01

    Full Text Available Due to the uneven distribution of pollutions and blur edge of pollutant area, there will exist uncertainty of source term shape in advective-diffusion equation model of contaminant transport. How to generalize those irregular source terms and deal with those uncertainties is very critical but rarely studied in previous research. In this study, the fate and transport of contaminant from rectangular and elliptic source geometry were simulated based on a three-dimensional analytical solute transport model, and the source geometry generalization guideline was developed by comparing the migration of contaminant. The result indicated that the variation of source area size had no effect on pollution plume migration when the plume migrated as far as five times of source side length. The migration of pollution plume became slower with the increase of aquifer thickness. The contaminant concentration was decreasing with scale factor rising, and the differences among various scale factors became smaller with the distance to field increasing.

  14. Visual Analytics for Heterogeneous Geoscience Data

    Science.gov (United States)

    Pan, Y.; Yu, L.; Zhu, F.; Rilee, M. L.; Kuo, K. S.; Jiang, H.; Yu, H.

    2017-12-01

    Geoscience data obtained from diverse sources have been routinely leveraged by scientists to study various phenomena. The principal data sources include observations and model simulation outputs. These data are characterized by spatiotemporal heterogeneity originated from different instrument design specifications and/or computational model requirements used in data generation processes. Such inherent heterogeneity poses several challenges in exploring and analyzing geoscience data. First, scientists often wish to identify features or patterns co-located among multiple data sources to derive and validate certain hypotheses. Heterogeneous data make it a tedious task to search such features in dissimilar datasets. Second, features of geoscience data are typically multivariate. It is challenging to tackle the high dimensionality of geoscience data and explore the relations among multiple variables in a scalable fashion. Third, there is a lack of transparency in traditional automated approaches, such as feature detection or clustering, in that scientists cannot intuitively interact with their analysis processes and interpret results. To address these issues, we present a new scalable approach that can assist scientists in analyzing voluminous and diverse geoscience data. We expose a high-level query interface that allows users to easily express their customized queries to search features of interest across multiple heterogeneous datasets. For identified features, we develop a visualization interface that enables interactive exploration and analytics in a linked-view manner. Specific visualization techniques such as scatter plots to parallel coordinates are employed in each view to allow users to explore various aspects of features. Different views are linked and refreshed according to user interactions in any individual view. In such a manner, a user can interactively and iteratively gain understanding into the data through a variety of visual analytics operations. We

  15. Two-dimensional analytical solutions for chemical transport in aquifers. Part 1. Simplified solutions for sources with constant concentration. Part 2. Exact solutions for sources with constant flux rate

    International Nuclear Information System (INIS)

    Shan, C.; Javandel, I.

    1996-05-01

    Analytical solutions are developed for modeling solute transport in a vertical section of a homogeneous aquifer. Part 1 of the series presents a simplified analytical solution for cases in which a constant-concentration source is located at the top (or the bottom) of the aquifer. The following transport mechanisms have been considered: advection (in the horizontal direction), transverse dispersion (in the vertical direction), adsorption, and biodegradation. In the simplified solution, however, longitudinal dispersion is assumed to be relatively insignificant with respect to advection, and has been neglected. Example calculations are given to show the movement of the contamination front, the development of concentration profiles, the mass transfer rate, and an application to determine the vertical dispersivity. The analytical solution developed in this study can be a useful tool in designing an appropriate monitoring system and an effective groundwater remediation method

  16. Identification of Evidence for Key Parameters in Decision-Analytic Models of Cost Effectiveness: A Description of Sources and a Recommended Minimum Search Requirement.

    Science.gov (United States)

    Paisley, Suzy

    2016-06-01

    This paper proposes recommendations for a minimum level of searching for data for key parameters in decision-analytic models of cost effectiveness and describes sources of evidence relevant to each parameter type. Key parameters are defined as treatment effects, adverse effects, costs, resource use, health state utility values (HSUVs) and baseline risk of events. The recommended minimum requirement for treatment effects is comprehensive searching according to available methodological guidance. For other parameter types, the minimum is the searching of one bibliographic database plus, where appropriate, specialist sources and non-research-based and non-standard format sources. The recommendations draw on the search methods literature and on existing analyses of how evidence is used to support decision-analytic models. They take account of the range of research and non-research-based sources of evidence used in cost-effectiveness models and of the need for efficient searching. Consideration is given to what constitutes best evidence for the different parameter types in terms of design and scientific quality and to making transparent the judgments that underpin the selection of evidence from the options available. Methodological issues are discussed, including the differences between decision-analytic models of cost effectiveness and systematic reviews when searching and selecting evidence and comprehensive versus sufficient searching. Areas are highlighted where further methodological research is required.

  17. Formative assessment and learning analytics

    NARCIS (Netherlands)

    Tempelaar, D.T.; Heck, A.; Cuypers, H.; van der Kooij, H.; van de Vrie, E.; Suthers, D.; Verbert, K.; Duval, E.; Ochoa, X.

    2013-01-01

    Learning analytics seeks to enhance the learning process through systematic measurements of learning related data, and informing learners and teachers of the results of these measurements, so as to support the control of the learning process. Learning analytics has various sources of information,

  18. Determining the analytical specificity of PCR-based assays for the diagnosis of IA: What is Aspergillus?

    Science.gov (United States)

    Morton, C Oliver; White, P Lewis; Barnes, Rosemary A; Klingspor, Lena; Cuenca-Estrella, Manuel; Lagrou, Katrien; Bretagne, Stéphane; Melchers, Willem; Mengoli, Carlo; Caliendo, Angela M; Cogliati, Massimo; Debets-Ossenkopp, Yvette; Gorton, Rebecca; Hagen, Ferry; Halliday, Catriona; Hamal, Petr; Harvey-Wood, Kathleen; Jaton, Katia; Johnson, Gemma; Kidd, Sarah; Lengerova, Martina; Lass-Florl, Cornelia; Linton, Chris; Millon, Laurence; Morrissey, C Orla; Paholcsek, Melinda; Talento, Alida Fe; Ruhnke, Markus; Willinger, Birgit; Donnelly, J Peter; Loeffler, Juergen

    2017-06-01

    A wide array of PCR tests has been developed to aid the diagnosis of invasive aspergillosis (IA), providing technical diversity but limiting standardisation and acceptance. Methodological recommendations for testing blood samples using PCR exist, based on achieving optimal assay sensitivity to help exclude IA. Conversely, when testing more invasive samples (BAL, biopsy, CSF) emphasis is placed on confirming disease, so analytical specificity is paramount. This multicenter study examined the analytical specificity of PCR methods for detecting IA by blind testing a panel of DNA extracted from a various fungal species to explore the range of Aspergillus species that could be detected, but also potential cross reactivity with other fungal species. Positivity rates were calculated and regression analysis was performed to determine any associations between technical specifications and performance. The accuracy of Aspergillus genus specific assays was 71.8%, significantly greater (P Aspergillus species (47.2%). For genus specific assays the most often missed species were A. lentulus (25.0%), A. versicolor (24.1%), A. terreus (16.1%), A. flavus (15.2%), A. niger (13.4%), and A. fumigatus (6.2%). There was a significant positive association between accuracy and using an Aspergillus genus PCR assay targeting the rRNA genes (P = .0011). Conversely, there was a significant association between rRNA PCR targets and false positivity (P = .0032). To conclude current Aspergillus PCR assays are better suited for detecting A. fumigatus, with inferior detection of most other Aspergillus species. The use of an Aspergillus genus specific PCR assay targeting the rRNA genes is preferential. © The Author 2016. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Technical challenges for big data in biomedicine and health: data sources, infrastructure, and analytics.

    Science.gov (United States)

    Peek, N; Holmes, J H; Sun, J

    2014-08-15

    To review technical and methodological challenges for big data research in biomedicine and health. We discuss sources of big datasets, survey infrastructures for big data storage and big data processing, and describe the main challenges that arise when analyzing big data. The life and biomedical sciences are massively contributing to the big data revolution through secondary use of data that were collected during routine care and through new data sources such as social media. Efficient processing of big datasets is typically achieved by distributing computation over a cluster of computers. Data analysts should be aware of pitfalls related to big data such as bias in routine care data and the risk of false-positive findings in high-dimensional datasets. The major challenge for the near future is to transform analytical methods that are used in the biomedical and health domain, to fit the distributed storage and processing model that is required to handle big data, while ensuring confidentiality of the data being analyzed.

  20. An analytical calculation of the peak efficiency for cylindrical sources perpendicular to the detector axis in gamma-ray spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Aguiar, Julio C. [Autoridad Regulatoria Nuclear, Laboratorio de Espectrometria Gamma-CTBTO, Av. Del Libertador 8250, C1429BNP Buenos Aires (Argentina)], E-mail: jaguiar@sede.arn.gov.ar

    2008-08-15

    An analytical expression for the so-called full-energy peak efficiency {epsilon}(E) for cylindrical source with perpendicular axis to an HPGe detector is derived, using point-source measurements. The formula covers different measuring distances, matrix compositions, densities and gamma-ray energies; the only assumption is that the radioactivity is homogeneously distributed within the source. The term for the photon self-attenuation is included in the calculation. Measurements were made using three different sized cylindrical sources of {sup 241}Am, {sup 57}Co, {sup 137}Cs, {sup 54}Mn, and {sup 60}Co with corresponding peaks of 59.5, 122, 662, 835, 1173, and 1332 keV, respectively, and one measurement of radioactive waste drum for 662, 1173, and 1332 keV.

  1. Partial and specific source memory for faces associated to other- and self-relevant negative contexts.

    Science.gov (United States)

    Bell, Raoul; Giang, Trang; Buchner, Axel

    2012-01-01

    Previous research has shown a source memory advantage for faces presented in negative contexts. As yet it remains unclear whether participants remember the specific type of context in which the faces were presented or whether they can only remember that the face was associated with negative valence. In the present study, participants saw faces together with descriptions of two different types of negative behaviour and neutral behaviour. In Experiment 1, we examined whether the participants were able to discriminate between two types of other-relevant negative context information (cheating and disgusting behaviour) in a source memory test. In Experiment 2, we assessed source memory for other-relevant negative (threatening) context information (other-aggressive behaviour) and self-relevant negative context information (self-aggressive behaviour). A multinomial source memory model was used to separately assess partial source memory for the negative valence of the behaviour and specific source memory for the particular type of negative context the face was associated with. In Experiment 1, source memory was specific for the particular type of negative context presented (i.e., cheating or disgusting behaviour). Experiment 2 showed that source memory for other-relevant negative information was more specific than source memory for self-relevant information. Thus, emotional source memory may vary in specificity depending on the degree to which the negative emotional context is perceived as threatening.

  2. Prolonged activated prothromboplastin time and breed specific variation in haemostatic analytes in healthy adult Bernese Mountain dogs

    DEFF Research Database (Denmark)

    Nielsen, Lise; Wiinberg, Bo; Kjelgaard-Hansen, Mads

    2011-01-01

    Coagulation tests are often performed in dogs suspected of haemostatic dysfunction and are interpreted according to validated laboratory reference intervals (RIs). Breed specific RIs for haematological and biochemical analytes have previously been identified in Bernese Mountain dogs, but it remains...... to be determined if breed specific RIs are necessary for haemostasis tests. Activated prothromboplastin time (aPTT), prothrombin time (PT), selected coagulation factors, D-dimers, fibrinogen, von Willebrand factor and thromboelastography (TEG) were analyzed in healthy Bernese Mountain dogs using the CLSI model...

  3. Quantifying uncertainty in nuclear analytical measurements

    International Nuclear Information System (INIS)

    2004-07-01

    Laboratories to quantify the uncertainty of measurement results, and the fact that this standard is used as a basis for the development and implementation of quality management systems in many laboratories performing nuclear analytical measurements, triggered the demand for specific guidance to cover uncertainty issues of nuclear analytical methods. The demand was recognized by the IAEA and a series of examples was worked out by a group of consultants in 1998. The diversity and complexity of the topics addressed delayed the publication of a technical guidance report, but the exchange of views among the experts was also beneficial and led to numerous improvements and additions with respect to the initial version. This publication is intended to assist scientists using nuclear analytical methods in assessing and quantifying the sources of uncertainty of their measurements. The numerous examples provide a tool for applying the principles elaborated in the GUM and EURACHEM/CITAC publications to their specific fields of interest and for complying with the requirements of current quality management standards for testing and calibration laboratories. It also provides a means for the worldwide harmonization of approaches to uncertainty quantification and thereby contributes to enhanced comparability and competitiveness of nuclear analytical measurements

  4. Analytical investigation of low temperature lift energy conversion systems with renewable energy source

    International Nuclear Information System (INIS)

    Lee, Hoseong; Hwang, Yunho; Radermacher, Reinhard

    2014-01-01

    The efficiency of the renewable energy powered energy conversion system is typically low due to its moderate heat source temperature. Therefore, improving its energy efficiency is essential. In this study, the performance of the energy conversion system with renewable energy source was theoretically investigated in order to explore its design aspect. For this purpose, a computer model of n-stage low temperature lift energy conversion (LTLEC) system was developed. The results showed that under given operating conditions such as temperatures and mass flow rates of heat source and heat sink fluids the unit power generation of the system increased with the number of stage, and it became saturated when the number of staging reached four. Investigation of several possible working fluids for the optimum stage LTLEC system revealed that ethanol could be an alternative to ammonia. The heat exchanger effectiveness is a critical factor on the system performance. The power generation was increased by 7.83% for the evaporator and 9.94% for the condenser with 10% increase of heat exchanger effectiveness. When these low temperature source fluids are applied to the LTLEC system, the heat exchanger performance would be very critical and it has to be designed accordingly. - Highlights: •Energy conversion system with renewable energy is analytically investigated. •A model of multi-stage low temperature lift energy conversion systems was developed. •The system performance increases as the stage number is increased. •The unit power generation is increased with increase of HX effectiveness. •Ethanol is found to be a good alternative to ammonia

  5. An Analytical Threshold Voltage Model of Fully Depleted (FD) Recessed-Source/Drain (Re-S/D) SOI MOSFETs with Back-Gate Control

    Science.gov (United States)

    Saramekala, Gopi Krishna; Tiwari, Pramod Kumar

    2016-10-01

    This paper presents an analytical threshold voltage model for back-gated fully depleted (FD), recessed-source drain silicon-on-insulator metal-oxide-semiconductor field-effect transistors (MOSFETs). Analytical surface potential models have been developed at front and back surfaces of the channel by solving the two-dimensional (2-D) Poisson's equation in the channel region with appropriate boundary conditions assuming a parabolic potential profile in the transverse direction of the channel. The strong inversion criterion is applied to the front surface potential as well as on the back one in order to find two separate threshold voltages for front and back channels of the device, respectively. The device threshold voltage has been assumed to be associated with the surface that offers a lower threshold voltage. The developed model was analyzed extensively for a variety of device geometry parameters like the oxide and silicon channel thicknesses, the thickness of the source/drain extension in the buried oxide, and the applied bias voltages with back-gate control. The proposed model has been validated by comparing the analytical results with numerical simulation data obtained from ATLAS™, a 2-D device simulator from SILVACO.

  6. Uncovering category specificity of genital sexual arousal in women: The critical role of analytic technique.

    Science.gov (United States)

    Pulverman, Carey S; Hixon, J Gregory; Meston, Cindy M

    2015-10-01

    Based on analytic techniques that collapse data into a single average value, it has been reported that women lack category specificity and show genital sexual arousal to a large range of sexual stimuli including those that both match and do not match their self-reported sexual interests. These findings may be a methodological artifact of the way in which data are analyzed. This study examined whether using an analytic technique that models data over time would yield different results. Across two studies, heterosexual (N = 19) and lesbian (N = 14) women viewed erotic films featuring heterosexual, lesbian, and gay male couples, respectively, as their physiological sexual arousal was assessed with vaginal photoplethysmography. Data analysis with traditional methods comparing average genital arousal between films failed to detect specificity of genital arousal for either group. When data were analyzed with smoothing regression splines and a within-subjects approach, both heterosexual and lesbian women demonstrated different patterns of genital sexual arousal to the different types of erotic films, suggesting that sophisticated statistical techniques may be necessary to more fully understand women's genital sexual arousal response. Heterosexual women showed category-specific genital sexual arousal. Lesbian women showed higher arousal to the heterosexual film than the other films. However, within subjects, lesbian women showed significantly different arousal responses suggesting that lesbian women's genital arousal discriminates between different categories of stimuli at the individual level. Implications for the future use of vaginal photoplethysmography as a diagnostic tool of sexual preferences in clinical and forensic settings are discussed. © 2015 Society for Psychophysiological Research.

  7. Analytical performance specifications for changes in assay bias (Δbias) for data with logarithmic distributions as assessed by effects on reference change values

    DEFF Research Database (Denmark)

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G

    2016-01-01

    BACKGROUND: The distributions of within-subject biological variation are usually described as coefficients of variation, as are analytical performance specifications for bias, imprecision and other characteristics. Estimation of specifications required for reference change values is traditionally...... done using relationship between the batch-related changes during routine performance, described as Δbias, and the coefficients of variation for analytical imprecision (CVA): the original theory is based on standard deviations or coefficients of variation calculated as if distributions were Gaussian....... METHODS: The distribution of between-subject biological variation can generally be described as log-Gaussian. Moreover, recent analyses of within-subject biological variation suggest that many measurands have log-Gaussian distributions. In consequence, we generated a model for the estimation of analytical...

  8. Optimisation (sampling strategies and analytical procedures) for site specific environment monitoring at the areas of uranium production legacy sites in Ukraine - 59045

    International Nuclear Information System (INIS)

    Voitsekhovych, Oleg V.; Lavrova, Tatiana V.; Kostezh, Alexander B.

    2012-01-01

    There are many sites in the world, where Environment are still under influence of the contamination related to the Uranium production carried out in past. Author's experience shows that lack of site characterization data, incomplete or unreliable environment monitoring studies can significantly limit quality of Safety Assessment procedures and Priority actions analyses needed for Remediation Planning. During recent decades the analytical laboratories of the many enterprises, currently being responsible for establishing the site specific environment monitoring program have been significantly improved their technical sampling and analytical capacities. However, lack of experience in the optimal site specific sampling strategy planning and also not enough experience in application of the required analytical techniques, such as modern alpha-beta radiometers, gamma and alpha spectrometry and liquid-scintillation analytical methods application for determination of U-Th series radionuclides in the environment, does not allow to these laboratories to develop and conduct efficiently the monitoring programs as a basis for further Safety Assessment in decision making procedures. This paper gives some conclusions, which were gained from the experience establishing monitoring programs in Ukraine and also propose some practical steps on optimization in sampling strategy planning and analytical procedures to be applied for the area required Safety assessment and justification for its potential remediation and safe management. (authors)

  9. A big data geospatial analytics platform - Physical Analytics Integrated Repository and Services (PAIRS)

    Science.gov (United States)

    Hamann, H.; Jimenez Marianno, F.; Klein, L.; Albrecht, C.; Freitag, M.; Hinds, N.; Lu, S.

    2015-12-01

    A big data geospatial analytics platform:Physical Analytics Information Repository and Services (PAIRS)Fernando Marianno, Levente Klein, Siyuan Lu, Conrad Albrecht, Marcus Freitag, Nigel Hinds, Hendrik HamannIBM TJ Watson Research Center, Yorktown Heights, NY 10598A major challenge in leveraging big geospatial data sets is the ability to quickly integrate multiple data sources into physical and statistical models and be run these models in real time. A geospatial data platform called Physical Analytics Information and Services (PAIRS) is developed on top of open source hardware and software stack to manage Terabyte of data. A new data interpolation and re gridding is implemented where any geospatial data layers can be associated with a set of global grid where the grid resolutions is doubling for consecutive layers. Each pixel on the PAIRS grid have an index that is a combination of locations and time stamp. The indexing allow quick access to data sets that are part of a global data layers and allowing to retrieve only the data of interest. PAIRS takes advantages of parallel processing framework (Hadoop) in a cloud environment to digest, curate, and analyze the data sets while being very robust and stable. The data is stored on a distributed no-SQL database (Hbase) across multiple server, data upload and retrieval is parallelized where the original analytics task is broken up is smaller areas/volume, analyzed independently, and then reassembled for the original geographical area. The differentiating aspect of PAIRS is the ability to accelerate model development across large geographical regions and spatial resolution ranging from 0.1 m up to hundreds of kilometer. System performance is benchmarked on real time automated data ingestion and retrieval of Modis and Landsat data layers. The data layers are curated for sensor error, verified for correctness, and analyzed statistically to detect local anomalies. Multi-layer query enable PAIRS to filter different data

  10. Temperature field due to time-dependent heat sources in a large rectangular grid - Derivation of analytical solution

    International Nuclear Information System (INIS)

    Claesson, J.; Probert, T.

    1996-01-01

    The temperature field in rock due to a large rectangular grid of heat releasing canisters containing nuclear waste is studied. The solution is by superposition divided into different parts. There is a global temperature field due to the large rectangular canister area, while a local field accounts for the remaining heat source problem. The global field is reduced to a single integral. The local field is also solved analytically using solutions for a finite line heat source and for an infinite grid of point sources. The local solution is reduced to three parts, each of which depends on two spatial coordinates only. The temperatures at the envelope of a canister are given by a single thermal resistance, which is given by an explicit formula. The results are illustrated by a few numerical examples dealing with the KBS-3 concept for storage of nuclear waste. 8 refs

  11. Optimizing RDF Data Cubes for Efficient Processing of Analytical Queries

    DEFF Research Database (Denmark)

    Jakobsen, Kim Ahlstrøm; Andersen, Alex B.; Hose, Katja

    2015-01-01

    data warehouses and data cubes. Today, external data sources are essential for analytics and, as the Semantic Web gains popularity, more and more external sources are available in native RDF. With the recent SPARQL 1.1 standard, performing analytical queries over RDF data sources has finally become...

  12. Numerical simulation and experimental validation of the three-dimensional flow field and relative analyte concentration distribution in an atmospheric pressure ion source.

    Science.gov (United States)

    Poehler, Thorsten; Kunte, Robert; Hoenen, Herwart; Jeschke, Peter; Wissdorf, Walter; Brockmann, Klaus J; Benter, Thorsten

    2011-11-01

    In this study, the validation and analysis of steady state numerical simulations of the gas flows within a multi-purpose ion source (MPIS) are presented. The experimental results were obtained with particle image velocimetry (PIV) measurements in a non-scaled MPIS. Two-dimensional time-averaged velocity and turbulent kinetic energy distributions are presented for two dry gas volume flow rates. The numerical results of the validation simulations are in very good agreement with the experimental data. All significant flow features have been correctly predicted within the accuracy of the experiments. For technical reasons, the experiments were conducted at room temperature. Thus, numerical simulations of ionization conditions at two operating points of the MPIS are also presented. It is clearly shown that the dry gas volume flow rate has the most significant impact on the overall flow pattern within the APLI source; far less critical is the (larger) nebulization gas flow. In addition to the approximate solution of Reynolds-Averaged Navier-Stokes equations, a transport equation for the relative analyte concentration has been solved. The results yield information on the three-dimensional analyte distribution within the source. It becomes evident that for ion transport into the MS ion transfer capillary, electromagnetic forces are at least as important as fluid dynamic forces. However, only the fluid dynamics determines the three-dimensional distribution of analyte gas. Thus, local flow phenomena in close proximity to the spray shield are strongly impacting on the ionization efficiency.

  13. Hanford performance evaluation program for Hanford site analytical services

    International Nuclear Information System (INIS)

    Markel, L.P.

    1995-09-01

    The U.S. Department of Energy (DOE) Order 5700.6C, Quality Assurance, and Title 10 of the Code of Federal Regulations, Part 830.120, Quality Assurance Requirements, states that it is the responsibility of DOE contractors to ensure that ''quality is achieved and maintained by those who have been assigned the responsibility for performing the work.'' Hanford Analytical Services Quality Assurance Plan (HASQAP) is designed to meet the needs of the Richland Operations Office (RL) for maintaining a consistent level of quality for the analytical chemistry services provided by contractor and commmercial analytical laboratory operations. Therefore, services supporting Hanford environmental monitoring, environmental restoration, and waste management analytical services shall meet appropriate quality standards. This performance evaluation program will monitor the quality standards of all analytical laboratories supporting the Hanforad Site including on-site and off-site laboratories. The monitoring and evaluation of laboratory performance can be completed by the use of several tools. This program will discuss the tools that will be utilized for laboratory performance evaluations. Revision 0 will primarily focus on presently available programs using readily available performance evaluation materials provided by DOE, EPA or commercial sources. Discussion of project specific PE materials and evaluations will be described in section 9.0 and Appendix A

  14. Analytical description of photon beam phase spaces in inverse Compton scattering sources

    Directory of Open Access Journals (Sweden)

    C. Curatolo

    2017-08-01

    Full Text Available We revisit the description of inverse Compton scattering sources and the photon beams generated therein, emphasizing the behavior of their phase space density distributions and how they depend upon those of the two colliding beams of electrons and photons. The main objective is to provide practical formulas for bandwidth, spectral density, brilliance, which are valid in general for any value of the recoil factor, i.e. both in the Thomson regime of negligible electron recoil, and in the deep Compton recoil dominated region, which is of interest for gamma-gamma colliders and Compton sources for the production of multi-GeV photon beams. We adopt a description based on the center of mass reference system of the electron-photon collision, in order to underline the role of the electron recoil and how it controls the relativistic Doppler/boost effect in various regimes. Using the center of mass reference frame greatly simplifies the treatment, allowing us to derive simple formulas expressed in terms of rms momenta of the two colliding beams (emittance, energy spread, etc. and the collimation angle in the laboratory system. Comparisons with Monte Carlo simulations of inverse Compton scattering in various scenarios are presented, showing very good agreement with the analytical formulas: in particular we find that the bandwidth dependence on the electron beam emittance, of paramount importance in Thomson regime, as it limits the amount of focusing imparted to the electron beam, becomes much less sensitive in deep Compton regime, allowing a stronger focusing of the electron beam to enhance luminosity without loss of mono-chromaticity. A similar effect occurs concerning the bandwidth dependence on the frequency spread of the incident photons: in deep recoil regime the bandwidth comes out to be much less dependent on the frequency spread. The set of formulas here derived are very helpful in designing inverse Compton sources in diverse regimes, giving a

  15. Využití Google Analytics v eshopu

    OpenAIRE

    Zahradník, Jan

    2012-01-01

    The present thesis focuses on a specific e-shop operating mainly within the Czech market and the ways it uses one of the most significant tools of web analysis -- Google Analytics. The aim of the thesis is an analysis of the key areas, i.e. visitor analysis, visitor sourcing analysis and content analysis. The problematic areas are based on these as well as recommendations and suggestions that should help, once these have been applied, improve the service quality leading to increased revenue a...

  16. DCODE: A Distributed Column-Oriented Database Engine for Big Data Analytics

    OpenAIRE

    Liu, Yanchen; Cao, Fang; Mortazavi, Masood; Chen, Mengmeng; Yan, Ning; Ku, Chi; Adnaik, Aniket; Morgan, Stephen; Shi, Guangyu; Wang, Yuhu; Fang, Fan

    2015-01-01

    Part 10: Big Data and Text Mining; International audience; We propose a novel Distributed Column-Oriented Database Engine (DCODE) for efficient analytic query processing that combines advantages of both column storage and parallel processing. In DCODE, we enhance an existing open-source columnar database engine by adding the capability for handling queries over a cluster. Specifically, we studied parallel query execution and optimization techniques such as horizontal partitioning, exchange op...

  17. Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China.

    Science.gov (United States)

    Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li'an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling

    2016-03-01

    A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box-Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China.

  18. TU-H-CAMPUS-IeP1-05: A Framework for the Analytic Calculation of Patient-Specific Dose Distribution Due to CBCT Scan for IGRT

    Energy Technology Data Exchange (ETDEWEB)

    Youn, H; Jeon, H; Nam, J; Lee, J; Lee, J [Pusan National University Yangsan Hospital, Yangsan, Gyeongsangnam-do (Korea, Republic of); Kim, J; Kim, H [Pusan National University, Busan (Korea, Republic of); Cho, M; Yun, S [Samsung electronics Co., Suwon, Gyeonggi-do (Korea, Republic of); Park, D; Kim, W; Ki, Y; Kim, D [Pusan National University Hospital, Busan (Korea, Republic of)

    2016-06-15

    Purpose: To investigate the feasibility of an analytic framework to estimate patients’ absorbed dose distribution owing to daily cone-beam CT scan for image-guided radiation treatment. Methods: To compute total absorbed dose distribution, we separated the framework into primary and scattered dose calculations. Using the source parameters such as voltage, current, and bowtie filtration, for the primary dose calculation, we simulated the forward projection from the source to each voxel of an imaging object including some inhomogeneous inserts. Then we calculated the primary absorbed dose at each voxel based on the absorption probability deduced from the HU values and Beer’s law. In sequence, all voxels constructing the phantom were regarded as secondary sources to radiate scattered photons for scattered dose calculation. Details of forward projection were identical to that of the previous step. The secondary source intensities were given by using scatter-to- primary ratios provided by NIST. In addition, we compared the analytically calculated dose distribution with their Monte Carlo simulation results. Results: The suggested framework for absorbed dose estimation successfully provided the primary and secondary dose distributions of the phantom. Moreover, our analytic dose calculations and Monte Carlo calculations were well agreed each other even near the inhomogeneous inserts. Conclusion: This work indicated that our framework can be an effective monitor to estimate a patient’s exposure owing to cone-beam CT scan for image-guided radiation treatment. Therefore, we expected that the patient’s over-exposure during IGRT might be prevented by our framework.

  19. TU-H-CAMPUS-IeP1-05: A Framework for the Analytic Calculation of Patient-Specific Dose Distribution Due to CBCT Scan for IGRT

    International Nuclear Information System (INIS)

    Youn, H; Jeon, H; Nam, J; Lee, J; Lee, J; Kim, J; Kim, H; Cho, M; Yun, S; Park, D; Kim, W; Ki, Y; Kim, D

    2016-01-01

    Purpose: To investigate the feasibility of an analytic framework to estimate patients’ absorbed dose distribution owing to daily cone-beam CT scan for image-guided radiation treatment. Methods: To compute total absorbed dose distribution, we separated the framework into primary and scattered dose calculations. Using the source parameters such as voltage, current, and bowtie filtration, for the primary dose calculation, we simulated the forward projection from the source to each voxel of an imaging object including some inhomogeneous inserts. Then we calculated the primary absorbed dose at each voxel based on the absorption probability deduced from the HU values and Beer’s law. In sequence, all voxels constructing the phantom were regarded as secondary sources to radiate scattered photons for scattered dose calculation. Details of forward projection were identical to that of the previous step. The secondary source intensities were given by using scatter-to- primary ratios provided by NIST. In addition, we compared the analytically calculated dose distribution with their Monte Carlo simulation results. Results: The suggested framework for absorbed dose estimation successfully provided the primary and secondary dose distributions of the phantom. Moreover, our analytic dose calculations and Monte Carlo calculations were well agreed each other even near the inhomogeneous inserts. Conclusion: This work indicated that our framework can be an effective monitor to estimate a patient’s exposure owing to cone-beam CT scan for image-guided radiation treatment. Therefore, we expected that the patient’s over-exposure during IGRT might be prevented by our framework.

  20. Medial temporal lobe reinstatement of content-specific details predicts source memory

    Science.gov (United States)

    Liang, Jackson C.; Preston, Alison R.

    2016-01-01

    Leading theories propose that when remembering past events, medial temporal lobe (MTL) structures reinstate the neural patterns that were active when those events were initially encoded. Accurate reinstatement is hypothesized to support detailed recollection of memories, including their source. While several studies have linked cortical reinstatement to successful retrieval, indexing reinstatement within the MTL network and its relationship to memory performance has proved challenging. Here, we addressed this gap in knowledge by having participants perform an incidental encoding task, during which they visualized people, places, and objects in response to adjective cues. During a surprise memory test, participants saw studied and novel adjectives and indicated the imagery task they performed for each adjective. A multivariate pattern classifier was trained to discriminate the imagery tasks based on functional magnetic resonance imaging (fMRI) responses from hippocampus and MTL cortex at encoding. The classifier was then tested on MTL patterns during the source memory task. We found that MTL encoding patterns were reinstated during successful source retrieval. Moreover, when participants made source misattributions, errors were predicted by reinstatement of incorrect source content in MTL cortex. We further observed a gradient of content-specific reinstatement along the anterior-posterior axis of hippocampus and MTL cortex. Within anterior hippocampus, we found that reinstatement of person content was related to source memory accuracy, whereas reinstatement of place information across the entire hippocampal axis predicted correct source judgments. Content-specific reinstatement was also graded across MTL cortex, with PRc patterns evincing reactivation of people and more posterior regions, including PHc, showing evidence for reinstatement of places and objects. Collectively, these findings provide key evidence that source recollection relies on reinstatement of past

  1. Source-specific pollution exposure and associations with pulmonary response in the Atlanta Commuters Exposure Studies.

    Science.gov (United States)

    Krall, Jenna R; Ladva, Chandresh N; Russell, Armistead G; Golan, Rachel; Peng, Xing; Shi, Guoliang; Greenwald, Roby; Raysoni, Amit U; Waller, Lance A; Sarnat, Jeremy A

    2018-01-03

    Concentrations of traffic-related air pollutants are frequently higher within commuting vehicles than in ambient air. Pollutants found within vehicles may include those generated by tailpipe exhaust, brake wear, and road dust sources, as well as pollutants from in-cabin sources. Source-specific pollution, compared to total pollution, may represent regulation targets that can better protect human health. We estimated source-specific pollution exposures and corresponding pulmonary response in a panel study of commuters. We used constrained positive matrix factorization to estimate source-specific pollution factors and, subsequently, mixed effects models to estimate associations between source-specific pollution and pulmonary response. We identified four pollution factors that we named: crustal, primary tailpipe traffic, non-tailpipe traffic, and secondary. Among asthmatic subjects (N = 48), interquartile range increases in crustal and secondary pollution were associated with changes in lung function of -1.33% (95% confidence interval (CI): -2.45, -0.22) and -2.19% (95% CI: -3.46, -0.92) relative to baseline, respectively. Among non-asthmatic subjects (N = 51), non-tailpipe pollution was associated with pulmonary response only at 2.5 h post-commute. We found no significant associations between pulmonary response and primary tailpipe pollution. Health effects associated with traffic-related pollution may vary by source, and therefore some traffic pollution sources may require targeted interventions to protect health.

  2. Reassessment of the technical bases for estimating source terms. Draft report for comment

    International Nuclear Information System (INIS)

    Silberberg, M.; Mitchell, J.A.; Meyer, R.O.; Pasedag, W.F.; Ryder, C.P.; Peabody, C.A.; Jankowski, M.W.

    1985-07-01

    NUREG-0956 describes the NRC staff and contractor efforts to reassess and update the agency's analytical procedures for estimating accident source terms for nuclear power plants. The effort included development of a new source term analytical procedure - a set of computer codes - that is intended to replace the methodology of the Reactor Safety Study (WASH-1400) and to be used in reassessing the use of TID-14844 assumptions (10 CFR 100). NUREG-0956 describes the development of these codes, the demonstration of the codes to calculate source terms for specific cases, the peer review of this work, some perspectives on the overall impact of new source terms on plant risks, the plans for related research projects, and the conclusions and recommendations resulting from the effort

  3. An analytic thomism?

    Directory of Open Access Journals (Sweden)

    Daniel Alejandro Pérez Chamorro.

    2012-12-01

    Full Text Available For 50 years the philosophers of the Anglo-Saxon analytic tradition (E. Anscombre, P. Geach, A. Kenny, P. Foot have tried to follow the Thomas Aquinas School which they use as a source to surpass the Cartesian Epistemology and to develop the virtue ethics. Recently, J. Haldane has inaugurated a program of “analytical thomism” which main result until the present has been his “theory of identity mind/world”. Nevertheless, none of Thomás’ admirers has still found the means of assimilating his metaphysics of being.

  4. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    Science.gov (United States)

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  5. Using recruitment source timing and diagnosticity to enhance applicants' occupation-specific human capital.

    Science.gov (United States)

    Campion, Michael C; Ployhart, Robert E; Campion, Michael A

    2017-05-01

    [Correction Notice: An Erratum for this article was reported in Vol 102(5) of Journal of Applied Psychology (see record 2017-14296-001). In the article, the following headings were inadvertently set at the wrong level: Method, Participants and Procedure, Measures, Occupation specific human capital, Symbolic jobs, Relevant majors, Occupation-specific capital hotspots, Source timing, Source diagnosticity, Results, and Discussion. All versions of this article have been corrected.] This study proposes that reaching applicants through more diagnostic recruitment sources earlier in their educational development (e.g., in high school) can lead them to invest more in their occupation-specific human capital (OSHC), thereby making them higher quality candidates. Using a sample of 78,157 applicants applying for jobs within a desirable professional occupation in the public sector, results indicate that applicants who report hearing about the occupation earlier, and applicants who report hearing about the occupation through more diagnostic sources, have higher levels of OSHC upon application. Additionally, source timing and diagnosticity affect the likelihood of candidates applying for jobs symbolic of the occupation, selecting relevant majors, and attending educational institutions with top programs related to the occupation. These findings suggest a firm's recruiting efforts may influence applicants' OSHC investment strategies. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Mobility and Sector-specific Effects of Changes in Multiple Sources ...

    African Journals Online (AJOL)

    Using the second and third Cameroon household consumption surveys, this study examined mobility and sector-specific effects of changes in multiple sources of deprivation in Cameroon. Results indicated that between 2001 and 2007, deprivations associated with human capital and labour capital reduced, while ...

  7. FEASIBILITY OF INVESTMENT IN BUSINESS ANALYTICS

    Directory of Open Access Journals (Sweden)

    Mladen Varga

    2007-12-01

    Full Text Available Trends in data processing for decision support show that business users need business analytics, i.e. analytical applications which incorporate a variety of business oriented data analysis techniques and task-specific knowledge. The paper discusses the feasibility of investment in two models of implementing business analytics: custom development and packed analytical applications. The consequences of both models are shown on two models of business analytics implementation in Croatia.

  8. A Model To Estimate the Sources of Tobacco-Specific Nitrosamines in Cigarette Smoke.

    Science.gov (United States)

    Lipowicz, Peter J; Seeman, Jeffrey I

    2017-08-21

    Tobacco-specific nitrosamines (TSNAs) are one of the most extensively and continually studied classes of compounds found in tobacco and cigarette smoke.1-5 The TSNAs N-nitrosonornicotine (NNN) and 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanone (NNK) have been characterized by the US Food and Drug Administration (FDA) as harmful and potentially harmful constituents in tobacco products,6 and cigarette manufacturers report their levels in cigarette tobacco filler and cigarette smoke to the FDA. NNN and NNK are classified by IARC as carcinogenic to humans.7 TSNAs transfer from tobacco to smoke by evaporation driven by heat and the flow of gases down the cigarette rod. Other TSNA sources in smoke include pyrorelease, where room temperature-unextractable TSNAs are released by smoking, and pyrosynthesis, where TSNAs are formed by reactions during smoking. We propose the first model that quantifies these three sources of TSNA in smoke. In our model, evaporative transfer efficiency of a TSNA is equated to the evaporative transfer efficiency of nicotine. Smoke TSNA measured in excess of what is transferred by evaporation is termed "pyrogeneration," which is the net sum of pyrorelease and pyrosynthesis minus pyrodegredation. This model requires no internal standard, is applicable to commercial cigarettes "as is," and uses existing analytical methods. This model was applied to archived Philip Morris USA data. For commercial blended cigarettes, NNN pyrogeneration appears to be unimportant, but NNK pyrogeneration contributes roughly 30-70% of NNK in smoke with the greater contribution at lower tobacco NNK levels. This means there is an opportunity to significantly reduce smoke NNK by up to 70% if pyrogeneration can be decreased or eliminated, perhaps by finding a way to grow and cure tobacco with reduced matrix-bound NNK. For burley research cigarettes, pyrogeneration may account for 90% or more of both NNN and NNK in smoke.

  9. Final report of the IAEA advisory group meeting on accelerator-based nuclear analytical techniques for characterization and source identification of aerosol particles

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-01

    The field of aerosol characterization and source identification covers a wide range of scientific and technical activities in many institutions, in both developed and developing countries. This field includes research and applications on urban air pollution, source apportionment of suspended particulate matter, radioactive aerosol particles, organic compounds carried on particulate matter, elemental characterization of particles, and other areas. The subject of this AGM focused on the use of accelerator-based nuclear analytical techniques for determination of elemental composition of particles (by either bulk or single particle analysis) and the use of accumulated knowledge for source identification.

  10. Final report of the IAEA advisory group meeting on accelerator-based nuclear analytical techniques for characterization and source identification of aerosol particles

    International Nuclear Information System (INIS)

    1995-01-01

    The field of aerosol characterization and source identification covers a wide range of scientific and technical activities in many institutions, in both developed and developing countries. This field includes research and applications on urban air pollution, source apportionment of suspended particulate matter, radioactive aerosol particles, organic compounds carried on particulate matter, elemental characterization of particles, and other areas. The subject of this AGM focused on the use of accelerator-based nuclear analytical techniques for determination of elemental composition of particles (by either bulk or single particle analysis) and the use of accumulated knowledge for source identification

  11. 100-B/C Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    R.W. Ovink

    2010-03-18

    This report documents the process used to identify source area target analytes in support of the 100-B/C remedial investigation/feasibility study addendum to DOE/RL-2008-46. This report also establishes the analyte exclusion criteria applicable for 100-B/C use and the analytical methods needed to analyze the target analytes.

  12. An analytical approach for a nodal formulation of a two-dimensional fixed-source neutron transport problem in heterogeneous medium

    Energy Technology Data Exchange (ETDEWEB)

    Basso Barichello, Liliane; Dias da Cunha, Rudnei [Universidade Federal do Rio Grande do Sul, Porto Alegre, RS (Brazil). Inst. de Matematica; Becker Picoloto, Camila [Universidade Federal do Rio Grande do Sul, Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica; Tres, Anderson [Universidade Federal do Rio Grande do Sul, Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Matematica Aplicada

    2015-05-15

    A nodal formulation of a fixed-source two-dimensional neutron transport problem, in Cartesian geometry, defined in a heterogeneous medium, is solved by an analytical approach. Explicit expressions, in terms of the spatial variables, are derived for averaged fluxes in each region in which the domain is subdivided. The procedure is an extension of an analytical discrete ordinates method, the ADO method, for the solution of the two-dimensional homogeneous medium case. The scheme is developed from the discrete ordinates version of the two-dimensional transport equation along with the level symmetric quadrature scheme. As usual for nodal schemes, relations between the averaged fluxes and the unknown angular fluxes at the contours are introduced as auxiliary equations. Numerical results are in agreement with results available in the literature.

  13. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    Science.gov (United States)

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix

  14. Application of the source term code package to obtain a specific source term for the Laguna Verde Nuclear Power Plant

    International Nuclear Information System (INIS)

    Souto, F.J.

    1991-06-01

    The main objective of the project was to use the Source Term Code Package (STCP) to obtain a specific source term for those accident sequences deemed dominant as a result of probabilistic safety analyses (PSA) for the Laguna Verde Nuclear Power Plant (CNLV). The following programme has been carried out to meet this objective: (a) implementation of the STCP, (b) acquisition of specific data for CNLV to execute the STCP, and (c) calculations of specific source terms for accident sequences at CNLV. The STCP has been implemented and validated on CDC 170/815 and CDC 180/860 main frames as well as on a Micro VAX 3800 system. In order to get a plant-specific source term, data on the CNLV including initial core inventory, burn-up, primary containment structures, and materials used for the calculations have been obtained. Because STCP does not explicitly model containment failure, dry well failure in the form of a catastrophic rupture has been assumed. One of the most significant sequences from the point of view of possible off-site risk is the loss of off-site power with failure of the diesel generators and simultaneous loss of high pressure core spray and reactor core isolation cooling systems. The probability for that event is approximately 4.5 x 10 -6 . This sequence has been analysed in detail and the release fractions of radioisotope groups are given in the full report. 18 refs, 4 figs, 3 tabs

  15. fMRI activation patterns in an analytic reasoning task: consistency with EEG source localization

    Science.gov (United States)

    Li, Bian; Vasanta, Kalyana C.; O'Boyle, Michael; Baker, Mary C.; Nutter, Brian; Mitra, Sunanda

    2010-03-01

    Functional magnetic resonance imaging (fMRI) is used to model brain activation patterns associated with various perceptual and cognitive processes as reflected by the hemodynamic (BOLD) response. While many sensory and motor tasks are associated with relatively simple activation patterns in localized regions, higher-order cognitive tasks may produce activity in many different brain areas involving complex neural circuitry. We applied a recently proposed probabilistic independent component analysis technique (PICA) to determine the true dimensionality of the fMRI data and used EEG localization to identify the common activated patterns (mapped as Brodmann areas) associated with a complex cognitive task like analytic reasoning. Our preliminary study suggests that a hybrid GLM/PICA analysis may reveal additional regions of activation (beyond simple GLM) that are consistent with electroencephalography (EEG) source localization patterns.

  16. Analytical solution for the transient wave propagation of a buried cylindrical P-wave line source in a semi-infinite elastic medium with a fluid surface layer

    Science.gov (United States)

    Shan, Zhendong; Ling, Daosheng

    2018-02-01

    This article develops an analytical solution for the transient wave propagation of a cylindrical P-wave line source in a semi-infinite elastic solid with a fluid layer. The analytical solution is presented in a simple closed form in which each term represents a transient physical wave. The Scholte equation is derived, through which the Scholte wave velocity can be determined. The Scholte wave is the wave that propagates along the interface between the fluid and solid. To develop the analytical solution, the wave fields in the fluid and solid are defined, their analytical solutions in the Laplace domain are derived using the boundary and interface conditions, and the solutions are then decomposed into series form according to the power series expansion method. Each item of the series solution has a clear physical meaning and represents a transient wave path. Finally, by applying Cagniard's method and the convolution theorem, the analytical solutions are transformed into the time domain. Numerical examples are provided to illustrate some interesting features in the fluid layer, the interface and the semi-infinite solid. When the P-wave velocity in the fluid is higher than that in the solid, two head waves in the solid, one head wave in the fluid and a Scholte wave at the interface are observed for the cylindrical P-wave line source.

  17. Schedule Analytics

    Science.gov (United States)

    2016-04-30

    Warfare, Naval Sea Systems Command Acquisition Cycle Time : Defining the Problem David Tate, Institute for Defense Analyses Schedule Analytics Jennifer...research was comprised of the following high- level steps :  Identify and review primary data sources 1...research. However, detailed reviews of the OMB IT Dashboard data revealed that schedule data is highly aggregated. Program start date and program end date

  18. Analytical strategies for phosphoproteomics

    DEFF Research Database (Denmark)

    Thingholm, Tine E; Jensen, Ole N; Larsen, Martin R

    2009-01-01

    sensitive and specific strategies. Today, most phosphoproteomic studies are conducted by mass spectrometric strategies in combination with phospho-specific enrichment methods. This review presents an overview of different analytical strategies for the characterization of phosphoproteins. Emphasis...

  19. Enhancing Safeguards through Information Analysis: Business Analytics Tools

    International Nuclear Information System (INIS)

    Vincent, J.; Midwinter, J.

    2015-01-01

    For the past 25 years the IBM i2 Intelligence Analysis product portfolio has assisted over 4,500 organizations across law enforcement, defense, government agencies, and commercial private sector businesses to maximize the value of the mass of information to discover and disseminate actionable intelligence that can help identify, investigate, predict, prevent, and disrupt criminal, terrorist, and fraudulent acts; safeguarding communities, organizations, infrastructures, and investments. The collaborative Intelligence Analysis environment delivered by i2 is specifically designed to be: · scalable: supporting business needs as well as operational and end user environments · modular: an architecture which can deliver maximum operational flexibility with ability to add complimentary analytics · interoperable: integrating with existing environments and eases information sharing across partner agencies · extendable: providing an open source developer essential toolkit, examples, and documentation for custom requirements i2 Intelligence Analysis brings clarity to complex investigations and operations by delivering industry leading multidimensional analytics that can be run on-demand across disparate data sets or across a single centralized analysis environment. The sole aim is to detect connections, patterns, and relationships hidden within high-volume, all-source data, and to create and disseminate intelligence products in near real time for faster informed decision making. (author)

  20. Radionuclides in analytical chemistry

    International Nuclear Information System (INIS)

    Tousset, J.

    1984-01-01

    Applications of radionuclides in analytical chemistry are reviewed in this article: tracers, radioactive sources and activation analysis. Examples are given in all these fields and it is concluded that these methods should be used more widely [fr

  1. Exact analytical solution of time-independent neutron transport equation, and its applications to systems with a point source

    International Nuclear Information System (INIS)

    Mikata, Y.

    2014-01-01

    Highlights: • An exact solution for the one-speed neutron transport equation is obtained. • This solution as well as its derivation are believed to be new. • Neutron flux for a purely absorbing material with a point neutron source off the origin is obtained. • Spherically as well as cylindrically piecewise constant cross sections are studied. • Neutron flux expressions for a point neutron source off the origin are believed to be new. - Abstract: An exact analytical solution of the time-independent monoenergetic neutron transport equation is obtained in this paper. The solution is applied to systems with a point source. Systematic analysis of the solution of the time-independent neutron transport equation, and its applications represent the primary goal of this paper. To the best of the author’s knowledge, certain key results on the scalar neutron flux as well as their derivations are new. As an application of these results, a scalar neutron flux for a purely absorbing medium with a spherically piecewise constant cross section and an isotropic point neutron source off the origin as well as that for a cylindrically piecewise constant cross section with a point neutron source off the origin are obtained. Both of these results are believed to be new

  2. Potential sources of analytical bias and error in selected trace element data-quality analyses

    Science.gov (United States)

    Paul, Angela P.; Garbarino, John R.; Olsen, Lisa D.; Rosen, Michael R.; Mebane, Christopher A.; Struzeski, Tedmund M.

    2016-09-28

    Potential sources of analytical bias and error associated with laboratory analyses for selected trace elements where concentrations were greater in filtered samples than in paired unfiltered samples were evaluated by U.S. Geological Survey (USGS) Water Quality Specialists in collaboration with the USGS National Water Quality Laboratory (NWQL) and the Branch of Quality Systems (BQS).Causes for trace-element concentrations in filtered samples to exceed those in associated unfiltered samples have been attributed to variability in analytical measurements, analytical bias, sample contamination either in the field or laboratory, and (or) sample-matrix chemistry. These issues have not only been attributed to data generated by the USGS NWQL but have been observed in data generated by other laboratories. This study continues the evaluation of potential analytical bias and error resulting from matrix chemistry and instrument variability by evaluating the performance of seven selected trace elements in paired filtered and unfiltered surface-water and groundwater samples collected from 23 sampling sites of varying chemistries from six States, matrix spike recoveries, and standard reference materials.Filtered and unfiltered samples have been routinely analyzed on separate inductively coupled plasma-mass spectrometry instruments. Unfiltered samples are treated with hydrochloric acid (HCl) during an in-bottle digestion procedure; filtered samples are not routinely treated with HCl as part of the laboratory analytical procedure. To evaluate the influence of HCl on different sample matrices, an aliquot of the filtered samples was treated with HCl. The addition of HCl did little to differentiate the analytical results between filtered samples treated with HCl from those samples left untreated; however, there was a small, but noticeable, decrease in the number of instances where a particular trace-element concentration was greater in a filtered sample than in the associated

  3. Analytical admittance characterization of high mobility channel

    Energy Technology Data Exchange (ETDEWEB)

    Mammeri, A. M.; Mahi, F. Z., E-mail: fati-zo-mahi2002@yahoo.fr [Institute of Science and Technology, University of Bechar (Algeria); Varani, L. [Institute of Electronics of the South (IES - CNRS UMR 5214), University of Montpellier (France)

    2015-03-30

    In this contribution, we investigate the small-signal admittance of the high electron mobility transistors field-effect channels under a continuation branching of the current between channel and gate by using an analytical model. The analytical approach takes into account the linearization of the 2D Poisson equation and the drift current along the channel. The analytical equations discuss the frequency dependence of the admittance at source and drain terminals on the geometrical transistor parameters.

  4. Analytic model of the stress waves propagation in thin wall tubes, seeking the location of a harmonic point source in its surface

    International Nuclear Information System (INIS)

    Boaratti, Mario Francisco Guerra

    2006-01-01

    Leaks in pressurized tubes generate acoustic waves that propagate through the walls of these tubes, which can be captured by accelerometers or by acoustic emission sensors. The knowledge of how these walls can vibrate, or in another way, how these acoustic waves propagate in this material is fundamental in the detection and localization process of the leak source. In this work an analytic model was implemented, through the motion equations of a cylindrical shell, with the objective to understand the behavior of the tube surface excited by a point source. Since the cylindrical surface has a closed pattern in the circumferential direction, waves that are beginning their trajectory will meet with another that has already completed the turn over the cylindrical shell, in the clockwise direction as well as in the counter clockwise direction, generating constructive and destructive interferences. After enough time of propagation, peaks and valleys in the shell surface are formed, which can be visualized through a graphic representation of the analytic solution created. The theoretical results were proven through measures accomplished in an experimental setup composed of a steel tube finished in sand box, simulating the condition of infinite tube. To determine the location of the point source on the surface, the process of inverse solution was adopted, that is to say, known the signals of the sensor disposed in the tube surface , it is determined through the theoretical model where the source that generated these signals can be. (author)

  5. Analytical and pre-analytical performance characteristics of a novel cartridge-type blood gas analyzer for point-of-care and laboratory testing.

    Science.gov (United States)

    Oyaert, Matthijs; Van Maerken, Tom; Bridts, Silke; Van Loon, Silvi; Laverge, Heleen; Stove, Veronique

    2018-03-01

    Point-of-care blood gas test results may benefit therapeutic decision making by their immediate impact on patient care. We evaluated the (pre-)analytical performance of a novel cartridge-type blood gas analyzer, the GEM Premier 5000 (Werfen), for the determination of pH, partial carbon dioxide pressure (pCO 2 ), partial oxygen pressure (pO 2 ), sodium (Na + ), potassium (K + ), chloride (Cl - ), ionized calcium ( i Ca 2+ ), glucose, lactate, and total hemoglobin (tHb). Total imprecision was estimated according to the CLSI EP5-A2 protocol. The estimated total error was calculated based on the mean of the range claimed by the manufacturer. Based on the CLSI EP9-A2 evaluation protocol, a method comparison with the Siemens RapidPoint 500 and Abbott i-STAT CG8+ was performed. Obtained data were compared against preset quality specifications. Interference of potential pre-analytical confounders on co-oximetry and electrolyte concentrations were studied. The analytical performance was acceptable for all parameters tested. Method comparison demonstrated good agreement to the RapidPoint 500 and i-STAT CG8+, except for some parameters (RapidPoint 500: pCO 2 , K + , lactate and tHb; i-STAT CG8+: pO 2 , Na + , i Ca 2+ and tHb) for which significant differences between analyzers were recorded. No interference of lipemia or methylene blue on CO-oximetry results was found. On the contrary, significant interference for benzalkonium and hemolysis on electrolyte measurements were found, for which the user is notified by an interferent specific flag. Identification of sample errors from pre-analytical sources, such as interferences and automatic corrective actions, along with the analytical performance, ease of use and low maintenance time of the instrument, makes the evaluated instrument a suitable blood gas analyzer for both POCT and laboratory use. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  6. Analytical Solution of the Hyperbolic Heat Conduction Equation for Moving Semi-Infinite Medium under the Effect of Time-Dependent Laser Heat Source

    Directory of Open Access Journals (Sweden)

    R. T. Al-Khairy

    2009-01-01

    source, whose capacity is given by (,=((1−− while the semi-infinite body has insulated boundary. The solution is obtained by Laplace transforms method, and the discussion of solutions for different time characteristics of heat sources capacity (constant, instantaneous, and exponential is presented. The effect of absorption coefficients on the temperature profiles is examined in detail. It is found that the closed form solution derived from the present study reduces to the previously obtained analytical solution when the medium velocity is set to zero in the closed form solution.

  7. Analytical solution for the transient response of a fluid/saturated porous medium halfspace system subjected to an impulsive line source

    Science.gov (United States)

    Shan, Zhendong; Ling, Daosheng; Jing, Liping; Li, Yongqiang

    2018-05-01

    In this paper, transient wave propagation is investigated within a fluid/saturated porous medium halfspace system with a planar interface that is subjected to a cylindrical P-wave line source. Assuming the permeability coefficient is sufficiently large, analytical solutions for the transient response of the fluid/saturated porous medium halfspace system are developed. Moreover, the analytical solutions are presented in simple closed forms wherein each term represents a transient physical wave, especially the expressions for head waves. The methodology utilised to determine where the head wave can emerge within the system is also given. The wave fields within the fluid and porous medium are first defined considering the behaviour of two compressional waves and one tangential wave in the saturated porous medium and one compressional wave in the fluid. Substituting these wave fields into the interface continuity conditions, the analytical solutions in the Laplace domain are then derived. To transform the solutions into the time domain, a suitable distortion of the contour is provided to change the integration path of the solution, after which the analytical solutions in the Laplace domain are transformed into the time domain by employing Cagniard's method. Numerical examples are provided to illustrate some interesting features of the fluid/saturated porous medium halfspace system. In particular, the interface wave and head waves that propagate along the interface between the fluid and saturated porous medium can be observed.

  8. Plasma-cavity ringdown spectroscopy for analytical measurement: Progress and prospectives

    Science.gov (United States)

    Zhang, Sida; Liu, Wei; Zhang, Xiaohe; Duan, Yixiang

    2013-07-01

    Plasma-cavity ringdown spectroscopy is a powerful absorption technique for analytical measurement. It combines the inherent advantages of high sensitivity, absolute measurement, and relative insensitivity to light source intensity fluctuations of the cavity ringdown technique with use of plasma as an atomization/ionization source. In this review, we briefly describe the background and principles of plasma-cavity ringdown spectroscopy(CRDS) technology, the instrumental components, and various applications. The significant developments of the plasma sources, lasers, and cavity optics are illustrated. Analytical applications of plasma-CRDS for elemental detection and isotopic measurement in atomic spectrometry are outlined in this review. Plasma-CRDS is shown to have a promising future for various analytical applications, while some further efforts are still needed in fields such as cavity design, plasma source design, instrumental improvement and integration, as well as potential applications in radical and molecular measurements.

  9. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    Science.gov (United States)

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. From Theory Use to Theory Building in Learning Analytics: A Commentary on "Learning Analytics to Support Teachers during Synchronous CSCL"

    Science.gov (United States)

    Chen, Bodong

    2015-01-01

    In this commentary on Van Leeuwen (2015, this issue), I explore the relation between theory and practice in learning analytics. Specifically, I caution against adhering to one specific theoretical doctrine while ignoring others, suggest deeper applications of cognitive load theory to understanding teaching with analytics tools, and comment on…

  11. A two dimensional analytical modeling of surface potential in triple metal gate (TMG) fully-depleted Recessed-Source/Drain (Re-S/D) SOI MOSFET

    Science.gov (United States)

    Priya, Anjali; Mishra, Ram Awadh

    2016-04-01

    In this paper, analytical modeling of surface potential is proposed for new Triple Metal Gate (TMG) fully depleted Recessed-Source/Dain Silicon On Insulator (SOI) Metal Oxide Semiconductor Field Effect Transistor (MOSFET). The metal with the highest work function is arranged near the source region and the lowest one near the drain. Since Recessed-Source/Drain SOI MOSFET has higher drain current as compared to conventional SOI MOSFET due to large source and drain region. The surface potential model developed by 2D Poisson's equation is verified by comparison to the simulation result of 2-dimensional ATLAS simulator. The model is compared with DMG and SMG devices and analysed for different device parameters. The ratio of metal gate length is varied to optimize the result.

  12. SU-E-T-120: Analytic Dose Verification for Patient-Specific Proton Pencil Beam Scanning Plans

    International Nuclear Information System (INIS)

    Chang, C; Mah, D

    2015-01-01

    Purpose: To independently verify the QA dose of proton pencil beam scanning (PBS) plans using an analytic dose calculation model. Methods: An independent proton dose calculation engine is created using the same commissioning measurements as those employed to build our commercially available treatment planning system (TPS). Each proton PBS plan is exported from the TPS in DICOM format and calculated by this independent dose engine in a standard 40 x 40 x 40 cm water tank. This three-dimensional dose grid is then compared with the QA dose calculated by the commercial TPS, using standard Gamma criterion. A total of 18 measured pristine Bragg peaks, ranging from 100 to 226 MeV, are used in the model. Intermediate proton energies are interpolated. Similarly, optical properties of the spots are measured in air over 15 cm upstream and downstream, and fitted to a second-order polynomial. Multiple Coulomb scattering in water is approximated analytically using Preston and Kohler formula for faster calculation. The effect of range shifters on spot size is modeled with generalized Highland formula. Note that the above formulation approximates multiple Coulomb scattering in water and we therefore chose not use the full Moliere/Hanson form. Results: Initial examination of 3 patient-specific prostate PBS plans shows that agreement exists between 3D dose distributions calculated by the TPS and the independent proton PBS dose calculation engine. Both calculated dose distributions are compared with actual measurements at three different depths per beam and good agreements are again observed. Conclusion: Results here showed that 3D dose distributions calculated by this independent proton PBS dose engine are in good agreement with both TPS calculations and actual measurements. This tool can potentially be used to reduce the amount of different measurement depths required for patient-specific proton PBS QA

  13. Breed-specific variation of hematologic and biochemical analytes in healthy adult Bernese Mountain dogs

    DEFF Research Database (Denmark)

    Nielsen, Lise; Kjelgaard-Hansen, Mads; Jensen, Asger Lundorff

    2010-01-01

    Background: Hematology and serum biochemistry reference intervals in dogs may be affected by internal factors, such as breed and age, and external factors, such as the environment, diet, and lifestyle. In humans, it is well established that geographic origin and age may have an impact on reference...... reference intervals were rejected. Methods: The procedure was performed using the human Clinical and Laboratory Standards Institute-approved model modified for veterinary use. Thirty-two dogs were included in the study using a direct a priori method, as recommended. Results: While 23 of the standard...... intervals and, therefore, more specific reference intervals are sought for subpopulations. Objective: The objective of this study was to validate and transfer standard laboratory reference intervals for healthy Bernese Mountain dogs and to create new intervals for analytes where the established laboratory...

  14. Pre-analytical and analytical factors influencing Alzheimer's disease cerebrospinal fluid biomarker variability.

    Science.gov (United States)

    Fourier, Anthony; Portelius, Erik; Zetterberg, Henrik; Blennow, Kaj; Quadrio, Isabelle; Perret-Liaudet, Armand

    2015-09-20

    A panel of cerebrospinal fluid (CSF) biomarkers including total Tau (t-Tau), phosphorylated Tau protein at residue 181 (p-Tau) and β-amyloid peptides (Aβ42 and Aβ40), is frequently used as an aid in Alzheimer's disease (AD) diagnosis for young patients with cognitive impairment, for predicting prodromal AD in mild cognitive impairment (MCI) subjects, for AD discrimination in atypical clinical phenotypes and for inclusion/exclusion and stratification of patients in clinical trials. Due to variability in absolute levels between laboratories, there is no consensus on medical cut-off value for the CSF AD signature. Thus, for full implementation of this core AD biomarker panel in clinical routine, this issue has to be solved. Variability can be explained both by pre-analytical and analytical factors. For example, the plastic tubes used for CSF collection and storage, the lack of reference material and the variability of the analytical protocols were identified as important sources of variability. The aim of this review is to highlight these pre-analytical and analytical factors and describe efforts done to counteract them in order to establish cut-off values for core CSF AD biomarkers. This review will give the current state of recommendations. Copyright © 2015. Published by Elsevier B.V.

  15. Quo vadis, analytical chemistry?

    Science.gov (United States)

    Valcárcel, Miguel

    2016-01-01

    This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.

  16. A shape and mesh adaptive computational methodology for gamma ray dose from volumetric sources

    International Nuclear Information System (INIS)

    Mirza, N.M.; Ali, B.; Mirza, S.M.; Tufail, M.; Ahmad, N.

    1991-01-01

    Indoor external exposure to the population is dominated by gamma rays emitted from the walls and the floor of a room. A shape and mesh size adaptive flux calculational approach has been developed for a typical wall source. Parametric studies of the effect of mesh size on flux calculations have been done. The optimum value of the mesh size is found to depend strongly on distance from the source, permissible limits on uncertainty in flux predictions and on computer Central Processing Unit time. To test the computations, a typical wall source was reduced to a point, a line and an infinite volume source having finite thickness, and the computed flux values were compared with values from corresponding analytical expressions for these sources. Results indicate that the errors under optimum conditions remain less than 6% for the fluxes calculated from this approach when compared with the analytical values for the point and the line source approximations. Also, when the wall is simulated as an infinite volume source having finite thickness, the errors in computed to analytical flux ratios remain large for smaller wall dimensions. However, the errors become less than 10% when the wall dimensions are greater than ten mean free paths for 3 MeV gamma rays. Also, specific dose rates from this methodology remain within the difference of 15% for the values obtained by Monte Carlo method. (author)

  17. NC CATCH: Advancing Public Health Analytics.

    Science.gov (United States)

    Studnicki, James; Fisher, John W; Eichelberger, Christopher; Bridger, Colleen; Angelon-Gaetz, Kim; Nelson, Debi

    2010-01-01

    The North Carolina Comprehensive Assessment for Tracking Community Health (NC CATCH) is a Web-based analytical system deployed to local public health units and their community partners. The system has the following characteristics: flexible, powerful online analytic processing (OLAP) interface; multiple sources of multidimensional, event-level data fully conformed to common definitions in a data warehouse structure; enabled utilization of available decision support software tools; analytic capabilities distributed and optimized locally with centralized technical infrastructure; two levels of access differentiated by the user (anonymous versus registered) and by the analytical flexibility (Community Profile versus Design Phase); and, an emphasis on user training and feedback. The ability of local public health units to engage in outcomes-based performance measurement will be influenced by continuing access to event-level data, developments in evidence-based practice for improving population health, and the application of information technology-based analytic tools and methods.

  18. Plasma-cavity ringdown spectroscopy for analytical measurement: Progress and prospectives

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Sida; Liu, Wei [Research Center of Analytical Instrumentation, Analytical and Testing Center, College of Chemistry, Sichuan University, Chengdu (China); Zhang, Xiaohe [College of Water Resources and Hydropower, Sichuan University, Chengdu (China); Duan, Yixiang, E-mail: yduan@scu.edu.cn [Research Center of Analytical Instrumentation, Analytical and Testing Center, College of Chemistry, Sichuan University, Chengdu (China)

    2013-07-01

    Plasma-cavity ringdown spectroscopy is a powerful absorption technique for analytical measurement. It combines the inherent advantages of high sensitivity, absolute measurement, and relative insensitivity to light source intensity fluctuations of the cavity ringdown technique with use of plasma as an atomization/ionization source. In this review, we briefly describe the background and principles of plasma-cavity ringdown spectroscopy(CRDS) technology, the instrumental components, and various applications. The significant developments of the plasma sources, lasers, and cavity optics are illustrated. Analytical applications of plasma-CRDS for elemental detection and isotopic measurement in atomic spectrometry are outlined in this review. Plasma-CRDS is shown to have a promising future for various analytical applications, while some further efforts are still needed in fields such as cavity design, plasma source design, instrumental improvement and integration, as well as potential applications in radical and molecular measurements. - Highlights: • Plasma-based cavity ringdown spectroscopy • High sensitivity and high resolution • Elemental and isotopic measurements.

  19. Planning for Low End Analytics Disruptions in Business School Curricula

    Science.gov (United States)

    Rienzo, Thomas; Chen, Kuanchin

    2018-01-01

    Analytics is getting a great deal of attention in both industrial and academic venues. Organizations of all types are becoming more serious about transforming data from a variety of sources into insight, and analytics is the key to that transformation. Academic institutions are rapidly responding to the demand for analytics talent, with hundreds…

  20. LC-MS/MS analytical procedure to quantify tris(nonylphenyl)phosphite, as a source of the endocrine disruptors 4-nonylphenols, in food packaging materials.

    Science.gov (United States)

    Mottier, Pascal; Frank, Nancy; Dubois, Mathieu; Tarres, Adrienne; Bessaire, Thomas; Romero, Roman; Delatour, Thierry

    2014-01-01

    Tris(nonylphenyl)phosphite, an antioxidant used in polyethylene resins for food applications, is problematic since it is a source of the endocrine-disrupting chemicals 4-nonylphenols (4NP) upon migration into packaged foods. As a response to concerns surrounding the presence of 4NP-based compounds in packaging materials, some resin producers and additive suppliers have decided to eliminate TNPP from formulations. This paper describes an analytical procedure to verify the "TNPP-free" statement in multilayer laminates used for bag-in-box packaging. The method involves extraction of TNPP from laminates with organic solvents followed by detection/quantification by LC-MS/MS using the atmospheric pressure chemical ionisation (APCI) mode. A further acidic treatment of the latter extract allows the release of 4NP from potentially extracted TNPP. 4NP is then analysed by LC-MS/MS using electrospray ionisation (ESI) mode. This two-step analytical procedure ensures not only TNPP quantification in laminates, but also allows the flagging of other possible sources of 4NP in such packaging materials, typically as non-intentionally added substances (NIAS). The limits of quantification were 0.50 and 0.48 µg dm⁻² for TNPP and 4NP in laminates, respectively, with recoveries ranging between 87% and 114%. Usage of such analytical methodologies in quality control operations has pointed to a lack of traceability at the packaging supplier level and cross-contamination of extrusion equipment at the converter level, when TNPP-containing laminates are processed on the same machine beforehand.

  1. Emerging technology and architecture for big-data analytics

    CERN Document Server

    Chang, Chip; Yu, Hao

    2017-01-01

    This book describes the current state of the art in big-data analytics, from a technology and hardware architecture perspective. The presentation is designed to be accessible to a broad audience, with general knowledge of hardware design and some interest in big-data analytics. Coverage includes emerging technology and devices for data-analytics, circuit design for data-analytics, and architecture and algorithms to support data-analytics. Readers will benefit from the realistic context used by the authors, which demonstrates what works, what doesn’t work, and what are the fundamental problems, solutions, upcoming challenges and opportunities. Provides a single-source reference to hardware architectures for big-data analytics; Covers various levels of big-data analytics hardware design abstraction and flow, from device, to circuits and systems; Demonstrates how non-volatile memory (NVM) based hardware platforms can be a viable solution to existing challenges in hardware architecture for big-data analytics.

  2. Microgenetic Learning Analytics Methods: Workshop Report

    Science.gov (United States)

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  3. Two-step source tracing strategy of Yersinia pestis and its historical epidemiology in a specific region.

    Directory of Open Access Journals (Sweden)

    Yanfeng Yan

    Full Text Available Source tracing of pathogens is critical for the control and prevention of infectious diseases. Genome sequencing by high throughput technologies is currently feasible and popular, leading to the burst of deciphered bacterial genome sequences. Utilizing the flooding genomic data for source tracing of pathogens in outbreaks is promising, and challenging as well. Here, we employed Yersinia pestis genomes from a plague outbreak at Xinghai county of China in 2009 as an example, to develop a simple two-step strategy for rapid source tracing of the outbreak. The first step was to define the phylogenetic position of the outbreak strains in a whole species tree, and the next step was to provide a detailed relationship across the outbreak strains and their suspected relatives. Through this strategy, we observed that the Xinghai plague outbreak was caused by Y. pestis that circulated in the local plague focus, where the majority of historical plague epidemics in the Qinghai-Tibet Plateau may originate from. The analytical strategy developed here will be of great help in fighting against the outbreaks of emerging infectious diseases, by pinpointing the source of pathogens rapidly with genomic epidemiological data and microbial forensics information.

  4. Travel-time source-specific station correction improves location accuracy

    Science.gov (United States)

    Giuntini, Alessandra; Materni, Valerio; Chiappini, Stefano; Carluccio, Roberto; Console, Rodolfo; Chiappini, Massimo

    2013-04-01

    Accurate earthquake locations are crucial for investigating seismogenic processes, as well as for applications like verifying compliance to the Comprehensive Test Ban Treaty (CTBT). Earthquake location accuracy is related to the degree of knowledge about the 3-D structure of seismic wave velocity in the Earth. It is well known that modeling errors of calculated travel times may have the effect of shifting the computed epicenters far from the real locations by a distance even larger than the size of the statistical error ellipses, regardless of the accuracy in picking seismic phase arrivals. The consequences of large mislocations of seismic events in the context of the CTBT verification is particularly critical in order to trigger a possible On Site Inspection (OSI). In fact, the Treaty establishes that an OSI area cannot be larger than 1000 km2, and its larger linear dimension cannot be larger than 50 km. Moreover, depth accuracy is crucial for the application of the depth event screening criterion. In the present study, we develop a method of source-specific travel times corrections based on a set of well located events recorded by dense national seismic networks in seismically active regions. The applications concern seismic sequences recorded in Japan, Iran and Italy. We show that mislocations of the order of 10-20 km affecting the epicenters, as well as larger mislocations in hypocentral depths, calculated from a global seismic network and using the standard IASPEI91 travel times can be effectively removed by applying source-specific station corrections.

  5. Analytical research using synchrotron radiation based techniques

    International Nuclear Information System (INIS)

    Jha, Shambhu Nath

    2015-01-01

    There are many Synchrotron Radiation (SR) based techniques such as X-ray Absorption Spectroscopy (XAS), X-ray Fluorescence Analysis (XRF), SR-Fourier-transform Infrared (SRFTIR), Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. which are increasingly being employed worldwide in analytical research. With advent of modern synchrotron sources these analytical techniques have been further revitalized and paved ways for new techniques such as microprobe XRF and XAS, FTIR microscopy, Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. The talk will cover mainly two techniques illustrating its capability in analytical research namely XRF and XAS. XRF spectroscopy: XRF spectroscopy is an analytical technique which involves the detection of emitted characteristic X-rays following excitation of the elements within the sample. While electron, particle (protons or alpha particles), or X-ray beams can be employed as the exciting source for this analysis, the use of X-ray beams from a synchrotron source has been instrumental in the advancement of the technique in the area of microprobe XRF imaging and trace level compositional characterisation of any sample. Synchrotron radiation induced X-ray emission spectroscopy, has become competitive with the earlier microprobe and nanoprobe techniques following the advancements in manipulating and detecting these X-rays. There are two important features that contribute to the superb elemental sensitivities of microprobe SR induced XRF: (i) the absence of the continuum (Bremsstrahlung) background radiation that is a feature of spectra obtained from charged particle beams, and (ii) the increased X-ray flux on the sample associated with the use of tunable third generation synchrotron facilities. Detection sensitivities have been reported in the ppb range, with values of 10 -17 g - 10 -14 g (depending on the particular element and matrix). Keeping in mind its demand, a microprobe XRF beamline has been setup by RRCAT at Indus-2 synchrotron

  6. Analytical studies related to Indian PHWR containment system performance

    International Nuclear Information System (INIS)

    Haware, S.K.; Markandeya, S.G.; Ghosh, A.K.; Kushwaha, H.S.; Venkat Raj, V.

    1998-01-01

    Build-up of pressure in a multi-compartment containment after a postulated accident, the growth, transportation and removal of aerosols in the containment are complex processes of vital importance in deciding the source term. The release of hydrogen and its combustion increases the overpressure. In order to analyze these complex processes and to enable proper estimation of the source term, well tested analytical tools are necessary. This paper gives a detailed account of the analytical tools developed/adapted for PSA level 2 studies. (author)

  7. In Situ Near Infrared Spectroscopy for Analyte-Specific Monitoring of Glucose and Ammonium in Streptomyces coelicolor Fermentations

    DEFF Research Database (Denmark)

    Petersen, Nanna; Ödman, Peter; Cervera Padrell, Albert Emili

    2010-01-01

    was used as a model process. Partial least squares (PLS) regression models were calibrated for glucose and ammonium based on NIR spectra collected in situ. To ensure that the models were calibrated based on analyte-specific information, semisynthetic samples were used for model calibration in addition...... resulting in a RMSEP of 1.1 g/L. The prediction of ammonium based on NIR spectra collected in situ was not satisfactory. A comparison with models calibrated based on NIR spectra collected off line suggested that this is caused by signal attenuation in the optical fibers in the region above 2,000 nm...

  8. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening.

    Science.gov (United States)

    Lawton, Zachary E; Traub, Angelica; Fatigante, William L; Mancias, Jose; O'Leary, Adam E; Hall, Seth E; Wieland, Jamie R; Oberacher, Herbert; Gizzi, Michael C; Mulligan, Christopher C

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. Graphical Abstract ᅟ.

  9. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening

    Science.gov (United States)

    Lawton, Zachary E.; Traub, Angelica; Fatigante, William L.; Mancias, Jose; O'Leary, Adam E.; Hall, Seth E.; Wieland, Jamie R.; Oberacher, Herbert; Gizzi, Michael C.; Mulligan, Christopher C.

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. [Figure not available: see fulltext.

  10. Concurrence of big data analytics and healthcare: A systematic review.

    Science.gov (United States)

    Mehta, Nishita; Pandit, Anil

    2018-06-01

    The application of Big Data analytics in healthcare has immense potential for improving the quality of care, reducing waste and error, and reducing the cost of care. This systematic review of literature aims to determine the scope of Big Data analytics in healthcare including its applications and challenges in its adoption in healthcare. It also intends to identify the strategies to overcome the challenges. A systematic search of the articles was carried out on five major scientific databases: ScienceDirect, PubMed, Emerald, IEEE Xplore and Taylor & Francis. The articles on Big Data analytics in healthcare published in English language literature from January 2013 to January 2018 were considered. Descriptive articles and usability studies of Big Data analytics in healthcare and medicine were selected. Two reviewers independently extracted information on definitions of Big Data analytics; sources and applications of Big Data analytics in healthcare; challenges and strategies to overcome the challenges in healthcare. A total of 58 articles were selected as per the inclusion criteria and analyzed. The analyses of these articles found that: (1) researchers lack consensus about the operational definition of Big Data in healthcare; (2) Big Data in healthcare comes from the internal sources within the hospitals or clinics as well external sources including government, laboratories, pharma companies, data aggregators, medical journals etc.; (3) natural language processing (NLP) is most widely used Big Data analytical technique for healthcare and most of the processing tools used for analytics are based on Hadoop; (4) Big Data analytics finds its application for clinical decision support; optimization of clinical operations and reduction of cost of care (5) major challenge in adoption of Big Data analytics is non-availability of evidence of its practical benefits in healthcare. This review study unveils that there is a paucity of information on evidence of real-world use of

  11. Role of analytical chemistry in environmental monitoring

    International Nuclear Information System (INIS)

    Kayasth, S.; Swain, K.

    2004-01-01

    Basic aspects of pollution and the role of analytical chemistry in environmental monitoring are highlighted and exemplified, with emphasis on trace elements. Sources and pathways of natural and especially man-made polluting substances as well as physico-chemical characteristics are given. Attention is paid to adequate sampling in various compartments of the environment comprising both lithosphere and biosphere. Trace analysis is dealt with using a variety of analytical techniques, including criteria for choice of suited techniques, as well as aspects of analytical quality assurance and control. Finally, some data on trace elements levels in soil and water samples from India are presented. (author)

  12. Cause-specific stillbirth and exposure to chemical constituents and sources of fine particulate matter.

    Science.gov (United States)

    Ebisu, Keita; Malig, Brian; Hasheminassab, Sina; Sioutas, Constantinos; Basu, Rupa

    2018-01-01

    The stillbirth rate in the United States is relatively high, but limited evidence is available linking stillbirth with fine particulate matter (PM 2.5 ), its chemical constituents and sources. In this study, we explored associations between cause-specific stillbirth and prenatal exposures to those pollutants with using live birth and stillbirth records from eight California locations during 2002-2009. ICD-10 codes were used to identify cause of stillbirth from stillbirth records. PM 2.5 total mass and chemical constituents were collected from ambient monitors and PM 2.5 sources were quantified using Positive Matrix Factorization. Conditional logistic regression was applied using a nested case-control study design (N = 32,262). We found that different causes of stillbirth were associated with different PM 2.5 sources and/or chemical constituents. For stillbirths due to fetal growth, the odds ratio (OR) per interquartile range increase in gestational age-adjusted exposure to PM 2.5 total mass was 1.23 (95% confidence interval (CI): 1.06, 1.44). Similar associations were found with resuspended soil (OR=1.25, 95% CI: 1.10, 1.42), and secondary ammonium sulfate (OR=1.45, 95% CI: 1.18, 1.78). No associations were found between any pollutants and stillbirths caused by maternal complications. This study highlighted the importance of investigating cause-specific stillbirth and the differential toxicity levels of specific PM 2.5 sources and chemical constituents. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Pre-analytical and analytical aspects affecting clinical reliability of plasma glucose results.

    Science.gov (United States)

    Pasqualetti, Sara; Braga, Federica; Panteghini, Mauro

    2017-07-01

    The measurement of plasma glucose (PG) plays a central role in recognizing disturbances in carbohydrate metabolism, with established decision limits that are globally accepted. This requires that PG results are reliable and unequivocally valid no matter where they are obtained. To control the pre-analytical variability of PG and prevent in vitro glycolysis, the use of citrate as rapidly effective glycolysis inhibitor has been proposed. However, the commercial availability of several tubes with studies showing different performance has created confusion among users. Moreover, and more importantly, studies have shown that tubes promptly inhibiting glycolysis give PG results that are significantly higher than tubes containing sodium fluoride only, used in the majority of studies generating the current PG cut-points, with a different clinical classification of subjects. From the analytical point of view, to be equivalent among different measuring systems, PG results should be traceable to a recognized higher-order reference via the implementation of an unbroken metrological hierarchy. In doing this, it is important that manufacturers of measuring systems consider the uncertainty accumulated through the different steps of the selected traceability chain. In particular, PG results should fulfil analytical performance specifications defined to fit the intended clinical application. Since PG has tight homeostatic control, its biological variability may be used to define these limits. Alternatively, given the central diagnostic role of the analyte, an outcome model showing the impact of analytical performance of test on clinical classifications of subjects can be used. Using these specifications, performance assessment studies employing commutable control materials with values assigned by reference procedure have shown that the quality of PG measurements is often far from desirable and that problems are exacerbated using point-of-care devices. Copyright © 2017 The Canadian

  14. Designing a Marketing Analytics Course for the Digital Age

    Science.gov (United States)

    Liu, Xia; Burns, Alvin C.

    2018-01-01

    Marketing analytics is receiving great attention because of evolving technology and the radical changes in the marketing environment. This study aims to assist the design and implementation of a marketing analytics course. We assembled a rich data set from four sources: business executives, 400 employers' job postings, one million tweets about…

  15. Analytical model of nanoscale junctionless transistors towards controlling of short channel effects through source/drain underlap and channel thickness engineering

    Science.gov (United States)

    Roy, Debapriya; Biswas, Abhijit

    2018-01-01

    We develop a 2D analytical subthreshold model for nanoscale double-gate junctionless transistors (DGJLTs) with gate-source/drain underlap. The model is validated using well-calibrated TCAD simulation deck obtained by comparing experimental data in the literature. To analyze and control short-channel effects, we calculate the threshold voltage, drain induced barrier lowering (DIBL) and subthreshold swing of DGJLTs using our model and compare them with corresponding simulation value at channel length of 20 nm with channel thickness tSi ranging 5-10 nm, gate-source/drain underlap (LSD) values 0-7 nm and source/drain doping concentrations (NSD) ranging 5-12 × 1018 cm-3. As tSi reduces from 10 to 5 nm DIBL drops down from 42.5 to 0.42 mV/V at NSD = 1019 cm-3 and LSD = 5 nm in contrast to decrement from 71 to 4.57 mV/V without underlap. For a lower tSiDIBL increases marginally with increasing NSD. The subthreshold swing reduces more rapidly with thinning of channel thickness rather than increasing LSD or decreasing NSD.

  16. Guiding health promotion efforts with urban Inuit: a community-specific perspective on health information sources and dissemination strategies.

    Science.gov (United States)

    McShane, Kelly E; Smylie, Janet K; Hastings, Paul D; Martin, Carmel M

    2006-01-01

    To develop a community-specific perspective of health information sources and dissemination strategies of urban Inuit to better guide health promotion efforts. Through a collaborative partnership with the Tungasuvvingat Inuit Family Resource Centre, a series of key informant interviews and focus groups were conducted to gather information on specific sources of health information, strategies of health information dissemination, and overall themes in health information processes. Distinct patterns of health information sources and dissemination strategies emerged from the data. Major themes included: the importance of visual learning, community Elders, and cultural interpreters; community cohesion; and the Inuit and non-Inuit distinction. The core sources of health information are family members and sources from within the Inuit community. The principal dissemination strategy for health information was direct communication, either through one-on-one interactions or in groups. This community-specific perspective of health information sources and dissemination strategies shows substantial differences from current mainstream models of health promotion and knowledge translation. Health promotion efforts need to acknowledge the distinct health information processes of this community, and should strive to integrate existing health information sources and strategies of dissemination with those of the community.

  17. What are Segments in Google Analytics

    Science.gov (United States)

    Segments find all sessions that meet a specific condition. You can then apply this segment to any report in Google Analytics (GA). Segments are a way of identifying sessions and users while filters identify specific events, like pageviews.

  18. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management.

    Science.gov (United States)

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-12-15

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.

  19. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management

    Science.gov (United States)

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-01-01

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713

  20. Holistic rubric vs. analytic rubric for measuring clinical performance levels in medical students.

    Science.gov (United States)

    Yune, So Jung; Lee, Sang Yeoup; Im, Sun Ju; Kam, Bee Sung; Baek, Sun Yong

    2018-06-05

    Task-specific checklists, holistic rubrics, and analytic rubrics are often used for performance assessments. We examined what factors evaluators consider important in holistic scoring of clinical performance assessment, and compared the usefulness of applying holistic and analytic rubrics respectively, and analytic rubrics in addition to task-specific checklists based on traditional standards. We compared the usefulness of a holistic rubric versus an analytic rubric in effectively measuring the clinical skill performances of 126 third-year medical students who participated in a clinical performance assessment conducted by Pusan National University School of Medicine. We conducted a questionnaire survey of 37 evaluators who used all three evaluation methods-holistic rubric, analytic rubric, and task-specific checklist-for each student. The relationship between the scores on the three evaluation methods was analyzed using Pearson's correlation. Inter-rater agreement was analyzed by Kappa index. The effect of holistic and analytic rubric scores on the task-specific checklist score was analyzed using multiple regression analysis. Evaluators perceived accuracy and proficiency to be major factors in objective structured clinical examinations evaluation, and history taking and physical examination to be major factors in clinical performance examinations evaluation. Holistic rubric scores were highly related to the scores of the task-specific checklist and analytic rubric. Relatively low agreement was found in clinical performance examinations compared to objective structured clinical examinations. Meanwhile, the holistic and analytic rubric scores explained 59.1% of the task-specific checklist score in objective structured clinical examinations and 51.6% in clinical performance examinations. The results show the usefulness of holistic and analytic rubrics in clinical performance assessment, which can be used in conjunction with task-specific checklists for more efficient

  1. Analytical determination of specific 4,4'-methylene diphenyl diisocyanate hemoglobin adducts in human blood.

    Science.gov (United States)

    Gries, Wolfgang; Leng, Gabriele

    2013-09-01

    4,4'-Methylene diphenyl diisocyanate (MDI) is one of the most important isocyanates in the industrial production of polyurethane and other MDI-based synthetics. Because of its high reactivity, it is known as a sensitizing agent, caused by protein adducts. Analysis of MDI is routinely done by determination of the nonspecific 4,4'-methylenedianiline as a marker for MDI exposure in urine and blood. Since several publications have reported specific adducts of MDI and albumin or hemoglobin, more information about their existence in humans is necessary. Specific adducts of MDI and hemoglobin were only reported in rats after high-dose MDI inhalation. The aim of this investigation was to detect the hemoglobin adduct 5-isopropyl-3-[4-(4-aminobenzyl)phenyl]hydantoin (ABP-Val-Hyd) in human blood for the first time. We found values up to 5.2 ng ABP-Val-Hyd/g globin (16 pmol/g) in blood samples of workers exposed to MDI. Because there was no information available about possible amounts of this specific MDI marker, the analytical method focused on optimal sensitivity and selectivity. Using gas chromatography-high-resolution mass spectrometry with negative chemical ionization, we achieved a detection limit of 0.02 ng ABP-Val-Hyd/g globin (0.062 pmol/g). The robustness of the method was confirmed by relative standard deviations between 3.0 and 9.8 %. Combined with a linear detection range up to 10 ng ABP-Val-Hyd/g globin (31 pmol/g), the enhanced precision parameter demonstrates that the method described is optimized for screening studies of the human population.

  2. Recent analytical applications of magnetic nanoparticles

    Directory of Open Access Journals (Sweden)

    Mohammad Faraji

    2016-07-01

    Full Text Available Analytical chemistry has experienced, as well as other areas of science, a big change due to the needs and opportunities provided by analytical nanoscience and nanotechnology. Now, nanotechnology is increasingly proving to be a powerful ally of analytical chemistry to achieve its objectives, and to simplify analytical processes. Moreover, the information needs arising from the growing nanotechnological activity are opening an exciting new field of action for analytical chemists. Magnetic nanoparticles have been used in various fields owing to their unique properties including large specific surface area and simple separation with magnetic fields. For Analytical applications, they have been used mainly for sample preparation techniques (magnetic solid phase extraction with different advanced functional groups (layered double hydroxide, β-cyclodextrin, carbon nanotube, graphen, polymer, octadecylsilane and automation of it, microextraction techniques enantioseparation and chemosensors. This review summarizes the basic principles and achievements of magnetic nanoparticles in sample preparation techniques, enantioseparation and chemosensors. Also, some selected articles recently published (2010-2016 have been reviewed and discussed.

  3. Analytical theory relating the depth of the sulfate-methane transition to gas hydrate distribution and saturation

    Science.gov (United States)

    Bhatnagar, Gaurav; Chatterjee, Sayantan; Chapman, Walter G.; Dugan, Brandon; Dickens, Gerald R.; Hirasaki, George J.

    2011-03-01

    We develop a theory that relates gas hydrate saturation in marine sediments to the depth of the sulfate-methane transition (SMT) zone below the seafloor using steady state, analytical expressions. These expressions are valid for systems in which all methane transported into the gas hydrate stability zone (GHSZ) comes from deeper external sources (i.e., advective systems). This advective constraint causes anaerobic oxidation of methane to be the only sulfate sink, allowing us to link SMT depth to net methane flux. We also develop analytical expressions that define the gas hydrate saturation profile based on SMT depth and site-specific parameters such as sedimentation rate, methane solubility, and porosity. We evaluate our analytical model at four drill sites along the Cascadia Margin where methane sources from depth dominate. With our model, we calculate average gas hydrate saturations across GHSZ and the top occurrence of gas hydrate at these sites as 0.4% and 120 mbsf (Site 889), 1.9% and 70 mbsf (Site U1325), 4.7% and 40 mbsf (Site U1326), and 0% (Site U1329), mbsf being meters below seafloor. These values compare favorably with average saturations and top occurrences computed from resistivity log and chloride data. The analytical expressions thus provide a fast and convenient method to calculate gas hydrate saturation and first-order occurrence at a given geologic setting where vertically upward advection dominates the methane flux.

  4. 25 CFR 115.702 - What specific sources of money will be accepted for deposit into a trust account?

    Science.gov (United States)

    2010-04-01

    ... Information § 115.702 What specific sources of money will be accepted for deposit into a trust account? We... 25 Indians 1 2010-04-01 2010-04-01 false What specific sources of money will be accepted for deposit into a trust account? 115.702 Section 115.702 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE...

  5. Locating gamma radiation source by self collimating BGO detector system

    Energy Technology Data Exchange (ETDEWEB)

    Orion, I; Pernick, A; Ilzycer, D; Zafrir, H [Israel Atomic Energy Commission, Yavne (Israel). Soreq Nuclear Research Center; Shani, G [Ben-Gurion Univ. of the Negev, Beersheba (Israel)

    1996-12-01

    The need for airborne collimated gamma detector system to estimate the radiation released from a nuclear accident has been established. A BGO detector system has been developed as an array of separate seven cylindrical Bismuth Germanate scintillators, one central detector symmetrically surrounded by six detectors. In such an arrangement, each of the detectors reduced the exposure of other detectors in the array to a radiation incident from a possible specific spatial angle, around file array. This shielding property defined as `self-collimation`, differs the point source response function for each of the detectors. The BGO detector system has a high density and atomic number, and therefore provides efficient self-collimation. Using the response functions of the separate detectors enables locating point sources as well as the direction of a nuclear radioactive plume with satisfactory angular resolution, of about 10 degrees. The detector`s point source response, as function of the source direction, in a horizontal plane, has been predicted by analytical calculation, and was verified by Monte-Carlo simulation using the code EGS4. The detector`s response was tested in a laboratory-scale experiment for several gamma ray energies, and the experimental results validated the theoretical (analytical and Monte-Carlo) results. (authors).

  6. Interior beam searchlight semi-analytical benchmark

    International Nuclear Information System (INIS)

    Ganapol, Barry D.; Kornreich, Drew E.

    2008-01-01

    Multidimensional semi-analytical benchmarks to provide highly accurate standards to assess routine numerical particle transport algorithms are few and far between. Because of the well-established 1D theory for the analytical solution of the transport equation, it is sometimes possible to 'bootstrap' a 1D solution to generate a more comprehensive solution representation. Here, we consider the searchlight problem (SLP) as a multidimensional benchmark. A variation of the usual SLP is the interior beam SLP (IBSLP) where a beam source lies beneath the surface of a half space and emits directly towards the free surface. We consider the establishment of a new semi-analytical benchmark based on a new FN formulation. This problem is important in radiative transfer experimental analysis to determine cloud absorption and scattering properties. (authors)

  7. Research prioritization using the Analytic Hierarchy Process: basic methods. Volume 1

    International Nuclear Information System (INIS)

    Vesely, W.E.; Shafaghi, A.; Gary, I. Jr.; Rasmuson, D.M.

    1983-08-01

    This report describes a systematic approach for prioritizing research needs and research programs. The approach is formally called the Analytic Hierarchy Process which was developed by T.L. Saaty and is described in several of his texts referenced in the report. The Analytic Hierarchy Process, or AHP for short, has been applied to a wide variety of prioritization problems and has a good record of success as documented in Saaty's texts. The report develops specific guidelines for constructing the hierarchy and for prioritizing the research programs. Specific examples are given to illustrate the steps in the AHP. As part of the work, a computer code has been developed and the use of the code is described. The code allows the prioritizations to be done in a codified and efficient manner; sensitivity and parametric studies can also be straightforwardly performed to gain a better understanding of the prioritization results. Finally, as an important part of the work, an approach is developed which utilizes probabilistic risk analyses (PRAs) to systematically identify and prioritize research needs and research programs. When utilized in an AHP framework, the PRA's which have been performed to date provide a powerful information source for focusing research on those areas most impacting risk and risk uncertainty

  8. Beyond ‘Moneyball’ to Analytics Leadership in Sports

    DEFF Research Database (Denmark)

    Tan, Felix Ter Chian; Hedman, Jonas; Xiao, Xiao

    2017-01-01

    analytics leadership. More specifically, we strive to understand 1) what analytics leadership entails for professional sports teams; and 2) what type of IT-enabled capabilities are required to realize such leadership position within a sporting and entertainment ecosystem. Our initial analysis identified...

  9. A two-dimensional analytical well model with applications to groundwater flow and convective transport modelling in the geosphere

    International Nuclear Information System (INIS)

    Chan, T.; Nakka, B.W.

    1994-12-01

    A two-dimensional analytical well model has been developed to describe steady groundwater flow in an idealized, confined aquifer intersected by a withdrawal well. The aquifer comprises a low-dipping fracture zone. The model is useful for making simple quantitative estimates of the transport of contaminants along groundwater pathways in the fracture zone to the well from an underground source that intercepts the fracture zone. This report documents the mathematical development of the analytical well model. It outlines the assumptions and method used to derive an exact analytical solution, which is verified by two other methods. It presents expressions for calculating quantities such as streamlines (groundwater flow paths), fractional volumetric flow rates, contaminant concentration in well water and minimum convective travel time to the well. In addition, this report presents the results of applying the analytical model to a site-specific conceptual model of the Whiteshell Research Area in southeastern Manitoba, Canada. This hydrogeological model includes the presence of a 20-m-thick, low-dipping (18 deg) fracture zone (LD1) that intercepts the horizon of a hypothetical disposal vault located at a depth of 500 m. A withdrawal well intercepts LD1 between the vault level and the ground surface. Predictions based on parameters and boundary conditions specific to LD1 are presented graphically. The analytical model has specific applications in the SYVAC geosphere model (GEONET) to calculate the fraction of a plume of contaminants moving up the fracture zone that is captured by the well, and to describe the drawdown in the hydraulic head in the fracture zone caused by the withdrawal well. (author). 16 refs., 6 tabs., 35 figs

  10. Data mining and business analytics with R

    CERN Document Server

    Ledolter, Johannes

    2013-01-01

    Collecting, analyzing, and extracting valuable information from a large amount of data requires easily accessible, robust, computational and analytical tools. Data Mining and Business Analytics with R utilizes the open source software R for the analysis, exploration, and simplification of large high-dimensional data sets. As a result, readers are provided with the needed guidance to model and interpret complicated data and become adept at building powerful models for prediction and classification. Highlighting both underlying concepts and practical computational skills, Data Mining

  11. The sources of the specificity of nuclear law and environmental law

    International Nuclear Information System (INIS)

    Rainaud, J.M.; Cristini, R.

    1983-01-01

    This paper analyses the sources of the specificity of nuclear law and its relationship with environmental law as well as with ordinary law. The characteristics of nuclear law are summarized thus: recent discovery of the atom's uses and mandatory protection against its effects; internationalization of its use, leading to a limitation of national authorities competence. Several international treaties are cited (Antarctic Treaty, NPT, London Dumping Convention etc.) showing the link between radiation protection and the environment. (NEA) [fr

  12. The Analytical Repository Source-Term (AREST) model: Analysis of spent fuel as a nuclear waste form

    International Nuclear Information System (INIS)

    Apted, M.J.; Liebetrau, A.M.; Engel, D.W.

    1989-02-01

    The purpose of this report is to assess the performance of spent fuel as a final waste form. The release of radionuclides from spent nuclear fuel has been simulated for the three repository sites that were nominated for site characterization in accordance with the Nuclear Waste Policy Act of 1982. The simulation is based on waste package designs that were presented in the environmental assessments prepared for each site. Five distinct distributions for containment failure have been considered, and the release for nuclides from the UO 2 matrix, gap (including grain boundary), crud/surface layer, and cladding has been calculated with the Analytic Repository Source-Term (AREST) code. Separate scenarios involving incongruent and congruent release from the UO 2 matrix have also been examined using the AREST code. Congruent release is defined here as the condition in which the relative mass release rates of a given nuclide and uranium from the UO 2 matrix are equal to their mass ratios in the matrix. Incongruent release refers to release of a given nuclide from the UO 2 matrix controlled by its own solubility-limiting solid phase. Release of nuclides from other sources within the spent fuel (e.g., cladding, fuel/cladding gap) is evaluated separately from either incongruent or congruent matrix release. 51 refs., 200 figs., 9 tabs

  13. CheapStat: an open-source, "do-it-yourself" potentiostat for analytical and educational applications.

    Directory of Open Access Journals (Sweden)

    Aaron A Rowe

    Full Text Available Although potentiostats are the foundation of modern electrochemical research, they have seen relatively little application in resource poor settings, such as undergraduate laboratory courses and the developing world. One reason for the low penetration of potentiostats is their cost, as even the least expensive commercially available laboratory potentiostats sell for more than one thousand dollars. An inexpensive electrochemical workstation could thus prove useful in educational labs, and increase access to electrochemistry-based analytical techniques for food, drug and environmental monitoring. With these motivations in mind, we describe here the CheapStat, an inexpensive (<$80, open-source (software and hardware, hand-held potentiostat that can be constructed by anyone who is proficient at assembling circuits. This device supports a number of potential waveforms necessary to perform cyclic, square wave, linear sweep and anodic stripping voltammetry. As we demonstrate, it is suitable for a wide range of applications ranging from food- and drug-quality testing to environmental monitoring, rapid DNA detection, and educational exercises. The device's schematics, parts lists, circuit board layout files, sample experiments, and detailed assembly instructions are available in the supporting information and are released under an open hardware license.

  14. Analytic solutions of a class of nonlinearly dynamic systems

    International Nuclear Information System (INIS)

    Wang, M-C; Zhao, X-S; Liu, X

    2008-01-01

    In this paper, the homotopy perturbation method (HPM) is applied to solve a coupled system of two nonlinear differential with first-order similar model of Lotka-Volterra and a Bratus equation with a source term. The analytic approximate solutions are derived. Furthermore, the analytic approximate solutions obtained by the HPM with the exact solutions reveals that the present method works efficiently

  15. Getting Open Source Right for Big Data Analytics: Software Sharing, Governance, Collaboration and Most of All, Fun!

    Science.gov (United States)

    Mattmann, C. A.

    2013-12-01

    A wave of open source big data analytic infrastructure is currently shaping government, private sector, and academia. Projects are consuming, adapting, and contributing back to various ecosystems of software e.g., the Apache Hadoop project and its ecosystem of related efforts including Hive, HBase, Pig, Oozie, Ambari, Knox, Tez and Yarn, to name a few; the Berkeley AMPLab stack which includes Spark, Shark, Mesos, Tachyon, BlinkDB, MLBase, and other emerging efforts; MapR and its related stack of technologies, offerings from commercial companies building products around these tools e.g., Hortonworks Data Platform (HDP), Cloudera's CDH project, etc. Though the technologies all offer different capabilities including low latency support/in-memory, versus record oriented file I/O, high availability, support for the Map Reduce programming paradigm or other dataflow/workflow constructs, there is a common thread that binds these products - they are all released under an open source license e.g., Apache2, MIT, BSD, GPL/LGPL, etc.; all thrive in various ecosystems, such as Apache, or Berkeley AMPLab; all are developed collaboratively, and all technologies provide plug in architecture models and methodologies for allowing others to contribute, and participate via various community models. This talk will cover the open source aspects and governance aspects of the aforementioned Big Data ecosystems and point out the differences, subtleties, and implications of those differences. The discussion will be by example, using several national deployments and Big Data initiatives stemming from the Administration including DARPA's XDATA program; NASA's CMAC program; NSF's EarthCube and geosciences BigData projects. Lessons learned from these efforts in terms of the open source aspects of these technologies will help guide the AGU community in their use, deployment and understanding.

  16. Source and specificity of chemical cues mediating shelter preference of Caribbean spiny lobsters (Panulirus argus).

    Science.gov (United States)

    Horner, Amy J; Nickles, Scott P; Weissburg, Marc J; Derby, Charles D

    2006-10-01

    Caribbean spiny lobsters display a diversity of social behaviors, one of the most prevalent of which is gregarious diurnal sheltering. Previous research has demonstrated that shelter selection is chemically mediated, but the source of release and the identity of the aggregation signal are unknown. In this study, we investigated the source and specificity of the aggregation signal in Caribbean spiny lobsters, Panulirus argus. We developed a relatively rapid test of shelter choice in a 5000-l laboratory flume that simulated flow conditions in the spiny lobster's natural environment, and used it to examine the shelter preference of the animals in response to a variety of odorants. We found that both males and females associated preferentially with shelters emanating conspecific urine of either sex, but not with shelters emanating seawater, food odors, or the scent of a predatory octopus. These results demonstrate specificity in the cues mediating sheltering behavior and show that urine is at least one source of the aggregation signal.

  17. Practical web analytics for user experience how analytics can help you understand your users

    CERN Document Server

    Beasley, Michael

    2013-01-01

    Practical Web Analytics for User Experience teaches you how to use web analytics to help answer the complicated questions facing UX professionals. Within this book, you'll find a quantitative approach for measuring a website's effectiveness and the methods for posing and answering specific questions about how users navigate a website. The book is organized according to the concerns UX practitioners face. Chapters are devoted to traffic, clickpath, and content use analysis, measuring the effectiveness of design changes, including A/B testing, building user profiles based on search hab

  18. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  19. Croatian Analytical Terminology

    Directory of Open Access Journals (Sweden)

    Kastelan-Macan; M.

    2008-04-01

    Full Text Available Results of analytical research are necessary in all human activities. They are inevitable in making decisions in the environmental chemistry, agriculture, forestry, veterinary medicine, pharmaceutical industry, and biochemistry. Without analytical measurements the quality of materials and products cannot be assessed, so that analytical chemistry is an essential part of technical sciences and disciplines.The language of Croatian science, and analytical chemistry within it, was one of the goals of our predecessors. Due to the political situation, they did not succeed entirely, but for the scientists in independent Croatia this is a duty, because language is one of the most important features of the Croatian identity. The awareness of the need to introduce Croatian terminology was systematically developed in the second half of the 19th century, along with the founding of scientific societies and the wish of scientists to write their scientific works in Croatian, so that the results of their research may be applied in economy. Many authors of textbooks from the 19th and the first half of the 20th century contributed to Croatian analytical terminology (F. Rački, B. Šulek, P. Žulić, G. Pexidr, J. Domac, G. Janeček , F. Bubanović, V. Njegovan and others. M. DeŢelić published the first systematic chemical terminology in 1940, adjusted to the IUPAC recommendations. In the second half of 20th century textbooks in classic analytical chemistry were written by V. Marjanović-Krajovan, M. Gyiketta-Ogrizek, S. Žilić and others. I. Filipović wrote the General and Inorganic Chemistry textbook and the Laboratory Handbook (in collaboration with P. Sabioncello and contributed greatly to establishing the terminology in instrumental analytical methods.The source of Croatian nomenclature in modern analytical chemistry today are translated textbooks by Skoog, West and Holler, as well as by Günnzler i Gremlich, and original textbooks by S. Turina, Z.

  20. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA for Environmental Risk Management

    Directory of Open Access Journals (Sweden)

    Yan Li

    2016-12-01

    Full Text Available With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity and the degree of Socio-Economic Deprivation (SED at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.

  1. Median of patient results as a tool for assessment of analytical stability

    DEFF Research Database (Denmark)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft

    2015-01-01

    BACKGROUND: In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. METHOD......: Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable...... analytical bias based on biological variation. RESULTS: Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. DISCUSSION: Patient results applied in analytical quality performance...

  2. Diagnostic Air Quality Model Evaluation of Source-Specific Primary and Secondary Fine Particulate Carbon

    Science.gov (United States)

    Ambient measurements of 78 source-specific tracers of primary and secondary carbonaceous fine particulate matter collected at four midwestern United States locations over a full year (March 2004–February 2005) provided an unprecedented opportunity to diagnostically evaluate...

  3. Electron capture detector based on a non-radioactive electron source: operating parameters vs. analytical performance

    Directory of Open Access Journals (Sweden)

    E. Bunert

    2017-12-01

    Full Text Available Gas chromatographs with electron capture detectors are widely used for the analysis of electron affine substances such as pesticides or chlorofluorocarbons. With detection limits in the low pptv range, electron capture detectors are the most sensitive detectors available for such compounds. Based on their operating principle, they require free electrons at atmospheric pressure, which are usually generated by a β− decay. However, the use of radioactive materials leads to regulatory restrictions regarding purchase, operation, and disposal. Here, we present a novel electron capture detector based on a non-radioactive electron source that shows similar detection limits compared to radioactive detectors but that is not subject to these limitations and offers further advantages such as adjustable electron densities and energies. In this work we show first experimental results using 1,1,2-trichloroethane and sevoflurane, and investigate the effect of several operating parameters on the analytical performance of this new non-radioactive electron capture detector (ECD.

  4. A Simple Analytical Model for Predicting the Detectable Ion Current in Ion Mobility Spectrometry Using Corona Discharge Ionization Sources

    Science.gov (United States)

    Kirk, Ansgar Thomas; Kobelt, Tim; Spehlbrink, Hauke; Zimmermann, Stefan

    2018-05-01

    Corona discharge ionization sources are often used in ion mobility spectrometers (IMS) when a non-radioactive ion source with high ion currents is required. Typically, the corona discharge is followed by a reaction region where analyte ions are formed from the reactant ions. In this work, we present a simple yet sufficiently accurate model for predicting the ion current available at the end of this reaction region when operating at reduced pressure as in High Kinetic Energy Ion Mobility Spectrometers (HiKE-IMS) or most IMS-MS instruments. It yields excellent qualitative agreement with measurement results and is even able to calculate the ion current within an error of 15%. Additional interesting findings of this model are the ion current at the end of the reaction region being independent from the ion current generated by the corona discharge and the ion current in High Kinetic Energy Ion Mobility Spectrometers (HiKE-IMS) growing quadratically when scaling down the length of the reaction region. [Figure not available: see fulltext.

  5. Mixed analytical-stochastic simulation method for the recovery of a Brownian gradient source from probability fluxes to small windows.

    Science.gov (United States)

    Dobramysl, U; Holcman, D

    2018-02-15

    Is it possible to recover the position of a source from the steady-state fluxes of Brownian particles to small absorbing windows located on the boundary of a domain? To address this question, we develop a numerical procedure to avoid tracking Brownian trajectories in the entire infinite space. Instead, we generate particles near the absorbing windows, computed from the analytical expression of the exit probability. When the Brownian particles are generated by a steady-state gradient at a single point, we compute asymptotically the fluxes to small absorbing holes distributed on the boundary of half-space and on a disk in two dimensions, which agree with stochastic simulations. We also derive an expression for the splitting probability between small windows using the matched asymptotic method. Finally, when there are more than two small absorbing windows, we show how to reconstruct the position of the source from the diffusion fluxes. The present approach provides a computational first principle for the mechanism of sensing a gradient of diffusing particles, a ubiquitous problem in cell biology.

  6. Analytical approach of laser beam propagation in the hollow polygonal light pipe.

    Science.gov (United States)

    Zhu, Guangzhi; Zhu, Xiao; Zhu, Changhong

    2013-08-10

    An analytical method of researching the light distribution properties on the output end of a hollow n-sided polygonal light pipe and a light source with a Gaussian distribution is developed. The mirror transformation matrices and a special algorithm of removing void virtual images are created to acquire the location and direction vector of each effective virtual image on the entrance plane. The analytical method is demonstrated by Monte Carlo ray tracing. At the same time, four typical cases are discussed. The analytical results indicate that the uniformity of light distribution varies with the structural and optical parameters of the hollow n-sided polygonal light pipe and light source with a Gaussian distribution. The analytical approach will be useful to design and choose the hollow n-sided polygonal light pipe, especially for high-power laser beam homogenization techniques.

  7. Nuclear analytical chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Brune, D.; Forkman, B.; Persson, B.

    1984-01-01

    This book covers the general theories and techniques of nuclear chemical analysis, directed at applications in analytical chemistry, nuclear medicine, radiophysics, agriculture, environmental sciences, geological exploration, industrial process control, etc. The main principles of nuclear physics and nuclear detection on which the analysis is based are briefly outlined. An attempt is made to emphasise the fundamentals of activation analysis, detection and activation methods, as well as their applications. The book provides guidance in analytical chemistry, agriculture, environmental and biomedical sciences, etc. The contents include: the nuclear periodic system; nuclear decay; nuclear reactions; nuclear radiation sources; interaction of radiation with matter; principles of radiation detectors; nuclear electronics; statistical methods and spectral analysis; methods of radiation detection; neutron activation analysis; charged particle activation analysis; photon activation analysis; sample preparation and chemical separation; nuclear chemical analysis in biological and medical research; the use of nuclear chemical analysis in the field of criminology; nuclear chemical analysis in environmental sciences, geology and mineral exploration; and radiation protection.

  8. Nuclear analytical chemistry

    International Nuclear Information System (INIS)

    Brune, D.; Forkman, B.; Persson, B.

    1984-01-01

    This book covers the general theories and techniques of nuclear chemical analysis, directed at applications in analytical chemistry, nuclear medicine, radiophysics, agriculture, environmental sciences, geological exploration, industrial process control, etc. The main principles of nuclear physics and nuclear detection on which the analysis is based are briefly outlined. An attempt is made to emphasise the fundamentals of activation analysis, detection and activation methods, as well as their applications. The book provides guidance in analytical chemistry, agriculture, environmental and biomedical sciences, etc. The contents include: the nuclear periodic system; nuclear decay; nuclear reactions; nuclear radiation sources; interaction of radiation with matter; principles of radiation detectors; nuclear electronics; statistical methods and spectral analysis; methods of radiation detection; neutron activation analysis; charged particle activation analysis; photon activation analysis; sample preparation and chemical separation; nuclear chemical analysis in biological and medical research; the use of nuclear chemical analysis in the field of criminology; nuclear chemical analysis in environmental sciences, geology and mineral exploration; and radiation protection

  9. A global multicenter study on reference values: 2. Exploration of sources of variation across the countries.

    Science.gov (United States)

    Ichihara, Kiyoshi; Ozarda, Yesim; Barth, Julian H; Klee, George; Shimizu, Yoshihisa; Xia, Liangyu; Hoffmann, Mariza; Shah, Swarup; Matsha, Tandi; Wassung, Janette; Smit, Francois; Ruzhanskaya, Anna; Straseski, Joely; Bustos, Daniel N; Kimura, Shogo; Takahashi, Aki

    2017-04-01

    The intent of this study, based on a global multicenter study of reference values (RVs) for serum analytes was to explore biological sources of variation (SVs) of the RVs among 12 countries around the world. As described in the first part of this paper, RVs of 50 major serum analytes from 13,396 healthy individuals living in 12 countries were obtained. Analyzed in this study were 23 clinical chemistry analytes and 8 analytes measured by immunoturbidimetry. Multiple regression analysis was performed for each gender, country by country, analyte by analyte, by setting four major SVs (age, BMI, and levels of drinking and smoking) as a fixed set of explanatory variables. For analytes with skewed distributions, log-transformation was applied. The association of each source of variation with RVs was expressed as the partial correlation coefficient (r p ). Obvious gender and age-related changes in the RVs were observed in many analytes, almost consistently between countries. Compilation of age-related variations of RVs after adjusting for between-country differences revealed peculiar patterns specific to each analyte. Judged fromthe r p , BMI related changes were observed for many nutritional and inflammatory markers in almost all countries. However, the slope of linear regression of BMI vs. RV differed greatly among countries for some analytes. Alcohol and smoking-related changes were observed less conspicuously in a limited number of analytes. The features of sex, age, alcohol, and smoking-related changes in RVs of the analytes were largely comparable worldwide. The finding of differences in BMI-related changes among countries in some analytes is quite relevant to understanding ethnic differences in susceptibility to nutritionally related diseases. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Requirements Specification for Open Source Software Selection

    OpenAIRE

    YANG, YING

    2008-01-01

    Open source software has been widely used. The software world is enjoying the advantages of collaboration and cooperation in software development and use with the advent of open source movement. However, little research is concerned about the practical guidelines of OSS selection. It is hard for an organization to make a decision whether they should use the OSS or not, and to select an appropriate one from a number of OSS candidates. This thesis studies how to select an open source software f...

  11. LC-MS/MS Identification of Species-Specific Muscle Peptides in Processed Animal Proteins.

    Science.gov (United States)

    Marchis, Daniela; Altomare, Alessandra; Gili, Marilena; Ostorero, Federica; Khadjavi, Amina; Corona, Cristiano; Ru, Giuseppe; Cappelletti, Benedetta; Gianelli, Silvia; Amadeo, Francesca; Rumio, Cristiano; Carini, Marina; Aldini, Giancarlo; Casalone, Cristina

    2017-12-06

    An innovative analytical strategy has been applied to identify signature peptides able to distinguish among processed animal proteins (PAPs) derived from bovine, pig, fish, and milk products. Proteomics was first used to elucidate the proteome of each source. Starting from the identified proteins and using a funnel based approach, a set of abundant and well characterized peptides with suitable physical-chemical properties (signature peptides) and specific for each source was selected. An on-target LC-ESI-MS/MS method (MRM mode) was set up using standard peptides and was then applied to selectively identify the PAP source and also to distinguish proteins from bovine carcass and milk proteins. We believe that the method described meets the request of the European Commission which has developed a strategy for gradually lifting the "total ban" toward "species to species ban", therefore requiring official methods for species-specific discrimination in feed.

  12. Reactor Section standard analytical methods. Part 1

    Energy Technology Data Exchange (ETDEWEB)

    Sowden, D.

    1954-07-01

    the Standard Analytical Methods manual was prepared for the purpose of consolidating and standardizing all current analytical methods and procedures used in the Reactor Section for routine chemical analyses. All procedures are established in accordance with accepted practice and the general analytical methods specified by the Engineering Department. These procedures are specifically adapted to the requirements of the water treatment process and related operations. The methods included in this manual are organized alphabetically within the following five sections which correspond to the various phases of the analytical control program in which these analyses are to be used: water analyses, essential material analyses, cotton plug analyses boiler water analyses, and miscellaneous control analyses.

  13. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals

    DEFF Research Database (Denmark)

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G

    2018-01-01

    for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision...... are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation...

  14. Proactive Supply Chain Performance Management with Predictive Analytics

    Directory of Open Access Journals (Sweden)

    Nenad Stefanovic

    2014-01-01

    Full Text Available Today’s business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators. Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment.

  15. Proactive supply chain performance management with predictive analytics.

    Science.gov (United States)

    Stefanovic, Nenad

    2014-01-01

    Today's business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI) model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators). Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment.

  16. Proactive Supply Chain Performance Management with Predictive Analytics

    Science.gov (United States)

    Stefanovic, Nenad

    2014-01-01

    Today's business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI) model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators). Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment. PMID:25386605

  17. (U) An Analytic Study of Piezoelectric Ejecta Mass Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Tregillis, Ian Lee [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-16

    We consider the piezoelectric measurement of the areal mass of an ejecta cloud, for the specific case where ejecta are created by a single shock at the free surface and fly ballistically through vacuum to the sensor. To do so, we define time- and velocity-dependent ejecta “areal mass functions” at the source and sensor in terms of typically unknown distribution functions for the ejecta particles. Next, we derive an equation governing the relationship between the areal mass function at the source (which resides in the rest frame of the free surface) and at the sensor (which resides in the laboratory frame). We also derive expressions for the analytic (“true”) accumulated ejecta mass at the sensor and the measured (“inferred”) value obtained via the standard method for analyzing piezoelectric voltage traces. This approach enables us to derive an exact expression for the error imposed upon a piezoelectric ejecta mass measurement (in a perfect system) by the assumption of instantaneous creation. We verify that when the ejecta are created instantaneously (i.e., when the time dependence is a delta function), the piezoelectric inference method exactly reproduces the correct result. When creation is not instantaneous, the standard piezo analysis will always overestimate the true mass. However, the error is generally quite small (less than several percent) for most reasonable velocity and time dependences. In some cases, errors exceeding 10-15% may require velocity distributions or ejecta production timescales inconsistent with experimental observations. These results are demonstrated rigorously with numerous analytic test problems.

  18. Development of cold source moderator structure

    International Nuclear Information System (INIS)

    Aso, Tomokaze; Ishikura, Syuichi; Terada, Atsuhiko; Teshigawara, Makoto; Watanabe, Noboru; HIno, Ryutaro

    1999-01-01

    The cold and thermal neutrons generated at the target (which works as a spallation neutron source under a 5MW proton beam condition) is filtered with cold source moderators using supercritical hydrogen. Preliminary structural analysis was carried out to clarify technical problems on the concept of the thin-walled structure for the cold source moderator. Structural analytical results showed that the maximum stress of 1 12MPa occurred on the moderator surface, which exceeded the allowable design stresses of ordinary aluminum alloys. Flow patterns measured by water flow experiments agreed well with hydraulic analytical results, which showed that an impinging jet flow from an inner pipe of the moderator caused a recirculation flow on a large scale. Based on analytical and experimental results, new moderator structures with minute frames, blowing flow holes etc. were proposed to keep its strength and to suppress the recirculation flow. (author)

  19. Low frequency interference between short synchrotron radiation sources

    Directory of Open Access Journals (Sweden)

    F. Méot

    2001-06-01

    Full Text Available A recently developed analytical formalism describing low frequency far-field synchrotron radiation (SR is applied to the calculation of spectral angular radiation densities from interfering short sources (edge, short magnet. This is illustrated by analytical calculation of synchrotron radiation from various assemblies of short dipoles, including an “isolated” highest density infrared SR source.

  20. Application of characterization, modelling, and analytics towards understanding process-structure linkages in metallic 3D printing

    Science.gov (United States)

    Groeber, M. A.; Schwalbach, E.; Donegan, S.; Chaput, K.; Butler, T.; Miller, J.

    2017-07-01

    This paper presents methods for combining process monitoring, thermal modelling and microstructure characterization together to draw process-to-structure relationships in metal additive manufacturing. The paper discusses heterogeneities in the local processing conditions within additively manufactured components and how they affect the resulting material structure. Methods for registering and fusing disparate data sources are presented, and some effort is made to discuss the utility of different data sources for specific microstructural features of interest. It is the intent that this paper will highlight the need for improved understanding of metallic additive manufacturing processes and show that combining experimental data with modelling and advanced data processing and analytics methods will accelerate that understanding.

  1. Selection of site specific vibration equation by using analytic hierarchy process in a quarry

    Energy Technology Data Exchange (ETDEWEB)

    Kalayci, Ulku, E-mail: ukalayci@istanbul.edu.tr; Ozer, Umit, E-mail: uozer@istanbul.edu.tr

    2016-01-15

    This paper presents a new approach for the selection of the most accurate SSVA (Site Specific Vibration Attenuation) equation for blasting processes in a quarry located near settlements in Istanbul, Turkey. In this context, the SSVA equations obtained from the same study area in the literature were considered in terms of distance between the shot points and buildings and the amount of explosive charge. In this purpose, 11 different SSVA equations obtained from the study area in the past 12 years, forecasting capabilities according to designated new conditions, using 102 vibration records as test data obtained from the study area was investigated. In this study, AHP (Analytic Hierarchy Process) was selected as an analysis method in order to determine the most accurate equation among 11 SSAV equations, and the parameters such as year, distance, charge, and r{sup 2} of the equations were used as criteria for AHP. Finally, the most appropriate equation was selected among the existing ones, and the process of selecting according to different target criteria was presented. Furthermore, it was noted that the forecasting results of the selected equation is more accurate than that formed using the test results. - Highlights: • The optimum Site Specific Vibration Attenuation equation for blasting in a quarry located near settlements was determined. • It is indicated that SSVA equations changing over the years don’t give always accurate estimates at changing conditions. • Selection of the blast induced SSVA equation was made using AHP. • Equation selection method was highlighted based on parameters such as charge, distance, and quarry geometry changes (year).

  2. Selection of site specific vibration equation by using analytic hierarchy process in a quarry

    International Nuclear Information System (INIS)

    Kalayci, Ulku; Ozer, Umit

    2016-01-01

    This paper presents a new approach for the selection of the most accurate SSVA (Site Specific Vibration Attenuation) equation for blasting processes in a quarry located near settlements in Istanbul, Turkey. In this context, the SSVA equations obtained from the same study area in the literature were considered in terms of distance between the shot points and buildings and the amount of explosive charge. In this purpose, 11 different SSVA equations obtained from the study area in the past 12 years, forecasting capabilities according to designated new conditions, using 102 vibration records as test data obtained from the study area was investigated. In this study, AHP (Analytic Hierarchy Process) was selected as an analysis method in order to determine the most accurate equation among 11 SSAV equations, and the parameters such as year, distance, charge, and r"2 of the equations were used as criteria for AHP. Finally, the most appropriate equation was selected among the existing ones, and the process of selecting according to different target criteria was presented. Furthermore, it was noted that the forecasting results of the selected equation is more accurate than that formed using the test results. - Highlights: • The optimum Site Specific Vibration Attenuation equation for blasting in a quarry located near settlements was determined. • It is indicated that SSVA equations changing over the years don’t give always accurate estimates at changing conditions. • Selection of the blast induced SSVA equation was made using AHP. • Equation selection method was highlighted based on parameters such as charge, distance, and quarry geometry changes (year).

  3. Big Data Analytics in Chemical Engineering.

    Science.gov (United States)

    Chiang, Leo; Lu, Bo; Castillo, Ivan

    2017-06-07

    Big data analytics is the journey to turn data into insights for more informed business and operational decisions. As the chemical engineering community is collecting more data (volume) from different sources (variety), this journey becomes more challenging in terms of using the right data and the right tools (analytics) to make the right decisions in real time (velocity). This article highlights recent big data advancements in five industries, including chemicals, energy, semiconductors, pharmaceuticals, and food, and then discusses technical, platform, and culture challenges. To reach the next milestone in multiplying successes to the enterprise level, government, academia, and industry need to collaboratively focus on workforce development and innovation.

  4. Mars Analytical Microimager

    Science.gov (United States)

    Batory, Krzysztof J.; Govindjee; Andersen, Dale; Presley, John; Lucas, John M.; Sears, S. Kelly; Vali, Hojatollah

    Unambiguous detection of extraterrestrial nitrogenous hydrocarbon microbiology requires an instrument both to recognize potential biogenic specimens and to successfully discriminate them from geochemical settings. Such detection should ideally be in-situ and not jeopardize other experiments by altering samples. Taken individually most biomarkers are inconclusive. For example, since amino acids can be synthesized abiotically they are not always considered reliable biomarkers. An enantiomeric imbalance, which is characteristic of all terrestrial life, may be questioned because chirality can also be altered abiotically. However, current scientific understanding holds that aggregates of identical proteins or proteinaceous complexes, with their well-defined amino acid residue sequences, are indisputable biomarkers. Our paper describes the Mars Analytical Microimager, an instrument for the simultaneous imaging of generic autofluorescent biomarkers and overall morphology. Autofluorescence from ultraviolet to near-infrared is emitted by all known terrestrial biology, and often as consistent complex bands uncharacteristic of abiotic mineral luminescence. The MAM acquires morphology, and even sub-micron morphogenesis, at a 3-centimeter working distance with resolution approaching a laser scanning microscope. Luminescence is simultaneously collected via a 2.5-micron aperture, thereby permitting accurate correlation of multi-dimensional optical behavior with specimen morphology. A variable wavelength excitation source and photospectrometer serve to obtain steady-state and excitation spectra of biotic and luminescent abiotic sources. We believe this is the first time instrumentation for detecting hydrated or desiccated microbiology non-destructively in-situ has been demonstrated. We have obtained excellent preliminary detection of biota and inorganic matrix discrimination from terrestrial polar analogues, and perimetric morphology of individual magnetotactic bacteria. Proposed

  5. An analytical model of the HINT performance metric

    Energy Technology Data Exchange (ETDEWEB)

    Snell, Q.O.; Gustafson, J.L. [Scalable Computing Lab., Ames, IA (United States)

    1996-10-01

    The HINT benchmark was developed to provide a broad-spectrum metric for computers and to measure performance over the full range of memory sizes and time scales. We have extended our understanding of why HINT performance curves look the way they do and can now predict the curves using an analytical model based on simple hardware specifications as input parameters. Conversely, by fitting the experimental curves with the analytical model, hardware specifications such as memory performance can be inferred to provide insight into the nature of a given computer system.

  6. Interpretation of the source-specific substantive control measures of the Minamata Convention on Mercury.

    Science.gov (United States)

    You, Mingqing

    2015-02-01

    Being persistent, toxic, and bio-accumulative, Mercury (Hg) seriously affects the environment and human health. Due to Hg's attribute of long-range environmental transport across national borders, especially through atmospheric transport, no country can fully protect its environment and human health with its own efforts, without global cooperation. The Minamata Convention on Mercury, which was formally adopted and opened for signature in October 2013, is the only global environmental regime on the control of Hg pollution. Its main substantive control measures are source-specific: its phasing-out, phasing-down, and other main substantive requirements all direct to specific categories of pollution sources through the regulation of specific sectors of the economy and social life. This Convention does not take a national quota approach to quantify the Parties' nationwide total allowable consumption or discharge of Hg or Hg compounds, nor does it quantify their nationwide total reduction requirements. This paper attempts to find the underlying reasons for this source-specific approach and offers two interpretations. One possible interpretation is that Hg might be a non-threshold pollutant, i.e., a pollutant without a risk-free value of concentration. The existence of a reference dose (RfD), reference concentration (RfC), provisional tolerable weekly intake (PTWI), minimal risk level (MRL) or other similar reference values of Hg does not necessarily mean that Hg cannot be regarded as non-threshold because such reference values have scientific uncertainties and may also involve policy considerations. Another interpretation is that Hg lacks a feasibly determinable total allowable quantity. There is evidence that negotiators might have treated Hg as non-threshold, or at least accepted that Hg lacks a feasibly determinable total allowable quantity: (1) The negotiators were informed about the serious situations of the current emissions, releases, and legacy deposition; (2

  7. Newspaper Reading among College Students in Development of Their Analytical Ability

    Science.gov (United States)

    Kumar, Dinesh

    2009-01-01

    The study investigated the newspaper reading among college students in development of their analytical ability. Newspapers are one of the few sources of information that are comprehensive, interconnected and offered in one format. The main objective of the study was to find out the development of the analytical ability among college students by…

  8. Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.

    Science.gov (United States)

    Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs

    2018-01-01

    While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.

  9. 78 FR 60700 - Source Specific Federal Implementation Plan for Implementing Best Available Retrofit Technology...

    Science.gov (United States)

    2013-10-02

    ... ENVIRONMENTAL PROTECTION AGENCY 40 CFR Part 49 [EPA-R09-OAR-2013-0489; FRL-9901-58-Region 9] Source Specific Federal Implementation Plan for Implementing Best Available Retrofit Technology for Four... Plan (FIP) to implement the Best Available Retrofit Technology (BART) requirement of the Regional Haze...

  10. 78 FR 41731 - Source Specific Federal Implementation Plan for Implementing Best Available Retrofit Technology...

    Science.gov (United States)

    2013-07-11

    ... ENVIRONMENTAL PROTECTION AGENCY 40 CFR Part 49 [EPA-R09-OAR-2013-0489; FRL-9830-5] Source Specific Federal Implementation Plan for Implementing Best Available Retrofit Technology for Four Corners Power... Implementation Plan (FIP) to implement the Best Available Retrofit Technology (BART) requirement of the Regional...

  11. Theoretical and Numerical Modeling of Transport of Land Use-Specific Fecal Source Identifiers

    Science.gov (United States)

    Bombardelli, F. A.; Sirikanchana, K. J.; Bae, S.; Wuertz, S.

    2008-12-01

    Microbial contamination in coastal and estuarine waters is of particular concern to public health officials. In this work, we advocate that well-formulated and developed mathematical and numerical transport models can be combined with modern molecular techniques in order to predict continuous concentrations of microbial indicators under diverse scenarios of interest, and that they can help in source identification of fecal pollution. As a proof of concept, we present initially the theory, numerical implementation and validation of one- and two-dimensional numerical models aimed at computing the distribution of fecal source identifiers in water bodies (based on Bacteroidales marker DNA sequences) coming from different land uses such as wildlife, livestock, humans, dogs or cats. These models have been developed to allow for source identification of fecal contamination in large bodies of water. We test the model predictions using diverse velocity fields and boundary conditions. Then, we present some preliminary results of an application of a three-dimensional water quality model to address the source of fecal contamination in the San Pablo Bay (SPB), United States, which constitutes an important sub-embayment of the San Francisco Bay. The transport equations for Bacteroidales include the processes of advection, diffusion, and decay of Bacteroidales. We discuss the validation of the developed models through comparisons of numerical results with field campaigns developed in the SPB. We determine the extent and importance of the contamination in the bay for two decay rates obtained from field observations, corresponding to total host-specific Bacteroidales DNA and host-specific viable Bacteroidales cells, respectively. Finally, we infer transport conditions in the SPB based on the numerical results, characterizing the fate of outflows coming from the Napa, Petaluma and Sonoma rivers.

  12. Domain-specific impairment of source memory following a right posterior medial temporal lobe lesion.

    Science.gov (United States)

    Peters, Jan; Koch, Benno; Schwarz, Michael; Daum, Irene

    2007-01-01

    This single case analysis of memory performance in a patient with an ischemic lesion affecting posterior but not anterior right medial temporal lobe (MTL) indicates that source memory can be disrupted in a domain-specific manner. The patient showed normal recognition memory for gray-scale photos of objects (visual condition) and spoken words (auditory condition). While memory for visual source (texture/color of the background against which pictures appeared) was within the normal range, auditory source memory (male/female speaker voice) was at chance level, a performance pattern significantly different from the control group. This dissociation is consistent with recent fMRI evidence of anterior/posterior MTL dissociations depending upon the nature of source information (visual texture/color vs. auditory speaker voice). The findings are in good agreement with the view of dissociable memory processing by the perirhinal cortex (anterior MTL) and parahippocampal cortex (posterior MTL), depending upon the neocortical input that these regions receive. (c) 2007 Wiley-Liss, Inc.

  13. Nodewise analytical calculation of the transfer function

    International Nuclear Information System (INIS)

    Makai, Mihaly

    1994-01-01

    The space dependence of neutron noise has so far been mostly investigated in homogeneous core models. Application of core diagnostic methods to locate a malfunction requires however that the transfer function be calculated for real, inhomogeneous cores. A code suitable for such purpose must be able to handle complex arithmetic and delta-function source. Further requirements are analytical dependence in one spatial variable and fast execution. The present work describes the TIDE program written to fulfil the above requirements. The core is subdivided into homogeneous, square assemblies. An analytical solution is given, which is a generalisation of the inhomogeneous response matrix method. (author)

  14. service line analytics in the new era.

    Science.gov (United States)

    Spence, Jay; Seargeant, Dan

    2015-08-01

    To succeed under the value-based business model, hospitals and health systems require effective service line analytics that combine inpatient and outpatient data and that incorporate quality metrics for evaluating clinical operations. When developing a framework for collection, analysis, and dissemination of service line data, healthcare organizations should focus on five key aspects of effective service line analytics: Updated service line definitions. Ability to analyze and trend service line net patient revenues by payment source. Access to accurate service line cost information across multiple dimensions with drill-through capabilities. Ability to redesign key reports based on changing requirements. Clear assignment of accountability.

  15. Big Data Analytics in Medicine and Healthcare.

    Science.gov (United States)

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  16. Advanced analytical techniques

    International Nuclear Information System (INIS)

    Mrochek, J.E.; Shumate, S.E.; Genung, R.K.; Bahner, C.T.; Lee, N.E.; Dinsmore, S.R.

    1976-01-01

    The development of several new analytical techniques for use in clinical diagnosis and biomedical research is reported. These include: high-resolution liquid chromatographic systems for the early detection of pathological molecular constituents in physiologic body fluids; gradient elution chromatography for the analysis of protein-bound carbohydrates in blood serum samples, with emphasis on changes in sera from breast cancer patients; electrophoretic separation techniques coupled with staining of specific proteins in cellular isoenzymes for the monitoring of genetic mutations and abnormal molecular constituents in blood samples; and the development of a centrifugal elution chromatographic technique for the assay of specific proteins and immunoglobulins in human blood serum samples

  17. Civil Society In Tanzania: An Analytical Review Of Sources Of ...

    African Journals Online (AJOL)

    Sixty percent of civil societies deal with social development programmes. Additionally, results show that most civil societies had disproportionate staffing problems; and sixty six percent depended on international sources of funding while 46% reported that they secured funds from both local and foreign sources of financing.

  18. Source-specific speciation profiles of PM2.5 for heavy metals and their anthropogenic emissions in China.

    Science.gov (United States)

    Liu, Yayong; Xing, Jia; Wang, Shuxiao; Fu, Xiao; Zheng, Haotian

    2018-08-01

    Heavy metals are concerned for its adverse effect on human health and long term burden on biogeochemical cycling in the ecosystem. In this study, a provincial-level emission inventory of 13 kinds of heavy metals including V, Cr, Mn, Co, Ni, Cu, Zn, As, Cd, Sn, Sb, Ba and Pb from 10 anthropogenic sources was developed for China, based on the 2015 national emission inventory of primary particulate matters and source category-specific speciation profiles collected from 50 previous studies measured in China. Uncertainties associated with the speciation profiles were also evaluated. Our results suggested that total emissions of the 13 types of heavy metals in China are estimated at about 58000 ton for the year 2015. The iron production is the dominant source of heavy metal, contributing 42% of total emissions of heavy metals. The emissions of heavy metals vary significantly at regional scale, with largest amount of emissions concentrated in northern and eastern China. Particular, high emissions of Cr, Co, Ni, As and Sb (contributing 8%-18% of the national emissions) are found in Shandong where has large capacity of industrial production. Uncertainty analysis suggested that the implementation of province-specific source profiles in this study significantly reduced the emission uncertainties from (-89%, 289%) to (-99%, 91%), particularly for coal combustion. However, source profiles for industry sectors such as non-metallic mineral manufacturing are quite limited, resulting in a relative high uncertainty. The high-resolution emission inventories of heavy metals are essential not only for their distribution, deposition and transport studies, but for the design of policies to redress critical atmospheric environmental hazards at local and regional scales. Detailed investigation on source-specific profile in China are still needed to achieve more accurate estimations of heavy metals in the future. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Design specific joint optimization of masks and sources on a very large scale

    Science.gov (United States)

    Lai, K.; Gabrani, M.; Demaris, D.; Casati, N.; Torres, A.; Sarkar, S.; Strenski, P.; Bagheri, S.; Scarpazza, D.; Rosenbluth, A. E.; Melville, D. O.; Wächter, A.; Lee, J.; Austel, V.; Szeto-Millstone, M.; Tian, K.; Barahona, F.; Inoue, T.; Sakamoto, M.

    2011-04-01

    Joint optimization (JO) of source and mask together is known to produce better SMO solutions than sequential optimization of the source and the mask. However, large scale JO problems are very difficult to solve because the global impact of the source variables causes an enormous number of mask variables to be coupled together. This work presents innovation that minimize this runtime bottleneck. The proposed SMO parallelization algorithm allows separate mask regions to be processed efficiently across multiple CPUs in a high performance computing (HPC) environment, despite the fact that a truly joint optimization is being carried out with source variables that interact across the entire mask. Building on this engine a progressive deletion (PD) method was developed that can directly compute "binding constructs" for the optimization, i.e. our method can essentially determine the particular feature content which limits the process window attainable by the optimum source. This method allows us to minimize the uncertainty inherent to different clustering/ranking methods in seeking an overall optimum source that results from the use of heuristic metrics. An objective benchmarking of the effectiveness of different pattern sampling methods was performed during postoptimization analysis. The PD serves as a golden standard for us to develop optimum pattern clustering/ranking algorithms. With this work, it is shown that it is not necessary to exhaustively optimize the entire mask together with the source in order to identify these binding clips. If the number of clips to be optimized exceeds the practical limit of the parallel SMO engine one can starts with a pattern selection step to achieve high clip count compression before SMO. With this LSSO capability one can address the challenging problem of layout-specific design, or improve the technology source as cell layouts and sample layouts replace lithography test structures in the development cycle.

  20. Theoretical, analytical, and statistical interpretation of environmental data

    International Nuclear Information System (INIS)

    Lombard, S.M.

    1974-01-01

    The reliability of data from radiochemical analyses of environmental samples cannot be determined from nuclear counting statistics alone. The rigorous application of the principles of propagation of errors, an understanding of the physics and chemistry of the species of interest in the environment, and the application of information from research on the analytical procedure are all necessary for a valid estimation of the errors associated with analytical results. The specific case of the determination of plutonium in soil is considered in terms of analytical problems and data reliability. (U.S.)

  1. Analytical performance of 17 general chemistry analytes across countries and across manufacturers in the INPUtS project of EQA organizers in Italy, the Netherlands, Portugal, United Kingdom and Spain.

    Science.gov (United States)

    Weykamp, Cas; Secchiero, Sandra; Plebani, Mario; Thelen, Marc; Cobbaert, Christa; Thomas, Annette; Jassam, Nuthar; Barth, Julian H; Perich, Carmen; Ricós, Carmen; Faria, Ana Paula

    2017-02-01

    Optimum patient care in relation to laboratory medicine is achieved when results of laboratory tests are equivalent, irrespective of the analytical platform used or the country where the laboratory is located. Standardization and harmonization minimize differences and the success of efforts to achieve this can be monitored with international category 1 external quality assessment (EQA) programs. An EQA project with commutable samples, targeted with reference measurement procedures (RMPs) was organized by EQA institutes in Italy, the Netherlands, Portugal, UK, and Spain. Results of 17 general chemistry analytes were evaluated across countries and across manufacturers according to performance specifications derived from biological variation (BV). For K, uric acid, glucose, cholesterol and high-density density (HDL) cholesterol, the minimum performance specification was met in all countries and by all manufacturers. For Na, Cl, and Ca, the minimum performance specifications were met by none of the countries and manufacturers. For enzymes, the situation was complicated, as standardization of results of enzymes toward RMPs was still not achieved in 20% of the laboratories and questionable in the remaining 80%. The overall performance of the measurement of 17 general chemistry analytes in European medical laboratories met the minimum performance specifications. In this general picture, there were no significant differences per country and no significant differences per manufacturer. There were major differences between the analytes. There were six analytes for which the minimum quality specifications were not met and manufacturers should improve their performance for these analytes. Standardization of results of enzymes requires ongoing efforts.

  2. World Spatiotemporal Analytics and Mapping Project (WSTAMP): Discovering, Exploring, and Mapping Spatiotemporal Patterns across the World s Largest Open Source Geographic Data Sets

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Robert N [ORNL; Piburn, Jesse O [ORNL; Sorokine, Alexandre [ORNL; Myers, Aaron T [ORNL; White, Devin A [ORNL

    2015-01-01

    The application of spatiotemporal (ST) analytics to integrated data from major sources such as the World Bank, United Nations, and dozens of others holds tremendous potential for shedding new light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, and changing attributes, as well as content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 10,000+ attributes covering over 200 nation states spanning over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We discuss the status of this work and report on major findings. Acknowledgment Prepared by Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, Tennessee 37831-6285, managed by UT-Battelle, LLC for the U. S. Department of Energy under contract no. DEAC05-00OR22725. Copyright This manuscript has been authored by employees of UT-Battelle, LLC, under contract DE-AC05-00OR22725 with the U.S. Department of Energy. Accordingly, the United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or

  3. "Using recruitment source timing and diagnosticity to enhance applicants' occupation-specific human capital": Correction to Campion, Ployhart, and Campion (2017).

    Science.gov (United States)

    2017-05-01

    Reports an error in "Using Recruitment Source Timing and Diagnosticity to Enhance Applicants' Occupation-Specific Human Capital" by Michael C. Campion, Robert E. Ployhart and Michael A. Campion ( Journal of Applied Psychology , Advanced Online Publication, Feb 02, 2017, np). In the article, the following headings were inadvertently set at the wrong level: Method, Participants and Procedure, Measures, Occupation specific human capital, Symbolic jobs, Relevant majors, Occupation-specific capital hotspots, Source timing, Source diagnosticity, Results, and Discussion. All versions of this article have been corrected. (The following abstract of the original article appeared in record 2017-04566-001.) This study proposes that reaching applicants through more diagnostic recruitment sources earlier in their educational development (e.g., in high school) can lead them to invest more in their occupation-specific human capital (OSHC), thereby making them higher quality candidates. Using a sample of 78,157 applicants applying for jobs within a desirable professional occupation in the public sector, results indicate that applicants who report hearing about the occupation earlier, and applicants who report hearing about the occupation through more diagnostic sources, have higher levels of OSHC upon application. Additionally, source timing and diagnosticity affect the likelihood of candidates applying for jobs symbolic of the occupation, selecting relevant majors, and attending educational institutions with top programs related to the occupation. These findings suggest a firm's recruiting efforts may influence applicants' OSHC investment strategies. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. Analytical representation for solution of the neutron point kinetics equation with time-dependent reactivity and free of the stiffness character

    International Nuclear Information System (INIS)

    Silva, Milena Wollmann da

    2013-01-01

    In this work, we report a genuine analytical representation for the solution of the neutron point kinetics equation free of the stiffness character, assuming that the reactivity is a continuous and sectionally continuous function of time. To this end, we initially cast the point kinetics equation in a first order linear differential equation. Next, we split the corresponding matrix as a sum of a diagonal matrix with a matrix, whose components contain the off-diagonal elements. Next, expanding the neutron density and the delayed neutron precursors concentrations in a truncated series, and replacing these expansions in the matrix equation, we come out with an equation, which allows to construct a recursive system, a first order matrix differential equation with source. The fundamental characteristic of this system relies on the fact that the corresponding matrix is diagonal, meanwhile the source term is written in terms of the matrix with the off-diagonal components. Further, the first equation of the recursive system has no source and satisfies the initial conditions. On the other hand, the remaining equations satisfy the null initial condition. Due to the diagonal feature of the matrix, we attain analytical solutions for these recursive equations. We also mention that we evaluate the results for any time value, without the analytical continuity because the purposed solution is free on the stiffness character. Finally, we present numerical simulations and comparisons against literature results, considering specific the applications for the following reactivity functions: constant, step, ramp, and sine. (author)

  5. A genetic algorithm-based job scheduling model for big data analytics.

    Science.gov (United States)

    Lu, Qinghua; Li, Shanshan; Zhang, Weishan; Zhang, Lei

    Big data analytics (BDA) applications are a new category of software applications that process large amounts of data using scalable parallel processing infrastructure to obtain hidden value. Hadoop is the most mature open-source big data analytics framework, which implements the MapReduce programming model to process big data with MapReduce jobs. Big data analytics jobs are often continuous and not mutually separated. The existing work mainly focuses on executing jobs in sequence, which are often inefficient and consume high energy. In this paper, we propose a genetic algorithm-based job scheduling model for big data analytics applications to improve the efficiency of big data analytics. To implement the job scheduling model, we leverage an estimation module to predict the performance of clusters when executing analytics jobs. We have evaluated the proposed job scheduling model in terms of feasibility and accuracy.

  6. USE OF BIG DATA ANALYTICS FOR CUSTOMER RELATIONSHIP MANAGEMENT: POINT OF PARITY OR SOURCE OF COMPETITIVE ADVANTAGE?

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas; Zablah, Alex R.; Straub, Detmar W.

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (CA use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: 1. What are the key antecedents of big data customer analytics use? 2. How, and to what extent, does big data...

  7. Application of radioactive sources in analytical instruments for planetary exploration

    International Nuclear Information System (INIS)

    Economou, T.E.

    2008-01-01

    Full text: In the past 50 years or so, many types of radioactive sources have been used in space exploration. 238 Pu is often used in space missions in Radioactive Heater Units (RHU) and Radioisotope Thermoelectric Generators (RTG) for heat and power generation, respectively. In 1960's, 2 ' 42 Cm alpha radioactive sources have been used for the first time in space applications on 3 Surveyor spacecrafts to obtain the chemical composition of the lunar surface with an instrument based on the Rutherford backscatterring of the alpha particles from nuclei in the analyzed sample. 242 Cm is an alpha emitter of 6.1 MeV alpha particles. Its half-life time, 163 days, is short enough to allow sources to be prepared with the necessary high intensity per unit area ( up to 470 mCi and FWHM of about 1.5% in the lunar instruments) that results in narrow energy distribution, yet long enough that the sources have adequate lifetimes for short duration missions. 242 Cm is readily prepared in curie quantities by irradiation of 241 Am by neutrons in nuclear reactors, followed by chemical separation of the curium from the americium and fission products. For long duration missions, like for example missions to Mars, comets, and asteroids, the isotope 244 Cm (T 1/2 =18.1 y, E α =5.8 MeV) is a better source because of its much longer half-life time. Both of these isotopes are also excellent x-ray excitation sources and have been used for that purpose on several planetary missions. For the light elements the excitation is caused mainly by the alpha particles, while for the heavier elements (> Ca) the excitation is mainly due to the x-rays from the Pu L-lines (E x =14-18 keV). 244 Cm has been used in several variations of the Alpha Proton Xray Spectrometer (APXS): PHOBOS 1 and 2 Pathfinder, Russian Mars-96 mission, Mars Exploration Rover (MER) and Rosetta. Other sources used in X-ray fluorescence instruments in space are 55 Fe and 109 Cd (Viking1,2, Beagle 2) and 57 Co is used in Moessbauer

  8. Integrated Array/Metadata Analytics

    Science.gov (United States)

    Misev, Dimitar; Baumann, Peter

    2015-04-01

    Data comes in various forms and types, and integration usually presents a problem that is often simply ignored and solved with ad-hoc solutions. Multidimensional arrays are an ubiquitous data type, that we find at the core of virtually all science and engineering domains, as sensor, model, image, statistics data. Naturally, arrays are richly described by and intertwined with additional metadata (alphanumeric relational data, XML, JSON, etc). Database systems, however, a fundamental building block of what we call "Big Data", lack adequate support for modelling and expressing these array data/metadata relationships. Array analytics is hence quite primitive or non-existent at all in modern relational DBMS. Recognizing this, we extended SQL with a new SQL/MDA part seamlessly integrating multidimensional array analytics into the standard database query language. We demonstrate the benefits of SQL/MDA with real-world examples executed in ASQLDB, an open-source mediator system based on HSQLDB and rasdaman, that already implements SQL/MDA.

  9. Argon analytical procedures for potassium-argon dating

    International Nuclear Information System (INIS)

    Gabites, J.E.; Adams, C.J.

    1981-01-01

    A manual for the argon analytical methods involved in potassium-argon geochronology, including: i) operating procedures for the ultra-high vacuum argon extraction/purification equipment for the analysis of nanolitre quantities of radiogenic argon in rocks, minerals and gases; ii) operating procedures for the AEI-MS10 gas source mass spectrometer

  10. Family of Quantum Sources for Improving Near Field Accuracy in Transducer Modeling by the Distributed Point Source Method

    Directory of Open Access Journals (Sweden)

    Dominique Placko

    2016-10-01

    Full Text Available The distributed point source method, or DPSM, developed in the last decade has been used for solving various engineering problems—such as elastic and electromagnetic wave propagation, electrostatic, and fluid flow problems. Based on a semi-analytical formulation, the DPSM solution is generally built by superimposing the point source solutions or Green’s functions. However, the DPSM solution can be also obtained by superimposing elemental solutions of volume sources having some source density called the equivalent source density (ESD. In earlier works mostly point sources were used. In this paper the DPSM formulation is modified to introduce a new kind of ESD, replacing the classical single point source by a family of point sources that are referred to as quantum sources. The proposed formulation with these quantum sources do not change the dimension of the global matrix to be inverted to solve the problem when compared with the classical point source-based DPSM formulation. To assess the performance of this new formulation, the ultrasonic field generated by a circular planer transducer was compared with the classical DPSM formulation and analytical solution. The results show a significant improvement in the near field computation.

  11. An Analysis of Earth Science Data Analytics Use Cases

    Science.gov (United States)

    Shie, Chung-Lin; Kempler, Steve

    2014-01-01

    The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https:earthdata.nasa.govaboutsystem-- performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co--analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.

  12. 100-K Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    Ovink, R.

    2012-09-18

    This report documents the process used to identify source area target analytes in support of the 100-K Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).

  13. 100-F Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    Ovink, R.

    2012-09-18

    This report documents the process used to identify source area target analytes in support of the 100-F Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).

  14. Investigation of rf plasma light sources for dye laser excitation

    International Nuclear Information System (INIS)

    Kendall, J.S.; Jaminet, J.F.

    1975-06-01

    Analytical and experimental studies were performed to assess the applicability of radio frequency (rf) induction heated plasma light sources for potential excitation of continuous dye lasers. Experimental efforts were directed toward development of a continuous light source having spectral flux and emission characteristics approaching that required for pumping organic dye lasers. Analytical studies were performed to investigate (1) methods of pulsing the light source to obtain higher radiant intensity and (2) methods of integrating the source with a reflective cavity for pumping a dye cell. (TFD)

  15. Source apportionment of elevated wintertime PAHs by compound-specific radiocarbon analysis

    Directory of Open Access Journals (Sweden)

    R. J. Sheesley

    2009-05-01

    Full Text Available Natural abundance radiocarbon analysis facilitates distinct source apportionment between contemporary biomass/biofuel (14C "alive" versus fossil fuel (14C "dead" combustion. Here, the first compound-specific radiocarbon analysis (CSRA of atmospheric polycyclic aromatic hydrocarbons (PAHs was demonstrated for a set of samples collected in Lycksele, Sweden a small town with frequent episodes of severe atmospheric pollution in the winter. Renewed interest in using residential wood combustion (RWC means that this type of seasonal pollution is of increasing concern in many areas. Five individual/paired PAH isolates from three pooled fortnight-long filter collections were analyzed by CSRA: phenanthrene, fluoranthene, pyrene, benzo[b+k]fluoranthene and indeno[cd]pyrene plus benzo[ghi]perylene; phenanthrene was the only compound also analyzed in the gas phase. The measured Δ14C for PAHs spanned from −138.3‰ to 58.0‰. A simple isotopic mass balance model was applied to estimate the fraction biomass (fbiomass contribution, which was constrained to 71–87% for the individual PAHs. Indeno[cd]pyrene plus benzo[ghi]perylene had an fbiomass of 71%, while fluoranthene and phenanthrene (gas phase had the highest biomass contribution at 87%. The total organic carbon (TOC, defined as carbon remaining after removal of inorganic carbon fbiomass was estimated to be 77%, which falls within the range for PAHs. This CSRA data of atmospheric PAHs established that RWC is the dominating source of atmospheric PAHs to this region of the boreal zone with some variations among RWC contributions to specific PAHs.

  16. Analytic Reflected Lightcurves for Exoplanets

    Science.gov (United States)

    Haggard, Hal M.; Cowan, Nicolas B.

    2018-04-01

    The disk-integrated reflected brightness of an exoplanet changes as a function of time due to orbital and rotational motion coupled with an inhomogeneous albedo map. We have previously derived analytic reflected lightcurves for spherical harmonic albedo maps in the special case of a synchronously-rotating planet on an edge-on orbit (Cowan, Fuentes & Haggard 2013). In this letter, we present analytic reflected lightcurves for the general case of a planet on an inclined orbit, with arbitrary spin period and non-zero obliquity. We do so for two different albedo basis maps: bright points (δ-maps), and spherical harmonics (Y_l^m-maps). In particular, we use Wigner D-matrices to express an harmonic lightcurve for an arbitrary viewing geometry as a non-linear combination of harmonic lightcurves for the simpler edge-on, synchronously rotating geometry. These solutions will enable future exploration of the degeneracies and information content of reflected lightcurves, as well as fast calculation of lightcurves for mapping exoplanets based on time-resolved photometry. To these ends we make available Exoplanet Analytic Reflected Lightcurves (EARL), a simple open-source code that allows rapid computation of reflected lightcurves.

  17. Online Learner Engagement: Opportunities and Challenges with Using Data Analytics

    Science.gov (United States)

    Bodily, Robert; Graham, Charles R.; Bush, Michael D.

    2017-01-01

    This article describes the crossroads between learning analytics and learner engagement. The authors do this by describing specific challenges of using analytics to support student engagement from three distinct perspectives: pedagogical considerations, technological issues, and interface design concerns. While engaging online learners presents a…

  18. Analytical method for determining colour intensities based on Cherenkov radiation colour quenching

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez-Gomez, C; Lopez-Gonzalez, J deD; Ferro-Garcia, M A [Univ. of Granada, Granada (Spain). Faculty of Sciences, Dept. of Inorganic Chemistry. Radiochemistry Section; Consejo Superior de Investigaciones Cientificas, Granada (Spain). Dept. of Chemical Research Coordinated Centre)

    1983-01-01

    A study was made for determining color intensities using as luminous non-monochromatic source produced by the Cherenkov emission in the walls of a glass capillary which acts as luminous source itself inside the colored solution to be evaluated. The reproducibility of this method has been compared with the spectrophotometric assay; the relative errors of both analytical methods have been calculated for different concentrations of congo red solution in the range of minimal error, according to Ringbom's criterion. The sensitivity of this analytical method has been studied for the two ..beta..-emitters employed: /sup 90/Sr//sup 90/Y and /sup 204/Tl.

  19. Technical specifications for the provision of heat and steam sources for INPP and Visaginas. Final report

    International Nuclear Information System (INIS)

    2003-01-01

    In October 1999, the National Energy Strategy was approved by the Lithuanian Parliament. The National Energy Strategy included the decision to close Unit-1 of INPP before 2005. Later is has been decided to close Unit 2 before the end of 2009 as well. The closure and decommissioning will have heavy impact on the heat supply for the city of Visaginas. Unit 1 and Unit 2 of INPP supplies hot water and steam to INPP for process purposes and for space heating of residential and commercial buildings. When Unit 1 is permanently shut down, reliable heat and steam sources independent of the power plants own heat and steam generation facilities are required for safety reasons in the event of shutdown of the remaining unit for maintenance or in an emergency. These steam and heat sources must be operational before single unit operation is envisaged. Provision of a reliable independent heat and steam source is therefore urgent. After both reactors are shut down permanently, a steam source will be needed at the plant for radioactive waste storage and disposal. INPP and DEA has performed a feasibility study for the provision of a reliable heat source for Ignalina Nuclear Power Plant and Visaginas, and the modernisation of Visaginas district heating system. The objective of this project is to prepare technical specifications for the provision of new heat and steam sources for INPP and Visaginas, and for rehabilitation of the heat transmission pipeline between INPP, the back-up boiler station and Visaginas City. The results of the study are presented in detail in the reports and technical specifications: 1. Transient analysis for Visaginas DH system, 2. Non-destructive testing of boiler stations, pump stations and transmission lines, 3. Conceptual design, 4. Technical specifications, Package 1 to 6. The study has suggested: 1. Construction of new steam boiler station, 2. Construction of new heat only boiler station, 3. Renovation of existing back-up heat only boiler station, 4

  20. Reference materials for micro-analytical nuclear techniques

    International Nuclear Information System (INIS)

    Valkovic, V.; Zeisler, R.; Bernasconi, G.; Danesi, P.R.

    1994-01-01

    Direct application of many existing reference materials in micro-analytical procedures such as energy dispersive x-ray fluorescence (EDXRF), particle induced x-ray emission spectroscopy (PIXE) and ion probe techniques for the determination of trace elements is often impossible or difficult because: 1) other constituents present in large amounts interfere with the determination; 2) trace components are not sufficiently homogeneously distributed in the sample. Therefore specific natural-matrix reference materials containing very low levels of trace elements and having high degree of homogeneity are required for many micro-analytical procedures. In this report, selection of the types of environmental and biological materials which are suitable for micro-analytical techniques will be discussed. (author)

  1. Analytical Techniques in the Pharmaceutical Sciences

    DEFF Research Database (Denmark)

    Leurs, Ulrike; Mistarz, Ulrik Hvid; Rand, Kasper Dyrberg

    2016-01-01

    Mass spectrometry (MS) offers the capability to identify, characterize and quantify a target molecule in a complex sample matrix and has developed into a premier analytical tool in drug development science. Through specific MS-based workflows including customized sample preparation, coupling...

  2. System optimization for continuous on-stream elemental analysis using low-output isotopic neutron sources

    International Nuclear Information System (INIS)

    Rizk, R.A.M.

    1989-01-01

    In continuous on-stream neutron activation analysis, the material to be analyzed may be continuously recirculated in a closed loop system between an activation source and a shielded detector. In this paper an analytical formulation of the detector response for such a system is presented. This formulation should be useful in optimizing the system design parameters for specific applications. A study has been made of all parameters that influence the detector response during on-stream analysis. Feasibility applications of the method to solutions of manganese and vanadium using a 5 μg 252 Cf neutron source are demonstrated. (author)

  3. FINANCIAL REPORTING AND SOURCE DOCUMENTS OF UKRAINIAN ENTERPRISES WHEN APPLYING THE IFRS

    Directory of Open Access Journals (Sweden)

    G. Golubnicha

    2013-08-01

    Full Text Available The theoretical, methodological and practical aspects of changes in financial reporting and source documents specific to Ukrainian enterprises in the new conditions resulting from the application of International Financial Reporting Standards have been analyzed. Also, a conceptual approach of defining the patterns of changes in financial reporting and elements of accounting method has been proposed. The issue of internal quality control of analytical accounting information at various stages of its formation has been researched.

  4. Waste minimization in analytical methods

    International Nuclear Information System (INIS)

    Green, D.W.; Smith, L.L.; Crain, J.S.; Boparai, A.S.; Kiely, J.T.; Yaeger, J.S. Schilling, J.B.

    1995-01-01

    The US Department of Energy (DOE) will require a large number of waste characterizations over a multi-year period to accomplish the Department's goals in environmental restoration and waste management. Estimates vary, but two million analyses annually are expected. The waste generated by the analytical procedures used for characterizations is a significant source of new DOE waste. Success in reducing the volume of secondary waste and the costs of handling this waste would significantly decrease the overall cost of this DOE program. Selection of appropriate analytical methods depends on the intended use of the resultant data. It is not always necessary to use a high-powered analytical method, typically at higher cost, to obtain data needed to make decisions about waste management. Indeed, for samples taken from some heterogeneous systems, the meaning of high accuracy becomes clouded if the data generated are intended to measure a property of this system. Among the factors to be considered in selecting the analytical method are the lower limit of detection, accuracy, turnaround time, cost, reproducibility (precision), interferences, and simplicity. Occasionally, there must be tradeoffs among these factors to achieve the multiple goals of a characterization program. The purpose of the work described here is to add waste minimization to the list of characteristics to be considered. In this paper the authors present results of modifying analytical methods for waste characterization to reduce both the cost of analysis and volume of secondary wastes. Although tradeoffs may be required to minimize waste while still generating data of acceptable quality for the decision-making process, they have data demonstrating that wastes can be reduced in some cases without sacrificing accuracy or precision

  5. Research in atomic and applied physics using a 6-GeV synchrotron source

    International Nuclear Information System (INIS)

    Jones, K.W.

    1985-12-01

    The Division of Atomic and Applied Physics in the Department of Applied Science at Brookhaven National Laboratory conducts a broad program of research using ion beams and synchrotron radiation for experiments in atomic physics and nuclear analytical techniques and applications. Many of the experiments would benefit greatly from the use of high energy, high intensity photon beams from a 6-GeV synchrotron source. A survey of some of the specific scientific possibilities is presented

  6. Thermal modeling of multi-shape heating sources on n-layer electronic board

    Directory of Open Access Journals (Sweden)

    Monier-Vinard Eric

    2017-01-01

    Full Text Available The present work completes the toolbox of analytical solutions that deal with resolving steady-state temperatures of a multi-layered structure heated by one or many heat sources. The problematic of heating sources having non-rectangular shapes is addressed to enlarge the capability of analytical approaches. Moreover, various heating sources could be located on the external surfaces of the sandwiched layers as well as embedded at interface of its constitutive layers. To demonstrate its relevance, the updated analytical solution has been compared with numerical simulations on the case of a multi-layered electronic board submitted to a set of heating source configurations. The comparison shows a high agreement between analytical and numerical calculations to predict the centroid and average temperatures. The promoted analytical approach establishes a kit of practical expressions, easy to implement, which would be cumulated, using superposition principle, to help electronic designers to early detect component or board temperatures beyond manufacturer limit. The ability to eliminate bad concept candidates with a minimum of set-up, relevant assumptions and low computation time can be easily achieved.

  7. Crowd Sourced Formal Verification-Augmentation (CSFV-A)

    Science.gov (United States)

    2016-06-01

    Projects Agency (DARPA), Air Force Research Laboratory (AFRL), Charles River Analytics Inc., and TopCoder, Inc. will be holding a contest to reward...CROWD SOURCED FORMAL VERIFICATION – AUGMENTATION (CSFV-A) CHARLES RIVER ANALYTICS, INC. JUNE 2016 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC...CSFV 5e. TASK NUMBER TC 5f. WORK UNIT NUMBER RA 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Charles River Analytics, Inc. 625 Mount Auburn

  8. Structural level characterization of base oils using advanced analytical techniques

    KAUST Repository

    Hourani, Nadim; Muller, Hendrik; Adam, Frederick M.; Panda, Saroj K.; Witt, Matthias; Al-Hajji, Adnan A.; Sarathy, Mani

    2015-01-01

    cyclotron resonance mass spectrometry (FT-ICR MS) equipped with atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) sources. First, the capabilities and limitations of each analytical technique were evaluated

  9. Analytical chemistry of actinides

    International Nuclear Information System (INIS)

    Chollet, H.; Marty, P.

    2001-01-01

    Different characterization methods specifically applied to the actinides are presented in this review such as ICP/OES (inductively coupled plasma-optical emission spectrometry), ICP/MS (inductively coupled plasma spectroscopy-mass spectrometry), TIMS (thermal ionization-mass spectrometry) and GD/OES (flow discharge optical emission). Molecular absorption spectrometry and capillary electrophoresis are also available to complete the excellent range of analytical tools at our disposal. (authors)

  10. Case Study : Visual Analytics in Software Product Assessments

    NARCIS (Netherlands)

    Telea, Alexandru; Voinea, Lucian; Lanza, M; Storey, M; Muller, H

    2009-01-01

    We present how a combination of static source code analysis, repository analysis, and visualization techniques has been used to effectively get and communicate insight in the development and project management problems of a large industrial code base. This study is an example of how visual analytics

  11. The World Spatiotemporal Analytics and Mapping Project (WSTAMP): Further Progress in Discovering, Exploring, and Mapping Spatiotemporal Patterns Across the World's Largest Open Source Data Sets

    Science.gov (United States)

    Piburn, J.; Stewart, R.; Myers, A.; Sorokine, A.; Axley, E.; Anderson, D.; Burdette, J.; Biddle, C.; Hohl, A.; Eberle, R.; Kaufman, J.; Morton, A.

    2017-10-01

    Spatiotemporal (ST) analytics applied to major data sources such as the World Bank and World Health Organization has shown tremendous value in shedding light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. WSTAMP engages this opportunity by situating analysts, data, and analytics together within a visually rich and computationally rigorous online analysis environment. Since introducing WSTAMP at the First International Workshop on Spatiotemporal Computing, several transformative advances have occurred. Collaboration with human computer interaction experts led to a complete interface redesign that deeply immerses the analyst within a ST context, significantly increases visual and textual content, provides navigational crosswalks for attribute discovery, substantially reduce mouse and keyboard actions, and supports user data uploads. Secondly, the database has been expanded to include over 16,000 attributes, 50 years of time, and 200+ nation states and redesigned to support non-annual, non-national, city, and interaction data. Finally, two new analytics are implemented for analyzing large portfolios of multi-attribute data and measuring the behavioral stability of regions along different dimensions. These advances required substantial new approaches in design, algorithmic innovations, and increased computational efficiency. We report on these advances and inform how others may freely access the tool.

  12. YOUTH STUDIES – A SPECIFIC GENRE OF THE EMPIRICAL PARADIGM IN SOCIAL SCIENCES

    Directory of Open Access Journals (Sweden)

    Agnė Dorelaitienė

    2017-09-01

    Full Text Available The article presents the situation of youth in contemporary society. Neoliberal economy, ageing society, rapid globalisation, technological changes, increase of social risk have prompted specific, historically unfamiliar, and fairly difficult to forecast social change. Social adaptation and construction of own identity are becoming challenging to youth as a specific social group in this period of great uncertainty, risk, and opportunities. Youth studies are referred to as one of the means to help understand the youth phenomenon and form the respective policy. Aim of the article is to reveal the role of youth studies as a specific interdisciplinary genre of the empirical-analytic paradigm in social sciences. Research objectives: (1 To identify the traditions of youth studies and differences between them; (2. To reveal the specific character of youth studies as an empirical paradigm in the contemporary context. Analysis of scientific sources and document analysis are used for achievement of the goal and objectives. Since the 20th century, youth studies have been developing as an independent research discipline and tradition. Perception of the notion of a young person has been changing along with development of the paradigmatic and methodological research traditions. Modernity has doubtlessly contributed to a young person finding his/her place in other age groups and putting an emphasis on the importance of youth as a specific social group. Recently, youth has been viewed as both the risk and the opportunity group. Although qualitative research, in particular, where youth emancipation is aspired, prevails in the contemporary research tradition, the empirical-analytic paradigm has not lost its relevance. The research has demonstrated that empirical-analytic paradigm is a specific genre of the youth studies characterised by quantitative approach and strong link to politics and practical situation of the phenomenon.

  13. Service Quality of Online Shopping Platforms: A Case-Based Empirical and Analytical Study

    Directory of Open Access Journals (Sweden)

    Tsan-Ming Choi

    2013-01-01

    Full Text Available Customer service is crucially important for online shopping platforms (OSPs such as eBay and Taobao. Based on the well-established service quality instruments and the scenario of the specific case on Taobao, this paper focuses on exploring the service quality of an OSP with an aim of revealing customer perceptions of the service quality associated with the provided functions and investigating their impacts on customer loyalty. By an empirical study, this paper finds that the “fulfillment and responsiveness” function is significantly related to the customer loyalty. Further analytical study is conducted to reveal that the optimal service level on the “fulfillment and responsiveness” function for the risk averse OSP uniquely exists. Moreover, the analytical results prove that (i if the customer loyalty is more positively correlated to the service level, it will lead to a larger optimal service level, and (ii the optimal service level is independent of the profit target, the source of uncertainty, and the risk preference of the OSP.

  14. Distributed data networks: a blueprint for Big Data sharing and healthcare analytics.

    Science.gov (United States)

    Popovic, Jennifer R

    2017-01-01

    This paper defines the attributes of distributed data networks and outlines the data and analytic infrastructure needed to build and maintain a successful network. We use examples from one successful implementation of a large-scale, multisite, healthcare-related distributed data network, the U.S. Food and Drug Administration-sponsored Sentinel Initiative. Analytic infrastructure-development concepts are discussed from the perspective of promoting six pillars of analytic infrastructure: consistency, reusability, flexibility, scalability, transparency, and reproducibility. This paper also introduces one use case for machine learning algorithm development to fully utilize and advance the portfolio of population health analytics, particularly those using multisite administrative data sources. © 2016 New York Academy of Sciences.

  15. Analytical approximation of neutron physics data

    International Nuclear Information System (INIS)

    Badikov, S.A.; Vinogradov, V.A.; Gaj, E.V.; Rabotnov, N.S.

    1984-01-01

    The method for experimental neutron-physical data analytical approximation by rational functions based on the Pade approximation is suggested. It is shown that the existence of the Pade approximation specific properties in polar zones is an extremely favourable analytical property essentially extending the convergence range and increasing its rate as compared with polynomial approximation. The Pade approximation is the particularly natural instrument for resonance curve processing as the resonances conform to the complex poles of the approximant. But even in a general case analytical representation of the data in this form is convenient and compact. Thus representation of the data on the neutron threshold reaction cross sections (BOSPOR constant library) in the form of rational functions lead to approximately twenty fold reduction of the storaged numerical information as compared with the by-point calculation at the same accWracy

  16. Clinical laboratory analytics: Challenges and promise for an emerging discipline

    Directory of Open Access Journals (Sweden)

    Brian H Shirts

    2015-01-01

    Full Text Available The clinical laboratory is a major source of health care data. Increasingly these data are being integrated with other data to inform health system-wide actions meant to improve diagnostic test utilization, service efficiency, and "meaningful use." The Academy of Clinical Laboratory Physicians and Scientists hosted a satellite meeting on clinical laboratory analytics in conjunction with their annual meeting on May 29, 2014 in San Francisco. There were 80 registrants for the clinical laboratory analytics meeting. The meeting featured short presentations on current trends in clinical laboratory analytics and several panel discussions on data science in laboratory medicine, laboratory data and its role in the larger healthcare system, integrating laboratory analytics, and data sharing for collaborative analytics. One main goal of meeting was to have an open forum of leaders that work with the "big data" clinical laboratories produce. This article summarizes the proceedings of the meeting and content discussed.

  17. Big Data Analytics with Datalog Queries on Spark.

    Science.gov (United States)

    Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo

    2016-01-01

    There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics.

  18. Deployment of Analytics into the Healthcare Safety Net: Lessons Learned.

    Science.gov (United States)

    Hartzband, David; Jacobs, Feygele

    2016-01-01

    As payment reforms shift healthcare reimbursement toward value-based payment programs, providers need the capability to work with data of greater complexity, scope and scale. This will in many instances necessitate a change in understanding of the value of data, and the types of data needed for analysis to support operations and clinical practice. It will also require the deployment of different infrastructure and analytic tools. Community health centers, which serve more than 25 million people and together form the nation's largest single source of primary care for medically underserved communities and populations, are expanding and will need to optimize their capacity to leverage data as new payer and organizational models emerge. To better understand existing capacity and help organizations plan for the strategic and expanded uses of data, a project was initiated that deployed contemporary, Hadoop-based, analytic technology into several multi-site community health centers (CHCs) and a primary care association (PCA) with an affiliated data warehouse supporting health centers across the state. An initial data quality exercise was carried out after deployment, in which a number of analytic queries were executed using both the existing electronic health record (EHR) applications and in parallel, the analytic stack. Each organization carried out the EHR analysis using the definitions typically applied for routine reporting. The analysis deploying the analytic stack was carried out using those common definitions established for the Uniform Data System (UDS) by the Health Resources and Service Administration. 1 In addition, interviews with health center leadership and staff were completed to understand the context for the findings. The analysis uncovered many challenges and inconsistencies with respect to the definition of core terms (patient, encounter, etc.), data formatting, and missing, incorrect and unavailable data. At a population level, apparent underreporting

  19. I. Forensic data analysis by pattern recognition. Categorization of white bond papers by elemental composition. II. Source identification of oil spills by pattern recognition analysis of natural elemental composition. III. Improving the reliability of factor analysis of chemical measured analytical data by utilizing the measured analytical uncertainity. IV. Elucidating the structure of some clinical data

    International Nuclear Information System (INIS)

    Duewer, D.L.

    1977-01-01

    Pattern recognition techniques are applied to the analysis of white bond papers and the problem of determining the source of an oil spill. In each case, an elemental analysis by neutron activation is employed. For the determination of source of oil spills, the field sample was weathered prior to activation analysis. A procedure for including measured analytical uncertainty into data analysis methodology is discussed, with particular reference to factor analysis. The suitability of various dispersion matrices and matrix rank determination criteria for data having analytical uncertainty is investigated. A criterion useful for judging the number of factors insensitive to analytical uncertainty is presented. A model data structure for investigating the behavior of factor analysis techniques in a known, controlled manner is described and analyzed. A chemically interesting test data base having analytical uncertainty is analyzed and compared with the model data. The data structure of 22 blood constituents in three categories of liver disease (viral or toxic hepatitis, alcoholic liver diseases and obstructive processes) is studied using various statistical and pattern recognition techniques. Comparison of classification results on the original data, in combination with principal component analysis, suggests a possible underlying structure for the data. This model structure is tested by the application of two simple data transformations. Analysis of the transformed data appears to confirm that some basic understanding of the studied data has been achieved

  20. Data analytics in the ATLAS Distributed Computing

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration; Bryant, Lincoln

    2015-01-01

    The ATLAS Data analytics effort is focused on creating systems which provide the ATLAS ADC with new capabilities for understanding distributed systems and overall operational performance. These capabilities include: warehousing information from multiple systems (the production and distributed analysis system - PanDA, the distributed data management system - Rucio, the file transfer system, various monitoring services etc. ); providing a platform to execute arbitrary data mining and machine learning algorithms over aggregated data; satisfy a variety of use cases for different user roles; host new third party analytics services on a scalable compute platform. We describe the implemented system where: data sources are existing RDBMS (Oracle) and Flume collectors; a Hadoop cluster is used to store the data; native Hadoop and Apache Pig scripts are used for data aggregation; and R for in-depth analytics. Part of the data is indexed in ElasticSearch so both simpler investigations and complex dashboards can be made ...

  1. Gender-partitioned patient medians of serum albumin requested by general practitioners for the assessment of analytical stability

    DEFF Research Database (Denmark)

    Hansen, Steen Ingemann; Petersen, Per Hyltoft; Lund, Flemming

    2017-01-01

    BACKGROUND: Recently, the use of separate gender-partitioned patient medians of serum sodium has revealed potential for monitoring analytical stability within the optimum analytical performance specifications for laboratory medicine. The serum albumin concentration depends on whether a patient...... patients were closely related despite considerable variation due to the current analytical variation. This relationship was confirmed by the calculated half-range for the monthly ratio between the genders of 0.44%, which surpasses the optimum analytical performance specification for bias of serum albumin...... (0.72%). The weekly ratio had a half-range of 1.83%, which surpasses the minimum analytical performance specifications of 2.15%. CONCLUSIONS: Monthly gender-partitioned patient medians of serum albumin are useful for monitoring of long-term analytical stability, where the gender medians are two...

  2. On an analytical representation of the solution of the one-dimensional transport equation for a multi-group model in planar geometry

    Energy Technology Data Exchange (ETDEWEB)

    Fernandes, Julio C.L.; Vilhena, Marco T.; Bodmann, Bardo E.J., E-mail: julio.lombaldo@ufrgs.br, E-mail: mtmbvilhena@gmail.com, E-mail: bardo.bodmann@ufrgs.br [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Dept. de Matematica Pura e Aplicada; Dulla, Sandra; Ravetto, Piero, E-mail: sandra.dulla@polito.it, E-mail: piero.ravetto@polito.it [Dipartimento di Energia, Politecnico di Torino, Piemonte (Italy)

    2015-07-01

    In this work we generalize the solution of the one-dimensional neutron transport equation to a multi- group approach in planar geometry. The basic idea of this work consists in consider the hierarchical construction of a solution for a generic number G of energy groups, starting from a mono-energetic solution. The hierarchical method follows the reasoning of the decomposition method. More specifically, the additional terms from adding energy groups is incorporated into the recursive scheme as source terms. This procedure leads to an analytical representation for the solution with G energy groups. The recursion depth is related to the accuracy of the solution, that may be evaluated after each recursion step. The authors present a heuristic analysis of stability for the results. Numerical simulations for a specific example with four energy groups and a localized pulsed source. (author)

  3. Inorganic Arsenic Determination in Food: A Review of Analytical Proposals and Quality Assessment Over the Last Six Years.

    Science.gov (United States)

    Llorente-Mirandes, Toni; Rubio, Roser; López-Sánchez, José Fermín

    2017-01-01

    Here we review recent developments in analytical proposals for the assessment of inorganic arsenic (iAs) content in food products. Interest in the determination of iAs in products for human consumption such as food commodities, wine, and seaweed among others is fueled by the wide recognition of its toxic effects on humans, even at low concentrations. Currently, the need for robust and reliable analytical methods is recognized by various international safety and health agencies, and by organizations in charge of establishing acceptable tolerance levels of iAs in food. This review summarizes the state of the art of analytical methods while highlighting tools for the assessment of quality assessment of the results, such as the production and evaluation of certified reference materials (CRMs) and the availability of specific proficiency testing (PT) programmes. Because the number of studies dedicated to the subject of this review has increased considerably over recent years, the sources consulted and cited here are limited to those from 2010 to the end of 2015.

  4. Source-specific fine particulate air pollution and systemic inflammation in ischaemic heart disease patients

    Science.gov (United States)

    Siponen, Taina; Yli-Tuomi, Tarja; Aurela, Minna; Dufva, Hilkka; Hillamo, Risto; Hirvonen, Maija-Riitta; Huttunen, Kati; Pekkanen, Juha; Pennanen, Arto; Salonen, Iiris; Tiittanen, Pekka; Salonen, Raimo O; Lanki, Timo

    2015-01-01

    Objective To compare short-term effects of fine particles (PM2.5; aerodynamic diameter <2.5 µm) from different sources on the blood levels of markers of systemic inflammation. Methods We followed a panel of 52 ischaemic heart disease patients from 15 November 2005 to 21 April 2006 with clinic visits in every second week in the city of Kotka, Finland, and determined nine inflammatory markers from blood samples. In addition, we monitored outdoor air pollution at a fixed site during the study period and conducted a source apportionment of PM2.5 using the Environmental Protection Agency's model EPA PMF 3.0. We then analysed associations between levels of source-specific PM2.5 and markers of systemic inflammation using linear mixed models. Results We identified five source categories: regional and long-range transport (LRT), traffic, biomass combustion, sea salt, and pulp industry. We found most evidence for the relation of air pollution and inflammation in LRT, traffic and biomass combustion; the most relevant inflammation markers were C-reactive protein, interleukin-12 and myeloperoxidase. Sea salt was not positively associated with any of the inflammatory markers. Conclusions Results suggest that PM2.5 from several sources, such as biomass combustion and traffic, are promoters of systemic inflammation, a risk factor for cardiovascular diseases. PMID:25479755

  5. Analytical framework for borehole heat exchanger (BHE) simulation influenced by horizontal groundwater flow and complex top boundary conditions

    Science.gov (United States)

    Rivera, Jaime; Blum, Philipp; Bayer, Peter

    2015-04-01

    Borehole heat exchangers (BHE) are the most widely used technologies for tapping low-enthalpy energy resources in the shallow subsurface. Analysis of these systems requires a proper simulation of the relevant processes controlling the transfer of heat between the BHE and the ground. Among the available simulation approaches, analytical methods are broadly accepted, especially when low computational costs and comprehensive analyses are demanded. Moreover, these methods constitute the benchmark solutions to evaluate the performance of more complex numerical models. Within the spectrum of existing (semi-)analytical models, those based on the superposition of problem-specific Green's functions are particularly appealing. Green's functions can be derived, for instance, for nodal or line sources with constant or transient strengths. In the same manner, functional forms can be obtained for scenarios with complex top boundary conditions whose temperature may vary in space and time. Other relevant processes, such as advective heat transport, mechanical dispersion and heat transfer through the unsaturated zone could be incorporated as well. A keystone of the methodology is that individual solutions can be added up invoking the superposition principle. This leads to a flexible and robust framework for studying the interaction of multiple processes on thermal plumes of BHEs. In this contribution, we present a new analytical framework and its verification via comparison with a numerical model. It simulates a BHE as a line source, and it integrates both horizontal groundwater flow and the effect of top boundary effects due to variable land use. All these effects may be implemented as spatially and temporally variable. For validation, the analytical framework is successfully applied to study cases where highly resolved temperature data is available.

  6. 3D material cytometry (3DMaC): a very high-replicate, high-throughput analytical method using microfabricated, shape-specific, cell-material niches.

    Science.gov (United States)

    Parratt, Kirsten; Jeong, Jenny; Qiu, Peng; Roy, Krishnendu

    2017-08-08

    Studying cell behavior within 3D material niches is key to understanding cell biology in health and diseases, and developing biomaterials for regenerative medicine applications. Current approaches to studying these cell-material niches have low throughput and can only analyze a few replicates per experiment resulting in reduced measurement assurance and analytical power. Here, we report 3D material cytometry (3DMaC), a novel high-throughput method based on microfabricated, shape-specific 3D cell-material niches and imaging cytometry. 3DMaC achieves rapid and highly multiplexed analyses of very high replicate numbers ("n" of 10 4 -10 6 ) of 3D biomaterial constructs. 3DMaC overcomes current limitations of low "n", low-throughput, and "noisy" assays, to provide rapid and simultaneous analyses of potentially hundreds of parameters in 3D biomaterial cultures. The method is demonstrated here for a set of 85 000 events containing twelve distinct cell-biomaterial micro-niches along with robust, customized computational methods for high-throughput analytics with potentially unprecedented statistical power.

  7. Data science and big data analytics discovering, analyzing, visualizing and presenting data

    CERN Document Server

    2014-01-01

    Data Science and Big Data Analytics is about harnessing the power of data for new insights. The book covers the breadth of activities and methods and tools that Data Scientists use. The content focuses on concepts, principles and practical applications that are applicable to any industry and technology environment, and the learning is supported and explained with examples that you can replicate using open-source software. This book will help you: Become a contributor on a data science teamDeploy a structured lifecycle approach to data analytics problemsApply appropriate analytic techniques and

  8. Assessment of Analytic Water hammer Pressure Model of FAI/08-70

    International Nuclear Information System (INIS)

    Park, Ju Yeop; Yoo, Seung Hun; Seul, Kwang-Won

    2016-01-01

    In evaluating water hammer effect on the safety related systems, methods developed by the US utility are likely to be adopted in Korea. For example, the US utility developed specific methods to evaluate pressure and loading transient on piping due to water hammer as in FAI/08-70. The methods of FAI/08-70 would be applied in Korea when any regulatory request on the evaluation of water hammer effect due to the non-condensable gas accumulation in the safety related systems. Specifically, FAI/08-70 gives an analytic model which can be used to analyze the maximum transient pressure and maximum transient loading on the piping of the safety-related systems due to the non-condensable induced water hammer effect. Therefore, it seems to be meaningful to review the FAI/08-70 methods and attempt to apply the methods to a specific case to see if they really give reasonable estimate before the application of FAI/08-70 methods to domestic nuclear power plants. In the present study, analytic water hammer pressure model of FAI/08-70 is reviewed in detail and the model is applied to the specific experiment of FAI/08-70 to see if the analytic water hammer pressure model really gives reasonable estimate of the peak water hammer pressure. Specifically, we assess the experiment 52A of FAI/08-70 which adopts flushed initial condition with a short rising piping length and a high level piping length of 51inch. The calculated analytic water hammer pressure peak shows a close agreement with the measured experimental data of 52A. Unfortunately, however, the theoretical value is a little bit less than that of the experimental value. This implies the analytic model of FAI/08-70 is not conservative

  9. Assessment of Analytic Water hammer Pressure Model of FAI/08-70

    Energy Technology Data Exchange (ETDEWEB)

    Park, Ju Yeop; Yoo, Seung Hun; Seul, Kwang-Won [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    In evaluating water hammer effect on the safety related systems, methods developed by the US utility are likely to be adopted in Korea. For example, the US utility developed specific methods to evaluate pressure and loading transient on piping due to water hammer as in FAI/08-70. The methods of FAI/08-70 would be applied in Korea when any regulatory request on the evaluation of water hammer effect due to the non-condensable gas accumulation in the safety related systems. Specifically, FAI/08-70 gives an analytic model which can be used to analyze the maximum transient pressure and maximum transient loading on the piping of the safety-related systems due to the non-condensable induced water hammer effect. Therefore, it seems to be meaningful to review the FAI/08-70 methods and attempt to apply the methods to a specific case to see if they really give reasonable estimate before the application of FAI/08-70 methods to domestic nuclear power plants. In the present study, analytic water hammer pressure model of FAI/08-70 is reviewed in detail and the model is applied to the specific experiment of FAI/08-70 to see if the analytic water hammer pressure model really gives reasonable estimate of the peak water hammer pressure. Specifically, we assess the experiment 52A of FAI/08-70 which adopts flushed initial condition with a short rising piping length and a high level piping length of 51inch. The calculated analytic water hammer pressure peak shows a close agreement with the measured experimental data of 52A. Unfortunately, however, the theoretical value is a little bit less than that of the experimental value. This implies the analytic model of FAI/08-70 is not conservative.

  10. Semi-analytic flux formulas for shielding calculations

    International Nuclear Information System (INIS)

    Wallace, O.J.

    1976-06-01

    A special coordinate system based on the work of H. Ono and A. Tsuro has been used to derive exact semi-analytic formulas for the flux from cylindrical, spherical, toroidal, rectangular, annular and truncated cone volume sources; from cylindrical, spherical, truncated cone, disk and rectangular surface sources; and from curved and tilted line sources. In most of the cases where the source is curved, shields of the same curvature are allowed in addition to the standard slab shields; cylindrical shields are also allowed in the rectangular volume source flux formula. An especially complete treatment of a cylindrical volume source is given, in which dose points may be arbitrarily located both within and outside the source, and a finite cylindrical shield may be considered. Detector points may also be specified as lying within spherical and annular source volumes. The integral functions encountered in these formulas require at most two-dimensional numeric integration in order to evaluate the flux values. The classic flux formulas involving only slab shields and slab, disk, line, sphere and truncated cone sources become some of the many special cases which are given in addition to the more general formulas mentioned above

  11. Holistic versus Analytic Evaluation of EFL Writing: A Case Study

    Science.gov (United States)

    Ghalib, Thikra K.; Al-Hattami, Abdulghani A.

    2015-01-01

    This paper investigates the performance of holistic and analytic scoring rubrics in the context of EFL writing. Specifically, the paper compares EFL students' scores on a writing task using holistic and analytic scoring rubrics. The data for the study was collected from 30 participants attending an English undergraduate program in a Yemeni…

  12. Estimation of contribution ratios of pollutant sources to a specific section based on an enhanced water quality model.

    Science.gov (United States)

    Cao, Bibo; Li, Chuan; Liu, Yan; Zhao, Yue; Sha, Jian; Wang, Yuqiu

    2015-05-01

    Because water quality monitoring sections or sites could reflect the water quality status of rivers, surface water quality management based on water quality monitoring sections or sites would be effective. For the purpose of improving water quality of rivers, quantifying the contribution ratios of pollutant resources to a specific section is necessary. Because physical and chemical processes of nutrient pollutants are complex in water bodies, it is difficult to quantitatively compute the contribution ratios. However, water quality models have proved to be effective tools to estimate surface water quality. In this project, an enhanced QUAL2Kw model with an added module was applied to the Xin'anjiang Watershed, to obtain water quality information along the river and to assess the contribution ratios of each pollutant source to a certain section (the Jiekou state-controlled section). Model validation indicated that the results were reliable. Then, contribution ratios were analyzed through the added module. Results show that among the pollutant sources, the Lianjiang tributary contributes the largest part of total nitrogen (50.43%), total phosphorus (45.60%), ammonia nitrogen (32.90%), nitrate (nitrite + nitrate) nitrogen (47.73%), and organic nitrogen (37.87%). Furthermore, contribution ratios in different reaches varied along the river. Compared with pollutant loads ratios of different sources in the watershed, an analysis of contribution ratios of pollutant sources for each specific section, which takes the localized chemical and physical processes into consideration, was more suitable for local-regional water quality management. In summary, this method of analyzing the contribution ratios of pollutant sources to a specific section based on the QUAL2Kw model was found to support the improvement of the local environment.

  13. 40 CFR 425.03 - Sulfide analytical methods and applicability.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Sulfide analytical methods and applicability. 425.03 Section 425.03 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS LEATHER TANNING AND FINISHING POINT SOURCE CATEGORY General Provisions...

  14. Validation of an analytical method based on the high-resolution continuum source flame atomic absorption spectrometry for the fast-sequential determination of several hazardous/priority hazardous metals in soil.

    Science.gov (United States)

    Frentiu, Tiberiu; Ponta, Michaela; Hategan, Raluca

    2013-03-01

    The aim of this paper was the validation of a new analytical method based on the high-resolution continuum source flame atomic absorption spectrometry for the fast-sequential determination of several hazardous/priority hazardous metals (Ag, Cd, Co, Cr, Cu, Ni, Pb and Zn) in soil after microwave assisted digestion in aqua regia. Determinations were performed on the ContrAA 300 (Analytik Jena) air-acetylene flame spectrometer equipped with xenon short-arc lamp as a continuum radiation source for all elements, double monochromator consisting of a prism pre-monocromator and an echelle grating monochromator, and charge coupled device as detector. For validation a method-performance study was conducted involving the establishment of the analytical performance of the new method (limits of detection and quantification, precision and accuracy). Moreover, the Bland and Altman statistical method was used in analyzing the agreement between the proposed assay and inductively coupled plasma optical emission spectrometry as standardized method for the multielemental determination in soil. The limits of detection in soil sample (3σ criterion) in the high-resolution continuum source flame atomic absorption spectrometry method were (mg/kg): 0.18 (Ag), 0.14 (Cd), 0.36 (Co), 0.25 (Cr), 0.09 (Cu), 1.0 (Ni), 1.4 (Pb) and 0.18 (Zn), close to those in inductively coupled plasma optical emission spectrometry: 0.12 (Ag), 0.05 (Cd), 0.15 (Co), 1.4 (Cr), 0.15 (Cu), 2.5 (Ni), 2.5 (Pb) and 0.04 (Zn). Accuracy was checked by analyzing 4 certified reference materials and a good agreement for 95% confidence interval was found in both methods, with recoveries in the range of 94-106% in atomic absorption and 97-103% in optical emission. Repeatability found by analyzing real soil samples was in the range 1.6-5.2% in atomic absorption, similar with that of 1.9-6.1% in optical emission spectrometry. The Bland and Altman method showed no statistical significant difference between the two spectrometric

  15. In search of integrated specificity: comment on Denson, Spanovic, and Miller (2009).

    Science.gov (United States)

    Miller, Gregory E

    2009-11-01

    Psychologists have long been interested in the integrated specificity hypothesis, which maintains that stressors elicit fairly distinct behavioral, emotional, and biological responses that are molded by selective pressures to meet specific demands from the environment. This issue of Psychological Bulletin features a meta-analytic review of the evidence for this proposition by T. F. Denson, M. Spanovic, and N. Miller. Their review concluded that the meta-analytic findings support the "core concept behind the integrated specificity model" (p. 845) and reveal that "within the context of a stressful event, organisms produce an integrated and coordinated response at multiple levels (i.e., cognitive, emotional, physiological)" (p. 845). I argue that conclusions such as this are unwarranted, given the data. Aside from some effects for cortisol, little evidence of specificity was presented, and most of the significant findings reported would be expected by chance alone. I also contend that Denson et al. failed to consider some important sources of evidence bearing on the specificity hypothesis, particularly how appraisals and emotions couple with autonomic nervous system endpoints and functional indices of immune response. If selective pressures did give rise to an integrated stress response, such pathways almost certainly would have been involved. By omitting such outcomes from the meta-analysis, Denson et al. overlooked what are probably the most definitive tests of the specificity hypothesis. As a result, the field is back where it started: with a lot of affection for the concept of integrated specificity but little in the way of definitive evidence to refute or accept it.

  16. Manual of selected physico-chemical analytical methods. IV

    International Nuclear Information System (INIS)

    Beran, M.; Klosova, E.; Krtil, J.; Sus, F.; Kuvik, V.; Vrbova, L.; Hamplova, M.; Lengyel, J.; Kelnar, L.; Zakouril, K.

    1990-11-01

    The Central Testing Laboratory of the Nuclear Research Institute at Rez has for a decade been participating in the development of analytical procedures and has been providing analyses of samples of different types and origin. The analytical procedures developed have been published in special journals and a number of them in the Manuals of analytical methods, in three parts. The 4th part of the Manual contains selected physico-chemical methods developed or modified by the Laboratory in the years 1986-1990 within the project ''Development of physico-chemical analytical methods''. In most cases, techniques are involved for non-nuclear applications. Some can find wider applications, especially in analyses of environmental samples. Others have been developed for specific cases of sample analyses or require special instrumentation (mass spectrometer), which partly restricts their applicability by other institutions. (author)

  17. Analytical model of contamination during the drying of cylinders of jamonable muscle

    Science.gov (United States)

    Montoya Arroyave, Isabel

    2014-05-01

    For a cylinder of jamonable muscle of radius R and length much greater than R; considering that the internal resistance to the transfer of water is much greater than the external and that the internal resistance is one certain function of the distance to the axis; the distribution of the punctual moisture in the jamonable cylinder is analytically computed in terms of the Bessel's functions. During the process of drying and salted the jamonable cylinder is sensitive to contaminate with bacterium and protozoa that come from the environment. An analytical model of contamination is presents using the diffusion equation with sources and sinks, which is solve by the method of the Laplace transform, the Bromwich integral, the residue theorem and some special functions like Bessel and Heun. The critical times intervals of drying and salted are computed in order to obtain the minimum possible contamination. It is assumed that both external moisture and contaminants decrease exponentially with time. Contaminants profiles are plotted and discussed some possible techniques of contaminants detection. All computations are executed using Computer Algebra, specifically Maple. It is said that the results are important for the food industry and it is suggested some future research lines.

  18. RETRANS - A tool to verify the functional equivalence of automatically generated source code with its specification

    International Nuclear Information System (INIS)

    Miedl, H.

    1998-01-01

    Following the competent technical standards (e.g. IEC 880) it is necessary to verify each step in the development process of safety critical software. This holds also for the verification of automatically generated source code. To avoid human errors during this verification step and to limit the cost effort a tool should be used which is developed independently from the development of the code generator. For this purpose ISTec has developed the tool RETRANS which demonstrates the functional equivalence of automatically generated source code with its underlying specification. (author)

  19. Analytical Chemistry Core Capability Assessment - Preliminary Report

    International Nuclear Information System (INIS)

    Barr, Mary E.; Farish, Thomas J.

    2012-01-01

    The concept of 'core capability' can be nebulous one. Even at a fairly specific level, where core capability equals maintaining essential services, it is highly dependent upon the perspective of the requestor. Samples are submitted to analytical services because the requesters do not have the capability to conduct adequate analyses themselves. Some requests are for general chemical information in support of R and D, process control, or process improvement. Many analyses, however, are part of a product certification package and must comply with higher-level customer quality assurance requirements. So which services are essential to that customer - just those for product certification? Does the customer also (indirectly) need services that support process control and improvement? And what is the timeframe? Capability is often expressed in terms of the currently utilized procedures, and most programmatic customers can only plan a few years out, at best. But should core capability consider the long term where new technologies, aging facilities, and personnel replacements must be considered? These questions, and a multitude of others, explain why attempts to gain long-term consensus on the definition of core capability have consistently failed. This preliminary report will not try to define core capability for any specific program or set of programs. Instead, it will try to address the underlying concerns that drive the desire to determine core capability. Essentially, programmatic customers want to be able to call upon analytical chemistry services to provide all the assays they need, and they don't want to pay for analytical chemistry services they don't currently use (or use infrequently). This report will focus on explaining how the current analytical capabilities and methods evolved to serve a variety of needs with a focus on why some analytes have multiple analytical techniques, and what determines the infrastructure for these analyses. This information will be

  20. Analytical control in metallurgical processes

    International Nuclear Information System (INIS)

    Coedo, A.G.; Dorado, M.T.; Padilla, I.

    1998-01-01

    This paper illustrates the role of analysis in enabling metallurgical industry to meet quality demands. For example, for the steel industry the demands by the automotive, aerospace, power generation, tinplate packaging industries and issue of environment near steel plants. Although chemical analysis technology continues to advance, achieving improved speed, precision and accuracy at lower levels of detection, the competitiveness of manufacturing industry continues to drive property demands at least at the same rate. Narrower specification ranges, lower levels of residual elements and economic pressures prescribe faster process routes, all of which lead to increased demands on the analytical function. These damands are illustrated by examples from several market sectors in which customer issues are considered together with ther analytical implications. (Author) 5 refs

  1. Analytical chemistry: Principles and techniques

    International Nuclear Information System (INIS)

    Hargis, L.G.

    1988-01-01

    Although this text seems to have been intended for use in a one-semester course in undergraduate analytical chemistry, it includes the range of topics usually encountered in a two-semester introductory course in chemical analysis. The material is arranged logically for use in a two-semester course: the first 12 chapters contain the subjects most often covered in the first term, and the next 10 chapters pertain to the second (instrumental) term. Overall breadth and level of treatment are standards for an undergraduate text of this sort, and the only major omission is that of kinetic methods (which is a common omission in analytical texts). In the first 12 chapters coverage of the basic material is quite good. The emphasis on the underlying principles of the techniques rather than on specifics and design of instrumentation is welcomed. This text may be more useful for the instrumental portion of an analytical chemistry course than for the solution chemistry segment. The instrumental analysis portion is appropriate for an introductory textbook

  2. Analytical Subthreshold Current and Subthreshold Swing Models for a Fully Depleted (FD) Recessed-Source/Drain (Re-S/D) SOI MOSFET with Back-Gate Control

    Science.gov (United States)

    Saramekala, Gopi Krishna; Tiwari, Pramod Kumar

    2017-08-01

    Two-dimensional (2D) analytical models for the subthreshold current and subthreshold swing of the back-gated fully depleted recessed-source/drain (Re-S/D) silicon-on-insulator (SOI) metal-oxide-semiconductor field-effect transistor (MOSFET) are presented. The surface potential is determined by solving the 2D Poisson equation in both channel and buried-oxide (BOX) regions, considering suitable boundary conditions. To derive closed-form expressions for the subthreshold characteristics, the virtual cathode potential expression has been derived in terms of the minimum of the front and back surface potentials. The effect of various device parameters such as gate oxide and Si film thicknesses, thickness of source/drain penetration into BOX, applied back-gate bias voltage, etc. on the subthreshold current and subthreshold swing has been analyzed. The validity of the proposed models is established using the Silvaco ATLAS™ 2D device simulator.

  3. pyJac: Analytical Jacobian generator for chemical kinetics

    Science.gov (United States)

    Niemeyer, Kyle E.; Curtis, Nicholas J.; Sung, Chih-Jen

    2017-06-01

    Accurate simulations of combustion phenomena require the use of detailed chemical kinetics in order to capture limit phenomena such as ignition and extinction as well as predict pollutant formation. However, the chemical kinetic models for hydrocarbon fuels of practical interest typically have large numbers of species and reactions and exhibit high levels of mathematical stiffness in the governing differential equations, particularly for larger fuel molecules. In order to integrate the stiff equations governing chemical kinetics, generally reactive-flow simulations rely on implicit algorithms that require frequent Jacobian matrix evaluations. Some in situ and a posteriori computational diagnostics methods also require accurate Jacobian matrices, including computational singular perturbation and chemical explosive mode analysis. Typically, finite differences numerically approximate these, but for larger chemical kinetic models this poses significant computational demands since the number of chemical source term evaluations scales with the square of species count. Furthermore, existing analytical Jacobian tools do not optimize evaluations or support emerging SIMD processors such as GPUs. Here we introduce pyJac, a Python-based open-source program that generates analytical Jacobian matrices for use in chemical kinetics modeling and analysis. In addition to producing the necessary customized source code for evaluating reaction rates (including all modern reaction rate formulations), the chemical source terms, and the Jacobian matrix, pyJac uses an optimized evaluation order to minimize computational and memory operations. As a demonstration, we first establish the correctness of the Jacobian matrices for kinetic models of hydrogen, methane, ethylene, and isopentanol oxidation (number of species ranging 13-360) by showing agreement within 0.001% of matrices obtained via automatic differentiation. We then demonstrate the performance achievable on CPUs and GPUs using py

  4. Timeframe Dependent Fragment Ions Observed in In-Source Decay Experiments with β-Casein Using MALDI MS.

    Science.gov (United States)

    Sekiya, Sadanori; Nagoshi, Keishiro; Iwamoto, Shinichi; Tanaka, Koichi; Takayama, Mitsuo

    2015-09-01

    The fragment ions observed with time-of-flight (TOF) and quadrupole ion trap (QIT) TOF mass spectrometers (MS) combined with matrix-assisted laser desorption/ionization in-source decay (MALDI-ISD) experiments of phosphorylated analytes β-casein and its model peptide were compared from the standpoint of the residence timeframe of analyte and fragment ions in the MALDI ion source and QIT cell. The QIT-TOF MS gave fragment c-, z'-, z-ANL, y-, and b-ions, and further degraded fragments originating from the loss of neutrals such as H(2)O, NH(3), CH(2)O (from serine), C2H4O (from threonine), and H(3)PO(4), whereas the TOF MS merely showed MALDI source-generated fragment c-, z'-, z-ANL, y-, and w-ions. The fragment ions observed in the QIT-TOF MS could be explained by the injection of the source-generated ions into the QIT cell or a cooperative effect of a little internal energy deposition, a long residence timeframe (140 ms) in the QIT cell, and specific amino acid effects on low-energy CID, whereas the source-generated fragments (c-, z'-, z-ANL, y-, and w-ions) could be a result of prompt radical-initiated fragmentation of hydrogen-abundant radical ions [M + H + H](+) and [M + H - H](-) within the 53 ns timeframe, which corresponds to the delayed extraction time. The further degraded fragment b/y-ions produced in the QIT cell were confirmed by positive- and negative-ion low-energy CID experiments performed on the source-generated ions (c-, z'-, and y-ions). The loss of phosphoric acid (98 u) from analyte and fragment ions can be explained by a slow ergodic fragmentation independent of positive and negative charges.

  5. Timeframe Dependent Fragment Ions Observed in In-Source Decay Experiments with β-Casein Using MALDI MS

    Science.gov (United States)

    Sekiya, Sadanori; Nagoshi, Keishiro; Iwamoto, Shinichi; Tanaka, Koichi; Takayama, Mitsuo

    2015-09-01

    The fragment ions observed with time-of-flight (TOF) and quadrupole ion trap (QIT) TOF mass spectrometers (MS) combined with matrix-assisted laser desorption/ionization in-source decay (MALDI-ISD) experiments of phosphorylated analytes β-casein and its model peptide were compared from the standpoint of the residence timeframe of analyte and fragment ions in the MALDI ion source and QIT cell. The QIT-TOF MS gave fragment c-, z'-, z-ANL, y-, and b-ions, and further degraded fragments originating from the loss of neutrals such as H2O, NH3, CH2O (from serine), C2H4O (from threonine), and H3PO4, whereas the TOF MS merely showed MALDI source-generated fragment c-, z'-, z-ANL, y-, and w-ions. The fragment ions observed in the QIT-TOF MS could be explained by the injection of the source-generated ions into the QIT cell or a cooperative effect of a little internal energy deposition, a long residence timeframe (140 ms) in the QIT cell, and specific amino acid effects on low-energy CID, whereas the source-generated fragments (c-, z'-, z-ANL, y-, and w-ions) could be a result of prompt radical-initiated fragmentation of hydrogen-abundant radical ions [M + H + H]+ and [M + H - H]- within the 53 ns timeframe, which corresponds to the delayed extraction time. The further degraded fragment b/y-ions produced in the QIT cell were confirmed by positive- and negative-ion low-energy CID experiments performed on the source-generated ions (c-, z'-, and y-ions). The loss of phosphoric acid (98 u) from analyte and fragment ions can be explained by a slow ergodic fragmentation independent of positive and negative charges.

  6. The World Spatiotemporal Analytics and Mapping Project (WSTAMP: Further Progress in Discovering, Exploring, and Mapping Spatiotemporal Patterns Across the World’s Largest Open Source Data Sets

    Directory of Open Access Journals (Sweden)

    J. Piburn

    2017-10-01

    Full Text Available Spatiotemporal (ST analytics applied to major data sources such as the World Bank and World Health Organization has shown tremendous value in shedding light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. WSTAMP engages this opportunity by situating analysts, data, and analytics together within a visually rich and computationally rigorous online analysis environment. Since introducing WSTAMP at the First International Workshop on Spatiotemporal Computing, several transformative advances have occurred. Collaboration with human computer interaction experts led to a complete interface redesign that deeply immerses the analyst within a ST context, significantly increases visual and textual content, provides navigational crosswalks for attribute discovery, substantially reduce mouse and keyboard actions, and supports user data uploads. Secondly, the database has been expanded to include over 16,000 attributes, 50 years of time, and 200+ nation states and redesigned to support non-annual, non-national, city, and interaction data. Finally, two new analytics are implemented for analyzing large portfolios of multi-attribute data and measuring the behavioral stability of regions along different dimensions. These advances required substantial new approaches in design, algorithmic innovations, and increased computational efficiency. We report on these advances and inform how others may freely access the tool.

  7. Dual metal gate tunneling field effect transistors based on MOSFETs: A 2-D analytical approach

    Science.gov (United States)

    Ramezani, Zeinab; Orouji, Ali A.

    2018-01-01

    A novel 2-D analytical drain current model of novel Dual Metal Gate Tunnel Field Effect Transistors Based on MOSFETs (DMG-TFET) is presented in this paper. The proposed Tunneling FET is extracted from a MOSFET structure by employing an additional electrode in the source region with an appropriate work function to induce holes in the N+ source region and hence makes it as a P+ source region. The electric field is derived which is utilized to extract the expression of the drain current by analytically integrating the band to band tunneling generation rate in the tunneling region based on the potential profile by solving the Poisson's equation. Through this model, the effects of the thin film thickness and gate voltage on the potential, the electric field, and the effects of the thin film thickness on the tunneling current can be studied. To validate our present model we use SILVACO ATLAS device simulator and the analytical results have been compared with it and found a good agreement.

  8. 77 FR 56176 - Analytical Methods Used in Periodic Reporting

    Science.gov (United States)

    2012-09-12

    ... informal rulemaking proceeding to consider changes in analytical principles (Proposals Six and Seven) used... (Proposals Six and Seven), September 4, 2012 (Petition). Proposal Six: Use of Foreign Postal Settlement System as Sole Source for Reporting of Inbound International Revenue, Pieces, and Weights. The Postal...

  9. Road Transportable Analytical Laboratory (RTAL) system

    International Nuclear Information System (INIS)

    1993-01-01

    The goal of this contractual effort is the development and demonstration of a Road Transportable Analytical Laboratory (RTAL) system to meet the unique needs of the Department of Energy (DOE) for rapid, accurate analysis of a wide variety of hazardous and radioactive contaminants in soil, groundwater, and surface waters. This laboratory system will be designed to provide the field and laboratory analytical equipment necessary to detect and quantify radionuclides, organics, heavy metals and other inorganics, and explosive materials. The planned laboratory system will consist of a set of individual laboratory modules deployable independently or as an interconnected group to meet each DOE site's specific needs

  10. A very high yield electron impact ion source for analytical mass spectrometry

    International Nuclear Information System (INIS)

    Koontz, S.L.; Bonner Denton, M.

    1981-01-01

    A novel ion source designed for use in mass spectrometric determination of organic compounds is described. The source is designed around a low pressure, large volume, hot cathode Penning discharge. The source operates in the 10 -4 - 10 -7 torr pressure domain and is capable of producing focusable current densities several orders of magnitude greater than those produced by conventional Nier -type sources. Mass spectra of n-butane and octafluoro-2-butene are presented. An improved signal-to-noise ratio is demonstrated with a General Electric Monopole 300 mass spectrometer. (orig.)

  11. Library improvement through data analytics

    CERN Document Server

    Farmer, Lesley S J

    2017-01-01

    This book shows how to act on and make sense of data in libraries. Using a range of techniques, tools and methodologies it explains how data can be used to help inform decision making at every level. Sound data analytics is the foundation for making an evidence-based case for libraries, in addition to guiding myriad organizational decisions, from optimizing operations for efficiency to responding to community needs. Designed to be useful for beginners as well as those with a background in data, this book introduces the basics of a six point framework that can be applied to a variety of library settings for effective system based, data-driven management. Library Improvement Through Data Analytics includes: - the basics of statistical concepts - recommended data sources for various library functions and processes, and guidance for using census, university, or - - government data in analysis - techniques for cleaning data - matching data to appropriate data analysis methods - how to make descriptive statistics m...

  12. The analytic nodal method in cylindrical geometry

    International Nuclear Information System (INIS)

    Prinsloo, Rian H.; Tomasevic, Djordje I.

    2008-01-01

    Nodal diffusion methods have been used extensively in nuclear reactor calculations, specifically for their performance advantage, but also for their superior accuracy. More specifically, the Analytic Nodal Method (ANM), utilising the transverse integration principle, has been applied to numerous reactor problems with much success. In this work, a nodal diffusion method is developed for cylindrical geometry. Application of this method to three-dimensional (3D) cylindrical geometry has never been satisfactorily addressed and we propose a solution which entails the use of conformal mapping. A set of 1D-equations with an adjusted, geometrically dependent, inhomogeneous source, is obtained. This work describes the development of the method and associated test code, as well as its application to realistic reactor problems. Numerical results are given for the PBMR-400 MW benchmark problem, as well as for a 'cylindrisized' version of the well-known 3D LWR IAEA benchmark. Results highlight the improved accuracy and performance over finite-difference core solutions and investigate the applicability of nodal methods to 3D PBMR type problems. Results indicate that cylindrical nodal methods definitely have a place within PBMR applications, yielding performance advantage factors of 10 and 20 for 2D and 3D calculations, respectively, and advantage factors of the order of 1000 in the case of the LWR problem

  13. Genesis of theory and analysis of practice of applying the analytical procedures in auditing

    OpenAIRE

    Сурніна, К. С.

    2012-01-01

    Determination of concept "Analytical procedures" in an audit by different researchers is investigated in the article, ownvision of necessity of wideuse of analytical procedures in audit is defined. Classification of analytical procedures is presentedtaking into account the specifity of auditing process on the whole

  14. Analytic reconstruction algorithms for triple-source CT with horizontal data truncation

    International Nuclear Information System (INIS)

    Chen, Ming; Yu, Hengyong

    2015-01-01

    Purpose: This paper explores a triple-source imaging method with horizontal data truncation to enlarge the field of view (FOV) for big objects. Methods: The study is conducted by using theoretical analysis, mathematical deduction, and numerical simulations. The proposed algorithms are implemented in c + + and MATLAB. While the basic platform is constructed in MATLAB, the computationally intensive segments are coded in c + +, which are linked via a MEX interface. Results: A triple-source circular scanning configuration with horizontal data truncation is developed, where three pairs of x-ray sources and detectors are unevenly distributed on the same circle to cover the whole imaging object. For this triple-source configuration, a fan-beam filtered backprojection-type algorithm is derived for truncated full-scan projections without data rebinning. The algorithm is also extended for horizontally truncated half-scan projections and cone-beam projections in a Feldkamp-type framework. Using their method, the FOV is enlarged twofold to threefold to scan bigger objects with high speed and quality. The numerical simulation results confirm the correctness and effectiveness of the developed algorithms. Conclusions: The triple-source scanning configuration with horizontal data truncation cannot only keep most of the advantages of a traditional multisource system but also cover a larger FOV for big imaging objects. In addition, because the filtering is shift-invariant, the proposed algorithms are very fast and easily parallelized on graphic processing units

  15. Specific absorbed fractions of energy at various ages from internal photon sources: 1, Methods

    International Nuclear Information System (INIS)

    Cristy, M.; Eckerman, K.F.

    1987-04-01

    Specific absorbed fractions (PHI's) in various organs of the body (target organs) from sources of monoenergetic photons in various other organs (source organs) are tabulated. This volume outlines various methods used to compute the PHI-values and describes how the ''best'' estimates recommended by us are chosen. These PHI-values can be used in calculating the photon component of the dose-equivalent rate in a given target organ from a given radionuclide that is present in a given source organ. The International Commission on Radiological Protection recognizes that the endosteal, or bone surface, cells are the tissue at risk for bone cancer. We have applied the dosimetry methods that Spiers and co-workers developed for beta-emitting radionuclides deposited in bone to follow the transport of secondary electrons that were freed by photon interactions through the microscopic structure of the skeleton. With these methods we can estimate PHI in the endosteal cells and can better estimate PHI in the active marrow; the latter is overestimated with the methods at photon energies below 200 keV. 41 refs., 25 figs., 23 tabs

  16. Life cycle management of analytical methods.

    Science.gov (United States)

    Parr, Maria Kristina; Schmidt, Alexander H

    2018-01-05

    In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Analytic posteriors for Pearson's correlation coefficient.

    Science.gov (United States)

    Ly, Alexander; Marsman, Maarten; Wagenmakers, Eric-Jan

    2018-02-01

    Pearson's correlation is one of the most common measures of linear dependence. Recently, Bernardo (11th International Workshop on Objective Bayes Methodology, 2015) introduced a flexible class of priors to study this measure in a Bayesian setting. For this large class of priors, we show that the (marginal) posterior for Pearson's correlation coefficient and all of the posterior moments are analytic. Our results are available in the open-source software package JASP.

  18. Analytic posteriors for Pearson's correlation coefficient

    OpenAIRE

    Ly, A.; Marsman, M.; Wagenmakers, E.-J.

    2018-01-01

    Pearson's correlation is one of the most common measures of linear dependence. Recently, Bernardo (11th International Workshop on Objective Bayes Methodology, 2015) introduced a flexible class of priors to study this measure in a Bayesian setting. For this large class of priors, we show that the (marginal) posterior for Pearson's correlation coefficient and all of the posterior moments are analytic. Our results are available in the open‐source software package JASP.

  19. Supercritical boiler material selection using fuzzy analytic network process

    Directory of Open Access Journals (Sweden)

    Saikat Ranjan Maity

    2012-08-01

    Full Text Available The recent development of world is being adversely affected by the scarcity of power and energy. To survive in the next generation, it is thus necessary to explore the non-conventional energy sources and efficiently consume the available sources. For efficient exploitation of the existing energy sources, a great scope lies in the use of Rankin cycle-based thermal power plants. Today, the gross efficiency of Rankin cycle-based thermal power plants is less than 28% which has been increased up to 40% with reheating and regenerative cycles. But, it can be further improved up to 47% by using supercritical power plant technology. Supercritical power plants use supercritical boilers which are able to withstand a very high temperature (650-720˚C and pressure (22.1 MPa while producing superheated steam. The thermal efficiency of a supercritical boiler greatly depends on the material of its different components. The supercritical boiler material should possess high creep rupture strength, high thermal conductivity, low thermal expansion, high specific heat and very high temperature withstandability. This paper considers a list of seven supercritical boiler materials whose performance is evaluated based on seven pivotal criteria. Given the intricacy and difficulty of this supercritical boiler material selection problem having interactions and interdependencies between different criteria, this paper applies fuzzy analytic network process to select the most appropriate material for a supercritical boiler. Rene 41 is the best supercritical boiler material, whereas, Haynes 230 is the worst preferred choice.

  20. Model and Analytic Processes for Export License Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.; Wood, Thomas W.; Daly, Don S.; Brothers, Alan J.; Sanfilippo, Antonio P.; Cook, Diane; Holder, Larry

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determine which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An

  1. Quality specifications for the extra-analytical phase of laboratory testing: Reference intervals and decision limits.

    Science.gov (United States)

    Ceriotti, Ferruccio

    2017-07-01

    Reference intervals and decision limits are a critical part of the clinical laboratory report. The evaluation of their correct use represents a tool to verify the post analytical quality. Four elements are identified as indicators. 1. The use of decision limits for lipids and glycated hemoglobin. 2. The use, whenever possible, of common reference values. 3. The presence of gender-related reference intervals for at least the following common serum measurands (besides obviously the fertility relate hormones): alkaline phosphatase (ALP), alanine aminotransferase (ALT), creatine kinase (CK), creatinine, gamma-glutamyl transferase (GGT), IgM, ferritin, iron, transferrin, urate, red blood cells (RBC), hemoglobin (Hb) and hematocrit (Hct). 4. The presence of age-related reference intervals. The problem of specific reference intervals for elderly people is discussed, but their use is not recommended; on the contrary it is necessary the presence of pediatric age-related reference intervals at least for the following common serum measurands: ALP, amylase, creatinine, inorganic phosphate, lactate dehydrogenase, aspartate aminotransferase, urate, insulin like growth factor 1, white blood cells, RBC, Hb, Hct, alfa-fetoprotein and fertility related hormones. The lack of such reference intervals may imply significant risks for the patients. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  2. An analytical optimization method for electric propulsion orbit transfer vehicles

    International Nuclear Information System (INIS)

    Oleson, S.R.

    1993-01-01

    Due to electric propulsion's inherent propellant mass savings over chemical propulsion, electric propulsion orbit transfer vehicles (EPOTVs) are a highly efficient mode of orbit transfer. When selecting an electric propulsion device (ion, MPD, or arcjet) and propellant for a particular mission, it is preferable to use quick, analytical system optimization methods instead of time intensive numerical integration methods. It is also of interest to determine each thruster's optimal operating characteristics for a specific mission. Analytical expressions are derived which determine the optimal specific impulse (Isp) for each type of electric thruster to maximize payload fraction for a desired thrusting time. These expressions take into account the variation of thruster efficiency with specific impulse. Verification of the method is made with representative electric propulsion values on a LEO-to-GEO mission. Application of the method to specific missions is discussed

  3. Open-Source Radiation Exposure Extraction Engine (RE3) with Patient-Specific Outlier Detection.

    Science.gov (United States)

    Weisenthal, Samuel J; Folio, Les; Kovacs, William; Seff, Ari; Derderian, Vana; Summers, Ronald M; Yao, Jianhua

    2016-08-01

    We present an open-source, picture archiving and communication system (PACS)-integrated radiation exposure extraction engine (RE3) that provides study-, series-, and slice-specific data for automated monitoring of computed tomography (CT) radiation exposure. RE3 was built using open-source components and seamlessly integrates with the PACS. RE3 calculations of dose length product (DLP) from the Digital imaging and communications in medicine (DICOM) headers showed high agreement (R (2) = 0.99) with the vendor dose pages. For study-specific outlier detection, RE3 constructs robust, automatically updating multivariable regression models to predict DLP in the context of patient gender and age, scan length, water-equivalent diameter (D w), and scanned body volume (SBV). As proof of concept, the model was trained on 811 CT chest, abdomen + pelvis (CAP) exams and 29 outliers were detected. The continuous variables used in the outlier detection model were scan length (R (2)  = 0.45), D w (R (2) = 0.70), SBV (R (2) = 0.80), and age (R (2) = 0.01). The categorical variables were gender (male average 1182.7 ± 26.3 and female 1047.1 ± 26.9 mGy cm) and pediatric status (pediatric average 710.7 ± 73.6 mGy cm and adult 1134.5 ± 19.3 mGy cm).

  4. Importance of implementing an analytical quality control system in a core laboratory.

    Science.gov (United States)

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions

  5. Suitability Evaluation of Specific Shallow Geothermal Technologies Using a GIS-Based Multi Criteria Decision Analysis Implementing the Analytic Hierarchic Process

    Directory of Open Access Journals (Sweden)

    Francesco Tinti

    2018-02-01

    Full Text Available The exploitation potential of shallow geothermal energy is usually defined in terms of site-specific ground thermal characteristics. While true, this assumption limits the complexity of the analysis, since feasibility studies involve many other components that must be taken into account when calculating the effective market viability of a geothermal technology or the economic value of a shallow geothermal project. In addition, the results of a feasibility study are not simply the sum of the various factors since some components may be conflicting while others will be of a qualitative nature only. Different approaches are therefore needed to evaluate the suitability of an area for shallow geothermal installation. This paper introduces a new GIS platform-based multicriteria decision analysis method aimed at comparing as many different shallow geothermal relevant factors as possible. Using the Analytic Hierarchic Process Tool, a geolocalized Suitability Index was obtained for a specific technological case: the integrated technologies developed within the GEOTeCH Project. A suitability map for the technologies in question was drawn up for Europe.

  6. Bias Assessment of General Chemistry Analytes using Commutable Samples.

    Science.gov (United States)

    Koerbin, Gus; Tate, Jillian R; Ryan, Julie; Jones, Graham Rd; Sikaris, Ken A; Kanowski, David; Reed, Maxine; Gill, Janice; Koumantakis, George; Yen, Tina; St John, Andrew; Hickman, Peter E; Simpson, Aaron; Graham, Peter

    2014-11-01

    Harmonisation of reference intervals for routine general chemistry analytes has been a goal for many years. Analytical bias may prevent this harmonisation. To determine if analytical bias is present when comparing methods, the use of commutable samples, or samples that have the same properties as the clinical samples routinely analysed, should be used as reference samples to eliminate the possibility of matrix effect. The use of commutable samples has improved the identification of unacceptable analytical performance in the Netherlands and Spain. The International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) has undertaken a pilot study using commutable samples in an attempt to determine not only country specific reference intervals but to make them comparable between countries. Australia and New Zealand, through the Australasian Association of Clinical Biochemists (AACB), have also undertaken an assessment of analytical bias using commutable samples and determined that of the 27 general chemistry analytes studied, 19 showed sufficiently small between method biases as to not prevent harmonisation of reference intervals. Application of evidence based approaches including the determination of analytical bias using commutable material is necessary when seeking to harmonise reference intervals.

  7. Environmental forensic principals for sources allocation of polycyclic aromatic hydrocarbons

    International Nuclear Information System (INIS)

    O'Sullivan, G.; Martin, E.; Sandau, C.D.

    2008-01-01

    Polycyclic aromatic hydrocarbons (PAH) are organic compounds which include only carbon and hydrogen with a fused ring structure containing at least two six-sided benzene rings but may also contain additional fused rings that are not six-sided. The environmental forensic principals for sources allocation of PAHs were examined in this presentation. Specifically, the presentation addressed the structure and physiochemical properties of PAHs; sources and sinks; fate and behaviour; analytical techniques; conventional source identification techniques; and toxic equivalent fingerprinting. It presented a case study where residents had been allegedly exposed to dioxins, PAHs and metals released from a railroad tie treatment plant. The classification of PAHs is governed by thermodynamic properties such as biogenic, petrogenic, and pyrogenic properties. A number of techniques were completed, including chemical fingerprinting; molecular diagnostic ratios; cluster analysis; principal component analysis; and TEF fingerprinting. These techniques have shown that suspected impacted sites do not all share similar PAH signatures indicating the potential for various sources. Several sites shared similar signatures to background locations. tabs., figs

  8. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  9. Real-Time Analytics for the Healthcare Industry: Arrhythmia Detection.

    Science.gov (United States)

    Agneeswaran, Vijay Srinivas; Mukherjee, Joydeb; Gupta, Ashutosh; Tonpay, Pranay; Tiwari, Jayati; Agarwal, Nitin

    2013-09-01

    It is time for the healthcare industry to move from the era of "analyzing our health history" to the age of "managing the future of our health." In this article, we illustrate the importance of real-time analytics across the healthcare industry by providing a generic mechanism to reengineer traditional analytics expressed in the R programming language into Storm-based real-time analytics code. This is a powerful abstraction, since most data scientists use R to write the analytics and are not clear on how to make the data work in real-time and on high-velocity data. Our paper focuses on the applications necessary to a healthcare analytics scenario, specifically focusing on the importance of electrocardiogram (ECG) monitoring. A physician can use our framework to compare ECG reports by categorization and consequently detect Arrhythmia. The framework can read the ECG signals and uses a machine learning-based categorizer that runs within a Storm environment to compare different ECG signals. The paper also presents some performance studies of the framework to illustrate the throughput and accuracy trade-off in real-time analytics.

  10. Determination of uranium in ground water using different analytical techniques

    International Nuclear Information System (INIS)

    Sahu, S.K.; Maity, Sukanta; Bhangare, R.C.; Pandit, G.G.; Sharma, D.N.

    2014-10-01

    The concern over presence of natural radionuclides like uranium in drinking water is growing recently. The contamination of aquifers with radionuclides depends on number of factors. The geology of an area is the most important factor along with anthropogenic activities like mining, coal ash disposal from thermal power plants, use of phosphate fertilizers etc. Whatever may be the source, the presence of uranium in drinking waters is a matter of great concern for public health. Studies show that uranium is a chemo-toxic and nephrotoxic heavy metal. This chemotoxicity affects the kidneys and bones in particular. Seeing the potential health hazards from natural radionuclides in drinking water, many countries worldwide have adopted the guideline activity concentration for drinking water quality recommended by the WHO (2011). For uranium, WHO has set a limit of 30μgL-1 in drinking water. The geological distribution of uranium and its migration in environment is of interest because the element is having environmental and exposure concerns. It is of great interest to use an analytical technique for uranium analysis in water which is highly sensitive especially at trace levels, specific and precise in presence of other naturally occurring major and trace metals and needs small amount of sample. Various analytical methods based on the use of different techniques have been developed in the past for the determination of uranium in the geological samples. The determination of uranium requires high selectivity due to its strong association with other elements. Several trace level wet chemistry analytical techniques have been reported for uranium determination, but most of these involve tedious and pain staking procedures, high detection limits, interferences etc. Each analytical technique has its own merits and demerits. Comparative assessment by different techniques can provide better quality control and assurance. In present study, uranium was analysed in ground water samples

  11. Specific absorbed fractions of energy at various ages from internal photon sources: 7, Adult male

    International Nuclear Information System (INIS)

    Cristy, M.; Eckerman, K.F.

    1987-04-01

    Specific absorbed fractions (PHI's) in various organs of the body (target organs) from sources of monoenergetic photons in various other organs (source organs) are tabulated. In this volume PHI-values are tabulated for an adult male (70-kg Reference Man). These PHI-values can be used in calculating the photon component of the dose-equivalent rate in a given target organ from a given radionuclide that is present in a given source organ. The International Commission on Radiological Protection recognizes that the endosteal, or bone surface, cells are the tissue at risk for bone cancer. We have applied the dosimetry methods developed for beta-emitting radionuclides deposited in bone to follow the transport of secondary electrons that were freed by photon interactions through the microscopic structure of the skeleton. With these methods we can estimate PHI in the endosteal cells and can better estimate PHI in the active marrow; the latter is overestimated with other methods at photon energies below 200 keV. 12 refs., 2 tabs

  12. Identification of specific sources of airborne particles emitted from within a complex industrial (steelworks) site

    Science.gov (United States)

    Beddows, D. C. S.; Harrison, Roy M.

    2018-06-01

    A case study is provided of the development and application of methods to identify and quantify specific sources of emissions from within a large complex industrial site. Methods include directional analysis of concentrations, chemical source tracers and correlations with gaseous emissions. Extensive measurements of PM10, PM2.5, trace gases, particulate elements and single particle mass spectra were made at sites around the Port Talbot steelworks in 2012. By using wind direction data in conjunction with real-time or hourly-average pollutant concentration measurements, it has been possible to locate areas within the steelworks associated with enhanced pollutant emissions. Directional analysis highlights the Slag Handling area of the works as the most substantial source of elevated PM10 concentrations during the measurement period. Chemical analyses of air sampled from relevant wind directions is consistent with the anticipated composition of slags, as are single particle mass spectra. Elevated concentrations of PM10 are related to inverse distance from the Slag Handling area, and concentrations increase with increased wind speed, consistent with a wind-driven resuspension source. There also appears to be a lesser source associated with Sinter Plant emissions affecting PM10 concentrations at the Fire Station monitoring site. The results are compared with a ME2 study using some of the same data, and shown to give a clearer view of the location and characteristics of emission sources, including fugitive dusts.

  13. Characterization of Analytical Reference Glass-1 (ARG-1)

    International Nuclear Information System (INIS)

    Smith, G.L.

    1993-12-01

    High-level radioactive waste may be immobilized in borosilicate glass at the West Valley Demonstration Project, West Valley, New York, the Defense Waste Processing Facility (DWPF), Aiken, South Carolina, and the Hanford Waste Vitrification Project (HWVP), Richland, Washington. The vitrified waste form will be stored in stainless steel canisters before its eventual transfer to a geologic repository for long-term disposal. Waste Acceptance Product Specifications (WAPS) (DOE 1993), Section 1.1.2 requires that the waste form producers must report the measured chemical composition of the vitrified waste in their production records before disposal. Chemical analysis of glass waste forms is receiving increased attention due to qualification requirements of vitrified waste forms. The Pacific Northwest Laboratory (PNL) has been supporting the glass producers' analytical laboratories by a continuing program of multilaboratory analytical testing using interlaboratory ''round robin'' methods. At the PNL Materials Characterization Center Analytical Round Robin 4 workshop ''Analysis of Nuclear Waste Glass and Related Materials,'' January 16--17, 1990, Pleasanton, California, the meeting attendees decided that simulated nuclear waste analytical reference glasses were needed for use as analytical standards. Use of common standard analytical reference materials would allow the glass producers' analytical laboratories to calibrate procedures and instrumentation, to control laboratory performance and conduct self-appraisals, and to help qualify their various waste forms

  14. An analytical solution for improved HIFU SAR estimation

    International Nuclear Information System (INIS)

    Dillon, C R; Vyas, U; Christensen, D A; Roemer, R B; Payne, A

    2012-01-01

    Accurate determination of the specific absorption rates (SARs) present during high intensity focused ultrasound (HIFU) experiments and treatments provides a solid physical basis for scientific comparison of results among HIFU studies and is necessary to validate and improve SAR predictive software, which will improve patient treatment planning, control and evaluation. This study develops and tests an analytical solution that significantly improves the accuracy of SAR values obtained from HIFU temperature data. SAR estimates are obtained by fitting the analytical temperature solution for a one-dimensional radial Gaussian heating pattern to the temperature versus time data following a step in applied power and evaluating the initial slope of the analytical solution. The analytical method is evaluated in multiple parametric simulations for which it consistently (except at high perfusions) yields maximum errors of less than 10% at the center of the focal zone compared with errors up to 90% and 55% for the commonly used linear method and an exponential method, respectively. For high perfusion, an extension of the analytical method estimates SAR with less than 10% error. The analytical method is validated experimentally by showing that the temperature elevations predicted using the analytical method's SAR values determined for the entire 3D focal region agree well with the experimental temperature elevations in a HIFU-heated tissue-mimicking phantom. (paper)

  15. Improvement of Analytical Technique for Determination of Gold in ...

    African Journals Online (AJOL)

    This article elucidates the improvement of analytical technique for determination of gold in geological matrix. Samples suspected to have gold in them were subjected to neutron flux from the Nigeria Research Reactor (NRR-1), a Miniature Neutron Source Reactor (MNSR). Two geological samples – one sample was ...

  16. Analytic continuation by duality estimation of the S parameter

    International Nuclear Information System (INIS)

    Ignjatovic, S. R.; Wijewardhana, L. C. R.; Takeuchi, T.

    2000-01-01

    We investigate the reliability of the analytic continuation by duality (ACD) technique in estimating the electroweak S parameter for technicolor theories. The ACD technique, which is an application of finite energy sum rules, relates the S parameter for theories with unknown particle spectra to known OPE coefficients. We identify the sources of error inherent in the technique and evaluate them for several toy models to see if they can be controlled. The evaluation of errors is done analytically and all relevant formulas are provided in appendixes including analytical formulas for approximating the function 1/s with a polynomial in s. The use of analytical formulas protects us from introducing additional errors due to numerical integration. We find that it is very difficult to control the errors even when the momentum dependence of the OPE coefficients is known exactly. In realistic cases in which the momentum dependence of the OPE coefficients is only known perturbatively, it is impossible to obtain a reliable estimate. (c) 2000 The American Physical Society

  17. Intuitive versus analytical decision making modulates trust in e-commerce

    Directory of Open Access Journals (Sweden)

    Paola Iannello

    2014-11-01

    Full Text Available The hypothesis that intuition and analytical processes affect differently trust in e-commerce was tested. Participants were offered products by a series of sellers via Internet. In the intuitive condition pictures of the sellers were followed by neutral descriptions and participants had less time to decide whether to trust the seller. In the analytical condition participants were given an informative description of the seller and had a longer time to decide. Interactions among condition, price and trust emerged in behavioral and psychophysiological responses. EMG signals increased during analytical processing, suggesting a cognitive effort, whereas higher cardiovascular measures mirrored the emotional involvement when faced to untrustworthy sellers. The study supported the fruitful application of the intuitive vs. analytical approach to e-commerce and of the combination of different sources of information about the buyers while they have to choose to trust the seller in a financial transaction over the Internet.

  18. Species-specific variation in the phosphorus nutritional sources by microphytoplankton in a Mediterranean estuary

    Directory of Open Access Journals (Sweden)

    MARLY CAROLINA MARTINEZ SOTO

    2015-08-01

    Full Text Available We investigated the species-specific phosphorus (P nutrition sources in the microphytoplankton community in the Mahon estuary (Minorca, Western Mediterranean in 2011, under two contrasting hydrographic scenarios. Estuarine flow, nutrient concentrations, phytoplankton community composition and enzyme-labeled fluorescence (ELF were measured in June and October, corresponding to the beginning and the end of summer. Dissolved inorganic nitrogen (DIN and inorganic phosphate (Pi exhibited enhanced concentrations in the inner estuary where N:P molar ratios suggested P-limitation in both surveys. Pi was low and variable (0.09±0.02 μmol•l-1 in June and 0.06±0.02 μmol•l-1 in October, whereas organic phosphorus remained a more reliable P source. Even though ambient Pi concentrations were slightly higher on June, when the microphytoplankton assemblage was dominated by dinoflagellates, the percentage of cells expressing ELF labeling was notably higher (65% of total cells than in October (12%, when the presence of diatoms characterized the microphytoplankton community. ELF was mainly expressed by dinoflagellate taxa, whereas diatoms only expressed significant AP in the inner estuary during the June survey. A P-addition bioassay in which response of AP to Pi enrichment was evaluated showed remarkable reduction in AP with increasing Pi. However, some dinoflagellate species maintained AP even when Pi was supplied in excess. We suggest that in the case of some dinoflagellate species AP is not as tightly controlled by ambient Pi as previously believed. AP activity in these species could indicate selective use of organic phosphorus, or slow metabolic response to changes in P forms, rather than physiological stress to low Pi availability. We emphasize the importance of identifying the links between the different P sources and the species-specific requirements, in order to understand the ecological response to anthropogenic biogeochemical perturbations.

  19. Analytical and physical electrochemistry

    CERN Document Server

    Girault, Hubert H

    2004-01-01

    The study of electrochemistry is pertinent to a wide variety of fields, including bioenergetics, environmental sciences, and engineering sciences. In addition, electrochemistry plays a fundamental role in specific applications as diverse as the conversion and storage of energy and the sequencing of DNA.Intended both as a basic course for undergraduate students and as a reference work for graduates and researchers, Analytical and Physical Electrochemistry covers two fundamental aspects of electrochemistry: electrochemistry in solution and interfacial electrochemistry. By bringing these two subj

  20. Fast analytical scatter estimation using graphics processing units.

    Science.gov (United States)

    Ingleby, Harry; Lippuner, Jonas; Rickey, Daniel W; Li, Yue; Elbakri, Idris

    2015-01-01

    To develop a fast patient-specific analytical estimator of first-order Compton and Rayleigh scatter in cone-beam computed tomography, implemented using graphics processing units. The authors developed an analytical estimator for first-order Compton and Rayleigh scatter in a cone-beam computed tomography geometry. The estimator was coded using NVIDIA's CUDA environment for execution on an NVIDIA graphics processing unit. Performance of the analytical estimator was validated by comparison with high-count Monte Carlo simulations for two different numerical phantoms. Monoenergetic analytical simulations were compared with monoenergetic and polyenergetic Monte Carlo simulations. Analytical and Monte Carlo scatter estimates were compared both qualitatively, from visual inspection of images and profiles, and quantitatively, using a scaled root-mean-square difference metric. Reconstruction of simulated cone-beam projection data of an anthropomorphic breast phantom illustrated the potential of this method as a component of a scatter correction algorithm. The monoenergetic analytical and Monte Carlo scatter estimates showed very good agreement. The monoenergetic analytical estimates showed good agreement for Compton single scatter and reasonable agreement for Rayleigh single scatter when compared with polyenergetic Monte Carlo estimates. For a voxelized phantom with dimensions 128 × 128 × 128 voxels and a detector with 256 × 256 pixels, the analytical estimator required 669 seconds for a single projection, using a single NVIDIA 9800 GX2 video card. Accounting for first order scatter in cone-beam image reconstruction improves the contrast to noise ratio of the reconstructed images. The analytical scatter estimator, implemented using graphics processing units, provides rapid and accurate estimates of single scatter and with further acceleration and a method to account for multiple scatter may be useful for practical scatter correction schemes.

  1. Experimental research on specific activity of 24Na using Chinese reference man phantom irradiated by 252Cf neutrons source

    International Nuclear Information System (INIS)

    Wang Yuexing; Yang Yifang; Lu Yongjie; Zhang Jianguo; Xing Hongchuan

    2011-01-01

    Objective: To investigate the specific activity of '2 4 Na per unit neutron fluence, A B/Φ ,in blood produced for Chinese reference man irradiated by 252 Cf neutron source,and to analyze the effects of scattering neutrons from ground,wall,and ceiling in irradiation site on it.Methods: A 252 Cf neutron source of 3×10 8 n/s and the anthropomorphic phantom were used for experiments. The phantom was made from 4 mm thick of outer covering by perspex and the liquid tissue-equivalent substitute in it. The data of phantom dimensions fit into Chinese reference man.The weight ratios of H, N, O and C in substitute equal from source to long axis of phantom were 1.1, 2.1, 3.1 and 4.1 m, respectively. Both the neutron source and the position of xiphisternum of the phantom were 1.6 m above the floor. Results: The average specific activity of 24 Na per unit neutron fluence was related to the irradiation-distances, d, and its maximum value, A B/ΦM , deduced by experimental data was about 1.85×10 -7 Bq·cm 2 ·g -1 . Conclusions: The A B/ΦM corresponds to that of phantom irradiated by plane-parallel beams, and the value is about more 3% than that by BOMAB phantom reported in literature. It has shown that floor-(wall-)scattered neutrons in irradiation site have significant contribution to the specific activity of 24 Na, but they contributed relatively little to the induced neutron doses. Consequently,using the specific activity of 24 Na for assessing accidental neutron doses received by an individual, the contribution of scattered neutrons in accident site will lead dose to be overestimated, and need to be correct. (authors)

  2. (U) An Analytic Examination of Piezoelectric Ejecta Mass Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Tregillis, Ian Lee [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-02

    Ongoing efforts to validate a Richtmyer-Meshkov instability (RMI) based ejecta source model [1, 2, 3] in LANL ASC codes use ejecta areal masses derived from piezoelectric sensor data [4, 5, 6]. However, the standard technique for inferring masses from sensor voltages implicitly assumes instantaneous ejecta creation [7], which is not a feature of the RMI source model. To investigate the impact of this discrepancy, we define separate “areal mass functions” (AMFs) at the source and sensor in terms of typically unknown distribution functions for the ejecta particles, and derive an analytic relationship between them. Then, for the case of single-shock ejection into vacuum, we use the AMFs to compare the analytic (or “true”) accumulated mass at the sensor with the value that would be inferred from piezoelectric voltage measurements. We confirm the inferred mass is correct when creation is instantaneous, and furthermore prove that when creation is not instantaneous, the inferred values will always overestimate the true mass. Finally, we derive an upper bound for the error imposed on a perfect system by the assumption of instantaneous ejecta creation. When applied to shots in the published literature, this bound is frequently less than several percent. Errors exceeding 15% may require velocities or timescales at odds with experimental observations.

  3. Biodiesel Analytical Methods: August 2002--January 2004

    Energy Technology Data Exchange (ETDEWEB)

    Van Gerpen, J.; Shanks, B.; Pruszko, R.; Clements, D.; Knothe, G.

    2004-07-01

    Biodiesel is an alternative fuel for diesel engines that is receiving great attention worldwide. The material contained in this book is intended to provide the reader with information about biodiesel engines and fuels, analytical methods used to measure fuel properties, and specifications for biodiesel quality control.

  4. Solvent effects in time-dependent self-consistent field methods. II. Variational formulations and analytical gradients

    International Nuclear Information System (INIS)

    Bjorgaard, J. A.; Velizhanin, K. A.; Tretiak, S.

    2015-01-01

    This study describes variational energy expressions and analytical excited state energy gradients for time-dependent self-consistent field methods with polarizable solvent effects. Linear response, vertical excitation, and state-specific solventmodels are examined. Enforcing a variational ground stateenergy expression in the state-specific model is found to reduce it to the vertical excitation model. Variational excited state energy expressions are then provided for the linear response and vertical excitation models and analytical gradients are formulated. Using semiempiricalmodel chemistry, the variational expressions are verified by numerical and analytical differentiation with respect to a static external electric field. Lastly, analytical gradients are further tested by performing microcanonical excited state molecular dynamics with p-nitroaniline

  5. Analytical Approximation of Spectrum for Pulse X-ray Tubes

    International Nuclear Information System (INIS)

    Vavilov, S; Fofanof, O; Koshkin, G; Udod, V

    2016-01-01

    Among the main characteristics of the pulsed X-ray apparatuses the spectral energy characteristics are the most important ones: the spectral distribution of the photon energy, effective and maximum energy of quanta. Knowing the spectral characteristics of the radiation of pulse sources is very important for the practical use of them in non-destructive testing. We have attempted on the analytical approximation of the pulsed X-ray apparatuses spectra obtained in the different experimental papers. The results of the analytical approximation of energy spectrum for pulse X-ray tube are presented. Obtained formulas are adequate to experimental data and can be used by designing pulsed X-ray apparatuses. (paper)

  6. Gravitational wave generation from bubble collisions in first-order phase transitions: An analytic approach

    International Nuclear Information System (INIS)

    Caprini, Chiara; Durrer, Ruth; Servant, Geraldine

    2008-01-01

    Gravitational wave production from bubble collisions was calculated in the early 1990s using numerical simulations. In this paper, we present an alternative analytic estimate, relying on a different treatment of stochasticity. In our approach, we provide a model for the bubble velocity power spectrum, suitable for both detonations and deflagrations. From this, we derive the anisotropic stress and analytically solve the gravitational wave equation. We provide analytical formulas for the peak frequency and the shape of the spectrum which we compare with numerical estimates. In contrast to the previous analysis, we do not work in the envelope approximation. This paper focuses on a particular source of gravitational waves from phase transitions. In a companion article, we will add together the different sources of gravitational wave signals from phase transitions: bubble collisions, turbulence and magnetic fields and discuss the prospects for probing the electroweak phase transition at LISA

  7. Oracle Exalytics: Engineered for Speed-of-Thought Analytics

    Directory of Open Access Journals (Sweden)

    Gabriela GLIGOR

    2011-12-01

    Full Text Available One of the biggest product announcements at 2011's Oracle OpenWorld user conference was Oracle Exalytics In-Memory Machine, the latest addition to the "Exa"-branded suite of Oracle-Sun engineered software-hardware systems. Analytics is all about gaining insights from the data for better decision making. However, the vision of delivering fast, interactive, insightful analytics has remained elusive for most organizations. Most enterprise IT organizations continue to struggle to deliver actionable analytics due to time-sensitive, sprawling requirements and ever tightening budgets. The issue is further exasperated by the fact that most enterprise analytics solutions require dealing with a number of hardware, software, storage and networking vendors and precious resources are wasted integrating the hardware and software components to deliver a complete analytical solution. Oracle Exalytics Business Intelligence Machine is the world’s first engineered system specifically designed to deliver high performance analysis, modeling and planning. Built using industry-standard hardware, market-leading business intelligence software and in-memory database technology, Oracle Exalytics is an optimized system that delivers answers to all your business questions with unmatched speed, intelligence, simplicity and manageability.

  8. Analytical and functional similarity of Amgen biosimilar ABP 215 to bevacizumab.

    Science.gov (United States)

    Seo, Neungseon; Polozova, Alla; Zhang, Mingxuan; Yates, Zachary; Cao, Shawn; Li, Huimin; Kuhns, Scott; Maher, Gwendolyn; McBride, Helen J; Liu, Jennifer

    ABP 215 is a biosimilar product to bevacizumab. Bevacizumab acts by binding to vascular endothelial growth factor A, inhibiting endothelial cell proliferation and new blood vessel formation, thereby leading to tumor vasculature normalization. The ABP 215 analytical similarity assessment was designed to assess the structural and functional similarity of ABP 215 and bevacizumab sourced from both the United States (US) and the European Union (EU). Similarity assessment was also made between the US- and EU-sourced bevacizumab to assess the similarity between the two products. The physicochemical properties and structural similarity of ABP 215 and bevacizumab were characterized using sensitive state-of-the-art analytical techniques capable of detecting small differences in product attributes. ABP 215 has the same amino acid sequence and exhibits similar post-translational modification profiles compared to bevacizumab. The functional similarity assessment employed orthogonal assays designed to interrogate all expected biological activities, including those known to affect the mechanisms of action for ABP 215 and bevacizumab. More than 20 batches of bevacizumab (US) and bevacizumab (EU), and 13 batches of ABP 215 representing unique drug substance lots were assessed for similarity. The large dataset allows meaningful comparisons and garners confidence in the overall conclusion for the analytical similarity assessment of ABP 215 to both US- and EU-sourced bevacizumab. The structural and purity attributes, and biological properties of ABP 215 are demonstrated to be highly similar to those of bevacizumab.

  9. [Quality Management and Quality Specifications of Laboratory Tests in Clinical Studies--Challenges in Pre-Analytical Processes in Clinical Laboratories].

    Science.gov (United States)

    Ishibashi, Midori

    2015-01-01

    The cost, speed, and quality are the three important factors recently indicated by the Ministry of Health, Labour and Welfare (MHLW) for the purpose of accelerating clinical studies. Based on this background, the importance of laboratory tests is increasing, especially in the evaluation of clinical study participants' entry and safety, and drug efficacy. To assure the quality of laboratory tests, providing high-quality laboratory tests is mandatory. For providing adequate quality assurance in laboratory tests, quality control in the three fields of pre-analytical, analytical, and post-analytical processes is extremely important. There are, however, no detailed written requirements concerning specimen collection, handling, preparation, storage, and shipping. Most laboratory tests for clinical studies are performed onsite in a local laboratory; however, a part of laboratory tests is done in offsite central laboratories after specimen shipping. As factors affecting laboratory tests, individual and inter-individual variations are well-known. Besides these factors, standardizing the factors of specimen collection, handling, preparation, storage, and shipping, may improve and maintain the high quality of clinical studies in general. Furthermore, the analytical method, units, and reference interval are also important factors. It is concluded that, to overcome the problems derived from pre-analytical processes, it is necessary to standardize specimen handling in a broad sense.

  10. Eco-analytical Methodology in Environmental Problems Monitoring

    Science.gov (United States)

    Agienko, M. I.; Bondareva, E. P.; Chistyakova, G. V.; Zhironkina, O. V.; Kalinina, O. I.

    2017-01-01

    Among the problems common to all mankind, which solutions influence the prospects of civilization, the problem of ecological situation monitoring takes very important place. Solution of this problem requires specific methodology based on eco-analytical comprehension of global issues. Eco-analytical methodology should help searching for the optimum balance between environmental problems and accelerating scientific and technical progress. The fact that Governments, corporations, scientists and nations focus on the production and consumption of material goods cause great damage to environment. As a result, the activity of environmentalists is developing quite spontaneously, as a complement to productive activities. Therefore, the challenge posed by the environmental problems for the science is the formation of geo-analytical reasoning and the monitoring of global problems common for the whole humanity. So it is expected to find the optimal trajectory of industrial development to prevent irreversible problems in the biosphere that could stop progress of civilization.

  11. Kawerau fluid chemistry : analytical results

    International Nuclear Information System (INIS)

    Mroczek, E.K.; Christenson, B.W.; Mountain, B.; Stewart, M.K.

    2001-01-01

    This report summarises the water and gas analytical data collected from Kawerau geothermal field 1998-2000 under the Sustainable Management of Geothermal and Mineral Resources (GMR) Project, Objective 2 'Understanding New Zealand Geothermal Systems'. The work is part of the continuing effort to characterise the chemical, thermal and isotopic signatures of the deep magmatic heat sources which drive our geothermal systems. At Kawerau there is clear indication that the present-day heat source relates to young volcanism within the field. However, being at the margins of the explored reservoir, little is presently known of the characteristics of that heat source. The Kawerau study follows on directly from the recently completed work characterising the geochemical signatures of the Ohaaki hydrothermal system. In the latter study the interpretation of the radiogenic noble gas isotope systematics was of fundamental importance in characterising the magmatic heat source. Unfortunately the collaboration with LLNL, which analysed the isotopes, could not be extended to include the Kawerau data. The gas samples have been archived and will be analysed once a new collaborator is found to continue the work. The purpose of the present compilation is to facilitate the final completion of the study by ensuring the data is accessible in one report. (author). 5 refs., 2 figs., 9 tabs

  12. Big data analytics to improve cardiovascular care: promise and challenges.

    Science.gov (United States)

    Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M

    2016-06-01

    The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system.

  13. Making Sense of Video Analytics: Lessons Learned from Clickstream Interactions, Attitudes, and Learning Outcome in a Video-Assisted Course

    Directory of Open Access Journals (Sweden)

    Michail N. Giannakos

    2015-02-01

    Full Text Available Online video lectures have been considered an instructional media for various pedagogic approaches, such as the flipped classroom and open online courses. In comparison to other instructional media, online video affords the opportunity for recording student clickstream patterns within a video lecture. Video analytics within lecture videos may provide insights into student learning performance and inform the improvement of video-assisted teaching tactics. Nevertheless, video analytics are not accessible to learning stakeholders, such as researchers and educators, mainly because online video platforms do not broadly share the interactions of the users with their systems. For this purpose, we have designed an open-access video analytics system for use in a video-assisted course. In this paper, we present a longitudinal study, which provides valuable insights through the lens of the collected video analytics. In particular, we found that there is a relationship between video navigation (repeated views and the level of cognition/thinking required for a specific video segment. Our results indicated that learning performance progress was slightly improved and stabilized after the third week of the video-assisted course. We also found that attitudes regarding easiness, usability, usefulness, and acceptance of this type of course remained at the same levels throughout the course. Finally, we triangulate analytics from diverse sources, discuss them, and provide the lessons learned for further development and refinement of video-assisted courses and practices.

  14. Humidity Effects on Fragmentation in Plasma-Based Ambient Ionization Sources.

    Science.gov (United States)

    Newsome, G Asher; Ackerman, Luke K; Johnson, Kevin J

    2016-01-01

    Post-plasma ambient desorption/ionization (ADI) sources are fundamentally dependent on surrounding water vapor to produce protonated analyte ions. There are two reports of humidity effects on ADI spectra. However, it is unclear whether humidity will affect all ADI sources and analytes, and by what mechanism humidity affects spectra. Flowing atmospheric pressure afterglow (FAPA) ionization and direct analysis in real time (DART) mass spectra of various surface-deposited and gas-phase analytes were acquired at ambient temperature and pressure across a range of observed humidity values. A controlled humidity enclosure around the ion source and mass spectrometer inlet was used to create programmed humidity and temperatures. The relative abundance and fragmentation of molecular adduct ions for several compounds consistently varied with changing ambient humidity and also were controlled with the humidity enclosure. For several compounds, increasing humidity decreased protonated molecule and other molecular adduct ion fragmentation in both FAPA and DART spectra. For others, humidity increased fragment ion ratios. The effects of humidity on molecular adduct ion fragmentation were caused by changes in the relative abundances of different reagent protonated water clusters and, thus, a change in the average difference in proton affinity between an analyte and the population of water clusters. Control of humidity in ambient post-plasma ion sources is needed to create spectral stability and reproducibility.

  15. Analytical chemistry instrumentation

    International Nuclear Information System (INIS)

    Laing, W.R.

    1986-01-01

    In nine sections, 48 chapters cover 1) analytical chemistry and the environment 2) environmental radiochemistry 3) automated instrumentation 4) advances in analytical mass spectrometry 5) fourier transform spectroscopy 6) analytical chemistry of plutonium 7) nuclear analytical chemistry 8) chemometrics and 9) nuclear fuel technology

  16. Trends in analytical techniques applied to particulate matter characterization: A critical review of fundaments and applications.

    Science.gov (United States)

    Galvão, Elson Silva; Santos, Jane Meri; Lima, Ana Teresa; Reis, Neyval Costa; Orlando, Marcos Tadeu D'Azeredo; Stuetz, Richard Michael

    2018-05-01

    Epidemiological studies have shown the association of airborne particulate matter (PM) size and chemical composition with health problems affecting the cardiorespiratory and central nervous systems. PM also act as cloud condensation nuclei (CNN) or ice nuclei (IN), taking part in the clouds formation process, and therefore can impact the climate. There are several works using different analytical techniques in PM chemical and physical characterization to supply information to source apportionment models that help environmental agencies to assess damages accountability. Despite the numerous analytical techniques described in the literature available for PM characterization, laboratories are normally limited to the in-house available techniques, which raises the question if a given technique is suitable for the purpose of a specific experimental work. The aim of this work consists of summarizing the main available technologies for PM characterization, serving as a guide for readers to find the most appropriate technique(s) for their investigation. Elemental analysis techniques like atomic spectrometry based and X-ray based techniques, organic and carbonaceous techniques and surface analysis techniques are discussed, illustrating their main features as well as their advantages and drawbacks. We also discuss the trends in analytical techniques used over the last two decades. The choice among all techniques is a function of a number of parameters such as: the relevant particles physical properties, sampling and measuring time, access to available facilities and the costs associated to equipment acquisition, among other considerations. An analytical guide map is presented as a guideline for choosing the most appropriated technique for a given analytical information required. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. New trends in the analytical determination of emerging contaminants and their transformation products in environmental waters.

    Science.gov (United States)

    Agüera, Ana; Martínez Bueno, María Jesús; Fernández-Alba, Amadeo R

    2013-06-01

    Since the so-called emerging contaminants were established as a new group of pollutants of environmental concern, a great effort has been devoted to the knowledge of their distribution, fate and effects in the environment. After more than 20 years of work, a significant improvement in knowledge about these contaminants has been achieved, but there is still a large gap of information on the growing number of new potential contaminants that are appearing and especially of their unpredictable transformation products. Although the environmental problem arising from emerging contaminants must be addressed from an interdisciplinary point of view, it is obvious that analytical chemistry plays an important role as the first step of the study, as it allows establishing the presence of chemicals in the environment, estimate their concentration levels, identify sources and determine their degradation pathways. These tasks involve serious difficulties requiring different analytical solutions adjusted to purpose. Thus, the complexity of the matrices requires highly selective analytical methods; the large number and variety of compounds potentially present in the samples demands the application of wide scope methods; the low concentrations at which these contaminants are present in the samples require a high detection sensitivity, and high demands on the confirmation and high structural information are needed for the characterisation of unknowns. New developments on analytical instrumentation have been applied to solve these difficulties. Furthermore and not less important has been the development of new specific software packages intended for data acquisition and, in particular, for post-run analysis. Thus, the use of sophisticated software tools has allowed successful screening analysis, determining several hundreds of analytes, and assisted in the structural elucidation of unknown compounds in a timely manner.

  18. Evaluated Rayleigh integrals for pulsed planar expanding ring sources

    International Nuclear Information System (INIS)

    Warshaw, S.I.

    1985-01-01

    Time-domain analytic and semianalytic pressure fields acoustically radiated from expanding pulsed ring sources imbedded in a planar rigid baffle have been calculated. The source functions are radially symmetric delta-function distributions whose amplitude and argument have simple functional dependencies on radius and time. Certain cases yield closed analytic results, while others result in elliptic integrals, which are evaluated to high accuracy by Gauss-Chebyshev and modified Gauss-Legendre quadrature. These results are of value for calibrating computer simulations and convolution procedures, and estimating fields from more complex planar radiators. 3 refs., 4 figs

  19. Determining the depth of certain gravity sources without a priori specification of their structural index

    Science.gov (United States)

    Zhou, Shuai; Huang, Danian

    2015-11-01

    We have developed a new method for the interpretation of gravity tensor data based on the generalized Tilt-depth method. Cooper (2011, 2012) extended the magnetic Tilt-depth method to gravity data. We take the gradient-ratio method of Cooper (2011, 2012) and modify it so that the source type does not need to be specified a priori. We develop the new method by generalizing the Tilt-depth method for depth estimation for different types of source bodies. The new technique uses only the three vertical tensor components of the full gravity tensor data observed or calculated at different height plane to estimate the depth of the buried bodies without a priori specification of their structural index. For severely noise-corrupted data, our method utilizes different upward continuation height data, which can effectively reduce the influence of noise. Theoretical simulations of the gravity source model with and without noise illustrate the ability of the method to provide source depth information. Additionally, the simulations demonstrate that the new method is simple, computationally fast and accurate. Finally, we apply the method using the gravity data acquired over the Humble Salt Dome in the USA as an example. The results show a good correspondence to the previous drilling and seismic interpretation results.

  20. Affect, Reason, and Persuasion: Advertising Strategies That Predict Affective and Analytic-Cognitive Responses.

    Science.gov (United States)

    Chaudhuri, Arjun; Buck, Ross

    1995-01-01

    Develops and tests hypotheses concerning the relationship of specific advertising strategies to affective and analytic cognitive responses of the audience. Analyses undergraduate students' responses to 240 advertisements. Demonstrates that advertising strategy variables accounted substantially for the variance in affective and analytic cognition.…

  1. Micro-optics for microfluidic analytical applications.

    Science.gov (United States)

    Yang, Hui; Gijs, Martin A M

    2018-02-19

    This critical review summarizes the developments in the integration of micro-optical elements with microfluidic platforms for facilitating detection and automation of bio-analytical applications. Micro-optical elements, made by a variety of microfabrication techniques, advantageously contribute to the performance of an analytical system, especially when the latter has microfluidic features. Indeed the easy integration of optical control and detection modules with microfluidic technology helps to bridge the gap between the macroscopic world and chip-based analysis, paving the way for automated and high-throughput applications. In our review, we start the discussion with an introduction of microfluidic systems and micro-optical components, as well as aspects of their integration. We continue with a detailed description of different microfluidic and micro-optics technologies and their applications, with an emphasis on the realization of optical waveguides and microlenses. The review continues with specific sections highlighting the advantages of integrated micro-optical components in microfluidic systems for tackling a variety of analytical problems, like cytometry, nucleic acid and protein detection, cell biology, and chemical analysis applications.

  2. ANALYTICAL SOLUTIONS OF SINGULAR ISOTHERMAL QUADRUPOLE LENS

    International Nuclear Information System (INIS)

    Chu Zhe; Lin, W. P.; Yang Xiaofeng

    2013-01-01

    Using an analytical method, we study the singular isothermal quadrupole (SIQ) lens system, which is the simplest lens model that can produce four images. In this case, the radial mass distribution is in accord with the profile of the singular isothermal sphere lens, and the tangential distribution is given by adding a quadrupole on the monopole component. The basic properties of the SIQ lens have been studied in this Letter, including the deflection potential, deflection angle, magnification, critical curve, caustic, pseudo-caustic, and transition locus. Analytical solutions of the image positions and magnifications for the source on axes are derived. We find that naked cusps will appear when the relative intensity k of quadrupole to monopole is larger than 0.6. According to the magnification invariant theory of the SIQ lens, the sum of the signed magnifications of the four images should be equal to unity, as found by Dalal. However, if a source lies in the naked cusp, the summed magnification of the left three images is smaller than the invariant 1. With this simple lens system, we study the situations where a point source infinitely approaches a cusp or a fold. The sum of the magnifications of the cusp image triplet is usually not equal to 0, and it is usually positive for major cusps while negative for minor cusps. Similarly, the sum of magnifications of the fold image pair is usually not equal to 0 either. Nevertheless, the cusp and fold relations are still equal to 0 in that the sum values are divided by infinite absolute magnifications by definition.

  3. New neutron-based isotopic analytical methods; An explorative study of resonance capture and incoherent scattering

    NARCIS (Netherlands)

    Perego, R.C.

    2004-01-01

    Two novel neutron-based analytical techniques have been treated in this thesis, Neutron Resonance Capture Analysis (NRCA), employing a pulsed neutron source, and Neutron Incoherent Scattering (NIS), making use of a cold neutron source. With the NRCA method isotopes are identified by the

  4. Pre-analytical factors influencing the stability of cerebrospinal fluid proteins

    DEFF Research Database (Denmark)

    Simonsen, Anja H; Bahl, Justyna M C; Danborg, Pia B

    2013-01-01

    Cerebrospinal fluid (CSF) is a potential source for new biomarkers due to its proximity to the brain. This study aimed to clarify the stability of the CSF proteome when undergoing pre-analytical factors. We investigated the effects of repeated freeze/thaw cycles, protease inhibitors and delayed s...

  5. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    Science.gov (United States)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to

  6. Analytical Modeling of Triple-Metal Hetero-Dielectric DG SON TFET

    Science.gov (United States)

    Mahajan, Aman; Dash, Dinesh Kumar; Banerjee, Pritha; Sarkar, Subir Kumar

    2018-02-01

    In this paper, a 2-D analytical model of triple-metal hetero-dielectric DG TFET is presented by combining the concepts of triple material gate engineering and hetero-dielectric engineering. Three metals with different work functions are used as both front- and back gate electrodes to modulate the barrier at source/channel and channel/drain interface. In addition to this, front gate dielectric consists of high-K HfO2 at source end and low-K SiO2 at drain side, whereas back gate dielectric is replaced by air to further improve the ON current of the device. Surface potential and electric field of the proposed device are formulated solving 2-D Poisson's equation and Young's approximation. Based on this electric field expression, tunneling current is obtained by using Kane's model. Several device parameters are varied to examine the behavior of the proposed device. The analytical model is validated with TCAD simulation results for proving the accuracy of our proposed model.

  7. Analytical Radiation Transport Benchmarks for The Next Century

    International Nuclear Information System (INIS)

    Ganapol, B.D.

    2005-01-01

    Verification of large-scale computational algorithms used in nuclear engineering and radiological applications is an essential element of reliable code performance. For this reason, the development of a suite of multidimensional semi-analytical benchmarks has been undertaken to provide independent verification of proper operation of codes dealing with the transport of neutral particles. The benchmarks considered cover several one-dimensional, multidimensional, monoenergetic and multigroup, fixed source and critical transport scenarios. The first approach, called the Green's Function. In slab geometry, the Green's function is incorporated into a set of integral equations for the boundary fluxes. Through a numerical Fourier transform inversion and subsequent matrix inversion for the boundary fluxes, a semi-analytical benchmark emerges. Multidimensional solutions in a variety of infinite media are also based on the slab Green's function. In a second approach, a new converged SN method is developed. In this method, the SN solution is ''minded'' to bring out hidden high quality solutions. For this case multigroup fixed source and criticality transport problems are considered. Remarkably accurate solutions can be obtained with this new method called the Multigroup Converged SN (MGCSN) method as will be demonstrated

  8. ATLAS Analytics and Machine Learning Platforms

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration; Legger, Federica; Gardner, Robert

    2018-01-01

    In 2015 ATLAS Distributed Computing started to migrate its monitoring systems away from Oracle DB and decided to adopt new big data platforms that are open source, horizontally scalable, and offer the flexibility of NoSQL systems. Three years later, the full software stack is in place, the system is considered in production and operating at near maximum capacity (in terms of storage capacity and tightly coupled analysis capability). The new model provides several tools for fast and easy to deploy monitoring and accounting. The main advantages are: ample ways to do complex analytics studies (using technologies such as java, pig, spark, python, jupyter), flexibility in reorganization of data flows, near real time and inline processing. The analytics studies improve our understanding of different computing systems and their interplay, thus enabling whole-system debugging and optimization. In addition, the platform provides services to alarm or warn on anomalous conditions, and several services closing feedback l...

  9. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    Science.gov (United States)

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  10. Analytical Chemistry: A retrospective view on some current trends.

    Science.gov (United States)

    Niessner, Reinhard

    2018-04-01

    In a retrospective view some current trends in Analytical Chemistry are outlined and connected to work published more than a hundred years ago in the same field. For example, gravimetric microanalysis after specific precipitation, once the sole basis for chemical analysis, has been transformed into a mass-sensitive transducer in combination with compound-specific receptors. Molecular spectroscopy, still practising the classical absorption/emission techniques for detecting elements or molecules experiences a change to Raman spectroscopy, is now allowing analysis of a multitude of additional features. Chemical sensors are now used to perform a vast number of analytical measurements. Especially paper-based devices (dipsticks, microfluidic pads) celebrate a revival as they can potentially revolutionize medicine in the developing world. Industry 4.0 will lead to a further increase of sensor applications. Preceding separation and enrichment of analytes from complicated matrices remains the backbone for a successful analysis, despite increasing attempts to avoid clean-up. Continuous separation techniques will become a key element for 24/7 production of goods with certified quality. Attempts to get instantaneous and specific chemical information by optical or electrical transduction will need highly selective receptors in large quantities. Further understanding of ligand - receptor complex structures is the key for successful generation of artificial bio-inspired receptors. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Quality assurance management plan (QAPP) special analytical support (SAS)

    Energy Technology Data Exchange (ETDEWEB)

    LOCKREM, L.L.

    1999-05-20

    It is the policy of Special Analytical Support (SAS) that the analytical aspects of all environmental data generated and processed in the laboratory, subject to the Environmental Protection Agency (EPA), U.S. Department of Energy or other project specific requirements, be of known and acceptable quality. It is the intention of this QAPP to establish and assure that an effective quality controlled management system is maintained in order to meet the quality requirements of the intended use(s) of the data.

  12. Quality assurance management plan (QAPP) special analytical support (SAS)

    International Nuclear Information System (INIS)

    LOCKREM, L.L.

    1999-01-01

    It is the policy of Special Analytical Support (SAS) that the analytical aspects of all environmental data generated and processed in the laboratory, subject to the Environmental Protection Agency (EPA), U.S. Department of Energy or other project specific requirements, be of known and acceptable quality. It is the intention of this QAPP to establish and assure that an effective quality controlled management system is maintained in order to meet the quality requirements of the intended use(s) of the data

  13. An open-source toolbox for multiphase flow in porous media

    Science.gov (United States)

    Horgue, P.; Soulaine, C.; Franc, J.; Guibert, R.; Debenest, G.

    2015-02-01

    Multiphase flow in porous media provides a wide range of applications: from the environmental understanding (aquifer, site-pollution) to industrial process improvements (oil production, waste management). Modeling of such flows involves specific volume-averaged equations and therefore specific computational fluid dynamics (CFD) tools. In this work, we develop a toolbox for modeling multiphase flow in porous media with OpenFOAM®, an open-source platform for CFD. The underlying idea of this approach is to provide an easily adaptable tool that can be used in further studies to test new mathematical models or numerical methods. The package provides the most common effective properties models of the literature (relative permeability, capillary pressure) and specific boundary conditions related to porous media flows. To validate this package, solvers based on the IMplicit Pressure Explicit Saturation (IMPES) method are developed in the toolbox. The numerical validation is performed by comparison with analytical solutions on academic cases. Then, a satisfactory parallel efficiency of the solver is shown on a more complex configuration.

  14. IAEA coordinated research project (CRP) on 'Analytical and experimental benchmark analyses of accelerator driven systems'

    International Nuclear Information System (INIS)

    Abanades, Alberto; Aliberti, Gerardo; Gohar, Yousry; Talamo, Alberto; Bornos, Victor; Kiyavitskaya, Anna; Carta, Mario; Janczyszyn, Jerzy; Maiorino, Jose; Pyeon, Cheolho; Stanculescu, Alexander; Titarenko, Yury; Westmeier, Wolfram

    2008-01-01

    In December 2005, the International Atomic Energy Agency (IAEA) has started a Coordinated Research Project (CRP) on 'Analytical and Experimental Benchmark Analyses of Accelerator Driven Systems'. The overall objective of the CRP, performed within the framework of the Technical Working Group on Fast Reactors (TWGFR) of IAEA's Nuclear Energy Department, is to increase the capability of interested Member States in developing and applying advanced reactor technologies in the area of long-lived radioactive waste utilization and transmutation. The specific objective of the CRP is to improve the present understanding of the coupling of an external neutron source (e.g. spallation source) with a multiplicative sub-critical core. The participants are performing computational and experimental benchmark analyses using integrated calculation schemes and simulation methods. The CRP aims at integrating some of the planned experimental demonstration projects of the coupling between a sub-critical core and an external neutron source (e.g. YALINA Booster in Belarus, and Kyoto University's Critical Assembly (KUCA)). The objective of these experimental programs is to validate computational methods, obtain high energy nuclear data, characterize the performance of sub-critical assemblies driven by external sources, and to develop and improve techniques for sub-criticality monitoring. The paper summarizes preliminary results obtained to-date for some of the CRP benchmarks. (authors)

  15. 21 CFR 809.30 - Restrictions on the sale, distribution and use of analyte specific reagents.

    Science.gov (United States)

    2010-04-01

    ...; (2) Clinical laboratories regulated under the Clinical Laboratory Improvement Amendments of 1988 (CLIA), as qualified to perform high complexity testing under 42 CFR part 493 or clinical laboratories... analytical or clinical performance. (e) The laboratory that develops an in-house test using the ASR shall...

  16. Seminal plasma as a source of prostate cancer peptide biomarker candidates for detection of indolent and advanced disease.

    Directory of Open Access Journals (Sweden)

    Jochen Neuhaus

    Full Text Available BACKGROUND: Extensive prostate specific antigen screening for prostate cancer generates a high number of unnecessary biopsies and over-treatment due to insufficient differentiation between indolent and aggressive tumours. We hypothesized that seminal plasma is a robust source of novel prostate cancer (PCa biomarkers with the potential to improve primary diagnosis of and to distinguish advanced from indolent disease. METHODOLOGY/PRINCIPAL FINDINGS: In an open-label case/control study 125 patients (70 PCa, 21 benign prostate hyperplasia, 25 chronic prostatitis, 9 healthy controls were enrolled in 3 centres. Biomarker panels a for PCa diagnosis (comparison of PCa patients versus benign controls and b for advanced disease (comparison of patients with post surgery Gleason score 7 were sought. Independent cohorts were used for proteomic biomarker discovery and testing the performance of the identified biomarker profiles. Seminal plasma was profiled using capillary electrophoresis mass spectrometry. Pre-analytical stability and analytical precision of the proteome analysis were determined. Support vector machine learning was used for classification. Stepwise application of two biomarker signatures with 21 and 5 biomarkers provided 83% sensitivity and 67% specificity for PCa detection in a test set of samples. A panel of 11 biomarkers for advanced disease discriminated between patients with Gleason score 7 and organ-confined (specificity in a preliminary validation setting. Seminal profiles showed excellent pre-analytical stability. Eight biomarkers were identified as fragments of N-acetyllactosaminide beta-1,3-N-acetylglucosaminyltransferase, prostatic acid phosphatase, stabilin-2, GTPase IMAP family member 6, semenogelin-1 and -2. Restricted sample size was the major limitation of the study. CONCLUSIONS/SIGNIFICANCE: Seminal plasma represents a robust source of potential peptide makers

  17. Let's Talk... Analytics

    Science.gov (United States)

    Oblinger, Diana G.

    2012-01-01

    Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)

  18. On-chip bio-analyte detection utilizing the velocity of magnetic microparticles in a fluid

    KAUST Repository

    Giouroudi, Ioanna; van den Driesche, Sander; Kosel, Jü rgen; Grössinger, Roland; Vellekoop, Michael J.

    2011-01-01

    change when analyte is attached to their surface via antibody–antigen binding. When the magnetic microparticles are attracted by a magnetic field within a microfluidic channel their velocity depends on the presence of analyte. Specifically, their velocity

  19. Preservatives and neutralizing substances in milk: analytical sensitivity of official specific and nonspecific tests, microbial inhibition effect, and residue persistence in milk

    Directory of Open Access Journals (Sweden)

    Livia Cavaletti Corrêa da Silva

    2015-09-01

    Full Text Available Milk fraud has been a recurring problem in Brazil; thus, it is important to know the effect of most frequently used preservatives and neutralizing substances as well as the detection capability of official tests. The objective of this study was to evaluate the analytical sensitivity of legislation-described tests and nonspecific microbial inhibition tests, and to investigate the effect of such substances on microbial growth inhibition and the persistence of detectable residues after 24/48h of refrigeration. Batches of raw milk, free from any contaminant, were divided into aliquots and mixed with different concentrations of formaldehyde, hydrogen peroxide, sodium hypochlorite, chlorine, chlorinated alkaline detergent, or sodium hydroxide. The analytical sensitivity of the official tests was 0.005%, 0.003%, and 0.013% for formaldehyde, hydrogen peroxide, and hypochlorite, respectively. Chlorine and chlorinated alkaline detergent were not detected by regulatory tests. In the tests for neutralizing substances, sodium hydroxide could not be detected when acidity was accurately neutralized. The yogurt culture test gave results similar to those obtained by official tests for the detection of specific substances. Concentrations of 0.05% of formaldehyde, 0.003% of hydrogen peroxide and 0.013% of sodium hypochlorite significantly reduced (P

  20. General analytical shakedown solution for structures with kinematic hardening materials

    Science.gov (United States)

    Guo, Baofeng; Zou, Zongyuan; Jin, Miao

    2016-09-01

    The effect of kinematic hardening behavior on the shakedown behaviors of structure has been investigated by performing shakedown analysis for some specific problems. The results obtained only show that the shakedown limit loads of structures with kinematic hardening model are larger than or equal to those with perfectly plastic model of the same initial yield stress. To further investigate the rules governing the different shakedown behaviors of kinematic hardening structures, the extended shakedown theorem for limited kinematic hardening is applied, the shakedown condition is then proposed, and a general analytical solution for the structural shakedown limit load is thus derived. The analytical shakedown limit loads for fully reversed cyclic loading and non-fully reversed cyclic loading are then given based on the general solution. The resulting analytical solution is applied to some specific problems: a hollow specimen subjected to tension and torsion, a flanged pipe subjected to pressure and axial force and a square plate with small central hole subjected to biaxial tension. The results obtained are compared with those in literatures, they are consistent with each other. Based on the resulting general analytical solution, rules governing the general effects of kinematic hardening behavior on the shakedown behavior of structure are clearly.

  1. Healthcare predictive analytics: An overview with a focus on Saudi Arabia.

    Science.gov (United States)

    Alharthi, Hana

    2018-03-08

    Despite a newfound wealth of data and information, the healthcare sector is lacking in actionable knowledge. This is largely because healthcare data, though plentiful, tends to be inherently complex and fragmented. Health data analytics, with an emphasis on predictive analytics, is emerging as a transformative tool that can enable more proactive and preventative treatment options. This review considers the ways in which predictive analytics has been applied in the for-profit business sector to generate well-timed and accurate predictions of key outcomes, with a focus on key features that may be applicable to healthcare-specific applications. Published medical research presenting assessments of predictive analytics technology in medical applications are reviewed, with particular emphasis on how hospitals have integrated predictive analytics into their day-to-day healthcare services to improve quality of care. This review also highlights the numerous challenges of implementing predictive analytics in healthcare settings and concludes with a discussion of current efforts to implement healthcare data analytics in the developing country, Saudi Arabia. Copyright © 2018 The Author. Published by Elsevier Ltd.. All rights reserved.

  2. Analytics for Education

    Science.gov (United States)

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  3. Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology.

    Science.gov (United States)

    Jesus, Mafalda; Martins, Ana P J; Gallardo, Eugenia; Silvestre, Samuel

    2016-01-01

    Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata , Smilax China, and Trigonella foenum graecum . This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well.

  4. Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology

    Directory of Open Access Journals (Sweden)

    Mafalda Jesus

    2016-01-01

    Full Text Available Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata, Smilax China, and Trigonella foenum graecum. This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well.

  5. 3-D discrete analytical ridgelet transform.

    Science.gov (United States)

    Helbert, David; Carré, Philippe; Andres, Eric

    2006-12-01

    In this paper, we propose an implementation of the 3-D Ridgelet transform: the 3-D discrete analytical Ridgelet transform (3-D DART). This transform uses the Fourier strategy for the computation of the associated 3-D discrete Radon transform. The innovative step is the definition of a discrete 3-D transform with the discrete analytical geometry theory by the construction of 3-D discrete analytical lines in the Fourier domain. We propose two types of 3-D discrete lines: 3-D discrete radial lines going through the origin defined from their orthogonal projections and 3-D planes covered with 2-D discrete line segments. These discrete analytical lines have a parameter called arithmetical thickness, allowing us to define a 3-D DART adapted to a specific application. Indeed, the 3-D DART representation is not orthogonal, It is associated with a flexible redundancy factor. The 3-D DART has a very simple forward/inverse algorithm that provides an exact reconstruction without any iterative method. In order to illustrate the potentiality of this new discrete transform, we apply the 3-D DART and its extension to the Local-DART (with smooth windowing) to the denoising of 3-D image and color video. These experimental results show that the simple thresholding of the 3-D DART coefficients is efficient.

  6. The Journal of Learning Analytics: Supporting and Promoting Learning Analytics Research

    OpenAIRE

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the Journal of Learning Analytics is identified Analytics is the most significant new initiative of SoLAR. 

  7. The "Journal of Learning Analytics": Supporting and Promoting Learning Analytics Research

    Science.gov (United States)

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the "Journal of Learning Analytics" is identified. Analytics is the most significant new initiative of SoLAR.

  8. Biofluid infrared spectro-diagnostics: pre-analytical considerations for clinical applications.

    Science.gov (United States)

    Lovergne, L; Bouzy, P; Untereiner, V; Garnotel, R; Baker, M J; Thiéfin, G; Sockalingum, G D

    2016-06-23

    Several proof-of-concept studies on the vibrational spectroscopy of biofluids have demonstrated that the methodology has promising potential as a clinical diagnostic tool. However, these studies also show that there is a lack of a standardised protocol in sample handling and preparation prior to spectroscopic analysis. One of the most important sources of analytical errors is the pre-analytical phase. For the technique to be translated into clinics, it is clear that a very strict protocol needs to be established for such biological samples. This study focuses on some of the aspects of the pre-analytical phase in the development of the high-throughput Fourier Transform Infrared (FTIR) spectroscopy of some of the most common biofluids such as serum, plasma and bile. Pre-analytical considerations that can impact either the samples (solvents, anti-coagulants, freeze-thaw cycles…) and/or spectroscopic analysis (sample preparation such as drying, deposit methods, volumes, substrates, operators dependence…) and consequently the quality and the reproducibility of spectral data will be discussed in this report.

  9. An analytical model for the assessment of airline expansion strategies

    Directory of Open Access Journals (Sweden)

    Mauricio Emboaba Moreira

    2014-01-01

    Full Text Available Purpose: The purpose of this article is to develop an analytical model to assess airline expansion strategies by combining generic business strategy models with airline business models. Methodology and approach: A number of airline business models are examined, as are Porter’s (1983 industry five forces that drive competition, complemented by Nalebuff/ Brandenburger’s  (1996 sixth force, and the basic elements of the general environment in which the expansion process takes place.  A system of points and weights is developed to create a score among the 904,736 possible combinations considered. The model’s outputs are generic expansion strategies with quantitative assessments for each specific combination of elements inputted. Originality and value: The analytical model developed is original because it combines for the first time and explicitly elements of the general environment, industry environment, airline business models and the generic expansion strategy types. Besides it creates a system of scores that may be used to drive the decision process toward the choice of a specific strategic expansion path. Research implications: The analytical model may be adapted to other industries apart from the airline industry by substituting the element “airline business model” by other industries corresponding elements related to the different specific business models.

  10. 100-N Area Decision Unit Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    Ovink, R.

    2012-09-18

    This report documents the process used to identify source area target analytes in support of the 100-N Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).

  11. Analytic, two fluid, field reversed configuration equilibrium with sheared rotation

    International Nuclear Information System (INIS)

    Sobehart, J.R.

    1989-01-01

    A two fluid model is used to derive an analytical equilibrium for elongated field reversed configurations containing shear in both the electron and ion velocity profiles. Like some semiempirical models used previously, the analytical expressions obtained provide a satisfactory fit to the experimental results for all radii with a few key parameters. The present results reduce to the rigid rotor model and the infinite conductivity case for a specific choice of the parameters

  12. Web Analytics 2.0 The Art of Online Accountability and Science of Customer Centricity

    CERN Document Server

    Kaushik, Avinash

    2009-01-01

    Adeptly address today's business challenges with this powerful new book from web analytics thought leader Avinash Kaushik. Web Analytics 2.0 presents a new framework that will permanently change how you think about analytics. It provides specific recommendations for creating an actionable strategy, applying analytical techniques correctly, solving challenges such as measuring social media and multichannel campaigns, achieving optimal success by leveraging experimentation, and employing tactics for truly listening to your customers. The book will help your organization become more data driven w

  13. Evaluation of the effect of coal cleaning of fugitive elements. Part II. Analytical methods. Final report, Phase II

    Energy Technology Data Exchange (ETDEWEB)

    Bosshart, R.E.; Price, A.A.; Ford, C.T.

    1980-03-01

    This report contains the analytical and test methods which were used routinely at Bituminous Coal Research, Inc. during the project. The procedures contained herein should aid coal industry laboratories and others, including commercial laboratories, who might be required to determine trace elements in coal. Some of the procedures have been presented in previous BCR reports; however, this report includes additional procedures which are described in greater detail. Also presented are many as the more basic coal methods which have been in use at BCR for many years, or which have been adapted or refined from other standard reference sources for coal and water. The basis for choosing specific analytical procedures for trace elements in coal is somewhat complex. At BCR, atomic absorption was selected as the basic method in the development of these procedures. The choice was based on sensitivity, selectivity, accuracy, precision, practicability, and economy. Whenever possible, the methods developed had to be both adequate and amenable for use by coal industry laboratories by virtue of relative simplicity and cost. This is not to imply that the methods described are simple or inexpensive; however, atomic abosrption techniques do meet these criteria in relation to more complex and costly methods such as neutron activation, mass spectrometry, and x-ray fluorescence, some of which require highly specialized personnel as well as access to sophisticated nuclear and computational facilities. Many of the analytical procedures for trace elements in coal have been developed or adapted specifically for the BCR studies. Their presentation is the principal purpose of this report.

  14. Molecular property diagnostic suite (MPDS): Development of disease-specific open source web portals for drug discovery.

    Science.gov (United States)

    Nagamani, S; Gaur, A S; Tanneeru, K; Muneeswaran, G; Madugula, S S; Consortium, Mpds; Druzhilovskiy, D; Poroikov, V V; Sastry, G N

    2017-11-01

    Molecular property diagnostic suite (MPDS) is a Galaxy-based open source drug discovery and development platform. MPDS web portals are designed for several diseases, such as tuberculosis, diabetes mellitus, and other metabolic disorders, specifically aimed to evaluate and estimate the drug-likeness of a given molecule. MPDS consists of three modules, namely data libraries, data processing, and data analysis tools which are configured and interconnected to assist drug discovery for specific diseases. The data library module encompasses vast information on chemical space, wherein the MPDS compound library comprises 110.31 million unique molecules generated from public domain databases. Every molecule is assigned with a unique ID and card, which provides complete information for the molecule. Some of the modules in the MPDS are specific to the diseases, while others are non-specific. Importantly, a suitably altered protocol can be effectively generated for another disease-specific MPDS web portal by modifying some of the modules. Thus, the MPDS suite of web portals shows great promise to emerge as disease-specific portals of great value, integrating chemoinformatics, bioinformatics, molecular modelling, and structure- and analogue-based drug discovery approaches.

  15. Specific classification of financial analysis of enterprise activity

    Directory of Open Access Journals (Sweden)

    Synkevych Nadiia I.

    2014-01-01

    Full Text Available Despite the fact that one can find a big variety of classifications of types of financial analysis of enterprise activity, which differ with their approach to classification and a number of classification features and their content, in modern scientific literature, their complex comparison and analysis of existing classification have not been done. This explains urgency of this study. The article studies classification of types of financial analysis of scientists and presents own approach to this problem. By the results of analysis the article improves and builds up a specific classification of financial analysis of enterprise activity and offers classification by the following features: objects, subjects, goals of study, automation level, time period of the analytical base, scope of study, organisation system, classification features of the subject, spatial belonging, sufficiency, information sources, periodicity, criterial base, method of data selection for analysis and time direction. All types of financial analysis significantly differ with their inherent properties and parameters depending on the goals of financial analysis. The developed specific classification provides subjects of financial analysis of enterprise activity with a possibility to identify a specific type of financial analysis, which would correctly meet the set goals.

  16. 40 CFR 141.402 - Ground water source microbial monitoring and analytical methods.

    Science.gov (United States)

    2010-07-01

    ... approves the use of E. coli as a fecal indicator for source water monitoring under this paragraph (a). If the repeat sample collected from the ground water source is E.coli positive, the system must comply... listed in the in paragraph (c)(2) of this section for the presence of E. coli, enterococci, or coliphage...

  17. Renewable energy integration in smart grids-multicriteria assessment using the fuzzy analytical hierarchy process

    OpenAIRE

    JANJIC, ALEKSANDAR; SAVIC, SUZANA; VELIMIROVIC, LAZAR; NIKOLIC, VESNA

    2015-01-01

    Unlike the traditional way of efficiency assessment of renewable energy sources integration, the smart grid concept is introducing new goals and objectives regarding increased use of renewable electricity sources, grid security, energy conservation, energy efficiency, and deregulated energy market. Possible benefits brought by renewable sources integration are evaluated by the degree of the approach to the ideal smart grid. In this paper, fuzzy analytical hierarchy process methodology for the...

  18. Sources of pre-analytical variations in yield of DNA extracted from blood samples: analysis of 50,000 DNA samples in EPIC.

    Directory of Open Access Journals (Sweden)

    Elodie Caboux

    Full Text Available The European Prospective Investigation into Cancer and nutrition (EPIC is a long-term, multi-centric prospective study in Europe investigating the relationships between cancer and nutrition. This study has served as a basis for a number of Genome-Wide Association Studies (GWAS and other types of genetic analyses. Over a period of 5 years, 52,256 EPIC DNA samples have been extracted using an automated DNA extraction platform. Here we have evaluated the pre-analytical factors affecting DNA yield, including anthropometric, epidemiological and technical factors such as center of subject recruitment, age, gender, body-mass index, disease case or control status, tobacco consumption, number of aliquots of buffy coat used for DNA extraction, extraction machine or procedure, DNA quantification method, degree of haemolysis and variations in the timing of sample processing. We show that the largest significant variations in DNA yield were observed with degree of haemolysis and with center of subject recruitment. Age, gender, body-mass index, cancer case or control status and tobacco consumption also significantly impacted DNA yield. Feedback from laboratories which have analyzed DNA with different SNP genotyping technologies demonstrate that the vast majority of samples (approximately 88% performed adequately in different types of assays. To our knowledge this study is the largest to date to evaluate the sources of pre-analytical variations in DNA extracted from peripheral leucocytes. The results provide a strong evidence-based rationale for standardized recommendations on blood collection and processing protocols for large-scale genetic studies.

  19. Analytical detection methods for irradiated foods

    International Nuclear Information System (INIS)

    1991-03-01

    The present publication is a review of scientific literature on the analytical identification of foods treated with ionizing radiation and the quantitative determination of absorbed dose of radiation. Because of the extremely low level of chemical changes resulting from irradiation or because of the lack of specificity to irradiation of any chemical changes, a few methods of quantitative determination of absorbed dose have shown promise until now. On the other hand, the present review has identified several possible methods, which could be used, following further research and testing, for the identification of irradiated foods. An IAEA Co-ordinated Research Programme on Analytical Detection Methods for Irradiation Treatment of Food ('ADMIT'), established in 1990, is currently investigating many of the methods cited in the present document. Refs and tab

  20. PROGRESSIVE DATA ANALYTICS IN HEALTH INFORMATICS USING AMAZON ELASTIC MAPREDUCE (EMR

    Directory of Open Access Journals (Sweden)

    J S Shyam Mohan

    2016-04-01

    Full Text Available Identifying, diagnosing and treatment of cancer involves a thorough investigation that involves data collection called big data from multi and different sources that are helpful for making effective and quick decision making. Similarly data analytics is used to find remedial actions for newly arriving diseases spread across multiple warehouses. Analytics can be performed on collected or available data from various data clusters that contains pieces of data. We provide an effective framework that provides a way for effective decision making using Amazon EMR. Through various experiments done on different biological datasets, we reveal the advantages of the proposed model and present numerical results. These results indicate that the proposed framework can efficiently perform analytics over any biological datasets and obtain results in optimal time thereby maintaining the quality of the result.

  1. A method for determining the analytical form of a radionuclide depth distribution using multiple gamma spectrometry measurements

    Energy Technology Data Exchange (ETDEWEB)

    Dewey, Steven Clifford, E-mail: sdewey001@gmail.com [United States Air Force School of Aerospace Medicine, Occupational Environmental Health Division, Health Physics Branch, Radiation Analysis Laboratories, 2350 Gillingham Drive, Brooks City-Base, TX 78235 (United States); Whetstone, Zachary David, E-mail: zacwhets@umich.edu [Radiological Health Engineering Laboratory, Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2355 Bonisteel Boulevard, 1906 Cooley Building, Ann Arbor, MI 48109-2104 (United States); Kearfott, Kimberlee Jane, E-mail: kearfott@umich.edu [Radiological Health Engineering Laboratory, Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2355 Bonisteel Boulevard, 1906 Cooley Building, Ann Arbor, MI 48109-2104 (United States)

    2011-06-15

    When characterizing environmental radioactivity, whether in the soil or within concrete building structures undergoing remediation or decommissioning, it is highly desirable to know the radionuclide depth distribution. This is typically modeled using continuous analytical expressions, whose forms are believed to best represent the true source distributions. In situ gamma ray spectroscopic measurements are combined with these models to fully describe the source. Currently, the choice of analytical expressions is based upon prior experimental core sampling results at similar locations, any known site history, or radionuclide transport models. This paper presents a method, employing multiple in situ measurements at a single site, for determining the analytical form that best represents the true depth distribution present. The measurements can be made using a variety of geometries, each of which has a different sensitivity variation with source spatial distribution. Using non-linear least squares numerical optimization methods, the results can be fit to a collection of analytical models and the parameters of each model determined. The analytical expression that results in the fit with the lowest residual is selected as the most accurate representation. A cursory examination is made of the effects of measurement errors on the method. - Highlights: > A new method for determining radionuclide distribution as a function of depth is presented. > Multiple measurements are used, with enough measurements to determine the unknowns in analytical functions that might describe the distribution. > The measurements must be as independent as possible, which is achieved through special collimation of the detector. > Although the effects of measurements errors may be significant on the results, an improvement over other methods is anticipated.

  2. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A

    2009-05-27

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.

  3. An analytical study of double bend achromat lattice

    Energy Technology Data Exchange (ETDEWEB)

    Fakhri, Ali Akbar, E-mail: fakhri@rrcat.gov.in; Kant, Pradeep; Singh, Gurnam; Ghodke, A. D. [Raja Ramanna Centre for Advanced Technology, Indore 452 013 (India)

    2015-03-15

    In a double bend achromat, Chasman-Green (CG) lattice represents the basic structure for low emittance synchrotron radiation sources. In the basic structure of CG lattice single focussing quadrupole (QF) magnet is used to form an achromat. In this paper, this CG lattice is discussed and an analytical relation is presented, showing the limitation of basic CG lattice to provide the theoretical minimum beam emittance in achromatic condition. To satisfy theoretical minimum beam emittance parameters, achromat having two, three, and four quadrupole structures is presented. In this structure, different arrangements of QF and defocusing quadruple (QD) are used. An analytical approach assuming quadrupoles as thin lenses has been followed for studying these structures. A study of Indus-2 lattice in which QF-QD-QF configuration in the achromat part has been adopted is also presented.

  4. An analytical study of double bend achromat lattice.

    Science.gov (United States)

    Fakhri, Ali Akbar; Kant, Pradeep; Singh, Gurnam; Ghodke, A D

    2015-03-01

    In a double bend achromat, Chasman-Green (CG) lattice represents the basic structure for low emittance synchrotron radiation sources. In the basic structure of CG lattice single focussing quadrupole (QF) magnet is used to form an achromat. In this paper, this CG lattice is discussed and an analytical relation is presented, showing the limitation of basic CG lattice to provide the theoretical minimum beam emittance in achromatic condition. To satisfy theoretical minimum beam emittance parameters, achromat having two, three, and four quadrupole structures is presented. In this structure, different arrangements of QF and defocusing quadruple (QD) are used. An analytical approach assuming quadrupoles as thin lenses has been followed for studying these structures. A study of Indus-2 lattice in which QF-QD-QF configuration in the achromat part has been adopted is also presented.

  5. An analytical study of double bend achromat lattice

    International Nuclear Information System (INIS)

    Fakhri, Ali Akbar; Kant, Pradeep; Singh, Gurnam; Ghodke, A. D.

    2015-01-01

    In a double bend achromat, Chasman-Green (CG) lattice represents the basic structure for low emittance synchrotron radiation sources. In the basic structure of CG lattice single focussing quadrupole (QF) magnet is used to form an achromat. In this paper, this CG lattice is discussed and an analytical relation is presented, showing the limitation of basic CG lattice to provide the theoretical minimum beam emittance in achromatic condition. To satisfy theoretical minimum beam emittance parameters, achromat having two, three, and four quadrupole structures is presented. In this structure, different arrangements of QF and defocusing quadruple (QD) are used. An analytical approach assuming quadrupoles as thin lenses has been followed for studying these structures. A study of Indus-2 lattice in which QF-QD-QF configuration in the achromat part has been adopted is also presented

  6. Making advanced analytics work for you.

    Science.gov (United States)

    Barton, Dominic; Court, David

    2012-10-01

    Senior leaders who write off the move toward big data as a lot of big talk are making, well, a big mistake. So argue McKinsey's Barton and Court, who worked with dozens of companies to figure out how to translate advanced analytics into nuts-and-bolts practices that affect daily operations on the front lines. The authors offer a useful guide for leaders and managers who want to take a deliberative approach to big data-but who also want to get started now. First, companies must identify the right data for their business, seek to acquire the information creatively from diverse sources, and secure the necessary IT support. Second, they need to build analytics models that are tightly focused on improving performance, making the models only as complex as business goals demand. Third, and most important, companies must transform their capabilities and culture so that the analytical results can be implemented from the C-suite to the front lines. That means developing simple tools that everyone in the organization can understand and teaching people why the data really matter. Embracing big data is as much about changing mind-sets as it is about crunching numbers. Executed with the right care and flexibility, this cultural shift could have payoffs that are, well, bigger than you expect.

  7. Design specification for the European Spallation Source neutron generating target element

    International Nuclear Information System (INIS)

    Aguilar, A.; Sordo, F.; Mora, T.; Mena, L.; Mancisidor, M.; Aguilar, J.; Bakedano, G.; Herranz, I.; Luna, P.; Magan, M.; Vivanco, R.; Jimenez-Villacorta, F.; Sjogreen, K.; Oden, U.; Perlado, J.M.

    2017-01-01

    The paper addresses some of the most relevant issues concerning the thermal hydraulics and radiation damage of the neutron generation target to be built at the European Spallation Source as recently approved after a critical design review. The target unit consists of a set of Tungsten blocks placed inside a wheel of 2.5 m diameter which rotates at some 0.5 Hz in order to distribute the heat generated from incoming protons which reach the target in the radial direction. The spallation material elements are composed of an array of Tungsten pieces which rest on a rotating steel support (the cassette) and are distributed in a cross-flow configuration. The thermal, mechanical and radiation effects resulting from the impact of a 2 GeV proton pulse are analysed in detail as well as an evaluation of the inventory of spallation products. The current design is found to conform to specifications and found to be robust enough to deal with several accident scenarios.

  8. Design specification for the European Spallation Source neutron generating target element

    Energy Technology Data Exchange (ETDEWEB)

    Aguilar, A. [Consorcio ESS-BILBAO. Parque Tecnológico Bizkaia. Poligono Ugaldeguren III, Pol. A, 7B, 48170 Zamudio (Spain); Sordo, F., E-mail: fernando.sordo@essbilbao.org [Consorcio ESS-BILBAO. Parque Tecnológico Bizkaia. Poligono Ugaldeguren III, Pol. A, 7B, 48170 Zamudio (Spain); Instituto de Fusión Nuclear, José Gutiérrez Abascal, 2, 28006 Madrid (Spain); Mora, T. [Consorcio ESS-BILBAO. Parque Tecnológico Bizkaia. Poligono Ugaldeguren III, Pol. A, 7B, 48170 Zamudio (Spain); Mena, L. [Consorcio ESS-BILBAO. Parque Tecnológico Bizkaia. Poligono Ugaldeguren III, Pol. A, 7B, 48170 Zamudio (Spain); Instituto de Fusión Nuclear, José Gutiérrez Abascal, 2, 28006 Madrid (Spain); Mancisidor, M.; Aguilar, J.; Bakedano, G.; Herranz, I.; Luna, P. [Consorcio ESS-BILBAO. Parque Tecnológico Bizkaia. Poligono Ugaldeguren III, Pol. A, 7B, 48170 Zamudio (Spain); Magan, M.; Vivanco, R. [Consorcio ESS-BILBAO. Parque Tecnológico Bizkaia. Poligono Ugaldeguren III, Pol. A, 7B, 48170 Zamudio (Spain); Instituto de Fusión Nuclear, José Gutiérrez Abascal, 2, 28006 Madrid (Spain); Jimenez-Villacorta, F. [Consorcio ESS-BILBAO. Parque Tecnológico Bizkaia. Poligono Ugaldeguren III, Pol. A, 7B, 48170 Zamudio (Spain); Sjogreen, K.; Oden, U. [European Spallation Source ERIC, P.O Box 176, SE-221 00 Lund (Sweden); Perlado, J.M. [Instituto de Fusión Nuclear, José Gutiérrez Abascal, 2, 28006 Madrid (Spain); and others

    2017-06-01

    The paper addresses some of the most relevant issues concerning the thermal hydraulics and radiation damage of the neutron generation target to be built at the European Spallation Source as recently approved after a critical design review. The target unit consists of a set of Tungsten blocks placed inside a wheel of 2.5 m diameter which rotates at some 0.5 Hz in order to distribute the heat generated from incoming protons which reach the target in the radial direction. The spallation material elements are composed of an array of Tungsten pieces which rest on a rotating steel support (the cassette) and are distributed in a cross-flow configuration. The thermal, mechanical and radiation effects resulting from the impact of a 2 GeV proton pulse are analysed in detail as well as an evaluation of the inventory of spallation products. The current design is found to conform to specifications and found to be robust enough to deal with several accident scenarios.

  9. Integration Of Data From Heterogeneous Sources Using Etl Technology.

    Directory of Open Access Journals (Sweden)

    Marek Macura

    2014-01-01

    Full Text Available Data integration is a crucial issue in environments of heterogeneous data sources. At present mentioned heterogeneity is becoming widespread. Whenever, based on various data sources, we want to gain useful information and knowledge we must solve data integration problem in order to apply appropriate analytical methods on comprehensive and uniform data. Such activity is known as knowledge discovery from data process. Therefore approaches to data integration problem are very interesting and bring us closer to the "age of information". The paper presents an architecture, which implements knowledge discovery from data process. The solution combines ETL technology and wrapper layer known from mediated systems. It also provides semantic integration through connections mechanism between data elements. The solution allows for integration of any data sources and implementation of analytical methods in one environment. The proposed environment is verified by applying it to data sources on the foundry industry.

  10. Analytical methods for the determination of mixtures of bisphenols and derivatives in human and environmental exposure sources and biological fluids. A review

    International Nuclear Information System (INIS)

    Caballero-Casero, N.; Lunar, L.; Rubio, S.

    2016-01-01

    Bisphenol A (BPA) is ubiquitous in humans and the environment. Its potential adverse effects through genomic and non-genomic pathways have fostered BPA replacement by bisphenol analogs that, unfortunately, exert similar adverse effects. Many of these analogs, as well as their derivatives, have already found in humans and the environment and major concerns have arisen over their low dose- and mixture-related effects. This review aims to discuss the characteristics of the main analytical methods reported so far for the determination of mixtures of bisphenol analogs and/or derivatives in human and environmental exposure sources and biological fluids. Approaches followed for removal of background contamination, sample preparation and separation and detection of mixtures of bisphenols and derivatives are critically discussed. Sample treatment is matrix-dependent and common steps include analyte isolation, removal of interferences, evaporation of the extracts and solvent reconstitution. Separation and quantification has been almost exclusively carried out by liquid chromatography tandem mass spectrometry (LC-MS/MS) or gas chromatography mass spectrometry (GC–MS), in the last case prior derivatization, but LC-fluorescence detection has also found some applications. Main characteristics, advantages and drawbacks of these methods will be comparatively discussed. Although at an early stage, some approaches for the assessment of the risk to mixtures of bisphenols, mainly based on the combination of chemical target analysis and toxicity evaluation, have been already applied and they will be here presented. Current knowledge gaps hindering a reliable assessment of human and environmental risk to mixtures of bisphenols and derivatives will be outlined. - Highlights: • Analytical methods for the (bio)monitoring of mixtures of bisphenols are reviewed. • LC and CG coupled to MS are the preferred techniques. • Method-dependent sample treatments are required to remove matrix

  11. Analytical methods for the determination of mixtures of bisphenols and derivatives in human and environmental exposure sources and biological fluids. A review

    Energy Technology Data Exchange (ETDEWEB)

    Caballero-Casero, N.; Lunar, L.; Rubio, S., E-mail: qa1rubrs@uco.es

    2016-02-18

    Bisphenol A (BPA) is ubiquitous in humans and the environment. Its potential adverse effects through genomic and non-genomic pathways have fostered BPA replacement by bisphenol analogs that, unfortunately, exert similar adverse effects. Many of these analogs, as well as their derivatives, have already found in humans and the environment and major concerns have arisen over their low dose- and mixture-related effects. This review aims to discuss the characteristics of the main analytical methods reported so far for the determination of mixtures of bisphenol analogs and/or derivatives in human and environmental exposure sources and biological fluids. Approaches followed for removal of background contamination, sample preparation and separation and detection of mixtures of bisphenols and derivatives are critically discussed. Sample treatment is matrix-dependent and common steps include analyte isolation, removal of interferences, evaporation of the extracts and solvent reconstitution. Separation and quantification has been almost exclusively carried out by liquid chromatography tandem mass spectrometry (LC-MS/MS) or gas chromatography mass spectrometry (GC–MS), in the last case prior derivatization, but LC-fluorescence detection has also found some applications. Main characteristics, advantages and drawbacks of these methods will be comparatively discussed. Although at an early stage, some approaches for the assessment of the risk to mixtures of bisphenols, mainly based on the combination of chemical target analysis and toxicity evaluation, have been already applied and they will be here presented. Current knowledge gaps hindering a reliable assessment of human and environmental risk to mixtures of bisphenols and derivatives will be outlined. - Highlights: • Analytical methods for the (bio)monitoring of mixtures of bisphenols are reviewed. • LC and CG coupled to MS are the preferred techniques. • Method-dependent sample treatments are required to remove matrix

  12. A Low-Cost, Simplified Platform of Interchangeable, Ambient Ionization Sources for Rapid, Forensic Evidence Screening on Portable Mass Spectrometric Instrumentation

    Directory of Open Access Journals (Sweden)

    Patrick W. Fedick

    2018-03-01

    Full Text Available Portable mass spectrometers (MS are becoming more prevalent due to improved instrumentation, commercialization, and the robustness of new ionization methodologies. To increase utility towards diverse field-based applications, there is an inherent need for rugged ionization source platforms that are simple, yet robust towards analytical scenarios that may arise. Ambient ionization methodologies have evolved to target specific real-world problems and fulfill requirements of the analysis at hand. Ambient ionization techniques continue to advance towards higher performance, with specific sources showing variable proficiency depending on application area. To realize the full potential and applicability of ambient ionization methods, a selection of sources may be more prudent, showing a need for a low-cost, flexible ionization source platform. This manuscript describes a centralized system that was developed for portable MS systems that incorporates modular, rapidly-interchangeable ionization sources comprised of low-cost, commercially-available parts. Herein, design considerations are reported for a suite of ambient ionization sources that can be crafted with minimal machining or customization. Representative spectral data is included to demonstrate applicability towards field processing of forensic evidence. While this platform is demonstrated on portable instrumentation, retrofitting to lab-scale MS systems is anticipated.

  13. PAVA: Physiological and Anatomical Visual Analytics for Mapping of Tissue-Specific Concentration and Time-Course Data

    Science.gov (United States)

    We describe the development and implementation of a Physiological and Anatomical Visual Analytics tool (PAVA), a web browser-based application, used to visualize experimental/simulated chemical time-course data (dosimetry), epidemiological data and Physiologically-Annotated Data ...

  14. Exploring the Potential of Predictive Analytics and Big Data in Emergency Care.

    Science.gov (United States)

    Janke, Alexander T; Overbeek, Daniel L; Kocher, Keith E; Levy, Phillip D

    2016-02-01

    Clinical research often focuses on resource-intensive causal inference, whereas the potential of predictive analytics with constantly increasing big data sources remains largely unexplored. Basic prediction, divorced from causal inference, is much easier with big data. Emergency care may benefit from this simpler application of big data. Historically, predictive analytics have played an important role in emergency care as simple heuristics for risk stratification. These tools generally follow a standard approach: parsimonious criteria, easy computability, and independent validation with distinct populations. Simplicity in a prediction tool is valuable, but technological advances make it no longer a necessity. Emergency care could benefit from clinical predictions built using data science tools with abundant potential input variables available in electronic medical records. Patients' risks could be stratified more precisely with large pools of data and lower resource requirements for comparing each clinical encounter to those that came before it, benefiting clinical decisionmaking and health systems operations. The largest value of predictive analytics comes early in the clinical encounter, in which diagnostic and prognostic uncertainty are high and resource-committing decisions need to be made. We propose an agenda for widening the application of predictive analytics in emergency care. Throughout, we express cautious optimism because there are myriad challenges related to database infrastructure, practitioner uptake, and patient acceptance. The quality of routinely compiled clinical data will remain an important limitation. Complementing big data sources with prospective data may be necessary if predictive analytics are to achieve their full potential to improve care quality in the emergency department. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  15. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    Science.gov (United States)

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  16. Analytical Aerodynamic Simulation Tools for Vertical Axis Wind Turbines

    International Nuclear Information System (INIS)

    Deglaire, Paul

    2010-01-01

    Wind power is a renewable energy source that is today the fastest growing solution to reduce CO 2 emissions in the electric energy mix. Upwind horizontal axis wind turbine with three blades has been the preferred technical choice for more than two decades. This horizontal axis concept is today widely leading the market. The current PhD thesis will cover an alternative type of wind turbine with straight blades and rotating along the vertical axis. A brief overview of the main differences between the horizontal and vertical axis concept has been made. However the main focus of this thesis is the aerodynamics of the wind turbine blades. Making aerodynamically efficient turbines starts with efficient blades. Making efficient blades requires a good understanding of the physical phenomena and effective simulations tools to model them. The specific aerodynamics for straight bladed vertical axis turbine flow are reviewed together with the standard aerodynamic simulations tools that have been used in the past by blade and rotor designer. A reasonably fast (regarding computer power) and accurate (regarding comparison with experimental results) simulation method was still lacking in the field prior to the current work. This thesis aims at designing such a method. Analytical methods can be used to model complex flow if the geometry is simple. Therefore, a conformal mapping method is derived to transform any set of section into a set of standard circles. Then analytical procedures are generalized to simulate moving multibody sections in the complex vertical flows and forces experienced by the blades. Finally the fast semi analytical aerodynamic algorithm boosted by fast multipole methods to handle high number of vortices is coupled with a simple structural model of the rotor to investigate potential aeroelastic instabilities. Together with these advanced simulation tools, a standard double multiple streamtube model has been developed and used to design several straight bladed

  17. Improving the trust in results of numerical simulations and scientific data analytics

    Energy Technology Data Exchange (ETDEWEB)

    Cappello, Franck [Argonne National Lab. (ANL), Argonne, IL (United States); Constantinescu, Emil [Argonne National Lab. (ANL), Argonne, IL (United States); Hovland, Paul [Argonne National Lab. (ANL), Argonne, IL (United States); Peterka, Tom [Argonne National Lab. (ANL), Argonne, IL (United States); Phillips, Carolyn [Argonne National Lab. (ANL), Argonne, IL (United States); Snir, Marc [Argonne National Lab. (ANL), Argonne, IL (United States); Wild, Stefan [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-04-30

    This white paper investigates several key aspects of the trust that a user can give to the results of numerical simulations and scientific data analytics. In this document, the notion of trust is related to the integrity of numerical simulations and data analytics applications. This white paper complements the DOE ASCR report on Cybersecurity for Scientific Computing Integrity by (1) exploring the sources of trust loss; (2) reviewing the definitions of trust in several areas; (3) providing numerous cases of result alteration, some of them leading to catastrophic failures; (4) examining the current notion of trust in numerical simulation and scientific data analytics; (5) providing a gap analysis; and (6) suggesting two important research directions and their respective research topics. To simplify the presentation without loss of generality, we consider that trust in results can be lost (or the results’ integrity impaired) because of any form of corruption happening during the execution of the numerical simulation or the data analytics application. In general, the sources of such corruption are threefold: errors, bugs, and attacks. Current applications are already using techniques to deal with different types of corruption. However, not all potential corruptions are covered by these techniques. We firmly believe that the current level of trust that a user has in the results is at least partially founded on ignorance of this issue or the hope that no undetected corruptions will occur during the execution. This white paper explores the notion of trust and suggests recommendations for developing a more scientifically grounded notion of trust in numerical simulation and scientific data analytics. We first formulate the problem and show that it goes beyond previous questions regarding the quality of results such as V&V, uncertainly quantification, and data assimilation. We then explore the complexity of this difficult problem, and we sketch complementary general

  18. Analytic sensing for multi-layer spherical models with application to EEG source imaging

    OpenAIRE

    Kandaswamy, Djano; Blu, Thierry; Van De Ville, Dimitri

    2013-01-01

    Source imaging maps back boundary measurements to underlying generators within the domain; e. g., retrieving the parameters of the generating dipoles from electrical potential measurements on the scalp such as in electroencephalography (EEG). Fitting such a parametric source model is non-linear in the positions of the sources and renewed interest in mathematical imaging has led to several promising approaches. One important step in these methods is the application of a sensing principle that ...

  19. Crowd-Sourced Intelligence Agency: Prototyping counterveillance

    OpenAIRE

    Jennifer Gradecki; Derek Curry

    2017-01-01

    This paper discusses how an interactive artwork, the Crowd-Sourced Intelligence Agency (CSIA), can contribute to discussions of Big Data intelligence analytics. The CSIA is a publicly accessible Open Source Intelligence (OSINT) system that was constructed using information gathered from technical manuals, research reports, academic papers, leaked documents, and Freedom of Information Act files. Using a visceral heuristic, the CSIA demonstrates how the statistical correlations made by automate...

  20. Analytic expressions for mode conversion in a plasma with a parabolic density profile: Generalized results

    International Nuclear Information System (INIS)

    Hinkel-Lipsker, D.E.; Fried, B.D.; Morales, G.J.

    1993-01-01

    This study provides an analytic solution to the general problem of mode conversion in an unmagnetized plasma. Specifically, an electromagnetic wave of frequency ω propagating through a plasma with a parabolic density profile of scale length L p is examined. The mode conversion points are located a distance Δ 0 from the peak of the profile, where the electron plasma frequency ω p (z) matches the wave frequency ω. The corresponding reflection, transmission, and mode conversion coefficients are expressed analytically in terms of parabolic cylinder functions for all values of Δ 0 . The method of solution is based on a source approximation technique that is valid when the electromagnetic and electrostatic scale lengths are well separated. For large Δ 0 , i.e., (cL p /ω) 1/2 much-lt Δ 0 p , the appropriately scaled result [D. E. Hinkel-Lipsker et al., Phys. Fluids B 4, 559 (1992)] for a linear density profile is recovered as the parabolic cylinder functions asymptotically become Airy functions. When Δ 0 →0, the special case of conversion at the peak of the profile [D. E. Hinkel-Lipsker et al., Phys. Fluids B 4, 1772 (1992)] is obtained

  1. Analytic theory of the gyrotron

    International Nuclear Information System (INIS)

    Lentini, P.J.

    1989-06-01

    An analytic theory is derived for a gyrotron operating in the linear gain regime. The gyrotron is a coherent source of microwave and millimeter wave radiation based on an electron beam emitting at cyclotron resonance Ω in a strong, uniform magnetic field. Relativistic equations of motion and first order perturbation theory are used. Results are obtained in both laboratory and normalized variables. An expression for cavity threshold gain is derived in the linear regime. An analytic expression for the electron phase angle in momentum space shows that the effect of the RF field is to form bunches that are equal to the unperturbed transit phase plus a correction term which varies as the sine of the input phase angle. The expression for the phase angle is plotted and bunching effects in and out of phase (0 and -π) with respect to the RF field are evident for detunings leading to gain and absorption, respectively. For exact resonance, field frequency ω = Ω, a bunch also forms at a phase of -π/2. This beam yields the same energy exchange with the RF field as an unbunched, (nonrelativistic) beam. 6 refs., 10 figs

  2. Enabling analytics on sensitive medical data with secure multi-party computation

    NARCIS (Netherlands)

    M. Veeningen (Meilof); S. Chatterjea (Supriyo); A.Z. Horváth (Anna Zsófia); G. Spindler (Gerald); E. Boersma (Eric); P. van der Spek (Peter); O. van der Galiën (Onno); J. Gutteling (Job); W. Kraaij (Wessel); P.J.M. Veugen (Thijs)

    2018-01-01

    textabstractWhile there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multiparty computation can enable such data

  3. Interpretative approaches to identifying sources of hydrocarbons in complex contaminated environments

    International Nuclear Information System (INIS)

    Sauer, T.C.; Brown, J.S.; Boehm, P.D.

    1993-01-01

    Recent advances in analytical instrumental hardware and software have permitted the use of more sophisticated approaches in identifying or fingerprinting sources of hydrocarbons in complex matrix environments. In natural resource damage assessments and contaminated site investigations of both terrestrial and aquatic environments, chemical fingerprinting has become an important interpretative tool. The alkyl homologues of the major polycyclic and heterocyclic aromatic hydrocarbons (e.g., phenanthrenes/anthracenes, dibenzothiophenes, chrysenes) have been found to the most valuable hydrocarbons in differentiating hydrocarbon sources, but there are other hydrocarbon analytes, such as the chemical biomarkers steranes and triterpanes, and alkyl homologues of benzene, and chemical methodologies, such as scanning UV fluorescence, that have been found to be useful in certain environments. This presentation will focus on recent data interpretative approaches for hydrocarbon source identification assessments. Selection of appropriate targets analytes and data quality requirements will be discussed and example cases including the Arabian Gulf War oil spill results will be presented

  4. Big data, mining, and analytics components of strategic decision making

    CERN Document Server

    Kudyba, Stephan

    2014-01-01

    Kudyba again has put together an all-star cast in his new book focused on leveraging data, including the more traditional structured and also the unstructured incomprehensible source, to generate actionable information. This most current book provides a framework for both the advanced data jockeys to more analytically focused data-driven decision makers. A must-read for those wishing to be on the cutting edge of leveraging the multitude of data sources that businesses capture today.-Jeff Nicola, VP of Retail Sales at one of the nation's largest health insurance firms, and a Six

  5. Analytical design of sensors for measuring during terminal phase of atmospheric temperature planetary entry

    Science.gov (United States)

    Millard, J. P.; Green, M. J.; Sommer, S. C.

    1972-01-01

    An analytical study was conducted to develop a sensor for measuring the temperature of a planetary atmosphere from an entry vehicle traveling at supersonic speeds and having a detached shock. Such a sensor has been used in the Planetary Atmosphere Experiments Test Probe (PAET) mission and is planned for the Viking-Mars mission. The study specifically considered butt-welded thermocouple sensors stretched between two support posts; however, the factors considered are sufficiently general to apply to other sensors as well. This study included: (1) an investigation of the relation between sensor-measured temperature and free-stream conditions; (2) an evaluation of the effects of extraneous sources of heat; (3) the development of a computer program for evaluating sensor response during entry; and (4) a parametric study of sensor design characteristics.

  6. Analytical Solution for 2D Inter-Well Porous Flow in a Rectangular Reservoir

    Directory of Open Access Journals (Sweden)

    Junfeng Ding

    2018-04-01

    Full Text Available Inter-well fluid flows through porous media are commonly encountered in the production of groundwater, oil, and geothermal energy. In this paper, inter-well porous flow inside a rectangular reservoir is solved based on the complex variable function theory combined with the method of mirror images. In order to derive the solution analytically, the inter-well flow is modeled as a 2D flow in a homogenous and isotropic porous medium. The resulted exact analytical solution takes the form of an infinite series, but it can be truncated to give high accuracy approximation. In terms of nine cases of inter-well porous flow associated with enhanced geothermal systems, the applications of the obtained analytical solution are demonstrated, and the convergence properties of the truncated series are investigated. It is shown that the convergent rate of the truncated series increases with the symmetric level of well distribution inside the reservoir, and the adoption of Euler transform significantly accelerates the convergence of alternating series cases associated with asymmetric well distribution. In principle, the analytical solution proposed in this paper can be applied to other scientific and engineering fields, as long as the involved problem is governed by 2D Laplace equation in a rectangular domain and subject to similar source/sink and boundary conditions, i.e., isolated point sources/sinks and uniform Dirichlet or homogeneous Neumann boundary conditions.

  7. The analytical investigation of the super-Gaussian pump source on ...

    Indian Academy of Sciences (India)

    In this paper, we assumed that the fiber core and first clad are exposed to a pump source with a super-Gaussian profile of order four. The effects of this non-uniform heat deposition on thermal, stress and thermo-optics properties such as temperature-dependent change of refractive index and thermally induced stress have ...

  8. Analytic trigonometry

    CERN Document Server

    Bruce, William J; Maxwell, E A; Sneddon, I N

    1963-01-01

    Analytic Trigonometry details the fundamental concepts and underlying principle of analytic geometry. The title aims to address the shortcomings in the instruction of trigonometry by considering basic theories of learning and pedagogy. The text first covers the essential elements from elementary algebra, plane geometry, and analytic geometry. Next, the selection tackles the trigonometric functions of angles in general, basic identities, and solutions of equations. The text also deals with the trigonometric functions of real numbers. The fifth chapter details the inverse trigonometric functions

  9. Role of modern analytical techniques in the production of uranium metal

    International Nuclear Information System (INIS)

    Hareendran, K.N.; Roy, S.B.

    2009-01-01

    Production of nuclear grade uranium metal conforming to its stringent specification with respect to metallic and non metallic impurities necessitates implementation of a comprehensive quality control regime. Founding members of Uranium Metal Plant realised the importance of this aspect of metal production and a quality control laboratory was set up as part of the production plant. In the initial stages of its existence, the laboratory mainly catered to the process control analysis of the plant process samples and Spectroscopy Division and Analytical Division of BARC provided analysis of trace metallic impurities in the intermediates as well as in the product uranium metal. This laboratory also provided invaluable R and D support for the optimization of the process involving both calciothermy and magnesiothermy. Prior to 1985, analytical procedures used were limited to classical methods of analysis with minimal instrumental procedures. The first major analytical instrument, a Flame AAS was installed in 1985 and a beginning to the trace analysis was made. However during the last 15 years the Quality Control Section has modernized the analytical set up by acquiring appropriate instruments. Presently the facility has implemented a complete quality control and quality assurance program required to cover all aspects of uranium metal production viz analysis of raw materials, process samples, waste disposal samples and also determination of all the specification elements in uranium metal. The current analytical practices followed in QCS are presented here

  10. Forecasting Significant Societal Events Using The Embers Streaming Predictive Analytics System.

    Science.gov (United States)

    Doyle, Andy; Katz, Graham; Summers, Kristen; Ackermann, Chris; Zavorin, Ilya; Lim, Zunsik; Muthiah, Sathappan; Butler, Patrick; Self, Nathan; Zhao, Liang; Lu, Chang-Tien; Khandpur, Rupinder Paul; Fayed, Youssef; Ramakrishnan, Naren

    2014-12-01

    Developed under the Intelligence Advanced Research Project Activity Open Source Indicators program, Early Model Based Event Recognition using Surrogates (EMBERS) is a large-scale big data analytics system for forecasting significant societal events, such as civil unrest events on the basis of continuous, automated analysis of large volumes of publicly available data. It has been operational since November 2012 and delivers approximately 50 predictions each day for countries of Latin America. EMBERS is built on a streaming, scalable, loosely coupled, shared-nothing architecture using ZeroMQ as its messaging backbone and JSON as its wire data format. It is deployed on Amazon Web Services using an entirely automated deployment process. We describe the architecture of the system, some of the design tradeoffs encountered during development, and specifics of the machine learning models underlying EMBERS. We also present a detailed prospective evaluation of EMBERS in forecasting significant societal events in the past 2 years.

  11. Survalytics: An Open-Source Cloud-Integrated Experience Sampling, Survey, and Analytics and Metadata Collection Module for Android Operating System Apps.

    Science.gov (United States)

    O'Reilly-Shah, Vikas; Mackey, Sean

    2016-06-03

    We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes.

  12. Survalytics: An Open-Source Cloud-Integrated Experience Sampling, Survey, and Analytics and Metadata Collection Module for Android Operating System Apps

    Science.gov (United States)

    Mackey, Sean

    2016-01-01

    Background We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Objective Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Methods Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. Results The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. Conclusions The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes. PMID:27261155

  13. Specific absorbed fractions of energy at various ages from internal photon sources: 3, Five-year-old

    International Nuclear Information System (INIS)

    Cristy, M.; Eckerman, K.F.

    1987-04-01

    Specific absorbed fractions (PHI's) in various organs of the body (target organs) from sources of monoenergetic photons in various other organs (source organs) are tabulated. In this volume PHI-values are tabulated for a five-year-old or 19-kg person. These PHI-values can be used in calculating the photon component of the dose-equivalent rate in a given target organ from a given radionuclide that is present in a given source organ. The International Commission on Radiological Protection recognizes that the endosteal, or bone surface, cells are the tissue at risk for bone cancer. We have applied the dosimetry methods developed for beta-emitting radionuclides deposited in bone to follow the transport of secondary electrons that were freed by photon interactions through the microscopic structure of the skeleton. With these methods we can estimate PHI in the endosteal cells and can better estimate PHI in the active marrow; the latter is overestimated with other methods at photon energies below 200 keV. 12 refs., 2 tabs

  14. Big data analytics for the Future Circular Collider reliability and availability studies

    Science.gov (United States)

    Begy, Volodimir; Apollonio, Andrea; Gutleber, Johannes; Martin-Marquez, Manuel; Niemi, Arto; Penttinen, Jussi-Pekka; Rogova, Elena; Romero-Marin, Antonio; Sollander, Peter

    2017-10-01

    Responding to the European Strategy for Particle Physics update 2013, the Future Circular Collider study explores scenarios of circular frontier colliders for the post-LHC era. One branch of the study assesses industrial approaches to model and simulate the reliability and availability of the entire particle collider complex based on the continuous monitoring of CERN’s accelerator complex operation. The modelling is based on an in-depth study of the CERN injector chain and LHC, and is carried out as a cooperative effort with the HL-LHC project. The work so far has revealed that a major challenge is obtaining accelerator monitoring and operational data with sufficient quality, to automate the data quality annotation and calculation of reliability distribution functions for systems, subsystems and components where needed. A flexible data management and analytics environment that permits integrating the heterogeneous data sources, the domain-specific data quality management algorithms and the reliability modelling and simulation suite is a key enabler to complete this accelerator operation study. This paper describes the Big Data infrastructure and analytics ecosystem that has been put in operation at CERN, serving as the foundation on which reliability and availability analysis and simulations can be built. This contribution focuses on data infrastructure and data management aspects and presents case studies chosen for its validation.

  15. Analytic geometry

    CERN Document Server

    Burdette, A C

    1971-01-01

    Analytic Geometry covers several fundamental aspects of analytic geometry needed for advanced subjects, including calculus.This book is composed of 12 chapters that review the principles, concepts, and analytic proofs of geometric theorems, families of lines, the normal equation of the line, and related matters. Other chapters highlight the application of graphing, foci, directrices, eccentricity, and conic-related topics. The remaining chapters deal with the concept polar and rectangular coordinates, surfaces and curves, and planes.This book will prove useful to undergraduate trigonometric st

  16. A survey on platforms for big data analytics.

    Science.gov (United States)

    Singh, Dilpreet; Reddy, Chandan K

    The primary purpose of this paper is to provide an in-depth analysis of different platforms available for performing big data analytics. This paper surveys different hardware platforms available for big data analytics and assesses the advantages and drawbacks of each of these platforms based on various metrics such as scalability, data I/O rate, fault tolerance, real-time processing, data size supported and iterative task support. In addition to the hardware, a detailed description of the software frameworks used within each of these platforms is also discussed along with their strengths and drawbacks. Some of the critical characteristics described here can potentially aid the readers in making an informed decision about the right choice of platforms depending on their computational needs. Using a star ratings table, a rigorous qualitative comparison between different platforms is also discussed for each of the six characteristics that are critical for the algorithms of big data analytics. In order to provide more insights into the effectiveness of each of the platform in the context of big data analytics, specific implementation level details of the widely used k-means clustering algorithm on various platforms are also described in the form pseudocode.

  17. A Generic analytical solution for modelling pumping tests in wells intersecting fractures

    Science.gov (United States)

    Dewandel, Benoît; Lanini, Sandra; Lachassagne, Patrick; Maréchal, Jean-Christophe

    2018-04-01

    The behaviour of transient flow due to pumping in fractured rocks has been studied for at least the past 80 years. Analytical solutions were proposed for solving the issue of a well intersecting and pumping from one vertical, horizontal or inclined fracture in homogeneous aquifers, but their domain of application-even if covering various fracture geometries-was restricted to isotropic or anisotropic aquifers, whose potential boundaries had to be parallel or orthogonal to the fracture direction. The issue thus remains unsolved for many field cases. For example, a well intersecting and pumping a fracture in a multilayer or a dual-porosity aquifer, where intersected fractures are not necessarily parallel or orthogonal to aquifer boundaries, where several fractures with various orientations intersect the well, or the effect of pumping not only in fractures, but also in the aquifer through the screened interval of the well. Using a mathematical demonstration, we show that integrating the well-known Theis analytical solution (Theis, 1935) along the fracture axis is identical to the equally well-known analytical solution of Gringarten et al. (1974) for a uniform-flux fracture fully penetrating a homogeneous aquifer. This result implies that any existing line- or point-source solution can be used for implementing one or more discrete fractures that are intersected by the well. Several theoretical examples are presented and discussed: a single vertical fracture in a dual-porosity aquifer or in a multi-layer system (with a partially intersecting fracture); one and two inclined fractures in a leaky-aquifer system with pumping either only from the fracture(s), or also from the aquifer between fracture(s) in the screened interval of the well. For the cases with several pumping sources, analytical solutions of flowrate contribution from each individual source (fractures and well) are presented, and the drawdown behaviour according to the length of the pumped screened interval of

  18. Public perception of analytical risk assessments

    International Nuclear Information System (INIS)

    Waite, D.A.; McCormack, W.D.

    1990-01-01

    Most analytical assessments of potential impacts on the environment from US Department of Energy (DOE) activities receive, at some point in their development, public scrutiny. The objective of this paper is to discuss the apparent perception of these assessments held by the public reviewers, based on written and verbal comments that they have offered. The discussion begins with a short overview of the analytical assessment process most often used on DOE projects. The process is described in terms of the basic process elements and data sources involved. Based on this outline of the assessment process, the key elements from the public's perspective are identified and examined on the basis of Importance Criteria and the Perception Framework in which the Importance Criteria appear to be applied. The paper is concluded with an analysis of the key elements of the public's perception. This section of the discussion is formatted to couple observational evidence of public perception difficulties with key assessment elements, and these difficulties with potential alternative approaches that serve the same purpose but are more acceptable to the public

  19. An analytical model for studying noise effects in PWR type reactors

    International Nuclear Information System (INIS)

    Meyer, K.

    1975-10-01

    An analytical model based on the one-group diffusion method is described. It has been used for calculating the axial dependence of the spectral density of the ionization chamber noise supposing a site-independent stationary neutron flux distribution. Coolant inlet temperature fluctuations are considered as noise sources. (author)

  20. An Analytical Framework for Evaluating E-Commerce Business Models and Strategies.

    Science.gov (United States)

    Lee, Chung-Shing

    2001-01-01

    Considers electronic commerce as a paradigm shift, or a disruptive innovation, and presents an analytical framework based on the theories of transaction costs and switching costs. Topics include business transformation process; scale effect; scope effect; new sources of revenue; and e-commerce value creation model and strategy. (LRW)

  1. STEPS: source term estimation based on plant status phase 0 - the technical specifications of the containment module

    International Nuclear Information System (INIS)

    Vamanu, D.V.

    1998-01-01

    In the framework of Project RODOS (Real-Time On-Line Decision Support System for Nuclear Emergencies in Europe), the European Atomic Energy Community (EAEC) of the Commission of the European Communities has commissioned the development of a unified concept, body of knowledge, models and software package meant to assist the evaluation of the source term of severe nuclear accidents in light water reactors of the types prevailing on the Continent. Code-named STEPS, for 'Source Term Estimation based on Plant Status', the project has evolved as Contract RODOS D (FI4P-CT96-0048), between EAEC and consortium of expert European centres including Commissariat a l'Energie Atomique, Institut de Protection et de Surete Nucleaire, (CEA-DPI-SEAC) as Coordinator and Forschungszentrum Karlsruhe GmbH (FZK-INR), the Finnish Centre for Radiation and Nuclear Safety, (STUK-NSD), the Technical Research Centre of Finland, Energy, Nuclear Energy (VTT-ET-NE), and Eidgenossische Technische Hochschule - ETH Zurich, Centre of Excellence (ETH-CERS) as Contractors. For the Phase 0 of the project, an IFIN-HH expert has been assigned by ETH-CERS to develop the Technical Specifications of the delivery component of the intended STEPS package, the CONTAINMENT Module. Sponsored by ETH-CERS headquarters in Zurich, the work was done on the premises and with the logistic support of CEA D PI-SEAC at Fontenay-aux-Roses, with the feedback processing and computer code development subsequently performed in Bucharest. The Technical Specifications of the STEPS CONTAINMENT Module were guided by specific terms of reference, including: (i) the capability of the software to function as a source term interface between targeted nuclear power plants and the RODOS System; (ii) the comparable capability of the system to be operated as a stand-alone assessment and decision support tool for a comprehensive variety of plants, nuclear emergency classes. On the technical side, the specifications had to focus on the possible

  2. Understanding Business Analytics

    Science.gov (United States)

    2015-01-05

    analytics have been used in organizations for a variety of reasons for quite some time; ranging from the simple (generating and understanding business analytics...process. understanding business analytics 3 How well these two components are orchestrated will determine the level of success an organization has in

  3. Analytical workflow for rapid screening and purification of bioactives from venom proteomes

    NARCIS (Netherlands)

    Otvos, R.A.; Heus, F.A.M.; Vonk, F.J.; Halff, J.; Bruynzeel, B.; Paliukhovich, I.; Smit, A.B.; Niessen, W.M.A.; Kool, J.

    2013-01-01

    Animal venoms are important sources for finding new pharmaceutical lead molecules. We used an analytical platform for initial rapid screening and identification of bioactive compounds from these venoms followed by fast and straightforward LC-MS only guided purification to obtain bioactives for

  4. Laser-induced plasmas as an analytical source for quantitative analysis of gaseous and aerosol systems: Fundamentals of plasma-particle interactions

    Science.gov (United States)

    Diwakar, Prasoon K.

    2009-11-01

    Laser-induced Breakdown Spectroscopy (LIBS) is a relatively new analytical diagnostic technique which has gained serious attention in recent past due to its simplicity, robustness, and portability and multi-element analysis capabilities. LIBS has been used successfully for analysis of elements in different media including solids, liquids and gases. Since 1963, when the first breakdown study was reported, to 1983, when the first LIBS experiments were reported, the technique has come a long way, but the majority of fundamental understanding of the processes that occur has taken place in last few years, which has propelled LIBS in the direction of being a well established analytical technique. This study, which mostly focuses on LIBS involving aerosols, has been able to unravel some of the mysteries and provide knowledge that will be valuable to LIBS community as a whole. LIBS processes can be broken down to three basic steps, namely, plasma formation, analyte introduction, and plasma-analyte interactions. In this study, these three steps have been investigated in laser-induced plasma, focusing mainly on the plasma-particle interactions. Understanding plasma-particle interactions and the fundamental processes involved is important in advancing laser-induced breakdown spectroscopy as a reliable and accurate analytical technique. Critical understanding of plasma-particle interactions includes study of the plasma evolution, analyte atomization, and the particle dissociation and diffusion. In this dissertation, temporal and spatial studies have been done to understand the fundamentals of the LIBS processes including the breakdown of gases by the laser pulse, plasma inception mechanisms, plasma evolution, analyte introduction and plasma-particle interactions and their influence on LIBS signal. Spectral measurements were performed in a laser-induced plasma and the results reveal localized perturbations in the plasma properties in the vicinity of the analyte species, for

  5. Guided Text Search Using Adaptive Visual Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Symons, Christopher T [ORNL; Senter, James K [ORNL; DeNap, Frank A [ORNL

    2012-10-01

    This research demonstrates the promise of augmenting interactive visualizations with semi- supervised machine learning techniques to improve the discovery of significant associations and insights in the search and analysis of textual information. More specifically, we have developed a system called Gryffin that hosts a unique collection of techniques that facilitate individualized investigative search pertaining to an ever-changing set of analytical questions over an indexed collection of open-source documents related to critical national infrastructure. The Gryffin client hosts dynamic displays of the search results via focus+context record listings, temporal timelines, term-frequency views, and multiple coordinate views. Furthermore, as the analyst interacts with the display, the interactions are recorded and used to label the search records. These labeled records are then used to drive semi-supervised machine learning algorithms that re-rank the unlabeled search records such that potentially relevant records are moved to the top of the record listing. Gryffin is described in the context of the daily tasks encountered at the US Department of Homeland Security s Fusion Center, with whom we are collaborating in its development. The resulting system is capable of addressing the analysts information overload that can be directly attributed to the deluge of information that must be addressed in the search and investigative analysis of textual information.

  6. Analyticity without Differentiability

    Science.gov (United States)

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  7. Analytical modeling of worldwide medical radiation use

    International Nuclear Information System (INIS)

    Mettler, F.A. Jr.; Davis, M.; Kelsey, C.A.; Rosenberg, R.; Williams, A.

    1987-01-01

    An analytical model was developed to estimate the availability and frequency of medical radiation use on a worldwide basis. This model includes medical and dental x-ray, nuclear medicine, and radiation therapy. The development of an analytical model is necessary as the first step in estimating the radiation dose to the world's population from this source. Since there is no data about the frequency of medical radiation use in more than half the countries in the world and only fragmentary data in an additional one-fourth of the world's countries, such a model can be used to predict the uses of medical radiation in these countries. The model indicates that there are approximately 400,000 medical x-ray machines worldwide and that approximately 1.2 billion diagnostic medical x-ray examinations are performed annually. Dental x-ray examinations are estimated at 315 million annually and approximately 22 million in-vivo diagnostic nuclear medicine examinations. Approximately 4 million radiation therapy procedures or courses of treatment are undertaken annually

  8. Analysis and analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Batuecas Rodriguez, T [Department of Chemistry and Isotopes, Junta de Energia Nuclear, Madrid (Spain)

    1967-01-01

    The technology associated with the use of organic coolants in nuclear reactors depends to a large extent on the determination and control of their physical and chemical properties, and particularly on the viability, speed, sensitivity, precision and accuracy (depending on the intended usage) of the methods employed in detection and analytical determination. This has led to the study and development of numerous techniques, some specially designed for the extreme conditions involved in working with the types of product in question and others adapted from existing techniques. In the specific case of polyphenyl and hydropolyphenyl mixtures, which have been the principal subjects of study to date and offer greatest promise, the analytical problems are broadly as follows: Composition of initial product or virgin coolant composition of macro components and amounts of organic and inorganic impurities; Coolant during and after operation. Determination of gases and organic compounds produced by pyrolysis and radiolysis (degradation and polymerization products); Control of systems for purifying and regenerating the coolant after use. Dissolved pressurization gases; Detection of intermediate products during decomposition; these are generally very unstable (free radicals); Degree of fouling and film formation. Tests to determine potential formation of films; Corrosion of structural elements and canning materials; Health and safety. Toxicity, inflammability and impurities that can be activated. Although some of the above problems are closely interrelated and entail similar techniques, they vary as to degree of difficulty. Another question is the difficulty of distinguishing clearly between techniques for determining physical and physico-chemical properties, on one hand, and analytical techniques on the other. Any classification is therefore somewhat arbitrary (for example, in the case of dosimetry and techniques for determining mean molecular weights or electrical conductivity

  9. Livermore Accelerator Source for Radionuclide Science (LASRS)

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Scott [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bleuel, Darren [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Johnson, Micah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Rusnak, Brian [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Soltz, Ron [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Tonchev, Anton [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-05-05

    The Livermore Accelerator Source for Radionuclide Science (LASRS) will generate intense photon and neutron beams to address important gaps in the study of radionuclide science that directly impact Stockpile Stewardship, Nuclear Forensics, and Nuclear Material Detection. The co-location of MeV-scale neutral and photon sources with radiochemical analytics provides a unique facility to meet current and future challenges in nuclear security and nuclear science.

  10. Programming system for analytic geometry

    International Nuclear Information System (INIS)

    Raymond, Jacques

    1970-01-01

    After having outlined the characteristics of computing centres which do not comply with engineering tasks, notably the time required by all different tasks to be performed when developing a software (assembly, compilation, link edition, loading, run), and identified constraints specific to engineering, the author identifies the characteristics a programming system should have to suit engineering tasks. He discussed existing conversational systems and their programming language, and their main drawbacks. Then, he presents a system which aims at facilitating programming and addressing problems of analytic geometry and trigonometry

  11. Mantle cloaks for elliptical cylinders excited by an electric line source

    DEFF Research Database (Denmark)

    Kaminski, Piotr Marek; Yakovlev, Alexander B.; Arslanagic, Samel

    2016-01-01

    We investigate the ability of surface impedance mantle cloaks for cloaking of elliptical cylinders excited by an electric line source. The exact analytical solution of the problem utilizing Mathieu functions is obtained and is used to derive optimal surface impedances to cloak a number of configu......We investigate the ability of surface impedance mantle cloaks for cloaking of elliptical cylinders excited by an electric line source. The exact analytical solution of the problem utilizing Mathieu functions is obtained and is used to derive optimal surface impedances to cloak a number...

  12. Networking in the Desert - Operational and Analytical Challenges for MINUSMA in Mali

    DEFF Research Database (Denmark)

    Haugegaard, Rikke

    This paper will initiate a discussion of the operational and analytical challenges in understanding network dynamics in Mali, and how these dynamics can be seen as one source of conflict in Mali. The paper is based on a field visit to Mali in 2014....

  13. Performance Marketing with Google Analytics Strategies and Techniques for Maximizing Online ROI

    CERN Document Server

    Tonkin, Sebastian

    2010-01-01

    An unparalleled author trio shares valuable advice for using Google Analytics to achieve your business goals. Google Analytics is a free tool used by millions of Web site owners across the globe to track how visitors interact with their Web sites, where they arrive from, and which visitors drive the most revenue and sales leads. This book offers clear explanations of practical applications drawn from the real world. The author trio of Google Analytics veterans starts with a broad explanation of performance marketing and gets progressively more specific, closing with step-by-step analysis and a

  14. Scalable Earth-observation Analytics for Geoscientists: Spacetime Extensions to the Array Database SciDB

    Science.gov (United States)

    Appel, Marius; Lahn, Florian; Pebesma, Edzer; Buytaert, Wouter; Moulds, Simon

    2016-04-01

    imagery to existing multi-temporal SciDB arrays. While our SciDB plugin works directly in the database, the GDAL driver has been specifically developed using a minimum amount of external dependencies (i.e. CURL). Source code for both tools is available from github [1]. We present these tools in a case-study that demonstrates the ingestion of multi-temporal tiled earth-observation data to SciDB, followed by a time-series analysis using R and SciDBR. Through the exclusive use of open-source software, our approach supports reproducibility in scalable large-scale earth-observation analytics. In the future, these tools can be used in an automated way to let scientists only work on ready-to-use SciDB arrays to significantly reduce the data management workload for domain scientists. [1] https://github.com/mappl/scidb4geo} and \\url{https://github.com/mappl/scidb4gdal

  15. New Analytical Monographs on TCM Herbal Drugs for Quality Proof.

    Science.gov (United States)

    Wagner, Hildebert; Bauer, Rudolf; Melchart, Dieter

    2016-01-01

    Regardless of specific national drug regulations there is an international consensus that all TCM drugs must meet stipulated high quality standards focusing on authentication, identification and chemical composition. In addition, safety of all TCM drugs prescribed by physicians has to be guaranteed. During the 25 years history of the TCM hospital Bad Kötzting, 171 TCM drugs underwent an analytical quality proof including thin layer as well as high pressure liquid chromatography. As from now mass spectroscopy will also be available as analytical tool. The findings are compiled and already published in three volumes of analytical monographs. One more volume will be published shortly, and a fifth volume is in preparation. The main issues of the analytical procedure in TCM drugs like authenticity, botanical nomenclature, variability of plant species and parts as well as processing are pointed out and possible ways to overcome them are sketched. © 2016 S. Karger GmbH, Freiburg.

  16. User-Centered Evaluation of Visual Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Scholtz, Jean C.

    2017-10-01

    Visual analytics systems are becoming very popular. More domains now use interactive visualizations to analyze the ever-increasing amount and heterogeneity of data. More novel visualizations are being developed for more tasks and users. We need to ensure that these systems can be evaluated to determine that they are both useful and usable. A user-centered evaluation for visual analytics needs to be developed for these systems. While many of the typical human-computer interaction (HCI) evaluation methodologies can be applied as is, others will need modification. Additionally, new functionality in visual analytics systems needs new evaluation methodologies. There is a difference between usability evaluations and user-centered evaluations. Usability looks at the efficiency, effectiveness, and user satisfaction of users carrying out tasks with software applications. User-centered evaluation looks more specifically at the utility provided to the users by the software. This is reflected in the evaluations done and in the metrics used. In the visual analytics domain this is very challenging as users are most likely experts in a particular domain, the tasks they do are often not well defined, the software they use needs to support large amounts of different kinds of data, and often the tasks last for months. These difficulties are discussed more in the section on User-centered Evaluation. Our goal is to provide a discussion of user-centered evaluation practices for visual analytics, including existing practices that can be carried out and new methodologies and metrics that need to be developed and agreed upon by the visual analytics community. The material provided here should be of use for both researchers and practitioners in the field of visual analytics. Researchers and practitioners in HCI and interested in visual analytics will find this information useful as well as a discussion on changes that need to be made to current HCI practices to make them more suitable to

  17. Source inversion in the full-wave tomography; Full wave tomography ni okeru source inversion

    Energy Technology Data Exchange (ETDEWEB)

    Tsuchiya, T [DIA Consultants Co. Ltd., Tokyo (Japan)

    1997-10-22

    In order to consider effects of characteristics of a vibration source in the full-wave tomography (FWT), a study has been performed on a method to invert vibration source parameters together with V(p)/V(s) distribution. The study has expanded an analysis method which uses as the basic the gradient method invented by Tarantola and the partial space method invented by Sambridge, and conducted numerical experiments. The experiment No. 1 has performed inversion of only the vibration source parameters, and the experiment No. 2 has executed simultaneous inversion of the V(p)/V(s) distribution and the vibration source parameters. The result of the discussions revealed that and effective analytical procedure would be as follows: in order to predict maximum stress, the average vibration source parameters and the property parameters are first inverted simultaneously; in order to estimate each vibration source parameter at a high accuracy, the property parameters are fixed, and each vibration source parameter is inverted individually; and the derived vibration source parameters are fixed, and the property parameters are again inverted from the initial values. 5 figs., 2 tabs.

  18. Development and Verification of Behavior of Tritium Analytic Code (BOTANIC)

    International Nuclear Information System (INIS)

    Park, Min Young; Kim, Eung Soo

    2014-01-01

    VHTR, one of the Generation IV reactor concepts, has a relatively high operation temperature and is usually suggested as a heat source for many industrial processes, including hydrogen production process. Thus, it is vital to trace tritium behavior in the VHTR system and the potential permeation rate to the industrial process. In other words, tritium is a crucial issue in terms of safety in the fission reactor system. Therefore, it is necessary to understand the behavior of tritium and the development of the tool to enable this is vital.. In this study, a Behavior of Tritium Analytic Code (BOTANIC) an analytic tool which is capable of analyzing tritium behavior is developed using a chemical process code called gPROMS. BOTANIC was then further verified using the analytic solutions and benchmark codes such as Tritium Permeation Analysis Code (TPAC) and COMSOL. In this study, the Behavior of Tritium Analytic Code, BOTANIC, has been developed using a chemical process code called gPROMS. The code has several distinctive features including non-diluted assumption, flexible applications and adoption of distributed permeation model. Due to these features, BOTANIC has the capability to analyze a wide range of tritium level systems and has a higher accuracy as it has the capacity to solve distributed models. BOTANIC was successfully developed and verified using analytical solution and the benchmark code calculation result. The results showed very good agreement with the analytical solutions and the calculation results of TPAC and COMSOL. Future work will be focused on the total system verification

  19. Analytical characterization of radiation fields generated by certain witch-type distributed axi-symmetrical ion beams

    International Nuclear Information System (INIS)

    Timus, D.M.; Kalla, S.L.; Abbas, M.I.

    2005-01-01

    Increasing interest is being shown in obtaining accurate predictions concerning radiation fields produced by ion beams impinging on homogeneous plane targets, the effect of this process being exothermic nuclear reactions. Previous theoretical studies made by the authors have focused on radiation fields generated by homogeneous plane disk- or ring-shaped sources, based on a unified treatment of the radiation field distribution developed by Hubbell and co-workers. In the case of an equivalent homogeneous source anisotropically emitting in non dispersive media, the Legendre polynomial series expansion method for specific emissivity function can be successfully applied when conditions for the convergence of the approximating series are satisfied. We have developed an analytical expression for the radiation field distribution around a homogeneous disk-shaped target bombarded by Witch-type distributed (in transverse plane) ion beams whose elementary areas anisotropically emit following a cos-type law in non dispersive media. Results of this investigation can be extended to various experimental situations in which the assumption of an angular omni-directional as well as of a constant space distribution of nuclear reaction emissivity over the accelerator target surface or other kinds of axi-symmetric plane sources of radiation is no longer valid. Animated 3 D graphics visualization is suggested

  20. An Analytical Method for the Abel Inversion of Asymmetrical Gaussian Profiles

    International Nuclear Information System (INIS)

    Xu Guosheng; Wan Baonian

    2007-01-01

    An analytical algorithm for fast calculation of the Abel inversion for density profile measurement in tokamak is developed. Based upon the assumptions that the particle source is negligibly small in the plasma core region, density profiles can be approximated by an asymmetrical Gaussian distribution controlled only by one parameter V 0 /D and V 0 /D is constant along the radial direction, the analytical algorithm is presented and examined against a testing profile. The validity is confirmed by benchmark with the standard Abel inversion method and the theoretical profile. The scope of application as well as the error analysis is also discussed in detail

  1. Halal authenticity of gelatin using species-specific PCR.

    Science.gov (United States)

    Shabani, Hessam; Mehdizadeh, Mehrangiz; Mousavi, Seyed Mohammad; Dezfouli, Ehsan Ansari; Solgi, Tara; Khodaverdi, Mahdi; Rabiei, Maryam; Rastegar, Hossein; Alebouyeh, Mahmoud

    2015-10-01

    Consumption of food products derived from porcine sources is strictly prohibited in Islam. Gelatin, mostly derived from bovine and porcine sources, has many applications in the food and pharmaceutical industries. To ensure that food products comply with halal regulations, development of valid and reliable analytical methods is very much required. In this study, a species-specific polymerase chain reaction (PCR) assay using conserved regions of mitochondrial DNA (cytochrome b gene) was performed to evaluate the halal authenticity of gelatin. After isolation of DNA from gelatin powders with known origin, conventional PCR using species-specific primers was carried out on the extracted DNA. The amplified expected PCR products of 212 and 271 bp were observed for porcine and bovine gelatin, respectively. The sensitivity of the method was tested on binary gelatin mixtures containing 0.1%, 1%, 10%, and 100% (w/w) of porcine gelatin within bovine gelatin and vice versa. Although most of the DNA is degraded due to the severe processing steps of gelatin production, the minimum level of 0.1% w/w of both porcine and bovine gelatin was detected. Moreover, eight food products labeled as containing bovine gelatin and eight capsule shells were subjected to PCR examination. The results showed that all samples contained bovine gelatin, and the absence of porcine gelatin was verified. This method of species authenticity is very useful to verify whether gelatin and gelatin-containing food products are derived from halal ingredients. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Strand Invasion Based Amplification (SIBA®): a novel isothermal DNA amplification technology demonstrating high specificity and sensitivity for a single molecule of target analyte.

    Science.gov (United States)

    Hoser, Mark J; Mansukoski, Hannu K; Morrical, Scott W; Eboigbodin, Kevin E

    2014-01-01

    Isothermal nucleic acid amplification technologies offer significant advantages over polymerase chain reaction (PCR) in that they do not require thermal cycling or sophisticated laboratory equipment. However, non-target-dependent amplification has limited the sensitivity of isothermal technologies and complex probes are usually required to distinguish between non-specific and target-dependent amplification. Here, we report a novel isothermal nucleic acid amplification technology, Strand Invasion Based Amplification (SIBA). SIBA technology is resistant to non-specific amplification, is able to detect a single molecule of target analyte, and does not require target-specific probes. The technology relies on the recombinase-dependent insertion of an invasion oligonucleotide (IO) into the double-stranded target nucleic acid. The duplex regions peripheral to the IO insertion site dissociate, thereby enabling target-specific primers to bind. A polymerase then extends the primers onto the target nucleic acid leading to exponential amplification of the target. The primers are not substrates for the recombinase and are, therefore unable to extend the target template in the absence of the IO. The inclusion of 2'-O-methyl RNA to the IO ensures that it is not extendible and that it does not take part in the extension of the target template. These characteristics ensure that the technology is resistant to non-specific amplification since primer dimers or mis-priming are unable to exponentially amplify. Consequently, SIBA is highly specific and able to distinguish closely-related species with single molecule sensitivity in the absence of complex probes or sophisticated laboratory equipment. Here, we describe this technology in detail and demonstrate its use for the detection of Salmonella.

  3. Strand Invasion Based Amplification (SIBA®: a novel isothermal DNA amplification technology demonstrating high specificity and sensitivity for a single molecule of target analyte.

    Directory of Open Access Journals (Sweden)

    Mark J Hoser

    Full Text Available Isothermal nucleic acid amplification technologies offer significant advantages over polymerase chain reaction (PCR in that they do not require thermal cycling or sophisticated laboratory equipment. However, non-target-dependent amplification has limited the sensitivity of isothermal technologies and complex probes are usually required to distinguish between non-specific and target-dependent amplification. Here, we report a novel isothermal nucleic acid amplification technology, Strand Invasion Based Amplification (SIBA. SIBA technology is resistant to non-specific amplification, is able to detect a single molecule of target analyte, and does not require target-specific probes. The technology relies on the recombinase-dependent insertion of an invasion oligonucleotide (IO into the double-stranded target nucleic acid. The duplex regions peripheral to the IO insertion site dissociate, thereby enabling target-specific primers to bind. A polymerase then extends the primers onto the target nucleic acid leading to exponential amplification of the target. The primers are not substrates for the recombinase and are, therefore unable to extend the target template in the absence of the IO. The inclusion of 2'-O-methyl RNA to the IO ensures that it is not extendible and that it does not take part in the extension of the target template. These characteristics ensure that the technology is resistant to non-specific amplification since primer dimers or mis-priming are unable to exponentially amplify. Consequently, SIBA is highly specific and able to distinguish closely-related species with single molecule sensitivity in the absence of complex probes or sophisticated laboratory equipment. Here, we describe this technology in detail and demonstrate its use for the detection of Salmonella.

  4. Analytical performances of food microbiology laboratories - critical analysis of 7 years of proficiency testing results.

    Science.gov (United States)

    Abdel Massih, M; Planchon, V; Polet, M; Dierick, K; Mahillon, J

    2016-02-01

    Based on the results of 19 food microbiology proficiency testing (PT) schemes, this study aimed to assess the laboratory performances, to highlight the main sources of unsatisfactory analytical results and to suggest areas of improvement. The 2009-2015 results of REQUASUD and IPH PT, involving a total of 48 laboratories, were analysed. On average, the laboratories failed to detect or enumerate foodborne pathogens in 3·0% of the tests. Thanks to a close collaboration with the PT participants, the causes of outliers could be identified in 74% of the cases. The main causes of erroneous PT results were either pre-analytical (handling of the samples, timing of analysis), analytical (unsuitable methods, confusion of samples, errors in colony counting or confirmation) or postanalytical mistakes (calculation and encoding of results). PT schemes are a privileged observation post to highlight analytical problems, which would otherwise remain unnoticed. In this perspective, this comprehensive study of PT results provides insight into the sources of systematic errors encountered during the analyses. This study draws the attention of the laboratories to the main causes of analytical errors and suggests practical solutions to avoid them, in an educational purpose. The observations support the hypothesis that regular participation to PT, when followed by feed-back and appropriate corrective actions, can play a key role in quality improvement and provide more confidence in the laboratory testing results. © 2015 The Society for Applied Microbiology.

  5. Role of thermo-analytical techniques in compositional characterization of nuclear materials

    International Nuclear Information System (INIS)

    Raje, Naina

    2015-01-01

    The study of heat effects on different materials has a long history. Extraction of metals from the ores, pottery production, glasses making etc. are the examples, where the performance of products obtained from raw materials depends on the processing temperatures. Concrete, pottery, bricks etc., are severely damaged due to uncontrolled high temperatures. Therefore, the heating of raw materials in controlled manner is of pivotal importance to get products of the desired quality. Thermo-analytical techniques provide the information on the effect of heat under controlled heating conditions. In thermo-analytical techniques, physical properties of materials are measured as a function of temperature. Simultaneous thermo-analytical techniques are beneficial in comparison to any single thermo-analytical technique. Simultaneous techniques refer to the measurement of two or more signals on the same sample at the same time in the same instrument. Nowadays, simultaneous thermo-analytical technique are extensively in use for the analysis of materials. Ammonium diuranate (ADU) and magnesium diuranate (MDU), also known as yellowcake, are intermediate precursors in fuel fabrication process, with stringent specifications along with the need to understand its thermal behavior. In the processing of lowgrade ores, higher levels of impurities are being encountered in the leach solution that affects the properties of ADU/MDU. In order to meet the fuel specifications, quality assurance of these nuclear materials is essential. Current studies describe the application of simultaneous Thermogravimetry (TG) - differential thermal analysis (DTA) - evolved gas analysis (EGA) techniques for the compositional characterization of ADU/MDU with respect to the impurities present in the matrices

  6. Analytic evidence for the Gubser-Mitra conjecture

    International Nuclear Information System (INIS)

    Miyamoto, Umpei

    2008-01-01

    A simple master equation for the static perturbation of charged black strings is derived with employing the gauge proposed by Kol. The sign changes of the potential in the master equation and specific heat of the background exactly coincide. That is, for the black strings with positive specific heat, the potential becomes positive definite to forbid the bound state implying the onset of Gregory-Laflamme instability. It can safely be said that this is the first analytic and explicit evidence for the Gubser-Mitra conjecture, correlating the classical and thermodynamic instabilities of black branes. Possible generalizations of the analysis are also discussed

  7. OSSMETER D3.4 – Language-Specific Source Code Quality Analysis

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen); A. Shahi (Ashim); H.J.S. Basten (Bas)

    2014-01-01

    htmlabstractThis deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and prototypes of the tools that are needed for source code quality analysis in open source software projects. It builds upon the results of: • Deliverable 3.1 where infra-structure and

  8. Adolescents show sex-specific preferences on media when pornography is a major source of sexual knowledge

    DEFF Research Database (Denmark)

    Rasmussen, Anna Lund; Svarrer, Rebekka; Lauszus, Finn Friis

    2017-01-01

    photographs;thus, these magazines constituted a major source of adolescent girls. Girls knew the gestational age of legal abortion in Denmark and had their knowledge from non-explicit magazines while this was not the case for boys (p=0.004). Pupils who stated their knowledge on sex from these magazines knew...... the first sign of pregnancy (menostasia), the correct facts of legal abortion, and STI.Conclusions: Pornography in different media is used in the vast majority of adolescents and its use is sex-specific. Knowledge on STI, pregnancy, legal abortion was variably associated with the type of media....... with focus on pornography and what media was used. Pornography was divided according to five media subcategories. Knowledge on sexually transmitted infection (STI), pregnancy and abortion and their associations with pornography were explored.Results: Pornography was reported as the second largest source...

  9. Cold moderators for pulsed neutron sources

    International Nuclear Information System (INIS)

    Carpenter, J.M.

    1990-01-01

    This paper reviews cold moderators in pulsed neutron sources and provides details of the performance of different cold moderator materials and configurations. Analytical forms are presented which describe wavelength spectra and emission time distributions. Several types of cooling arrangements used in pulsed source moderators are described. Choices of materials are surveyed. The author examines some of the radiation damage effects in cold moderators, including the phenomenon of ''burping'' in irradiated cold solid methane. 9 refs., 15 figs., 4 tabs

  10. Factors controlling leaching of polycyclic aromatic hydrocarbons from petroleum source rock using nonionic surfactant

    Energy Technology Data Exchange (ETDEWEB)

    Akinlua, Akinsehinwa [Obafemi Awolowo Univ., Ile-Ife (Nigeria). Fossil Fuels and Environmental Geochemistry Group; Jochmann, Maik A.; Qian, Yuan; Schmidt, Torsten C. [Duisburg-Essen Univ., Essen (Germany). Instrumental Analytical Chemistry; Sulkowski, Martin [Duisburg-Essen Univ., Essen (Germany). Inst. of Environmental Analytical Chemistry

    2012-03-15

    The extraction of polycyclic aromatic hydrocarbons (PAHs) from petroleum source rock by nonionic surfactants with the assistance of microwave irradiation was investigated and the conditions for maximum yield were determined. The results showed that the extraction temperatures and type of surfactant have significant effects on extraction yields of PAHs. Factors such as surfactant concentration, irradiation power, sample/solvent ratio and mixing surfactants (i.e., mixture of surfactant at specific ratio) also influence the extraction efficiencies for these compounds. The optimum temperature for microwave-assisted nonionic surfactant extraction of PAHs from petroleum source rock was 120 C and the best suited surfactant was Brij 35. The new method showed extraction efficiencies comparable to those afforded by the Soxhlet extraction method, but a reduction of the extraction times and environmentally friendliness of the new nonionic surfactant extraction system are clear advantages. The results also show that microwave-assisted nonionic surfactant extraction is a good and efficient green analytical preparatory technique for geochemical evaluation of petroleum source rock. (orig.)

  11. Application of ''Confirm tank T is an appropriate feed source for High-Level waste feed batch X'' to specific feed batches

    International Nuclear Information System (INIS)

    JO, J.

    1999-01-01

    This document addresses the characterization needs of tanks as set forth in the Data Quality Objectives for TWRS Privatization Phase I: Confirm Tank T is an Appropriate Feed Source for High-Level Waste Feed Batch X (Crawford et al. 1998). The primary purpose of this document is to collect existing data and identify the data needed to determine whether or not the feed source(s) are appropriate for a specific batch. To answer these questions, the existing tank data must be collected and a detailed review performed. If the existing data are insufficient to complete a full comparison, additional data must be obtained from the feed source(s). Additional information requirements need to be identified and formally documented, then the source tank waste must be sampled or resampled and analyzed. Once the additional data are obtained, the data shall be incorporated into the existing database for the source tank and a reevaluation of the data against the Data Quality Objective (DQO) must be made

  12. Application of ''Confirm tank T is an appropriate feed source for Low-Activity waste feed batch X'' to specific feed batches

    International Nuclear Information System (INIS)

    JO, J.

    1999-01-01

    This document addresses the characterization needs of tanks as set forth in the ''Confirm Tank T is an Appropriate Feed Source for Low-Activity Waste Feed Batch X'' Data Quality Objective (DQO) (Certa and Jo 1998). The primary purpose of this document is to collect existing data and identify the data needed to determine whether or not the feed source(s) are appropriate for a specific batch before transfer is made to the feed staging tanks. To answer these questions, the existing tank data must be collected and a detailed review performed. If the existing data are insufficient to complete a full comparison, additional data must be obtained from the feed source(s). Additional information requirements need to be identified and formally documented, then the source tank waste must be sampled or resampled and analyzed. Once the additional data are obtained, the data shall be incorporated into the existing database for the source tank and a reevaluation of the data against the DQO must be made

  13. Modern methods in analytical acoustics lecture notes

    CERN Document Server

    Crighton, D G; Williams, J E Ffowcs; Heckl, M; Leppington, F G

    1992-01-01

    Modern Methods in Analytical Acoustics considers topics fundamental to the understanding of noise, vibration and fluid mechanisms. The series of lectures on which this material is based began by some twenty five years ago and has been developed and expanded ever since. Acknowledged experts in the field have given this course many times in Europe and the USA. Although the scope of the course has widened considerably, the primary aim of teaching analytical techniques of acoustics alongside specific areas of wave motion and unsteady fluid mechanisms remains. The distinguished authors of this volume are drawn from Departments of Acoustics, Engineering of Applied Mathematics in Berlin, Cambridge and London. Their intention is to reach a wider audience of all those concerned with acoustic analysis than has been able to attend the course.

  14. Model-based Engineering for the Integration of Manufacturing Systems with Advanced Analytics

    OpenAIRE

    Lechevalier , David; Narayanan , Anantha; Rachuri , Sudarsan; Foufou , Sebti; Lee , Y Tina

    2016-01-01

    Part 3: Interoperability and Systems Integration; International audience; To employ data analytics effectively and efficiently on manufacturing systems, engineers and data scientists need to collaborate closely to bring their domain knowledge together. In this paper, we introduce a domain-specific modeling approach to integrate a manufacturing system model with advanced analytics, in particular neural networks, to model predictions. Our approach combines a set of meta-models and transformatio...

  15. The use of analytical procedures in the internal audit of the restaurant business expenses

    Directory of Open Access Journals (Sweden)

    T.Yu. Kopotienko

    2015-06-01

    Full Text Available The important task of carrying out the internal audit of expenses is to get the sufficient and reliable audit evidence. This can be achieved by using the analytical procedures in the audit process. The identification of the analytical procedures with the financial analysis of the business activities prevents from the efficient usage of them in the internal audit of the restaurant business expenses. The knowledge of internal auditors about the instructional techniques of analytical procedures and their tasks, depending on the verification steps are insufficient. The purpose of the article is the developing the methods of the internal audit of the restaurant business expenses based on an integrated application of analytical procedures. The nature and purpose of analytical procedures have been investigated in the article. It have been identified the factors influencing on auditor’s decision about the choice of analytical procedures complex. It was recommended to identify among them the purpose of analytical procedures, the type and structure of the enterprise, the source of the available information, the existence of financial and non-financial information, reliability and comparability of the available information. It have been identified the tasks of analytical procedures, depending on the verification steps. It was offered the analytical procedures complex as a part of internal audit of the restaurant business expenses. This complex contains a list of the analytical procedures, instructional techniques of analysis that are used in the appropriate procedure and the brief overview of the content of procedure.

  16. Coupling impedance of an in-vacuum undulator: Measurement, simulation, and analytical estimation

    Science.gov (United States)

    Smaluk, Victor; Fielder, Richard; Blednykh, Alexei; Rehm, Guenther; Bartolini, Riccardo

    2014-07-01

    One of the important issues of the in-vacuum undulator design is the coupling impedance of the vacuum chamber, which includes tapered transitions with variable gap size. To get complete and reliable information on the impedance, analytical estimate, numerical simulations and beam-based measurements have been performed at Diamond Light Source, a forthcoming upgrade of which includes introducing additional insertion device (ID) straights. The impedance of an already existing ID vessel geometrically similar to the new one has been measured using the orbit bump method. The measurement results in comparison with analytical estimations and numerical simulations are discussed in this paper.

  17. Concurrent sourcing as a mechanism for safeguarding specific investments from opportunism

    DEFF Research Database (Denmark)

    Mols, Niels Peter

    This paper identifies when concurrent sourcing is an effective safeguard. Concurrent sourcing shortens the period that a buyer needs in order to internalize production and thus, it shortens the period in which an external supplier is able to hold-up a buyer. Concurrent sourcing also allows...... for short run expansion of production and reduces costs of lost customers. However, when complementarities and diseconomies of scale make concurrent sourcing an efficient choice for a buyer, the same complementarities and diseconomies of scale also weaken the threat that the internal production unit may...

  18. Improved analytical sensitivity for uranium and plutonium in environmental samples: Cavity ion source thermal ionization mass spectrometry

    International Nuclear Information System (INIS)

    Ingeneri, Kristofer; Riciputi, L.

    2001-01-01

    Following successful field trials, environmental sampling has played a central role as a routine part of safeguards inspections since early 1996 to verify declared and to detect undeclared activity. The environmental sampling program has brought a new series of analytical challenges, and driven a need for advances in verification technology. Environmental swipe samples are often extremely low in concentration of analyte (ng level or lower), yet the need to analyze these samples accurately and precisely is vital, particularly for the detection of undeclared nuclear activities. Thermal ionization mass spectrometry (TIMS) is the standard method of determining isotope ratios of uranium and plutonium in the environmental sampling program. TIMS analysis typically employs 1-3 filaments to vaporize and ionize the sample, and the ions are mass separated and analyzed using magnetic sector instruments due to their high mass resolution and high ion transmission. However, the ionization efficiency (the ratio of material present to material actually detected) of uranium using a standard TIMS instrument is low (0.2%), even under the best conditions. Increasing ionization efficiency by even a small amount would have a dramatic impact for safeguards applications, allowing both improvements in analytical precision and a significant decrease in the amount of uranium and plutonium required for analysis, increasing the sensitivity of environmental sampling

  19. Addressing the Analytic Challenges of Cross-Sectional Pediatric Pneumonia Etiology Data.

    Science.gov (United States)

    Hammitt, Laura L; Feikin, Daniel R; Scott, J Anthony G; Zeger, Scott L; Murdoch, David R; O'Brien, Katherine L; Deloria Knoll, Maria

    2017-06-15

    Despite tremendous advances in diagnostic laboratory technology, identifying the pathogen(s) causing pneumonia remains challenging because the infected lung tissue cannot usually be sampled for testing. Consequently, to obtain information about pneumonia etiology, clinicians and researchers test specimens distant to the site of infection. These tests may lack sensitivity (eg, blood culture, which is only positive in a small proportion of children with pneumonia) and/or specificity (eg, detection of pathogens in upper respiratory tract specimens, which may indicate asymptomatic carriage or a less severe syndrome, such as upper respiratory infection). While highly sensitive nucleic acid detection methods and testing of multiple specimens improve sensitivity, multiple pathogens are often detected and this adds complexity to the interpretation as the etiologic significance of results may be unclear (ie, the pneumonia may be caused by none, one, some, or all of the pathogens detected). Some of these challenges can be addressed by adjusting positivity rates to account for poor sensitivity or incorporating test results from controls without pneumonia to account for poor specificity. However, no classical analytic methods can account for measurement error (ie, sensitivity and specificity) for multiple specimen types and integrate the results of measurements for multiple pathogens to produce an accurate understanding of etiology. We describe the major analytic challenges in determining pneumonia etiology and review how the common analytical approaches (eg, descriptive, case-control, attributable fraction, latent class analysis) address some but not all challenges. We demonstrate how these limitations necessitate a new, integrated analytical approach to pneumonia etiology data. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  20. Analytical modeling of multi-layered printed circuit board using multi-stacked via clusters as component heat spreader

    Directory of Open Access Journals (Sweden)

    Monier-Vinard Eric

    2016-01-01

    Full Text Available In order to help the electronic designer to early determine the limits of the power dissipation of electronic component, an analytical model was established to allow a fast insight of relevant design parameters of a multi-layered electronic board constitution. The proposed steady-state approach based on Fourier series method promotes a practical solution to quickly investigate the potential gain of multi-layered thermal via clusters. Generally, it has been shown a good agreement between the results obtained by the proposed analytical model and those given by electronics cooling software widely used in industry. Some results highlight the fact that the conventional practices for Printed Circuit Board modeling can be dramatically underestimate source temperatures, in particular with smaller sources. Moreover, the analytic solution could be applied to optimize the heat spreading in the board structure with a local modification of the effective thermal conductivity layers.

  1. Random source generating far field with elliptical flat-topped beam profile

    International Nuclear Information System (INIS)

    Zhang, Yongtao; Cai, Yangjian

    2014-01-01

    Circular and rectangular multi-Gaussian Schell-model (MGSM) sources which generate far fields with circular and rectangular flat-topped beam profiles were introduced just recently (Sahin and Korotkova 2012 Opt. Lett. 37 2970; Korotkova 2014 Opt. Lett. 39 64). In this paper, a random source named an elliptical MGSM source is introduced. An analytical expression for the propagation factor of an elliptical MGSM beam is derived. Furthermore, an analytical propagation formula for an elliptical MGSM beam passing through a stigmatic ABCD optical system is derived, and its propagation properties in free space are studied. It is interesting to find that an elliptical MGSM source generates a far field with an elliptical flat-topped beam profile, being qualitatively different from that of circular and rectangular MGSM sources. The ellipticity and the flatness of the elliptical flat-topped beam profile in the far field are determined by the initial coherence widths and the beam index, respectively. (paper)

  2. Column-Oriented Databases, an Alternative for Analytical Environment

    Directory of Open Access Journals (Sweden)

    Gheorghe MATEI

    2010-12-01

    Full Text Available It is widely accepted that a data warehouse is the central place of a Business Intelligence system. It stores all data that is relevant for the company, data that is acquired both from internal and external sources. Such a repository stores data from more years than a transactional system can do, and offer valuable information to its users to make the best decisions, based on accurate and reliable data. As the volume of data stored in an enterprise data warehouse becomes larger and larger, new approaches are needed to make the analytical system more efficient. This paper presents column-oriented databases, which are considered an element of the new generation of DBMS technology. The paper emphasizes the need and the advantages of these databases for an analytical environment and make a short presentation of two of the DBMS built in a columnar approach.

  3. Using predictive analytics and big data to optimize pharmaceutical outcomes.

    Science.gov (United States)

    Hernandez, Inmaculada; Zhang, Yuting

    2017-09-15

    The steps involved, the resources needed, and the challenges associated with applying predictive analytics in healthcare are described, with a review of successful applications of predictive analytics in implementing population health management interventions that target medication-related patient outcomes. In healthcare, the term big data typically refers to large quantities of electronic health record, administrative claims, and clinical trial data as well as data collected from smartphone applications, wearable devices, social media, and personal genomics services; predictive analytics refers to innovative methods of analysis developed to overcome challenges associated with big data, including a variety of statistical techniques ranging from predictive modeling to machine learning to data mining. Predictive analytics using big data have been applied successfully in several areas of medication management, such as in the identification of complex patients or those at highest risk for medication noncompliance or adverse effects. Because predictive analytics can be used in predicting different outcomes, they can provide pharmacists with a better understanding of the risks for specific medication-related problems that each patient faces. This information will enable pharmacists to deliver interventions tailored to patients' needs. In order to take full advantage of these benefits, however, clinicians will have to understand the basics of big data and predictive analytics. Predictive analytics that leverage big data will become an indispensable tool for clinicians in mapping interventions and improving patient outcomes. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  4. Analytic scattering kernels for neutron thermalization studies

    International Nuclear Information System (INIS)

    Sears, V.F.

    1990-01-01

    Current plans call for the inclusion of a liquid hydrogen or deuterium cold source in the NRU replacement vessel. This report is part of an ongoing study of neutron thermalization in such a cold source. Here, we develop a simple analytical model for the scattering kernel of monatomic and diatomic liquids. We also present the results of extensive numerical calculations based on this model for liquid hydrogen, liquid deuterium, and mixtures of the two. These calculations demonstrate the dependence of the scattering kernel on the incident and scattered-neutron energies, the behavior near rotational thresholds, the dependence on the centre-of-mass pair correlations, the dependence on the ortho concentration, and the dependence on the deuterium concentration in H 2 /D 2 mixtures. The total scattering cross sections are also calculated and compared with available experimental results

  5. Application of advanced nuclear and instrumental analytical techniques for characterisation of environmental materials

    International Nuclear Information System (INIS)

    Sudersanan, M.; Pawaskar, P.B.; Kayasth, S.R.; Kumar, S.C.

    2002-01-01

    Full text: Increasing realisation about the toxic effects of metal ions in environmental materials has given an impetus to research on analytical techniques for their characterization. The large number of analytes present at very low levels has necessitated the use of sensitive, selective and element specific techniques for their characterization. The concern about precision and accuracy on such analysis, which have socio-economic bearing, has emphasized the use of Certified Reference Materials and the use of multi-technique approach for the unambiguous characterization of analytes. The recent work carried out at Analytical Chemistry Division, BARC on these aspects is presented in this paper. Increasing use of fossil fuels has led to the generation of large quantities of fly ash which pose problems of safe disposal. The utilization of these materials for land filling is an attractive option but the presence of trace amounts of toxic metals like mercury, arsenic, lead etc may cause environmental problems. In view of the inhomogeneous nature of the material, efficient sample processing is an important factor, in addition to the validation of the results by the use of proper standards. Analysis was carried out on flyash samples received as reference materials and also as samples from commercial sources using a combination of both nuclear techniques like INAA and RNAA as well as other techniques like AAS, ICPAES, cold vapour AAS for mercury and hydride generation technique for arsenic. Similar analysis using nuclear techniques was employed for the characterization of air particulates. Biological materials often serve as sensitive indicator materials for pollution measurements. They are also employed for studies on the uptake of toxic metals like U, Th, Cd, Pb, Hg etc. The presence of large amounts of organic materials in them necessitate an appropriate sample dissolution procedure. In view of the possibility of loss of certain analytes like Cd, Hg, As, by high

  6. Substrate specificity of glucose dehydrogenase and carbon source utilization pattern of pantoea dispersa strain P2 and its radiation induced mutants

    International Nuclear Information System (INIS)

    Lee, Young Keun; Murugesan, Senthilkumar

    2009-01-01

    Mineral phosphate solubilizing pantoea dispersa strain P2 produced 5.5 mM and 42.6 mM of gluconic acid on 24 h and 72 h incubation, respectively. Strain P2 exhibited glucose dehydrogenase (GDH) specific activity of 0.32 IU mg -1 protein. We have studied the substrate specificity of GDH as well as carbon source utilization pattern of strain P2. GDH of strain P2 did not use ribose as substrate. Utilization of lactose with specific activity of 0.65 IU mg -1 protein indicated that the enzyme belongs to GDH type B isozyme. Arabinose, galactose, ribose, sucrose and xylose did not induce the synthesis of GDH enzyme while mannose induced the synthesis of GDH with highest specific activity of 0.58 IU mg -1 protein. Through radiation mutagenesis, the substrate specificity of GDH was modified in order to utilize side range of sugars available in root exudates. Ribose, originally not a substrate for GDH of strain P2 was utilized as substrate by mutants P2-M5 with specific activity of 0.44 and 0.57 IU mg -1 protein, respectively. Specific activity of GDH on the media containing lactose and galactose was also improved to 1.2 and 0.52 IU mg -1 protein in P2-M5 and P2-M6 respectively. Based on the carbon source availability in root exudate, the mutants can be selected and utilized as efficient biofertilizer under P-deficient soil conditions

  7. Gravitational Radiation from Post-Newtonian Sources and Inspiralling Compact Binaries

    Directory of Open Access Journals (Sweden)

    Blanchet Luc

    2006-06-01

    Full Text Available The article reviews the current status of a theoretical approach to the problem of the emission of gravitational waves by isolated systems in the context of general relativity. Part A of the article deals with general post-Newtonian sources. The exterior field of the source is investigated by means of a combination of analytic post-Minkowskian and multipolar approximations. The physical observables in the far-zone of the source are described by a specific set of radiative multipole moments. By matching the exterior solution to the metric of the post-Newtonian source in the near-zone we obtain the explicit expressions of the source multipole moments. The relationships between the radiative and source moments involve many non-linear multipole interactions, among them those associated with the tails (and tails-of-tails of gravitational waves. Part B of the article is devoted to the application to compact binary systems. We present the equations of binary motion, and the associated Lagrangian and Hamiltonian, at the third post-Newtonian (3PN order beyond the Newtonian acceleration. The gravitational-wave energy flux, taking consistently into account the relativistic corrections in the binary moments as well as the various tail effects, is derived through 3.5PN order with respect to the quadrupole formalism. The binary's orbital phase, whose prior knowledge is crucial for searching and analyzing the signals from inspiralling compact binaries, is deduced from an energy balance argument.

  8. Analytical Study of 90Sr Betavoltaic Nuclear Battery Performance Based on p-n Junction Silicon

    International Nuclear Information System (INIS)

    Rahastama, Swastya; Waris, Abdul

    2016-01-01

    Previously, an analytical calculation of 63 Ni p-n junction betavoltaic battery has been published. As the basic approach, we reproduced the analytical simulation of 63 Ni betavoltaic battery and then compared it to previous results using the same design of the battery. Furthermore, we calculated its maximum power output and radiation- electricity conversion efficiency using semiconductor analysis method.Then, the same method were applied to calculate and analyse the performance of 90 Sr betavoltaic battery. The aim of this project is to compare the analytical perfomance results of 90 Sr betavoltaic battery to 63 Ni betavoltaic battery and the source activity influences to performance. Since it has a higher power density, 90 Sr betavoltaic battery yields more power than 63 Ni betavoltaic battery but less radiation-electricity conversion efficiency. However, beta particles emitted from 90 Sr source could travel further inside the silicon corresponding to stopping range of beta particles, thus the 90 Sr betavoltaic battery could be designed thicker than 63 Ni betavoltaic battery to achieve higher conversion efficiency. (paper)

  9. A family of analytical solutions of a nonlinear diffusion-convection equation

    Science.gov (United States)

    Hayek, Mohamed

    2018-01-01

    Despite its popularity in many engineering fields, the nonlinear diffusion-convection equation has no general analytical solutions. This work presents a family of closed-form analytical traveling wave solutions for the nonlinear diffusion-convection equation with power law nonlinearities. This kind of equations typically appears in nonlinear problems of flow and transport in porous media. The solutions that are addressed are simple and fully analytical. Three classes of analytical solutions are presented depending on the type of the nonlinear diffusion coefficient (increasing, decreasing or constant). It has shown that the structure of the traveling wave solution is strongly related to the diffusion term. The main advantage of the proposed solutions is that they are presented in a unified form contrary to existing solutions in the literature where the derivation of each solution depends on the specific values of the diffusion and convection parameters. The proposed closed-form solutions are simple to use, do not require any numerical implementation, and may be implemented in a simple spreadsheet. The analytical expressions are also useful to mathematically analyze the structure and properties of the solutions.

  10. Angular dependence of response of dosimeters exposed to an extended radioactive source

    International Nuclear Information System (INIS)

    Manai, K.; Trabelsi, A.; Madouri, F.

    2014-01-01

    This study was carried out to investigate the exposure angular dependence of dosimeters response when exposed to the extended gamma source of an irradiation facility. Using analytical and Monte Carlo analysis, we show that dosimeters response has no angular dependence as claimed by a previous study. The dose rate formula we derived takes into account the path length of the photons in the dosimeter. Experimental data have been used to validate our analytical and Monte Carlo methods. Furthermore, the effects on the dosimeters responses in relation to their sizes response of their size and geometry and orientation have been investigated and, within statistical errors, no angular dependence was found. - Highlights: • We investigate the exposer angle dependence of dosimeter response to a gamma source. • Analytical and Monte Carlo analyses show no angular dependence as claimed by others. • We derive the dose rate formulae taking into account the path length of photons. • Analytical and Monte Carlo models have been validated using experimental data

  11. Selenium speciation from food source to metabolites: a critical review

    Energy Technology Data Exchange (ETDEWEB)

    Dumont, Emmie; Vanhaecke, Frank; Cornelis, Rita [Ghent University, Department of Analytical Chemistry, Ghent (Belgium)

    2006-08-15

    Especially in the last decade, a vast number of papers on Se and its role in health issues have been published. This review gives a brief, critical overview of the main analytical findings reported in these papers. Of particular interest is the Se content in different food sources worldwide and the extent to which their consumption is reflected in the Se content of human tissues and body fluids. Several food sources, both natural (Brazil nuts, garlic, Brassica juncea) and Se-enriched (yeast-based supplements), are discussed as to origin, characteristics, Se metabolism and impact of their consumption on the human body. The continuous development of new and improvement of existing analytical techniques has provided different powerful tools to unravel the Se species and their function. An up-to-date literature study on Se speciation analysis is given, illustrating how analytical chemistry in its different facets aids in the identification of Se compounds and provides insight into the complete metabolic pathway of Se throughout the human body. This review includes a detailed image of the current state-of-the-art of Se speciation analysis in these food sources and in human tissues and body fluids. (orig.)

  12. Physics Mining of Multi-source Data Sets, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to implement novel physics mining algorithms with analytical capabilities to derive diagnostic and prognostic numerical models from multi-source...

  13. People analytics in the era of big data changing the way you attract, acquire, develop, and retain talent

    CERN Document Server

    Isson, Jean Paul

    2016-01-01

    Apply predictive analytics throughout all stages of workforce management People Analytics in the Era of Big Data provides a blueprint for leveraging your talent pool through the use of data analytics. Written by the Global Vice President of Business Intelligence and Predictive Analytics at Monster Worldwide, this book is packed full of actionable insights to help you source, recruit, acquire, engage, retain, promote, and manage the exceptional talent your organization needs. With a unique approach that applies analytics to every stage of the hiring process and the entire workforce planning and management cycle, this informative guide provides the key perspective that brings analytics into HR in a truly useful way. You're already inundated with disparate employee data, so why not mine that data for insights that add value to your organization and strengthen your workforce? This book presents a practical framework for real-world talent analytics, backed by groundbreaking examples of workforce analytics in a...

  14. Bioimaging of cells and tissues using accelerator-based sources.

    Science.gov (United States)

    Petibois, Cyril; Cestelli Guidi, Mariangela

    2008-07-01

    A variety of techniques exist that provide chemical information in the form of a spatially resolved image: electron microprobe analysis, nuclear microprobe analysis, synchrotron radiation microprobe analysis, secondary ion mass spectrometry, and confocal fluorescence microscopy. Linear (LINAC) and circular (synchrotrons) particle accelerators have been constructed worldwide to provide to the scientific community unprecedented analytical performances. Now, these facilities match at least one of the three analytical features required for the biological field: (1) a sufficient spatial resolution for single cell (pros and cons of the most popular techniques that have been implemented on accelerator-based sources to address analytical issues on biological specimens.

  15. Development and analytical characterization of a Grimm-type glow discharge ion source operated with high gas flow rates and coupled to a mass spectrometer with high mass resolution

    International Nuclear Information System (INIS)

    Beyer, Claus; Feldmann, Ingo; Gilmour, Dave; Hoffmann, Volker; Jakubowski, Norbert

    2002-01-01

    A Grimm-type glow discharge ion source has been developed and was coupled to a commercial inductively coupled plasma mass spectrometer (ICP-MS) with high mass resolution (Axiom, ThermoElemental, Winsford, UK) by exchanging the front plate of the ICP-MS interface system only. In addition to high discharge powers of up to 70 W, which are typical for a Grimm-type design, this source could be operated with relative high gas flow rates of up to 240 ml min -1 . In combination with a high discharge voltage the signal intensities are reaching a constant level within the first 20 s after the discharge has started. An analytical characterization of this source is given utilizing a calibration using the steel standard reference material NIST 1261A-1265A. The sensitivity for the investigated elements measured with a resolution of 4000 is in the range of 500-6000 cps μg -1 g -1 , and a relative standard deviation (R.S.D.) of the measured isotope relative to Fe of less than 8% for the major and minor components of the sample has been achieved. Limits of detection at ng g -1 levels could be obtained

  16. Two-dimensional semi-analytic nodal method for multigroup pin power reconstruction

    International Nuclear Information System (INIS)

    Seung Gyou, Baek; Han Gyu, Joo; Un Chul, Lee

    2007-01-01

    A pin power reconstruction method applicable to multigroup problems involving square fuel assemblies is presented. The method is based on a two-dimensional semi-analytic nodal solution which consists of eight exponential terms and 13 polynomial terms. The 13 polynomial terms represent the particular solution obtained under the condition of a 2-dimensional 13 term source expansion. In order to achieve better approximation of the source distribution, the least square fitting method is employed. The 8 exponential terms represent a part of the analytically obtained homogeneous solution and the 8 coefficients are determined by imposing constraints on the 4 surface average currents and 4 corner point fluxes. The surface average currents determined from a transverse-integrated nodal solution are used directly whereas the corner point fluxes are determined during the course of the reconstruction by employing an iterative scheme that would realize the corner point balance condition. The outgoing current based corner point flux determination scheme is newly introduced. The accuracy of the proposed method is demonstrated with the L336C5 benchmark problem. (authors)

  17. Opinions on Drug Interaction Sources in Anticancer Treatments and Parameters for an Oncology-Specific Database by Pharmacy Practitioners in Asia

    Directory of Open Access Journals (Sweden)

    2010-01-01

    Full Text Available Cancer patients undergoing chemotherapy are particularly susceptible to drug-drug interactions (DDIs. Practitioners should keep themselves updated with the most current DDI information, particularly involving new anticancer drugs (ACDs. Databases can be useful to obtain up-to-date DDI information in a timely and efficient manner. Our objective was to investigate the DDI information sources of pharmacy practitioners in Asia and their views on the usefulness of an oncology-specific database for ACD interactions. A qualitative, cross-sectional survey was done to collect information on the respondents' practice characteristics, sources of DDI information and parameters useful in an ACD interaction database. Response rate was 49%. Electronic databases (70%, drug interaction textbooks (69% and drug compendia (64% were most commonly used. Majority (93% indicated that a database catering towards ACD interactions was useful. Essential parameters that should be included in the database were the mechanism and severity of the detected interaction, and the presence of a management plan (98% each. This study has improved our understanding on the usefulness of various DDI information sources for ACD interactions among pharmacy practitioners in Asia. An oncology-specific DDI database targeting ACD interactions is definitely attractive for clinical practice.

  18. Monte Carlo and analytic simulations in nanoparticle-enhanced radiation therapy

    Directory of Open Access Journals (Sweden)

    Paro AD

    2016-09-01

    Full Text Available Autumn D Paro,1 Mainul Hossain,2 Thomas J Webster,1,3,4 Ming Su1,4 1Department of Chemical Engineering, Northeastern University, Boston, MA, USA; 2NanoScience Technology Center and School of Electrical Engineering and Computer Science, University of Central Florida, Orlando, Florida, USA; 3Excellence for Advanced Materials Research, King Abdulaziz University, Jeddah, Saudi Arabia; 4Wenzhou Institute of Biomaterials and Engineering, Chinese Academy of Science, Wenzhou Medical University, Zhejiang, People’s Republic of China Abstract: Analytical and Monte Carlo simulations have been used to predict dose enhancement factors in nanoparticle-enhanced X-ray radiation therapy. Both simulations predict an increase in dose enhancement in the presence of nanoparticles, but the two methods predict different levels of enhancement over the studied energy, nanoparticle materials, and concentration regime for several reasons. The Monte Carlo simulation calculates energy deposited by electrons and photons, while the analytical one only calculates energy deposited by source photons and photoelectrons; the Monte Carlo simulation accounts for electron–hole recombination, while the analytical one does not; and the Monte Carlo simulation randomly samples photon or electron path and accounts for particle interactions, while the analytical simulation assumes a linear trajectory. This study demonstrates that the Monte Carlo simulation will be a better choice to evaluate dose enhancement with nanoparticles in radiation therapy. Keywords: nanoparticle, dose enhancement, Monte Carlo simulation, analytical simulation, radiation therapy, tumor cell, X-ray 

  19. Fluxball magnetic field analysis using a hybrid analytical/FEM/BEM with equivalent currents

    International Nuclear Information System (INIS)

    Fernandes, João F.P.; Camilo, Fernando M.; Machado, V. Maló

    2016-01-01

    In this paper, a fluxball electric machine is analyzed concerning the magnetic flux, force and torque. A novel method is proposed based in a special hybrid FEM/BEM (Finite Element Method/Boundary Element Method) with equivalent currents by using an analytical treatment for the source field determination. The method can be applied to evaluate the magnetic field in axisymmetric problems, in the presence of several magnetic materials. Same results obtained by a commercial Finite Element Analysis tool are presented for validation purposes with the proposed method. - Highlights: • The Fluxball machine magnetic field is analyzed by a new FEM/BEM/Analytical method. • The method is adequate for axisymmetric non homogeneous magnetic field problems. • The source magnetic field is evaluated considering a non-magnetic equivalent problem. • Material magnetization vectors are accounted by using equivalent currents. • A strong reduction of the finite element domain is achieved.

  20. Using online analytical processing to manage emergency department operations.

    Science.gov (United States)

    Gordon, Bradley D; Asplin, Brent R

    2004-11-01

    The emergency department (ED) is a unique setting in which to explore and evaluate the utility of information technology to improve health care operations. A potentially useful software tool in managing this complex environment is online analytical processing (OLAP). An OLAP system has the ability to provide managers, providers, and researchers with the necessary information to make decisions quickly and effectively by allowing them to examine patterns and trends in operations and patient flow. OLAP software quickly summarizes and processes data acquired from a variety of data sources, including computerized ED tracking systems. It allows the user to form a comprehensive picture of the ED from both system-wide and patient-specific perspectives and to interactively view the data using an approach that meets his or her needs. This article describes OLAP software tools and provides examples of potential OLAP applications for care improvement projects, primarily from the perspective of the ED. While OLAP is clearly a helpful tool in the ED, it is far more useful when integrated into the larger continuum of health information systems across a hospital or health care delivery system.

  1. Innovative technology summary report: Road Transportable Analytical Laboratory (RTAL)

    International Nuclear Information System (INIS)

    1998-10-01

    The Road Transportable Analytical Laboratory (RTAL) has been used in support of US Department of Energy (DOE) site and waste characterization and remediation planning at Fernald Environmental Management Project (FEMP) and is being considered for implementation at other DOE sites, including the Paducah Gaseous Diffusion Plant. The RTAL laboratory system consists of a set of individual laboratory modules deployable independently or as an interconnected group to meet each DOE site's specific analysis needs. The prototype RTAL, deployed at FEMP Operable Unit 1 Waste Pits, has been designed to be synergistic with existing analytical laboratory capabilities, thereby reducing the occurrence of unplanned rush samples that are disruptive to efficient laboratory operations

  2. Analytical solution using computer algebra of a biosensor for detecting toxic substances in water

    Science.gov (United States)

    Rúa Taborda, María. Isabel

    2014-05-01

    In a relatively recent paper an electrochemical biosensor for water toxicity detection based on a bio-chip as a whole cell was proposed and numerically solved and analyzed. In such paper the kinetic processes in a miniaturized electrochemical biosensor system was described using the equations for specific enzymatic reaction and the diffusion equation. The numerical solution shown excellent agreement with the measured data but such numerical solution is not enough to design efficiently the corresponding bio-chip. For this reason an analytical solution is demanded. The object of the present work is to provide such analytical solution and then to give algebraic guides to design the bio-sensor. The analytical solution is obtained using computer algebra software, specifically Maple. The method of solution is the Laplace transform, with Bromwich integral and residue theorem. The final solution is given as a series of Bessel functions and the effective time for the bio-sensor is computed. It is claimed that the analytical solutions that were obtained will be very useful to predict further current variations in similar systems with different geometries, materials and biological components. Beside of this the analytical solution that we provide is very useful to investigate the relationship between different chamber parameters such as cell radius and height; and electrode radius.

  3. SIMMER-III analytic thermophysical property model

    International Nuclear Information System (INIS)

    Morita, K; Tobita, Y.; Kondo, Sa.; Fischer, E.A.

    1999-05-01

    An analytic thermophysical property model using general function forms is developed for a reactor safety analysis code, SIMMER-III. The function forms are designed to represent correct behavior of properties of reactor-core materials over wide temperature ranges, especially for the thermal conductivity and the viscosity near the critical point. The most up-to-date and reliable sources for uranium dioxide, mixed-oxide fuel, stainless steel, and sodium available at present are used to determine parameters in the proposed functions. This model is also designed to be consistent with a SIMMER-III model on thermodynamic properties and equations of state for reactor-core materials. (author)

  4. Analytical methods for predicting contaminant transport

    International Nuclear Information System (INIS)

    Pigford, T.H.

    1989-09-01

    This paper summarizes some of the previous and recent work at the University of California on analytical solutions for predicting contaminate transport in porous and fractured geologic media. Emphasis is given here to the theories for predicting near-field transport, needed to derive the time-dependent source term for predicting far-field transport and overall repository performance. New theories summarized include solubility-limited release rate with flow backfill in rock, near-field transport of radioactive decay chains, interactive transport of colloid and solute, transport of carbon-14 as carbon dioxide in unsaturated rock, and flow of gases out of and a waste container through cracks and penetrations. 28 refs., 4 figs

  5. Validation of Prototype Continuous Real-Time Vital Signs Video Analytics Monitoring System CCATT Viewer

    Science.gov (United States)

    2018-01-26

    traditional monitors, this capability will facilitate management of a group of patients. Innovative visual analytics of the complex array of real-time...redundant system could be useful in managing hundreds of bedside monitor data sources. With too many data sources, a single central server may suffer...collection rate. 3.2 Viewer Elements Design For detailed elements to display, as well as their color, line styles , and locations on the screen, we

  6. Analytical Lie-algebraic solution of a 3D sound propagation problem in the ocean

    Energy Technology Data Exchange (ETDEWEB)

    Petrov, P.S., E-mail: petrov@poi.dvo.ru [Il' ichev Pacific Oceanological Institute, 43 Baltiyskaya str., Vladivostok, 690041 (Russian Federation); Prants, S.V., E-mail: prants@poi.dvo.ru [Il' ichev Pacific Oceanological Institute, 43 Baltiyskaya str., Vladivostok, 690041 (Russian Federation); Petrova, T.N., E-mail: petrova.tn@dvfu.ru [Far Eastern Federal University, 8 Sukhanova str., 690950, Vladivostok (Russian Federation)

    2017-06-21

    The problem of sound propagation in a shallow sea with variable bottom slope is considered. The sound pressure field produced by a time-harmonic point source in such inhomogeneous 3D waveguide is expressed in the form of a modal expansion. The expansion coefficients are computed using the adiabatic mode parabolic equation theory. The mode parabolic equations are solved explicitly, and the analytical expressions for the modal coefficients are obtained using a Lie-algebraic technique. - Highlights: • A group-theoretical approach is applied to a problem of sound propagation in a shallow sea with variable bottom slope. • An analytical solution of this problem is obtained in the form of modal expansion with analytical expressions of the coefficients. • Our result is the only analytical solution of the 3D sound propagation problem with no translational invariance. • This solution can be used for the validation of the numerical propagation models.

  7. A nonlinear analytic function expansion nodal method for transient calculations

    Energy Technology Data Exchange (ETDEWEB)

    Joo, Han Gyn; Park, Sang Yoon; Cho, Byung Oh; Zee, Sung Quun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1999-12-31

    The nonlinear analytic function expansion nodal (AFEN) method is applied to the solution of the time-dependent neutron diffusion equation. Since the AFEN method requires both the particular solution and the homogeneous solution to the transient fixed source problem, the derivation of the solution method is focused on finding the particular solution efficiently. To avoid complicated particular solutions, the source distribution is approximated by quadratic polynomials and the transient source is constructed such that the error due to the quadratic approximation is minimized, In addition, this paper presents a new two-node solution scheme that is derived by imposing the constraint of current continuity at the interface corner points. The method is verified through a series of application to the NEACRP PWR rod ejection benchmark problems. 6 refs., 2 figs., 1 tab. (Author)

  8. A nonlinear analytic function expansion nodal method for transient calculations

    Energy Technology Data Exchange (ETDEWEB)

    Joo, Han Gyn; Park, Sang Yoon; Cho, Byung Oh; Zee, Sung Quun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    The nonlinear analytic function expansion nodal (AFEN) method is applied to the solution of the time-dependent neutron diffusion equation. Since the AFEN method requires both the particular solution and the homogeneous solution to the transient fixed source problem, the derivation of the solution method is focused on finding the particular solution efficiently. To avoid complicated particular solutions, the source distribution is approximated by quadratic polynomials and the transient source is constructed such that the error due to the quadratic approximation is minimized, In addition, this paper presents a new two-node solution scheme that is derived by imposing the constraint of current continuity at the interface corner points. The method is verified through a series of application to the NEACRP PWR rod ejection benchmark problems. 6 refs., 2 figs., 1 tab. (Author)

  9. Nuclear and nuclear related analytical methods applied in environmental research

    International Nuclear Information System (INIS)

    Popescu, Ion V.; Gheboianu, Anca; Bancuta, Iulian; Cimpoca, G. V; Stihi, Claudia; Radulescu, Cristiana; Oros Calin; Frontasyeva, Marina; Petre, Marian; Dulama, Ioana; Vlaicu, G.

    2010-01-01

    Nuclear Analytical Methods can be used for research activities on environmental studies like water quality assessment, pesticide residues, global climatic change (transboundary), pollution and remediation. Heavy metal pollution is a problem associated with areas of intensive industrial activity. In this work the moss bio monitoring technique was employed to study the atmospheric deposition in Dambovita County Romania. Also, there were used complementary nuclear and atomic analytical methods: Neutron Activation Analysis (NAA), Atomic Absorption Spectrometry (AAS) and Inductively Coupled Plasma Atomic Emission Spectrometry (ICP-AES). These high sensitivity analysis methods were used to determine the chemical composition of some samples of mosses placed in different areas with different pollution industrial sources. The concentrations of Cr, Fe, Mn, Ni and Zn were determined. The concentration of Fe from the same samples was determined using all these methods and we obtained a very good agreement, in statistical limits, which demonstrate the capability of these analytical methods to be applied on a large spectrum of environmental samples with the same results. (authors)

  10. MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF INTELLIGENCE PRODUCTS

    Science.gov (United States)

    2016-04-01

    AU/ACSC/2016 AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF...establishing unit level certified Masters of Analytic Tradecraft (MAT) analysts to be trained and entrusted to evaluate and rate the standards and...cues) ideally should meet or exceed effective rigor (based on analytical process).4 To accomplish this, decision makers should not be left to their

  11. Gender-partitioned patient medians of serum albumin requested by general practitioners for the assessment of analytical stability.

    Science.gov (United States)

    Hansen, Steen Ingemann; Petersen, Per Hyltoft; Lund, Flemming; Fraser, Callum G; Sölétormos, György

    2018-04-25

    Recently, the use of separate gender-partitioned patient medians of serum sodium has revealed potential for monitoring analytical stability within the optimum analytical performance specifications for laboratory medicine. The serum albumin concentration depends on whether a patient is sitting or recumbent during phlebotomy. We therefore investigated only examinations requested by general practitioners (GPs) to provide data from sitting patients. Weekly and monthly patient medians of serum albumin requested by GP for both male and female patients were calculated from the raw data obtained from three analysers in the hospital laboratory on examination of samples from those >18 years. The half-range of medians were applied as an estimate of the maximum bias. Further, the ratios between the two medians were calculated (females/males). The medians for male and female patients were closely related despite considerable variation due to the current analytical variation. This relationship was confirmed by the calculated half-range for the monthly ratio between the genders of 0.44%, which surpasses the optimum analytical performance specification for bias of serum albumin (0.72%). The weekly ratio had a half-range of 1.83%, which surpasses the minimum analytical performance specifications of 2.15%. Monthly gender-partitioned patient medians of serum albumin are useful for monitoring of long-term analytical stability, where the gender medians are two independent estimates of changes in (delta) bias: only results requested by GP are of value in this application to ensure that all patients are sitting during phlebotomy.

  12. Analytical developments in reprocessing at the CEA

    International Nuclear Information System (INIS)

    Buffereau, M.

    1989-01-01

    Analytical developments in reprocessing, which are based on extensive basic research, are aimed at fulfilling current requirements of R and D laboratories, pilot plants and industrial plants. They are also intended to propose and provide new opportunities. On-line measurements are a long term goal. One must be confident of their outcome. New equipment and procedures must be tested and their specifications determined, first at the laboratory level, and then in a pilot plant. In this respect we are considering equipment which will be in operation in the ATALANTE laboratories. And APM is also both a necessary and useful resource. However, many measurements must still be done and will continue to have to be done in analytical laboratories. Along with the improvement of accuracy the main developments aim at reducing manpower requirements and effluents and waste releases

  13. Analytical solutions for tomato peeling with combined heat flux and convective boundary conditions

    Science.gov (United States)

    Cuccurullo, G.; Giordano, L.; Metallo, A.

    2017-11-01

    Peeling of tomatoes by radiative heating is a valid alternative to steam or lye, which are expensive and pollutant methods. Suitable energy densities are required in order to realize short time operations, thus involving only a thin layer under the tomato surface. This paper aims to predict the temperature field in rotating tomatoes exposed to the source irradiation. Therefore, a 1D unsteady analytical model is presented, which involves a semi-infinite slab subjected to time dependent heating while convective heat transfer takes place on the exposed surface. In order to account for the tomato rotation, the heat source is described as the positive half-wave of a sinusoidal function. The problem being linear, the solution is derived following the Laplace Transform Method. In addition, an easy-to-handle solution for the problem at hand is presented, which assumes a differentiable function for approximating the source while neglecting convective cooling, the latter contribution turning out to be negligible for the context at hand. A satisfying agreement between the two analytical solutions is found, therefore, an easy procedure for a proper design of the dry heating system can be set up avoiding the use of numerical simulations.

  14. Interlanguage comparison of sport discourse (on the material of sport analytic article

    Directory of Open Access Journals (Sweden)

    Gavryushina Ekaterina Alexandrovna

    2016-06-01

    Full Text Available The article is devoted to the study of cultural and mental specificity of language units in the sport communication. The study was conducted on the material of English, Russian and German analytical articles, thematically related to tennis. Using the technique of cross-language comparison it is revealed significant characteristic parameters of sports discourse. The proposed comparative procedure consists in three stages of analysis: linguistic, cognitive-communicative and linguistic-cultural. During the analysis at each step there were identified certain criteria specific to the sport discourse in three linguasocieties. Sport communication reflects not only the specificity and originality of the language, but also the traditions, history, mentality, culture, and behavior patterns of modern professional sport community. Comparative study of sport discourse reveals the cultural, linguistic and cognitive features of sublanguage sports and allows to get a common view of the structure of sport analytical articles.

  15. Low energy ion beam systems for surface analytical and structural studies

    International Nuclear Information System (INIS)

    Nelson, G.C.

    1980-01-01

    This paper reviews the use of low energy ion beam systems for surface analytical and structural studies. Areas where analytical methods which utilize ion beams can provide a unique insight into materials problems are discussed. The design criteria of ion beam systems for performing materials studies are described and the systems now being used by a number of laboratories are reviewed. Finally, several specific problems are described where the solution was provided at least in part by information provided by low energy ion analysis techniques

  16. Enhanced forensic discrimination of pollutants by position-specific isotope analysis using isotope ratio monitoring by (13)C nuclear magnetic resonance spectrometry.

    Science.gov (United States)

    Julien, Maxime; Nun, Pierrick; Höhener, Patrick; Parinet, Julien; Robins, Richard J; Remaud, Gérald S

    2016-01-15

    In forensic environmental investigations the main issue concerns the inference of the original source of the pollutant for determining the liable party. Isotope measurements in geochemistry, combined with complimentary techniques for contaminant identification, have contributed significantly to source determination at polluted sites. In this work we have determined the intramolecular (13)C profiles of several molecules well-known as pollutants. By giving additional analytical parameters, position-specific isotope analysis performed by isotope ratio monitoring by (13)C nuclear magnetic resonance (irm-(13)C NMR) spectrometry gives new information to help in answering the major question: what is the origin of the detected contaminant? We have shown that isotope profiling of the core of a molecule reveals both the raw materials and the process used in its manufacture. It also can reveal processes occurring between the contamination site 'source' and the sampling site. Thus, irm-(13)C NMR is shown to be a very good complement to compound-specific isotope analysis currently performed by mass spectrometry for assessing polluted sites involving substantial spills of pollutant. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Long-term effects of total and source-specific particulate air pollution on incident cardiovascular disease in Gothenburg, Sweden.

    Science.gov (United States)

    Stockfelt, Leo; Andersson, Eva M; Molnár, Peter; Gidhagen, Lars; Segersson, David; Rosengren, Annika; Barregard, Lars; Sallsten, Gerd

    2017-10-01

    Long-term exposure to air pollution increases cardiopulmonary morbidity and mortality, but it is not clear which components of air pollution are the most harmful, nor which time window of exposure is most relevant. Further studies at low exposure levels have also been called for. We analyzed two Swedish cohorts to investigate the effects of total and source-specific particulate matter (PM) on incident cardiovascular disease for different time windows of exposure. Two cohorts initially recruited to study predictors of cardiovascular disease (the PPS cohort and the GOT-MONICA cohort) were followed from 1990 to 2011. We collected data on residential addresses and assigned each individual yearly total and source-specific PM and Nitrogen Oxides (NO x ) exposures based on dispersion models. Using multivariable Cox regression models with time-dependent exposure, we studied the association between three different time windows (lag 0, lag 1-5, and exposure at study start) of residential PM and NO x exposure, and incidence of ischemic heart disease, stroke, heart failure and atrial fibrillation. During the study period, there were 2266 new-onset cases of ischemic heart disease, 1391 of stroke, 925 of heart failure and 1712 of atrial fibrillation. The majority of cases were in the PPS cohort, where participants were older. Exposure levels during the study period were moderate (median: 13µg/m 3 for PM 10 and 9µg/m 3 for PM 2.5 ), and similar in both cohorts. Road traffic and residential heating were the largest local sources of PM air pollution, and long distance transportation the largest PM source in total. In the PPS cohort, there were positive associations between PM in the last five years and both ischemic heart disease (HR: 1.24 [95% CI: 0.98-1.59] per 10µg/m 3 of PM 10 , and HR: 1.38 [95% CI: 1.08-1.77] per 5µg/m 3 of PM 2.5 ) and heart failure. In the GOT-MONICA cohort, there were positive but generally non-significant associations between PM and stroke (HR: 1

  18. ASTM Data Banks and Chemical Information Sources

    Science.gov (United States)

    Batik, Albert; Hale, Eleanor

    1972-01-01

    Among the data described are infrared indexes, mass spectral data, chromatographic data, X-ray emmission data, odor and taste threshold data, and thermodynamics data. This paper provides the chemical documentarian a complete reference source to a wide variety of analytical data. (Author/NH)

  19. Descriptive and analytic epidemiology. Bridges to cancer control

    International Nuclear Information System (INIS)

    Mettlin, C.

    1988-01-01

    Epidemiology serves as a bridge between basic science and cancer control. The two major orientations of epidemiology are descriptive and analytic. The former is useful in assessing the scope and dimensions of the cancer problem and the latter is used to assess environmental and lifestyle sources of cancer risk. A recent development in descriptive epidemiology is the use of functional measures of disease such as lost life expectancy. In analytical epidemiology, there is new or renewed interest in several lifestyle factors including diet and exercise as well as environmental factors such as involuntary tobacco exposure and radon in dwellings. Review of the evidence should consider the strengths and weaknesses of different research procedures. Each method is inconclusive by itself but, the different research designs of epidemiology collectively may represent a hierarchy of proof. Although the roles of many factors remain to be defined, the aggregate epidemiologic data continue to demonstrate the special importance of personal behavior and lifestyle in affecting cancer risk

  20. Analytical Techniques and Pharmacokinetics of Gastrodia elata Blume and Its Constituents.

    Science.gov (United States)

    Wu, Jinyi; Wu, Bingchu; Tang, Chunlan; Zhao, Jinshun

    2017-07-08

    Gastrodia elata Blume ( G. elata ), commonly called Tianma in Chinese, is an important and notable traditional Chinese medicine (TCM), which has been used in China as an anticonvulsant, analgesic, sedative, anti-asthma, anti-immune drug since ancient times. The aim of this review is to provide an overview of the abundant efforts of scientists in developing analytical techniques and performing pharmacokinetic studies of G. elata and its constituents, including sample pretreatment methods, analytical techniques, absorption, distribution, metabolism, excretion (ADME) and influence factors to its pharmacokinetics. Based on the reported pharmacokinetic property data of G. elata and its constituents, it is hoped that more studies will focus on the development of rapid and sensitive analytical techniques, discovering new therapeutic uses and understanding the specific in vivo mechanisms of action of G. elata and its constituents from the pharmacokinetic viewpoint in the near future. The present review discusses analytical techniques and pharmacokinetics of G. elata and its constituents reported from 1985 onwards.