WorldWideScience

Sample records for sources specific analytical

  1. Specification of brachytherapy sources

    Energy Technology Data Exchange (ETDEWEB)

    1984-10-01

    BCRU recommends that the following specification of gamma-ray brachytherapy sources be adopted. Unless otherwise stated, the output of a cylindrical source should be specified in air kerma rate at a point in free space at a distance of 1 m from the source on the radial plane of symmetry, i.e. the plane bisecting the active length and perpendicular to the cylindrical axis of the source. For a wire source the output should be specified for a 1 cm length. For any other construction of source, the point at which the output is specified should be stated. It is also recommended that the units in which the air kerma rate is expressed should be micrograys per hour (..mu..Gy/h).

  2. Electrospray ion source with reduced analyte electrochemistry

    Science.gov (United States)

    Kertesz, Vilmos [Knoxville, TN; Van Berkel, Gary [Clinton, TN

    2011-08-23

    An electrospray ion (ESI) source and method capable of ionizing an analyte molecule without oxidizing or reducing the analyte of interest. The ESI source can include an emitter having a liquid conduit, a working electrode having a liquid contacting surface, a spray tip, a secondary working electrode, and a charge storage coating covering partially or fully the liquid contacting surface of the working electrode. The liquid conduit, the working electrode and the secondary working electrode can be in liquid communication. The electrospray ion source can also include a counter electrode proximate to, but separated from, said spray tip. The electrospray ion source can also include a power system for applying a voltage difference between the working electrodes and a counter-electrode. The power system can deliver pulsed voltage changes to the working electrodes during operation of said electrospray ion source to minimize the surface potential of the charge storage coating.

  3. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    Science.gov (United States)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82

  4. Compound-specific radiocarbon analysis - Analytical challenges and applications

    Science.gov (United States)

    Mollenhauer, G.; Rethemeyer, J.

    2009-01-01

    Within the last decades, techniques have become available that allow measurement of isotopic compositions of individual organic compounds (compound-specific isotope measurements). Most often the carbon isotopic composition of these compounds is studied, including stable carbon (δ13C) and radiocarbon (Δ14C) measurements. While compound-specific stable carbon isotope measurements are fairly simple, and well-established techniques are widely available, radiocarbon analysis of specific organic compounds is a more challenging method. Analytical challenges include difficulty obtaining adequate quantities of sample, tedious and complicated laboratory separations, the lack of authentic standards for measuring realistic processing blanks, and large uncertainties in values of Δ14C at small sample sizes. The challenges associated with sample preparation for compound-specific Δ14C measurements will be discussed in this contribution. Several years of compound-specific radiocarbon analysis have revealed that in most natural samples, purified organic compounds consist of heterogeneous mixtures of the same compound. These mixtures could derive from multiple sources, each having a different initial reservoir age but mixed in the same terminal reservoir, from a single source but mixed after deposition, or from a prokaryotic organism using variable carbon sources including mobilization of ancient carbon. These processes not only represent challenges to the interpretation of compound-specific radiocarbon data, but provide unique tools for the understanding of biogeochemical and sedimentological processes influencing the preserved organic geochemical records in marine sediments. We will discuss some examples where compound-specific radiocarbon analysis has provided new insights for the understanding of carbon source utilization and carbon cycling.

  5. 21 CFR 864.4020 - Analyte specific reagents.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Analyte specific reagents. 864.4020 Section 864.4020 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Specimen Preparation Reagents § 864.4020 Analyte specific...

  6. Analytic Approximation to Radiation Fields from Line Source Geometry

    International Nuclear Information System (INIS)

    Michieli, I.

    2000-01-01

    Line sources with slab shields represent typical source-shield configuration in gamma-ray attenuation problems. Such shielding problems often lead to the generalized Secant integrals of the specific form. Besides numerical integration approach, various expansions and rational approximations with limited applicability are in use for computing the value of such integral functions. Lately, the author developed rapidly convergent infinite series representation of generalized Secant Integrals involving incomplete Gamma functions. Validity of such representation was established for zero and positive values of integral parameter a (a=0). In this paper recurrence relations for generalized Secant Integrals are derived allowing us simple approximate analytic calculation of the integral for arbitrary a values. It is demonstrated how truncated series representation can be used, as the basis for such calculations, when possibly negative a values are encountered. (author)

  7. Light Source Estimation with Analytical Path-tracing

    OpenAIRE

    Kasper, Mike; Keivan, Nima; Sibley, Gabe; Heckman, Christoffer

    2017-01-01

    We present a novel algorithm for light source estimation in scenes reconstructed with a RGB-D camera based on an analytically-derived formulation of path-tracing. Our algorithm traces the reconstructed scene with a custom path-tracer and computes the analytical derivatives of the light transport equation from principles in optics. These derivatives are then used to perform gradient descent, minimizing the photometric error between one or more captured reference images and renders of our curre...

  8. SPANDOM - source projection analytic nodal discrete ordinates method

    International Nuclear Information System (INIS)

    Kim, Tae Hyeong; Cho, Nam Zin

    1994-01-01

    We describe a new discrete ordinates nodal method for the two-dimensional transport equation. We solve the discrete ordinates equation analytically after the source term is projected and represented in polynomials. The method is applied to two fast reactor benchmark problems and compared with the TWOHEX code. The results indicate that the present method accurately predicts not only multiplication factor but also flux distribution

  9. Pentaho Business Analytics: a Business Intelligence Open Source Alternative

    Directory of Open Access Journals (Sweden)

    Diana TÂRNĂVEANU

    2012-10-01

    Full Text Available Most organizations strive to obtain fast, interactive and insightful analytics in order to fundament the most effective and profitable decisions. They need to incorporate huge amounts of data in order to run analysis based on queries and reports with collaborative capabilities. The large variety of Business Intelligence solutions on the market makes it very difficult for organizations to select one and evaluate the impact of the selected solution to the organization. The need of a strategy to help organization chose the best solution for investment emerges. In the past, Business Intelligence (BI market was dominated by closed source and commercial tools, but in the last years open source solutions developed everywhere. An Open Source Business Intelligence solution can be an option due to time-sensitive, sprawling requirements and tightening budgets. This paper presents a practical solution implemented in a suite of Open Source Business Intelligence products called Pentaho Business Analytics, which provides data integration, OLAP services, reporting, dashboarding, data mining and ETL capabilities. The study conducted in this paper suggests that the open source phenomenon could become a valid alternative to commercial platforms within the BI context.

  10. Analytical performance specifications for external quality assessment - definitions and descriptions.

    Science.gov (United States)

    Jones, Graham R D; Albarede, Stephanie; Kesseler, Dagmar; MacKenzie, Finlay; Mammen, Joy; Pedersen, Morten; Stavelin, Anne; Thelen, Marc; Thomas, Annette; Twomey, Patrick J; Ventura, Emma; Panteghini, Mauro

    2017-06-27

    External Quality Assurance (EQA) is vital to ensure acceptable analytical quality in medical laboratories. A key component of an EQA scheme is an analytical performance specification (APS) for each measurand that a laboratory can use to assess the extent of deviation of the obtained results from the target value. A consensus conference held in Milan in 2014 has proposed three models to set APS and these can be applied to setting APS for EQA. A goal arising from this conference is the harmonisation of EQA APS between different schemes to deliver consistent quality messages to laboratories irrespective of location and the choice of EQA provider. At this time there are wide differences in the APS used in different EQA schemes for the same measurands. Contributing factors to this variation are that the APS in different schemes are established using different criteria, applied to different types of data (e.g. single data points, multiple data points), used for different goals (e.g. improvement of analytical quality; licensing), and with the aim of eliciting different responses from participants. This paper provides recommendations from the European Federation of Laboratory Medicine (EFLM) Task and Finish Group on Performance Specifications for External Quality Assurance Schemes (TFG-APSEQA) and on clear terminology for EQA APS. The recommended terminology covers six elements required to understand APS: 1) a statement on the EQA material matrix and its commutability; 2) the method used to assign the target value; 3) the data set to which APS are applied; 4) the applicable analytical property being assessed (i.e. total error, bias, imprecision, uncertainty); 5) the rationale for the selection of the APS; and 6) the type of the Milan model(s) used to set the APS. The terminology is required for EQA participants and other interested parties to understand the meaning of meeting or not meeting APS.

  11. Subsurface Shielding Source Term Specification Calculation

    International Nuclear Information System (INIS)

    S.Su

    2001-01-01

    The purpose of this calculation is to establish appropriate and defensible waste-package radiation source terms for use in repository subsurface shielding design. This calculation supports the shielding design for the waste emplacement and retrieval system, and subsurface facility system. The objective is to identify the limiting waste package and specify its associated source terms including source strengths and energy spectra. Consistent with the Technical Work Plan for Subsurface Design Section FY 01 Work Activities (CRWMS M and O 2001, p. 15), the scope of work includes the following: (1) Review source terms generated by the Waste Package Department (WPD) for various waste forms and waste package types, and compile them for shielding-specific applications. (2) Determine acceptable waste package specific source terms for use in subsurface shielding design, using a reasonable and defensible methodology that is not unduly conservative. This calculation is associated with the engineering and design activity for the waste emplacement and retrieval system, and subsurface facility system. The technical work plan for this calculation is provided in CRWMS M and O 2001. Development and performance of this calculation conforms to the procedure, AP-3.12Q, Calculations

  12. An Analytical Study of Prostate-Specific Antigen Dynamics.

    Science.gov (United States)

    Esteban, Ernesto P; Deliz, Giovanni; Rivera-Rodriguez, Jaileen; Laureano, Stephanie M

    2016-01-01

    The purpose of this research is to carry out a quantitative study of prostate-specific antigen dynamics for patients with prostatic diseases, such as benign prostatic hyperplasia (BPH) and localized prostate cancer (LPC). The proposed PSA mathematical model was implemented using clinical data of 218 Japanese patients with histological proven BPH and 147 Japanese patients with LPC (stages T2a and T2b). For prostatic diseases (BPH and LPC) a nonlinear equation was obtained and solved in a close form to predict PSA progression with patients' age. The general solution describes PSA dynamics for patients with both diseases LPC and BPH. Particular solutions allow studying PSA dynamics for patients with BPH or LPC. Analytical solutions have been obtained and solved in a close form to develop nomograms for a better understanding of PSA dynamics in patients with BPH and LPC. This study may be useful to improve the diagnostic and prognosis of prostatic diseases.

  13. An Analytical Method of Auxiliary Sources Solution for Plane Wave Scattering by Impedance Cylinders

    DEFF Research Database (Denmark)

    Larsen, Niels Vesterdal; Breinbjerg, Olav

    2004-01-01

    Analytical Method of Auxiliary Sources solutions for plane wave scattering by circular impedance cylinders are derived by transformation of the exact eigenfunction series solutions employing the Hankel function wave transformation. The analytical Method of Auxiliary Sources solution thus obtained...

  14. Application of californium-252 neutron sources for analytical chemistry

    International Nuclear Information System (INIS)

    Ishii, Daido

    1976-01-01

    The researches made for the application of Cf-252 neutron sources to analytical chemistry during the period from 1970 to 1974 including partly 1975 are reviewed. The first part is the introduction to the above. The second part deals with general review of symposia, publications and the like. Attention is directed to ERDA publishing the periodical ''Californium-252 Progress'' and to a study group of Cf-252 utilization held by Japanese Radioisotope Association in 1974. The third part deals with its application for radio activation analysis. The automated absolute activation analysis (AAAA) of Savannha River is briefly explained. The joint experiment of Savannha River operation office with New Brunswick laboratory is mentioned. Cf-252 radiation source was used for the non-destructive analysis of elements in river water. East neutrons of Cf-252 were used for the quantitative analysis of lead in paints. Many applications for industrial control processes have been reported. Attention is drawn to the application of Cf-252 neutron sources for the field search of neutral resources. For example, a logging sonde for searching uranium resources was developed. the fourth part deals with the application of the analysis with gamma ray by capturing neutrons. For example, a bore hole sonde and the process control analysis of sulfur in fuel utilized capture gamma ray. The prompt gamma ray by capturing neutrons may be used for the nondestructive analysis of enrivonment. (Iwakiri, K.)

  15. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals.

    Science.gov (United States)

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G; Sandberg, Sverre; Sölétormos, György

    2018-01-01

    Background Many clinical decisions are based on comparison of patient results with reference intervals. Therefore, an estimation of the analytical performance specifications for the quality that would be required to allow sharing common reference intervals is needed. The International Federation of Clinical Chemistry (IFCC) recommended a minimum of 120 reference individuals to establish reference intervals. This number implies a certain level of quality, which could then be used for defining analytical performance specifications as the maximum combination of analytical bias and imprecision required for sharing common reference intervals, the aim of this investigation. Methods Two methods were investigated for defining the maximum combination of analytical bias and imprecision that would give the same quality of common reference intervals as the IFCC recommendation. Method 1 is based on a formula for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation of analytical performance specifications for both Gaussian and log-Gaussian distributions of reference intervals.

  16. Requirements Specification for Open Source Software Selection

    OpenAIRE

    YANG, YING

    2008-01-01

    Open source software has been widely used. The software world is enjoying the advantages of collaboration and cooperation in software development and use with the advent of open source movement. However, little research is concerned about the practical guidelines of OSS selection. It is hard for an organization to make a decision whether they should use the OSS or not, and to select an appropriate one from a number of OSS candidates. This thesis studies how to select an open source software f...

  17. Civil Society In Tanzania: An Analytical Review Of Sources Of ...

    African Journals Online (AJOL)

    Sixty percent of civil societies deal with social development programmes. Additionally, results show that most civil societies had disproportionate staffing problems; and sixty six percent depended on international sources of funding while 46% reported that they secured funds from both local and foreign sources of financing.

  18. Analytical support of plant specific SAMG development validation of SAMG using MELCOR 1.8.5

    International Nuclear Information System (INIS)

    Duspiva, Jiri

    2006-01-01

    They are two NPPs in operation in Czech Republic. Both of NPPs operated in CR have already implemented EOPs, developed under collaboration with the WESE. The project on SAMG development has started and follows the previous one for EOPs also with the WESE as the leading organization. Plant specific SAMGs for the Temelin as well as Dukovany NPPs are based on the WOG generic SAMGs. The analytical support of plant specific SAMGs development is performed by the NRI Rez within the validation process. Basic conditions as well as their filling by NRI Rez are focused on analyst, analytical tools and their applications. More detail description is attended to the approach of the preparation of the MELCOR code application to the evaluation of hydrogen risk, validation of recent set of hydrogen passive autocatalytic recombiners and definition of proposals to amend system of hydrogen removal. Such kind of parametric calculations will request to perform very wide set of runs. It could not be possible with the whole plant model and decoupling of such calculation with storing of mass and energy sources into the containment is only one way. The example of this decoupling for the LOCA scenario is shown. It includes seven sources - heat losses from primary and secondary circuits, fluid blowndown through cold leg break, fission products blowndown through cold leg break, fluid blowndown through break in reactor pressure vessel bottom head, fission products through break in reactor pressure vessel bottom head, melt ejection from reactor pressure vessel to cavity and gas masses and heat losses from corium in cavity. The stand alone containment analysis was tested in two configurations - with or without taking of fission products into account. Testing showed very good agreement of all calculations until lower head failure and acceptable agreement after that. Also some problematic features appeared. The stand alone test with fission product was possible only after the changes in source code

  19. Analytical and semi-analytical formalism for the voltage and the current sources of a superconducting cavity under dynamic detuning

    CERN Document Server

    Doleans, M

    2003-01-01

    Elliptical superconducting radio frequency (SRF) cavities are sensitive to frequency detuning because they have a high Q value in comparison with normal conducting cavities and weak mechanical properties. Radiation pressure on the cavity walls, microphonics, and tuning system are possible sources of dynamic detuning during SRF cavity-pulsed operation. A general analytic relation between the cavity voltage, the dynamic detuning function, and the RF control function is developed. This expression for the voltage envelope in a cavity under dynamic detuning and dynamic RF controls is analytically expressed through an integral formulation. A semi-analytical scheme is derived to calculate the voltage behavior in any practical case. Examples of voltage envelope behavior for different cases of dynamic detuning and RF control functions are shown. The RF control function for a cavity under dynamic detuning is also investigated and as an application various filling schemes are presented.

  20. Setting analytical performance specifications based on outcome studies - is it possible?

    NARCIS (Netherlands)

    Horvath, Andrea Rita; Bossuyt, Patrick M. M.; Sandberg, Sverre; John, Andrew St; Monaghan, Phillip J.; Verhagen-Kamerbeek, Wilma D. J.; Lennartz, Lieselotte; Cobbaert, Christa M.; Ebert, Christoph; Lord, Sarah J.

    2015-01-01

    The 1st Strategic Conference of the European Federation of Clinical Chemistry and Laboratory Medicine proposed a simplified hierarchy for setting analytical performance specifications (APS). The top two levels of the 1999 Stockholm hierarchy, i.e., evaluation of the effect of analytical performance

  1. Application of radioactive sources in analytical instruments for planetary exploration

    International Nuclear Information System (INIS)

    Economou, T.E.

    2008-01-01

    Full text: In the past 50 years or so, many types of radioactive sources have been used in space exploration. 238 Pu is often used in space missions in Radioactive Heater Units (RHU) and Radioisotope Thermoelectric Generators (RTG) for heat and power generation, respectively. In 1960's, 2 ' 42 Cm alpha radioactive sources have been used for the first time in space applications on 3 Surveyor spacecrafts to obtain the chemical composition of the lunar surface with an instrument based on the Rutherford backscatterring of the alpha particles from nuclei in the analyzed sample. 242 Cm is an alpha emitter of 6.1 MeV alpha particles. Its half-life time, 163 days, is short enough to allow sources to be prepared with the necessary high intensity per unit area ( up to 470 mCi and FWHM of about 1.5% in the lunar instruments) that results in narrow energy distribution, yet long enough that the sources have adequate lifetimes for short duration missions. 242 Cm is readily prepared in curie quantities by irradiation of 241 Am by neutrons in nuclear reactors, followed by chemical separation of the curium from the americium and fission products. For long duration missions, like for example missions to Mars, comets, and asteroids, the isotope 244 Cm (T 1/2 =18.1 y, E α =5.8 MeV) is a better source because of its much longer half-life time. Both of these isotopes are also excellent x-ray excitation sources and have been used for that purpose on several planetary missions. For the light elements the excitation is caused mainly by the alpha particles, while for the heavier elements (> Ca) the excitation is mainly due to the x-rays from the Pu L-lines (E x =14-18 keV). 244 Cm has been used in several variations of the Alpha Proton Xray Spectrometer (APXS): PHOBOS 1 and 2 Pathfinder, Russian Mars-96 mission, Mars Exploration Rover (MER) and Rosetta. Other sources used in X-ray fluorescence instruments in space are 55 Fe and 109 Cd (Viking1,2, Beagle 2) and 57 Co is used in Moessbauer

  2. Heat-source specification 500 watt(e) RTG

    International Nuclear Information System (INIS)

    1983-02-01

    This specification establishes the requirements for a 90 SrF 2 heat source and its fuel capsule for application in a 500 W(e) thermoelectric generator. The specification covers: fuel composition and quantity; the Hastelloy S fuel capsule material and fabrication; and the quality assurance requirements for the assembled heat source

  3. Modification and validation of an analytical source model for external beam radiotherapy Monte Carlo dose calculations

    Energy Technology Data Exchange (ETDEWEB)

    Davidson, Scott E., E-mail: sedavids@utmb.edu [Radiation Oncology, The University of Texas Medical Branch, Galveston, Texas 77555 (United States); Cui, Jing [Radiation Oncology, University of Southern California, Los Angeles, California 90033 (United States); Kry, Stephen; Ibbott, Geoffrey S.; Followill, David S. [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States); Deasy, Joseph O. [Department of Medical Physics, Memorial Sloan Kettering Cancer Center, New York, New York 10065 (United States); Vicic, Milos [Department of Applied Physics, University of Belgrade, Belgrade 11000 (Serbia); White, R. Allen [Bioinformatics and Computational Biology, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States)

    2016-08-15

    Purpose: A dose calculation tool, which combines the accuracy of the dose planning method (DPM) Monte Carlo code and the versatility of a practical analytical multisource model, which was previously reported has been improved and validated for the Varian 6 and 10 MV linear accelerators (linacs). The calculation tool can be used to calculate doses in advanced clinical application studies. One shortcoming of current clinical trials that report dose from patient plans is the lack of a standardized dose calculation methodology. Because commercial treatment planning systems (TPSs) have their own dose calculation algorithms and the clinical trial participant who uses these systems is responsible for commissioning the beam model, variation exists in the reported calculated dose distributions. Today’s modern linac is manufactured to tight specifications so that variability within a linac model is quite low. The expectation is that a single dose calculation tool for a specific linac model can be used to accurately recalculate dose from patient plans that have been submitted to the clinical trial community from any institution. The calculation tool would provide for a more meaningful outcome analysis. Methods: The analytical source model was described by a primary point source, a secondary extra-focal source, and a contaminant electron source. Off-axis energy softening and fluence effects were also included. The additions of hyperbolic functions have been incorporated into the model to correct for the changes in output and in electron contamination with field size. A multileaf collimator (MLC) model is included to facilitate phantom and patient dose calculations. An offset to the MLC leaf positions was used to correct for the rudimentary assumed primary point source. Results: Dose calculations of the depth dose and profiles for field sizes 4 × 4 to 40 × 40 cm agree with measurement within 2% of the maximum dose or 2 mm distance to agreement (DTA) for 95% of the data

  4. Ibmdbpy-spatial : An Open-source implementation of in-database geospatial analytics in Python

    Science.gov (United States)

    Roy, Avipsa; Fouché, Edouard; Rodriguez Morales, Rafael; Moehler, Gregor

    2017-04-01

    As the amount of spatial data acquired from several geodetic sources has grown over the years and as data infrastructure has become more powerful, the need for adoption of in-database analytic technology within geosciences has grown rapidly. In-database analytics on spatial data stored in a traditional enterprise data warehouse enables much faster retrieval and analysis for making better predictions about risks and opportunities, identifying trends and spot anomalies. Although there are a number of open-source spatial analysis libraries like geopandas and shapely available today, most of them have been restricted to manipulation and analysis of geometric objects with a dependency on GEOS and similar libraries. We present an open-source software package, written in Python, to fill the gap between spatial analysis and in-database analytics. Ibmdbpy-spatial provides a geospatial extension to the ibmdbpy package, implemented in 2015. It provides an interface for spatial data manipulation and access to in-database algorithms in IBM dashDB, a data warehouse platform with a spatial extender that runs as a service on IBM's cloud platform called Bluemix. Working in-database reduces the network overload, as the complete data need not be replicated into the user's local system altogether and only a subset of the entire dataset can be fetched into memory in a single instance. Ibmdbpy-spatial accelerates Python analytics by seamlessly pushing operations written in Python into the underlying database for execution using the dashDB spatial extender, thereby benefiting from in-database performance-enhancing features, such as columnar storage and parallel processing. The package is currently supported on Python versions from 2.7 up to 3.4. The basic architecture of the package consists of three main components - 1) a connection to the dashDB represented by the instance IdaDataBase, which uses a middleware API namely - pypyodbc or jaydebeapi to establish the database connection via

  5. Sealed radionuclide sources - new technical specifications and current practice

    Energy Technology Data Exchange (ETDEWEB)

    Brabec, D

    1987-03-01

    Basic technical specifications are discussed valid in Czechoslovakia for sealed radionuclide sources, based on international ISO and CMEA standards. Described are the standardization of terminology, relationships of tests, testing methods, types of sealed sources and their applications, relations to Czechoslovak regulations on radiation protection and to IAEA specifications for radioactive material shipment, etc. Practical impact is shown of the introduction of the new standards governing sealed sources on the national economy, and the purpose is explained of various documents issued with sealed sources. (author). 2 figs., 45 refs.

  6. Review and evaluation of spark source mass spectrometry as an analytical method

    International Nuclear Information System (INIS)

    Beske, H.E.

    1981-01-01

    The analytical features and most important fields of application of spark source mass spectrometry are described with respect to the trace analysis of high-purity materials and the multielement analysis of technical alloys, geochemical and cosmochemical, biological and radioactive materials, as well as in environmental analysis. Comparisons are made to other analytical methods. The distribution of the method as well as opportunities for contract analysis are indicated and developmental tendencies discussed. (orig.) [de

  7. A comparison of average wages with age-specific wages for assessing indirect productivity losses: analytic simplicity versus analytic precision.

    Science.gov (United States)

    Connolly, Mark P; Tashjian, Cole; Kotsopoulos, Nikolaos; Bhatt, Aomesh; Postma, Maarten J

    2017-07-01

    Numerous approaches are used to estimate indirect productivity losses using various wage estimates applied to poor health in working aged adults. Considering the different wage estimation approaches observed in the published literature, we sought to assess variation in productivity loss estimates when using average wages compared with age-specific wages. Published estimates for average and age-specific wages for combined male/female wages were obtained from the UK Office of National Statistics. A polynomial interpolation was used to convert 5-year age-banded wage data into annual age-specific wages estimates. To compare indirect cost estimates, average wages and age-specific wages were used to project productivity losses at various stages of life based on the human capital approach. Discount rates of 0, 3, and 6 % were applied to projected age-specific and average wage losses. Using average wages was found to overestimate lifetime wages in conditions afflicting those aged 1-27 and 57-67, while underestimating lifetime wages in those aged 27-57. The difference was most significant for children where average wage overestimated wages by 15 % and for 40-year-olds where it underestimated wages by 14 %. Large differences in projecting productivity losses exist when using the average wage applied over a lifetime. Specifically, use of average wages overestimates productivity losses between 8 and 15 % for childhood illnesses. Furthermore, during prime working years, use of average wages will underestimate productivity losses by 14 %. We suggest that to achieve more precise estimates of productivity losses, age-specific wages should become the standard analytic approach.

  8. Analytic solution of field distribution and demagnetization function of ideal hollow cylindrical field source

    Science.gov (United States)

    Xu, Xiaonong; Lu, Dingwei; Xu, Xibin; Yu, Yang; Gu, Min

    2017-09-01

    The Halbach type hollow cylindrical permanent magnet array (HCPMA) is a volume compact and energy conserved field source, which have attracted intense interests in many practical applications. Here, using the complex variable integration method based on the Biot-Savart Law (including current distributions inside the body and on the surfaces of magnet), we derive analytical field solutions to an ideal multipole HCPMA in entire space including the interior of magnet. The analytic field expression inside the array material is used to construct an analytic demagnetization function, with which we can explain the origin of demagnetization phenomena in HCPMA by taking into account an ideal magnetic hysteresis loop with finite coercivity. These analytical field expressions and demagnetization functions provide deeper insight into the nature of such permanent magnet array systems and offer guidance in designing optimized array system.

  9. Analytical modeling of Schottky tunneling source impact ionization MOSFET with reduced breakdown voltage

    Directory of Open Access Journals (Sweden)

    Sangeeta Singh

    2016-03-01

    Full Text Available In this paper, we have investigated a novel Schottky tunneling source impact ionization MOSFET (STS-IMOS to lower the breakdown voltage of conventional impact ionization MOS (IMOS and developed an analytical model for the same. In STS-IMOS there is an accumulative effect of both impact ionization and source induced barrier tunneling. The silicide source offers very low parasitic resistance, the outcome of which is an increment in voltage drop across the intrinsic region for the same applied bias. This reduces operating voltage and hence, it exhibits a significant reduction in both breakdown and threshold voltage. STS-IMOS shows high immunity against hot electron damage. As a result of this the device reliability increases magnificently. The analytical model for impact ionization current (Iii is developed based on the integration of ionization integral (M. Similarly, to get Schottky tunneling current (ITun expression, Wentzel–Kramers–Brillouin (WKB approximation is employed. Analytical models for threshold voltage and subthreshold slope is optimized against Schottky barrier height (ϕB variation. The expression for the drain current is computed as a function of gate-to-drain bias via integral expression. It is validated by comparing it with the technology computer-aided design (TCAD simulation results as well. In essence, this analytical framework provides the physical background for better understanding of STS-IMOS and its performance estimation.

  10. Reconstruction of sound source signal by analytical passive TR in the environment with airflow

    Science.gov (United States)

    Wei, Long; Li, Min; Yang, Debin; Niu, Feng; Zeng, Wu

    2017-03-01

    In the acoustic design of air vehicles, the time-domain signals of noise sources on the surface of air vehicles can serve as data support to reveal the noise source generation mechanism, analyze acoustic fatigue, and take measures for noise insulation and reduction. To rapidly reconstruct the time-domain sound source signals in an environment with flow, a method combining the analytical passive time reversal mirror (AP-TR) with a shear flow correction is proposed. In this method, the negative influence of flow on sound wave propagation is suppressed by the shear flow correction, obtaining the corrected acoustic propagation time delay and path. Those corrected time delay and path together with the microphone array signals are then submitted to the AP-TR, reconstructing more accurate sound source signals in the environment with airflow. As an analytical method, AP-TR offers a supplementary way in 3D space to reconstruct the signal of sound source in the environment with airflow instead of the numerical TR. Experiments on the reconstruction of the sound source signals of a pair of loud speakers are conducted in an anechoic wind tunnel with subsonic airflow to validate the effectiveness and priorities of the proposed method. Moreover the comparison by theorem and experiment result between the AP-TR and the time-domain beamforming in reconstructing the sound source signal is also discussed.

  11. Analytical performance, reference values and decision limits. A need to differentiate between reference intervals and decision limits and to define analytical quality specifications

    DEFF Research Database (Denmark)

    Petersen, Per Hyltoft; Jensen, Esther A; Brandslund, Ivan

    2012-01-01

    of the values of analytical components measured on reference samples from reference individuals. Decision limits are based on guidelines from national and international expert groups defining specific concentrations of certain components as limits for decision about diagnosis or well-defined specific actions....... Analytical quality specifications for reference intervals have been defined for bias since the 1990s, but in the recommendations specified in the clinical guidelines analytical quality specifications are only scarcely defined. The demands for negligible biases are, however, even more essential for decision...... limits, as the choice is no longer left to the clinician, but emerge directly from the concentration. Even a small bias will change the number of diseased individuals, so the demands for negligible biases are obvious. A view over the analytical quality as published gives a variable picture of bias...

  12. Analytic solution of magnetic induction distribution of ideal hollow spherical field sources

    Science.gov (United States)

    Xu, Xiaonong; Lu, Dingwei; Xu, Xibin; Yu, Yang; Gu, Min

    2017-12-01

    The Halbach type hollow spherical permanent magnet arrays (HSPMA) are volume compacted, energy efficient field sources, and capable of producing multi-Tesla field in the cavity of the array, which have attracted intense interests in many practical applications. Here, we present analytical solutions of magnetic induction to the ideal HSPMA in entire space, outside of array, within the cavity of array, and in the interior of the magnet. We obtain solutions using concept of magnetic charge to solve the Poisson's and Laplace's equations for the HSPMA. Using these analytical field expressions inside the material, a scalar demagnetization function is defined to approximately indicate the regions of magnetization reversal, partial demagnetization, and inverse magnetic saturation. The analytical field solution provides deeper insight into the nature of HSPMA and offer guidance in designing optimized one.

  13. Intense neutron source: high-voltage power supply specifications

    International Nuclear Information System (INIS)

    Riedel, A.A.

    1980-08-01

    This report explains the need for and sets forth the electrical, mechanical and safety specifications for a high-voltage power supply to be used with the intense neutron source. It contains sufficient information for a supplier to bid on such a power supply

  14. Algorithms and analytical solutions for rapidly approximating long-term dispersion from line and area sources

    Science.gov (United States)

    Barrett, Steven R. H.; Britter, Rex E.

    Predicting long-term mean pollutant concentrations in the vicinity of airports, roads and other industrial sources are frequently of concern in regulatory and public health contexts. Many emissions are represented geometrically as ground-level line or area sources. Well developed modelling tools such as AERMOD and ADMS are able to model dispersion from finite (i.e. non-point) sources with considerable accuracy, drawing upon an up-to-date understanding of boundary layer behaviour. Due to mathematical difficulties associated with line and area sources, computationally expensive numerical integration schemes have been developed. For example, some models decompose area sources into a large number of line sources orthogonal to the mean wind direction, for which an analytical (Gaussian) solution exists. Models also employ a time-series approach, which involves computing mean pollutant concentrations for every hour over one or more years of meteorological data. This can give rise to computer runtimes of several days for assessment of a site. While this may be acceptable for assessment of a single industrial complex, airport, etc., this level of computational cost precludes national or international policy assessments at the level of detail available with dispersion modelling. In this paper, we extend previous work [S.R.H. Barrett, R.E. Britter, 2008. Development of algorithms and approximations for rapid operational air quality modelling. Atmospheric Environment 42 (2008) 8105-8111] to line and area sources. We introduce approximations which allow for the development of new analytical solutions for long-term mean dispersion from line and area sources, based on hypergeometric functions. We describe how these solutions can be parameterized from a single point source run from an existing advanced dispersion model, thereby accounting for all processes modelled in the more costly algorithms. The parameterization method combined with the analytical solutions for long-term mean

  15. Generalized Analytical Treatment Of The Source Strength In The Solution Of The Diffusion Equation

    International Nuclear Information System (INIS)

    Essa, Kh.S.M.; EI-Otaify, M.S.

    2007-01-01

    The source release strength (which is an integral part of the mathematical formulation of the diffusion equation) together with the boundary conditions leads to three different forms of the diffusion equation. The obtained forms have been solved analytically under different boundary conditions, by using transformation of axis, cosine, and Fourier transformation. Three equivalent alternative mathematical formulations of the problem have been obtained. The estimated solution of the concentrations at the ground source has been used for comparison with observed concentrations data for SF 6 tracer experiments in low wind and unstable conditions at lIT Delhi sports ground. A good agreement between estimated and observed concentrations is found

  16. Source specific risk assessment of indoor aerosol particles

    Energy Technology Data Exchange (ETDEWEB)

    Koivisto, A.J.

    2013-05-15

    In the urban environment, atmospheric aerosols consist mainly of pollutants from anthropogenic sources. The majority of these originate from traffic and other combustion processes. A fraction of these pollutants will penetrate indoors via ventilation. However, indoor air concentrations are usually predominated by indoor sources due to the small amount of dilution air. In modern societies, people spend most of their time indoors. Thus, their exposure is controlled mainly by indoor concentrations from indoor sources. During the last decades, engineering of nanosized structures has created a new field of material science. Some of these materials have been shown to be potentially toxic to human health. The greatest potential for exposure to engineered nanomaterials (ENMs) occurs in the workplace during production and handling of ENMs. In an exposure assessment, both gaseous and particulate matter pollutants need to be considered. The toxicities of the particles usually depend on the source and age. With time, particle morphology and composition changes due to their tendency to undergo coagulation, condensation and evaporation. The PM exposure risk is related to source specific emissions, and thus, in risk assessment one needs to define source specific exposures. This thesis describes methods for source specific risk assessment of airborne particulate matter. It consists of studies related to workers' ENM exposures during the synthesis of nanoparticles, packing of agglomerated TiO{sub 2} nanoparticles, and handling of nanodiamonds. Background particles were distinguished from the ENM concentrations by using different measurement techniques and indoor aerosol modelings. Risk characterization was performed by using a source specific exposure and calculated dose levels in units of particle number and mass. The exposure risk was estimated by using non-health based occupational exposure limits for ENMs. For the nanosized TiO{sub 2}, the risk was also assessed from dose

  17. Minimum analytical quality specifications of inter-laboratory comparisons: agreement among Spanish EQAP organizers.

    Science.gov (United States)

    Ricós, Carmen; Ramón, Francisco; Salas, Angel; Buño, Antonio; Calafell, Rafael; Morancho, Jorge; Gutiérrez-Bassini, Gabriella; Jou, Josep M

    2011-11-18

    Four Spanish scientific societies organizing external quality assessment programs (EQAP) formed a working group to promote the use of common minimum quality specifications for clinical tests. Laboratories that do not meet the minimum specifications are encouraged to make immediate review of the analytical procedure affected and to implement corrective actions if necessary. The philosophy was to use the 95th percentile of results sent to EQAP (expressed in terms of percentage deviation from the target value) obtained for all results (except the outliers) during a cycle of 1 year. The target value for a number of analytes of the basic biochemistry program was established as the overall mean. However, because of the substantial discrepancies between routine methods for basic hematology, hormones, proteins, therapeutic drugs and tumor markers, the target in these cases was the peer group mean. The resulting specifications were quite similar to those established in the US (CLIA), and Germany (Richtlinie). The proposed specifications stand for the minimum level of quality to be attained for laboratories, to assure harmonized service performance. They have nothing to do with satisfying clinical requirements, which are the final level of quality to be reached, and that is strongly recommended in our organizations by means of documents, courses, symposiums and all types of educational activities.

  18. TrajAnalytics: An Open-Source, Web-Based Visual Analytics Software of Urban Trajectory Data

    OpenAIRE

    Zhao, Ye

    2018-01-01

    We developed a software system named TrajAnalytics, which explicitly supports interactive visual analytics of the emerging trajectory data. It offers data management capability and support various data queries by leveraging web-based computing platforms. It allows users to visually conduct queries and make sense of massive trajectory data.

  19. Performance specifications for the extra-analytical phases of laboratory testing: Why and how.

    Science.gov (United States)

    Plebani, Mario

    2017-07-01

    An important priority in the current healthcare scenario should be to address errors in laboratory testing, which account for a significant proportion of diagnostic errors. Efforts made in laboratory medicine to enhance the diagnostic process have been directed toward improving technology, greater volumes and more accurate laboratory tests being achieved, but data collected in the last few years highlight the need to re-evaluate the total testing process (TTP) as the unique framework for improving quality and patient safety. Valuable quality indicators (QIs) and extra-analytical performance specifications are required for guidance in improving all TTP steps. Yet in literature no data are available on extra-analytical performance specifications based on outcomes, and nor is it possible to set any specification using calculations involving biological variability. The collection of data representing the state-of-the-art based on quality indicators is, therefore, underway. The adoption of a harmonized set of QIs, a common data collection and standardised reporting method is mandatory as it will not only allow the accreditation of clinical laboratories according to the International Standard, but also assure guidance for promoting improvement processes and guaranteeing quality care to patients. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  20. PB-AM: An open-source, fully analytical linear poisson-boltzmann solver.

    Science.gov (United States)

    Felberg, Lisa E; Brookes, David H; Yap, Eng-Hui; Jurrus, Elizabeth; Baker, Nathan A; Head-Gordon, Teresa

    2017-06-05

    We present the open source distributed software package Poisson-Boltzmann Analytical Method (PB-AM), a fully analytical solution to the linearized PB equation, for molecules represented as non-overlapping spherical cavities. The PB-AM software package includes the generation of outputs files appropriate for visualization using visual molecular dynamics, a Brownian dynamics scheme that uses periodic boundary conditions to simulate dynamics, the ability to specify docking criteria, and offers two different kinetics schemes to evaluate biomolecular association rate constants. Given that PB-AM defines mutual polarization completely and accurately, it can be refactored as a many-body expansion to explore 2- and 3-body polarization. Additionally, the software has been integrated into the Adaptive Poisson-Boltzmann Solver (APBS) software package to make it more accessible to a larger group of scientists, educators, and students that are more familiar with the APBS framework. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  1. The analytical benchmark solution of spatial diffusion kinetics in source driven systems for homogeneous media

    International Nuclear Information System (INIS)

    Oliveira, F.L. de; Maiorino, J.R.; Santos, R.S.

    2007-01-01

    This paper describes a closed form solution obtained by the expansion method for the general time dependent diffusion model with delayed emission for source transients in homogeneous media. In particular, starting from simple models, and increasing the complexity, numerical results were obtained for different types of source transients. Thus, first an analytical solution of the one group without precursors was solved, followed by considering one precursors family. The general case of G-groups with R families of precursor although having a closed form solution, cannot be solved analytically, since there are no explicit formulae for the eigenvalues, and numerical methods must be used to solve such problem. To illustrate the general solution, the multi-group (three groups) time-dependent without precursors was also solved and the results inter compared with results obtained by the previous one group models for a given fast homogeneous media, and different types of source transients. The results are being compared with the obtained by numerical methods. (author)

  2. Evaluation and analytical validation of a handheld digital refractometer for urine specific gravity measurement

    Directory of Open Access Journals (Sweden)

    Sara P. Wyness

    2016-08-01

    Full Text Available Objectives: Refractometers are commonly used to determine urine specific gravity (SG in the assessment of hydration status and urine specimen validity testing. Few comprehensive performance evaluations are available demonstrating refractometer capability from a clinical laboratory perspective. The objective of this study was therefore to conduct an analytical validation of a handheld digital refractometer used for human urine SG testing. Design and methods: A MISCO Palm Abbe™ refractometer was used for all experiments, including device familiarization, carryover, precision, accuracy, linearity, analytical sensitivity, evaluation of potential substances which contribute to SG (i.e. “interference”, and reference interval evaluation. A manual refractometer, urine osmometer, and a solute score (sum of urine chloride, creatinine, glucose, potassium, sodium, total protein, and urea nitrogen; all in mg/dL were used as comparative methods for accuracy assessment. Results: Significant carryover was not observed. A wash step was still included as good laboratory practice. Low imprecision (%CV, <0.01 was demonstrated using low and high QC material. Accuracy studies showed strong correlation to manual refractometry. Linear correlation was also demonstrated between SG, osmolality, and solute score. Linearity of Palm Abbe performance was verified with observed error of ≤0.1%. Increases in SG were observed with increasing concentrations of albumin, creatinine, glucose, hemoglobin, sodium chloride, and urea. Transference of a previously published urine SG reference interval of 1.0020–1.0300 was validated. Conclusions: The Palm Abbe digital refractometer was a fast, simple, and accurate way to measure urine SG. Analytical validity was confirmed by the present experiments. Keywords: Specific gravity, Osmolality, Digital refractometry, Hydration, Sports medicine, Urine drug testing, Urine adulteration

  3. SmartR: an open-source platform for interactive visual analytics for translational research data.

    Science.gov (United States)

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-07-15

    In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR , a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR . reinhard.schneider@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  4. Pulsed voltage electrospray ion source and method for preventing analyte electrolysis

    Science.gov (United States)

    Kertesz, Vilmos [Knoxville, TN; Van Berkel, Gary [Clinton, TN

    2011-12-27

    An electrospray ion source and method of operation includes the application of pulsed voltage to prevent electrolysis of analytes with a low electrochemical potential. The electrospray ion source can include an emitter, a counter electrode, and a power supply. The emitter can include a liquid conduit, a primary working electrode having a liquid contacting surface, and a spray tip, where the liquid conduit and the working electrode are in liquid communication. The counter electrode can be proximate to, but separated from, the spray tip. The power system can supply voltage to the working electrode in the form of a pulse wave, where the pulse wave oscillates between at least an energized voltage and a relaxation voltage. The relaxation duration of the relaxation voltage can range from 1 millisecond to 35 milliseconds. The pulse duration of the energized voltage can be less than 1 millisecond and the frequency of the pulse wave can range from 30 to 800 Hz.

  5. Uncovering category specificity of genital sexual arousal in women: The critical role of analytic technique.

    Science.gov (United States)

    Pulverman, Carey S; Hixon, J Gregory; Meston, Cindy M

    2015-10-01

    Based on analytic techniques that collapse data into a single average value, it has been reported that women lack category specificity and show genital sexual arousal to a large range of sexual stimuli including those that both match and do not match their self-reported sexual interests. These findings may be a methodological artifact of the way in which data are analyzed. This study examined whether using an analytic technique that models data over time would yield different results. Across two studies, heterosexual (N = 19) and lesbian (N = 14) women viewed erotic films featuring heterosexual, lesbian, and gay male couples, respectively, as their physiological sexual arousal was assessed with vaginal photoplethysmography. Data analysis with traditional methods comparing average genital arousal between films failed to detect specificity of genital arousal for either group. When data were analyzed with smoothing regression splines and a within-subjects approach, both heterosexual and lesbian women demonstrated different patterns of genital sexual arousal to the different types of erotic films, suggesting that sophisticated statistical techniques may be necessary to more fully understand women's genital sexual arousal response. Heterosexual women showed category-specific genital sexual arousal. Lesbian women showed higher arousal to the heterosexual film than the other films. However, within subjects, lesbian women showed significantly different arousal responses suggesting that lesbian women's genital arousal discriminates between different categories of stimuli at the individual level. Implications for the future use of vaginal photoplethysmography as a diagnostic tool of sexual preferences in clinical and forensic settings are discussed. © 2015 Society for Psychophysiological Research.

  6. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals

    DEFF Research Database (Denmark)

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G

    2018-01-01

    for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision...... are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation...

  7. Breed-specific variation of hematologic and biochemical analytes in healthy adult Bernese Mountain dogs

    DEFF Research Database (Denmark)

    Nielsen, Lise; Kjelgaard-Hansen, Mads; Jensen, Asger Lundorff

    2010-01-01

    Background: Hematology and serum biochemistry reference intervals in dogs may be affected by internal factors, such as breed and age, and external factors, such as the environment, diet, and lifestyle. In humans, it is well established that geographic origin and age may have an impact on reference...... reference intervals were rejected. Methods: The procedure was performed using the human Clinical and Laboratory Standards Institute-approved model modified for veterinary use. Thirty-two dogs were included in the study using a direct a priori method, as recommended. Results: While 23 of the standard...... intervals and, therefore, more specific reference intervals are sought for subpopulations. Objective: The objective of this study was to validate and transfer standard laboratory reference intervals for healthy Bernese Mountain dogs and to create new intervals for analytes where the established laboratory...

  8. Potential sources of analytical bias and error in selected trace element data-quality analyses

    Science.gov (United States)

    Paul, Angela P.; Garbarino, John R.; Olsen, Lisa D.; Rosen, Michael R.; Mebane, Christopher A.; Struzeski, Tedmund M.

    2016-09-28

    Potential sources of analytical bias and error associated with laboratory analyses for selected trace elements where concentrations were greater in filtered samples than in paired unfiltered samples were evaluated by U.S. Geological Survey (USGS) Water Quality Specialists in collaboration with the USGS National Water Quality Laboratory (NWQL) and the Branch of Quality Systems (BQS).Causes for trace-element concentrations in filtered samples to exceed those in associated unfiltered samples have been attributed to variability in analytical measurements, analytical bias, sample contamination either in the field or laboratory, and (or) sample-matrix chemistry. These issues have not only been attributed to data generated by the USGS NWQL but have been observed in data generated by other laboratories. This study continues the evaluation of potential analytical bias and error resulting from matrix chemistry and instrument variability by evaluating the performance of seven selected trace elements in paired filtered and unfiltered surface-water and groundwater samples collected from 23 sampling sites of varying chemistries from six States, matrix spike recoveries, and standard reference materials.Filtered and unfiltered samples have been routinely analyzed on separate inductively coupled plasma-mass spectrometry instruments. Unfiltered samples are treated with hydrochloric acid (HCl) during an in-bottle digestion procedure; filtered samples are not routinely treated with HCl as part of the laboratory analytical procedure. To evaluate the influence of HCl on different sample matrices, an aliquot of the filtered samples was treated with HCl. The addition of HCl did little to differentiate the analytical results between filtered samples treated with HCl from those samples left untreated; however, there was a small, but noticeable, decrease in the number of instances where a particular trace-element concentration was greater in a filtered sample than in the associated

  9. Waste minimization methods for treating analytical instrumentation effluents at the source

    International Nuclear Information System (INIS)

    Ritter, J.A.; Barnhart, C.

    1995-01-01

    The primary goal of this project was to reduce the amount of hazardous waste being generated by the Savannah River Siste Defense Waste Processing Technology-analytical Laboratory (DWPT-AL). A detailed characterization study was performed on 12 of the liquid effluent streams generated within the DWPT-AL. Two of the streams were not hazardous, and are now being collected separately from the 10 hazardous streams. A secondary goal of the project was to develop in-line methods using primarily adsorption/ion exchange columns to treat liquid effluent as it emerges from the analytical instrument as a slow, dripping flow. Samples from the 10 hazardous streams were treated by adsorption in an experimental apparatus that resembled an in-line or at source column apparatus. The layered adsorbent bed contained activated carbon and ion exchange resin. The column technique did not work on the first three samples of the spectroscopy waste stream, but worked well on the next three samples which were treated in a different column. It was determined that an unusual form of mercury was present in the first three samples. Similarly, two samples of a combined waste stream were rendered nonhazardous, but the last two samples contained acetylnitrile that prevented analysis. The characteristics of these streams changed from the initial characterization study; therefore, continual, in-deptch stream characterization is the key to making this project successful

  10. fMRI activation patterns in an analytic reasoning task: consistency with EEG source localization

    Science.gov (United States)

    Li, Bian; Vasanta, Kalyana C.; O'Boyle, Michael; Baker, Mary C.; Nutter, Brian; Mitra, Sunanda

    2010-03-01

    Functional magnetic resonance imaging (fMRI) is used to model brain activation patterns associated with various perceptual and cognitive processes as reflected by the hemodynamic (BOLD) response. While many sensory and motor tasks are associated with relatively simple activation patterns in localized regions, higher-order cognitive tasks may produce activity in many different brain areas involving complex neural circuitry. We applied a recently proposed probabilistic independent component analysis technique (PICA) to determine the true dimensionality of the fMRI data and used EEG localization to identify the common activated patterns (mapped as Brodmann areas) associated with a complex cognitive task like analytic reasoning. Our preliminary study suggests that a hybrid GLM/PICA analysis may reveal additional regions of activation (beyond simple GLM) that are consistent with electroencephalography (EEG) source localization patterns.

  11. Kinetic calculations for miniature neutron source reactor using analytical and numerical techniques

    International Nuclear Information System (INIS)

    Ampomah-Amoako, E.

    2008-06-01

    The analytical methods, step change in reactivity and ramp change in reactivity as well as numerical methods, fixed point iteration and Runge Kutta-gill were used to simulate the initial build up of neutrons in a miniature neutron source reactor with and without temperature feedback effect. The methods were modified to include photo neutron concentration. PARET 7.3 was used to simulate the transients behaviour of Ghana Research Reactor-1. The PARET code was capable of simulating the transients for 2.1 mk and 4 mk insertions of reactivity with peak powers of 49.87 kW and 92.34 kW, respectively. PARET code however failed to simulate 6.71 mk of reactivity which was predicted by Akaho et al through TEMPFED. (au)

  12. Analytical investigation of low temperature lift energy conversion systems with renewable energy source

    International Nuclear Information System (INIS)

    Lee, Hoseong; Hwang, Yunho; Radermacher, Reinhard

    2014-01-01

    The efficiency of the renewable energy powered energy conversion system is typically low due to its moderate heat source temperature. Therefore, improving its energy efficiency is essential. In this study, the performance of the energy conversion system with renewable energy source was theoretically investigated in order to explore its design aspect. For this purpose, a computer model of n-stage low temperature lift energy conversion (LTLEC) system was developed. The results showed that under given operating conditions such as temperatures and mass flow rates of heat source and heat sink fluids the unit power generation of the system increased with the number of stage, and it became saturated when the number of staging reached four. Investigation of several possible working fluids for the optimum stage LTLEC system revealed that ethanol could be an alternative to ammonia. The heat exchanger effectiveness is a critical factor on the system performance. The power generation was increased by 7.83% for the evaporator and 9.94% for the condenser with 10% increase of heat exchanger effectiveness. When these low temperature source fluids are applied to the LTLEC system, the heat exchanger performance would be very critical and it has to be designed accordingly. - Highlights: •Energy conversion system with renewable energy is analytically investigated. •A model of multi-stage low temperature lift energy conversion systems was developed. •The system performance increases as the stage number is increased. •The unit power generation is increased with increase of HX effectiveness. •Ethanol is found to be a good alternative to ammonia

  13. A 2D semi-analytical model for Faraday shield in ICP source

    International Nuclear Information System (INIS)

    Zhang, L.G.; Chen, D.Z.; Li, D.; Liu, K.F.; Li, X.F.; Pan, R.M.; Fan, M.W.

    2016-01-01

    Highlights: • In this paper, a 2D model of ICP with faraday shield is proposed considering the complex structure of the Faraday shield. • Analytical solution is found to evaluate the electromagnetic field in the ICP source with Faraday shield. • The collision-free motion of electrons in the source is investigated and the results show that the electrons will oscillate along the radial direction, which brings insight into how the RF power couple to the plasma. - Abstract: Faraday shield is a thin copper structure with a large number of slits which is usually used in inductive coupled plasma (ICP) sources. RF power is coupled into the plasma through these slits, therefore Faraday shield plays an important role in ICP discharge. However, due to the complex structure of the Faraday shield, the resulted electromagnetic field is quite hard to evaluate. In this paper, a 2D model is proposed on the assumption that the Faraday shield is sufficiently long and the RF coil is uniformly distributed, and the copper is considered as ideal conductor. Under these conditions, the magnetic field inside the source is uniform with only the axial component, while the electric field can be decomposed into a vortex field generated by changing magnetic field together with a gradient field generated by electric charge accumulated on the Faraday shield surface, which can be easily found by solving Laplace's equation. The motion of the electrons in the electromagnetic field is investigated and the results show that the electrons will oscillate along the radial direction when taking no account of collision. This interesting result brings insight into how the RF power couples into the plasma.

  14. Analytical determination of specific 4,4'-methylene diphenyl diisocyanate hemoglobin adducts in human blood.

    Science.gov (United States)

    Gries, Wolfgang; Leng, Gabriele

    2013-09-01

    4,4'-Methylene diphenyl diisocyanate (MDI) is one of the most important isocyanates in the industrial production of polyurethane and other MDI-based synthetics. Because of its high reactivity, it is known as a sensitizing agent, caused by protein adducts. Analysis of MDI is routinely done by determination of the nonspecific 4,4'-methylenedianiline as a marker for MDI exposure in urine and blood. Since several publications have reported specific adducts of MDI and albumin or hemoglobin, more information about their existence in humans is necessary. Specific adducts of MDI and hemoglobin were only reported in rats after high-dose MDI inhalation. The aim of this investigation was to detect the hemoglobin adduct 5-isopropyl-3-[4-(4-aminobenzyl)phenyl]hydantoin (ABP-Val-Hyd) in human blood for the first time. We found values up to 5.2 ng ABP-Val-Hyd/g globin (16 pmol/g) in blood samples of workers exposed to MDI. Because there was no information available about possible amounts of this specific MDI marker, the analytical method focused on optimal sensitivity and selectivity. Using gas chromatography-high-resolution mass spectrometry with negative chemical ionization, we achieved a detection limit of 0.02 ng ABP-Val-Hyd/g globin (0.062 pmol/g). The robustness of the method was confirmed by relative standard deviations between 3.0 and 9.8 %. Combined with a linear detection range up to 10 ng ABP-Val-Hyd/g globin (31 pmol/g), the enhanced precision parameter demonstrates that the method described is optimized for screening studies of the human population.

  15. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    Science.gov (United States)

    Dinov, Ivo D; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W; Price, Nathan D; Van Horn, John D; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M; Dauer, William; Toga, Arthur W

    2016-01-01

    A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches

  16. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    Full Text Available A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI. The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data.Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i introduce methods for rebalancing imbalanced cohorts, (ii utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model

  17. The Analytical Repository Source-Term (AREST) model: Description and documentation

    International Nuclear Information System (INIS)

    Liebetrau, A.M.; Apted, M.J.; Engel, D.W.; Altenhofen, M.K.; Strachan, D.M.; Reid, C.R.; Windisch, C.F.; Erikson, R.L.; Johnson, K.I.

    1987-10-01

    The geologic repository system consists of several components, one of which is the engineered barrier system. The engineered barrier system interfaces with natural barriers that constitute the setting of the repository. A model that simulates the releases from the engineered barrier system into the natural barriers of the geosphere, called a source-term model, is an important component of any model for assessing the overall performance of the geologic repository system. The Analytical Repository Source-Term (AREST) model being developed is one such model. This report describes the current state of development of the AREST model and the code in which the model is implemented. The AREST model consists of three component models and five process models that describe the post-emplacement environment of a waste package. All of these components are combined within a probabilistic framework. The component models are a waste package containment (WPC) model that simulates the corrosion and degradation processes which eventually result in waste package containment failure; a waste package release (WPR) model that calculates the rates of radionuclide release from the failed waste package; and an engineered system release (ESR) model that controls the flow of information among all AREST components and process models and combines release output from the WPR model with failure times from the WPC model to produce estimates of total release. 167 refs., 40 figs., 12 tabs

  18. Technical challenges for big data in biomedicine and health: data sources, infrastructure, and analytics.

    Science.gov (United States)

    Peek, N; Holmes, J H; Sun, J

    2014-08-15

    To review technical and methodological challenges for big data research in biomedicine and health. We discuss sources of big datasets, survey infrastructures for big data storage and big data processing, and describe the main challenges that arise when analyzing big data. The life and biomedical sciences are massively contributing to the big data revolution through secondary use of data that were collected during routine care and through new data sources such as social media. Efficient processing of big datasets is typically achieved by distributing computation over a cluster of computers. Data analysts should be aware of pitfalls related to big data such as bias in routine care data and the risk of false-positive findings in high-dimensional datasets. The major challenge for the near future is to transform analytical methods that are used in the biomedical and health domain, to fit the distributed storage and processing model that is required to handle big data, while ensuring confidentiality of the data being analyzed.

  19. Analytical formulae to calculate the solid angle subtended at an arbitrarily positioned point source by an elliptical radiation detector

    International Nuclear Information System (INIS)

    Abbas, Mahmoud I.; Hammoud, Sami; Ibrahim, Tarek; Sakr, Mohamed

    2015-01-01

    In this article, we introduce a direct analytical mathematical method for calculating the solid angle, Ω, subtended at a point by closed elliptical contours. The solid angle is required in many areas of optical and nuclear physics to estimate the flux of particle beam of radiation and to determine the activity of a radioactive source. The validity of the derived analytical expressions was successfully confirmed by the comparison with some published data (Numerical Method)

  20. Quality specifications for the extra-analytical phase of laboratory testing: Reference intervals and decision limits.

    Science.gov (United States)

    Ceriotti, Ferruccio

    2017-07-01

    Reference intervals and decision limits are a critical part of the clinical laboratory report. The evaluation of their correct use represents a tool to verify the post analytical quality. Four elements are identified as indicators. 1. The use of decision limits for lipids and glycated hemoglobin. 2. The use, whenever possible, of common reference values. 3. The presence of gender-related reference intervals for at least the following common serum measurands (besides obviously the fertility relate hormones): alkaline phosphatase (ALP), alanine aminotransferase (ALT), creatine kinase (CK), creatinine, gamma-glutamyl transferase (GGT), IgM, ferritin, iron, transferrin, urate, red blood cells (RBC), hemoglobin (Hb) and hematocrit (Hct). 4. The presence of age-related reference intervals. The problem of specific reference intervals for elderly people is discussed, but their use is not recommended; on the contrary it is necessary the presence of pediatric age-related reference intervals at least for the following common serum measurands: ALP, amylase, creatinine, inorganic phosphate, lactate dehydrogenase, aspartate aminotransferase, urate, insulin like growth factor 1, white blood cells, RBC, Hb, Hct, alfa-fetoprotein and fertility related hormones. The lack of such reference intervals may imply significant risks for the patients. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  1. Selection of site specific vibration equation by using analytic hierarchy process in a quarry

    Energy Technology Data Exchange (ETDEWEB)

    Kalayci, Ulku, E-mail: ukalayci@istanbul.edu.tr; Ozer, Umit, E-mail: uozer@istanbul.edu.tr

    2016-01-15

    This paper presents a new approach for the selection of the most accurate SSVA (Site Specific Vibration Attenuation) equation for blasting processes in a quarry located near settlements in Istanbul, Turkey. In this context, the SSVA equations obtained from the same study area in the literature were considered in terms of distance between the shot points and buildings and the amount of explosive charge. In this purpose, 11 different SSVA equations obtained from the study area in the past 12 years, forecasting capabilities according to designated new conditions, using 102 vibration records as test data obtained from the study area was investigated. In this study, AHP (Analytic Hierarchy Process) was selected as an analysis method in order to determine the most accurate equation among 11 SSAV equations, and the parameters such as year, distance, charge, and r{sup 2} of the equations were used as criteria for AHP. Finally, the most appropriate equation was selected among the existing ones, and the process of selecting according to different target criteria was presented. Furthermore, it was noted that the forecasting results of the selected equation is more accurate than that formed using the test results. - Highlights: • The optimum Site Specific Vibration Attenuation equation for blasting in a quarry located near settlements was determined. • It is indicated that SSVA equations changing over the years don’t give always accurate estimates at changing conditions. • Selection of the blast induced SSVA equation was made using AHP. • Equation selection method was highlighted based on parameters such as charge, distance, and quarry geometry changes (year).

  2. Selection of site specific vibration equation by using analytic hierarchy process in a quarry

    International Nuclear Information System (INIS)

    Kalayci, Ulku; Ozer, Umit

    2016-01-01

    This paper presents a new approach for the selection of the most accurate SSVA (Site Specific Vibration Attenuation) equation for blasting processes in a quarry located near settlements in Istanbul, Turkey. In this context, the SSVA equations obtained from the same study area in the literature were considered in terms of distance between the shot points and buildings and the amount of explosive charge. In this purpose, 11 different SSVA equations obtained from the study area in the past 12 years, forecasting capabilities according to designated new conditions, using 102 vibration records as test data obtained from the study area was investigated. In this study, AHP (Analytic Hierarchy Process) was selected as an analysis method in order to determine the most accurate equation among 11 SSAV equations, and the parameters such as year, distance, charge, and r"2 of the equations were used as criteria for AHP. Finally, the most appropriate equation was selected among the existing ones, and the process of selecting according to different target criteria was presented. Furthermore, it was noted that the forecasting results of the selected equation is more accurate than that formed using the test results. - Highlights: • The optimum Site Specific Vibration Attenuation equation for blasting in a quarry located near settlements was determined. • It is indicated that SSVA equations changing over the years don’t give always accurate estimates at changing conditions. • Selection of the blast induced SSVA equation was made using AHP. • Equation selection method was highlighted based on parameters such as charge, distance, and quarry geometry changes (year).

  3. From Web Analytics to Product Analytics: The Internet of Things as a New Data Source for Enterprise Information Systems

    OpenAIRE

    Klat , Wilhelm; Stummer , Christian; Decker , Reinhold

    2016-01-01

    Part 4: Advanced Manufacturing and Management Aspects; International audience; The internet of things (IoT) paves the way for a new generation of consumer products that collect and exchange data, constituting a new data source for enterprise information systems (EIS). These IoT-ready products use built-in sensors and wireless communication technologies to capture and share data about product usage and the environment in which the products are used. The dissemination of the internet into the p...

  4. Analytical description of photon beam phase spaces in inverse Compton scattering sources

    Directory of Open Access Journals (Sweden)

    C. Curatolo

    2017-08-01

    Full Text Available We revisit the description of inverse Compton scattering sources and the photon beams generated therein, emphasizing the behavior of their phase space density distributions and how they depend upon those of the two colliding beams of electrons and photons. The main objective is to provide practical formulas for bandwidth, spectral density, brilliance, which are valid in general for any value of the recoil factor, i.e. both in the Thomson regime of negligible electron recoil, and in the deep Compton recoil dominated region, which is of interest for gamma-gamma colliders and Compton sources for the production of multi-GeV photon beams. We adopt a description based on the center of mass reference system of the electron-photon collision, in order to underline the role of the electron recoil and how it controls the relativistic Doppler/boost effect in various regimes. Using the center of mass reference frame greatly simplifies the treatment, allowing us to derive simple formulas expressed in terms of rms momenta of the two colliding beams (emittance, energy spread, etc. and the collimation angle in the laboratory system. Comparisons with Monte Carlo simulations of inverse Compton scattering in various scenarios are presented, showing very good agreement with the analytical formulas: in particular we find that the bandwidth dependence on the electron beam emittance, of paramount importance in Thomson regime, as it limits the amount of focusing imparted to the electron beam, becomes much less sensitive in deep Compton regime, allowing a stronger focusing of the electron beam to enhance luminosity without loss of mono-chromaticity. A similar effect occurs concerning the bandwidth dependence on the frequency spread of the incident photons: in deep recoil regime the bandwidth comes out to be much less dependent on the frequency spread. The set of formulas here derived are very helpful in designing inverse Compton sources in diverse regimes, giving a

  5. Analytical solution of spatial kinetics of the diffusion model for subcritical homogeneous systems driven by external source

    International Nuclear Information System (INIS)

    Oliveira, Fernando Luiz de

    2008-01-01

    This work describes an analytical solution obtained by the expansion method for the spatial kinetics using the diffusion model with delayed emission for source transients in homogeneous media. In particular, starting from simple models, and increasing the complexity, numerical results were obtained for different types of source transients. An analytical solution of the one group without precursors was solved, followed by considering one precursors family. The general case of G-groups with R families of precursor although having a closed form solution, cannot be solved analytically, since there are no explicit formulae for the eigenvalues, and numerical methods must be used to solve such problem. To illustrate the general solution, the multi-group (three groups) time-dependent problem without precursors was solved and the numerical results of a finite difference code were compared with the exact results for different transients. (author)

  6. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    Science.gov (United States)

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  7. Generalizing Source Geometry of Site Contamination by Simulating and Analyzing Analytical Solution of Three-Dimensional Solute Transport Model

    Directory of Open Access Journals (Sweden)

    Xingwei Wang

    2014-01-01

    Full Text Available Due to the uneven distribution of pollutions and blur edge of pollutant area, there will exist uncertainty of source term shape in advective-diffusion equation model of contaminant transport. How to generalize those irregular source terms and deal with those uncertainties is very critical but rarely studied in previous research. In this study, the fate and transport of contaminant from rectangular and elliptic source geometry were simulated based on a three-dimensional analytical solute transport model, and the source geometry generalization guideline was developed by comparing the migration of contaminant. The result indicated that the variation of source area size had no effect on pollution plume migration when the plume migrated as far as five times of source side length. The migration of pollution plume became slower with the increase of aquifer thickness. The contaminant concentration was decreasing with scale factor rising, and the differences among various scale factors became smaller with the distance to field increasing.

  8. 21 CFR 809.30 - Restrictions on the sale, distribution and use of analyte specific reagents.

    Science.gov (United States)

    2010-04-01

    ...; (2) Clinical laboratories regulated under the Clinical Laboratory Improvement Amendments of 1988 (CLIA), as qualified to perform high complexity testing under 42 CFR part 493 or clinical laboratories... analytical or clinical performance. (e) The laboratory that develops an in-house test using the ASR shall...

  9. Specific Human Capital as a Source of Superior Team Performance

    OpenAIRE

    Egon Franck; Stephan Nüesch; Jan Pieper

    2009-01-01

    In this paper, we empirically investigate the performance effect of team-specific human capital in highly interactive teams. Based on the tenets of the resource-based view of the firm and on the ideas of typical learning functions, we hypothesize that team members’ shared experience in working together positively impacts team performance, but at diminishing rates. Holding a team’s stock of general human capital and other potential drivers constant, we find support for this prediction. Implica...

  10. A two-dimensional transient analytical solution for a ponded ditch drainage system under the influence of source/sink

    Science.gov (United States)

    Sarmah, Ratan; Tiwari, Shubham

    2018-03-01

    An analytical solution is developed for predicting two-dimensional transient seepage into ditch drainage network receiving water from a non-uniform steady ponding field from the surface of the soil under the influence of source/sink in the flow domain. The flow domain is assumed to be saturated, homogeneous and anisotropic in nature and have finite extends in horizontal and vertical directions. The drains are assumed to be standing vertical and penetrating up to impervious layer. The water levels in the drains are unequal and invariant with time. The flow field is also assumed to be under the continuous influence of time-space dependent arbitrary source/sink term. The correctness of the proposed model is checked by developing a numerical code and also with the existing analytical solution for the simplified case. The study highlights the significance of source/sink influence in the subsurface flow. With the imposition of the source and sink term in the flow domain, the pathline and travel time of water particles started deviating from their original position and above that the side and top discharge to the drains were also observed to have a strong influence of the source/sink terms. The travel time and pathline of water particles are also observed to have a dependency on the height of water in the ditches and on the location of source/sink activation area.

  11. TH-A-19A-06: Site-Specific Comparison of Analytical and Monte Carlo Based Dose Calculations

    International Nuclear Information System (INIS)

    Schuemann, J; Grassberger, C; Paganetti, H; Dowdell, S

    2014-01-01

    Purpose: To investigate the impact of complex patient geometries on the capability of analytical dose calculation algorithms to accurately predict dose distributions and to verify currently used uncertainty margins in proton therapy. Methods: Dose distributions predicted by an analytical pencilbeam algorithm were compared with Monte Carlo simulations (MCS) using TOPAS. 79 complete patient treatment plans were investigated for 7 disease sites (liver, prostate, breast, medulloblastoma spine and whole brain, lung and head and neck). A total of 508 individual passively scattered treatment fields were analyzed for field specific properties. Comparisons based on target coverage indices (EUD, D95, D90 and D50) were performed. Range differences were estimated for the distal position of the 90% dose level (R90) and the 50% dose level (R50). Two-dimensional distal dose surfaces were calculated and the root mean square differences (RMSD), average range difference (ARD) and average distal dose degradation (ADD), the distance between the distal position of the 80% and 20% dose levels (R80- R20), were analyzed. Results: We found target coverage indices calculated by TOPAS to generally be around 1–2% lower than predicted by the analytical algorithm. Differences in R90 predicted by TOPAS and the planning system can be larger than currently applied range margins in proton therapy for small regions distal to the target volume. We estimate new site-specific range margins (R90) for analytical dose calculations considering total range uncertainties and uncertainties from dose calculation alone based on the RMSD. Our results demonstrate that a reduction of currently used uncertainty margins is feasible for liver, prostate and whole brain fields even without introducing MC dose calculations. Conclusion: Analytical dose calculation algorithms predict dose distributions within clinical limits for more homogeneous patients sites (liver, prostate, whole brain). However, we recommend

  12. TH-A-19A-06: Site-Specific Comparison of Analytical and Monte Carlo Based Dose Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Schuemann, J; Grassberger, C; Paganetti, H [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Dowdell, S [Illawarra Shoalhaven Local Health District, Wollongong (Australia)

    2014-06-15

    Purpose: To investigate the impact of complex patient geometries on the capability of analytical dose calculation algorithms to accurately predict dose distributions and to verify currently used uncertainty margins in proton therapy. Methods: Dose distributions predicted by an analytical pencilbeam algorithm were compared with Monte Carlo simulations (MCS) using TOPAS. 79 complete patient treatment plans were investigated for 7 disease sites (liver, prostate, breast, medulloblastoma spine and whole brain, lung and head and neck). A total of 508 individual passively scattered treatment fields were analyzed for field specific properties. Comparisons based on target coverage indices (EUD, D95, D90 and D50) were performed. Range differences were estimated for the distal position of the 90% dose level (R90) and the 50% dose level (R50). Two-dimensional distal dose surfaces were calculated and the root mean square differences (RMSD), average range difference (ARD) and average distal dose degradation (ADD), the distance between the distal position of the 80% and 20% dose levels (R80- R20), were analyzed. Results: We found target coverage indices calculated by TOPAS to generally be around 1–2% lower than predicted by the analytical algorithm. Differences in R90 predicted by TOPAS and the planning system can be larger than currently applied range margins in proton therapy for small regions distal to the target volume. We estimate new site-specific range margins (R90) for analytical dose calculations considering total range uncertainties and uncertainties from dose calculation alone based on the RMSD. Our results demonstrate that a reduction of currently used uncertainty margins is feasible for liver, prostate and whole brain fields even without introducing MC dose calculations. Conclusion: Analytical dose calculation algorithms predict dose distributions within clinical limits for more homogeneous patients sites (liver, prostate, whole brain). However, we recommend

  13. Prolonged activated prothromboplastin time and breed specific variation in haemostatic analytes in healthy adult Bernese Mountain dogs

    DEFF Research Database (Denmark)

    Nielsen, Lise; Wiinberg, Bo; Kjelgaard-Hansen, Mads

    2011-01-01

    Coagulation tests are often performed in dogs suspected of haemostatic dysfunction and are interpreted according to validated laboratory reference intervals (RIs). Breed specific RIs for haematological and biochemical analytes have previously been identified in Bernese Mountain dogs, but it remains...... to be determined if breed specific RIs are necessary for haemostasis tests. Activated prothromboplastin time (aPTT), prothrombin time (PT), selected coagulation factors, D-dimers, fibrinogen, von Willebrand factor and thromboelastography (TEG) were analyzed in healthy Bernese Mountain dogs using the CLSI model...

  14. Improvement of spatial discretization error on the semi-analytic nodal method using the scattered source subtraction method

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Tatsumi, Masahiro

    2006-01-01

    In this paper, the scattered source subtraction (SSS) method is newly proposed to improve the spatial discretization error of the semi-analytic nodal method with the flat-source approximation. In the SSS method, the scattered source is subtracted from both side of the diffusion or the transport equation to make spatial variation of the source term to be small. The same neutron balance equation is still used in the SSS method. Since the SSS method just modifies coefficients of node coupling equations (those used in evaluation for the response of partial currents), its implementation is easy. Validity of the present method is verified through test calculations that are carried out in PWR multi-assemblies configurations. The calculation results show that the SSS method can significantly improve the spatial discretization error. Since the SSS method does not have any negative impact on execution time, convergence behavior and memory requirement, it will be useful to reduce the spatial discretization error of the semi-analytic nodal method with the flat-source approximation. (author)

  15. An analytical calculation of the peak efficiency for cylindrical sources perpendicular to the detector axis in gamma-ray spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Aguiar, Julio C. [Autoridad Regulatoria Nuclear, Laboratorio de Espectrometria Gamma-CTBTO, Av. Del Libertador 8250, C1429BNP Buenos Aires (Argentina)], E-mail: jaguiar@sede.arn.gov.ar

    2008-08-15

    An analytical expression for the so-called full-energy peak efficiency {epsilon}(E) for cylindrical source with perpendicular axis to an HPGe detector is derived, using point-source measurements. The formula covers different measuring distances, matrix compositions, densities and gamma-ray energies; the only assumption is that the radioactivity is homogeneously distributed within the source. The term for the photon self-attenuation is included in the calculation. Measurements were made using three different sized cylindrical sources of {sup 241}Am, {sup 57}Co, {sup 137}Cs, {sup 54}Mn, and {sup 60}Co with corresponding peaks of 59.5, 122, 662, 835, 1173, and 1332 keV, respectively, and one measurement of radioactive waste drum for 662, 1173, and 1332 keV.

  16. Comparison of analytic source models for head scatter factor calculation and planar dose calculation for IMRT

    International Nuclear Information System (INIS)

    Yan Guanghua; Liu, Chihray; Lu Bo; Palta, Jatinder R; Li, Jonathan G

    2008-01-01

    The purpose of this study was to choose an appropriate head scatter source model for the fast and accurate independent planar dose calculation for intensity-modulated radiation therapy (IMRT) with MLC. The performance of three different head scatter source models regarding their ability to model head scatter and facilitate planar dose calculation was evaluated. A three-source model, a two-source model and a single-source model were compared in this study. In the planar dose calculation algorithm, in-air fluence distribution was derived from each of the head scatter source models while considering the combination of Jaw and MLC opening. Fluence perturbations due to tongue-and-groove effect, rounded leaf end and leaf transmission were taken into account explicitly. The dose distribution was calculated by convolving the in-air fluence distribution with an experimentally determined pencil-beam kernel. The results were compared with measurements using a diode array and passing rates with 2%/2 mm and 3%/3 mm criteria were reported. It was found that the two-source model achieved the best agreement on head scatter factor calculation. The three-source model and single-source model underestimated head scatter factors for certain symmetric rectangular fields and asymmetric fields, but similar good agreement could be achieved when monitor back scatter effect was incorporated explicitly. All the three source models resulted in comparable average passing rates (>97%) when the 3%/3 mm criterion was selected. The calculation with the single-source model and two-source model was slightly faster than the three-source model due to their simplicity

  17. Comparison of analytic source models for head scatter factor calculation and planar dose calculation for IMRT

    Energy Technology Data Exchange (ETDEWEB)

    Yan Guanghua [Department of Nuclear and Radiological Engineering, University of Florida, Gainesville, FL 32611 (United States); Liu, Chihray; Lu Bo; Palta, Jatinder R; Li, Jonathan G [Department of Radiation Oncology, University of Florida, Gainesville, FL 32610-0385 (United States)

    2008-04-21

    The purpose of this study was to choose an appropriate head scatter source model for the fast and accurate independent planar dose calculation for intensity-modulated radiation therapy (IMRT) with MLC. The performance of three different head scatter source models regarding their ability to model head scatter and facilitate planar dose calculation was evaluated. A three-source model, a two-source model and a single-source model were compared in this study. In the planar dose calculation algorithm, in-air fluence distribution was derived from each of the head scatter source models while considering the combination of Jaw and MLC opening. Fluence perturbations due to tongue-and-groove effect, rounded leaf end and leaf transmission were taken into account explicitly. The dose distribution was calculated by convolving the in-air fluence distribution with an experimentally determined pencil-beam kernel. The results were compared with measurements using a diode array and passing rates with 2%/2 mm and 3%/3 mm criteria were reported. It was found that the two-source model achieved the best agreement on head scatter factor calculation. The three-source model and single-source model underestimated head scatter factors for certain symmetric rectangular fields and asymmetric fields, but similar good agreement could be achieved when monitor back scatter effect was incorporated explicitly. All the three source models resulted in comparable average passing rates (>97%) when the 3%/3 mm criterion was selected. The calculation with the single-source model and two-source model was slightly faster than the three-source model due to their simplicity.

  18. A Model To Estimate the Sources of Tobacco-Specific Nitrosamines in Cigarette Smoke.

    Science.gov (United States)

    Lipowicz, Peter J; Seeman, Jeffrey I

    2017-08-21

    Tobacco-specific nitrosamines (TSNAs) are one of the most extensively and continually studied classes of compounds found in tobacco and cigarette smoke.1-5 The TSNAs N-nitrosonornicotine (NNN) and 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanone (NNK) have been characterized by the US Food and Drug Administration (FDA) as harmful and potentially harmful constituents in tobacco products,6 and cigarette manufacturers report their levels in cigarette tobacco filler and cigarette smoke to the FDA. NNN and NNK are classified by IARC as carcinogenic to humans.7 TSNAs transfer from tobacco to smoke by evaporation driven by heat and the flow of gases down the cigarette rod. Other TSNA sources in smoke include pyrorelease, where room temperature-unextractable TSNAs are released by smoking, and pyrosynthesis, where TSNAs are formed by reactions during smoking. We propose the first model that quantifies these three sources of TSNA in smoke. In our model, evaporative transfer efficiency of a TSNA is equated to the evaporative transfer efficiency of nicotine. Smoke TSNA measured in excess of what is transferred by evaporation is termed "pyrogeneration," which is the net sum of pyrorelease and pyrosynthesis minus pyrodegredation. This model requires no internal standard, is applicable to commercial cigarettes "as is," and uses existing analytical methods. This model was applied to archived Philip Morris USA data. For commercial blended cigarettes, NNN pyrogeneration appears to be unimportant, but NNK pyrogeneration contributes roughly 30-70% of NNK in smoke with the greater contribution at lower tobacco NNK levels. This means there is an opportunity to significantly reduce smoke NNK by up to 70% if pyrogeneration can be decreased or eliminated, perhaps by finding a way to grow and cure tobacco with reduced matrix-bound NNK. For burley research cigarettes, pyrogeneration may account for 90% or more of both NNN and NNK in smoke.

  19. Analytic sensing for multi-layer spherical models with application to EEG source imaging

    OpenAIRE

    Kandaswamy, Djano; Blu, Thierry; Van De Ville, Dimitri

    2013-01-01

    Source imaging maps back boundary measurements to underlying generators within the domain; e. g., retrieving the parameters of the generating dipoles from electrical potential measurements on the scalp such as in electroencephalography (EEG). Fitting such a parametric source model is non-linear in the positions of the sources and renewed interest in mathematical imaging has led to several promising approaches. One important step in these methods is the application of a sensing principle that ...

  20. A very high yield electron impact ion source for analytical mass spectrometry

    International Nuclear Information System (INIS)

    Koontz, S.L.; Bonner Denton, M.

    1981-01-01

    A novel ion source designed for use in mass spectrometric determination of organic compounds is described. The source is designed around a low pressure, large volume, hot cathode Penning discharge. The source operates in the 10 -4 - 10 -7 torr pressure domain and is capable of producing focusable current densities several orders of magnitude greater than those produced by conventional Nier -type sources. Mass spectra of n-butane and octafluoro-2-butene are presented. An improved signal-to-noise ratio is demonstrated with a General Electric Monopole 300 mass spectrometer. (orig.)

  1. 40 CFR 141.402 - Ground water source microbial monitoring and analytical methods.

    Science.gov (United States)

    2010-07-01

    ... approves the use of E. coli as a fecal indicator for source water monitoring under this paragraph (a). If the repeat sample collected from the ground water source is E.coli positive, the system must comply... listed in the in paragraph (c)(2) of this section for the presence of E. coli, enterococci, or coliphage...

  2. A systematic quantification of the sources of variation of process analytical measurements in the steel industry

    NARCIS (Netherlands)

    Jellema, R.H.; Louwerse, D.J.; Smilde, A.K.; Gerritsen, M.J.P.; Guldemond, D.; Voet, van der H.; Vereijken, P.F.G.

    2003-01-01

    A strategy is proposed for the Identification and quantification of sources of variation in a manufacturing process. The strategy involves six steps: identification and selection of factors, model selection, design of the experiments, performing the experiments, estimation of sources of variation,

  3. Temperature field due to time-dependent heat sources in a large rectangular grid - Derivation of analytical solution

    International Nuclear Information System (INIS)

    Claesson, J.; Probert, T.

    1996-01-01

    The temperature field in rock due to a large rectangular grid of heat releasing canisters containing nuclear waste is studied. The solution is by superposition divided into different parts. There is a global temperature field due to the large rectangular canister area, while a local field accounts for the remaining heat source problem. The global field is reduced to a single integral. The local field is also solved analytically using solutions for a finite line heat source and for an infinite grid of point sources. The local solution is reduced to three parts, each of which depends on two spatial coordinates only. The temperatures at the envelope of a canister are given by a single thermal resistance, which is given by an explicit formula. The results are illustrated by a few numerical examples dealing with the KBS-3 concept for storage of nuclear waste. 8 refs

  4. Partial and specific source memory for faces associated to other- and self-relevant negative contexts.

    Science.gov (United States)

    Bell, Raoul; Giang, Trang; Buchner, Axel

    2012-01-01

    Previous research has shown a source memory advantage for faces presented in negative contexts. As yet it remains unclear whether participants remember the specific type of context in which the faces were presented or whether they can only remember that the face was associated with negative valence. In the present study, participants saw faces together with descriptions of two different types of negative behaviour and neutral behaviour. In Experiment 1, we examined whether the participants were able to discriminate between two types of other-relevant negative context information (cheating and disgusting behaviour) in a source memory test. In Experiment 2, we assessed source memory for other-relevant negative (threatening) context information (other-aggressive behaviour) and self-relevant negative context information (self-aggressive behaviour). A multinomial source memory model was used to separately assess partial source memory for the negative valence of the behaviour and specific source memory for the particular type of negative context the face was associated with. In Experiment 1, source memory was specific for the particular type of negative context presented (i.e., cheating or disgusting behaviour). Experiment 2 showed that source memory for other-relevant negative information was more specific than source memory for self-relevant information. Thus, emotional source memory may vary in specificity depending on the degree to which the negative emotional context is perceived as threatening.

  5. Determining the analytical specificity of PCR-based assays for the diagnosis of IA: What is Aspergillus?

    Science.gov (United States)

    Morton, C Oliver; White, P Lewis; Barnes, Rosemary A; Klingspor, Lena; Cuenca-Estrella, Manuel; Lagrou, Katrien; Bretagne, Stéphane; Melchers, Willem; Mengoli, Carlo; Caliendo, Angela M; Cogliati, Massimo; Debets-Ossenkopp, Yvette; Gorton, Rebecca; Hagen, Ferry; Halliday, Catriona; Hamal, Petr; Harvey-Wood, Kathleen; Jaton, Katia; Johnson, Gemma; Kidd, Sarah; Lengerova, Martina; Lass-Florl, Cornelia; Linton, Chris; Millon, Laurence; Morrissey, C Orla; Paholcsek, Melinda; Talento, Alida Fe; Ruhnke, Markus; Willinger, Birgit; Donnelly, J Peter; Loeffler, Juergen

    2017-06-01

    A wide array of PCR tests has been developed to aid the diagnosis of invasive aspergillosis (IA), providing technical diversity but limiting standardisation and acceptance. Methodological recommendations for testing blood samples using PCR exist, based on achieving optimal assay sensitivity to help exclude IA. Conversely, when testing more invasive samples (BAL, biopsy, CSF) emphasis is placed on confirming disease, so analytical specificity is paramount. This multicenter study examined the analytical specificity of PCR methods for detecting IA by blind testing a panel of DNA extracted from a various fungal species to explore the range of Aspergillus species that could be detected, but also potential cross reactivity with other fungal species. Positivity rates were calculated and regression analysis was performed to determine any associations between technical specifications and performance. The accuracy of Aspergillus genus specific assays was 71.8%, significantly greater (P Aspergillus species (47.2%). For genus specific assays the most often missed species were A. lentulus (25.0%), A. versicolor (24.1%), A. terreus (16.1%), A. flavus (15.2%), A. niger (13.4%), and A. fumigatus (6.2%). There was a significant positive association between accuracy and using an Aspergillus genus PCR assay targeting the rRNA genes (P = .0011). Conversely, there was a significant association between rRNA PCR targets and false positivity (P = .0032). To conclude current Aspergillus PCR assays are better suited for detecting A. fumigatus, with inferior detection of most other Aspergillus species. The use of an Aspergillus genus specific PCR assay targeting the rRNA genes is preferential. © The Author 2016. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. In Situ Near Infrared Spectroscopy for Analyte-Specific Monitoring of Glucose and Ammonium in Streptomyces coelicolor Fermentations

    DEFF Research Database (Denmark)

    Petersen, Nanna; Ödman, Peter; Cervera Padrell, Albert Emili

    2010-01-01

    was used as a model process. Partial least squares (PLS) regression models were calibrated for glucose and ammonium based on NIR spectra collected in situ. To ensure that the models were calibrated based on analyte-specific information, semisynthetic samples were used for model calibration in addition...... resulting in a RMSEP of 1.1 g/L. The prediction of ammonium based on NIR spectra collected in situ was not satisfactory. A comparison with models calibrated based on NIR spectra collected off line suggested that this is caused by signal attenuation in the optical fibers in the region above 2,000 nm...

  7. Analytical calculation of the solid angle subtended by an arbitrarily positioned ellipsoid to a point source

    International Nuclear Information System (INIS)

    Heitz, Eric

    2017-01-01

    We present a geometric method for computing an ellipse that subtends the same solid-angle domain as an arbitrarily positioned ellipsoid. With this method we can extend existing analytical solid-angle calculations of ellipses to ellipsoids. Our idea consists of applying a linear transformation on the ellipsoid such that it is transformed into a sphere from which a disk that covers the same solid-angle domain can be computed. We demonstrate that by applying the inverse linear transformation on this disk we obtain an ellipse that subtends the same solid-angle domain as the ellipsoid. We provide a MATLAB implementation of our algorithm and we validate it numerically.

  8. Analytical calculation of the solid angle subtended by an arbitrarily positioned ellipsoid to a point source

    Energy Technology Data Exchange (ETDEWEB)

    Heitz, Eric, E-mail: eheitz.research@gmail.com

    2017-04-21

    We present a geometric method for computing an ellipse that subtends the same solid-angle domain as an arbitrarily positioned ellipsoid. With this method we can extend existing analytical solid-angle calculations of ellipses to ellipsoids. Our idea consists of applying a linear transformation on the ellipsoid such that it is transformed into a sphere from which a disk that covers the same solid-angle domain can be computed. We demonstrate that by applying the inverse linear transformation on this disk we obtain an ellipse that subtends the same solid-angle domain as the ellipsoid. We provide a MATLAB implementation of our algorithm and we validate it numerically.

  9. The machine in multimedia analytics

    NARCIS (Netherlands)

    Zahálka, J.

    2017-01-01

    This thesis investigates the role of the machine in multimedia analytics, a discipline that combines visual analytics with multimedia analysis algorithms in order to unlock the potential of multimedia collections as sources of knowledge in scientific and applied domains. Specifically, the central

  10. Analytical calculations of the efficiency of gamma scintillators total efficiency for coaxial disk sources

    Energy Technology Data Exchange (ETDEWEB)

    Selim, Y S; Abbas, M I; Fawzy, M A [Physics Department, Faculty of Science, Alexandria University, Aleaxndria (Egypt)

    1997-12-31

    Total efficiency of clad right circular cylindrical Nal(TI) scintillation detector from a coaxial isotropic radiating circular disk source has been calculated by the of rigid mathematical expressions. Results were tabulated for various gamma energies. 2 figs., 5 tabs.

  11. INAA in combination with other analytical techniques in the study of urban aerosol sources

    International Nuclear Information System (INIS)

    Binh, N.T.; Truong, Y.; Ngo, N.T.; Sieu, L.N.; Hien, P.D.

    2000-01-01

    Concentrations of elements in fine and coarse PM10 samples collected in Ho Chi Minh City were determined by INAA for the purpose of characterising air pollution sources using multivariate receptor modeling techniques. Seven sources common to coarse and fine samples were identified. Resuspended soil dust is dominant in the coarse samples accounting for 41% of the particulate mass. In the fine samples, vehicle emissions and coal burning are most important accounting for about 20% each. Although a great number of elements were included in the input data for receptor modeling, the interpretation of emission sources was not always straightforward. Information on other source markers were needed. Therefore, a polarography method was used for quantifying lead, and recently, ion chromatography method became available for quantifying secondary sulphates, nitrates and other water soluble ions. (author)

  12. Analytic and Unambiguous Phase-Based Algorithm for 3-D Localization of a Single Source with Uniform Circular Array

    Directory of Open Access Journals (Sweden)

    Le Zuo

    2018-02-01

    Full Text Available This paper presents an analytic algorithm for estimating three-dimensional (3-D localization of a single source with uniform circular array (UCA interferometers. Fourier transforms are exploited to expand the phase distribution of a single source and the localization problem is reformulated as an equivalent spectrum manipulation problem. The 3-D parameters are decoupled to different spectrums in the Fourier domain. Algebraic relations are established between the 3-D localization parameters and the Fourier spectrums. Fourier sampling theorem ensures that the minimum element number for 3-D localization of a single source with a UCA is five. Accuracy analysis provides mathematical insights into the 3-D localization algorithm that larger number of elements gives higher estimation accuracy. In addition, the phase-based high-order difference invariance (HODI property of a UCA is found and exploited to realize phase range compression. Following phase range compression, ambiguity resolution is addressed by the HODI of a UCA. A major advantage of the algorithm is that the ambiguity resolution and 3-D localization estimation are both analytic and are processed simultaneously, hence computationally efficient. Numerical simulations and experimental results are provided to verify the effectiveness of the proposed 3-D localization algorithm.

  13. Shielding Characteristics Using an Ultrasonic Configurable Fan Artificial Noise Source to Generate Modes - Experimental Measurements and Analytical Predictions

    Science.gov (United States)

    Sutliff, Daniel L.; Walker, Bruce E.

    2014-01-01

    An Ultrasonic Configurable Fan Artificial Noise Source (UCFANS) was designed, built, and tested in support of the NASA Langley Research Center's 14x22 wind tunnel test of the Hybrid Wing Body (HWB) full 3-D 5.8% scale model. The UCFANS is a 5.8% rapid prototype scale model of a high-bypass turbofan engine that can generate the tonal signature of proposed engines using artificial sources (no flow). The purpose of the program was to provide an estimate of the acoustic shielding benefits possible from mounting an engine on the upper surface of a wing; a flat plate model was used as the shielding surface. Simple analytical simulations were used to preview the radiation patterns - Fresnel knife-edge diffraction was coupled with a dense phased array of point sources to compute shielded and unshielded sound pressure distributions for potential test geometries and excitation modes. Contour plots of sound pressure levels, and integrated power levels, from nacelle alone and shielded configurations for both the experimental measurements and the analytical predictions are presented in this paper.

  14. French analytic experiment on the high specific burnup of PWR fuels in normal conditions

    International Nuclear Information System (INIS)

    Bruet, M.; Atabek, R.; Houdaille, B.; Baron, D.

    1982-04-01

    Hydrostatic density determinations made on UO 2 pellets of different kinds irradiated in conditions representative of PWR conditions enable the internal swelling rate of the UO 2 to be ascertained. A mean value of 0.8% per 10 4 MWdt -1 (u) up to a specific burnup of 45000 MWdt -1 (u) may be deduced from this experimental basis. These results agree well with those obtained in the TANGO experiments in which UO 2 balls were irradiated in quasi isothermal conditions and without stress. Further, the open porosity of oxide closes progressively and the change in the total porosity is thus very limited (under 1% at 45000 MWdt -1 (u)). With respect to the swelling of the pellets the rise in the specific burnup would not appear therefore to be a problem. The behaviour of recrystallized zircaloy 4 claddings remains satisfactory with respect to creep and growth during irradiation [fr

  15. Analytic reconstruction algorithms for triple-source CT with horizontal data truncation

    International Nuclear Information System (INIS)

    Chen, Ming; Yu, Hengyong

    2015-01-01

    Purpose: This paper explores a triple-source imaging method with horizontal data truncation to enlarge the field of view (FOV) for big objects. Methods: The study is conducted by using theoretical analysis, mathematical deduction, and numerical simulations. The proposed algorithms are implemented in c + + and MATLAB. While the basic platform is constructed in MATLAB, the computationally intensive segments are coded in c + +, which are linked via a MEX interface. Results: A triple-source circular scanning configuration with horizontal data truncation is developed, where three pairs of x-ray sources and detectors are unevenly distributed on the same circle to cover the whole imaging object. For this triple-source configuration, a fan-beam filtered backprojection-type algorithm is derived for truncated full-scan projections without data rebinning. The algorithm is also extended for horizontally truncated half-scan projections and cone-beam projections in a Feldkamp-type framework. Using their method, the FOV is enlarged twofold to threefold to scan bigger objects with high speed and quality. The numerical simulation results confirm the correctness and effectiveness of the developed algorithms. Conclusions: The triple-source scanning configuration with horizontal data truncation cannot only keep most of the advantages of a traditional multisource system but also cover a larger FOV for big imaging objects. In addition, because the filtering is shift-invariant, the proposed algorithms are very fast and easily parallelized on graphic processing units

  16. Exact analytical solution of time-independent neutron transport equation, and its applications to systems with a point source

    International Nuclear Information System (INIS)

    Mikata, Y.

    2014-01-01

    Highlights: • An exact solution for the one-speed neutron transport equation is obtained. • This solution as well as its derivation are believed to be new. • Neutron flux for a purely absorbing material with a point neutron source off the origin is obtained. • Spherically as well as cylindrically piecewise constant cross sections are studied. • Neutron flux expressions for a point neutron source off the origin are believed to be new. - Abstract: An exact analytical solution of the time-independent monoenergetic neutron transport equation is obtained in this paper. The solution is applied to systems with a point source. Systematic analysis of the solution of the time-independent neutron transport equation, and its applications represent the primary goal of this paper. To the best of the author’s knowledge, certain key results on the scalar neutron flux as well as their derivations are new. As an application of these results, a scalar neutron flux for a purely absorbing medium with a spherically piecewise constant cross section and an isotropic point neutron source off the origin as well as that for a cylindrically piecewise constant cross section with a point neutron source off the origin are obtained. Both of these results are believed to be new

  17. Application specific integrated circuit (ASIC) readout technologies for future ion beam analytical instruments

    Energy Technology Data Exchange (ETDEWEB)

    Whitlow, Harry J. E-mail: harry_j.whitlow@nuclear.lu.se

    2000-03-01

    New possibilities for ion beam analysis (IBA) are afforded by recent developments in detector technology which facilitate the parallel collection of data from a large number of channels. Application specific integrated circuit (ASIC) technologies, which have been widely employed for multi-channel readout systems in nuclear and particle physics, are more net-cost effective (160/channel for 1000 channels) and a more rational solution for readout of a large number of channels than afforded by conventional electronics. Based on results from existing and on-going chip designs, the possibilities and issues of ASIC readout technology are considered from the IBA viewpoint. Consideration is given to readout chip architecture and how the stringent resolution, linearity and stability requirements for IBA may be met. In addition the implications of the restrictions imposed by ASIC technology are discussed.

  18. Application of the source term code package to obtain a specific source term for the Laguna Verde Nuclear Power Plant

    International Nuclear Information System (INIS)

    Souto, F.J.

    1991-06-01

    The main objective of the project was to use the Source Term Code Package (STCP) to obtain a specific source term for those accident sequences deemed dominant as a result of probabilistic safety analyses (PSA) for the Laguna Verde Nuclear Power Plant (CNLV). The following programme has been carried out to meet this objective: (a) implementation of the STCP, (b) acquisition of specific data for CNLV to execute the STCP, and (c) calculations of specific source terms for accident sequences at CNLV. The STCP has been implemented and validated on CDC 170/815 and CDC 180/860 main frames as well as on a Micro VAX 3800 system. In order to get a plant-specific source term, data on the CNLV including initial core inventory, burn-up, primary containment structures, and materials used for the calculations have been obtained. Because STCP does not explicitly model containment failure, dry well failure in the form of a catastrophic rupture has been assumed. One of the most significant sequences from the point of view of possible off-site risk is the loss of off-site power with failure of the diesel generators and simultaneous loss of high pressure core spray and reactor core isolation cooling systems. The probability for that event is approximately 4.5 x 10 -6 . This sequence has been analysed in detail and the release fractions of radioisotope groups are given in the full report. 18 refs, 4 figs, 3 tabs

  19. The analytical investigation of the super-Gaussian pump source on ...

    Indian Academy of Sciences (India)

    In this paper, we assumed that the fiber core and first clad are exposed to a pump source with a super-Gaussian profile of order four. The effects of this non-uniform heat deposition on thermal, stress and thermo-optics properties such as temperature-dependent change of refractive index and thermally induced stress have ...

  20. Analytical magmatic source modelling from a joint inversion of ground deformation and focal mechanisms data

    Science.gov (United States)

    Cannavo', Flavio; Scandura, Danila; Palano, Mimmo; Musumeci, Carla

    2014-05-01

    Seismicity and ground deformation represent the principal geophysical methods for volcano monitoring and provide important constraints on subsurface magma movements. The occurrence of migrating seismic swarms, as observed at several volcanoes worldwide, are commonly associated with dike intrusions. In addition, on active volcanoes, (de)pressurization and/or intrusion of magmatic bodies stress and deform the surrounding crustal rocks, often causing earthquakes randomly distributed in time within a volume extending about 5-10 km from the wall of the magmatic bodies. Despite advances in space-based, geodetic and seismic networks have significantly improved volcano monitoring in the last decades on an increasing worldwide number of volcanoes, quantitative models relating deformation and seismicity are not common. The observation of several episodes of volcanic unrest throughout the world, where the movement of magma through the shallow crust was able to produce local rotation of the ambient stress field, introduces an opportunity to improve the estimate of the parameters of a deformation source. In particular, during these episodes of volcanic unrest a radial pattern of P-axes of the focal mechanism solutions, similar to that of ground deformation, has been observed. Therefore, taking into account additional information from focal mechanisms data, we propose a novel approach to volcanic source modeling based on the joint inversion of deformation and focal plane solutions assuming that both observations are due to the same source. The methodology is first verified against a synthetic dataset of surface deformation and strain within the medium, and then applied to real data from an unrest episode occurred before the May 13th 2008 eruption at Mt. Etna (Italy). The main results clearly indicate as the joint inversion improves the accuracy of the estimated source parameters of about 70%. The statistical tests indicate that the source depth is the parameter with the highest

  1. Diagnostic Air Quality Model Evaluation of Source-Specific Primary and Secondary Fine Particulate Carbon

    Science.gov (United States)

    Ambient measurements of 78 source-specific tracers of primary and secondary carbonaceous fine particulate matter collected at four midwestern United States locations over a full year (March 2004–February 2005) provided an unprecedented opportunity to diagnostically evaluate...

  2. Source-specific pollution exposure and associations with pulmonary response in the Atlanta Commuters Exposure Studies.

    Science.gov (United States)

    Krall, Jenna R; Ladva, Chandresh N; Russell, Armistead G; Golan, Rachel; Peng, Xing; Shi, Guoliang; Greenwald, Roby; Raysoni, Amit U; Waller, Lance A; Sarnat, Jeremy A

    2018-01-03

    Concentrations of traffic-related air pollutants are frequently higher within commuting vehicles than in ambient air. Pollutants found within vehicles may include those generated by tailpipe exhaust, brake wear, and road dust sources, as well as pollutants from in-cabin sources. Source-specific pollution, compared to total pollution, may represent regulation targets that can better protect human health. We estimated source-specific pollution exposures and corresponding pulmonary response in a panel study of commuters. We used constrained positive matrix factorization to estimate source-specific pollution factors and, subsequently, mixed effects models to estimate associations between source-specific pollution and pulmonary response. We identified four pollution factors that we named: crustal, primary tailpipe traffic, non-tailpipe traffic, and secondary. Among asthmatic subjects (N = 48), interquartile range increases in crustal and secondary pollution were associated with changes in lung function of -1.33% (95% confidence interval (CI): -2.45, -0.22) and -2.19% (95% CI: -3.46, -0.92) relative to baseline, respectively. Among non-asthmatic subjects (N = 51), non-tailpipe pollution was associated with pulmonary response only at 2.5 h post-commute. We found no significant associations between pulmonary response and primary tailpipe pollution. Health effects associated with traffic-related pollution may vary by source, and therefore some traffic pollution sources may require targeted interventions to protect health.

  3. Using analytic element models to delineate drinking water source protection areas.

    Science.gov (United States)

    Raymond, Heather A; Bondoc, Michael; McGinnis, John; Metropulos, Kathy; Heider, Pat; Reed, Allison; Saines, Steve

    2006-01-01

    Since 1999, Ohio EPA hydrogeologists have used two analytic element models (AEMs), the proprietary software GFLOW and U.S. EPA's WhAEM, to delineate protection areas for 535 public water systems. Both models now use the GFLOW2001 solution engine, integrate well with Geographic Information System (GIS) technology, have a user-friendly graphical interface, are capable of simulating a variety of complex hydrogeologic settings, and do not rely upon a model grid. These features simplify the modeling process and enable AEMs to bridge the gap between existing simplistic delineation methods and more complex numerical models. Ohio EPA hydrogeologists demonstrated that WhAEM2000 and GFLOW2000 were capable of producing capture zones similar to more widely accepted models by applying the AEMs to eight sites that had been previously delineated using other methods. After the Ohio EPA delineated protection areas using AEMs, more simplistic delineation methods used by other states (volumetric equation and arbitrary fixed radii) were applied to the same water systems to compare the differences between various methods. GIS software and two-tailed paired t-tests were used to quantify the differences in protection areas and analyze the data. The results of this analysis demonstrate that AEMs typically produce significantly different protection areas than the most simplistic delineation methods, in terms of total area and shape. If the volumetric equation had been used instead of AEMs, Ohio would not have protected 265 km2 of critical upgradient area and would have overprotected 269 km2 of primarily downgradient land. Since an increasing number of land-use restrictions are being tied to drinking water protection areas, this analysis has broad policy implications.

  4. Analytical characteristics of a continuum-source tungsten coil atomic absorption spectrometer.

    Science.gov (United States)

    Rust, Jennifer A; Nóbrega, Joaquim A; Calloway, Clifton P; Jones, Bradley T

    2005-08-01

    A continuum-source tungsten coil electrothermal atomic absorption spectrometer has been assembled, evaluated, and employed in four different applications. The instrument consists of a xenon arc lamp light source, a tungsten coil atomizer, a Czerny-Turner high resolution monochromator, and a linear photodiode array detector. This instrument provides simultaneous multi-element analyses across a 4 nm spectral window with a resolution of 0.024 nm. Such a device might be useful in many different types of analyses. To demonstrate this broad appeal, four very different applications have been evaluated. First of all, the temperature of the gas phase was measured during the atomization cycle of the tungsten coil, using tin as a thermometric element. Secondly, a summation approach for two absorption lines for aluminum falling within the same spectral window (305.5-309.5 nm) was evaluated. This approach improves the sensitivity without requiring any additional preconcentration steps. The third application describes a background subtraction technique, as it is applied to the analysis of an oil emulsion sample. Finally, interference effects caused by Na on the atomization of Pb were studied. The simultaneous measurements of Pb and Na suggests that negative interference arises at least partially from competition between Pb and Na atoms for H2 in the gas phase.

  5. Analytical estimation of emission zone mean position and width in organic light-emitting diodes from emission pattern image-source interference fringes

    International Nuclear Information System (INIS)

    Epstein, Ariel; Tessler, Nir; Einziger, Pinchas D.; Roberts, Matthew

    2014-01-01

    We present an analytical method for evaluating the first and second moments of the effective exciton spatial distribution in organic light-emitting diodes (OLED) from measured emission patterns. Specifically, the suggested algorithm estimates the emission zone mean position and width, respectively, from two distinct features of the pattern produced by interference between the emission sources and their images (induced by the reflective cathode): the angles in which interference extrema are observed, and the prominence of interference fringes. The relations between these parameters are derived rigorously for a general OLED structure, indicating that extrema angles are related to the mean position of the radiating excitons via Bragg's condition, and the spatial broadening is related to the attenuation of the image-source interference prominence due to an averaging effect. The method is applied successfully both on simulated emission patterns and on experimental data, exhibiting a very good agreement with the results obtained by numerical techniques. We investigate the method performance in detail, showing that it is capable of producing accurate estimations for a wide range of source-cathode separation distances, provided that the measured spectral interval is large enough; guidelines for achieving reliable evaluations are deduced from these results as well. As opposed to numerical fitting tools employed to perform similar tasks to date, our approximate method explicitly utilizes physical intuition and requires far less computational effort (no fitting is involved). Hence, applications that do not require highly resolved estimations, e.g., preliminary design and production-line verification, can benefit substantially from the analytical algorithm, when applicable. This introduces a novel set of efficient tools for OLED engineering, highly important in the view of the crucial role the exciton distribution plays in determining the device performance.

  6. Analytical estimation of emission zone mean position and width in organic light-emitting diodes from emission pattern image-source interference fringes

    Energy Technology Data Exchange (ETDEWEB)

    Epstein, Ariel, E-mail: ariel.epstein@utoronto.ca; Tessler, Nir, E-mail: nir@ee.technion.ac.il; Einziger, Pinchas D. [Department of Electrical Engineering, Technion-Israel Institute of Technology, Haifa 32000 (Israel); Roberts, Matthew, E-mail: mroberts@cdtltd.co.uk [Cambridge Display Technology Ltd, Building 2020, Cambourne Business Park, Cambourne, Cambridgeshire CB23 6DW (United Kingdom)

    2014-06-14

    We present an analytical method for evaluating the first and second moments of the effective exciton spatial distribution in organic light-emitting diodes (OLED) from measured emission patterns. Specifically, the suggested algorithm estimates the emission zone mean position and width, respectively, from two distinct features of the pattern produced by interference between the emission sources and their images (induced by the reflective cathode): the angles in which interference extrema are observed, and the prominence of interference fringes. The relations between these parameters are derived rigorously for a general OLED structure, indicating that extrema angles are related to the mean position of the radiating excitons via Bragg's condition, and the spatial broadening is related to the attenuation of the image-source interference prominence due to an averaging effect. The method is applied successfully both on simulated emission patterns and on experimental data, exhibiting a very good agreement with the results obtained by numerical techniques. We investigate the method performance in detail, showing that it is capable of producing accurate estimations for a wide range of source-cathode separation distances, provided that the measured spectral interval is large enough; guidelines for achieving reliable evaluations are deduced from these results as well. As opposed to numerical fitting tools employed to perform similar tasks to date, our approximate method explicitly utilizes physical intuition and requires far less computational effort (no fitting is involved). Hence, applications that do not require highly resolved estimations, e.g., preliminary design and production-line verification, can benefit substantially from the analytical algorithm, when applicable. This introduces a novel set of efficient tools for OLED engineering, highly important in the view of the crucial role the exciton distribution plays in determining the device performance.

  7. Radioactive particles in the environment: sources, particle characterization and analytical techniques

    International Nuclear Information System (INIS)

    2011-08-01

    Over the years, radioactive particles have been released to the environment from nuclear weapons testing and nuclear fuel cycle operations. However, measurements of environmental radioactivity and any associated assessments are often based on the average bulk mass or surface concentration, assuming that radionuclides are homogeneously distributed as simple ionic species. It has generally not been recognised that radioactive particles present in the environment often contain a significant fraction of the bulk sample activity, leading to sample heterogeneity problems and false and/or erratic measurement data. Moreover, the inherent differences in the transport and bioavailability of particle bound radionuclides compared with those existing as molecules or ions have largely been ignored in dose assessments. To date, most studies regarding radionuclide behaviour in the soil-plant system have dealt with soluble forms of radionuclides. When radionuclides are deposited in a less mobile form, or in case of a superposition of different physico-chemical forms, the behaviour of radionuclides becomes much more complicated and extra efforts are required to provide information about environmental status and behaviour of radioactive particles. There are currently no documents or international guides covering this aspect of environmental impact assessments. To fill this gap, between 2001 and 2008 the IAEA performed a Coordinated Research Programme (CRP- G4.10.03) on the 'Radiochemical, Chemical and Physical Characterization of Radioactive Particles in the Environment' with the objective of development, adoption and application of standardized analytical techniques for the comprehensive study of radioactive particles. The CRP was in line with the IAEA project intended to assist the Member States in building capacity for improving environmental assessments and for management of sites contaminated with radioactive particles. This IAEA-TECDOC presents the findings and achievements of

  8. SU-E-T-120: Analytic Dose Verification for Patient-Specific Proton Pencil Beam Scanning Plans

    International Nuclear Information System (INIS)

    Chang, C; Mah, D

    2015-01-01

    Purpose: To independently verify the QA dose of proton pencil beam scanning (PBS) plans using an analytic dose calculation model. Methods: An independent proton dose calculation engine is created using the same commissioning measurements as those employed to build our commercially available treatment planning system (TPS). Each proton PBS plan is exported from the TPS in DICOM format and calculated by this independent dose engine in a standard 40 x 40 x 40 cm water tank. This three-dimensional dose grid is then compared with the QA dose calculated by the commercial TPS, using standard Gamma criterion. A total of 18 measured pristine Bragg peaks, ranging from 100 to 226 MeV, are used in the model. Intermediate proton energies are interpolated. Similarly, optical properties of the spots are measured in air over 15 cm upstream and downstream, and fitted to a second-order polynomial. Multiple Coulomb scattering in water is approximated analytically using Preston and Kohler formula for faster calculation. The effect of range shifters on spot size is modeled with generalized Highland formula. Note that the above formulation approximates multiple Coulomb scattering in water and we therefore chose not use the full Moliere/Hanson form. Results: Initial examination of 3 patient-specific prostate PBS plans shows that agreement exists between 3D dose distributions calculated by the TPS and the independent proton PBS dose calculation engine. Both calculated dose distributions are compared with actual measurements at three different depths per beam and good agreements are again observed. Conclusion: Results here showed that 3D dose distributions calculated by this independent proton PBS dose engine are in good agreement with both TPS calculations and actual measurements. This tool can potentially be used to reduce the amount of different measurement depths required for patient-specific proton PBS QA

  9. Thulium-170 oxide heat source experimental and analytical radiation and shielding study

    International Nuclear Information System (INIS)

    Tse, A.; Nelson, C.A.

    1970-05-01

    Radiation dose rates from three thulium-170 oxide sources (20.7, 10.0 and 5.0 thermal watts) were measured through three thicknesses (1/4, 1/2 and 1 inch) of absorber by thermoluminescent dosimetry techniques. Absorber materials used were aluminium, stainless steel, lead, tungsten and depleted uranium. Resultant radiation doses were measured at 19 and 100 cm. Comparison of theoretical dose rates calculated by computer with measured dose rates validated the calculation technique for lead, tungsten and uranium absorbers but not for aluminum and stainless steel. Use of infinite medium build-up factors (B/sub ∞/) was thus validated in computation of dose rates for lead, tungsten and uranium absorbers; use of B/sub ∞/ in computation of dose rates for aluminum and stainless steel absorbers overestimated dose rates vis-a-vis experimentally determined dose rates by an approximate factor of 2

  10. Getting more out of biomedical documents with GATE's full lifecycle open source text analytics.

    Science.gov (United States)

    Cunningham, Hamish; Tablan, Valentin; Roberts, Angus; Bontcheva, Kalina

    2013-01-01

    This software article describes the GATE family of open source text analysis tools and processes. GATE is one of the most widely used systems of its type with yearly download rates of tens of thousands and many active users in both academic and industrial contexts. In this paper we report three examples of GATE-based systems operating in the life sciences and in medicine. First, in genome-wide association studies which have contributed to discovery of a head and neck cancer mutation association. Second, medical records analysis which has significantly increased the statistical power of treatment/outcome models in the UK's largest psychiatric patient cohort. Third, richer constructs in drug-related searching. We also explore the ways in which the GATE family supports the various stages of the lifecycle present in our examples. We conclude that the deployment of text mining for document abstraction or rich search and navigation is best thought of as a process, and that with the right computational tools and data collection strategies this process can be made defined and repeatable. The GATE research programme is now 20 years old and has grown from its roots as a specialist development tool for text processing to become a rather comprehensive ecosystem, bringing together software developers, language engineers and research staff from diverse fields. GATE now has a strong claim to cover a uniquely wide range of the lifecycle of text analysis systems. It forms a focal point for the integration and reuse of advances that have been made by many people (the majority outside of the authors' own group) who work in text processing for biomedicine and other areas. GATE is available online under GNU open source licences and runs on all major operating systems. Support is available from an active user and developer community and also on a commercial basis.

  11. Getting more out of biomedical documents with GATE's full lifecycle open source text analytics.

    Directory of Open Access Journals (Sweden)

    Hamish Cunningham

    Full Text Available This software article describes the GATE family of open source text analysis tools and processes. GATE is one of the most widely used systems of its type with yearly download rates of tens of thousands and many active users in both academic and industrial contexts. In this paper we report three examples of GATE-based systems operating in the life sciences and in medicine. First, in genome-wide association studies which have contributed to discovery of a head and neck cancer mutation association. Second, medical records analysis which has significantly increased the statistical power of treatment/outcome models in the UK's largest psychiatric patient cohort. Third, richer constructs in drug-related searching. We also explore the ways in which the GATE family supports the various stages of the lifecycle present in our examples. We conclude that the deployment of text mining for document abstraction or rich search and navigation is best thought of as a process, and that with the right computational tools and data collection strategies this process can be made defined and repeatable. The GATE research programme is now 20 years old and has grown from its roots as a specialist development tool for text processing to become a rather comprehensive ecosystem, bringing together software developers, language engineers and research staff from diverse fields. GATE now has a strong claim to cover a uniquely wide range of the lifecycle of text analysis systems. It forms a focal point for the integration and reuse of advances that have been made by many people (the majority outside of the authors' own group who work in text processing for biomedicine and other areas. GATE is available online under GNU open source licences and runs on all major operating systems. Support is available from an active user and developer community and also on a commercial basis.

  12. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening.

    Science.gov (United States)

    Lawton, Zachary E; Traub, Angelica; Fatigante, William L; Mancias, Jose; O'Leary, Adam E; Hall, Seth E; Wieland, Jamie R; Oberacher, Herbert; Gizzi, Michael C; Mulligan, Christopher C

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. Graphical Abstract ᅟ.

  13. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening

    Science.gov (United States)

    Lawton, Zachary E.; Traub, Angelica; Fatigante, William L.; Mancias, Jose; O'Leary, Adam E.; Hall, Seth E.; Wieland, Jamie R.; Oberacher, Herbert; Gizzi, Michael C.; Mulligan, Christopher C.

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. [Figure not available: see fulltext.

  14. Electron capture detector based on a non-radioactive electron source: operating parameters vs. analytical performance

    Directory of Open Access Journals (Sweden)

    E. Bunert

    2017-12-01

    Full Text Available Gas chromatographs with electron capture detectors are widely used for the analysis of electron affine substances such as pesticides or chlorofluorocarbons. With detection limits in the low pptv range, electron capture detectors are the most sensitive detectors available for such compounds. Based on their operating principle, they require free electrons at atmospheric pressure, which are usually generated by a β− decay. However, the use of radioactive materials leads to regulatory restrictions regarding purchase, operation, and disposal. Here, we present a novel electron capture detector based on a non-radioactive electron source that shows similar detection limits compared to radioactive detectors but that is not subject to these limitations and offers further advantages such as adjustable electron densities and energies. In this work we show first experimental results using 1,1,2-trichloroethane and sevoflurane, and investigate the effect of several operating parameters on the analytical performance of this new non-radioactive electron capture detector (ECD.

  15. Crowd-sourcing as an analytical method: Metrology of smartphone measurements in heritage science.

    Science.gov (United States)

    Brigham, Rosie; Grau-Bove, Josep; Rudnicka, Anna; Cassar, May; Strlic, Matija

    2018-04-12

    This research assesses the precision, repeatability and accuracy of crowd-sourced scientific measurements, and whether their quality is sufficient to provide usable results. Measurements of colour and area were chosen because of the possibility of producing them with smartphone cameras. The quality of measurements was estimated experimentally by comparing data contributed by anonymous participants in heritage sites with reference measurements of known accuracy and precision. Participants performed the measurements by taking photographs with their smartphones, from which colour and dimensional data could be extracted. The results indicate that smartphone measurements provided by citizen-scientists can be used to measure changes of colour, but that the performance is strongly dependent on the measured colour coordinate and ranges from a minimum detectable colour change or difference between colours of ΔE 3.1 to ΔE 17.2. The same method is able to measure areas when the difference in colour with the neighbouring areas is higher than ΔE 10. These results render the method useful in some heritage science contexts, but higher precision would be desirable: the human eye can detect differences as small as ΔE 2, and a light-fast pigment fades approximately ΔE 8 in its lifetime. There is scope for further research in the automatization of the post-processing of user contributions and the effect of contextual factors (such as detail in the instructions) in the quality of the raw data. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Getting Open Source Right for Big Data Analytics: Software Sharing, Governance, Collaboration and Most of All, Fun!

    Science.gov (United States)

    Mattmann, C. A.

    2013-12-01

    A wave of open source big data analytic infrastructure is currently shaping government, private sector, and academia. Projects are consuming, adapting, and contributing back to various ecosystems of software e.g., the Apache Hadoop project and its ecosystem of related efforts including Hive, HBase, Pig, Oozie, Ambari, Knox, Tez and Yarn, to name a few; the Berkeley AMPLab stack which includes Spark, Shark, Mesos, Tachyon, BlinkDB, MLBase, and other emerging efforts; MapR and its related stack of technologies, offerings from commercial companies building products around these tools e.g., Hortonworks Data Platform (HDP), Cloudera's CDH project, etc. Though the technologies all offer different capabilities including low latency support/in-memory, versus record oriented file I/O, high availability, support for the Map Reduce programming paradigm or other dataflow/workflow constructs, there is a common thread that binds these products - they are all released under an open source license e.g., Apache2, MIT, BSD, GPL/LGPL, etc.; all thrive in various ecosystems, such as Apache, or Berkeley AMPLab; all are developed collaboratively, and all technologies provide plug in architecture models and methodologies for allowing others to contribute, and participate via various community models. This talk will cover the open source aspects and governance aspects of the aforementioned Big Data ecosystems and point out the differences, subtleties, and implications of those differences. The discussion will be by example, using several national deployments and Big Data initiatives stemming from the Administration including DARPA's XDATA program; NASA's CMAC program; NSF's EarthCube and geosciences BigData projects. Lessons learned from these efforts in terms of the open source aspects of these technologies will help guide the AGU community in their use, deployment and understanding.

  17. Mixed analytical-stochastic simulation method for the recovery of a Brownian gradient source from probability fluxes to small windows.

    Science.gov (United States)

    Dobramysl, U; Holcman, D

    2018-02-15

    Is it possible to recover the position of a source from the steady-state fluxes of Brownian particles to small absorbing windows located on the boundary of a domain? To address this question, we develop a numerical procedure to avoid tracking Brownian trajectories in the entire infinite space. Instead, we generate particles near the absorbing windows, computed from the analytical expression of the exit probability. When the Brownian particles are generated by a steady-state gradient at a single point, we compute asymptotically the fluxes to small absorbing holes distributed on the boundary of half-space and on a disk in two dimensions, which agree with stochastic simulations. We also derive an expression for the splitting probability between small windows using the matched asymptotic method. Finally, when there are more than two small absorbing windows, we show how to reconstruct the position of the source from the diffusion fluxes. The present approach provides a computational first principle for the mechanism of sensing a gradient of diffusing particles, a ubiquitous problem in cell biology.

  18. A Simple Analytical Model for Predicting the Detectable Ion Current in Ion Mobility Spectrometry Using Corona Discharge Ionization Sources

    Science.gov (United States)

    Kirk, Ansgar Thomas; Kobelt, Tim; Spehlbrink, Hauke; Zimmermann, Stefan

    2018-05-01

    Corona discharge ionization sources are often used in ion mobility spectrometers (IMS) when a non-radioactive ion source with high ion currents is required. Typically, the corona discharge is followed by a reaction region where analyte ions are formed from the reactant ions. In this work, we present a simple yet sufficiently accurate model for predicting the ion current available at the end of this reaction region when operating at reduced pressure as in High Kinetic Energy Ion Mobility Spectrometers (HiKE-IMS) or most IMS-MS instruments. It yields excellent qualitative agreement with measurement results and is even able to calculate the ion current within an error of 15%. Additional interesting findings of this model are the ion current at the end of the reaction region being independent from the ion current generated by the corona discharge and the ion current in High Kinetic Energy Ion Mobility Spectrometers (HiKE-IMS) growing quadratically when scaling down the length of the reaction region. [Figure not available: see fulltext.

  19. The Analytical Repository Source-Term (AREST) model: Analysis of spent fuel as a nuclear waste form

    International Nuclear Information System (INIS)

    Apted, M.J.; Liebetrau, A.M.; Engel, D.W.

    1989-02-01

    The purpose of this report is to assess the performance of spent fuel as a final waste form. The release of radionuclides from spent nuclear fuel has been simulated for the three repository sites that were nominated for site characterization in accordance with the Nuclear Waste Policy Act of 1982. The simulation is based on waste package designs that were presented in the environmental assessments prepared for each site. Five distinct distributions for containment failure have been considered, and the release for nuclides from the UO 2 matrix, gap (including grain boundary), crud/surface layer, and cladding has been calculated with the Analytic Repository Source-Term (AREST) code. Separate scenarios involving incongruent and congruent release from the UO 2 matrix have also been examined using the AREST code. Congruent release is defined here as the condition in which the relative mass release rates of a given nuclide and uranium from the UO 2 matrix are equal to their mass ratios in the matrix. Incongruent release refers to release of a given nuclide from the UO 2 matrix controlled by its own solubility-limiting solid phase. Release of nuclides from other sources within the spent fuel (e.g., cladding, fuel/cladding gap) is evaluated separately from either incongruent or congruent matrix release. 51 refs., 200 figs., 9 tabs

  20. Using recruitment source timing and diagnosticity to enhance applicants' occupation-specific human capital.

    Science.gov (United States)

    Campion, Michael C; Ployhart, Robert E; Campion, Michael A

    2017-05-01

    [Correction Notice: An Erratum for this article was reported in Vol 102(5) of Journal of Applied Psychology (see record 2017-14296-001). In the article, the following headings were inadvertently set at the wrong level: Method, Participants and Procedure, Measures, Occupation specific human capital, Symbolic jobs, Relevant majors, Occupation-specific capital hotspots, Source timing, Source diagnosticity, Results, and Discussion. All versions of this article have been corrected.] This study proposes that reaching applicants through more diagnostic recruitment sources earlier in their educational development (e.g., in high school) can lead them to invest more in their occupation-specific human capital (OSHC), thereby making them higher quality candidates. Using a sample of 78,157 applicants applying for jobs within a desirable professional occupation in the public sector, results indicate that applicants who report hearing about the occupation earlier, and applicants who report hearing about the occupation through more diagnostic sources, have higher levels of OSHC upon application. Additionally, source timing and diagnosticity affect the likelihood of candidates applying for jobs symbolic of the occupation, selecting relevant majors, and attending educational institutions with top programs related to the occupation. These findings suggest a firm's recruiting efforts may influence applicants' OSHC investment strategies. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  1. Mobility and Sector-specific Effects of Changes in Multiple Sources ...

    African Journals Online (AJOL)

    Using the second and third Cameroon household consumption surveys, this study examined mobility and sector-specific effects of changes in multiple sources of deprivation in Cameroon. Results indicated that between 2001 and 2007, deprivations associated with human capital and labour capital reduced, while ...

  2. 78 FR 60700 - Source Specific Federal Implementation Plan for Implementing Best Available Retrofit Technology...

    Science.gov (United States)

    2013-10-02

    ... ENVIRONMENTAL PROTECTION AGENCY 40 CFR Part 49 [EPA-R09-OAR-2013-0489; FRL-9901-58-Region 9] Source Specific Federal Implementation Plan for Implementing Best Available Retrofit Technology for Four... Plan (FIP) to implement the Best Available Retrofit Technology (BART) requirement of the Regional Haze...

  3. 78 FR 41731 - Source Specific Federal Implementation Plan for Implementing Best Available Retrofit Technology...

    Science.gov (United States)

    2013-07-11

    ... ENVIRONMENTAL PROTECTION AGENCY 40 CFR Part 49 [EPA-R09-OAR-2013-0489; FRL-9830-5] Source Specific Federal Implementation Plan for Implementing Best Available Retrofit Technology for Four Corners Power... Implementation Plan (FIP) to implement the Best Available Retrofit Technology (BART) requirement of the Regional...

  4. Source and specificity of chemical cues mediating shelter preference of Caribbean spiny lobsters (Panulirus argus).

    Science.gov (United States)

    Horner, Amy J; Nickles, Scott P; Weissburg, Marc J; Derby, Charles D

    2006-10-01

    Caribbean spiny lobsters display a diversity of social behaviors, one of the most prevalent of which is gregarious diurnal sheltering. Previous research has demonstrated that shelter selection is chemically mediated, but the source of release and the identity of the aggregation signal are unknown. In this study, we investigated the source and specificity of the aggregation signal in Caribbean spiny lobsters, Panulirus argus. We developed a relatively rapid test of shelter choice in a 5000-l laboratory flume that simulated flow conditions in the spiny lobster's natural environment, and used it to examine the shelter preference of the animals in response to a variety of odorants. We found that both males and females associated preferentially with shelters emanating conspecific urine of either sex, but not with shelters emanating seawater, food odors, or the scent of a predatory octopus. These results demonstrate specificity in the cues mediating sheltering behavior and show that urine is at least one source of the aggregation signal.

  5. Predictive Big Data Analytics: A Study of Parkinson’s Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations

    Science.gov (United States)

    Dinov, Ivo D.; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W.; Price, Nathan D.; Van Horn, John D.; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M.; Dauer, William; Toga, Arthur W.

    2016-01-01

    Background A unique archive of Big Data on Parkinson’s Disease is collected, managed and disseminated by the Parkinson’s Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson’s disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data–large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources–all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Methods and Findings Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several

  6. Medial temporal lobe reinstatement of content-specific details predicts source memory

    Science.gov (United States)

    Liang, Jackson C.; Preston, Alison R.

    2016-01-01

    Leading theories propose that when remembering past events, medial temporal lobe (MTL) structures reinstate the neural patterns that were active when those events were initially encoded. Accurate reinstatement is hypothesized to support detailed recollection of memories, including their source. While several studies have linked cortical reinstatement to successful retrieval, indexing reinstatement within the MTL network and its relationship to memory performance has proved challenging. Here, we addressed this gap in knowledge by having participants perform an incidental encoding task, during which they visualized people, places, and objects in response to adjective cues. During a surprise memory test, participants saw studied and novel adjectives and indicated the imagery task they performed for each adjective. A multivariate pattern classifier was trained to discriminate the imagery tasks based on functional magnetic resonance imaging (fMRI) responses from hippocampus and MTL cortex at encoding. The classifier was then tested on MTL patterns during the source memory task. We found that MTL encoding patterns were reinstated during successful source retrieval. Moreover, when participants made source misattributions, errors were predicted by reinstatement of incorrect source content in MTL cortex. We further observed a gradient of content-specific reinstatement along the anterior-posterior axis of hippocampus and MTL cortex. Within anterior hippocampus, we found that reinstatement of person content was related to source memory accuracy, whereas reinstatement of place information across the entire hippocampal axis predicted correct source judgments. Content-specific reinstatement was also graded across MTL cortex, with PRc patterns evincing reactivation of people and more posterior regions, including PHc, showing evidence for reinstatement of places and objects. Collectively, these findings provide key evidence that source recollection relies on reinstatement of past

  7. CheapStat: an open-source, "do-it-yourself" potentiostat for analytical and educational applications.

    Directory of Open Access Journals (Sweden)

    Aaron A Rowe

    Full Text Available Although potentiostats are the foundation of modern electrochemical research, they have seen relatively little application in resource poor settings, such as undergraduate laboratory courses and the developing world. One reason for the low penetration of potentiostats is their cost, as even the least expensive commercially available laboratory potentiostats sell for more than one thousand dollars. An inexpensive electrochemical workstation could thus prove useful in educational labs, and increase access to electrochemistry-based analytical techniques for food, drug and environmental monitoring. With these motivations in mind, we describe here the CheapStat, an inexpensive (<$80, open-source (software and hardware, hand-held potentiostat that can be constructed by anyone who is proficient at assembling circuits. This device supports a number of potential waveforms necessary to perform cyclic, square wave, linear sweep and anodic stripping voltammetry. As we demonstrate, it is suitable for a wide range of applications ranging from food- and drug-quality testing to environmental monitoring, rapid DNA detection, and educational exercises. The device's schematics, parts lists, circuit board layout files, sample experiments, and detailed assembly instructions are available in the supporting information and are released under an open hardware license.

  8. RETRANS - A tool to verify the functional equivalence of automatically generated source code with its specification

    International Nuclear Information System (INIS)

    Miedl, H.

    1998-01-01

    Following the competent technical standards (e.g. IEC 880) it is necessary to verify each step in the development process of safety critical software. This holds also for the verification of automatically generated source code. To avoid human errors during this verification step and to limit the cost effort a tool should be used which is developed independently from the development of the code generator. For this purpose ISTec has developed the tool RETRANS which demonstrates the functional equivalence of automatically generated source code with its underlying specification. (author)

  9. Sources of variability in fatty acid (FA) biomarkers in the application of compound-specific stable isotopes (CSSIs) to soil and sediment fingerprinting and tracing: A review

    Energy Technology Data Exchange (ETDEWEB)

    Reiffarth, D.G., E-mail: Dominic.Reiffarth@unbc.ca [Natural Resources and Environmental Studies Program, University of Northern British Columbia, 3333 University Way, Prince George, BC V2N 4Z9 (Canada); Petticrew, E.L., E-mail: Ellen.Petticrew@unbc.ca [Geography Program and Quesnel River Research Centre, University of Northern British Columbia, 3333 University Way, Prince George, BC V2N 4Z9 (Canada); Owens, P.N., E-mail: Philip.Owens@unbc.ca [Environmental Science Program and Quesnel River Research Centre, University of Northern British Columbia, 3333 University Way, Prince George, BC, V2N 4Z9 (Canada); Lobb, D.A., E-mail: David.Lobb@umanitoba.ca [Watershed Systems Research Program, University of Manitoba, 13 Freedman Crescent, Winnipeg, MB R3T 2N2 (Canada)

    2016-09-15

    Determining soil redistribution and sediment budgets in watersheds is often challenging. One of the methods for making such determinations employs soil and sediment fingerprinting techniques, using sediment properties such as geochemistry, fallout radionuclides, and mineral magnetism. These methods greatly improve the estimation of erosion and deposition within a watershed, but are limited when determining land use-based soil and sediment movement. Recently, compound-specific stable isotopes (CSSIs), which employ fatty acids naturally occurring in the vegetative cover of soils, offer the possibility of refining fingerprinting techniques based on land use, complementing other methods that are currently in use. The CSSI method has been met with some success; however, challenges still remain with respect to scale and resolution due to a potentially large degree of biological, environmental and analytical uncertainty. By better understanding the source of tracers used in CSSI work and the inherent biochemical variability in those tracers, improvement in sample design and tracer selection is possible. Furthermore, an understanding of environmental and analytical factors affecting the CSSI signal will lead to refinement of the approach and the ability to generate more robust data. This review focuses on sources of biological, environmental and analytical variability in applying CSSI to soil and sediment fingerprinting, and presents recommendations based on past work and current research in this area for improving the CSSI technique. A recommendation, based on current information available in the literature, is to use very-long chain saturated fatty acids and to avoid the use of the ubiquitous saturated fatty acids, C{sub 16} and C{sub 18}. - Highlights: • Compound-specific stable isotopes (CSSIs) of carbon may be used as soil tracers. • The variables affecting CSSI data are: biological, environmental and analytical. • Understanding sources of variability will lead

  10. Sources of variability in fatty acid (FA) biomarkers in the application of compound-specific stable isotopes (CSSIs) to soil and sediment fingerprinting and tracing: A review

    International Nuclear Information System (INIS)

    Reiffarth, D.G.; Petticrew, E.L.; Owens, P.N.; Lobb, D.A.

    2016-01-01

    Determining soil redistribution and sediment budgets in watersheds is often challenging. One of the methods for making such determinations employs soil and sediment fingerprinting techniques, using sediment properties such as geochemistry, fallout radionuclides, and mineral magnetism. These methods greatly improve the estimation of erosion and deposition within a watershed, but are limited when determining land use-based soil and sediment movement. Recently, compound-specific stable isotopes (CSSIs), which employ fatty acids naturally occurring in the vegetative cover of soils, offer the possibility of refining fingerprinting techniques based on land use, complementing other methods that are currently in use. The CSSI method has been met with some success; however, challenges still remain with respect to scale and resolution due to a potentially large degree of biological, environmental and analytical uncertainty. By better understanding the source of tracers used in CSSI work and the inherent biochemical variability in those tracers, improvement in sample design and tracer selection is possible. Furthermore, an understanding of environmental and analytical factors affecting the CSSI signal will lead to refinement of the approach and the ability to generate more robust data. This review focuses on sources of biological, environmental and analytical variability in applying CSSI to soil and sediment fingerprinting, and presents recommendations based on past work and current research in this area for improving the CSSI technique. A recommendation, based on current information available in the literature, is to use very-long chain saturated fatty acids and to avoid the use of the ubiquitous saturated fatty acids, C 16 and C 18 . - Highlights: • Compound-specific stable isotopes (CSSIs) of carbon may be used as soil tracers. • The variables affecting CSSI data are: biological, environmental and analytical. • Understanding sources of variability will lead to more

  11. PAVA: Physiological and Anatomical Visual Analytics for Mapping of Tissue-Specific Concentration and Time-Course Data

    Science.gov (United States)

    We describe the development and implementation of a Physiological and Anatomical Visual Analytics tool (PAVA), a web browser-based application, used to visualize experimental/simulated chemical time-course data (dosimetry), epidemiological data and Physiologically-Annotated Data ...

  12. Identification of specific sources of airborne particles emitted from within a complex industrial (steelworks) site

    Science.gov (United States)

    Beddows, D. C. S.; Harrison, Roy M.

    2018-06-01

    A case study is provided of the development and application of methods to identify and quantify specific sources of emissions from within a large complex industrial site. Methods include directional analysis of concentrations, chemical source tracers and correlations with gaseous emissions. Extensive measurements of PM10, PM2.5, trace gases, particulate elements and single particle mass spectra were made at sites around the Port Talbot steelworks in 2012. By using wind direction data in conjunction with real-time or hourly-average pollutant concentration measurements, it has been possible to locate areas within the steelworks associated with enhanced pollutant emissions. Directional analysis highlights the Slag Handling area of the works as the most substantial source of elevated PM10 concentrations during the measurement period. Chemical analyses of air sampled from relevant wind directions is consistent with the anticipated composition of slags, as are single particle mass spectra. Elevated concentrations of PM10 are related to inverse distance from the Slag Handling area, and concentrations increase with increased wind speed, consistent with a wind-driven resuspension source. There also appears to be a lesser source associated with Sinter Plant emissions affecting PM10 concentrations at the Fire Station monitoring site. The results are compared with a ME2 study using some of the same data, and shown to give a clearer view of the location and characteristics of emission sources, including fugitive dusts.

  13. USE OF BIG DATA ANALYTICS FOR CUSTOMER RELATIONSHIP MANAGEMENT: POINT OF PARITY OR SOURCE OF COMPETITIVE ADVANTAGE?

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas; Zablah, Alex R.; Straub, Detmar W.

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (CA use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: 1. What are the key antecedents of big data customer analytics use? 2. How, and to what extent, does big data...

  14. Analytical performance specifications for changes in assay bias (Δbias) for data with logarithmic distributions as assessed by effects on reference change values

    DEFF Research Database (Denmark)

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G

    2016-01-01

    BACKGROUND: The distributions of within-subject biological variation are usually described as coefficients of variation, as are analytical performance specifications for bias, imprecision and other characteristics. Estimation of specifications required for reference change values is traditionally...... done using relationship between the batch-related changes during routine performance, described as Δbias, and the coefficients of variation for analytical imprecision (CVA): the original theory is based on standard deviations or coefficients of variation calculated as if distributions were Gaussian....... METHODS: The distribution of between-subject biological variation can generally be described as log-Gaussian. Moreover, recent analyses of within-subject biological variation suggest that many measurands have log-Gaussian distributions. In consequence, we generated a model for the estimation of analytical...

  15. Design specific joint optimization of masks and sources on a very large scale

    Science.gov (United States)

    Lai, K.; Gabrani, M.; Demaris, D.; Casati, N.; Torres, A.; Sarkar, S.; Strenski, P.; Bagheri, S.; Scarpazza, D.; Rosenbluth, A. E.; Melville, D. O.; Wächter, A.; Lee, J.; Austel, V.; Szeto-Millstone, M.; Tian, K.; Barahona, F.; Inoue, T.; Sakamoto, M.

    2011-04-01

    Joint optimization (JO) of source and mask together is known to produce better SMO solutions than sequential optimization of the source and the mask. However, large scale JO problems are very difficult to solve because the global impact of the source variables causes an enormous number of mask variables to be coupled together. This work presents innovation that minimize this runtime bottleneck. The proposed SMO parallelization algorithm allows separate mask regions to be processed efficiently across multiple CPUs in a high performance computing (HPC) environment, despite the fact that a truly joint optimization is being carried out with source variables that interact across the entire mask. Building on this engine a progressive deletion (PD) method was developed that can directly compute "binding constructs" for the optimization, i.e. our method can essentially determine the particular feature content which limits the process window attainable by the optimum source. This method allows us to minimize the uncertainty inherent to different clustering/ranking methods in seeking an overall optimum source that results from the use of heuristic metrics. An objective benchmarking of the effectiveness of different pattern sampling methods was performed during postoptimization analysis. The PD serves as a golden standard for us to develop optimum pattern clustering/ranking algorithms. With this work, it is shown that it is not necessary to exhaustively optimize the entire mask together with the source in order to identify these binding clips. If the number of clips to be optimized exceeds the practical limit of the parallel SMO engine one can starts with a pattern selection step to achieve high clip count compression before SMO. With this LSSO capability one can address the challenging problem of layout-specific design, or improve the technology source as cell layouts and sample layouts replace lithography test structures in the development cycle.

  16. The sources of the specificity of nuclear law and environmental law

    International Nuclear Information System (INIS)

    Rainaud, J.M.; Cristini, R.

    1983-01-01

    This paper analyses the sources of the specificity of nuclear law and its relationship with environmental law as well as with ordinary law. The characteristics of nuclear law are summarized thus: recent discovery of the atom's uses and mandatory protection against its effects; internationalization of its use, leading to a limitation of national authorities competence. Several international treaties are cited (Antarctic Treaty, NPT, London Dumping Convention etc.) showing the link between radiation protection and the environment. (NEA) [fr

  17. Optimisation (sampling strategies and analytical procedures) for site specific environment monitoring at the areas of uranium production legacy sites in Ukraine - 59045

    International Nuclear Information System (INIS)

    Voitsekhovych, Oleg V.; Lavrova, Tatiana V.; Kostezh, Alexander B.

    2012-01-01

    There are many sites in the world, where Environment are still under influence of the contamination related to the Uranium production carried out in past. Author's experience shows that lack of site characterization data, incomplete or unreliable environment monitoring studies can significantly limit quality of Safety Assessment procedures and Priority actions analyses needed for Remediation Planning. During recent decades the analytical laboratories of the many enterprises, currently being responsible for establishing the site specific environment monitoring program have been significantly improved their technical sampling and analytical capacities. However, lack of experience in the optimal site specific sampling strategy planning and also not enough experience in application of the required analytical techniques, such as modern alpha-beta radiometers, gamma and alpha spectrometry and liquid-scintillation analytical methods application for determination of U-Th series radionuclides in the environment, does not allow to these laboratories to develop and conduct efficiently the monitoring programs as a basis for further Safety Assessment in decision making procedures. This paper gives some conclusions, which were gained from the experience establishing monitoring programs in Ukraine and also propose some practical steps on optimization in sampling strategy planning and analytical procedures to be applied for the area required Safety assessment and justification for its potential remediation and safe management. (authors)

  18. Source-specific fine particulate air pollution and systemic inflammation in ischaemic heart disease patients

    Science.gov (United States)

    Siponen, Taina; Yli-Tuomi, Tarja; Aurela, Minna; Dufva, Hilkka; Hillamo, Risto; Hirvonen, Maija-Riitta; Huttunen, Kati; Pekkanen, Juha; Pennanen, Arto; Salonen, Iiris; Tiittanen, Pekka; Salonen, Raimo O; Lanki, Timo

    2015-01-01

    Objective To compare short-term effects of fine particles (PM2.5; aerodynamic diameter <2.5 µm) from different sources on the blood levels of markers of systemic inflammation. Methods We followed a panel of 52 ischaemic heart disease patients from 15 November 2005 to 21 April 2006 with clinic visits in every second week in the city of Kotka, Finland, and determined nine inflammatory markers from blood samples. In addition, we monitored outdoor air pollution at a fixed site during the study period and conducted a source apportionment of PM2.5 using the Environmental Protection Agency's model EPA PMF 3.0. We then analysed associations between levels of source-specific PM2.5 and markers of systemic inflammation using linear mixed models. Results We identified five source categories: regional and long-range transport (LRT), traffic, biomass combustion, sea salt, and pulp industry. We found most evidence for the relation of air pollution and inflammation in LRT, traffic and biomass combustion; the most relevant inflammation markers were C-reactive protein, interleukin-12 and myeloperoxidase. Sea salt was not positively associated with any of the inflammatory markers. Conclusions Results suggest that PM2.5 from several sources, such as biomass combustion and traffic, are promoters of systemic inflammation, a risk factor for cardiovascular diseases. PMID:25479755

  19. Comparison in the analytical performance between krypton and argon glow discharge plasmas as the excitation source for atomic emission spectrometry.

    Science.gov (United States)

    Wagatsuma, Kazuaki

    2009-04-01

    The emission characteristics of ionic lines of nickel, cobalt, and vanadium were investigated when argon or krypton was employed as the plasma gas in glow discharge optical emission spectrometry. A dc Grimm-style lamp was employed as the excitation source. Detection limits of the ionic lines in each iron-matrix alloy sample were compared between the krypton and the argon plasmas. Particular intense ionic lines were observed in the emission spectra as a function of the discharge gas (krypton or argon), such as the Co II 258.033 nm for krypton and the Co II 231.707 nm for argon. The explanation for this is that collisions with the plasma gases dominantly populate particular excited levels of cobalt ion, which can receive the internal energy from each gas ion selectively, for example, the 3d(7)4p (3)G(5) (6.0201 eV) for krypton and the 3d(7)4p (3)G(4) (8.0779 eV) for argon. In the determination of nickel as well as cobalt in iron-matrix samples, more sensitive ionic lines could be found in the krypton plasma rather than the argon plasma. Detection limits in the krypton plasma were 0.0039 mass% Ni for the Ni II 230.299-nm line and 0.002 mass% Co for the Co II 258.033-nm line. However, in the determination of vanadium, the argon plasma had better analytical performance, giving a detection limit of 0.0023 mass% V for the V II 309.310-nm line.

  20. Analytical solution for the transient wave propagation of a buried cylindrical P-wave line source in a semi-infinite elastic medium with a fluid surface layer

    Science.gov (United States)

    Shan, Zhendong; Ling, Daosheng

    2018-02-01

    This article develops an analytical solution for the transient wave propagation of a cylindrical P-wave line source in a semi-infinite elastic solid with a fluid layer. The analytical solution is presented in a simple closed form in which each term represents a transient physical wave. The Scholte equation is derived, through which the Scholte wave velocity can be determined. The Scholte wave is the wave that propagates along the interface between the fluid and solid. To develop the analytical solution, the wave fields in the fluid and solid are defined, their analytical solutions in the Laplace domain are derived using the boundary and interface conditions, and the solutions are then decomposed into series form according to the power series expansion method. Each item of the series solution has a clear physical meaning and represents a transient wave path. Finally, by applying Cagniard's method and the convolution theorem, the analytical solutions are transformed into the time domain. Numerical examples are provided to illustrate some interesting features in the fluid layer, the interface and the semi-infinite solid. When the P-wave velocity in the fluid is higher than that in the solid, two head waves in the solid, one head wave in the fluid and a Scholte wave at the interface are observed for the cylindrical P-wave line source.

  1. Cause-specific stillbirth and exposure to chemical constituents and sources of fine particulate matter.

    Science.gov (United States)

    Ebisu, Keita; Malig, Brian; Hasheminassab, Sina; Sioutas, Constantinos; Basu, Rupa

    2018-01-01

    The stillbirth rate in the United States is relatively high, but limited evidence is available linking stillbirth with fine particulate matter (PM 2.5 ), its chemical constituents and sources. In this study, we explored associations between cause-specific stillbirth and prenatal exposures to those pollutants with using live birth and stillbirth records from eight California locations during 2002-2009. ICD-10 codes were used to identify cause of stillbirth from stillbirth records. PM 2.5 total mass and chemical constituents were collected from ambient monitors and PM 2.5 sources were quantified using Positive Matrix Factorization. Conditional logistic regression was applied using a nested case-control study design (N = 32,262). We found that different causes of stillbirth were associated with different PM 2.5 sources and/or chemical constituents. For stillbirths due to fetal growth, the odds ratio (OR) per interquartile range increase in gestational age-adjusted exposure to PM 2.5 total mass was 1.23 (95% confidence interval (CI): 1.06, 1.44). Similar associations were found with resuspended soil (OR=1.25, 95% CI: 1.10, 1.42), and secondary ammonium sulfate (OR=1.45, 95% CI: 1.18, 1.78). No associations were found between any pollutants and stillbirths caused by maternal complications. This study highlighted the importance of investigating cause-specific stillbirth and the differential toxicity levels of specific PM 2.5 sources and chemical constituents. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Final report of the IAEA advisory group meeting on accelerator-based nuclear analytical techniques for characterization and source identification of aerosol particles

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-01

    The field of aerosol characterization and source identification covers a wide range of scientific and technical activities in many institutions, in both developed and developing countries. This field includes research and applications on urban air pollution, source apportionment of suspended particulate matter, radioactive aerosol particles, organic compounds carried on particulate matter, elemental characterization of particles, and other areas. The subject of this AGM focused on the use of accelerator-based nuclear analytical techniques for determination of elemental composition of particles (by either bulk or single particle analysis) and the use of accumulated knowledge for source identification.

  3. Final report of the IAEA advisory group meeting on accelerator-based nuclear analytical techniques for characterization and source identification of aerosol particles

    International Nuclear Information System (INIS)

    1995-01-01

    The field of aerosol characterization and source identification covers a wide range of scientific and technical activities in many institutions, in both developed and developing countries. This field includes research and applications on urban air pollution, source apportionment of suspended particulate matter, radioactive aerosol particles, organic compounds carried on particulate matter, elemental characterization of particles, and other areas. The subject of this AGM focused on the use of accelerator-based nuclear analytical techniques for determination of elemental composition of particles (by either bulk or single particle analysis) and the use of accumulated knowledge for source identification

  4. Technical specifications for the provision of heat and steam sources for INPP and Visaginas. Final report

    International Nuclear Information System (INIS)

    2003-01-01

    In October 1999, the National Energy Strategy was approved by the Lithuanian Parliament. The National Energy Strategy included the decision to close Unit-1 of INPP before 2005. Later is has been decided to close Unit 2 before the end of 2009 as well. The closure and decommissioning will have heavy impact on the heat supply for the city of Visaginas. Unit 1 and Unit 2 of INPP supplies hot water and steam to INPP for process purposes and for space heating of residential and commercial buildings. When Unit 1 is permanently shut down, reliable heat and steam sources independent of the power plants own heat and steam generation facilities are required for safety reasons in the event of shutdown of the remaining unit for maintenance or in an emergency. These steam and heat sources must be operational before single unit operation is envisaged. Provision of a reliable independent heat and steam source is therefore urgent. After both reactors are shut down permanently, a steam source will be needed at the plant for radioactive waste storage and disposal. INPP and DEA has performed a feasibility study for the provision of a reliable heat source for Ignalina Nuclear Power Plant and Visaginas, and the modernisation of Visaginas district heating system. The objective of this project is to prepare technical specifications for the provision of new heat and steam sources for INPP and Visaginas, and for rehabilitation of the heat transmission pipeline between INPP, the back-up boiler station and Visaginas City. The results of the study are presented in detail in the reports and technical specifications: 1. Transient analysis for Visaginas DH system, 2. Non-destructive testing of boiler stations, pump stations and transmission lines, 3. Conceptual design, 4. Technical specifications, Package 1 to 6. The study has suggested: 1. Construction of new steam boiler station, 2. Construction of new heat only boiler station, 3. Renovation of existing back-up heat only boiler station, 4

  5. The Utility of Person-Specific Analyses for Investigating Developmental Processes: An Analytic Primer on Studying the Individual

    Science.gov (United States)

    Gayles, Jochebed G.; Molenaar, Peter C. M.

    2013-01-01

    The fields of psychology and human development are experiencing a resurgence of scientific inquiries about phenomena that unfold at the level of the individual. This article addresses the issues of analyzing intraindividual psychological/developmental phenomena using standard analytical techniques for interindividual variation. When phenomena are…

  6. Schedule Analytics

    Science.gov (United States)

    2016-04-30

    Warfare, Naval Sea Systems Command Acquisition Cycle Time : Defining the Problem David Tate, Institute for Defense Analyses Schedule Analytics Jennifer...research was comprised of the following high- level steps :  Identify and review primary data sources 1...research. However, detailed reviews of the OMB IT Dashboard data revealed that schedule data is highly aggregated. Program start date and program end date

  7. Domain-specific impairment of source memory following a right posterior medial temporal lobe lesion.

    Science.gov (United States)

    Peters, Jan; Koch, Benno; Schwarz, Michael; Daum, Irene

    2007-01-01

    This single case analysis of memory performance in a patient with an ischemic lesion affecting posterior but not anterior right medial temporal lobe (MTL) indicates that source memory can be disrupted in a domain-specific manner. The patient showed normal recognition memory for gray-scale photos of objects (visual condition) and spoken words (auditory condition). While memory for visual source (texture/color of the background against which pictures appeared) was within the normal range, auditory source memory (male/female speaker voice) was at chance level, a performance pattern significantly different from the control group. This dissociation is consistent with recent fMRI evidence of anterior/posterior MTL dissociations depending upon the nature of source information (visual texture/color vs. auditory speaker voice). The findings are in good agreement with the view of dissociable memory processing by the perirhinal cortex (anterior MTL) and parahippocampal cortex (posterior MTL), depending upon the neocortical input that these regions receive. (c) 2007 Wiley-Liss, Inc.

  8. Determining the depth of certain gravity sources without a priori specification of their structural index

    Science.gov (United States)

    Zhou, Shuai; Huang, Danian

    2015-11-01

    We have developed a new method for the interpretation of gravity tensor data based on the generalized Tilt-depth method. Cooper (2011, 2012) extended the magnetic Tilt-depth method to gravity data. We take the gradient-ratio method of Cooper (2011, 2012) and modify it so that the source type does not need to be specified a priori. We develop the new method by generalizing the Tilt-depth method for depth estimation for different types of source bodies. The new technique uses only the three vertical tensor components of the full gravity tensor data observed or calculated at different height plane to estimate the depth of the buried bodies without a priori specification of their structural index. For severely noise-corrupted data, our method utilizes different upward continuation height data, which can effectively reduce the influence of noise. Theoretical simulations of the gravity source model with and without noise illustrate the ability of the method to provide source depth information. Additionally, the simulations demonstrate that the new method is simple, computationally fast and accurate. Finally, we apply the method using the gravity data acquired over the Humble Salt Dome in the USA as an example. The results show a good correspondence to the previous drilling and seismic interpretation results.

  9. Theoretical and Numerical Modeling of Transport of Land Use-Specific Fecal Source Identifiers

    Science.gov (United States)

    Bombardelli, F. A.; Sirikanchana, K. J.; Bae, S.; Wuertz, S.

    2008-12-01

    Microbial contamination in coastal and estuarine waters is of particular concern to public health officials. In this work, we advocate that well-formulated and developed mathematical and numerical transport models can be combined with modern molecular techniques in order to predict continuous concentrations of microbial indicators under diverse scenarios of interest, and that they can help in source identification of fecal pollution. As a proof of concept, we present initially the theory, numerical implementation and validation of one- and two-dimensional numerical models aimed at computing the distribution of fecal source identifiers in water bodies (based on Bacteroidales marker DNA sequences) coming from different land uses such as wildlife, livestock, humans, dogs or cats. These models have been developed to allow for source identification of fecal contamination in large bodies of water. We test the model predictions using diverse velocity fields and boundary conditions. Then, we present some preliminary results of an application of a three-dimensional water quality model to address the source of fecal contamination in the San Pablo Bay (SPB), United States, which constitutes an important sub-embayment of the San Francisco Bay. The transport equations for Bacteroidales include the processes of advection, diffusion, and decay of Bacteroidales. We discuss the validation of the developed models through comparisons of numerical results with field campaigns developed in the SPB. We determine the extent and importance of the contamination in the bay for two decay rates obtained from field observations, corresponding to total host-specific Bacteroidales DNA and host-specific viable Bacteroidales cells, respectively. Finally, we infer transport conditions in the SPB based on the numerical results, characterizing the fate of outflows coming from the Napa, Petaluma and Sonoma rivers.

  10. Identification of Evidence for Key Parameters in Decision-Analytic Models of Cost Effectiveness: A Description of Sources and a Recommended Minimum Search Requirement.

    Science.gov (United States)

    Paisley, Suzy

    2016-06-01

    This paper proposes recommendations for a minimum level of searching for data for key parameters in decision-analytic models of cost effectiveness and describes sources of evidence relevant to each parameter type. Key parameters are defined as treatment effects, adverse effects, costs, resource use, health state utility values (HSUVs) and baseline risk of events. The recommended minimum requirement for treatment effects is comprehensive searching according to available methodological guidance. For other parameter types, the minimum is the searching of one bibliographic database plus, where appropriate, specialist sources and non-research-based and non-standard format sources. The recommendations draw on the search methods literature and on existing analyses of how evidence is used to support decision-analytic models. They take account of the range of research and non-research-based sources of evidence used in cost-effectiveness models and of the need for efficient searching. Consideration is given to what constitutes best evidence for the different parameter types in terms of design and scientific quality and to making transparent the judgments that underpin the selection of evidence from the options available. Methodological issues are discussed, including the differences between decision-analytic models of cost effectiveness and systematic reviews when searching and selecting evidence and comprehensive versus sufficient searching. Areas are highlighted where further methodological research is required.

  11. Interpretation of the source-specific substantive control measures of the Minamata Convention on Mercury.

    Science.gov (United States)

    You, Mingqing

    2015-02-01

    Being persistent, toxic, and bio-accumulative, Mercury (Hg) seriously affects the environment and human health. Due to Hg's attribute of long-range environmental transport across national borders, especially through atmospheric transport, no country can fully protect its environment and human health with its own efforts, without global cooperation. The Minamata Convention on Mercury, which was formally adopted and opened for signature in October 2013, is the only global environmental regime on the control of Hg pollution. Its main substantive control measures are source-specific: its phasing-out, phasing-down, and other main substantive requirements all direct to specific categories of pollution sources through the regulation of specific sectors of the economy and social life. This Convention does not take a national quota approach to quantify the Parties' nationwide total allowable consumption or discharge of Hg or Hg compounds, nor does it quantify their nationwide total reduction requirements. This paper attempts to find the underlying reasons for this source-specific approach and offers two interpretations. One possible interpretation is that Hg might be a non-threshold pollutant, i.e., a pollutant without a risk-free value of concentration. The existence of a reference dose (RfD), reference concentration (RfC), provisional tolerable weekly intake (PTWI), minimal risk level (MRL) or other similar reference values of Hg does not necessarily mean that Hg cannot be regarded as non-threshold because such reference values have scientific uncertainties and may also involve policy considerations. Another interpretation is that Hg lacks a feasibly determinable total allowable quantity. There is evidence that negotiators might have treated Hg as non-threshold, or at least accepted that Hg lacks a feasibly determinable total allowable quantity: (1) The negotiators were informed about the serious situations of the current emissions, releases, and legacy deposition; (2

  12. 25 CFR 115.702 - What specific sources of money will be accepted for deposit into a trust account?

    Science.gov (United States)

    2010-04-01

    ... Information § 115.702 What specific sources of money will be accepted for deposit into a trust account? We... 25 Indians 1 2010-04-01 2010-04-01 false What specific sources of money will be accepted for deposit into a trust account? 115.702 Section 115.702 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE...

  13. SPECTRAL INDEX AS A FUNCTION OF MASS ACCRETION RATE IN BLACK HOLE SOURCES: MONTE CARLO SIMULATIONS AND AN ANALYTICAL DESCRIPTION

    International Nuclear Information System (INIS)

    Laurent, Philippe; Titarchuk, Lev

    2011-01-01

    We present herein a theoretical study of correlations between spectral indexes of X-ray emergent spectra and mass accretion rate ( m-dot ) in black hole (BH) sources, which provide a definitive signature for BHs. It has been firmly established, using the Rossi X-ray Timing Explorer (RXTE) in numerous BH observations during hard-soft state spectral evolution, that the photon index of X-ray spectra increases when m-dot increases and, moreover, the index saturates at high values of m-dot . In this paper, we present theoretical arguments that the observationally established index saturation effect versus mass accretion rate is a signature of the bulk (converging) flow onto the BH. Also, we demonstrate that the index saturation value depends on the plasma temperature of converging flow. We self-consistently calculate the Compton cloud (CC) plasma temperature as a function of mass accretion rate using the energy balance between energy dissipation and Compton cooling. We explain the observable phenomenon, index- m-dot correlations using a Monte Carlo simulation of radiative processes in the innermost part (CC) of a BH source and we account for the Comptonization processes in the presence of thermal and bulk motions, as basic types of plasma motion. We show that, when m-dot increases, BH sources evolve to high and very soft states (HSS and VSS, respectively), in which the strong blackbody(BB)-like and steep power-law components are formed in the resulting X-ray spectrum. The simultaneous detections of these two components strongly depends on sensitivity of high-energy instruments, given that the relative contribution of the hard power-law tail in the resulting VSS spectrum can be very low, which is why, to date RXTE observations of the VSS X-ray spectrum have been characterized by the presence of the strong BB-like component only. We also predict specific patterns for high-energy e-fold (cutoff) energy (E fold ) evolution with m-dot for thermal and dynamical (bulk

  14. Travel-time source-specific station correction improves location accuracy

    Science.gov (United States)

    Giuntini, Alessandra; Materni, Valerio; Chiappini, Stefano; Carluccio, Roberto; Console, Rodolfo; Chiappini, Massimo

    2013-04-01

    Accurate earthquake locations are crucial for investigating seismogenic processes, as well as for applications like verifying compliance to the Comprehensive Test Ban Treaty (CTBT). Earthquake location accuracy is related to the degree of knowledge about the 3-D structure of seismic wave velocity in the Earth. It is well known that modeling errors of calculated travel times may have the effect of shifting the computed epicenters far from the real locations by a distance even larger than the size of the statistical error ellipses, regardless of the accuracy in picking seismic phase arrivals. The consequences of large mislocations of seismic events in the context of the CTBT verification is particularly critical in order to trigger a possible On Site Inspection (OSI). In fact, the Treaty establishes that an OSI area cannot be larger than 1000 km2, and its larger linear dimension cannot be larger than 50 km. Moreover, depth accuracy is crucial for the application of the depth event screening criterion. In the present study, we develop a method of source-specific travel times corrections based on a set of well located events recorded by dense national seismic networks in seismically active regions. The applications concern seismic sequences recorded in Japan, Iran and Italy. We show that mislocations of the order of 10-20 km affecting the epicenters, as well as larger mislocations in hypocentral depths, calculated from a global seismic network and using the standard IASPEI91 travel times can be effectively removed by applying source-specific station corrections.

  15. Specific absorbed fractions of energy at various ages from internal photon sources: 7, Adult male

    International Nuclear Information System (INIS)

    Cristy, M.; Eckerman, K.F.

    1987-04-01

    Specific absorbed fractions (PHI's) in various organs of the body (target organs) from sources of monoenergetic photons in various other organs (source organs) are tabulated. In this volume PHI-values are tabulated for an adult male (70-kg Reference Man). These PHI-values can be used in calculating the photon component of the dose-equivalent rate in a given target organ from a given radionuclide that is present in a given source organ. The International Commission on Radiological Protection recognizes that the endosteal, or bone surface, cells are the tissue at risk for bone cancer. We have applied the dosimetry methods developed for beta-emitting radionuclides deposited in bone to follow the transport of secondary electrons that were freed by photon interactions through the microscopic structure of the skeleton. With these methods we can estimate PHI in the endosteal cells and can better estimate PHI in the active marrow; the latter is overestimated with other methods at photon energies below 200 keV. 12 refs., 2 tabs

  16. Specific absorbed fractions of energy at various ages from internal photon sources: 1, Methods

    International Nuclear Information System (INIS)

    Cristy, M.; Eckerman, K.F.

    1987-04-01

    Specific absorbed fractions (PHI's) in various organs of the body (target organs) from sources of monoenergetic photons in various other organs (source organs) are tabulated. This volume outlines various methods used to compute the PHI-values and describes how the ''best'' estimates recommended by us are chosen. These PHI-values can be used in calculating the photon component of the dose-equivalent rate in a given target organ from a given radionuclide that is present in a given source organ. The International Commission on Radiological Protection recognizes that the endosteal, or bone surface, cells are the tissue at risk for bone cancer. We have applied the dosimetry methods that Spiers and co-workers developed for beta-emitting radionuclides deposited in bone to follow the transport of secondary electrons that were freed by photon interactions through the microscopic structure of the skeleton. With these methods we can estimate PHI in the endosteal cells and can better estimate PHI in the active marrow; the latter is overestimated with the methods at photon energies below 200 keV. 41 refs., 25 figs., 23 tabs

  17. TU-H-CAMPUS-IeP1-05: A Framework for the Analytic Calculation of Patient-Specific Dose Distribution Due to CBCT Scan for IGRT

    Energy Technology Data Exchange (ETDEWEB)

    Youn, H; Jeon, H; Nam, J; Lee, J; Lee, J [Pusan National University Yangsan Hospital, Yangsan, Gyeongsangnam-do (Korea, Republic of); Kim, J; Kim, H [Pusan National University, Busan (Korea, Republic of); Cho, M; Yun, S [Samsung electronics Co., Suwon, Gyeonggi-do (Korea, Republic of); Park, D; Kim, W; Ki, Y; Kim, D [Pusan National University Hospital, Busan (Korea, Republic of)

    2016-06-15

    Purpose: To investigate the feasibility of an analytic framework to estimate patients’ absorbed dose distribution owing to daily cone-beam CT scan for image-guided radiation treatment. Methods: To compute total absorbed dose distribution, we separated the framework into primary and scattered dose calculations. Using the source parameters such as voltage, current, and bowtie filtration, for the primary dose calculation, we simulated the forward projection from the source to each voxel of an imaging object including some inhomogeneous inserts. Then we calculated the primary absorbed dose at each voxel based on the absorption probability deduced from the HU values and Beer’s law. In sequence, all voxels constructing the phantom were regarded as secondary sources to radiate scattered photons for scattered dose calculation. Details of forward projection were identical to that of the previous step. The secondary source intensities were given by using scatter-to- primary ratios provided by NIST. In addition, we compared the analytically calculated dose distribution with their Monte Carlo simulation results. Results: The suggested framework for absorbed dose estimation successfully provided the primary and secondary dose distributions of the phantom. Moreover, our analytic dose calculations and Monte Carlo calculations were well agreed each other even near the inhomogeneous inserts. Conclusion: This work indicated that our framework can be an effective monitor to estimate a patient’s exposure owing to cone-beam CT scan for image-guided radiation treatment. Therefore, we expected that the patient’s over-exposure during IGRT might be prevented by our framework.

  18. TU-H-CAMPUS-IeP1-05: A Framework for the Analytic Calculation of Patient-Specific Dose Distribution Due to CBCT Scan for IGRT

    International Nuclear Information System (INIS)

    Youn, H; Jeon, H; Nam, J; Lee, J; Lee, J; Kim, J; Kim, H; Cho, M; Yun, S; Park, D; Kim, W; Ki, Y; Kim, D

    2016-01-01

    Purpose: To investigate the feasibility of an analytic framework to estimate patients’ absorbed dose distribution owing to daily cone-beam CT scan for image-guided radiation treatment. Methods: To compute total absorbed dose distribution, we separated the framework into primary and scattered dose calculations. Using the source parameters such as voltage, current, and bowtie filtration, for the primary dose calculation, we simulated the forward projection from the source to each voxel of an imaging object including some inhomogeneous inserts. Then we calculated the primary absorbed dose at each voxel based on the absorption probability deduced from the HU values and Beer’s law. In sequence, all voxels constructing the phantom were regarded as secondary sources to radiate scattered photons for scattered dose calculation. Details of forward projection were identical to that of the previous step. The secondary source intensities were given by using scatter-to- primary ratios provided by NIST. In addition, we compared the analytically calculated dose distribution with their Monte Carlo simulation results. Results: The suggested framework for absorbed dose estimation successfully provided the primary and secondary dose distributions of the phantom. Moreover, our analytic dose calculations and Monte Carlo calculations were well agreed each other even near the inhomogeneous inserts. Conclusion: This work indicated that our framework can be an effective monitor to estimate a patient’s exposure owing to cone-beam CT scan for image-guided radiation treatment. Therefore, we expected that the patient’s over-exposure during IGRT might be prevented by our framework.

  19. Species-specific variation in the phosphorus nutritional sources by microphytoplankton in a Mediterranean estuary

    Directory of Open Access Journals (Sweden)

    MARLY CAROLINA MARTINEZ SOTO

    2015-08-01

    Full Text Available We investigated the species-specific phosphorus (P nutrition sources in the microphytoplankton community in the Mahon estuary (Minorca, Western Mediterranean in 2011, under two contrasting hydrographic scenarios. Estuarine flow, nutrient concentrations, phytoplankton community composition and enzyme-labeled fluorescence (ELF were measured in June and October, corresponding to the beginning and the end of summer. Dissolved inorganic nitrogen (DIN and inorganic phosphate (Pi exhibited enhanced concentrations in the inner estuary where N:P molar ratios suggested P-limitation in both surveys. Pi was low and variable (0.09±0.02 μmol•l-1 in June and 0.06±0.02 μmol•l-1 in October, whereas organic phosphorus remained a more reliable P source. Even though ambient Pi concentrations were slightly higher on June, when the microphytoplankton assemblage was dominated by dinoflagellates, the percentage of cells expressing ELF labeling was notably higher (65% of total cells than in October (12%, when the presence of diatoms characterized the microphytoplankton community. ELF was mainly expressed by dinoflagellate taxa, whereas diatoms only expressed significant AP in the inner estuary during the June survey. A P-addition bioassay in which response of AP to Pi enrichment was evaluated showed remarkable reduction in AP with increasing Pi. However, some dinoflagellate species maintained AP even when Pi was supplied in excess. We suggest that in the case of some dinoflagellate species AP is not as tightly controlled by ambient Pi as previously believed. AP activity in these species could indicate selective use of organic phosphorus, or slow metabolic response to changes in P forms, rather than physiological stress to low Pi availability. We emphasize the importance of identifying the links between the different P sources and the species-specific requirements, in order to understand the ecological response to anthropogenic biogeochemical perturbations.

  20. Comprehension and Writing Strategy Training Improves Performance on Content-Specific Source-Based Writing Tasks

    Science.gov (United States)

    Weston-Sementelli, Jennifer L.; Allen, Laura K.; McNamara, Danielle S.

    2018-01-01

    Source-based essays are evaluated both on the quality of the writing and the content appropriate interpretation and use of source material. Hence, composing a high-quality source-based essay (an essay written based on source material) relies on skills related to both reading (the sources) and writing (the essay) skills. As such, source-based…

  1. An analytical threshold voltage model for a short-channel dual-metal-gate (DMG) recessed-source/drain (Re-S/D) SOI MOSFET

    Science.gov (United States)

    Saramekala, G. K.; Santra, Abirmoya; Dubey, Sarvesh; Jit, Satyabrata; Tiwari, Pramod Kumar

    2013-08-01

    In this paper, an analytical short-channel threshold voltage model is presented for a dual-metal-gate (DMG) fully depleted recessed source/drain (Re-S/D) SOI MOSFET. For the first time, the advantages of recessed source/drain (Re-S/D) and of dual-metal-gate structure are incorporated simultaneously in a fully depleted SOI MOSFET. The analytical surface potential model at Si-channel/SiO2 interface and Si-channel/buried-oxide (BOX) interface have been developed by solving the 2-D Poisson’s equation in the channel region with appropriate boundary conditions assuming parabolic potential profile in the transverse direction of the channel. Thereupon, a threshold voltage model is derived from the minimum surface potential in the channel. The developed model is analyzed extensively for a variety of device parameters like the oxide and silicon channel thicknesses, thickness of source/drain extension in the BOX, control and screen gate length ratio. The validity of the present 2D analytical model is verified with ATLAS™, a 2D device simulator from SILVACO Inc.

  2. Numerical simulation and experimental validation of the three-dimensional flow field and relative analyte concentration distribution in an atmospheric pressure ion source.

    Science.gov (United States)

    Poehler, Thorsten; Kunte, Robert; Hoenen, Herwart; Jeschke, Peter; Wissdorf, Walter; Brockmann, Klaus J; Benter, Thorsten

    2011-11-01

    In this study, the validation and analysis of steady state numerical simulations of the gas flows within a multi-purpose ion source (MPIS) are presented. The experimental results were obtained with particle image velocimetry (PIV) measurements in a non-scaled MPIS. Two-dimensional time-averaged velocity and turbulent kinetic energy distributions are presented for two dry gas volume flow rates. The numerical results of the validation simulations are in very good agreement with the experimental data. All significant flow features have been correctly predicted within the accuracy of the experiments. For technical reasons, the experiments were conducted at room temperature. Thus, numerical simulations of ionization conditions at two operating points of the MPIS are also presented. It is clearly shown that the dry gas volume flow rate has the most significant impact on the overall flow pattern within the APLI source; far less critical is the (larger) nebulization gas flow. In addition to the approximate solution of Reynolds-Averaged Navier-Stokes equations, a transport equation for the relative analyte concentration has been solved. The results yield information on the three-dimensional analyte distribution within the source. It becomes evident that for ion transport into the MS ion transfer capillary, electromagnetic forces are at least as important as fluid dynamic forces. However, only the fluid dynamics determines the three-dimensional distribution of analyte gas. Thus, local flow phenomena in close proximity to the spray shield are strongly impacting on the ionization efficiency.

  3. Analytical Solution of the Hyperbolic Heat Conduction Equation for Moving Semi-Infinite Medium under the Effect of Time-Dependent Laser Heat Source

    Directory of Open Access Journals (Sweden)

    R. T. Al-Khairy

    2009-01-01

    source, whose capacity is given by (,=((1−− while the semi-infinite body has insulated boundary. The solution is obtained by Laplace transforms method, and the discussion of solutions for different time characteristics of heat sources capacity (constant, instantaneous, and exponential is presented. The effect of absorption coefficients on the temperature profiles is examined in detail. It is found that the closed form solution derived from the present study reduces to the previously obtained analytical solution when the medium velocity is set to zero in the closed form solution.

  4. A two dimensional analytical modeling of surface potential in triple metal gate (TMG) fully-depleted Recessed-Source/Drain (Re-S/D) SOI MOSFET

    Science.gov (United States)

    Priya, Anjali; Mishra, Ram Awadh

    2016-04-01

    In this paper, analytical modeling of surface potential is proposed for new Triple Metal Gate (TMG) fully depleted Recessed-Source/Dain Silicon On Insulator (SOI) Metal Oxide Semiconductor Field Effect Transistor (MOSFET). The metal with the highest work function is arranged near the source region and the lowest one near the drain. Since Recessed-Source/Drain SOI MOSFET has higher drain current as compared to conventional SOI MOSFET due to large source and drain region. The surface potential model developed by 2D Poisson's equation is verified by comparison to the simulation result of 2-dimensional ATLAS simulator. The model is compared with DMG and SMG devices and analysed for different device parameters. The ratio of metal gate length is varied to optimize the result.

  5. Applying Advanced Analytical Approaches to Characterize the Impact of Specific Clinical Gaps and Profiles on the Management of Rheumatoid Arthritis.

    Science.gov (United States)

    Ruiz-Cordell, Karyn D; Joubin, Kathy; Haimowitz, Steven

    2016-01-01

    The goal of this study was to add a predictive modeling approach to the meta-analysis of continuing medical education curricula to determine whether this technique can be used to better understand clinical decision making. Using the education of rheumatologists on rheumatoid arthritis management as a model, this study demonstrates how the combined methodology has the ability to not only characterize learning gaps but also identify those proficiency areas that have the greatest impact on clinical behavior. The meta-analysis included seven curricula with 25 activities. Learners who identified as rheumatologists were evaluated across multiple learning domains, using a uniform methodology to characterize learning gains and gaps. A performance composite variable (called the treatment individualization and optimization score) was then established as a target upon which predictive analytics were conducted. Significant predictors of the target included items related to the knowledge of rheumatologists and confidence concerning 1) treatment guidelines and 2) tests that measure disease activity. In addition, a striking demographic predictor related to geographic practice setting was also identified. The results demonstrate the power of advanced analytics to identify key predictors that influence clinical behaviors. Furthermore, the ability to provide an expected magnitude of change if these predictors are addressed has the potential to substantially refine educational priorities to those drivers that, if targeted, will most effectively overcome clinical barriers and lead to the greatest success in achieving treatment goals.

  6. Reduction of PM emissions from specific sources reflected on key components concentrations of ambient PM10

    Science.gov (United States)

    Minguillon, M. C.; Querol, X.; Monfort, E.; Alastuey, A.; Escrig, A.; Celades, I.; Miro, J. V.

    2009-04-01

    The relationship between specific particulate emission control and ambient levels of some PM10 components (Zn, As, Pb, Cs, Tl) was evaluated. To this end, the industrial area of Castellón (Eastern Spain) was selected, where around 40% of the EU glazed ceramic tiles and a high proportion of EU ceramic frits (middle product for the manufacture of ceramic glaze) are produced. The PM10 emissions from the ceramic processes were calculated over the period 2000 to 2007 taking into account the degree of implementation of corrective measures throughout the study period. Abatement systems (mainly bag filters) were implemented in the majority of the fusion kilns for frit manufacture in the area as a result of the application of the Directive 1996/61/CE, leading to a marked decrease in PM10 emissions. On the other hand, ambient PM10 sampling was carried out from April 2002 to July 2008 at three urban sites and one suburban site of the area and a complete chemical analysis was made for about 35 % of the collected samples, by means of different techniques (ICP-AES, ICP-MS, Ion Chromatography, selective electrode and elemental analyser). The series of chemical composition of PM10 allowed us to apply a source contribution model (Principal Component Analysis), followed by a multilinear regression analysis, so that PM10 sources were identified and their contribution to bulk ambient PM10 was quantified on a daily basis, as well as the contribution to bulk ambient concentrations of the identified key components (Zn, As, Pb, Cs, Tl). The contribution of the sources identified as the manufacture and use of ceramic glaze components, including the manufacture of ceramic frits, accounted for more than 65, 75, 58, 53, and 53% of ambient Zn, As, Pb, Cs and Tl levels, respectively (with the exception of Tl contribution at one of the sites). The important emission reductions of these sources during the study period had an impact on ambient key components levels, such that there was a high

  7. Two-dimensional analytical solutions for chemical transport in aquifers. Part 1. Simplified solutions for sources with constant concentration. Part 2. Exact solutions for sources with constant flux rate

    International Nuclear Information System (INIS)

    Shan, C.; Javandel, I.

    1996-05-01

    Analytical solutions are developed for modeling solute transport in a vertical section of a homogeneous aquifer. Part 1 of the series presents a simplified analytical solution for cases in which a constant-concentration source is located at the top (or the bottom) of the aquifer. The following transport mechanisms have been considered: advection (in the horizontal direction), transverse dispersion (in the vertical direction), adsorption, and biodegradation. In the simplified solution, however, longitudinal dispersion is assumed to be relatively insignificant with respect to advection, and has been neglected. Example calculations are given to show the movement of the contamination front, the development of concentration profiles, the mass transfer rate, and an application to determine the vertical dispersivity. The analytical solution developed in this study can be a useful tool in designing an appropriate monitoring system and an effective groundwater remediation method

  8. Open-Source Radiation Exposure Extraction Engine (RE3) with Patient-Specific Outlier Detection.

    Science.gov (United States)

    Weisenthal, Samuel J; Folio, Les; Kovacs, William; Seff, Ari; Derderian, Vana; Summers, Ronald M; Yao, Jianhua

    2016-08-01

    We present an open-source, picture archiving and communication system (PACS)-integrated radiation exposure extraction engine (RE3) that provides study-, series-, and slice-specific data for automated monitoring of computed tomography (CT) radiation exposure. RE3 was built using open-source components and seamlessly integrates with the PACS. RE3 calculations of dose length product (DLP) from the Digital imaging and communications in medicine (DICOM) headers showed high agreement (R (2) = 0.99) with the vendor dose pages. For study-specific outlier detection, RE3 constructs robust, automatically updating multivariable regression models to predict DLP in the context of patient gender and age, scan length, water-equivalent diameter (D w), and scanned body volume (SBV). As proof of concept, the model was trained on 811 CT chest, abdomen + pelvis (CAP) exams and 29 outliers were detected. The continuous variables used in the outlier detection model were scan length (R (2)  = 0.45), D w (R (2) = 0.70), SBV (R (2) = 0.80), and age (R (2) = 0.01). The categorical variables were gender (male average 1182.7 ± 26.3 and female 1047.1 ± 26.9 mGy cm) and pediatric status (pediatric average 710.7 ± 73.6 mGy cm and adult 1134.5 ± 19.3 mGy cm).

  9. Source apportionment of elevated wintertime PAHs by compound-specific radiocarbon analysis

    Directory of Open Access Journals (Sweden)

    R. J. Sheesley

    2009-05-01

    Full Text Available Natural abundance radiocarbon analysis facilitates distinct source apportionment between contemporary biomass/biofuel (14C "alive" versus fossil fuel (14C "dead" combustion. Here, the first compound-specific radiocarbon analysis (CSRA of atmospheric polycyclic aromatic hydrocarbons (PAHs was demonstrated for a set of samples collected in Lycksele, Sweden a small town with frequent episodes of severe atmospheric pollution in the winter. Renewed interest in using residential wood combustion (RWC means that this type of seasonal pollution is of increasing concern in many areas. Five individual/paired PAH isolates from three pooled fortnight-long filter collections were analyzed by CSRA: phenanthrene, fluoranthene, pyrene, benzo[b+k]fluoranthene and indeno[cd]pyrene plus benzo[ghi]perylene; phenanthrene was the only compound also analyzed in the gas phase. The measured Δ14C for PAHs spanned from −138.3‰ to 58.0‰. A simple isotopic mass balance model was applied to estimate the fraction biomass (fbiomass contribution, which was constrained to 71–87% for the individual PAHs. Indeno[cd]pyrene plus benzo[ghi]perylene had an fbiomass of 71%, while fluoranthene and phenanthrene (gas phase had the highest biomass contribution at 87%. The total organic carbon (TOC, defined as carbon remaining after removal of inorganic carbon fbiomass was estimated to be 77%, which falls within the range for PAHs. This CSRA data of atmospheric PAHs established that RWC is the dominating source of atmospheric PAHs to this region of the boreal zone with some variations among RWC contributions to specific PAHs.

  10. A new DG nanoscale TFET based on MOSFETs by using source gate electrode: 2D simulation and an analytical potential model

    Science.gov (United States)

    Ramezani, Zeinab; Orouji, Ali A.

    2017-08-01

    This paper suggests and investigates a double-gate (DG) MOSFET, which emulates tunnel field effect transistors (M-TFET). We have combined this novel concept into a double-gate MOSFET, which behaves as a tunneling field effect transistor by work function engineering. In the proposed structure, in addition to the main gate, we utilize another gate over the source region with zero applied voltage and a proper work function to convert the source region from N+ to P+. We check the impact obtained by varying the source gate work function and source doping on the device parameters. The simulation results of the M-TFET indicate that it is a suitable case for a switching performance. Also, we present a two-dimensional analytic potential model of the proposed structure by solving the Poisson's equation in x and y directions and by derivatives from the potential profile; thus, the electric field is achieved. To validate our present model, we use the SILVACO ATLAS device simulator. The analytical results have been compared with it.

  11. Analytic model of the stress waves propagation in thin wall tubes, seeking the location of a harmonic point source in its surface

    International Nuclear Information System (INIS)

    Boaratti, Mario Francisco Guerra

    2006-01-01

    Leaks in pressurized tubes generate acoustic waves that propagate through the walls of these tubes, which can be captured by accelerometers or by acoustic emission sensors. The knowledge of how these walls can vibrate, or in another way, how these acoustic waves propagate in this material is fundamental in the detection and localization process of the leak source. In this work an analytic model was implemented, through the motion equations of a cylindrical shell, with the objective to understand the behavior of the tube surface excited by a point source. Since the cylindrical surface has a closed pattern in the circumferential direction, waves that are beginning their trajectory will meet with another that has already completed the turn over the cylindrical shell, in the clockwise direction as well as in the counter clockwise direction, generating constructive and destructive interferences. After enough time of propagation, peaks and valleys in the shell surface are formed, which can be visualized through a graphic representation of the analytic solution created. The theoretical results were proven through measures accomplished in an experimental setup composed of a steel tube finished in sand box, simulating the condition of infinite tube. To determine the location of the point source on the surface, the process of inverse solution was adopted, that is to say, known the signals of the sensor disposed in the tube surface , it is determined through the theoretical model where the source that generated these signals can be. (author)

  12. Analytical solution for the transient response of a fluid/saturated porous medium halfspace system subjected to an impulsive line source

    Science.gov (United States)

    Shan, Zhendong; Ling, Daosheng; Jing, Liping; Li, Yongqiang

    2018-05-01

    In this paper, transient wave propagation is investigated within a fluid/saturated porous medium halfspace system with a planar interface that is subjected to a cylindrical P-wave line source. Assuming the permeability coefficient is sufficiently large, analytical solutions for the transient response of the fluid/saturated porous medium halfspace system are developed. Moreover, the analytical solutions are presented in simple closed forms wherein each term represents a transient physical wave, especially the expressions for head waves. The methodology utilised to determine where the head wave can emerge within the system is also given. The wave fields within the fluid and porous medium are first defined considering the behaviour of two compressional waves and one tangential wave in the saturated porous medium and one compressional wave in the fluid. Substituting these wave fields into the interface continuity conditions, the analytical solutions in the Laplace domain are then derived. To transform the solutions into the time domain, a suitable distortion of the contour is provided to change the integration path of the solution, after which the analytical solutions in the Laplace domain are transformed into the time domain by employing Cagniard's method. Numerical examples are provided to illustrate some interesting features of the fluid/saturated porous medium halfspace system. In particular, the interface wave and head waves that propagate along the interface between the fluid and saturated porous medium can be observed.

  13. Improved analytical sensitivity for uranium and plutonium in environmental samples: Cavity ion source thermal ionization mass spectrometry

    International Nuclear Information System (INIS)

    Ingeneri, Kristofer; Riciputi, L.

    2001-01-01

    Following successful field trials, environmental sampling has played a central role as a routine part of safeguards inspections since early 1996 to verify declared and to detect undeclared activity. The environmental sampling program has brought a new series of analytical challenges, and driven a need for advances in verification technology. Environmental swipe samples are often extremely low in concentration of analyte (ng level or lower), yet the need to analyze these samples accurately and precisely is vital, particularly for the detection of undeclared nuclear activities. Thermal ionization mass spectrometry (TIMS) is the standard method of determining isotope ratios of uranium and plutonium in the environmental sampling program. TIMS analysis typically employs 1-3 filaments to vaporize and ionize the sample, and the ions are mass separated and analyzed using magnetic sector instruments due to their high mass resolution and high ion transmission. However, the ionization efficiency (the ratio of material present to material actually detected) of uranium using a standard TIMS instrument is low (0.2%), even under the best conditions. Increasing ionization efficiency by even a small amount would have a dramatic impact for safeguards applications, allowing both improvements in analytical precision and a significant decrease in the amount of uranium and plutonium required for analysis, increasing the sensitivity of environmental sampling

  14. An Analytical Threshold Voltage Model of Fully Depleted (FD) Recessed-Source/Drain (Re-S/D) SOI MOSFETs with Back-Gate Control

    Science.gov (United States)

    Saramekala, Gopi Krishna; Tiwari, Pramod Kumar

    2016-10-01

    This paper presents an analytical threshold voltage model for back-gated fully depleted (FD), recessed-source drain silicon-on-insulator metal-oxide-semiconductor field-effect transistors (MOSFETs). Analytical surface potential models have been developed at front and back surfaces of the channel by solving the two-dimensional (2-D) Poisson's equation in the channel region with appropriate boundary conditions assuming a parabolic potential profile in the transverse direction of the channel. The strong inversion criterion is applied to the front surface potential as well as on the back one in order to find two separate threshold voltages for front and back channels of the device, respectively. The device threshold voltage has been assumed to be associated with the surface that offers a lower threshold voltage. The developed model was analyzed extensively for a variety of device geometry parameters like the oxide and silicon channel thicknesses, the thickness of the source/drain extension in the buried oxide, and the applied bias voltages with back-gate control. The proposed model has been validated by comparing the analytical results with numerical simulation data obtained from ATLAS™, a 2-D device simulator from SILVACO.

  15. An analytical approach for a nodal formulation of a two-dimensional fixed-source neutron transport problem in heterogeneous medium

    Energy Technology Data Exchange (ETDEWEB)

    Basso Barichello, Liliane; Dias da Cunha, Rudnei [Universidade Federal do Rio Grande do Sul, Porto Alegre, RS (Brazil). Inst. de Matematica; Becker Picoloto, Camila [Universidade Federal do Rio Grande do Sul, Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica; Tres, Anderson [Universidade Federal do Rio Grande do Sul, Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Matematica Aplicada

    2015-05-15

    A nodal formulation of a fixed-source two-dimensional neutron transport problem, in Cartesian geometry, defined in a heterogeneous medium, is solved by an analytical approach. Explicit expressions, in terms of the spatial variables, are derived for averaged fluxes in each region in which the domain is subdivided. The procedure is an extension of an analytical discrete ordinates method, the ADO method, for the solution of the two-dimensional homogeneous medium case. The scheme is developed from the discrete ordinates version of the two-dimensional transport equation along with the level symmetric quadrature scheme. As usual for nodal schemes, relations between the averaged fluxes and the unknown angular fluxes at the contours are introduced as auxiliary equations. Numerical results are in agreement with results available in the literature.

  16. Bioaccumulation of photoprotective compounds in copepods: environmental triggers and sources of intra-specific variability

    Science.gov (United States)

    Zagarese, H. E.; García, P.; Diéguez, M. D.; Ferraro, M. A.

    2012-12-01

    Ultraviolet radiation (UVR) and temperature are two globally important abiotic factors affecting freshwater ecosystems. Planktonic organisms have developed a battery of counteracting mechanisms to minimize the risk of being damaged by UVR, which respond to three basic principles: avoid, protect, repair. Copepods are among the most successful zooplankton groups. They are highly adaptable animals, capable of displaying flexible behaviors, physiologies, and life strategies. In particular, they are well equipped to cope with harmful UVR. Their arsenal includes vertical migration, accumulation of photoprotective compounds, and photorepair. The preference for a particular strategy is affected by a plethora of environmental (extrinsic) parameters, such as the existence of a depth refuge, the risk of visual predation, and temperature. Temperature modifies the environment (e.g. the lake thermal structure), and animal metabolism (e.g., swimming speed, bioaccumulation of photoprotective compounds). In addition, the relative weight of UVR-coping strategies is also influenced by the organism (intrinsic) characteristics (e.g., inter- and intra-specific variability). The UV absorbing compounds, mycosporine-like amino acids (MAAs), are widely distributed among freshwater copepods. Animals are unable to synthesize MAAs, and therefore depend on external sources for accumulating these compounds. Although copepods may acquire MAAs from their food, for the few centropagic species investigated so far, the main source of MAAs are microbial (most likely prokaryotic) organisms living in close association with the copepods. Boeckella gracilipes is a common centropagic copepod in Patagonian lakes. We suspected that its occurrence in different types of lakes, hydrologically unconnected, but within close geographical proximity, could have resulted in different microbial-copepod associations (i.e., different MAAs sources) that could translate into intra-specific differences in the accumulation

  17. Analytical characterization of ch14.18: a mouse-human chimeric disialoganglioside-specific therapeutic antibody.

    Science.gov (United States)

    Soman, Gopalan; Kallarakal, Abraham T; Michiel, Dennis; Yang, Xiaoyi; Saptharish, Nirmala; Jiang, Hengguang; Giardina, Steve; Gilly, John; Mitra, George

    2012-01-01

    Ch14.18 is a mouse-human chimeric monoclonal antibody to the disialoganglioside (GD2) glycolipid. In the clinic, this antibody has been shown to be effective in the treatment of children with high-risk neuroblastoma, either alone or in combination therapy. Extensive product characterization is a prerequisite to addressing the potential issues of product variability associated with process changes and manufacturing scale-up. Charge heterogeneity, glycosylation profile, molecular state and aggregation, interaction (affinity) with Fcγ receptors and functional or biological activities are a few of the critical characterization assays for assessing product comparability for this antibody. In this article, we describe the in-house development and qualification of imaged capillary isoelectric focusing to assess charge heterogeneity, analytical size exclusion chromatography with online static and dynamic light scattering (DLS), batch mode DLS for aggregate detection, biosensor (surface plasmon resonance)-based Fcγ receptor antibody interaction kinetics, N-glycoprofiling with PNGase F digestion, 2-aminobenzoic acid labeling and high performance liquid chromatography and N-glycan analysis using capillary electrophoresis. In addition, we studied selected biological activity assays, such as complement-dependent cytotoxicity. The consistency and reproducibility of the assays are established by comparing the intra-day and inter-day assay results. Applications of the methodologies to address stability or changes in product characteristics are also reported. The study results reveal that the ch14.18 clinical product formulated in phosphate-buffered saline at a concentration of 5 mg/ml and stored at 2-8°C is stable for more than five years.

  18. EEG source space analysis of the supervised factor analytic approach for the classification of multi-directional arm movement

    Science.gov (United States)

    Shenoy Handiru, Vikram; Vinod, A. P.; Guan, Cuntai

    2017-08-01

    Objective. In electroencephalography (EEG)-based brain-computer interface (BCI) systems for motor control tasks the conventional practice is to decode motor intentions by using scalp EEG. However, scalp EEG only reveals certain limited information about the complex tasks of movement with a higher degree of freedom. Therefore, our objective is to investigate the effectiveness of source-space EEG in extracting relevant features that discriminate arm movement in multiple directions. Approach. We have proposed a novel feature extraction algorithm based on supervised factor analysis that models the data from source-space EEG. To this end, we computed the features from the source dipoles confined to Brodmann areas of interest (BA4a, BA4p and BA6). Further, we embedded class-wise labels of multi-direction (multi-class) source-space EEG to an unsupervised factor analysis to make it into a supervised learning method. Main Results. Our approach provided an average decoding accuracy of 71% for the classification of hand movement in four orthogonal directions, that is significantly higher (>10%) than the classification accuracy obtained using state-of-the-art spatial pattern features in sensor space. Also, the group analysis on the spectral characteristics of source-space EEG indicates that the slow cortical potentials from a set of cortical source dipoles reveal discriminative information regarding the movement parameter, direction. Significance. This study presents evidence that low-frequency components in the source space play an important role in movement kinematics, and thus it may lead to new strategies for BCI-based neurorehabilitation.

  19. Design specification for the European Spallation Source neutron generating target element

    International Nuclear Information System (INIS)

    Aguilar, A.; Sordo, F.; Mora, T.; Mena, L.; Mancisidor, M.; Aguilar, J.; Bakedano, G.; Herranz, I.; Luna, P.; Magan, M.; Vivanco, R.; Jimenez-Villacorta, F.; Sjogreen, K.; Oden, U.; Perlado, J.M.

    2017-01-01

    The paper addresses some of the most relevant issues concerning the thermal hydraulics and radiation damage of the neutron generation target to be built at the European Spallation Source as recently approved after a critical design review. The target unit consists of a set of Tungsten blocks placed inside a wheel of 2.5 m diameter which rotates at some 0.5 Hz in order to distribute the heat generated from incoming protons which reach the target in the radial direction. The spallation material elements are composed of an array of Tungsten pieces which rest on a rotating steel support (the cassette) and are distributed in a cross-flow configuration. The thermal, mechanical and radiation effects resulting from the impact of a 2 GeV proton pulse are analysed in detail as well as an evaluation of the inventory of spallation products. The current design is found to conform to specifications and found to be robust enough to deal with several accident scenarios.

  20. Design specification for the European Spallation Source neutron generating target element

    Energy Technology Data Exchange (ETDEWEB)

    Aguilar, A. [Consorcio ESS-BILBAO. Parque Tecnológico Bizkaia. Poligono Ugaldeguren III, Pol. A, 7B, 48170 Zamudio (Spain); Sordo, F., E-mail: fernando.sordo@essbilbao.org [Consorcio ESS-BILBAO. Parque Tecnológico Bizkaia. Poligono Ugaldeguren III, Pol. A, 7B, 48170 Zamudio (Spain); Instituto de Fusión Nuclear, José Gutiérrez Abascal, 2, 28006 Madrid (Spain); Mora, T. [Consorcio ESS-BILBAO. Parque Tecnológico Bizkaia. Poligono Ugaldeguren III, Pol. A, 7B, 48170 Zamudio (Spain); Mena, L. [Consorcio ESS-BILBAO. Parque Tecnológico Bizkaia. Poligono Ugaldeguren III, Pol. A, 7B, 48170 Zamudio (Spain); Instituto de Fusión Nuclear, José Gutiérrez Abascal, 2, 28006 Madrid (Spain); Mancisidor, M.; Aguilar, J.; Bakedano, G.; Herranz, I.; Luna, P. [Consorcio ESS-BILBAO. Parque Tecnológico Bizkaia. Poligono Ugaldeguren III, Pol. A, 7B, 48170 Zamudio (Spain); Magan, M.; Vivanco, R. [Consorcio ESS-BILBAO. Parque Tecnológico Bizkaia. Poligono Ugaldeguren III, Pol. A, 7B, 48170 Zamudio (Spain); Instituto de Fusión Nuclear, José Gutiérrez Abascal, 2, 28006 Madrid (Spain); Jimenez-Villacorta, F. [Consorcio ESS-BILBAO. Parque Tecnológico Bizkaia. Poligono Ugaldeguren III, Pol. A, 7B, 48170 Zamudio (Spain); Sjogreen, K.; Oden, U. [European Spallation Source ERIC, P.O Box 176, SE-221 00 Lund (Sweden); Perlado, J.M. [Instituto de Fusión Nuclear, José Gutiérrez Abascal, 2, 28006 Madrid (Spain); and others

    2017-06-01

    The paper addresses some of the most relevant issues concerning the thermal hydraulics and radiation damage of the neutron generation target to be built at the European Spallation Source as recently approved after a critical design review. The target unit consists of a set of Tungsten blocks placed inside a wheel of 2.5 m diameter which rotates at some 0.5 Hz in order to distribute the heat generated from incoming protons which reach the target in the radial direction. The spallation material elements are composed of an array of Tungsten pieces which rest on a rotating steel support (the cassette) and are distributed in a cross-flow configuration. The thermal, mechanical and radiation effects resulting from the impact of a 2 GeV proton pulse are analysed in detail as well as an evaluation of the inventory of spallation products. The current design is found to conform to specifications and found to be robust enough to deal with several accident scenarios.

  1. The Relative Importance of Specific Self-Efficacy Sources in Pretraining Self-Efficacy Beliefs

    Science.gov (United States)

    Howardson, Garett N.; Behrend, Tara S.

    2015-01-01

    Self-efficacy is clearly important for learning. Research identifying the most important sources of self-efficacy beliefs, however, has been somewhat limited to date in that different disciplines focus largely on different sources of self-efficacy. Whereas education researchers focus on Bandura's original sources of "enactive mastery,"…

  2. OSSMETER D3.4 – Language-Specific Source Code Quality Analysis

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen); A. Shahi (Ashim); H.J.S. Basten (Bas)

    2014-01-01

    htmlabstractThis deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and prototypes of the tools that are needed for source code quality analysis in open source software projects. It builds upon the results of: • Deliverable 3.1 where infra-structure and

  3. Plastics from household waste as a source of heavy metal pollution: An inventory study using INAA as the analytical technique

    International Nuclear Information System (INIS)

    Bode, P.; De Bruin, M.; Aalbers, Th.G.; Meyer, P.J.

    1990-01-01

    An inventory study to the levels of cadmium in the plastic component of household waste was carried out utilizing INAA as the analytical technique. In a 2-h irradiation, 2-d decay, and 1-h measurement, protocol adequate sensitivities could be obtained for Cd, but also for a group of other metals: Cr, Co, Ni, Cu, Sr, Zn, As, Se, Mo, Sn, Sb, Ba, and Hg. Red-, orange-, and yellow-colored plastics either contain Cd at high levels (over 1000 mg/kg) or have relatively low Cd concentrations (<50 mg/kg). High concentrations were also occasionally found for Sr,Se,Ba,Sb, and Hg. INAA appeared very well to be routinely usable for such analysis because of the absence of a destruction step, adequate sensitivity, high accuracy, and multielement results

  4. The World Spatiotemporal Analytics and Mapping Project (WSTAMP): Further Progress in Discovering, Exploring, and Mapping Spatiotemporal Patterns Across the World's Largest Open Source Data Sets

    Science.gov (United States)

    Piburn, J.; Stewart, R.; Myers, A.; Sorokine, A.; Axley, E.; Anderson, D.; Burdette, J.; Biddle, C.; Hohl, A.; Eberle, R.; Kaufman, J.; Morton, A.

    2017-10-01

    Spatiotemporal (ST) analytics applied to major data sources such as the World Bank and World Health Organization has shown tremendous value in shedding light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. WSTAMP engages this opportunity by situating analysts, data, and analytics together within a visually rich and computationally rigorous online analysis environment. Since introducing WSTAMP at the First International Workshop on Spatiotemporal Computing, several transformative advances have occurred. Collaboration with human computer interaction experts led to a complete interface redesign that deeply immerses the analyst within a ST context, significantly increases visual and textual content, provides navigational crosswalks for attribute discovery, substantially reduce mouse and keyboard actions, and supports user data uploads. Secondly, the database has been expanded to include over 16,000 attributes, 50 years of time, and 200+ nation states and redesigned to support non-annual, non-national, city, and interaction data. Finally, two new analytics are implemented for analyzing large portfolios of multi-attribute data and measuring the behavioral stability of regions along different dimensions. These advances required substantial new approaches in design, algorithmic innovations, and increased computational efficiency. We report on these advances and inform how others may freely access the tool.

  5. The World Spatiotemporal Analytics and Mapping Project (WSTAMP: Further Progress in Discovering, Exploring, and Mapping Spatiotemporal Patterns Across the World’s Largest Open Source Data Sets

    Directory of Open Access Journals (Sweden)

    J. Piburn

    2017-10-01

    Full Text Available Spatiotemporal (ST analytics applied to major data sources such as the World Bank and World Health Organization has shown tremendous value in shedding light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. WSTAMP engages this opportunity by situating analysts, data, and analytics together within a visually rich and computationally rigorous online analysis environment. Since introducing WSTAMP at the First International Workshop on Spatiotemporal Computing, several transformative advances have occurred. Collaboration with human computer interaction experts led to a complete interface redesign that deeply immerses the analyst within a ST context, significantly increases visual and textual content, provides navigational crosswalks for attribute discovery, substantially reduce mouse and keyboard actions, and supports user data uploads. Secondly, the database has been expanded to include over 16,000 attributes, 50 years of time, and 200+ nation states and redesigned to support non-annual, non-national, city, and interaction data. Finally, two new analytics are implemented for analyzing large portfolios of multi-attribute data and measuring the behavioral stability of regions along different dimensions. These advances required substantial new approaches in design, algorithmic innovations, and increased computational efficiency. We report on these advances and inform how others may freely access the tool.

  6. Analytical pyrolysis and thermally assisted hydrolysis and methylation of EUROSOIL humic acid samples: a key to their source

    NARCIS (Netherlands)

    Buurman, P.; Nierop, K.G.J.; Kaal, J.; Senesi, S.I.

    2009-01-01

    Humic acids have been widely investigated by spectroscopic methods, especially NMR and FTIR, and they are known to show significant differences according to their origin. Low resolution methods such as NMR and FTIR, however cannot easily distinguish different input sources or establish relations

  7. DEVELOPMENT OF SAMPLING AND ANALYTICAL METHODS FOR THE MEASUREMENT OF NITROUS OXIDE FROM FOSSIL FUEL COMBUSTION SOURCES

    Science.gov (United States)

    The report documents the technical approach and results achieved while developing a grab sampling method and an automated, on-line gas chromatography method suitable to characterize nitrous oxide (N2O) emissions from fossil fuel combustion sources. The two methods developed have...

  8. Analytical Subthreshold Current and Subthreshold Swing Models for a Fully Depleted (FD) Recessed-Source/Drain (Re-S/D) SOI MOSFET with Back-Gate Control

    Science.gov (United States)

    Saramekala, Gopi Krishna; Tiwari, Pramod Kumar

    2017-08-01

    Two-dimensional (2D) analytical models for the subthreshold current and subthreshold swing of the back-gated fully depleted recessed-source/drain (Re-S/D) silicon-on-insulator (SOI) metal-oxide-semiconductor field-effect transistor (MOSFET) are presented. The surface potential is determined by solving the 2D Poisson equation in both channel and buried-oxide (BOX) regions, considering suitable boundary conditions. To derive closed-form expressions for the subthreshold characteristics, the virtual cathode potential expression has been derived in terms of the minimum of the front and back surface potentials. The effect of various device parameters such as gate oxide and Si film thicknesses, thickness of source/drain penetration into BOX, applied back-gate bias voltage, etc. on the subthreshold current and subthreshold swing has been analyzed. The validity of the proposed models is established using the Silvaco ATLAS™ 2D device simulator.

  9. World Spatiotemporal Analytics and Mapping Project (WSTAMP): Discovering, Exploring, and Mapping Spatiotemporal Patterns across the World s Largest Open Source Geographic Data Sets

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Robert N [ORNL; Piburn, Jesse O [ORNL; Sorokine, Alexandre [ORNL; Myers, Aaron T [ORNL; White, Devin A [ORNL

    2015-01-01

    The application of spatiotemporal (ST) analytics to integrated data from major sources such as the World Bank, United Nations, and dozens of others holds tremendous potential for shedding new light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, and changing attributes, as well as content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 10,000+ attributes covering over 200 nation states spanning over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We discuss the status of this work and report on major findings. Acknowledgment Prepared by Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, Tennessee 37831-6285, managed by UT-Battelle, LLC for the U. S. Department of Energy under contract no. DEAC05-00OR22725. Copyright This manuscript has been authored by employees of UT-Battelle, LLC, under contract DE-AC05-00OR22725 with the U.S. Department of Energy. Accordingly, the United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or

  10. LC-MS/MS analytical procedure to quantify tris(nonylphenyl)phosphite, as a source of the endocrine disruptors 4-nonylphenols, in food packaging materials.

    Science.gov (United States)

    Mottier, Pascal; Frank, Nancy; Dubois, Mathieu; Tarres, Adrienne; Bessaire, Thomas; Romero, Roman; Delatour, Thierry

    2014-01-01

    Tris(nonylphenyl)phosphite, an antioxidant used in polyethylene resins for food applications, is problematic since it is a source of the endocrine-disrupting chemicals 4-nonylphenols (4NP) upon migration into packaged foods. As a response to concerns surrounding the presence of 4NP-based compounds in packaging materials, some resin producers and additive suppliers have decided to eliminate TNPP from formulations. This paper describes an analytical procedure to verify the "TNPP-free" statement in multilayer laminates used for bag-in-box packaging. The method involves extraction of TNPP from laminates with organic solvents followed by detection/quantification by LC-MS/MS using the atmospheric pressure chemical ionisation (APCI) mode. A further acidic treatment of the latter extract allows the release of 4NP from potentially extracted TNPP. 4NP is then analysed by LC-MS/MS using electrospray ionisation (ESI) mode. This two-step analytical procedure ensures not only TNPP quantification in laminates, but also allows the flagging of other possible sources of 4NP in such packaging materials, typically as non-intentionally added substances (NIAS). The limits of quantification were 0.50 and 0.48 µg dm⁻² for TNPP and 4NP in laminates, respectively, with recoveries ranging between 87% and 114%. Usage of such analytical methodologies in quality control operations has pointed to a lack of traceability at the packaging supplier level and cross-contamination of extrusion equipment at the converter level, when TNPP-containing laminates are processed on the same machine beforehand.

  11. Sources of traffic and visitors' preferences regarding online public reports of quality: web analytics and online survey results.

    Science.gov (United States)

    Bardach, Naomi S; Hibbard, Judith H; Greaves, Felix; Dudley, R Adams

    2015-05-01

    In the context of the Affordable Care Act, there is extensive emphasis on making provider quality transparent and publicly available. Online public reports of quality exist, but little is known about how visitors find reports or about their purpose in visiting. To address this gap, we gathered website analytics data from a national group of online public reports of hospital or physician quality and surveyed real-time visitors to those websites. Websites were recruited from a national group of online public reports of hospital or physician quality. Analytics data were gathered from each website: number of unique visitors, method of arrival for each unique visitor, and search terms resulting in visits. Depending on the website, a survey invitation was launched for unique visitors on landing pages or on pages with quality information. Survey topics included type of respondent (eg, consumer, health care professional), purpose of visit, areas of interest, website experience, and demographics. There were 116,657 unique visitors to the 18 participating websites (1440 unique visitors/month per website), with most unique visitors arriving through search (63.95%, 74,606/116,657). Websites with a higher percent of traffic from search engines garnered more unique visitors (P=.001). The most common search terms were for individual hospitals (23.25%, 27,122/74,606) and website names (19.43%, 22,672/74,606); medical condition terms were uncommon (0.81%, 605/74,606). Survey view rate was 42.48% (49,560/116,657 invited) resulting in 1755 respondents (participation rate=3.6%). There were substantial proportions of consumer (48.43%, 850/1755) and health care professional respondents (31.39%, 551/1755). Across websites, proportions of consumer (21%-71%) and health care professional respondents (16%-48%) varied. Consumers were frequently interested in using the information to choose providers or assess the quality of their provider (52.7%, 225/427); the majority of those choosing a

  12. Sources of Traffic and Visitors’ Preferences Regarding Online Public Reports of Quality: Web Analytics and Online Survey Results

    Science.gov (United States)

    Hibbard, Judith H; Greaves, Felix; Dudley, R Adams

    2015-01-01

    Background In the context of the Affordable Care Act, there is extensive emphasis on making provider quality transparent and publicly available. Online public reports of quality exist, but little is known about how visitors find reports or about their purpose in visiting. Objective To address this gap, we gathered website analytics data from a national group of online public reports of hospital or physician quality and surveyed real-time visitors to those websites. Methods Websites were recruited from a national group of online public reports of hospital or physician quality. Analytics data were gathered from each website: number of unique visitors, method of arrival for each unique visitor, and search terms resulting in visits. Depending on the website, a survey invitation was launched for unique visitors on landing pages or on pages with quality information. Survey topics included type of respondent (eg, consumer, health care professional), purpose of visit, areas of interest, website experience, and demographics. Results There were 116,657 unique visitors to the 18 participating websites (1440 unique visitors/month per website), with most unique visitors arriving through search (63.95%, 74,606/116,657). Websites with a higher percent of traffic from search engines garnered more unique visitors (P=.001). The most common search terms were for individual hospitals (23.25%, 27,122/74,606) and website names (19.43%, 22,672/74,606); medical condition terms were uncommon (0.81%, 605/74,606). Survey view rate was 42.48% (49,560/116,657 invited) resulting in 1755 respondents (participation rate=3.6%). There were substantial proportions of consumer (48.43%, 850/1755) and health care professional respondents (31.39%, 551/1755). Across websites, proportions of consumer (21%-71%) and health care professional respondents (16%-48%) varied. Consumers were frequently interested in using the information to choose providers or assess the quality of their provider (52.7%, 225

  13. Guiding health promotion efforts with urban Inuit: a community-specific perspective on health information sources and dissemination strategies.

    Science.gov (United States)

    McShane, Kelly E; Smylie, Janet K; Hastings, Paul D; Martin, Carmel M

    2006-01-01

    To develop a community-specific perspective of health information sources and dissemination strategies of urban Inuit to better guide health promotion efforts. Through a collaborative partnership with the Tungasuvvingat Inuit Family Resource Centre, a series of key informant interviews and focus groups were conducted to gather information on specific sources of health information, strategies of health information dissemination, and overall themes in health information processes. Distinct patterns of health information sources and dissemination strategies emerged from the data. Major themes included: the importance of visual learning, community Elders, and cultural interpreters; community cohesion; and the Inuit and non-Inuit distinction. The core sources of health information are family members and sources from within the Inuit community. The principal dissemination strategy for health information was direct communication, either through one-on-one interactions or in groups. This community-specific perspective of health information sources and dissemination strategies shows substantial differences from current mainstream models of health promotion and knowledge translation. Health promotion efforts need to acknowledge the distinct health information processes of this community, and should strive to integrate existing health information sources and strategies of dissemination with those of the community.

  14. Concurrent sourcing as a mechanism for safeguarding specific investments from opportunism

    DEFF Research Database (Denmark)

    Mols, Niels Peter

    This paper identifies when concurrent sourcing is an effective safeguard. Concurrent sourcing shortens the period that a buyer needs in order to internalize production and thus, it shortens the period in which an external supplier is able to hold-up a buyer. Concurrent sourcing also allows...... for short run expansion of production and reduces costs of lost customers. However, when complementarities and diseconomies of scale make concurrent sourcing an efficient choice for a buyer, the same complementarities and diseconomies of scale also weaken the threat that the internal production unit may...

  15. Characterization and source term assessments of radioactive particles from Marshall Islands using non-destructive analytical techniques

    Science.gov (United States)

    Jernström, J.; Eriksson, M.; Simon, R.; Tamborini, G.; Bildstein, O.; Marquez, R. Carlos; Kehl, S. R.; Hamilton, T. F.; Ranebo, Y.; Betti, M.

    2006-08-01

    Six plutonium-containing particles stemming from Runit Island soil (Marshall Islands) were characterized by non-destructive analytical and microanalytical methods. Composition and elemental distribution in the particles were studied with synchrotron radiation based micro X-ray fluorescence spectrometry. Scanning electron microscope equipped with energy dispersive X-ray detector and with wavelength dispersive system as well as a secondary ion mass spectrometer were used to examine particle surfaces. Based on the elemental composition the particles were divided into two groups: particles with pure Pu matrix, and particles where the plutonium is included in Si/O-rich matrix being more heterogenously distributed. All of the particles were identified as nuclear fuel fragments of exploded weapon components. As containing plutonium with low 240Pu/ 239Pu atomic ratio, less than 0.065, which corresponds to weapons-grade plutonium or a detonation with low fission yield, the particles were identified to originate from the safety test and low-yield tests conducted in the history of Runit Island. The Si/O-rich particles contained traces of 137Cs ( 239 + 240 Pu/ 137Cs activity ratio higher than 2500), which indicated that a minor fission process occurred during the explosion. The average 241Am/ 239Pu atomic ratio in the six particles was 3.7 × 10 - 3 ± 0.2 × 10 - 3 (February 2006), which indicated that plutonium in the different particles had similar age.

  16. Characterization and source term assessments of radioactive particles from Marshall Islands using non-destructive analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jernstroem, J. [Laboratory of Radiochemistry, Department of Chemistry, P.O. Box 55, FI-00014 University of Helsinki (Finland)]. E-mail: jussi.jernstrom@helsinki.fi; Eriksson, M. [IAEA-MEL, International Atomic Energy Agency - Marine Environment Laboratory, 4 Quai Antoine 1er, MC 98000 Monaco (Monaco); Simon, R. [Institute for Synchrotron Radiation, Forschungszentrum Karlsruhe GmbH, D-76021 Karlsruhe (Germany); Tamborini, G. [European Commission, Joint Research Centre, Institute for Transuranium Elements, P.O. Box 2340, D-76125 Karlsruhe (Germany); Bildstein, O. [European Commission, Joint Research Centre, Institute for Transuranium Elements, P.O. Box 2340, D-76125 Karlsruhe (Germany); Marquez, R. Carlos [European Commission, Joint Research Centre, Institute for Transuranium Elements, P.O. Box 2340, D-76125 Karlsruhe (Germany); Kehl, S.R. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, CA 94551-0808 (United States); Hamilton, T.F. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, CA 94551-0808 (United States); Ranebo, Y. [European Commission, Joint Research Centre, Institute for Transuranium Elements, P.O. Box 2340, D-76125 Karlsruhe (Germany); Betti, M. [European Commission, Joint Research Centre, Institute for Transuranium Elements, P.O. Box 2340, D-76125 Karlsruhe (Germany)]. E-mail: maria.betti@ec.europa.eu

    2006-08-15

    Six plutonium-containing particles stemming from Runit Island soil (Marshall Islands) were characterized by non-destructive analytical and microanalytical methods. Composition and elemental distribution in the particles were studied with synchrotron radiation based micro X-ray fluorescence spectrometry. Scanning electron microscope equipped with energy dispersive X-ray detector and with wavelength dispersive system as well as a secondary ion mass spectrometer were used to examine particle surfaces. Based on the elemental composition the particles were divided into two groups: particles with pure Pu matrix, and particles where the plutonium is included in Si/O-rich matrix being more heterogenously distributed. All of the particles were identified as nuclear fuel fragments of exploded weapon components. As containing plutonium with low {sup 240}Pu/{sup 239}Pu atomic ratio, less than 0.065, which corresponds to weapons-grade plutonium or a detonation with low fission yield, the particles were identified to originate from the safety test and low-yield tests conducted in the history of Runit Island. The Si/O-rich particles contained traces of {sup 137}Cs ({sup 239+240}Pu/{sup 137}Cs activity ratio higher than 2500), which indicated that a minor fission process occurred during the explosion. The average {sup 241}Am/{sup 239}Pu atomic ratio in the six particles was 3.7 x 10{sup -3} {+-} 0.2 x 10{sup -3} (February 2006), which indicated that plutonium in the different particles had similar age.

  17. Getting to the Source: a Survey of Quantitative Data Sources Available to the Everyday Librarian: Part II: Data Sources from Specific Library Applications

    Directory of Open Access Journals (Sweden)

    Lisa Goddard

    2007-03-01

    Full Text Available This is the second part of a two-part article that provides a survey of data sources which are likely to be immediately available to the typical practitioner who wishes to engage in statistical analysis of collections and services within his or her own library. Part I outlines the data elements which can be extracted from web server logs, and discusses web log analysis tools. Part II looks at logs, reports, and data sources from proxy servers, resource vendors, link resolvers, federated search engines, institutional repositories, electronic reference services, and the integrated library system.

  18. 3D material cytometry (3DMaC): a very high-replicate, high-throughput analytical method using microfabricated, shape-specific, cell-material niches.

    Science.gov (United States)

    Parratt, Kirsten; Jeong, Jenny; Qiu, Peng; Roy, Krishnendu

    2017-08-08

    Studying cell behavior within 3D material niches is key to understanding cell biology in health and diseases, and developing biomaterials for regenerative medicine applications. Current approaches to studying these cell-material niches have low throughput and can only analyze a few replicates per experiment resulting in reduced measurement assurance and analytical power. Here, we report 3D material cytometry (3DMaC), a novel high-throughput method based on microfabricated, shape-specific 3D cell-material niches and imaging cytometry. 3DMaC achieves rapid and highly multiplexed analyses of very high replicate numbers ("n" of 10 4 -10 6 ) of 3D biomaterial constructs. 3DMaC overcomes current limitations of low "n", low-throughput, and "noisy" assays, to provide rapid and simultaneous analyses of potentially hundreds of parameters in 3D biomaterial cultures. The method is demonstrated here for a set of 85 000 events containing twelve distinct cell-biomaterial micro-niches along with robust, customized computational methods for high-throughput analytics with potentially unprecedented statistical power.

  19. Preservatives and neutralizing substances in milk: analytical sensitivity of official specific and nonspecific tests, microbial inhibition effect, and residue persistence in milk

    Directory of Open Access Journals (Sweden)

    Livia Cavaletti Corrêa da Silva

    2015-09-01

    Full Text Available Milk fraud has been a recurring problem in Brazil; thus, it is important to know the effect of most frequently used preservatives and neutralizing substances as well as the detection capability of official tests. The objective of this study was to evaluate the analytical sensitivity of legislation-described tests and nonspecific microbial inhibition tests, and to investigate the effect of such substances on microbial growth inhibition and the persistence of detectable residues after 24/48h of refrigeration. Batches of raw milk, free from any contaminant, were divided into aliquots and mixed with different concentrations of formaldehyde, hydrogen peroxide, sodium hypochlorite, chlorine, chlorinated alkaline detergent, or sodium hydroxide. The analytical sensitivity of the official tests was 0.005%, 0.003%, and 0.013% for formaldehyde, hydrogen peroxide, and hypochlorite, respectively. Chlorine and chlorinated alkaline detergent were not detected by regulatory tests. In the tests for neutralizing substances, sodium hydroxide could not be detected when acidity was accurately neutralized. The yogurt culture test gave results similar to those obtained by official tests for the detection of specific substances. Concentrations of 0.05% of formaldehyde, 0.003% of hydrogen peroxide and 0.013% of sodium hypochlorite significantly reduced (P

  20. Strand Invasion Based Amplification (SIBA®): a novel isothermal DNA amplification technology demonstrating high specificity and sensitivity for a single molecule of target analyte.

    Science.gov (United States)

    Hoser, Mark J; Mansukoski, Hannu K; Morrical, Scott W; Eboigbodin, Kevin E

    2014-01-01

    Isothermal nucleic acid amplification technologies offer significant advantages over polymerase chain reaction (PCR) in that they do not require thermal cycling or sophisticated laboratory equipment. However, non-target-dependent amplification has limited the sensitivity of isothermal technologies and complex probes are usually required to distinguish between non-specific and target-dependent amplification. Here, we report a novel isothermal nucleic acid amplification technology, Strand Invasion Based Amplification (SIBA). SIBA technology is resistant to non-specific amplification, is able to detect a single molecule of target analyte, and does not require target-specific probes. The technology relies on the recombinase-dependent insertion of an invasion oligonucleotide (IO) into the double-stranded target nucleic acid. The duplex regions peripheral to the IO insertion site dissociate, thereby enabling target-specific primers to bind. A polymerase then extends the primers onto the target nucleic acid leading to exponential amplification of the target. The primers are not substrates for the recombinase and are, therefore unable to extend the target template in the absence of the IO. The inclusion of 2'-O-methyl RNA to the IO ensures that it is not extendible and that it does not take part in the extension of the target template. These characteristics ensure that the technology is resistant to non-specific amplification since primer dimers or mis-priming are unable to exponentially amplify. Consequently, SIBA is highly specific and able to distinguish closely-related species with single molecule sensitivity in the absence of complex probes or sophisticated laboratory equipment. Here, we describe this technology in detail and demonstrate its use for the detection of Salmonella.

  1. Strand Invasion Based Amplification (SIBA®: a novel isothermal DNA amplification technology demonstrating high specificity and sensitivity for a single molecule of target analyte.

    Directory of Open Access Journals (Sweden)

    Mark J Hoser

    Full Text Available Isothermal nucleic acid amplification technologies offer significant advantages over polymerase chain reaction (PCR in that they do not require thermal cycling or sophisticated laboratory equipment. However, non-target-dependent amplification has limited the sensitivity of isothermal technologies and complex probes are usually required to distinguish between non-specific and target-dependent amplification. Here, we report a novel isothermal nucleic acid amplification technology, Strand Invasion Based Amplification (SIBA. SIBA technology is resistant to non-specific amplification, is able to detect a single molecule of target analyte, and does not require target-specific probes. The technology relies on the recombinase-dependent insertion of an invasion oligonucleotide (IO into the double-stranded target nucleic acid. The duplex regions peripheral to the IO insertion site dissociate, thereby enabling target-specific primers to bind. A polymerase then extends the primers onto the target nucleic acid leading to exponential amplification of the target. The primers are not substrates for the recombinase and are, therefore unable to extend the target template in the absence of the IO. The inclusion of 2'-O-methyl RNA to the IO ensures that it is not extendible and that it does not take part in the extension of the target template. These characteristics ensure that the technology is resistant to non-specific amplification since primer dimers or mis-priming are unable to exponentially amplify. Consequently, SIBA is highly specific and able to distinguish closely-related species with single molecule sensitivity in the absence of complex probes or sophisticated laboratory equipment. Here, we describe this technology in detail and demonstrate its use for the detection of Salmonella.

  2. Specific factors influencing information system/information and communication technology sourcing strategies in healthcare facilities.

    Science.gov (United States)

    Potančok, Martin; Voříšek, Jiří

    2016-09-01

    Healthcare facilities use a number of information system/information and communication technologies. Each healthcare facility faces a need to choose sourcing strategies most suitable to ensure provision of information system/information and communication technology services, processes and resources. Currently, it is possible to observe an expansion of sourcing possibilities in healthcare informatics, which creates new requirements for sourcing strategies. Thus, the aim of this article is to identify factors influencing information system/information and communication technology sourcing strategies in healthcare facilities. The identification was based on qualitative research, namely, a case study. This study provides a set of internal and external factors with their impact levels. The findings also show that not enough attention is paid to these factors during decision-making. © The Author(s) 2015.

  3. Two-step source tracing strategy of Yersinia pestis and its historical epidemiology in a specific region.

    Directory of Open Access Journals (Sweden)

    Yanfeng Yan

    Full Text Available Source tracing of pathogens is critical for the control and prevention of infectious diseases. Genome sequencing by high throughput technologies is currently feasible and popular, leading to the burst of deciphered bacterial genome sequences. Utilizing the flooding genomic data for source tracing of pathogens in outbreaks is promising, and challenging as well. Here, we employed Yersinia pestis genomes from a plague outbreak at Xinghai county of China in 2009 as an example, to develop a simple two-step strategy for rapid source tracing of the outbreak. The first step was to define the phylogenetic position of the outbreak strains in a whole species tree, and the next step was to provide a detailed relationship across the outbreak strains and their suspected relatives. Through this strategy, we observed that the Xinghai plague outbreak was caused by Y. pestis that circulated in the local plague focus, where the majority of historical plague epidemics in the Qinghai-Tibet Plateau may originate from. The analytical strategy developed here will be of great help in fighting against the outbreaks of emerging infectious diseases, by pinpointing the source of pathogens rapidly with genomic epidemiological data and microbial forensics information.

  4. Reducing mortality risk by targeting specific air pollution sources: Suva, Fiji.

    Science.gov (United States)

    Isley, C F; Nelson, P F; Taylor, M P; Stelcer, E; Atanacio, A J; Cohen, D D; Mani, F S; Maata, M

    2018-01-15

    Health implications of air pollution vary dependent upon pollutant sources. This work determines the value, in terms of reduced mortality, of reducing ambient particulate matter (PM 2.5 : effective aerodynamic diameter 2.5μm or less) concentration due to different emission sources. Suva, a Pacific Island city with substantial input from combustion sources, is used as a case-study. Elemental concentration was determined, by ion beam analysis, for PM 2.5 samples from Suva, spanning one year. Sources of PM 2.5 have been quantified by positive matrix factorisation. A review of recent literature has been carried out to delineate the mortality risk associated with these sources. Risk factors have then been applied for Suva, to calculate the possible mortality reduction that may be achieved through reduction in pollutant levels. Higher risk ratios for black carbon and sulphur resulted in mortality predictions for PM 2.5 from fossil fuel combustion, road vehicle emissions and waste burning that surpass predictions for these sources based on health risk of PM 2.5 mass alone. Predicted mortality for Suva from fossil fuel smoke exceeds the national toll from road accidents in Fiji. The greatest benefit for Suva, in terms of reduced mortality, is likely to be accomplished by reducing emissions from fossil fuel combustion (diesel), vehicles and waste burning. Copyright © 2017. Published by Elsevier B.V.

  5. Analytical methods for the determination of mixtures of bisphenols and derivatives in human and environmental exposure sources and biological fluids. A review

    International Nuclear Information System (INIS)

    Caballero-Casero, N.; Lunar, L.; Rubio, S.

    2016-01-01

    Bisphenol A (BPA) is ubiquitous in humans and the environment. Its potential adverse effects through genomic and non-genomic pathways have fostered BPA replacement by bisphenol analogs that, unfortunately, exert similar adverse effects. Many of these analogs, as well as their derivatives, have already found in humans and the environment and major concerns have arisen over their low dose- and mixture-related effects. This review aims to discuss the characteristics of the main analytical methods reported so far for the determination of mixtures of bisphenol analogs and/or derivatives in human and environmental exposure sources and biological fluids. Approaches followed for removal of background contamination, sample preparation and separation and detection of mixtures of bisphenols and derivatives are critically discussed. Sample treatment is matrix-dependent and common steps include analyte isolation, removal of interferences, evaporation of the extracts and solvent reconstitution. Separation and quantification has been almost exclusively carried out by liquid chromatography tandem mass spectrometry (LC-MS/MS) or gas chromatography mass spectrometry (GC–MS), in the last case prior derivatization, but LC-fluorescence detection has also found some applications. Main characteristics, advantages and drawbacks of these methods will be comparatively discussed. Although at an early stage, some approaches for the assessment of the risk to mixtures of bisphenols, mainly based on the combination of chemical target analysis and toxicity evaluation, have been already applied and they will be here presented. Current knowledge gaps hindering a reliable assessment of human and environmental risk to mixtures of bisphenols and derivatives will be outlined. - Highlights: • Analytical methods for the (bio)monitoring of mixtures of bisphenols are reviewed. • LC and CG coupled to MS are the preferred techniques. • Method-dependent sample treatments are required to remove matrix

  6. Analytical methods for the determination of mixtures of bisphenols and derivatives in human and environmental exposure sources and biological fluids. A review

    Energy Technology Data Exchange (ETDEWEB)

    Caballero-Casero, N.; Lunar, L.; Rubio, S., E-mail: qa1rubrs@uco.es

    2016-02-18

    Bisphenol A (BPA) is ubiquitous in humans and the environment. Its potential adverse effects through genomic and non-genomic pathways have fostered BPA replacement by bisphenol analogs that, unfortunately, exert similar adverse effects. Many of these analogs, as well as their derivatives, have already found in humans and the environment and major concerns have arisen over their low dose- and mixture-related effects. This review aims to discuss the characteristics of the main analytical methods reported so far for the determination of mixtures of bisphenol analogs and/or derivatives in human and environmental exposure sources and biological fluids. Approaches followed for removal of background contamination, sample preparation and separation and detection of mixtures of bisphenols and derivatives are critically discussed. Sample treatment is matrix-dependent and common steps include analyte isolation, removal of interferences, evaporation of the extracts and solvent reconstitution. Separation and quantification has been almost exclusively carried out by liquid chromatography tandem mass spectrometry (LC-MS/MS) or gas chromatography mass spectrometry (GC–MS), in the last case prior derivatization, but LC-fluorescence detection has also found some applications. Main characteristics, advantages and drawbacks of these methods will be comparatively discussed. Although at an early stage, some approaches for the assessment of the risk to mixtures of bisphenols, mainly based on the combination of chemical target analysis and toxicity evaluation, have been already applied and they will be here presented. Current knowledge gaps hindering a reliable assessment of human and environmental risk to mixtures of bisphenols and derivatives will be outlined. - Highlights: • Analytical methods for the (bio)monitoring of mixtures of bisphenols are reviewed. • LC and CG coupled to MS are the preferred techniques. • Method-dependent sample treatments are required to remove matrix

  7. Analytical model of nanoscale junctionless transistors towards controlling of short channel effects through source/drain underlap and channel thickness engineering

    Science.gov (United States)

    Roy, Debapriya; Biswas, Abhijit

    2018-01-01

    We develop a 2D analytical subthreshold model for nanoscale double-gate junctionless transistors (DGJLTs) with gate-source/drain underlap. The model is validated using well-calibrated TCAD simulation deck obtained by comparing experimental data in the literature. To analyze and control short-channel effects, we calculate the threshold voltage, drain induced barrier lowering (DIBL) and subthreshold swing of DGJLTs using our model and compare them with corresponding simulation value at channel length of 20 nm with channel thickness tSi ranging 5-10 nm, gate-source/drain underlap (LSD) values 0-7 nm and source/drain doping concentrations (NSD) ranging 5-12 × 1018 cm-3. As tSi reduces from 10 to 5 nm DIBL drops down from 42.5 to 0.42 mV/V at NSD = 1019 cm-3 and LSD = 5 nm in contrast to decrement from 71 to 4.57 mV/V without underlap. For a lower tSiDIBL increases marginally with increasing NSD. The subthreshold swing reduces more rapidly with thinning of channel thickness rather than increasing LSD or decreasing NSD.

  8. Surface-specific analytical techniques

    International Nuclear Information System (INIS)

    Riviere, J.C.

    1982-01-01

    The following methods are discussed: electron excitation (Auger electron spectroscopy; scanning Auger microscopy; electron energy-loss spectroscopy; appearance potential spectroscopy; electron-induced luminescence; electron-stimulated desorption); photon excitation (X-ray photoelectron spectroscopy; X-ray excited Auger electron spectroscopy; synchrotron radiation photoelectron spectroscopy; ultraviolet photoelectron spectroscopy; photoelectron spectromicroscopy; ellipsometry); ion excitation (ion-excited Auger electron spectroscopy; proton-excited Auger electron spectroscopy; ion neutralization spectroscopy; ion beam spectrochemical analysis; glow discharge optical spectroscopy; static secondary ion mass spectrometry; ion scattering spectroscopy; glow discharge mass spectrometry); resolution and sensitivity. (U.K.)

  9. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    Science.gov (United States)

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. Copyright © 2016 the American Physiological Society.

  10. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  11. Seaweed as source of energy. 1: effect of a specific bacterial strain on biogas production

    Energy Technology Data Exchange (ETDEWEB)

    Sreenivasa R.P.; Tarwade, S.J.; Sarma, K.S.R.

    1980-09-01

    Only certain marine bacteria capable of digesting the special type of polysaccharide - agar and alginic acid can bring about the biodegradation of these substances and utilise them as carbon source to produce the organics which will be utilised by the methane bacteria to produce methane. When bacterial strain was used in conjunction with cowdung as a source of methane bacteria in seaweed digester, production of biogas from seaweed was accelerated. Adding of small amount of Ulva to seaweed digester increased the output of gas. (Refs. 4).

  12. Use of GSR particle analysis program on an analytical SEM to identify sources of emission of airborne particles

    International Nuclear Information System (INIS)

    Chan, Y.C.; Trumper, J.; Bostrom, T.

    2002-01-01

    Full text: High concentrations of airborne particles, in particular PM 10 (particulate matter 10 , but has been little used in Australia for airborne particulates. Two sets of 15 mm PM 10 samples were collected in March and April 2000 from two sites in Brisbane, one within a suburb and one next to an arterial road. The particles were collected directly onto double-sided carbon tapes with a cascade impactor attached to a high-volume PM 10 sampler. The carbon tapes were analysed in a JEOL 840 SEM equipped with a Be-window energy-dispersive X-ray detector and Moran Scientific microanalysis system. An automated Gun Shot Residue (GSR) program was used together with backscattered electron imaging to characterise and analyse individual particulates. About 6,000 particles in total were analysed for each set of impactor samples. Due to limitations of useful pixel size, only particles larger than about 0.5 μm could be analysed. The size, shape and estimated elemental composition (from Na to Pb) of the particles were subjected to non-hierarchical cluster analysis and the characteristics of the clusters were related to their possible sources of emission. Both samples resulted in similar particle clusters. The particles could be classified into three main categories non-spherical (58% of the total number of analysed particles, shape factor >1 1), spherical (15%) and 'carbonaceous' (27%, ie with unexplained % of elemental mass >75%). Non-spherical particles were mainly sea salt and soil particles, and a small amount of iron, lead and mineral dust. The spherical particles were mainly sea salt particles and flyash, and a small amount of iron, lead and secondary sulphate dust. The carbonaceous particles included carbon material mixed with secondary aerosols, roadside dust, sea salt or industrial dust. The arterial road sample also contained more roadside dust and less secondary aerosols than the suburb sample. Current limitations with this method are the minimum particle size

  13. Rascal: A domain specific language for source code analysis and manipulation

    NARCIS (Netherlands)

    P. Klint (Paul); T. van der Storm (Tijs); J.J. Vinju (Jurgen); A. Walenstein; S. Schuppe

    2009-01-01

    htmlabstractMany automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This

  14. RASCAL : a domain specific language for source code analysis and manipulationa

    NARCIS (Netherlands)

    Klint, P.; Storm, van der T.; Vinju, J.J.

    2009-01-01

    Many automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This impedance

  15. Attributing the human disease burden of foodborne infections to specific sources.

    NARCIS (Netherlands)

    Pires, S.M.; Evers, E.G.; van Pelt, W.; Ayers, T.; Scallan, E.; Angulo, F.J.; Havelaar, A.H.; Hald, T.

    2009-01-01

    Foodborne diseases are an important cause of human illness worldwide. Humans acquire these infections from a variety of sources and routes of transmission. Many efforts have been made in the last decades to prevent and control foodborne diseases, particularly foodborne zoonoses. However, information

  16. Premature Deaths Attributed to Source-Specific BC Emissions in Six Urban US Regions

    Czech Academy of Sciences Publication Activity Database

    Turner, M.D.; Henze, D.K.; Capps, S.; Hakami, A.; Zhao, S.; Resler, Jaroslav; Carmichael, G.; Stanier, C.; Baek, J.; Sandu, A.; Russell, A.G.; Nenes, A.; Pinder, R.; Napelenok, S.; Bash, J.; Percell, P.; Chai, T.

    2015-01-01

    Roč. 10, č. 11 (2015), Article 114014 ISSN 1748-9326 Grant - others:NASA Applied Sciences Program(US) NNX09AN77G Institutional support: RVO:67985807 Keywords : air quality * health impact * source apportionment * adjoint * particulate matter * black car bon Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 4.134, year: 2015

  17. Specific power reduction of an ion source due to heating and cathode sputtering of electrodes

    International Nuclear Information System (INIS)

    Hamilton, G.U.; Semashko, N.N.

    The potentialities and limitations of the water-cooled ion-optical system of the ion source designed for continuous operation of the high-power neutral beam injector are determined. The following problems are analyzed: thermal expansion and deformation of electrodes, electrode sputtering as a result of bombardment, and heat transfer to turbulent flow of water

  18. Attributing the Human Disease Burden of Foodborne Infections to Specific Sources

    DEFF Research Database (Denmark)

    Pires, Sara Monteiro; Evers, Eric E.; Van Pely, Wilfrid

    2009-01-01

    Foodborne diseases are an important cause of human illness worldwide. Humans acquire these infections from a variety of sources and routes of transmission. Many efforts have been made in the last decades to prevent and control foodborne diseases, particularly foodborne zoonoses. However...

  19. Premature Deaths Attributed to Source-Specific BC Emissions in Six Urban US Regions

    Czech Academy of Sciences Publication Activity Database

    Turner, M.D.; Henze, D.K.; Capps, S.; Hakami, A.; Zhao, S.; Resler, Jaroslav; Carmichael, G.; Stanier, C.; Baek, J.; Sandu, A.; Russell, A.G.; Nenes, A.; Pinder, R.; Napelenok, S.; Bash, J.; Percell, P.; Chai, T.

    2015-01-01

    Roč. 10, č. 11 (2015), Article 114014 ISSN 1748-9326 Grant - others: NASA Applied Sciences Program(US) NNX09AN77G Institutional support: RVO:67985807 Keywords : air quality * health impact * source apportionment * adjoint * particulate matter * black carbon Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 4.134, year: 2015

  20. Study of cold and hot sources in a research reactor. (Physics, specifications, operation, utilization)

    International Nuclear Information System (INIS)

    Safieh, J.

    1982-10-01

    A brief description of the reactor, sources and experimental channels (ORPHEE being taken as example) is first given. The first part deals with the hot neutron source, mainly made of a graphite block to be carried at a temperature of 1500 0 K by nuclear heating. The present study focused on the determination, with the code MERCURE IV, of heat sources generated in the graphite block. From these results the spatial distribution of temperatures have been calculated with two different methods. Mechanical and thermal stresses have been calculated for the hot points. Then, the outlet neutron spectra is determined by means of the code APOLLO. Finally, the operation of the device is presented and the risks and the safety measures are given. The second part deals with cold neutron sources comprising mainly a cold moderator (liquid hydrogen 20.4 0 K). The helium coolant circuit liquefies the hydrogen by means of heat exchange in a condenser. Cold neutron yields calculations are developed by means of the code THERMOS in the plane and cyclindrical geometries. Heat sources generated by nuclear radiations are calculated. A detailed description of the device and its coolant circuit is given, and a risk analysis is finally presented. The third part deals with the part of thermal cold and hot neutrons in the study of matter and its dynamics. Technical means needed to obtain a monochromatic beam, for diffraction experiments, are recalled emphasizing on the interest of these neutrons with regard to X radiation. Then, one deals with cold neutron guides. Finally, the efficiency of two neutron guides is calculated. 78 refs [fr

  1. Analytical strategies for phosphoproteomics

    DEFF Research Database (Denmark)

    Thingholm, Tine E; Jensen, Ole N; Larsen, Martin R

    2009-01-01

    sensitive and specific strategies. Today, most phosphoproteomic studies are conducted by mass spectrometric strategies in combination with phospho-specific enrichment methods. This review presents an overview of different analytical strategies for the characterization of phosphoproteins. Emphasis...

  2. Sources

    International Nuclear Information System (INIS)

    Duffy, L.P.

    1991-01-01

    This paper discusses the sources of radiation in the narrow perspective of radioactivity and the even narrow perspective of those sources that concern environmental management and restoration activities at DOE facilities, as well as a few related sources. Sources of irritation, Sources of inflammatory jingoism, and Sources of information. First, the sources of irritation fall into three categories: No reliable scientific ombudsman to speak without bias and prejudice for the public good, Technical jargon with unclear definitions exists within the radioactive nomenclature, and Scientific community keeps a low-profile with regard to public information. The next area of personal concern are the sources of inflammation. This include such things as: Plutonium being described as the most dangerous substance known to man, The amount of plutonium required to make a bomb, Talk of transuranic waste containing plutonium and its health affects, TMI-2 and Chernobyl being described as Siamese twins, Inadequate information on low-level disposal sites and current regulatory requirements under 10 CFR 61, Enhanced engineered waste disposal not being presented to the public accurately. Numerous sources of disinformation regarding low level radiation high-level radiation, Elusive nature of the scientific community, The Federal and State Health Agencies resources to address comparative risk, and Regulatory agencies speaking out without the support of the scientific community

  3. Governance and asset specificity as facilitators and sources of innovation and value creation

    OpenAIRE

    Sunde, Per Anders

    2007-01-01

    Drawing on transaction cost theory and relational exchange theory, this dissertation examines the different performance effects of formal and relational governance mechanisms, project specific investments, and the interaction between governance and project specific investments in inter-firm innovation projects. The following four performance dimensions are studied: goal attainment, value creation potential for the customer and the contractor, and innovative performance. The model and hypo...

  4. Suitability Evaluation of Specific Shallow Geothermal Technologies Using a GIS-Based Multi Criteria Decision Analysis Implementing the Analytic Hierarchic Process

    Directory of Open Access Journals (Sweden)

    Francesco Tinti

    2018-02-01

    Full Text Available The exploitation potential of shallow geothermal energy is usually defined in terms of site-specific ground thermal characteristics. While true, this assumption limits the complexity of the analysis, since feasibility studies involve many other components that must be taken into account when calculating the effective market viability of a geothermal technology or the economic value of a shallow geothermal project. In addition, the results of a feasibility study are not simply the sum of the various factors since some components may be conflicting while others will be of a qualitative nature only. Different approaches are therefore needed to evaluate the suitability of an area for shallow geothermal installation. This paper introduces a new GIS platform-based multicriteria decision analysis method aimed at comparing as many different shallow geothermal relevant factors as possible. Using the Analytic Hierarchic Process Tool, a geolocalized Suitability Index was obtained for a specific technological case: the integrated technologies developed within the GEOTeCH Project. A suitability map for the technologies in question was drawn up for Europe.

  5. Final report of the inter institutional project ININ-CNSNS 'Source Terms specific for the CNLV'

    International Nuclear Information System (INIS)

    Anaya M, R.A.

    1991-02-01

    The purpose of the project inter institutional ININ-CNSNS 'Source Terms Specifies for the CNLV' it is the one of implanting in the computer CYBER (CDC 180-830) of the ININ, the 'Source Term Code Package' (STCP) and to make the operation tests and corresponding operation using the data of the sample problem, for finally to liberate the package, all time that by means of the analysis of the results it is consider appropriate. In this report the results of the are presented simulation of the sequence 'Energy Losses external' (Station blackout) and 'Lost total of CA with failure of the RCIC and success of the HPCS' both with data of the Laguna Verde Central. (Author)

  6. Source-specific speciation profiles of PM2.5 for heavy metals and their anthropogenic emissions in China.

    Science.gov (United States)

    Liu, Yayong; Xing, Jia; Wang, Shuxiao; Fu, Xiao; Zheng, Haotian

    2018-08-01

    Heavy metals are concerned for its adverse effect on human health and long term burden on biogeochemical cycling in the ecosystem. In this study, a provincial-level emission inventory of 13 kinds of heavy metals including V, Cr, Mn, Co, Ni, Cu, Zn, As, Cd, Sn, Sb, Ba and Pb from 10 anthropogenic sources was developed for China, based on the 2015 national emission inventory of primary particulate matters and source category-specific speciation profiles collected from 50 previous studies measured in China. Uncertainties associated with the speciation profiles were also evaluated. Our results suggested that total emissions of the 13 types of heavy metals in China are estimated at about 58000 ton for the year 2015. The iron production is the dominant source of heavy metal, contributing 42% of total emissions of heavy metals. The emissions of heavy metals vary significantly at regional scale, with largest amount of emissions concentrated in northern and eastern China. Particular, high emissions of Cr, Co, Ni, As and Sb (contributing 8%-18% of the national emissions) are found in Shandong where has large capacity of industrial production. Uncertainty analysis suggested that the implementation of province-specific source profiles in this study significantly reduced the emission uncertainties from (-89%, 289%) to (-99%, 91%), particularly for coal combustion. However, source profiles for industry sectors such as non-metallic mineral manufacturing are quite limited, resulting in a relative high uncertainty. The high-resolution emission inventories of heavy metals are essential not only for their distribution, deposition and transport studies, but for the design of policies to redress critical atmospheric environmental hazards at local and regional scales. Detailed investigation on source-specific profile in China are still needed to achieve more accurate estimations of heavy metals in the future. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Premature deaths attributed to source-specific BC emissions in six urban US regions

    International Nuclear Information System (INIS)

    Turner, Matthew D; Henze, Daven K; Capps, Shannon L; Hakami, Amir; Zhao, Shunliu; Resler, Jaroslav; Carmichael, Gregory R; Stanier, Charles O; Baek, Jaemeen; Sandu, Adrian; Russell, Armistead G; Nenes, Athanasios; Pinder, Rob W; Napelenok, Sergey L; Bash, Jesse O; Percell, Peter B; Chai, Tianfeng

    2015-01-01

    Recent studies have shown that exposure to particulate black carbon (BC) has significant adverse health effects and may be more detrimental to human health than exposure to PM 2.5 as a whole. Mobile source BC emission controls, mostly on diesel-burning vehicles, have successfully decreased mobile source BC emissions to less than half of what they were 30 years ago. Quantification of the benefits of previous emissions controls conveys the value of these regulatory actions and provides a method by which future control alternatives could be evaluated. In this study we use the adjoint of the Community Multiscale Air Quality (CMAQ) model to estimate highly-resolved spatial distributions of benefits related to emission reductions for six urban regions within the continental US. Emissions from outside each of the six chosen regions account for between 7% and 27% of the premature deaths attributed to exposure to BC within the region. While we estimate that nonroad mobile and onroad diesel emissions account for the largest number of premature deaths attributable to exposure to BC, onroad gasoline is shown to have more than double the benefit per unit emission relative to that of nonroad mobile and onroad diesel. Within the region encompassing New York City and Philadelphia, reductions in emissions from large industrial combustion sources that are not classified as EGUs (i.e., non-EGU) are estimated to have up to triple the benefits per unit emission relative to reductions to onroad diesel sectors, and provide similar benefits per unit emission to that of onroad gasoline emissions in the region. While onroad mobile emissions have been decreasing in the past 30 years and a majority of vehicle emission controls that regulate PM focus on diesel emissions, our analysis shows the most efficient target for stricter controls is actually onroad gasoline emissions. (letter)

  8. Radionuclides in analytical chemistry

    International Nuclear Information System (INIS)

    Tousset, J.

    1984-01-01

    Applications of radionuclides in analytical chemistry are reviewed in this article: tracers, radioactive sources and activation analysis. Examples are given in all these fields and it is concluded that these methods should be used more widely [fr

  9. Survalytics: An Open-Source Cloud-Integrated Experience Sampling, Survey, and Analytics and Metadata Collection Module for Android Operating System Apps

    Science.gov (United States)

    Mackey, Sean

    2016-01-01

    Background We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Objective Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Methods Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. Results The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. Conclusions The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes. PMID:27261155

  10. Survalytics: An Open-Source Cloud-Integrated Experience Sampling, Survey, and Analytics and Metadata Collection Module for Android Operating System Apps.

    Science.gov (United States)

    O'Reilly-Shah, Vikas; Mackey, Sean

    2016-06-03

    We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes.

  11. Sources of pre-analytical variations in yield of DNA extracted from blood samples: analysis of 50,000 DNA samples in EPIC.

    Directory of Open Access Journals (Sweden)

    Elodie Caboux

    Full Text Available The European Prospective Investigation into Cancer and nutrition (EPIC is a long-term, multi-centric prospective study in Europe investigating the relationships between cancer and nutrition. This study has served as a basis for a number of Genome-Wide Association Studies (GWAS and other types of genetic analyses. Over a period of 5 years, 52,256 EPIC DNA samples have been extracted using an automated DNA extraction platform. Here we have evaluated the pre-analytical factors affecting DNA yield, including anthropometric, epidemiological and technical factors such as center of subject recruitment, age, gender, body-mass index, disease case or control status, tobacco consumption, number of aliquots of buffy coat used for DNA extraction, extraction machine or procedure, DNA quantification method, degree of haemolysis and variations in the timing of sample processing. We show that the largest significant variations in DNA yield were observed with degree of haemolysis and with center of subject recruitment. Age, gender, body-mass index, cancer case or control status and tobacco consumption also significantly impacted DNA yield. Feedback from laboratories which have analyzed DNA with different SNP genotyping technologies demonstrate that the vast majority of samples (approximately 88% performed adequately in different types of assays. To our knowledge this study is the largest to date to evaluate the sources of pre-analytical variations in DNA extracted from peripheral leucocytes. The results provide a strong evidence-based rationale for standardized recommendations on blood collection and processing protocols for large-scale genetic studies.

  12. sources

    Directory of Open Access Journals (Sweden)

    Shu-Yin Chiang

    2002-01-01

    Full Text Available In this paper, we study the simplified models of the ATM (Asynchronous Transfer Mode multiplexer network with Bernoulli random traffic sources. Based on the model, the performance measures are analyzed by the different output service schemes.

  13. Estimation of contribution ratios of pollutant sources to a specific section based on an enhanced water quality model.

    Science.gov (United States)

    Cao, Bibo; Li, Chuan; Liu, Yan; Zhao, Yue; Sha, Jian; Wang, Yuqiu

    2015-05-01

    Because water quality monitoring sections or sites could reflect the water quality status of rivers, surface water quality management based on water quality monitoring sections or sites would be effective. For the purpose of improving water quality of rivers, quantifying the contribution ratios of pollutant resources to a specific section is necessary. Because physical and chemical processes of nutrient pollutants are complex in water bodies, it is difficult to quantitatively compute the contribution ratios. However, water quality models have proved to be effective tools to estimate surface water quality. In this project, an enhanced QUAL2Kw model with an added module was applied to the Xin'anjiang Watershed, to obtain water quality information along the river and to assess the contribution ratios of each pollutant source to a certain section (the Jiekou state-controlled section). Model validation indicated that the results were reliable. Then, contribution ratios were analyzed through the added module. Results show that among the pollutant sources, the Lianjiang tributary contributes the largest part of total nitrogen (50.43%), total phosphorus (45.60%), ammonia nitrogen (32.90%), nitrate (nitrite + nitrate) nitrogen (47.73%), and organic nitrogen (37.87%). Furthermore, contribution ratios in different reaches varied along the river. Compared with pollutant loads ratios of different sources in the watershed, an analysis of contribution ratios of pollutant sources for each specific section, which takes the localized chemical and physical processes into consideration, was more suitable for local-regional water quality management. In summary, this method of analyzing the contribution ratios of pollutant sources to a specific section based on the QUAL2Kw model was found to support the improvement of the local environment.

  14. Experimental research on specific activity of 24Na using Chinese reference man phantom irradiated by 252Cf neutrons source

    International Nuclear Information System (INIS)

    Wang Yuexing; Yang Yifang; Lu Yongjie; Zhang Jianguo; Xing Hongchuan

    2011-01-01

    Objective: To investigate the specific activity of '2 4 Na per unit neutron fluence, A B/Φ ,in blood produced for Chinese reference man irradiated by 252 Cf neutron source,and to analyze the effects of scattering neutrons from ground,wall,and ceiling in irradiation site on it.Methods: A 252 Cf neutron source of 3×10 8 n/s and the anthropomorphic phantom were used for experiments. The phantom was made from 4 mm thick of outer covering by perspex and the liquid tissue-equivalent substitute in it. The data of phantom dimensions fit into Chinese reference man.The weight ratios of H, N, O and C in substitute equal from source to long axis of phantom were 1.1, 2.1, 3.1 and 4.1 m, respectively. Both the neutron source and the position of xiphisternum of the phantom were 1.6 m above the floor. Results: The average specific activity of 24 Na per unit neutron fluence was related to the irradiation-distances, d, and its maximum value, A B/ΦM , deduced by experimental data was about 1.85×10 -7 Bq·cm 2 ·g -1 . Conclusions: The A B/ΦM corresponds to that of phantom irradiated by plane-parallel beams, and the value is about more 3% than that by BOMAB phantom reported in literature. It has shown that floor-(wall-)scattered neutrons in irradiation site have significant contribution to the specific activity of 24 Na, but they contributed relatively little to the induced neutron doses. Consequently,using the specific activity of 24 Na for assessing accidental neutron doses received by an individual, the contribution of scattered neutrons in accident site will lead dose to be overestimated, and need to be correct. (authors)

  15. An analytic uranium sources model

    International Nuclear Information System (INIS)

    Singer, C.E.

    2001-01-01

    This document presents a method for estimating uranium resources as a continuous function of extraction costs and describing the uncertainty in the resulting fit. The estimated functions provide convenient extrapolations of currently available data on uranium extraction cost and can be used to predict the effect of resource depletion on future uranium supply costs. As such, they are a useful input for economic models of the nuclear energy sector. The method described here pays careful attention to minimizing built-in biases in the fitting procedure and defines ways to describe the uncertainty in the resulting fits in order to render the procedure and its results useful to the widest possible variety of potential users. (author)

  16. Source-specific social support and circulating inflammatory markers among white-collar employees.

    Science.gov (United States)

    Nakata, Akinori; Irie, Masahiro; Takahashi, Masaya

    2014-06-01

    Despite known beneficial effects of social support on cardiovascular health, the pathway through which sources of support (supervisor, coworkers, family/friends) influence inflammatory markers is not completely understood. We investigated the independent and moderating associations between social support and inflammatory markers. A total of 137 male white-collar employees underwent a blood draw for measurement of high-sensitive C-reactive protein (hs-CRP), interleukin-6 (IL-6), tumor necrosis factor alpha (TNF-α), monocyte and leukocyte counts, and completed a questionnaire on social support. Multivariable linear regression analyses controlling for covariates revealed that supervisor support was inversely associated with IL-6 (β = -0.24, p markers. Social support from the immediate supervisor may be a potential mechanism through which social support exerts beneficial effects on inflammatory markers in working men.

  17. Seaweed as source of energy. I: effect of a specific bacterial strain on biogas production

    Energy Technology Data Exchange (ETDEWEB)

    Rao, P.S.; Tarwade, S.J.; Sarma, K.S.R.

    1980-01-01

    Biogas was produced from seaweed by making use of alginate-digesting marine bacteria that were isolated from decomposing seaweed and can digest seaweed carbohydrates (agar and alginic acid). Laboratory digesters containing 100 g seaweed were inoculated with 50 mL broth cultures of different seaweed-derived bacterial strains, and the maximum amount of degradation obtained was 28% (compared with 13% for a bacteria-free digestion). Cow dung was added as a source of methanogenic bacteria, and the amount of biogas produced was more than double the amount obtained when seaweed and cow dung were digested in the absence of the seaweed-derived bacteria. Adding a small amount of Ulva to the seaweed digester increased the production of biogas.

  18. Valve-specific, analytic-phenomenological modelling of spray dispersion in zero-dimensional simulation; Ventilspezifische, analytisch-phaenomenologische Modellierung der Sprayausbreitung fuer die nulldimensionale Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Schuerg, F.; Arndt, S. [Robert Bosch GmbH, Stuttgart (Germany); Weigand, B. [Stuttgart Univ. (Germany). Inst. fuer Thermodynamik der Luft- und Raumfahrt

    2007-07-01

    Spray-guided combustion processes for gasoline direct injection offer a great fuel saving potential. The quality of mixture formation has direct impact on combustion and emissions and ultimately on the technical feasibility of the consumption advantage. Therefore, it is very important to select the optimal mixture formation strategy. A systematic optimization of the mixture formation process based on experiments or three-dimensional computational fluid dynamics requires tremendous effort. An efficient alternative is the application-oriented, zero-dimensional numerical simulation of mixture formation. With a systemic model formulation in terms of global thermodynamic and fluid mechanical balance equations, the presented simulation model considers all relevant aspects of the mixture formation process. A comparison with measurements in a pressure/temperature chamber using laser-induced exciplex fluorescence tomography revealed a very satisfactory agreement between simulation and experiment. The newly developed, analytic-phenomenological spray propagation model precisely captures the injector-specific mixture formation characteristics of an annular-orifice injector in terms of penetration and volume. Vaporization rate and mean air/fuel ratio as the key quantities of mixture formation are correctly reproduced. Thus, the simulation model is suited to numerically assess the quality and to optimize the strategy of mixture formation. (orig.)

  19. The Source and Impact of Specific Parameters that Enhance Well-Being in Daily Life.

    Science.gov (United States)

    Stewart, William C; Reynolds, Kelly E; Jones, Lydia J; Stewart, Jeanette A; Nelson, Lindsay A

    2016-08-01

    The purpose of this study was to review four parameters (forgiveness, gratitude, hope and empathy) frequently noted when evaluating well-being. We reviewed clinical studies from 1966 to present. We included 63 articles. All four of the parameters were shown to generally improve an individual's well-being. These parameters demonstrated a positive influence within more specific societal issues including improvement in social relationships, delinquent behavior and physical health. These parameters were generally derived from training and religion. This study suggests that these parameters may improve either one of general well-being, pro-social and positive relational behavior and demonstrate positive health effects.

  20. Developing a model of source-specific interpersonal conflict in health care.

    Science.gov (United States)

    Guidroz, Ashley M; Wang, Mo; Perez, Lisa M

    2012-02-01

    Nurses work in complex social environments, and conflict may arise with fellow coworkers, their supervisor, physicians or the patients and family they care for. Although much research has documented the negative effects of conflict on nurses, no research to date has examined the comparative effect that conflict from all four sources can have on nurses. The purpose of this study is to test a model of workplace conflict where the negative effect of conflict on nurses will be experienced via emotional exhaustion. We test the mediator model by analysing the cross-sectional data collected within one hospital (N1=182) and cross-validating those results in a second hospital (N2=161). The pattern of results was largely consistent across the two samples indicating support for a mediated model of workplace conflict for physician, supervisor and patient. Conflict with other nurses, however, did not have a relationship with either emotional exhaustion or other personal and organizational outcomes. The theoretical and practical implications of the current findings, as well as the limitations and future research directions, are discussed. Copyright © 2011 John Wiley & Sons, Ltd.

  1. Overview of plant specific source terms and their impact on risk

    International Nuclear Information System (INIS)

    Desaedeleer, G.

    2004-01-01

    Probabilistic risk assesment and safety assessment focuses on systems and measures to prevent core meltdown, and it integrates many aspects of design and operation. It provides mapping of initiating event, frequencies onto plant damage state and through plant systems analysis, utilizes fault tree and event tree logic models, may include 'external event' analysis such as fire, flood, wind, seismic events. Percent contribution of sequences to the core damage frequency are shown for the following plants, taken as examples ZION, EDISON, OCONEE 3, SEABROOK, SIZEWELL B, MILLSTONE 3, RINGHALS 2. The presentation includes comparison of the following initiating event frequencies: loss of off-site power; small LOCA; large LOCA, steam generator tube rupture; loss of feedwater; turbine trip; reactor trip. Consequence analysis deals with: dispersion and depletion of radioactivity in the atmosphere, health effects, factors in the off-site emergency plan analyzed with codes that address the weather conditions; provision of mapping of source terms; risk diagram for early fatalities and for latent cancer fatalities

  2. The Application of the Analytic Hierarchy Process and a New Correlation Algorithm to Urban Construction and Supervision Using Multi-Source Government Data in Tianjin

    Directory of Open Access Journals (Sweden)

    Shaoyi Wang

    2018-02-01

    Full Text Available As the era of big data approaches, big data has attracted increasing amounts of attention from researchers. Various types of studies have been conducted and these studies have focused particularly on the management, organization, and correlation of data and calculations using data. Most studies involving big data address applications in scientific, commercial, and ecological fields. However, the application of big data to government management is also needed. This paper examines the application of multi-source government data to urban construction and supervision in Tianjin, China. The analytic hierarchy process and a new approach called the correlation degree algorithm are introduced to calculate the degree of correlation between different approval items in one construction project and between different construction projects. The results show that more than 75% of the construction projects and their approval items are highly correlated. The results of this study suggest that most of the examined construction projects are well supervised, have relatively high probabilities of satisfying the relevant legal requirements, and observe their initial planning schemes.

  3. Molecular property diagnostic suite (MPDS): Development of disease-specific open source web portals for drug discovery.

    Science.gov (United States)

    Nagamani, S; Gaur, A S; Tanneeru, K; Muneeswaran, G; Madugula, S S; Consortium, Mpds; Druzhilovskiy, D; Poroikov, V V; Sastry, G N

    2017-11-01

    Molecular property diagnostic suite (MPDS) is a Galaxy-based open source drug discovery and development platform. MPDS web portals are designed for several diseases, such as tuberculosis, diabetes mellitus, and other metabolic disorders, specifically aimed to evaluate and estimate the drug-likeness of a given molecule. MPDS consists of three modules, namely data libraries, data processing, and data analysis tools which are configured and interconnected to assist drug discovery for specific diseases. The data library module encompasses vast information on chemical space, wherein the MPDS compound library comprises 110.31 million unique molecules generated from public domain databases. Every molecule is assigned with a unique ID and card, which provides complete information for the molecule. Some of the modules in the MPDS are specific to the diseases, while others are non-specific. Importantly, a suitably altered protocol can be effectively generated for another disease-specific MPDS web portal by modifying some of the modules. Thus, the MPDS suite of web portals shows great promise to emerge as disease-specific portals of great value, integrating chemoinformatics, bioinformatics, molecular modelling, and structure- and analogue-based drug discovery approaches.

  4. Detailed Source-Specific Molecular Composition of Ambient Aerosol Organic Matter Using Ultrahigh Resolution Mass Spectrometry and 1H NMR

    Directory of Open Access Journals (Sweden)

    Amanda S. Willoughby

    2016-06-01

    Full Text Available Organic aerosols (OA are universally regarded as an important component of the atmosphere that have far-ranging impacts on climate forcing and human health. Many of these impacts are related to OA molecular characteristics. Despite the acknowledged importance, current uncertainties related to the source apportionment of molecular properties and environmental impacts make it difficult to confidently predict the net impacts of OA. Here we evaluate the specific molecular compounds as well as bulk structural properties of total suspended particulates in ambient OA collected from key emission sources (marine, biomass burning, and urban using ultrahigh resolution mass spectrometry (UHR-MS and proton nuclear magnetic resonance spectroscopy (1H NMR. UHR-MS and 1H NMR show that OA within each source is structurally diverse, and the molecular characteristics are described in detail. Principal component analysis (PCA revealed that (1 aromatic nitrogen species are distinguishing components for these biomass burning aerosols; (2 these urban aerosols are distinguished by having formulas with high O/C ratios and lesser aromatic and condensed aromatic formulas; and (3 these marine aerosols are distinguished by lipid-like compounds of likely marine biological origin. This study provides a unique qualitative approach for enhancing the chemical characterization of OA necessary for molecular source apportionment.

  5. Adolescents show sex-specific preferences on media when pornography is a major source of sexual knowledge

    DEFF Research Database (Denmark)

    Rasmussen, Anna Lund; Svarrer, Rebekka; Lauszus, Finn Friis

    2017-01-01

    photographs;thus, these magazines constituted a major source of adolescent girls. Girls knew the gestational age of legal abortion in Denmark and had their knowledge from non-explicit magazines while this was not the case for boys (p=0.004). Pupils who stated their knowledge on sex from these magazines knew...... the first sign of pregnancy (menostasia), the correct facts of legal abortion, and STI.Conclusions: Pornography in different media is used in the vast majority of adolescents and its use is sex-specific. Knowledge on STI, pregnancy, legal abortion was variably associated with the type of media....... with focus on pornography and what media was used. Pornography was divided according to five media subcategories. Knowledge on sexually transmitted infection (STI), pregnancy and abortion and their associations with pornography were explored.Results: Pornography was reported as the second largest source...

  6. Determination of the specific resistance of individual freestanding ZnO nanowires with the low energy electron point source microscope

    Energy Technology Data Exchange (ETDEWEB)

    Weber, Dirk Henning; Beyer, Andre; Voelkel, Berthold; Goelzhaeuser, Armin [Physik Supramolekularer Systeme, Universitaet Bielefeld (Germany); Schlenker, Eva; Bakin, Andrey; Waag, Andreas [Institut fuer Halbleitertechnik, Technische Universitaet Braunschweig (Germany)

    2008-07-01

    A low energy electron point source (LEEPS) microscope is used to determine the electrical conductivity of individual freestanding ZnO nanowires in UHV. The nanowires were contacted with a manipulation tip and I-V curves were taken at different wire lengths. From those, the specific resistance was calculated and separated from the contact resistance. By comparing the specific resistances of ZnO nanowires with diameters between 1100 and 48 nm, a large surface contribution for the thin nanowires was found. A geometric model for separation between surface and bulk contributions is given. The results of electrical transport measurements on vapor phase grown ZnO nanowires are discussed, as well as the size dependence of the wire resistance.

  7. Deriving Earth Science Data Analytics Requirements

    Science.gov (United States)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  8. Sources and transformation of dissolved and particulate organic nitrogen in the North Pacific Subtropical Gyre indicated by compound-specific δ15N analysis of amino acids

    Science.gov (United States)

    Yamaguchi, Yasuhiko T.; McCarthy, Matthew D.

    2018-01-01

    This study explores the use of compound-specific nitrogen isotopes of amino acids (δ15NAA) of coupled dissolved and particulate organic nitrogen (DON, PON) samples as a new approach to examine relative sources, transformation processes, and the potential coupling of these two major forms of N cycle in the ocean water column. We measured δ15NAA distributions in high-molecular-weight dissolved organic nitrogen (HMW DON) and suspended PON in the North Pacific Subtropical Gyre (NPSG) from surface to mesopelagic depths. A new analytical approach achieved far greater δ15NAA measurement precision for DON than earlier work, allowing us to resolve previously obscured differences in δ15NAA signatures, both with depth and between ON pools. We propose that δ15N values of total hydrolysable amino acids (THAA) represents a proxy for proteinaceous ON δ15N values in DON and PON. Together with bulk δ15N values, this allows δ15N values and changes in bulk, proteinaceous, and ;other-N; to be directly evaluated. These novel measurements suggest three main conclusions. First, the δ15NAA signatures of both surface and mesopelagic HMW DON suggest mainly heterotrophic bacterial sources, with mesopelagic HMW DON bearing signatures of far more degraded material compared to surface material. These results contrast with a previous proposal that HMW DON δ15NAA patterns are essentially ;pre-formed; by cyanobacteria in the surface ocean, undergo little change with depth. Second, different δ15NAA values and patterns of HMW DON vs. suspended PON in the surface NPSG suggest that sources and cycling of these two N reservoirs are surpisingly decoupled. Based on molecular δ15N signatures, we propose a new hypothesis that production of surface HMW DON is ultimately derived from subsurface nitrate, while PON in the mixed layer is strongly linked to N2 fixation and N recycling. In contrast, the comparative δ15NAA signatures of HMW DON vs. suspended PON in the mesopelagic also suggest a

  9. Ground Deformation and Sources geometry of the 2016 Central Italy Earthquake Sequence Investigated through Analytical and Numerical Modeling of DInSAR Measurements and Structural-Geological Data

    Science.gov (United States)

    Solaro, G.; Bonano, M.; Boncio, P.; Brozzetti, F.; Castaldo, R.; Casu, F.; Cirillo, D.; Cheloni, D.; De Luca, C.; De Nardis, R.; De Novellis, V.; Ferrarini, F.; Lanari, R.; Lavecchia, G.; Manunta, M.; Manzo, M.; Pepe, A.; Pepe, S.; Tizzani, P.; Zinno, I.

    2017-12-01

    The 2016 Central Italy seismic sequence started on 24th August with a MW 6.1 event, where the intra-Apennine WSW-dipping Vettore-Gorzano extensional fault system released a destructive earthquake, causing 300 casualties and extensive damage to the town of Amatrice and surroundings. We generated several interferograms by using ALOS and Sentinel 1-A and B constellation data acquired on both ascending and descending orbits to show that most displacement is characterized by two main subsiding lobes of about 20 cm on the fault hanging-wall. By inverting the generated interferograms, following the Okada analytical approach, the modelling results account for two sources related to main shock and more energetic aftershock. Through Finite Element numerical modelling that jointly exploits DInSAR deformation measurements and structural-geological data, we reconstruct the 3D source of the Amatrice 2016 normal fault earthquake which well fit the main shock. The inversion shows that the co-seismic displacement area was partitioned on two distinct en echelon fault planes, which at the main event hypocentral depth (8 km) merge in one single WSW-dipping surface. Slip peaks were higher along the southern half of the Vettore fault, lower along the northern half of Gorzano fault and null in the relay zone between the two faults; field evidence of co-seismic surface rupture are coherent with the reconstructed scenario. The following seismic sequence was characterized by numerous aftershocks located southeast and northwest of the epicenter which decreased in frequency and magnitude until the end of October, when a MW 5.9 event occurred on 26th October about 25 km to the NW of the previous mainshock. Then, on 30th October, a third large event of magnitude MW 6.5 nucleated below the town of Norcia, striking the area between the two preceding events and filling the gap between the previous ruptures. Also in this case, we exploit a large dataset of DInSAR and GPS measurements to investigate

  10. [Quality Management and Quality Specifications of Laboratory Tests in Clinical Studies--Challenges in Pre-Analytical Processes in Clinical Laboratories].

    Science.gov (United States)

    Ishibashi, Midori

    2015-01-01

    The cost, speed, and quality are the three important factors recently indicated by the Ministry of Health, Labour and Welfare (MHLW) for the purpose of accelerating clinical studies. Based on this background, the importance of laboratory tests is increasing, especially in the evaluation of clinical study participants' entry and safety, and drug efficacy. To assure the quality of laboratory tests, providing high-quality laboratory tests is mandatory. For providing adequate quality assurance in laboratory tests, quality control in the three fields of pre-analytical, analytical, and post-analytical processes is extremely important. There are, however, no detailed written requirements concerning specimen collection, handling, preparation, storage, and shipping. Most laboratory tests for clinical studies are performed onsite in a local laboratory; however, a part of laboratory tests is done in offsite central laboratories after specimen shipping. As factors affecting laboratory tests, individual and inter-individual variations are well-known. Besides these factors, standardizing the factors of specimen collection, handling, preparation, storage, and shipping, may improve and maintain the high quality of clinical studies in general. Furthermore, the analytical method, units, and reference interval are also important factors. It is concluded that, to overcome the problems derived from pre-analytical processes, it is necessary to standardize specimen handling in a broad sense.

  11. Stingless bees (Melipona scutellaris) learn to associate footprint cues at food sources with a specific reward context.

    Science.gov (United States)

    Roselino, Ana Carolina; Rodrigues, André Vieira; Hrncir, Michael

    2016-10-01

    Foraging insects leave chemical footprints on flowers that subsequent foragers may use as indicators for recent flower visits and, thus, potential resource depletion. Accordingly, foragers should reject food sources presenting these chemical cues. Contrasting this assumption, experimental studies in stingless bees (Apidae, Meliponini), so far, demonstrated an attractive effect of footprints. These findings lead to doubts about the meaning of these chemical cues in natural foraging contexts. Here, we asked whether foragers of stingless bees (Melipona scutellaris) use footprints according to the previously experienced reward level of visited food sources. Bees were trained to artificial flower patches, at which the reward of a flower either decreased or, alternatively, increased after a visit by a forager. Individuals were allowed a total of nine foraging bouts to the patch, after which their preference for visited or unvisited flowers was tested. In the choice tests, bees trained under the decreasing reward context preferred unvisited flowers, whereas individuals trained under the increasing reward context preferred visited flowers. Foragers without experience chose randomly between visited and unvisited flowers. These results demonstrate that M. scutellaris learns to associate unspecific footprint cues at food sources with differential, specific reward contexts, and uses these chemical cues accordingly for their foraging decisions.

  12. Specific absorbed fractions of energy at various ages from internal photon sources: 3, Five-year-old

    International Nuclear Information System (INIS)

    Cristy, M.; Eckerman, K.F.

    1987-04-01

    Specific absorbed fractions (PHI's) in various organs of the body (target organs) from sources of monoenergetic photons in various other organs (source organs) are tabulated. In this volume PHI-values are tabulated for a five-year-old or 19-kg person. These PHI-values can be used in calculating the photon component of the dose-equivalent rate in a given target organ from a given radionuclide that is present in a given source organ. The International Commission on Radiological Protection recognizes that the endosteal, or bone surface, cells are the tissue at risk for bone cancer. We have applied the dosimetry methods developed for beta-emitting radionuclides deposited in bone to follow the transport of secondary electrons that were freed by photon interactions through the microscopic structure of the skeleton. With these methods we can estimate PHI in the endosteal cells and can better estimate PHI in the active marrow; the latter is overestimated with other methods at photon energies below 200 keV. 12 refs., 2 tabs

  13. Long-term effects of total and source-specific particulate air pollution on incident cardiovascular disease in Gothenburg, Sweden.

    Science.gov (United States)

    Stockfelt, Leo; Andersson, Eva M; Molnár, Peter; Gidhagen, Lars; Segersson, David; Rosengren, Annika; Barregard, Lars; Sallsten, Gerd

    2017-10-01

    Long-term exposure to air pollution increases cardiopulmonary morbidity and mortality, but it is not clear which components of air pollution are the most harmful, nor which time window of exposure is most relevant. Further studies at low exposure levels have also been called for. We analyzed two Swedish cohorts to investigate the effects of total and source-specific particulate matter (PM) on incident cardiovascular disease for different time windows of exposure. Two cohorts initially recruited to study predictors of cardiovascular disease (the PPS cohort and the GOT-MONICA cohort) were followed from 1990 to 2011. We collected data on residential addresses and assigned each individual yearly total and source-specific PM and Nitrogen Oxides (NO x ) exposures based on dispersion models. Using multivariable Cox regression models with time-dependent exposure, we studied the association between three different time windows (lag 0, lag 1-5, and exposure at study start) of residential PM and NO x exposure, and incidence of ischemic heart disease, stroke, heart failure and atrial fibrillation. During the study period, there were 2266 new-onset cases of ischemic heart disease, 1391 of stroke, 925 of heart failure and 1712 of atrial fibrillation. The majority of cases were in the PPS cohort, where participants were older. Exposure levels during the study period were moderate (median: 13µg/m 3 for PM 10 and 9µg/m 3 for PM 2.5 ), and similar in both cohorts. Road traffic and residential heating were the largest local sources of PM air pollution, and long distance transportation the largest PM source in total. In the PPS cohort, there were positive associations between PM in the last five years and both ischemic heart disease (HR: 1.24 [95% CI: 0.98-1.59] per 10µg/m 3 of PM 10 , and HR: 1.38 [95% CI: 1.08-1.77] per 5µg/m 3 of PM 2.5 ) and heart failure. In the GOT-MONICA cohort, there were positive but generally non-significant associations between PM and stroke (HR: 1

  14. Radioimmunoassay. A revolution in the analytic procedure

    Energy Technology Data Exchange (ETDEWEB)

    Strecker, H; Eckert, H G [Farbwerke Hoechst A.G., Frankfurt am Main (Germany, F.R.). Radiochemisches Lab.

    1978-04-01

    Radioimmunoassay is an analytic method which combines the sensitivity of radioactive measurement and the specificity of the antigen-antibody reaction Substances down to a concentration of some picograms per ml serum (or biological material) can be measured in the presence of a millionfold excess of otherwise interfering substances. The method is easy to carry out (test tube chemistry). The main field of application at the moment is in endocrinology; further possibilities of application are in pharmaceutical research, environmental protection, forensic medicine, and for general analytic purposes. Radioactive sources are used only in vitro in the nanocurie range, i.e. radiation exposure is negligible.

  15. Analytic trigonometry

    CERN Document Server

    Bruce, William J; Maxwell, E A; Sneddon, I N

    1963-01-01

    Analytic Trigonometry details the fundamental concepts and underlying principle of analytic geometry. The title aims to address the shortcomings in the instruction of trigonometry by considering basic theories of learning and pedagogy. The text first covers the essential elements from elementary algebra, plane geometry, and analytic geometry. Next, the selection tackles the trigonometric functions of angles in general, basic identities, and solutions of equations. The text also deals with the trigonometric functions of real numbers. The fifth chapter details the inverse trigonometric functions

  16. STEPS: source term estimation based on plant status phase 0 - the technical specifications of the containment module

    International Nuclear Information System (INIS)

    Vamanu, D.V.

    1998-01-01

    In the framework of Project RODOS (Real-Time On-Line Decision Support System for Nuclear Emergencies in Europe), the European Atomic Energy Community (EAEC) of the Commission of the European Communities has commissioned the development of a unified concept, body of knowledge, models and software package meant to assist the evaluation of the source term of severe nuclear accidents in light water reactors of the types prevailing on the Continent. Code-named STEPS, for 'Source Term Estimation based on Plant Status', the project has evolved as Contract RODOS D (FI4P-CT96-0048), between EAEC and consortium of expert European centres including Commissariat a l'Energie Atomique, Institut de Protection et de Surete Nucleaire, (CEA-DPI-SEAC) as Coordinator and Forschungszentrum Karlsruhe GmbH (FZK-INR), the Finnish Centre for Radiation and Nuclear Safety, (STUK-NSD), the Technical Research Centre of Finland, Energy, Nuclear Energy (VTT-ET-NE), and Eidgenossische Technische Hochschule - ETH Zurich, Centre of Excellence (ETH-CERS) as Contractors. For the Phase 0 of the project, an IFIN-HH expert has been assigned by ETH-CERS to develop the Technical Specifications of the delivery component of the intended STEPS package, the CONTAINMENT Module. Sponsored by ETH-CERS headquarters in Zurich, the work was done on the premises and with the logistic support of CEA D PI-SEAC at Fontenay-aux-Roses, with the feedback processing and computer code development subsequently performed in Bucharest. The Technical Specifications of the STEPS CONTAINMENT Module were guided by specific terms of reference, including: (i) the capability of the software to function as a source term interface between targeted nuclear power plants and the RODOS System; (ii) the comparable capability of the system to be operated as a stand-alone assessment and decision support tool for a comprehensive variety of plants, nuclear emergency classes. On the technical side, the specifications had to focus on the possible

  17. Source-specific sewage pollution detection in urban river waters using pharmaceuticals and personal care products as molecular indicators.

    Science.gov (United States)

    Kiguchi, Osamu; Sato, Go; Kobayashi, Takashi

    2016-11-01

    Source-specific elucidation of domestic sewage pollution caused by various effluent sources in an urban river water, as conducted for this study, demands knowledge of the relation between concentrations of pharmaceuticals and personal care products (PPCPs) as molecular indicators (caffeine, carbamazepine, triclosan) and water quality concentrations of total nitrogen (T-N) and total phosphorous (T-P). River water and wastewater samples from the Asahikawa River Basin in northern Japan were analyzed using derivatization-gas chromatography/mass spectrometry. Caffeine, used as an indicator of domestic sewage in the Asahikawa River Basin, was more ubiquitous than either carbamazepine or triclosan (92-100 %). Its concentration was higher than any target compound used to assess the basin: caffeine, caffeine concentrations detected in wastewater effluents and the strongly positive mutual linear correlation between caffeine and T-N or T-P (R 2  > 0.759) reflect the contribution of septic tank system effluents to the lower Asahikawa River Basin. Results of relative molecular indicators in combination with different molecular indicators (caffeine/carbamazepine and triclosan/carbamazepine) and cluster analysis better reflect the contribution of sewage than results obtained using concentrations of respective molecular indicators and cluster analysis. Relative molecular indicators used with water quality parameters (e.g., caffeine/T-N ratio) in this study provide results more clearly, relatively, and quantitatively than results obtained using molecular indicators alone. Moreover, the caffeine/T-N ratio reflects variations of caffeine flux from effluent sources. These results suggest strongly relative molecular indicators are also useful indicators, reflecting differences in spatial contributions of domestic sources for PPCPs in urban areas.

  18. Analytical Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Analytical Labspecializes in Oil and Hydraulic Fluid Analysis, Identification of Unknown Materials, Engineering Investigations, Qualification Testing (to support...

  19. Google analytics integrations

    CERN Document Server

    Waisberg, Daniel

    2015-01-01

    A roadmap for turning Google Analytics into a centralized marketing analysis platform With Google Analytics Integrations, expert author Daniel Waisberg shows you how to gain a more meaningful, complete view of customers that can drive growth opportunities. This in-depth guide shows not only how to use Google Analytics, but also how to turn this powerful data collection and analysis tool into a central marketing analysis platform for your company. Taking a hands-on approach, this resource explores the integration and analysis of a host of common data sources, including Google AdWords, AdSens

  20. ESIP Earth Sciences Data Analytics (ESDA) Cluster - Work in Progress

    Science.gov (United States)

    Kempler, Steven

    2015-01-01

    The purpose of this poster is to promote a common understanding of the usefulness of, and activities that pertain to, Data Analytics and more broadly, the Data Scientist; Facilitate collaborations to better understand the cross usage of heterogeneous datasets and to provide accommodating data analytics expertise, now and as the needs evolve into the future; Identify gaps that, once filled, will further collaborative activities. Objectives Provide a forum for Academic discussions that provides ESIP members a better understanding of the various aspects of Earth Science Data Analytics Bring in guest speakers to describe external efforts, and further teach us about the broader use of Data Analytics. Perform activities that:- Compile use cases generated from specific community needs to cross analyze heterogeneous data- Compile sources of analytics tools, in particular, to satisfy the needs of the above data users- Examine gaps between needs and sources- Examine gaps between needs and community expertise- Document specific data analytics expertise needed to perform Earth science data analytics Seek graduate data analytics Data Science student internship opportunities.

  1. Compound-Specific Isotope Analysis (CSIA Application for Source Apportionment and Natural Attenuation Assessment of Chlorinated Benzenes

    Directory of Open Access Journals (Sweden)

    Luca Alberti

    2017-11-01

    Full Text Available In light of the complex management of chlorobenzene (CB contaminated sites, at which a hydraulic barrier (HB for plumes containment is emplaced, compound-specific stable isotope analysis (CSIA has been applied for source apportionment, for investigating the relation between the upgradient and downgradient of the HB, and to target potential CB biodegradation processes. The isotope signature of all the components potentially involved in the degradation processes has been expressed using the concentration-weighted average δ13C of CBs + benzene (δ13Csum. Upgradient of the HB, the average δ13Csum of −25.6‰ and −29.4‰ were measured for plumes within the eastern and western sectors, respectively. Similar values were observed for the potential sources, with δ13Csum values of −26.5‰ for contaminated soils and −29.8‰ for the processing water pipeline in the eastern and western sectors, respectively, allowing for apportioning of these potential sources to the respective contaminant plumes. For the downgradient of the HB, similar CB concentrations but enriched δ13Csum values between −24.5‰ and −25.9‰ were measured. Moreover, contaminated soils showed a similar δ13Csum signature of −24.5‰, thus suggesting that the plumes likely originate from past activities located in the downgradient of the HB. Within the industrial property, significant δ13C enrichments were measured for 1,2,4-trichlorobenzene (TCB, 1,2-dichlorobenzene (DCB, 1,3-DCB, and 1,4-DCBs, thus suggesting an important role for anaerobic biodegradation. Further degradation of monochlorobenzene (MCB and benzene was also demonstrated. CSIA was confirmed to be an effective approach for site characterization, revealing the proper functioning of the HB and demonstrating the important role of natural attenuation processes in reducing the contamination upgradient of the HB.

  2. Overview of the relations earthquake source parameters and the specification of strong ground motion for design purposes

    International Nuclear Information System (INIS)

    Bernreuter, D.L.

    1977-08-01

    One of the most important steps in the seismic design process is the specification of the appropriate ground motion to be input into the design analysis. From the point-of-view of engineering design analysis, the important parameters are peak ground acceleration, spectral shape and peak spectral levels. In a few cases, ground displacement is a useful parameter. The earthquake is usually specified by giving its magnitude and either the epicentral distance or the distance of the closest point on the causitive fault to the site. Typically, the appropriate ground motion parameters are obtained using the specified magnitude and distance in equations obtained from regression analysis among the appropriate variables. Two major difficulties with such an approach are: magnitude is not the best parameter to use to define the strength of an earthquake, and little near-field data is available to establish the appropriate form for the attenuation of the ground motion with distance, source size and strength. These difficulties are important for designing a critical facility; i.e., one for which a very low risk of exceeding the design ground motion is required. Examples of such structures are nuclear power plants, schools and hospitals. for such facilities, a better understanding of the relation between the ground motion and the important earthquake source parameters could be very useful for several reasons

  3. Quo vadis, analytical chemistry?

    Science.gov (United States)

    Valcárcel, Miguel

    2016-01-01

    This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.

  4. Analytic geometry

    CERN Document Server

    Burdette, A C

    1971-01-01

    Analytic Geometry covers several fundamental aspects of analytic geometry needed for advanced subjects, including calculus.This book is composed of 12 chapters that review the principles, concepts, and analytic proofs of geometric theorems, families of lines, the normal equation of the line, and related matters. Other chapters highlight the application of graphing, foci, directrices, eccentricity, and conic-related topics. The remaining chapters deal with the concept polar and rectangular coordinates, surfaces and curves, and planes.This book will prove useful to undergraduate trigonometric st

  5. I. Forensic data analysis by pattern recognition. Categorization of white bond papers by elemental composition. II. Source identification of oil spills by pattern recognition analysis of natural elemental composition. III. Improving the reliability of factor analysis of chemical measured analytical data by utilizing the measured analytical uncertainity. IV. Elucidating the structure of some clinical data

    International Nuclear Information System (INIS)

    Duewer, D.L.

    1977-01-01

    Pattern recognition techniques are applied to the analysis of white bond papers and the problem of determining the source of an oil spill. In each case, an elemental analysis by neutron activation is employed. For the determination of source of oil spills, the field sample was weathered prior to activation analysis. A procedure for including measured analytical uncertainty into data analysis methodology is discussed, with particular reference to factor analysis. The suitability of various dispersion matrices and matrix rank determination criteria for data having analytical uncertainty is investigated. A criterion useful for judging the number of factors insensitive to analytical uncertainty is presented. A model data structure for investigating the behavior of factor analysis techniques in a known, controlled manner is described and analyzed. A chemically interesting test data base having analytical uncertainty is analyzed and compared with the model data. The data structure of 22 blood constituents in three categories of liver disease (viral or toxic hepatitis, alcoholic liver diseases and obstructive processes) is studied using various statistical and pattern recognition techniques. Comparison of classification results on the original data, in combination with principal component analysis, suggests a possible underlying structure for the data. This model structure is tested by the application of two simple data transformations. Analysis of the transformed data appears to confirm that some basic understanding of the studied data has been achieved

  6. What limits working memory capacity? Evidence for modality-specific sources to the simultaneous storage of visual and auditory arrays.

    Science.gov (United States)

    Fougnie, Daryl; Marois, René

    2011-11-01

    There is considerable debate on whether working memory (WM) storage is mediated by distinct subsystems for auditory and visual stimuli (Baddeley, 1986) or whether it is constrained by a single, central capacity-limited system (Cowan, 2006). Recent studies have addressed this issue by measuring the dual-task cost during the concurrent storage of auditory and visual arrays (e.g., Cocchini, Logie, Della Sala, MacPherson, & Baddeley, 2002; Fougnie & Marois, 2006; Saults & Cowan, 2007). However, studies have yielded widely different dual-task costs, which have been taken to support both modality-specific and central capacity-limit accounts of WM storage. Here, we demonstrate that the controversies regarding such costs mostly stem from how these costs are measured. Measures that compare combined dual-task capacity with the higher single-task capacity support a single, central WM store when there is a large disparity between the single-task capacities (Experiment 1) but not when the single-task capacities are well equated (Experiment 2). In contrast, measures of the dual-task cost that normalize for differences in single-task capacity reveal evidence for modality-specific stores, regardless of single-task performance. Moreover, these normalized measures indicate that dual-task cost is much smaller if the tasks do not involve maintaining bound feature representations in WM (Experiment 3). Taken together, these experiments not only resolve a discrepancy in the field and clarify how to assess the dual-task cost but also indicate that WM capacity can be constrained both by modality-specific and modality-independent sources of information processing.

  7. Laser-induced plasmas as an analytical source for quantitative analysis of gaseous and aerosol systems: Fundamentals of plasma-particle interactions

    Science.gov (United States)

    Diwakar, Prasoon K.

    2009-11-01

    Laser-induced Breakdown Spectroscopy (LIBS) is a relatively new analytical diagnostic technique which has gained serious attention in recent past due to its simplicity, robustness, and portability and multi-element analysis capabilities. LIBS has been used successfully for analysis of elements in different media including solids, liquids and gases. Since 1963, when the first breakdown study was reported, to 1983, when the first LIBS experiments were reported, the technique has come a long way, but the majority of fundamental understanding of the processes that occur has taken place in last few years, which has propelled LIBS in the direction of being a well established analytical technique. This study, which mostly focuses on LIBS involving aerosols, has been able to unravel some of the mysteries and provide knowledge that will be valuable to LIBS community as a whole. LIBS processes can be broken down to three basic steps, namely, plasma formation, analyte introduction, and plasma-analyte interactions. In this study, these three steps have been investigated in laser-induced plasma, focusing mainly on the plasma-particle interactions. Understanding plasma-particle interactions and the fundamental processes involved is important in advancing laser-induced breakdown spectroscopy as a reliable and accurate analytical technique. Critical understanding of plasma-particle interactions includes study of the plasma evolution, analyte atomization, and the particle dissociation and diffusion. In this dissertation, temporal and spatial studies have been done to understand the fundamentals of the LIBS processes including the breakdown of gases by the laser pulse, plasma inception mechanisms, plasma evolution, analyte introduction and plasma-particle interactions and their influence on LIBS signal. Spectral measurements were performed in a laser-induced plasma and the results reveal localized perturbations in the plasma properties in the vicinity of the analyte species, for

  8. Analytical chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Chae, Myeong Hu; Lee, Hu Jun; Kim, Ha Seok

    1989-02-15

    This book give explanations on analytical chemistry with ten chapters, which deal with development of analytical chemistry, the theory of error with definition and classification, sample and treatment gravimetry on general process of gravimetry in aqueous solution and non-aqueous solution, precipitation titration about precipitation reaction and types, complexometry with summary and complex compound, oxidation-reduction equilibrium on electrode potential and potentiometric titration, solvent extraction and chromatograph and experiment with basic operation for chemical experiment.

  9. Analytical chemistry

    International Nuclear Information System (INIS)

    Chae, Myeong Hu; Lee, Hu Jun; Kim, Ha Seok

    1989-02-01

    This book give explanations on analytical chemistry with ten chapters, which deal with development of analytical chemistry, the theory of error with definition and classification, sample and treatment gravimetry on general process of gravimetry in aqueous solution and non-aqueous solution, precipitation titration about precipitation reaction and types, complexometry with summary and complex compound, oxidation-reduction equilibrium on electrode potential and potentiometric titration, solvent extraction and chromatograph and experiment with basic operation for chemical experiment.

  10. Sodium intakes of US children and adults from foods and beverages by location of origin and by specific food source.

    Science.gov (United States)

    Drewnowski, Adam; Rehm, Colin D

    2013-05-28

    Sodium intakes, from foods and beverages, of 22,852 persons in the National Health and Nutrition Examination Surveys (NHANES 2003-2008) were examined by specific food source and by food location of origin. Analyses were based on a single 24-h recall. Separate analyses were conducted for children (6-11 years of age), adolescents (12-19), and adults (20-50 and ≥51 years). Grouping of like foods (e.g., food sources) used a scheme proposed by the National Cancer Institute, which divides foods/beverages into 96 food subgroups (e.g., pizza, yeast breads or cold cuts). Food locations of origin were stores (e.g., grocery, convenience and specialty stores), quick-service restaurant/pizza (QSR), full-service restaurant (FSR), school, or other. Food locations of sodium were also evaluated by race/ethnicity amongst adults. Stores provided between 58.1% and 65.2% of dietary sodium, whereas QSR and FSR together provided between 18.9% and 31.8% depending on age. The proportion of sodium from QSR varied from 10.1% to 19.9%, whereas that from FSR varied from 3.4% to 13.3%. School meals provided 10.4% of sodium for 6-11 year olds and 6.0% for 12-19 year olds. Pizza from QSR, the top away from home food item, provided 5.4% of sodium in adolescents. QSR pizza, chicken, burgers and Mexican dishes combined provided 7.8% of total sodium in adult diets. Most sodium came from foods purchased in stores. Food manufacturers, restaurants, and grocery stores all have a role to play in reducing the amount of sodium in the American diet.

  11. Targeting the Sources of Fecal Contamination using Dog-, Human-, and Ruminant- Specific Markers in the Lake Herrick Watershed, Georgia.

    Science.gov (United States)

    Saintil, T.; Radcliffe, D. E.; Rasmussen, T. C.; Habteselassie, M.; Sowah, R.; Kannan, A.

    2016-12-01

    The Lake Herrick Watershed is about 1.5 km2 and covers portions of the University of Georgia's East campus, the Oconee Forest, residential and commercial landuse. Lake Herrick, a recreational site on the University of Georgia campus, was closed in 2002 due to fecal contamination. Subsequent monitoring confirmed persistent contamination, which led to a permanent closure to swimming, boating, and fishing. While fecal coliform abundance is a standard metric for determining human health risks, Geldreich (1970) showed that fecal abundance does not necessarily correlate with the presence of pathogens. Nor does it identify pollution sources, which are needed to mitigate health risks. Two inflow tributaries and the outlet stream were monitored for discharge, fecal coliform, forms of nitrogen and phosphorus and other water-quality data to quantify lake influent and effluent bacteria loads. Fecal sources were identified using the human HF183 genetic marker (Seurinck et al., 2005), the ruminant BacR marker (Reischer et al., 2006), and the dog mitochondrial DNA (mtDNA) marker (Tambalo et al., 2012). Preliminary results confirm high concentrations of E. coli and Enterococci, above the State's limit of 124 MPN/100 mL, in both baseflows and stormflows. The findings also suggest that the E. coli and Enterococci loads from the inlet tributaries are on average higher compared to the bacteria loads coming out of the outlet stream. The human markers were detectable at all three sites but most of the samples were not quantifiable. The ruminant markers were quantifiable at both inlets but no ruminant markers were found at the outlet. The dog markers were detectable but not quantifiable at both inlets and no dog markers were detected at the outlet. Statistical analyses will be used to establish relationships between the nutrients data, the fecal concentrations, and the gene-specific markers.

  12. Branched GDGTs in Lacustrine Environments: Tracing Allochthonous and Autochthonous Sources Using Compound-Specific Stable Carbon Isotope Analysis

    Science.gov (United States)

    Weber, Y.; S Sinninghe Damsté, J.; Lehmann, M. F.; Niemann, H.; Schubert, C. J.

    2015-12-01

    allochthonous (i.e., soil) source. Our data demonstrate the great potential of compound-specific C isotope analysis to constrain the origin of brGDGTs in lake sediments, possibly allowing the identification of freshwater environments that are particularly suited for brGDGT-based paleoenvironmental reconstructions.

  13. Radiological source tracking in oil/gas, medical and other industries: requirements and specifications for passive RFID technology

    Energy Technology Data Exchange (ETDEWEB)

    Dowla, Farid U. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-01

    Subsurface sensors that employ radioisotopes, such 241Am-Be and 137Cs, for reservoir characterization must be tracked for safety and security reasons. Other radiological sources are also widely used in medicine. The radiological source containers, in both applications, are small, mobile and used widely worldwide. The nuclear sources pose radiological dispersal device (RDD) security risks. Security concerns with the industrial use of radionuclide sources is in fact quite high as it is estimated that each year hundreds of sealed sources go missing, either lost or stolen. Risk mitigation efforts include enhanced regulations, source-use guidelines, research and development on electronic tracking of sources. This report summarizes the major elements of the requirements and operational concepts of nuclear sources with the goal of developing automated electronic tagging and locating systems.

  14. WEB MAPPING ARCHITECTURES BASED ON OPEN SPECIFICATIONS AND FREE AND OPEN SOURCE SOFTWARE IN THE WATER DOMAIN

    Directory of Open Access Journals (Sweden)

    C. Arias Muñoz

    2017-09-01

    Full Text Available The availability of water-related data and information across different geographical and jurisdictional scales is of critical importance for the conservation and management of water resources in the 21st century. Today information assets are often found fragmented across multiple agencies that use incompatible data formats and procedures for data collection, storage, maintenance, analysis, and distribution. The growing adoption of Web mapping systems in the water domain is reducing the gap between data availability and its practical use and accessibility. Nevertheless, more attention must be given to the design and development of these systems to achieve high levels of interoperability and usability while fulfilling different end user informational needs. This paper first presents a brief overview of technologies used in the water domain, and then presents three examples of Web mapping architectures based on free and open source software (FOSS and the use of open specifications (OS that address different users’ needs for data sharing, visualization, manipulation, scenario simulations, and map production. The purpose of the paper is to illustrate how the latest developments in OS for geospatial and water-related data collection, storage, and sharing, combined with the use of mature FOSS projects facilitate the creation of sophisticated interoperable Web-based information systems in the water domain.

  15. Web Mapping Architectures Based on Open Specifications and Free and Open Source Software in the Water Domain

    Science.gov (United States)

    Arias Muñoz, C.; Brovelli, M. A.; Kilsedar, C. E.; Moreno-Sanchez, R.; Oxoli, D.

    2017-09-01

    The availability of water-related data and information across different geographical and jurisdictional scales is of critical importance for the conservation and management of water resources in the 21st century. Today information assets are often found fragmented across multiple agencies that use incompatible data formats and procedures for data collection, storage, maintenance, analysis, and distribution. The growing adoption of Web mapping systems in the water domain is reducing the gap between data availability and its practical use and accessibility. Nevertheless, more attention must be given to the design and development of these systems to achieve high levels of interoperability and usability while fulfilling different end user informational needs. This paper first presents a brief overview of technologies used in the water domain, and then presents three examples of Web mapping architectures based on free and open source software (FOSS) and the use of open specifications (OS) that address different users' needs for data sharing, visualization, manipulation, scenario simulations, and map production. The purpose of the paper is to illustrate how the latest developments in OS for geospatial and water-related data collection, storage, and sharing, combined with the use of mature FOSS projects facilitate the creation of sophisticated interoperable Web-based information systems in the water domain.

  16. ZebrafishMiner: an open source software for interactive evaluation of domain-specific fluorescence in zebrafish

    Directory of Open Access Journals (Sweden)

    Reischl Markus

    2017-09-01

    Full Text Available High-throughput microscopy makes it possible to observe the morphology of zebrafish on large scale to quantify genetic, toxic or drug effects. The image acquisition is done by automated microscopy, images are evaluated automatically by image processing pipelines, tailored specifically to the requirements of the scientific question. The transfer of such algorithms to other projects, however, is complex due to missing guidelines and lack of mathematical or programming knowledge. In this work, we implement an image processing pipeline for automatic fluorescence quantification in user-defined domains of zebrafish embryos and larvae of different age. The pipeline is capable of detecting embryos and larvae in image stacks and quantifying domain activity. To make this protocol available to the community, we developed an open source software package called „ZebrafishMiner“ which guides the user through all steps of the processing pipeline and makes the algorithms available and easy to handle. We implemented all routines in an MATLAB-based graphical user interface (GUI that gives the user control over all image processing parameters. The software is shipped with a manual of 30 pages and three tutorial datasets, which guide the user through the manual step by step. It can be downloaded at https://sourceforge.net/projects/scixminer/.

  17. Domain-Specific Acceleration and Auto-Parallelization of Legacy Scientific Code in FORTRAN 77 using Source-to-Source Compilation

    OpenAIRE

    Vanderbauwhede, Wim; Davidson, Gavin

    2017-01-01

    Massively parallel accelerators such as GPGPUs, manycores and FPGAs represent a powerful and affordable tool for scientists who look to speed up simulations of complex systems. However, porting code to such devices requires a detailed understanding of heterogeneous programming tools and effective strategies for parallelization. In this paper we present a source to source compilation approach with whole-program analysis to automatically transform single-threaded FORTRAN 77 legacy code into Ope...

  18. Analytical mechanics

    CERN Document Server

    Lemos, Nivaldo A

    2018-01-01

    Analytical mechanics is the foundation of many areas of theoretical physics including quantum theory and statistical mechanics, and has wide-ranging applications in engineering and celestial mechanics. This introduction to the basic principles and methods of analytical mechanics covers Lagrangian and Hamiltonian dynamics, rigid bodies, small oscillations, canonical transformations and Hamilton–Jacobi theory. This fully up-to-date textbook includes detailed mathematical appendices and addresses a number of advanced topics, some of them of a geometric or topological character. These include Bertrand's theorem, proof that action is least, spontaneous symmetry breakdown, constrained Hamiltonian systems, non-integrability criteria, KAM theory, classical field theory, Lyapunov functions, geometric phases and Poisson manifolds. Providing worked examples, end-of-chapter problems, and discussion of ongoing research in the field, it is suitable for advanced undergraduate students and graduate students studying analyt...

  19. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    Science.gov (United States)

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix

  20. A literature review of methods of analysis of organic analytes in radioactive wastes with an emphasis on sources from the United Kingdom

    International Nuclear Information System (INIS)

    Clauss, S.A.; Bean, R.M.

    1993-09-01

    This report, compiled by Pacific Northwest Laboratory (PNL), examines literature originating through the United Kingdom (UK) nuclear industry relating to the analyses of organic constituents of radioactive waste. Additionally, secondary references from the UK and other counties, including the United States, have been reviewed. The purpose of this literature review was to find analytical methods that would apply to the mixed-waste matrices found at Hanford

  1. Analytical quadrics

    CERN Document Server

    Spain, Barry; Ulam, S; Stark, M

    1960-01-01

    Analytical Quadrics focuses on the analytical geometry of three dimensions. The book first discusses the theory of the plane, sphere, cone, cylinder, straight line, and central quadrics in their standard forms. The idea of the plane at infinity is introduced through the homogenous Cartesian coordinates and applied to the nature of the intersection of three planes and to the circular sections of quadrics. The text also focuses on paraboloid, including polar properties, center of a section, axes of plane section, and generators of hyperbolic paraboloid. The book also touches on homogenous coordi

  2. Rapid diagnostic tests as a source of DNA for Plasmodium species-specific real-time PCR

    Directory of Open Access Journals (Sweden)

    Van Esbroeck Marjan

    2011-03-01

    Full Text Available Abstract Background This study describes the use of malaria rapid diagnostic tests (RDTs as a source of DNA for Plasmodium species-specific real-time PCR. Methods First, the best method to recover DNA from RDTs was investigated and then the applicability of this DNA extraction method was assessed on 12 different RDT brands. Finally, two RDT brands (OptiMAL Rapid Malaria Test and SDFK60 malaria Ag Plasmodium falciparum/Pan test were comprehensively evaluated on a panel of clinical samples submitted for routine malaria diagnosis at ITM. DNA amplification was done with the 18S rRNA real-time PCR targeting the four Plasmodium species. Results of PCR on RDT were compared to those obtained by PCR on whole blood samples. Results Best results were obtained by isolating DNA from the proximal part of the nitrocellulose component of the RDT strip with a simple DNA elution method. The PCR on RDT showed a detection limit of 0.02 asexual parasites/μl, which was identical to the same PCR on whole blood. For all 12 RDT brands tested, DNA was detected except for one brand when a low parasite density sample was applied. In RDTs with a plastic seal covering the nitrocellulose strip, DNA extraction was hampered. PCR analysis on clinical RDT samples demonstrated correct identification for single species infections for all RDT samples with asexual parasites of P. falciparum (n = 60, Plasmodium vivax (n = 10, Plasmodium ovale (n = 10 and Plasmodium malariae (n = 10. Samples with only gametocytes were detected in all OptiMAL and in 10 of the 11 SDFK60 tests. None of the negative samples (n = 20 gave a signal by PCR on RDT. With PCR on RDT, higher Ct-values were observed than with PCR on whole blood, with a mean difference of 2.68 for OptiMAL and 3.53 for SDFK60. Mixed infections were correctly identified with PCR on RDT in 4/5 OptiMAL tests and 2/5 SDFK60 tests. Conclusions RDTs are a reliable source of DNA for Plasmodium real-time PCR. This study demonstrates the

  3. Semi-Analytical Benchmarks for MCNP6

    Energy Technology Data Exchange (ETDEWEB)

    Grechanuk, Pavel Aleksandrovi [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-11-07

    Code verification is an extremely important process that involves proving or disproving the validity of code algorithms by comparing them against analytical results of the underlying physics or mathematical theory on which the code is based. Monte Carlo codes such as MCNP6 must undergo verification and testing upon every release to ensure that the codes are properly simulating nature. Specifically, MCNP6 has multiple sets of problems with known analytic solutions that are used for code verification. Monte Carlo codes primarily specify either current boundary sources or a volumetric fixed source, either of which can be very complicated functions of space, energy, direction and time. Thus, most of the challenges with modeling analytic benchmark problems in Monte Carlo codes come from identifying the correct source definition to properly simulate the correct boundary conditions. The problems included in this suite all deal with mono-energetic neutron transport without energy loss, in a homogeneous material. The variables that differ between the problems are source type (isotropic/beam), medium dimensionality (infinite/semi-infinite), etc.

  4. Apportioning sources of organic matter in streambed sediments: An integrated molecular and compound-specific stable isotope approach

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, Richard J., E-mail: Richard.J.Cooper@uea.ac.uk [School of Environmental Sciences, University of East Anglia, Norwich Research Park, Norwich NR4 7TJ (United Kingdom); Pedentchouk, Nikolai; Hiscock, Kevin M.; Disdle, Paul [School of Environmental Sciences, University of East Anglia, Norwich Research Park, Norwich NR4 7TJ (United Kingdom); Krueger, Tobias [IRI THESys, Humboldt University, 10099 Berlin (Germany); Rawlins, Barry G. [British Geological Survey, Keyworth, Nottingham NG12 5GG (United Kingdom)

    2015-07-01

    We present a novel application for quantitatively apportioning sources of organic matter in streambed sediments via a coupled molecular and compound-specific isotope analysis (CSIA) of long-chain leaf wax n-alkane biomarkers using a Bayesian mixing model. Leaf wax extracts of 13 plant species were collected from across two environments (aquatic and terrestrial) and four plant functional types (trees, herbaceous perennials, and C{sub 3} and C{sub 4} graminoids) from the agricultural River Wensum catchment, UK. Seven isotopic (δ{sup 13}C{sub 27}, δ{sup 13}C{sub 29}, δ{sup 13}C{sub 31}, δ{sup 13}C{sub 27–31}, δ{sup 2}H{sub 27}, δ{sup 2}H{sub 29}, and δ{sup 2}H{sub 27–29}) and two n-alkane ratio (average chain length (ACL), carbon preference index (CPI)) fingerprints were derived, which successfully differentiated 93% of individual plant specimens by plant functional type. The δ{sup 2}H values were the strongest discriminators of plants originating from different functional groups, with trees (δ{sup 2}H{sub 27–29} = − 208‰ to − 164‰) and C{sub 3} graminoids (δ{sup 2}H{sub 27–29} = − 259‰ to − 221‰) providing the largest contrasts. The δ{sup 13}C values provided strong discrimination between C{sub 3} (δ{sup 13}C{sub 27–31} = − 37.5‰ to − 33.8‰) and C{sub 4} (δ{sup 13}C{sub 27–31} = − 23.5‰ to − 23.1‰) plants, but neither δ{sup 13}C nor δ{sup 2}H values could uniquely differentiate aquatic and terrestrial species, emphasizing a stronger plant physiological/biochemical rather than environmental control over isotopic differences. ACL and CPI complemented isotopic discrimination, with significantly longer chain lengths recorded for trees and terrestrial plants compared with herbaceous perennials and aquatic species, respectively. Application of a comprehensive Bayesian mixing model for 18 streambed sediments collected between September 2013 and March 2014 revealed considerable temporal variability in the

  5. An analytic thomism?

    Directory of Open Access Journals (Sweden)

    Daniel Alejandro Pérez Chamorro.

    2012-12-01

    Full Text Available For 50 years the philosophers of the Anglo-Saxon analytic tradition (E. Anscombre, P. Geach, A. Kenny, P. Foot have tried to follow the Thomas Aquinas School which they use as a source to surpass the Cartesian Epistemology and to develop the virtue ethics. Recently, J. Haldane has inaugurated a program of “analytical thomism” which main result until the present has been his “theory of identity mind/world”. Nevertheless, none of Thomás’ admirers has still found the means of assimilating his metaphysics of being.

  6. Formative assessment and learning analytics

    NARCIS (Netherlands)

    Tempelaar, D.T.; Heck, A.; Cuypers, H.; van der Kooij, H.; van de Vrie, E.; Suthers, D.; Verbert, K.; Duval, E.; Ochoa, X.

    2013-01-01

    Learning analytics seeks to enhance the learning process through systematic measurements of learning related data, and informing learners and teachers of the results of these measurements, so as to support the control of the learning process. Learning analytics has various sources of information,

  7. Dynamic Metabolic Profiles and Tissue-Specific Source Effects on the Metabolome of Developing Seeds of Brassica napus.

    Directory of Open Access Journals (Sweden)

    Helin Tan

    Full Text Available Canola (Brassica napus is one of several important oil-producing crops, and the physiological processes, enzymes, and genes involved in oil synthesis in canola seeds have been well characterized. However, relatively little is known about the dynamic metabolic changes that occur during oil accumulation in seeds, as well as the mechanistic origins of metabolic changes. To explore the metabolic changes that occur during oil accumulation, we isolated metabolites from both seed and silique wall and identified and characterized them by using gas chromatography coupled with mass spectrometry (GC-MS. The results showed that a total of 443 metabolites were identified from four developmental stages. Dozens of these metabolites were differentially expressed during seed ripening, including 20 known to be involved in seed development. To investigate the contribution of tissue-specific carbon sources to the biosynthesis of these metabolites, we examined the metabolic changes of silique walls and seeds under three treatments: leaf-detachment (Ld, phloem-peeling (Pe, and selective silique darkening (Sd. Our study demonstrated that the oil content was independent of leaf photosynthesis and phloem transport during oil accumulation, but required the metabolic influx from the silique wall. Notably, Sd treatment resulted in seed senescence, which eventually led to a severe reduction of the oil content. Sd treatment also caused a significant accumulation of fatty acids (FA, organic acids and amino acids. Furthermore, an unexpected accumulation of sugar derivatives and organic acid was observed in the Pe- and Sd-treated seeds. Consistent with this, the expression of a subset of genes involved in FA metabolism, sugar and oil storage was significantly altered in Pe and Sd treated seeds. Taken together, our studies suggest the metabolite profiles of canola seeds dynamically varied during the course of oil accumulation, which may provide a new insight into the mechanisms

  8. Asp- and Glu-specific novel dipeptidyl peptidase 11 of Porphyromonas gingivalis ensures utilization of proteinaceous energy sources.

    Science.gov (United States)

    Ohara-Nemoto, Yuko; Shimoyama, Yu; Kimura, Shigenobu; Kon, Asako; Haraga, Hiroshi; Ono, Toshio; Nemoto, Takayuki K

    2011-11-04

    Porphyromonas gingivalis and Porphyromonas endodontalis, asaccharolytic black-pigmented anaerobes, are predominant pathogens of human chronic and periapical periodontitis, respectively. They incorporate di- and tripeptides from the environment as carbon and energy sources. In the present study we cloned a novel dipeptidyl peptidase (DPP) gene of P. endodontalis ATCC 35406, designated as DPP11. The DPP11 gene encoded 717 amino acids with a molecular mass of 81,090 Da and was present as a 75-kDa form with an N terminus of Asp(22). A homology search revealed the presence of a P. gingivalis orthologue, PGN0607, that has been categorized as an isoform of authentic DPP7. P. gingivalis DPP11 was exclusively cell-associated as a truncated 60-kDa form, and the gene ablation retarded cell growth. DPP11 specifically removed dipeptides from oligopeptides with the penultimate N-terminal Asp and Glu and has a P2-position preference to hydrophobic residues. Optimum pH was 7.0, and the k(cat)/K(m) value was higher for Asp than Glu. Those activities were lost by substitution of Ser(652) in P. endodontalis and Ser(655) in P. gingivalis DPP11 to Ala, and they were consistently decreased with increasing NaCl concentration. Arg(670) is a unique amino acid completely conserved in all DPP11 members distributed in the genera Porphyromonas, Bacteroides, and Parabacteroides, whereas this residue is converted to Gly in all authentic DPP7 members. Substitution analysis suggested that Arg(670) interacts with an acidic residue of the substrate. Considered to preferentially utilize acidic amino acids, DPP11 ensures efficient degradation of oligopeptide substrates in these Gram-negative anaerobic rods.

  9. Asp- and Glu-specific Novel Dipeptidyl Peptidase 11 of Porphyromonas gingivalis Ensures Utilization of Proteinaceous Energy Sources*

    Science.gov (United States)

    Ohara-Nemoto, Yuko; Shimoyama, Yu; Kimura, Shigenobu; Kon, Asako; Haraga, Hiroshi; Ono, Toshio; Nemoto, Takayuki K.

    2011-01-01

    Porphyromonas gingivalis and Porphyromonas endodontalis, asaccharolytic black-pigmented anaerobes, are predominant pathogens of human chronic and periapical periodontitis, respectively. They incorporate di- and tripeptides from the environment as carbon and energy sources. In the present study we cloned a novel dipeptidyl peptidase (DPP) gene of P. endodontalis ATCC 35406, designated as DPP11. The DPP11 gene encoded 717 amino acids with a molecular mass of 81,090 Da and was present as a 75-kDa form with an N terminus of Asp22. A homology search revealed the presence of a P. gingivalis orthologue, PGN0607, that has been categorized as an isoform of authentic DPP7. P. gingivalis DPP11 was exclusively cell-associated as a truncated 60-kDa form, and the gene ablation retarded cell growth. DPP11 specifically removed dipeptides from oligopeptides with the penultimate N-terminal Asp and Glu and has a P2-position preference to hydrophobic residues. Optimum pH was 7.0, and the kcat/Km value was higher for Asp than Glu. Those activities were lost by substitution of Ser652 in P. endodontalis and Ser655 in P. gingivalis DPP11 to Ala, and they were consistently decreased with increasing NaCl concentration. Arg670 is a unique amino acid completely conserved in all DPP11 members distributed in the genera Porphyromonas, Bacteroides, and Parabacteroides, whereas this residue is converted to Gly in all authentic DPP7 members. Substitution analysis suggested that Arg670 interacts with an acidic residue of the substrate. Considered to preferentially utilize acidic amino acids, DPP11 ensures efficient degradation of oligopeptide substrates in these Gram-negative anaerobic rods. PMID:21896480

  10. "Using recruitment source timing and diagnosticity to enhance applicants' occupation-specific human capital": Correction to Campion, Ployhart, and Campion (2017).

    Science.gov (United States)

    2017-05-01

    Reports an error in "Using Recruitment Source Timing and Diagnosticity to Enhance Applicants' Occupation-Specific Human Capital" by Michael C. Campion, Robert E. Ployhart and Michael A. Campion ( Journal of Applied Psychology , Advanced Online Publication, Feb 02, 2017, np). In the article, the following headings were inadvertently set at the wrong level: Method, Participants and Procedure, Measures, Occupation specific human capital, Symbolic jobs, Relevant majors, Occupation-specific capital hotspots, Source timing, Source diagnosticity, Results, and Discussion. All versions of this article have been corrected. (The following abstract of the original article appeared in record 2017-04566-001.) This study proposes that reaching applicants through more diagnostic recruitment sources earlier in their educational development (e.g., in high school) can lead them to invest more in their occupation-specific human capital (OSHC), thereby making them higher quality candidates. Using a sample of 78,157 applicants applying for jobs within a desirable professional occupation in the public sector, results indicate that applicants who report hearing about the occupation earlier, and applicants who report hearing about the occupation through more diagnostic sources, have higher levels of OSHC upon application. Additionally, source timing and diagnosticity affect the likelihood of candidates applying for jobs symbolic of the occupation, selecting relevant majors, and attending educational institutions with top programs related to the occupation. These findings suggest a firm's recruiting efforts may influence applicants' OSHC investment strategies. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  11. Analytical solutions of electric potential and impedance for a multilayered spherical volume conductor excited by time-harmonic electric current source: application in brain EIT

    International Nuclear Information System (INIS)

    Xiao Chunyan; Lei Yinzhao

    2005-01-01

    A model of a multilayered spherical volume conductor with four electrodes is built. In this model, a time-harmonic electric current is injected into the sphere through a pair of drive electrodes, and electric potential is measured by the other pair of measurement electrodes. By solving the boundary value problem of the electromagnetic field, the analytical solutions of electric potential and impedance in the whole conduction region are derived. The theoretical values of electric potential on the surface of the sphere are in good accordance with the experimental results. The analytical solutions are then applied to the simulation of the forward problem of brain electrical impedance tomography (EIT). The results show that, for a real human head, the imaginary part of the electric potential is not small enough to be ignored at above 20 kHz, and there exists an approximate linear relationship between the real and imaginary parts of the electric potential when the electromagnetic parameters of the innermost layer keep unchanged. Increase in the conductivity of the innermost layer leads to a decrease of the magnitude of both real and imaginary parts of the electric potential on the scalp. However, the increase of permittivity makes the magnitude of the imaginary part of the electric potential increase while that of the real part decreases, and vice versa

  12. Defining a roadmap for harmonizing quality indicators in Laboratory Medicine: a consensus statement on behalf of the IFCC Working Group "Laboratory Error and Patient Safety" and EFLM Task and Finish Group "Performance specifications for the extra-analytical phases".

    Science.gov (United States)

    Sciacovelli, Laura; Panteghini, Mauro; Lippi, Giuseppe; Sumarac, Zorica; Cadamuro, Janne; Galoro, César Alex De Olivera; Pino Castro, Isabel Garcia Del; Shcolnik, Wilson; Plebani, Mario

    2017-08-28

    The improving quality of laboratory testing requires a deep understanding of the many vulnerable steps involved in the total examination process (TEP), along with the identification of a hierarchy of risks and challenges that need to be addressed. From this perspective, the Working Group "Laboratory Errors and Patient Safety" (WG-LEPS) of International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) is focusing its activity on implementation of an efficient tool for obtaining meaningful information on the risk of errors developing throughout the TEP, and for establishing reliable information about error frequencies and their distribution. More recently, the European Federation of Clinical Chemistry and Laboratory Medicine (EFLM) has created the Task and Finish Group "Performance specifications for the extra-analytical phases" (TFG-PSEP) for defining performance specifications for extra-analytical phases. Both the IFCC and EFLM groups are working to provide laboratories with a system to evaluate their performances and recognize the critical aspects where improvement actions are needed. A Consensus Conference was organized in Padova, Italy, in 2016 in order to bring together all the experts and interested parties to achieve a consensus for effective harmonization of quality indicators (QIs). A general agreement was achieved and the main outcomes have been the release of a new version of model of quality indicators (MQI), the approval of a criterion for establishing performance specifications and the definition of the type of information that should be provided within the report to the clinical laboratories participating to the QIs project.

  13. Analytical chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jae Seong

    1993-02-15

    This book is comprised of nineteen chapters, which describes introduction of analytical chemistry, experimental error and statistics, chemistry equilibrium and solubility, gravimetric analysis with mechanism of precipitation, range and calculation of the result, volume analysis on general principle, sedimentation method on types and titration curve, acid base balance, acid base titration curve, complex and firing reaction, introduction of chemical electro analysis, acid-base titration curve, electrode and potentiometry, electrolysis and conductometry, voltammetry and polarographic spectrophotometry, atomic spectrometry, solvent extraction, chromatograph and experiments.

  14. Analytical chemistry

    International Nuclear Information System (INIS)

    Choi, Jae Seong

    1993-02-01

    This book is comprised of nineteen chapters, which describes introduction of analytical chemistry, experimental error and statistics, chemistry equilibrium and solubility, gravimetric analysis with mechanism of precipitation, range and calculation of the result, volume analysis on general principle, sedimentation method on types and titration curve, acid base balance, acid base titration curve, complex and firing reaction, introduction of chemical electro analysis, acid-base titration curve, electrode and potentiometry, electrolysis and conductometry, voltammetry and polarographic spectrophotometry, atomic spectrometry, solvent extraction, chromatograph and experiments.

  15. Analytical chemistry

    International Nuclear Information System (INIS)

    Anon.

    1985-01-01

    The division for Analytical Chemistry continued to try and develope an accurate method for the separation of trace amounts from mixtures which, contain various other elements. Ion exchange chromatography is of special importance in this regard. New separation techniques were tried on certain trace amounts in South African standard rock materials and special ceramics. Methods were also tested for the separation of carrier-free radioisotopes from irradiated cyclotron discs

  16. The source of pretreatment serum prostate-specific antigen in clinically localized prostate cancer--T, N, or M?

    International Nuclear Information System (INIS)

    Zagars, Gunar K.; Kavadi, Vivek S.; Pollack, Alan; Eschenbach, Andrew C. von; Sands, M. Elizabeth

    1995-01-01

    Purpose: Prostate-specific antigen (PSA) is an important marker for prostate cancer and has been shown to be secreted from the primary tumor and from metastases. However, the relative contribution of the primary and micrometastatic disease to the serum level of PSA in patients with clinically localized disease has not been delineated. This study addresses the source of pretreatment serum PSA in patients with clinically localized disease. Methods and Materials: The fall in serum PSA level following radical prostatectomy (280 patients; 105 T1, 165 T2, 10 T3) or definitive radiotherapy (427 patients; 122 T1, 147 T2, 158 T3/T4) was analyzed with the assumption that any fall in PSA following local treatment reflects the fraction of PSA produced in the prostate and its primary tumor. Results: Serum PSA level became undetectable in 277 of the 280 (99%) patients within 6 months of radical prostatectomy. The three patients who did not achieve undetectable levels had postsurgical values ≤ 0.9 ng/ml. Following definitive radiotherapy, nadir serum PSA values were between ≤ 0.3 and 20.3 ng/ml, with mean and median values of 1.9 and 1.2 ng/ml, respectively. Nadir PSA was undetectable in 52 patients (12%). Four patients' PSA did not fall, but rose from the start, and each developed metastatic disease within 9 months, and in each metastases appeared to contribute to pretreatment serum PSA. In the remaining patients, the maximal factor by which PSA fell to its nadir was higher the higher the pretreatment PSA level. We present arguments that this is most consistent with the hypothesis that virtually all detectable pretreatment serum PSA derives from the primary tumor. Confirmatory evidence that little of the pretreatment serum PSA came from metastases was obtained by extrapolating the rising PSA profile in 97 patients back to pretreatment time. Back-extrapolated PSA contributed a mean of 7% and a median of 5% to the pretreatment serum value. Because such back-extrapolated values

  17. Material-specific Conversion Factors for Different Solid Phantoms Used in the Dosimetry of Different Brachytherapy Sources

    Directory of Open Access Journals (Sweden)

    Sedigheh Sina

    2015-07-01

    Full Text Available Introduction Based on Task Group No. 43 (TG-43U1 recommendations, water phantom is proposed as a reference phantom for the dosimetry of brachytherapy sources. The experimental determination of TG-43 parameters is usually performed in water-equivalent solid phantoms. The purpose of this study was to determine the conversion factors for equalizing solid phantoms to water. Materials and Methods TG-43 parameters of low- and high-energy brachytherapy sources (i.e., Pd-103, I-125 and Cs-137 were obtained in different phantoms, using Monte Carlo simulations. The brachytherapy sources were simulated at the center of different phantoms including water, solid water, poly(methyl methacrylate, polystyrene and polyethylene. Dosimetric parameters such as dose rate constant, radial dose function and anisotropy function of each source were compared in different phantoms. Then, conversion factors were obtained to make phantom parameters equivalent to those of water. Results Polynomial coefficients of conversion factors were obtained for all sources to quantitatively compare g(r values in different phantom materials and the radial dose function in water. Conclusion Polynomial coefficients of conversion factors were obtained for all sources to quantitatively compare g(r values in different phantom materials and the radial dose function in water.

  18. Using plant growth modeling to analyse C source-sink relations under drought: inter and intra specific comparison

    Directory of Open Access Journals (Sweden)

    Benoit ePallas

    2013-11-01

    Full Text Available The ability to assimilate C and allocate NSC (non structural carbohydrates to the most appropriate organs is crucial to maximize plant ecological or agronomic performance. Such C source and sink activities are differentially affected by environmental constraints. Under drought, plant growth is generally more sink than source limited as organ expansion or appearance rate is earlier and stronger affected than C assimilation. This favors plant survival and recovery but not always agronomic performance as NSC are stored rather than used for growth due to a modified metabolism in source and sink leaves. Such interactions between plant C and water balance are complex and plant modeling can help analyzing their impact on plant phenotype. This paper addresses the impact of trade-offs between C sink and source activities and plant production under drought, combining experimental and modeling approaches. Two contrasted monocotyledonous species (rice, oil palm were studied. Experimentally, the sink limitation of plant growth under moderate drought was confirmed as well as the modifications in NSC metabolism in source and sink organs. Under severe stress, when C source became limiting, plant NSC concentration decreased. Two plant models dedicated to oil palm and rice morphogenesis were used to perform a sensitivity analysis and further explore how to optimize C sink and source drought sensitivity to maximize plant growth. Modeling results highlighted that optimal drought sensitivity depends both on drought type and species and that modeling is a great opportunity to analyse such complex processes. Further modeling needs and more generally the challenge of using models to support complex trait breeding are discussed.

  19. Analytical solution of neutron transport equation in an annular reactor with a rotating pulsed source; Resolucao analitica da equacao de transporte de neutrons em um reator anelar com fonte pulsada rotativa

    Energy Technology Data Exchange (ETDEWEB)

    Teixeira, Paulo Cleber Mendonca

    2002-12-01

    In this study, an analytical solution of the neutron transport equation in an annular reactor is presented with a short and rotating neutron source of the type S(x) {delta} (x- Vt), where V is the speed of annular pulsed reactor. The study is an extension of a previous study by Williams [12] carried out with a pulsed source of the type S(x) {delta} (t). In the new concept of annular pulsed reactor designed to produce continuous high flux, the core consists of a subcritical annular geometry pulsed by a rotating modulator, producing local super prompt critical condition, thereby giving origin to a rotating neutron pulse. An analytical solution is obtained by opening up of the annular geometry and applying one energy group transport theory in one dimension using applied mathematical techniques of Laplace transform and Complex Variables. The general solution for the flux consists of a fundamental mode, a finite number of harmonics and a transient integral. A condition which limits the number of harmonics depending upon the circumference of the annular geometry has been obtained. Inverse Laplace transform technique is used to analyse instability condition in annular reactor core. A regenerator parameter in conjunction with perimeter of the ring and nuclear properties is used to obtain stable and unstable harmonics and to verify if these exist. It is found that the solution does not present instability in the conditions stated in the new concept of annular pulsed reactor. (author)

  20. Technical evaluation of the proposed changes in the technical specifications for emergency power sources for the Big Rock Point nuclear power plant

    International Nuclear Information System (INIS)

    Latorre, V.R.

    1979-12-01

    The technical evaluation is presented for the proposed changes to the Technical Specifications for emergency power sources for the Big Rock Point nuclear power plant. The criteria used to evaluate the acceptability of the changes include those delineated in IEEE Std-308-1974, and IEEE Std-450-1975 as endorsed by US NRC Regulatory Guide 1.129

  1. An analytical method for assessing stage-specific drug activity in Plasmodium vivax malaria: implications for ex vivo drug susceptibility testing.

    Directory of Open Access Journals (Sweden)

    Douglas H Kerlin

    Full Text Available The emergence of highly chloroquine (CQ resistant P. vivax in Southeast Asia has created an urgent need for an improved understanding of the mechanisms of drug resistance in these parasites, the development of robust tools for defining the spread of resistance, and the discovery of new antimalarial agents. The ex vivo Schizont Maturation Test (SMT, originally developed for the study of P. falciparum, has been modified for P. vivax. We retrospectively analysed the results from 760 parasite isolates assessed by the modified SMT to investigate the relationship between parasite growth dynamics and parasite susceptibility to antimalarial drugs. Previous observations of the stage-specific activity of CQ against P. vivax were confirmed, and shown to have profound consequences for interpretation of the assay. Using a nonlinear model we show increased duration of the assay and a higher proportion of ring stages in the initial blood sample were associated with decreased effective concentration (EC(50 values of CQ, and identify a threshold where these associations no longer hold. Thus, starting composition of parasites in the SMT and duration of the assay can have a profound effect on the calculated EC(50 for CQ. Our findings indicate that EC(50 values from assays with a duration less than 34 hours do not truly reflect the sensitivity of the parasite to CQ, nor an assay where the proportion of ring stage parasites at the start of the assay does not exceed 66%. Application of this threshold modelling approach suggests that similar issues may occur for susceptibility testing of amodiaquine and mefloquine. The statistical methodology which has been developed also provides a novel means of detecting stage-specific drug activity for new antimalarials.

  2. Predictive Big Data Analytics: A Study of Parkinson?s Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations

    OpenAIRE

    Dinov, Ivo D.; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W.; Price, Nathan D.; Van Horn, John D.; Ames, Joseph

    2016-01-01

    Background A unique archive of Big Data on Parkinson?s Disease is collected, managed and disseminated by the Parkinson?s Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationsh...

  3. Radiocarbon Analysis to Calculate New End-Member Values for Biomass Burning Source Samples Specific to the Bay Area

    Science.gov (United States)

    Yoon, S.; Kirchstetter, T.; Fairley, D.; Sheesley, R. J.; Tang, X.

    2017-12-01

    Elemental carbon (EC), also known as black carbon or soot, is an important particulate air pollutant that contributes to climate forcing through absorption of solar radiation and to adverse human health impacts through inhalation. Both fossil fuel combustion and biomass burning, via residential firewood burning, agricultural burning, wild fires, and controlled burns, are significant sources of EC. Our ability to successfully control ambient EC concentrations requires understanding the contribution of these different emission sources. Radiocarbon (14C) analysis has been increasingly used as an apportionment tool to distinguish between EC from fossil fuel and biomass combustion sources. However, there are uncertainties associated with this method including: 1) uncertainty associated with the isolation of EC to be used for radiocarbon analysis (e.g., inclusion of organic carbon, blank contamination, recovery of EC, etc.) 2) uncertainty associated with the radiocarbon signature of the end member. The objective of this research project is to utilize laboratory experiments to evaluate some of these uncertainties, particularly for EC sources that significantly impact the San Francisco Bay Area. Source samples of EC only and a mix of EC and organic carbon (OC) were produced for this study to represent known emission sources and to approximate the mixing of EC and OC that would be present in the atmosphere. These samples include a combination of methane flame soot, various wood smoke samples (i.e. cedar, oak, sugar pine, pine at various ages, etc.), meat cooking, and smoldering cellulose smoke. EC fractions were isolated using a Sunset Laboratory's thermal optical transmittance carbon analyzer. For 14C analysis, samples were sent to Woods Hole Oceanographic Institution for isotope analysis using an accelerated mass spectrometry. End member values and uncertainties for the EC isolation utilizing this method will be reported.

  4. Croatian Analytical Terminology

    Directory of Open Access Journals (Sweden)

    Kastelan-Macan; M.

    2008-04-01

    Full Text Available Results of analytical research are necessary in all human activities. They are inevitable in making decisions in the environmental chemistry, agriculture, forestry, veterinary medicine, pharmaceutical industry, and biochemistry. Without analytical measurements the quality of materials and products cannot be assessed, so that analytical chemistry is an essential part of technical sciences and disciplines.The language of Croatian science, and analytical chemistry within it, was one of the goals of our predecessors. Due to the political situation, they did not succeed entirely, but for the scientists in independent Croatia this is a duty, because language is one of the most important features of the Croatian identity. The awareness of the need to introduce Croatian terminology was systematically developed in the second half of the 19th century, along with the founding of scientific societies and the wish of scientists to write their scientific works in Croatian, so that the results of their research may be applied in economy. Many authors of textbooks from the 19th and the first half of the 20th century contributed to Croatian analytical terminology (F. Rački, B. Šulek, P. Žulić, G. Pexidr, J. Domac, G. Janeček , F. Bubanović, V. Njegovan and others. M. DeŢelić published the first systematic chemical terminology in 1940, adjusted to the IUPAC recommendations. In the second half of 20th century textbooks in classic analytical chemistry were written by V. Marjanović-Krajovan, M. Gyiketta-Ogrizek, S. Žilić and others. I. Filipović wrote the General and Inorganic Chemistry textbook and the Laboratory Handbook (in collaboration with P. Sabioncello and contributed greatly to establishing the terminology in instrumental analytical methods.The source of Croatian nomenclature in modern analytical chemistry today are translated textbooks by Skoog, West and Holler, as well as by Günnzler i Gremlich, and original textbooks by S. Turina, Z.

  5. FEASIBILITY OF INVESTMENT IN BUSINESS ANALYTICS

    Directory of Open Access Journals (Sweden)

    Mladen Varga

    2007-12-01

    Full Text Available Trends in data processing for decision support show that business users need business analytics, i.e. analytical applications which incorporate a variety of business oriented data analysis techniques and task-specific knowledge. The paper discusses the feasibility of investment in two models of implementing business analytics: custom development and packed analytical applications. The consequences of both models are shown on two models of business analytics implementation in Croatia.

  6. Validation of an analytical method based on the high-resolution continuum source flame atomic absorption spectrometry for the fast-sequential determination of several hazardous/priority hazardous metals in soil.

    Science.gov (United States)

    Frentiu, Tiberiu; Ponta, Michaela; Hategan, Raluca

    2013-03-01

    The aim of this paper was the validation of a new analytical method based on the high-resolution continuum source flame atomic absorption spectrometry for the fast-sequential determination of several hazardous/priority hazardous metals (Ag, Cd, Co, Cr, Cu, Ni, Pb and Zn) in soil after microwave assisted digestion in aqua regia. Determinations were performed on the ContrAA 300 (Analytik Jena) air-acetylene flame spectrometer equipped with xenon short-arc lamp as a continuum radiation source for all elements, double monochromator consisting of a prism pre-monocromator and an echelle grating monochromator, and charge coupled device as detector. For validation a method-performance study was conducted involving the establishment of the analytical performance of the new method (limits of detection and quantification, precision and accuracy). Moreover, the Bland and Altman statistical method was used in analyzing the agreement between the proposed assay and inductively coupled plasma optical emission spectrometry as standardized method for the multielemental determination in soil. The limits of detection in soil sample (3σ criterion) in the high-resolution continuum source flame atomic absorption spectrometry method were (mg/kg): 0.18 (Ag), 0.14 (Cd), 0.36 (Co), 0.25 (Cr), 0.09 (Cu), 1.0 (Ni), 1.4 (Pb) and 0.18 (Zn), close to those in inductively coupled plasma optical emission spectrometry: 0.12 (Ag), 0.05 (Cd), 0.15 (Co), 1.4 (Cr), 0.15 (Cu), 2.5 (Ni), 2.5 (Pb) and 0.04 (Zn). Accuracy was checked by analyzing 4 certified reference materials and a good agreement for 95% confidence interval was found in both methods, with recoveries in the range of 94-106% in atomic absorption and 97-103% in optical emission. Repeatability found by analyzing real soil samples was in the range 1.6-5.2% in atomic absorption, similar with that of 1.9-6.1% in optical emission spectrometry. The Bland and Altman method showed no statistical significant difference between the two spectrometric

  7. Development of an Analytic Method for Sulfur Compounds in Aged Garlic Extract with the Use of a Postcolumn High Performance Liquid Chromatography Method with Sulfur-Specific Detection.

    Science.gov (United States)

    Matsutomo, Toshiaki; Kodera, Yukihiro

    2016-02-01

    Garlic and its processed preparations contain numerous sulfur compounds that are difficult to analyze in a single run using HPLC. The aim of this study was to develop a rapid and convenient sulfur-specific HPLC method to analyze sulfur compounds in aged garlic extract (AGE). We modified a conventional postcolumn HPLC method by employing a hexaiodoplatinate reagent. Identification and structural analysis of sulfur compounds were conducted by LC-mass spectrometry (LC-MS) and nuclear magnetic resonance. The production mechanisms of cis-S-1-propenylcysteine (cis-S1PC) and S-allylmercaptocysteine (SAMC) were examined by model reactions. Our method has the following advantages: less interference from nonsulfur compounds, high sensitivity, good correlation coefficients (r > 0.98), and high resolution that can separate >20 sulfur compounds, including several isomers, in garlic preparations in a single run. This method was adapted for LC-MS analysis. We identified cis-S1PC and γ-glutamyl-S-allyl-mercaptocysteine in AGE. The results of model reactions suggest that cis-S1PC is produced from trans-S1PC through an isomerization reaction and that SAMC is produced by a reaction involving S-allylcysteine/S1PC and diallyldisulfide during the aging period. We developed a rapid postcolumn HPLC method for both qualitative and quantitative analyses of sulfur compounds, and this method helped elucidate a potential mechanism of cis-S1PC and SAMC action in AGE. © 2016 American Society for Nutrition.

  8. Application of ''Confirm tank T is an appropriate feed source for High-Level waste feed batch X'' to specific feed batches

    International Nuclear Information System (INIS)

    JO, J.

    1999-01-01

    This document addresses the characterization needs of tanks as set forth in the Data Quality Objectives for TWRS Privatization Phase I: Confirm Tank T is an Appropriate Feed Source for High-Level Waste Feed Batch X (Crawford et al. 1998). The primary purpose of this document is to collect existing data and identify the data needed to determine whether or not the feed source(s) are appropriate for a specific batch. To answer these questions, the existing tank data must be collected and a detailed review performed. If the existing data are insufficient to complete a full comparison, additional data must be obtained from the feed source(s). Additional information requirements need to be identified and formally documented, then the source tank waste must be sampled or resampled and analyzed. Once the additional data are obtained, the data shall be incorporated into the existing database for the source tank and a reevaluation of the data against the Data Quality Objective (DQO) must be made

  9. Application of ''Confirm tank T is an appropriate feed source for Low-Activity waste feed batch X'' to specific feed batches

    International Nuclear Information System (INIS)

    JO, J.

    1999-01-01

    This document addresses the characterization needs of tanks as set forth in the ''Confirm Tank T is an Appropriate Feed Source for Low-Activity Waste Feed Batch X'' Data Quality Objective (DQO) (Certa and Jo 1998). The primary purpose of this document is to collect existing data and identify the data needed to determine whether or not the feed source(s) are appropriate for a specific batch before transfer is made to the feed staging tanks. To answer these questions, the existing tank data must be collected and a detailed review performed. If the existing data are insufficient to complete a full comparison, additional data must be obtained from the feed source(s). Additional information requirements need to be identified and formally documented, then the source tank waste must be sampled or resampled and analyzed. Once the additional data are obtained, the data shall be incorporated into the existing database for the source tank and a reevaluation of the data against the DQO must be made

  10. Analytical evaluation of different carbon sources and growth stimulators on the biomass and lipid production of Chlorella vulgaris – Implications for biofuels

    International Nuclear Information System (INIS)

    Josephine, A.; Niveditha, C.; Radhika, A.; Shali, A. Brindha; Kumar, T.S.; Dharani, G.; Kirubagaran, R.

    2015-01-01

    The key challenges in lipid production from marine microalgae include the selection of appropriate strain, optimization of the culture conditions and enhancement of biolipid yield. This study is aimed at evaluating the optimal harvest time and effect of chlorella growth factor (CGF) extract, carbon sources and phytohormones on the biomass and lipid production in Chlorella vulgaris. CGF, extracted using hot water from Chlorella has been reported to possess various medicinal properties. However, in the present study, for the first time in C. vulgaris, CGF was found as a best growth stimulator by enhancing the biomass level (1.208 kg m −3 ) significantly on day 5. Gibberellin and citrate augmented the biomass by 0.935 kg m −3 and 1.025 kg m −3 . Combination of CGF and phytohormones were more effective than CGF and carbon sources. Analysis of fatty acid methyl esters indicated that the ratio of saturated to unsaturated fatty acids is higher in cytokinin, abscisic acid and CGF, and are also rich in short chain carbon atoms, ideal criteria for biodiesel. Nitrogen starvation favoured synthesis of more unsaturated fatty acids than saturated. This study shows that CGF enhances the biomass and lipid significantly and thus can be used for large scale biomass production. - Highlights: • Optimization studies revealed 7th day to be the ideal period for harvesting Chlorella vulgaris. • Chlorella growth factor extract acted as a chief growth promoting factor of C. vulgaris. • Chlorella growth factor with carbon sources or phytohormones was not effective than chlorella growth factor extract alone. • Cytokinin treatment increased saturated fatty acids level, although the biomass production was not significant

  11. Analytical developments in the measurements of boron, nitrate, phosphate and sulphate isotopes and case examples of discrimination of nitrogen and sulphur sources in pollution studies

    International Nuclear Information System (INIS)

    Aggarwal, J.; Sheppard, D.S.; Robinson, B.W.

    1998-01-01

    Methods are documented for the analysis of B isotopes, O and N isotopes in nitrates. B isotopes can be measured by negative ion thermal ionisation mass spectrometry. Nitrate is recovered from groundwaters by ion exchange and the resulting silver nitrate combusted for stable isotope gas analysis. Oxygen isotope analysis of phosphates can be determined by generating and analysing CO 2 gas from the combustion of silver phosphate produced from aqueous samples. Sulphate in ground and surface waters can be separated and concentrated by ion exchange and precipitated as barium sulphate. This is reacted with graphite to yield CO 2 and CO, the latter being spark discharged to CO 2 and the total CO 2 measured for oxygen isotope analysis. Barium sulphide from this reaction is converted to silver sulphide which is reacted with cuprous oxide to give SO 2 gas for sulphur isotope measurements. A case study of the semi-rural Manakau area in New Zealand was conducted to see if nitrate isotopes could be used to detect the source of nitrate contamination (groundwater nitrate - 3- N). Nitrogen isotope (+4 to +12 per mille) coupled with oxygen isotope measurements (+5 to +9 per mille) demonstrated that the nitrogen is not sources from fertilisers but from some combination of septic tank and animal waste. For the case study of sulphate isotope use, sulphur and oxygen isotopic compositions of sulphate in river and lake water from seven major catchments of New Zealand were determined. The isotope analyses have allowed the distinction between natural (geological, geothermal and volcanic) and anthropogenic (fertiliser) sulphur sources. (author)

  12. Analytical mechanics

    CERN Document Server

    Helrich, Carl S

    2017-01-01

    This advanced undergraduate textbook begins with the Lagrangian formulation of Analytical Mechanics and then passes directly to the Hamiltonian formulation and the canonical equations, with constraints incorporated through Lagrange multipliers. Hamilton's Principle and the canonical equations remain the basis of the remainder of the text. Topics considered for applications include small oscillations, motion in electric and magnetic fields, and rigid body dynamics. The Hamilton-Jacobi approach is developed with special attention to the canonical transformation in order to provide a smooth and logical transition into the study of complex and chaotic systems. Finally the text has a careful treatment of relativistic mechanics and the requirement of Lorentz invariance. The text is enriched with an outline of the history of mechanics, which particularly outlines the importance of the work of Euler, Lagrange, Hamilton and Jacobi. Numerous exercises with solutions support the exceptionally clear and concise treatment...

  13. Control of Orphan Sources and Other Radioactive Material in the Metal Recycling and Production Industries. Specific Safety Guide (Arabic Edition)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-09-01

    Accidents involving orphan sources and other radioactive material in the metal recycling and production industries have resulted in serious radiological accidents as well as in harmful environmental, social and economic impacts. This Safety Guide provides recommendations, the implementation of which should prevent such accidents and provide confidence that scrap metal and recycled products are safe. Contents: 1. Introduction; 2. Protection of people and the environment; 3. Responsibilities; 4. Monitoring for radioactive material; 5. Response to the discovery of radioactive material; 6. Remediation of contaminated areas; 7. Management of recovered radioactive material; Annex I: Review of events involving radioactive material in the metal recycling and production industries; Annex II: Categorization of radioactive sources; Annex III: Some examples of national and international initiatives.

  14. Control of Orphan Sources and Other Radioactive Material in the Metal Recycling and Production Industries. Specific Safety Guide

    International Nuclear Information System (INIS)

    2014-01-01

    Accidents involving orphan sources and other radioactive material in the metal recycling and production industries have resulted in serious radiological accidents as … well as in harmful environmental, social and economic impacts. This Safety Guide provides recommendations, the implementation of which should prevent such accidents and provide confidence that scrap metal and recycled products are safe. Contents: 1. Introduction; 2. Protection of people and the environment; 3. Responsibilities; 4. Monitoring for radioactive material; 5. Response to the discovery of radioactive material; 6. Remediation of contaminated areas; 7. Management of recovered radioactive material; Annex I: Review of events involving radioactive material in the metal recycling and production industries; Annex II: Categorization of radioactive sources; Annex III: Some examples of national and international initiatives

  15. Control of Orphan Sources and Other Radioactive Material in the Metal Recycling and Production Industries. Specific Safety Guide (Arabic Edition)

    International Nuclear Information System (INIS)

    2014-01-01

    Accidents involving orphan sources and other radioactive material in the metal recycling and production industries have resulted in serious radiological accidents as well as in harmful environmental, social and economic impacts. This Safety Guide provides recommendations, the implementation of which should prevent such accidents and provide confidence that scrap metal and recycled products are safe. Contents: 1. Introduction; 2. Protection of people and the environment; 3. Responsibilities; 4. Monitoring for radioactive material; 5. Response to the discovery of radioactive material; 6. Remediation of contaminated areas; 7. Management of recovered radioactive material; Annex I: Review of events involving radioactive material in the metal recycling and production industries; Annex II: Categorization of radioactive sources; Annex III: Some examples of national and international initiatives

  16. Control of Orphan Sources and Other Radioactive Material in the Metal Recycling and Production Industries. Specific Safety Guide

    International Nuclear Information System (INIS)

    2012-01-01

    Accidents involving orphan sources and other radioactive material in the metal recycling and production industries have resulted in serious radiological accidents as well as in harmful environmental, social and economic impacts. This Safety Guide provides recommendations, the implementation of which should prevent such accidents and provide confidence that scrap metal and recycled products are safe. Contents: 1. Introduction; 2. Protection of people and the environment; 3. Responsibilities; 4. Monitoring for radioactive material; 5. Response to the discovery of radioactive material; 6. Remediation of contaminated areas; 7. Management of recovered radioactive material; Annex I: Review of events involving radioactive material in the metal recycling and production industries; Annex II: Categorization of radioactive sources; Annex III: Some examples of national and international initiatives.

  17. Control of Orphan Sources and Other Radioactive Material in the Metal Recycling and Production Industries. Specific Safety Guide (Spanish Edition)

    International Nuclear Information System (INIS)

    2013-01-01

    Accidents involving orphan sources and other radioactive material in the metal recycling and production industries have resulted in serious radiological accidents as well as in harmful environmental, social and economic impacts. This Safety Guide provides recommendations, the implementation of which should prevent such accidents and provide confidence that scrap metal and recycled products are safe. Contents: 1. Introduction; 2. Protection of people and the environment; 3. Responsibilities; 4. Monitoring for radioactive material; 5. Response to the discovery of radioactive material; 6. Remediation of contaminated areas; 7. Management of recovered radioactive material; Annex I: Review of events involving radioactive material in the metal recycling and production industries; Annex II: Categorization of radioactive sources; Annex III: Some examples of national and international initiatives

  18. An analytical model for the distribution of CO2 sources and sinks, fluxes, and mean concentration within the roughness sub-layer

    Science.gov (United States)

    Siqueira, M. B.; Katul, G. G.

    2009-12-01

    A one-dimensional analytical model that predicts foliage CO2 uptake rates, turbulent fluxes, and mean concentration throughout the roughness sub-layer (RSL), a layer that extends from the ground surface up to 5 times the canopy height (h), is proposed. The model combines the mean continuity equation for CO2 with first-order closure principles for turbulent fluxes and simplified physiological and radiative transfer schemes for foliage uptake. This combination results in a second-order ordinary differential equation in which it is imposed soil respiration (RE) as lower and CO2 concentration well above the RSL as upper boundary conditions. An inverse version of the model was tested against data sets from two contrasting ecosystems: a tropical forest (TF, h=40 m) and a managed irrigated rice canopy (RC, h=0.7 m) - with good agreement noted between modeled and measured mean CO2 concentration profiles within the entire RSL (see figure). Sensitivity analysis on the model parameters revealed a plausible scaling regime between them and a dimensionless parameter defined by the ratio between external (RE) and internal (stomatal conductance) characteristics controlling the CO2 exchange process. The model can be used to infer the thickness of the RSL for CO2 exchange, the inequality in zero-plane displacement between CO2 and momentum, and its consequences on modeled CO2 fluxes. A simplified version of the solution is well suited for being incorporated into large-scale climate models. Furthermore, the model framework here can be used to a priori estimate relative contributions from the soil surface and the atmosphere to canopy-air CO2 concentration thereby making it synergetic to stable isotopes studies. Panels a) and c): Profiles of normalized measured leaf area density distribution (a) for TF and RC, respectively. Continuous lines are the constant a used in the model and dashed lines represent data-derived profiles. Panels b) and d) are modeled and ensemble-averaged measured

  19. Highly macroscopically degenerated single-point ground states as source of specific heat capacity anomalies in magnetic frustrated systems

    Science.gov (United States)

    Jurčišinová, E.; Jurčišin, M.

    2018-04-01

    Anomalies of the specific heat capacity are investigated in the framework of the exactly solvable antiferromagnetic spin- 1 / 2 Ising model in the external magnetic field on the geometrically frustrated tetrahedron recursive lattice. It is shown that the Schottky-type anomaly in the behavior of the specific heat capacity is related to the existence of unique highly macroscopically degenerated single-point ground states which are formed on the borders between neighboring plateau-like ground states. It is also shown that the very existence of these single-point ground states with large residual entropies predicts the appearance of another anomaly in the behavior of the specific heat capacity for low temperatures, namely, the field-induced double-peak structure, which exists, and should be observed experimentally, along with the Schottky-type anomaly in various frustrated magnetic system.

  20. Identification of heavy metals sources in the Mexico city atmosphere, using the proton induced x-ray analytical technique and multifactorial statistics techniques

    International Nuclear Information System (INIS)

    Hernandez M, B.

    1997-01-01

    The objectives of this work are: to identify the heavy metals present in the air, and its concentrations. To know the behavior from the polluting chemical elements to the long of an annual cycle corresponding to 1990, based on the concentrations of the same ones, obtained through the PIXE technique. To identify the suitable statistical methods to use to the data of metals concentration in form of total suspended particle (PST), found in this investigation. To relate the concentrations and the meteorological parameters considered to be able to suggest the possible pollution sources. In function of the obtained results, to serve as base to the decisions making and measures control that are planned by diverse institutions focused to the problem of the atmospheric pollution in the Metropolitan area of Mexico City (ZMCM). (Author)

  1. Source identification of an unknown spill (2002) from Canal Ste-Anne-de-Bellevue, Quebec by the multi-criterion analytical approach and lab simulation of the spill sample

    International Nuclear Information System (INIS)

    Wang, Z.; Hollebone, B.; Yang, C.; Fingas, M.F.; Landriault, M.; Environment Canada, Ottawa, ON

    2004-01-01

    This study characterized the chemical composition of a variety of laboratory oil samples in order to determine the type, nature and sources of 3 unknown oil samples from an oil spill that occurred in Canal Ste-Anne-de-Bellevue, Quebec in 2002. An integrated multi-criterion approach using gas chromatography/mass spectrometry and gas chromatography/flame ionization detection was applied to characterize the laboratory samples. Results of chemical fingerprinting were presented. The distribution patterns of hydrocarbons in the spill and suspected source samples were recognized and compared. The study also involved an analysis of oil characteristic biomarkers and the extended suite of parent and alkylated polycyclic aromatic hydrocarbons. Several diagnostic ratios of source-specific marker compounds for fingerprint interpretation were determined. The major components in the suspected source samples were then identified. 40 refs., 5 tabs., 10 figs

  2. Integrated site-specific quantification of faecal bacteria and detection of DNA markers in faecal contamination source tracking as a microbial risk tracking tool in urban Lake ecosystems

    Science.gov (United States)

    Donde, Oscar Omondi; Tian, Cuicui; Xiao, Bangding

    2017-11-01

    The presence of feacal-derived pathogens in water is responsible for several infectious diseases and deaths worldwide. As a solution, sources of fecal pollution in waters must be accurately assessed, properly determined and strictly controlled. However, the exercise has remained challenging due to the existing overlapping characteristics by different members of faecal coliform bacteria and the inadequacy of information pertaining to the contribution of seasonality and weather condition on tracking the possible sources of pollution. There are continued efforts to improve the Faecal Contamination Source Tracking (FCST) techniques such as Microbial Source Tracking (MST). This study aimed to make contribution to MST by evaluating the efficacy of combining site specific quantification of faecal contamination indicator bacteria and detection of DNA markers while accounting for seasonality and weather conditions' effects in tracking the major sources of faecal contamination in a freshwater system (Donghu Lake, China). The results showed that the use of cyd gene in addition to lacZ and uidA genes differentiates E. coli from other closely related faecal bacteria. The use of selective media increases the pollution source tracking accuracy. BSA addition boosts PCR detection and increases FCST efficiency. Seasonality and weather variability also influence the detection limit for DNA markers.

  3. Analytical chemistry of actinides

    International Nuclear Information System (INIS)

    Chollet, H.; Marty, P.

    2001-01-01

    Different characterization methods specifically applied to the actinides are presented in this review such as ICP/OES (inductively coupled plasma-optical emission spectrometry), ICP/MS (inductively coupled plasma spectroscopy-mass spectrometry), TIMS (thermal ionization-mass spectrometry) and GD/OES (flow discharge optical emission). Molecular absorption spectrometry and capillary electrophoresis are also available to complete the excellent range of analytical tools at our disposal. (authors)

  4. Analytical and physical electrochemistry

    CERN Document Server

    Girault, Hubert H

    2004-01-01

    The study of electrochemistry is pertinent to a wide variety of fields, including bioenergetics, environmental sciences, and engineering sciences. In addition, electrochemistry plays a fundamental role in specific applications as diverse as the conversion and storage of energy and the sequencing of DNA.Intended both as a basic course for undergraduate students and as a reference work for graduates and researchers, Analytical and Physical Electrochemistry covers two fundamental aspects of electrochemistry: electrochemistry in solution and interfacial electrochemistry. By bringing these two subj

  5. Microplastics in the aquatic and terrestrial environment: sources (with a specific focus on personal care products), fate and effects.

    Science.gov (United States)

    Duis, Karen; Coors, Anja

    2016-01-01

    Due to the widespread use and durability of synthetic polymers, plastic debris occurs in the environment worldwide. In the present work, information on sources and fate of microplastic particles in the aquatic and terrestrial environment, and on their uptake and effects, mainly in aquatic organisms, is reviewed. Microplastics in the environment originate from a variety of sources. Quantitative information on the relevance of these sources is generally lacking, but first estimates indicate that abrasion and fragmentation of larger plastic items and materials containing synthetic polymers are likely to be most relevant. Microplastics are ingested and, mostly, excreted rapidly by numerous aquatic organisms. So far, there is no clear evidence of bioaccumulation or biomagnification. In laboratory studies, the ingestion of large amounts of microplastics mainly led to a lower food uptake and, consequently, reduced energy reserves and effects on other physiological functions. Based on the evaluated data, the lowest microplastic concentrations affecting marine organisms exposed via water are much higher than levels measured in marine water. In lugworms exposed via sediment, effects were observed at microplastic levels that were higher than those in subtidal sediments but in the same range as maximum levels in beach sediments. Hydrophobic contaminants are enriched on microplastics, but the available experimental results and modelling approaches indicate that the transfer of sorbed pollutants by microplastics is not likely to contribute significantly to bioaccumulation of these pollutants. Prior to being able to comprehensively assess possible environmental risks caused by microplastics a number of knowledge gaps need to be filled. However, in view of the persistence of microplastics in the environment, the high concentrations measured at some environmental sites and the prospective of strongly increasing concentrations, the release of plastics into the environment should be

  6. Monitoring of seismic events from a specific source region using a single regional array: A case study

    Science.gov (United States)

    Gibbons, S. J.; Kværna, T.; Ringdal, F.

    2005-07-01

    In the monitoring of earthquakes and nuclear explosions using a sparse worldwide network of seismic stations, it is frequently necessary to make reliable location estimates using a single seismic array. It is also desirable to screen out routine industrial explosions automatically in order that analyst resources are not wasted upon detections which can, with a high level of confidence, be associated with such a source. The Kovdor mine on the Kola Peninsula of NW Russia is the site of frequent industrial blasts which are well recorded by the ARCES regional seismic array at a distance of approximately 300 km. We describe here an automatic procedure for identifying signals which are likely to result from blasts at the Kovdor mine and, wherever possible, for obtaining single array locations for such events. Carefully calibrated processing parameters were chosen using measurements from confirmed events at the mine over a one-year period for which the operators supplied Ground Truth information. Phase arrival times are estimated using an autoregressive method and slowness and azimuth are estimated using broadband f{-} k analysis in fixed frequency bands and time-windows fixed relative to the initial P-onset time. We demonstrate the improvement to slowness estimates resulting from the use of fixed frequency bands. Events can be located using a single array if, in addition to the P-phase, at least one secondary phase is found with both an acceptable slowness estimate and valid onset-time estimate. We evaluate the on-line system over a twelve month period; every event known to have occured at the mine is detected by the process and 32 out of 53 confirmed events were located automatically. The remaining events were classified as “very likely” Kovdor events and were subsequently located by an analyst. The false alarm rate is low; only 84 very likely Kovdor events were identified during the whole of 2003 and none of these were subsequently located at a large distance from

  7. Development and analytical characterization of a Grimm-type glow discharge ion source operated with high gas flow rates and coupled to a mass spectrometer with high mass resolution

    International Nuclear Information System (INIS)

    Beyer, Claus; Feldmann, Ingo; Gilmour, Dave; Hoffmann, Volker; Jakubowski, Norbert

    2002-01-01

    A Grimm-type glow discharge ion source has been developed and was coupled to a commercial inductively coupled plasma mass spectrometer (ICP-MS) with high mass resolution (Axiom, ThermoElemental, Winsford, UK) by exchanging the front plate of the ICP-MS interface system only. In addition to high discharge powers of up to 70 W, which are typical for a Grimm-type design, this source could be operated with relative high gas flow rates of up to 240 ml min -1 . In combination with a high discharge voltage the signal intensities are reaching a constant level within the first 20 s after the discharge has started. An analytical characterization of this source is given utilizing a calibration using the steel standard reference material NIST 1261A-1265A. The sensitivity for the investigated elements measured with a resolution of 4000 is in the range of 500-6000 cps μg -1 g -1 , and a relative standard deviation (R.S.D.) of the measured isotope relative to Fe of less than 8% for the major and minor components of the sample has been achieved. Limits of detection at ng g -1 levels could be obtained

  8. Helios: Understanding Solar Evolution Through Text Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Randazzese, Lucien [SRI International, Menlo Park, CA (United States)

    2016-12-02

    This proof-of-concept project focused on developing, testing, and validating a range of bibliometric, text analytic, and machine-learning based methods to explore the evolution of three photovoltaic (PV) technologies: Cadmium Telluride (CdTe), Dye-Sensitized solar cells (DSSC), and Multi-junction solar cells. The analytical approach to the work was inspired by previous work by the same team to measure and predict the scientific prominence of terms and entities within specific research domains. The goal was to create tools that could assist domain-knowledgeable analysts in investigating the history and path of technological developments in general, with a focus on analyzing step-function changes in performance, or “breakthroughs,” in particular. The text-analytics platform developed during this project was dubbed Helios. The project relied on computational methods for analyzing large corpora of technical documents. For this project we ingested technical documents from the following sources into Helios: Thomson Scientific Web of Science (papers), the U.S. Patent & Trademark Office (patents), the U.S. Department of Energy (technical documents), the U.S. National Science Foundation (project funding summaries), and a hand curated set of full-text documents from Thomson Scientific and other sources.

  9. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  10. Nuclear analytical chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Brune, D.; Forkman, B.; Persson, B.

    1984-01-01

    This book covers the general theories and techniques of nuclear chemical analysis, directed at applications in analytical chemistry, nuclear medicine, radiophysics, agriculture, environmental sciences, geological exploration, industrial process control, etc. The main principles of nuclear physics and nuclear detection on which the analysis is based are briefly outlined. An attempt is made to emphasise the fundamentals of activation analysis, detection and activation methods, as well as their applications. The book provides guidance in analytical chemistry, agriculture, environmental and biomedical sciences, etc. The contents include: the nuclear periodic system; nuclear decay; nuclear reactions; nuclear radiation sources; interaction of radiation with matter; principles of radiation detectors; nuclear electronics; statistical methods and spectral analysis; methods of radiation detection; neutron activation analysis; charged particle activation analysis; photon activation analysis; sample preparation and chemical separation; nuclear chemical analysis in biological and medical research; the use of nuclear chemical analysis in the field of criminology; nuclear chemical analysis in environmental sciences, geology and mineral exploration; and radiation protection.

  11. Nuclear analytical chemistry

    International Nuclear Information System (INIS)

    Brune, D.; Forkman, B.; Persson, B.

    1984-01-01

    This book covers the general theories and techniques of nuclear chemical analysis, directed at applications in analytical chemistry, nuclear medicine, radiophysics, agriculture, environmental sciences, geological exploration, industrial process control, etc. The main principles of nuclear physics and nuclear detection on which the analysis is based are briefly outlined. An attempt is made to emphasise the fundamentals of activation analysis, detection and activation methods, as well as their applications. The book provides guidance in analytical chemistry, agriculture, environmental and biomedical sciences, etc. The contents include: the nuclear periodic system; nuclear decay; nuclear reactions; nuclear radiation sources; interaction of radiation with matter; principles of radiation detectors; nuclear electronics; statistical methods and spectral analysis; methods of radiation detection; neutron activation analysis; charged particle activation analysis; photon activation analysis; sample preparation and chemical separation; nuclear chemical analysis in biological and medical research; the use of nuclear chemical analysis in the field of criminology; nuclear chemical analysis in environmental sciences, geology and mineral exploration; and radiation protection

  12. Compound-specific stable carbon isotopic composition of petroleum hydrocarbons as a tool for tracing the source of oil spills

    International Nuclear Information System (INIS)

    Li Yun; Xiong Yongqiang; Yang Wanying; Xie Yueliang; Li Siyuan; Sun Yongge

    2009-01-01

    With the increasing demand for and consumption of crude oils, oil spill accidents happen frequently during the transportation of crude oils and oil products, and the environmental hazard they pose has become increasingly serious in China. The exact identification of the source of spilled oil can act as forensic evidence in the investigation and handling of oil spill accidents. In this study, a weathering simulation experiment demonstrates that the mass loss of crude oils caused by short-term weathering mainly occurs within the first 24 h after a spill, and is dominated by the depletion of low-molecular weight hydrocarbons ( 18 n-alkanes). Short-term weathering has no significant effect on δ 13 C values of individual n-alkanes (C 12 -C 33 ), suggesting that a stable carbon isotope profile of n-alkanes can be a useful tool for tracing the source of an oil spill, particularly for weathered oils or those with a relatively low concentration or absence of sterane and terpane biomarkers

  13. Advanced analytical techniques

    International Nuclear Information System (INIS)

    Mrochek, J.E.; Shumate, S.E.; Genung, R.K.; Bahner, C.T.; Lee, N.E.; Dinsmore, S.R.

    1976-01-01

    The development of several new analytical techniques for use in clinical diagnosis and biomedical research is reported. These include: high-resolution liquid chromatographic systems for the early detection of pathological molecular constituents in physiologic body fluids; gradient elution chromatography for the analysis of protein-bound carbohydrates in blood serum samples, with emphasis on changes in sera from breast cancer patients; electrophoretic separation techniques coupled with staining of specific proteins in cellular isoenzymes for the monitoring of genetic mutations and abnormal molecular constituents in blood samples; and the development of a centrifugal elution chromatographic technique for the assay of specific proteins and immunoglobulins in human blood serum samples

  14. Clinical presentation of women with pelvic source varicose veins in the perineum as a first step in the development of a disease-specific patient assessment tool.

    Science.gov (United States)

    Gibson, Kathleen; Minjarez, Renee; Ferris, Brian; Neradilek, Moni; Wise, Matthew; Stoughton, Julianne; Meissner, Mark

    2017-07-01

    Pelvic venous incompetence can cause symptomatic varicose veins in the perineum, buttock, and thigh. Presentation, symptom severity, and response to treatment of pelvic source varicose veins are not well defined. Currently available tools to measure the severity of lower extremity venous disease and its effects on quality of life may be inadequate to assess disease severity in these patients. The purpose of this study was to evaluate the histories, demographics, and clinical presentations of women with pelvic source varicose veins and to compare these data to a population of women with nonpelvic source varicose veins. A total of 72 female patients with symptomatic pelvic source varicose veins were prospectively followed up. Age, weight, height, parity, and birth weights of offspring were recorded. Both pelvic source varicose veins and saphenous incompetence were identified by duplex ultrasound. Patients were queried as to their primary symptoms, activities that made their symptoms worse, and time when their symptoms were most prominent. Severity of disease was objectively evaluated using the revised Venous Clinical Severity Score (rVCSS) and 10-point numeric pain rating scale (NPRS). Compared with women without a pelvic source of varicose veins (N = 1163), patients with pelvic source varicose veins were younger (mean, 44.6 ± 8.6 vs 52.6 ± 12.9 years; P source varicose veins are a unique subset of patients. They are younger and thinner than those with nonpelvic source varicose veins, have larger infants than the general U.S. population, and have an inverse correlation between age and pain. As the majority of premenopausal patients have increased symptoms during menses, this may be due to hormonal influence. As it is poorly associated with patient-reported discomfort, the rVCSS is a poor tool for evaluating pelvic source varicose veins. A disease-specific tool for the evaluation of pelvic source varicose veins is critically needed, and this study is a first

  15. Specification and estimation of sources of bias affecting neurological studies in PET/MR with an anatomical brain phantom

    Energy Technology Data Exchange (ETDEWEB)

    Teuho, J., E-mail: jarmo.teuho@tyks.fi [Turku PET Centre, Turku (Finland); Johansson, J. [Turku PET Centre, Turku (Finland); Linden, J. [Turku PET Centre, Turku (Finland); Department of Mathematics and Statistics, University of Turku, Turku (Finland); Saunavaara, V.; Tolvanen, T.; Teräs, M. [Turku PET Centre, Turku (Finland)

    2014-01-11

    Selection of reconstruction parameters has an effect on the image quantification in PET, with an additional contribution from a scanner-specific attenuation correction method. For achieving comparable results in inter- and intra-center comparisons, any existing quantitative differences should be identified and compensated for. In this study, a comparison between PET, PET/CT and PET/MR is performed by using an anatomical brain phantom, to identify and measure the amount of bias caused due to differences in reconstruction and attenuation correction methods especially in PET/MR. Differences were estimated by using visual, qualitative and quantitative analysis. The qualitative analysis consisted of a line profile analysis for measuring the reproduction of anatomical structures and the contribution of the amount of iterations to image contrast. The quantitative analysis consisted of measurement and comparison of 10 anatomical VOIs, where the HRRT was considered as the reference. All scanners reproduced the main anatomical structures of the phantom adequately, although the image contrast on the PET/MR was inferior when using a default clinical brain protocol. Image contrast was improved by increasing the amount of iterations from 2 to 5 while using 33 subsets. Furthermore, a PET/MR-specific bias was detected, which resulted in underestimation of the activity values in anatomical structures closest to the skull, due to the MR-derived attenuation map that ignores the bone. Thus, further improvements for the PET/MR reconstruction and attenuation correction could be achieved by optimization of RAMLA-specific reconstruction parameters and implementation of bone to the attenuation template. -- Highlights: • Comparison between PET, PET/CT and PET/MR was performed with a novel brain phantom. • The performance of reconstruction and attenuation correction in PET/MR was studied. • A recently developed brain phantom was found feasible for PET/MR imaging. • Contrast reduction

  16. Specification and estimation of sources of bias affecting neurological studies in PET/MR with an anatomical brain phantom

    Science.gov (United States)

    Teuho, J.; Johansson, J.; Linden, J.; Saunavaara, V.; Tolvanen, T.; Teräs, M.

    2014-01-01

    Selection of reconstruction parameters has an effect on the image quantification in PET, with an additional contribution from a scanner-specific attenuation correction method. For achieving comparable results in inter- and intra-center comparisons, any existing quantitative differences should be identified and compensated for. In this study, a comparison between PET, PET/CT and PET/MR is performed by using an anatomical brain phantom, to identify and measure the amount of bias caused due to differences in reconstruction and attenuation correction methods especially in PET/MR. Differences were estimated by using visual, qualitative and quantitative analysis. The qualitative analysis consisted of a line profile analysis for measuring the reproduction of anatomical structures and the contribution of the amount of iterations to image contrast. The quantitative analysis consisted of measurement and comparison of 10 anatomical VOIs, where the HRRT was considered as the reference. All scanners reproduced the main anatomical structures of the phantom adequately, although the image contrast on the PET/MR was inferior when using a default clinical brain protocol. Image contrast was improved by increasing the amount of iterations from 2 to 5 while using 33 subsets. Furthermore, a PET/MR-specific bias was detected, which resulted in underestimation of the activity values in anatomical structures closest to the skull, due to the MR-derived attenuation map that ignores the bone. Thus, further improvements for the PET/MR reconstruction and attenuation correction could be achieved by optimization of RAMLA-specific reconstruction parameters and implementation of bone to the attenuation template.

  17. Substrate specificity of glucose dehydrogenase and carbon source utilization pattern of pantoea dispersa strain P2 and its radiation induced mutants

    International Nuclear Information System (INIS)

    Lee, Young Keun; Murugesan, Senthilkumar

    2009-01-01

    Mineral phosphate solubilizing pantoea dispersa strain P2 produced 5.5 mM and 42.6 mM of gluconic acid on 24 h and 72 h incubation, respectively. Strain P2 exhibited glucose dehydrogenase (GDH) specific activity of 0.32 IU mg -1 protein. We have studied the substrate specificity of GDH as well as carbon source utilization pattern of strain P2. GDH of strain P2 did not use ribose as substrate. Utilization of lactose with specific activity of 0.65 IU mg -1 protein indicated that the enzyme belongs to GDH type B isozyme. Arabinose, galactose, ribose, sucrose and xylose did not induce the synthesis of GDH enzyme while mannose induced the synthesis of GDH with highest specific activity of 0.58 IU mg -1 protein. Through radiation mutagenesis, the substrate specificity of GDH was modified in order to utilize side range of sugars available in root exudates. Ribose, originally not a substrate for GDH of strain P2 was utilized as substrate by mutants P2-M5 with specific activity of 0.44 and 0.57 IU mg -1 protein, respectively. Specific activity of GDH on the media containing lactose and galactose was also improved to 1.2 and 0.52 IU mg -1 protein in P2-M5 and P2-M6 respectively. Based on the carbon source availability in root exudate, the mutants can be selected and utilized as efficient biofertilizer under P-deficient soil conditions

  18. Substrate specificity of glucose dehydrogenase and carbon source utilization pattern of pantoea dispersa strain P2 and its radiation induced mutants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Keun; Murugesan, Senthilkumar [Korea Atomic Energy Research Institute, Jeongeup (Korea, Republic of)

    2009-06-15

    Mineral phosphate solubilizing pantoea dispersa strain P2 produced 5.5 mM and 42.6 mM of gluconic acid on 24 h and 72 h incubation, respectively. Strain P2 exhibited glucose dehydrogenase (GDH) specific activity of 0.32 IU mg{sup -1} protein. We have studied the substrate specificity of GDH as well as carbon source utilization pattern of strain P2. GDH of strain P2 did not use ribose as substrate. Utilization of lactose with specific activity of 0.65 IU mg{sup -1} protein indicated that the enzyme belongs to GDH type B isozyme. Arabinose, galactose, ribose, sucrose and xylose did not induce the synthesis of GDH enzyme while mannose induced the synthesis of GDH with highest specific activity of 0.58 IU mg{sup -1} protein. Through radiation mutagenesis, the substrate specificity of GDH was modified in order to utilize side range of sugars available in root exudates. Ribose, originally not a substrate for GDH of strain P2 was utilized as substrate by mutants P2-M5 with specific activity of 0.44 and 0.57 IU mg{sup -1} protein, respectively. Specific activity of GDH on the media containing lactose and galactose was also improved to 1.2 and 0.52 IU mg{sup -1} protein in P2-M5 and P2-M6 respectively. Based on the carbon source availability in root exudate, the mutants can be selected and utilized as efficient biofertilizer under P-deficient soil conditions.

  19. Radiation dose of cardiac dual-source CT: The effect of tailoring the protocol to patient-specific parameters

    International Nuclear Information System (INIS)

    Alkadhi, Hatem; Stolzmann, Paul; Scheffel, Hans; Desbiolles, Lotus; Baumueller, Stephan; Plass, Andre; Genoni, Michele; Marincek, Borut; Leschka, Sebastian

    2008-01-01

    Objective: To determine the radiation doses and image quality of different dual-source computed tomography coronary angiography (CTCA) protocols tailored to the heart rate (HR) and body mass index (BMI) of the patients. Materials and methods: Two hundred consecutive patients (68 women; mean age 61 ± 9 years) underwent either helical CTCA with retrospective ECG-gating or sequential CT with prospective ECG-triggering: 50 patients (any BMI, any HR) were examined with a standard, non-tailored protocol (helical CTCA, 120 kV, 330 mAs), whereas the other 150 patients were examined with a tailored protocol: 40 patients (group A, BMI ≤ 25 kg/sqm, HR ≤ 70 bpm) with sequential CTCA (100 kV, 190 mAs ref ), 43 patients (group B, BMI ≤ 25 kg/sqm, HR > 70 bpm) with helical CTCA (100 kV, 220 mAs), 28 patients (group C, BMI > 25 kg/sqm, HR ≤ 70 bpm) with sequential CTCA (120 kV, 330 mAs ref ), and 39 patients (group D, BMI > 25 kg/sqm, HR > 70 bpm) with helical CTCA (120 kV, 330 mAs). The effective radiation dose estimates were calculated from the dose-length-product for each patient. Image quality was classified as being diagnostic or non-diagnostic in each coronary segment. Results: Image quality was diagnostic in 2403/2460 (98%) and non-diagnostic in 57/2460 (2%) of all coronary segments. No significant differences in image quality were found among all five CTCA protocols (p = 0.78). The non-tailored helical CTCA protocol was associated with a radiation dose of 9.0 ± 1.0 mSv, being significantly higher compared to that using sequential CTCA (group A: 1.3 ± 0.3 mSv, p 70 bpm (group D: 8.5 ± 0.9 mSv, p = 0.51). Conclusions: Dual-source CTCA is associated with radiation doses ranging between 1.3 and 9.0 mSv, depending on the protocol used. Tailoring of the CTCA protocol to the HR and BMI of the individual patient results in dose reductions of up to 86%, while maintaining a diagnostic image quality of the examination

  20. Alu polymerase chain reaction: A method for rapid isolation of human-specific sequences from complex DNA sources

    International Nuclear Information System (INIS)

    Nelson, D.L.; Ledbetter, S.A.; Corbo, L.; Victoria, M.F.; Ramirez-Solis, R.; Webster, T.D.; Ledbetter, D.H.; Caskey, C.T.

    1989-01-01

    Current efforts to map the human genome are focused on individual chromosomes or smaller regions and frequently rely on the use of somatic cell hybrids. The authors report the application of the polymerase chain reaction to direct amplification of human DNA from hybrid cells containing regions of the human genome in rodent cell backgrounds using primers directed to the human Alu repeat element. They demonstrate Alu-directed amplification of a fragment of the human HPRT gene from both hybrid cell and cloned DNA and identify through sequence analysis the Alu repeats involved in this amplification. They also demonstrate the application of this technique to identify the chromosomal locations of large fragments of the human X chromosome cloned in a yeast artificial chromosome and the general applicability of the method to the preparation of DNA probes from cloned human sequences. The technique allows rapid gene mapping and provides a simple method for the isolation and analysis of specific chromosomal regions

  1. Median of patient results as a tool for assessment of analytical stability.

    Science.gov (United States)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Indigenous Saccharomyces cerevisiae yeasts as a source of biodiversity for the selection of starters for specific fermentations

    Directory of Open Access Journals (Sweden)

    Capece Angela

    2014-01-01

    Full Text Available The long-time studies on wine yeasts have determined a wide diffusion of inoculated fermentations by commercial starters, mainly of Saccharomyces. Although the use of starter cultures has improved the reproducibility of wine quality, the main drawback to this practice is the lack of the typical traits of wines produced by spontaneous fermentation. These findings have stimulated wine-researchers and wine-makers towards the selection of autochthonous strains as starter cultures. The objective of this study was to investigate the biodiversity of 167 S. cerevisiae yeasts, isolated from spontaneous fermentation of grapes. The genetic variability of isolates was evaluated by PCR amplification of inter-δ region with primer pair δ2/δ12. The same isolates were investigated for characteristics of oenological interest, such as resistance to sulphur dioxide, ethanol and copper and hydrogen sulphide production. On the basis of technological and molecular results, 20 strains were chosen and tested into inoculated fermentations at laboratory scale. The experimental wines were analyzed for the content of some by-products correlated to wine aroma, such as higher alcohols, acetaldehyde, ethyl acetate and acetic acid. One selected strain was used as starter culture to perform fermentation at cellar level. The selection program followed during this research project represents an optimal combination between two different trends in modern winemaking: the use of S. cerevisiae as starter cultures and the starter culture selection for specific fermentations.

  3. Compound specific stable isotopes as probes for distinguishing the sources of biomolecules in terrestrial and extraterrestrial materials

    Science.gov (United States)

    Engel, M. H.; Macko, S. A.

    2003-04-01

    Life on Earth consists of orderly arrangements of several key types of organic compounds (amino acids, sugars, fatty acids, nucleic bases) that are the building blocks of proteins, carbohydrates, lipids and nucleotides. Subsequent to death, macromolecules are commonly broken down to their molecular constituents or other similar scale components. Thus, in ancient terrestrial and extraterrestrial materials, it is far more likely to expect the presence of simple compounds such as amino acids rather than the proteins from which they were possibly derived. Given that amino acids, for example, are common components of all extinct and extant organisms, the challenge has been to develop methods for distinguishing their sources. Stable isotopes are powerful probes for determining the origins of organic matter. Amino acid constituents of all organisms on Earth exhibit characteristic stable isotope compositions owing to fractionations associated with their biosynthesis. These fractionations are distinct from those observed for amino acids formed by abiotic processes. Thus it should be possible to use isotopes as probes for determining whether amino acids in ancient rocks on Earth are biotic or abiotic, based on their relative isotopic compositions. Also, owing to differences in the isotope compositions of precursors, amino acids in extraterrestrial materials such as carbonaceous meteorites are moderately to substantially enriched in the heavy isotopes of C, N and H relative to terrestrial amino acids. Assuming that the isotope compositions of the gaseous components of, for example, the Martian atmosphere were distinct from Earth at such time when organic molecules may have formed, it should be possible to distinguish these components from terrestrial contaminants by determining their isotope compositions and/or those of their respective enantiomers. Also, if life as we know it existed on another planet such as Mars, fractionations characteristic of biosynthesis should be

  4. Opinions on Drug Interaction Sources in Anticancer Treatments and Parameters for an Oncology-Specific Database by Pharmacy Practitioners in Asia

    Directory of Open Access Journals (Sweden)

    2010-01-01

    Full Text Available Cancer patients undergoing chemotherapy are particularly susceptible to drug-drug interactions (DDIs. Practitioners should keep themselves updated with the most current DDI information, particularly involving new anticancer drugs (ACDs. Databases can be useful to obtain up-to-date DDI information in a timely and efficient manner. Our objective was to investigate the DDI information sources of pharmacy practitioners in Asia and their views on the usefulness of an oncology-specific database for ACD interactions. A qualitative, cross-sectional survey was done to collect information on the respondents' practice characteristics, sources of DDI information and parameters useful in an ACD interaction database. Response rate was 49%. Electronic databases (70%, drug interaction textbooks (69% and drug compendia (64% were most commonly used. Majority (93% indicated that a database catering towards ACD interactions was useful. Essential parameters that should be included in the database were the mechanism and severity of the detected interaction, and the presence of a management plan (98% each. This study has improved our understanding on the usefulness of various DDI information sources for ACD interactions among pharmacy practitioners in Asia. An oncology-specific DDI database targeting ACD interactions is definitely attractive for clinical practice.

  5. Production of sealed sup 6 sup 0 Co and sup 1 sup 9 sup 2 Ir sources of high specific activity in the nuclear reactor RA

    International Nuclear Information System (INIS)

    Dobrijevic, R.; Vucina, J.

    1998-01-01

    Given is a review on the development of the production of 60 Co and 192 Ir performed in the Vinca Institute in the nuclear reactor RA. The experience gained showed that this reactor was suitable for obtaining of these and some other radionuclides. One possibility of its re-start is that the performances of the reactor remain the same (power 6.5 MW, max.neutron flux up to 6x10 13 n.cm -2 s -1 ). By applying new techniques of target preparation, 60 Co for sterilization units of specific activity 1.11 TBq/g could be produced. Maximal activity of sup 1 sup 9 sup 2 Ir would be about 1.48 TBq what is satisfactory for the sources for gamma radiography. The increase of the flux to 10 14 n.cm -2 s -1 would enable the production of 60 Co of specific activities about 3.335 TBq/g. This is satisfactory for the sources for the radiation therapy of activities up to 111 TBq and for gamma radiography of activities about 0.37 TBq. In the case of 192 Ir the sources for the radiation therapy of activities about 0.37 TBq could be obtained. Maximal achievable activities of 192 Ir would be about 3.7 TBq. (author)

  6. Bacterial indicator occurrence and the use of an F+ specific RNA coliphage assay to identify fecal sources in Homosassa Springs, Florida

    Science.gov (United States)

    Griffin, Dale W.; Stokes, Rodger; Rose, J.B.; Paul, J.H.

    2000-01-01

    A microbiological water quality study of Homosassa Springs State Wildlife Park (HSSWP) and surrounding areas was undertaken. Samples were collected in November of 1997 (seven sites) and again in November of 1998 (nine sites). Fecal bacterial concentrations (total and fecal coliforms, Clostridium perfringens, and enterococci) were measured as relative indicators of fecal contamination. F+-specific coliphage genotyping was performed to determine the source of fecal contamination at the study sites. Bacterial levels were considerably higher at most sites in the 1997 sampling compared to the 1998 sampling, probably because of the greater rainfall that year. In November of 1997, 2 of the 7 sites were in violation of all indicator standards and guidance levels. In November of 1998, 1 of 9 sites was in violation of all indicator standard and guidance levels. The highest concentrations of all fecal indicators were found at a station downstream of the animal holding pens in HSSWP. The lowest levels of indicators were found at the Homosassa Main Spring vent. Levels of fecal indicators downstream of HSSWP (near the point of confluence with the river) were equivalent to those found in the Southeastern Fork and areas upstream of the park influences. F+ specific RNA coliphage analysis indicated that fecal contamination at all sites that tested positive was from animal sources (mammals and birds). These results suggest that animal (indigenous and those in HSSWP) and not human sources influenced microbial water quality in the area of Homosassa River covered by this study.

  7. Population of computational rabbit-specific ventricular action potential models for investigating sources of variability in cellular repolarisation.

    Directory of Open Access Journals (Sweden)

    Philip Gemmell

    Full Text Available Variability is observed at all levels of cardiac electrophysiology. Yet, the underlying causes and importance of this variability are generally unknown, and difficult to investigate with current experimental techniques. The aim of the present study was to generate populations of computational ventricular action potential models that reproduce experimentally observed intercellular variability of repolarisation (represented by action potential duration and to identify its potential causes. A systematic exploration of the effects of simultaneously varying the magnitude of six transmembrane current conductances (transient outward, rapid and slow delayed rectifier K(+, inward rectifying K(+, L-type Ca(2+, and Na(+/K(+ pump currents in two rabbit-specific ventricular action potential models (Shannon et al. and Mahajan et al. at multiple cycle lengths (400, 600, 1,000 ms was performed. This was accomplished with distributed computing software specialised for multi-dimensional parameter sweeps and grid execution. An initial population of 15,625 parameter sets was generated for both models at each cycle length. Action potential durations of these populations were compared to experimentally derived ranges for rabbit ventricular myocytes. 1,352 parameter sets for the Shannon model and 779 parameter sets for the Mahajan model yielded action potential duration within the experimental range, demonstrating that a wide array of ionic conductance values can be used to simulate a physiological rabbit ventricular action potential. Furthermore, by using clutter-based dimension reordering, a technique that allows visualisation of multi-dimensional spaces in two dimensions, the interaction of current conductances and their relative importance to the ventricular action potential at different cycle lengths were revealed. Overall, this work represents an important step towards a better understanding of the role that variability in current conductances may play in

  8. Population of computational rabbit-specific ventricular action potential models for investigating sources of variability in cellular repolarisation.

    Science.gov (United States)

    Gemmell, Philip; Burrage, Kevin; Rodriguez, Blanca; Quinn, T Alexander

    2014-01-01

    Variability is observed at all levels of cardiac electrophysiology. Yet, the underlying causes and importance of this variability are generally unknown, and difficult to investigate with current experimental techniques. The aim of the present study was to generate populations of computational ventricular action potential models that reproduce experimentally observed intercellular variability of repolarisation (represented by action potential duration) and to identify its potential causes. A systematic exploration of the effects of simultaneously varying the magnitude of six transmembrane current conductances (transient outward, rapid and slow delayed rectifier K(+), inward rectifying K(+), L-type Ca(2+), and Na(+)/K(+) pump currents) in two rabbit-specific ventricular action potential models (Shannon et al. and Mahajan et al.) at multiple cycle lengths (400, 600, 1,000 ms) was performed. This was accomplished with distributed computing software specialised for multi-dimensional parameter sweeps and grid execution. An initial population of 15,625 parameter sets was generated for both models at each cycle length. Action potential durations of these populations were compared to experimentally derived ranges for rabbit ventricular myocytes. 1,352 parameter sets for the Shannon model and 779 parameter sets for the Mahajan model yielded action potential duration within the experimental range, demonstrating that a wide array of ionic conductance values can be used to simulate a physiological rabbit ventricular action potential. Furthermore, by using clutter-based dimension reordering, a technique that allows visualisation of multi-dimensional spaces in two dimensions, the interaction of current conductances and their relative importance to the ventricular action potential at different cycle lengths were revealed. Overall, this work represents an important step towards a better understanding of the role that variability in current conductances may play in experimentally

  9. Mars Analytical Microimager

    Science.gov (United States)

    Batory, Krzysztof J.; Govindjee; Andersen, Dale; Presley, John; Lucas, John M.; Sears, S. Kelly; Vali, Hojatollah

    Unambiguous detection of extraterrestrial nitrogenous hydrocarbon microbiology requires an instrument both to recognize potential biogenic specimens and to successfully discriminate them from geochemical settings. Such detection should ideally be in-situ and not jeopardize other experiments by altering samples. Taken individually most biomarkers are inconclusive. For example, since amino acids can be synthesized abiotically they are not always considered reliable biomarkers. An enantiomeric imbalance, which is characteristic of all terrestrial life, may be questioned because chirality can also be altered abiotically. However, current scientific understanding holds that aggregates of identical proteins or proteinaceous complexes, with their well-defined amino acid residue sequences, are indisputable biomarkers. Our paper describes the Mars Analytical Microimager, an instrument for the simultaneous imaging of generic autofluorescent biomarkers and overall morphology. Autofluorescence from ultraviolet to near-infrared is emitted by all known terrestrial biology, and often as consistent complex bands uncharacteristic of abiotic mineral luminescence. The MAM acquires morphology, and even sub-micron morphogenesis, at a 3-centimeter working distance with resolution approaching a laser scanning microscope. Luminescence is simultaneously collected via a 2.5-micron aperture, thereby permitting accurate correlation of multi-dimensional optical behavior with specimen morphology. A variable wavelength excitation source and photospectrometer serve to obtain steady-state and excitation spectra of biotic and luminescent abiotic sources. We believe this is the first time instrumentation for detecting hydrated or desiccated microbiology non-destructively in-situ has been demonstrated. We have obtained excellent preliminary detection of biota and inorganic matrix discrimination from terrestrial polar analogues, and perimetric morphology of individual magnetotactic bacteria. Proposed

  10. 100-B/C Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    R.W. Ovink

    2010-03-18

    This report documents the process used to identify source area target analytes in support of the 100-B/C remedial investigation/feasibility study addendum to DOE/RL-2008-46. This report also establishes the analyte exclusion criteria applicable for 100-B/C use and the analytical methods needed to analyze the target analytes.

  11. Distinguishing between old and modern permafrost sources in the northeast Siberian land-shelf system with compound-specific δ2H analysis

    Science.gov (United States)

    Vonk, Jorien E.; Tesi, Tommaso; Bröder, Lisa; Holmstrand, Henry; Hugelius, Gustaf; Andersson, August; Dudarev, Oleg; Semiletov, Igor; Gustafsson, Örjan

    2017-08-01

    Pleistocene ice complex permafrost deposits contain roughly a quarter of the organic carbon (OC) stored in permafrost (PF) terrain. When permafrost thaws, its OC is remobilized into the (aquatic) environment where it is available for degradation, transport or burial. Aquatic or coastal environments contain sedimentary reservoirs that can serve as archives of past climatic change. As permafrost thaw is increasing throughout the Arctic, these reservoirs are important locations to assess the fate of remobilized permafrost OC.We here present compound-specific deuterium (δ2H) analysis on leaf waxes as a tool to distinguish between OC released from thawing Pleistocene permafrost (ice complex deposits; ICD) and from thawing Holocene permafrost (from near-surface soils). Bulk geochemistry (%OC; δ13C; %total nitrogen, TN) was analyzed as well as the concentrations and δ2H signatures of long-chain n-alkanes (C21 to C33) and mid- to long-chain n-alkanoic acids (C16 to C30) extracted from both ICD-PF samples (n = 9) and modern vegetation and O-horizon (topsoil-PF) samples (n = 9) from across the northeast Siberian Arctic. Results show that these topsoil-PF samples have higher %OC, higher OC / TN values and more depleted δ13C-OC values than ICD-PF samples, suggesting that these former samples trace a fresher soil and/or vegetation source. Whereas the two investigated sources differ on the bulk geochemical level, they are, however, virtually indistinguishable when using leaf wax concentrations and ratios. However, on the molecular isotope level, leaf wax biomarker δ2H values are statistically different between topsoil PF and ICD PF. For example, the mean δ2H value of C29 n-alkane was -246 ± 13 ‰ (mean ± SD) for topsoil PF and -280 ± 12 ‰ for ICD PF. With a dynamic isotopic range (difference between two sources) of 34 to 50 ‰; the isotopic fingerprints of individual, abundant, biomarker molecules from leaf waxes can thus serve as endmembers to distinguish between

  12. What are Segments in Google Analytics

    Science.gov (United States)

    Segments find all sessions that meet a specific condition. You can then apply this segment to any report in Google Analytics (GA). Segments are a way of identifying sessions and users while filters identify specific events, like pageviews.

  13. 561 SOURCE SPECIFIC QUANTIFICATION, CHARACTERISATION ...

    African Journals Online (AJOL)

    Osondu

    2013-09-02

    Sep 2, 2013 ... Lapai town lacks data on quantity of waste generated and their characteristics for efficient and sustainable ... crude open dump sites, burning without air and water pollution control, the breeding of flies and vermin, and the ...

  14. 561 SOURCE SPECIFIC QUANTIFICATION, CHARACTERISATION ...

    African Journals Online (AJOL)

    Osondu

    2013-09-02

    Sep 2, 2013 ... the least with 2%. For efficient and sustainable solid waste management in Lapai it is recommended that Lapai ... and for residential, industrial and commercial places. .... Rice/Cement sacks. 18% ... now an environmental nuisance and health risk. A typical .... http://www.nyc.gov/html/dos/pdf/wprr/wprro6.pdf,.

  15. Analytical admittance characterization of high mobility channel

    Energy Technology Data Exchange (ETDEWEB)

    Mammeri, A. M.; Mahi, F. Z., E-mail: fati-zo-mahi2002@yahoo.fr [Institute of Science and Technology, University of Bechar (Algeria); Varani, L. [Institute of Electronics of the South (IES - CNRS UMR 5214), University of Montpellier (France)

    2015-03-30

    In this contribution, we investigate the small-signal admittance of the high electron mobility transistors field-effect channels under a continuation branching of the current between channel and gate by using an analytical model. The analytical approach takes into account the linearization of the 2D Poisson equation and the drift current along the channel. The analytical equations discuss the frequency dependence of the admittance at source and drain terminals on the geometrical transistor parameters.

  16. Quantifying uncertainty in nuclear analytical measurements

    International Nuclear Information System (INIS)

    2004-07-01

    Laboratories to quantify the uncertainty of measurement results, and the fact that this standard is used as a basis for the development and implementation of quality management systems in many laboratories performing nuclear analytical measurements, triggered the demand for specific guidance to cover uncertainty issues of nuclear analytical methods. The demand was recognized by the IAEA and a series of examples was worked out by a group of consultants in 1998. The diversity and complexity of the topics addressed delayed the publication of a technical guidance report, but the exchange of views among the experts was also beneficial and led to numerous improvements and additions with respect to the initial version. This publication is intended to assist scientists using nuclear analytical methods in assessing and quantifying the sources of uncertainty of their measurements. The numerous examples provide a tool for applying the principles elaborated in the GUM and EURACHEM/CITAC publications to their specific fields of interest and for complying with the requirements of current quality management standards for testing and calibration laboratories. It also provides a means for the worldwide harmonization of approaches to uncertainty quantification and thereby contributes to enhanced comparability and competitiveness of nuclear analytical measurements

  17. Visual Analytics for Heterogeneous Geoscience Data

    Science.gov (United States)

    Pan, Y.; Yu, L.; Zhu, F.; Rilee, M. L.; Kuo, K. S.; Jiang, H.; Yu, H.

    2017-12-01

    Geoscience data obtained from diverse sources have been routinely leveraged by scientists to study various phenomena. The principal data sources include observations and model simulation outputs. These data are characterized by spatiotemporal heterogeneity originated from different instrument design specifications and/or computational model requirements used in data generation processes. Such inherent heterogeneity poses several challenges in exploring and analyzing geoscience data. First, scientists often wish to identify features or patterns co-located among multiple data sources to derive and validate certain hypotheses. Heterogeneous data make it a tedious task to search such features in dissimilar datasets. Second, features of geoscience data are typically multivariate. It is challenging to tackle the high dimensionality of geoscience data and explore the relations among multiple variables in a scalable fashion. Third, there is a lack of transparency in traditional automated approaches, such as feature detection or clustering, in that scientists cannot intuitively interact with their analysis processes and interpret results. To address these issues, we present a new scalable approach that can assist scientists in analyzing voluminous and diverse geoscience data. We expose a high-level query interface that allows users to easily express their customized queries to search features of interest across multiple heterogeneous datasets. For identified features, we develop a visualization interface that enables interactive exploration and analytics in a linked-view manner. Specific visualization techniques such as scatter plots to parallel coordinates are employed in each view to allow users to explore various aspects of features. Different views are linked and refreshed according to user interactions in any individual view. In such a manner, a user can interactively and iteratively gain understanding into the data through a variety of visual analytics operations. We

  18. Do site-specific radiocarbon measurements reflect localized distributions of 14C in biota inhabiting a wetland with point contamination sources?

    Science.gov (United States)

    Yankovich, T; King-Sharp, K J; Benz, M L; Carr, J; Killey, R W D; Beresford, N A; Wood, M D

    2013-12-01

    Duke Swamp is a wetland ecosystem that receives (14)C via a groundwater pathway originating from a waste management area on Atomic Energy Canada Limited's Chalk River Laboratories site. This groundwater reaches the surface of the swamp, resulting in relatively high (14)C levels over an area of 146 m(2). The objective of this study was to quantify (14)C concentrations in flora and fauna inhabiting areas of Duke Swamp over the gradient of (14)C activity concentrations in moss to determine whether (14)C specific activities in receptor biota reflect the localized nature of the groundwater source in the swamp. Representative receptor plants and animals, and corresponding air and soil samples were collected at six sites in Duke Swamp with (14)C specific activities in air that ranged from 1140 to 45,900 Bq/kg C. In general, it was found that specific activities of (14)C in biota tissues reflected those measured in environmental media collected from the same sampling site. The findings demonstrate that mosses could be used in monitoring programs to ensure protection of biota in areas with elevated (14)C, negating the need to capture and euthanize higher organisms. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Let's Talk... Analytics

    Science.gov (United States)

    Oblinger, Diana G.

    2012-01-01

    Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)

  20. Analytics for Education

    Science.gov (United States)

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  1. Analytical chemistry instrumentation

    International Nuclear Information System (INIS)

    Laing, W.R.

    1986-01-01

    In nine sections, 48 chapters cover 1) analytical chemistry and the environment 2) environmental radiochemistry 3) automated instrumentation 4) advances in analytical mass spectrometry 5) fourier transform spectroscopy 6) analytical chemistry of plutonium 7) nuclear analytical chemistry 8) chemometrics and 9) nuclear fuel technology

  2. Specific features of diffuse reflection of human face skin for laser and non-laser sources of visible and near-IR light

    International Nuclear Information System (INIS)

    Dolotov, L E; Sinichkin, Yu P; Tuchin, Valerii V; Al'tshuler, G B; Yaroslavskii, I V

    2011-01-01

    The specific features of diffuse reflection from different areas of human face skin for laser and non-laser sources of visible and near-IR light have been investigated to localise the closed-eye (eyelid) region. In the visible spectral range the reflection from the eyelid skin surface can be differentiated by measuring the slope of the spectral dependence of the effective optical density of skin in the wavelength range from 650 to 700nm. In the near-IR spectral range the reflectances of the skin surface at certain wavelengths, normalised to the forehead skin reflectance, can be used as a criterion for differentiating the eyelid skin. In this case, a maximum discrimination is obtained when measuring the skin reflectances at laser wavelengths of 1310 and 1470nm, which correspond to the spectral ranges of maximum and minimum water absorption. (optical technologies in biophysics and medicine)

  3. Impact of Glycerol as Carbon Source onto Specific Sugar and Inducer Uptake Rates and Inclusion Body Productivity in E. coli BL21(DE3

    Directory of Open Access Journals (Sweden)

    Julian Kopp

    2017-12-01

    Full Text Available The Gram-negative bacterium E. coli is the host of choice for a multitude of used recombinant proteins. Generally, cultivation is easy, media are cheap, and a high product titer can be obtained. However, harsh induction procedures using isopropyl β-d-1 thiogalactopyranoside as inducer are often referred to cause stress reactions, leading to a phenomenon known as “metabolic” or “product burden”. These high expressions of recombinant proteins mainly result in decreased growth rates and cell lysis at elevated induction times. Therefore, approaches tend to use “soft” or “tunable” induction with lactose and reduce the stress level of the production host. The usage of glucose as energy source in combination with lactose as induction reagent causes catabolite repression effects on lactose uptake kinetics and as a consequence reduced product titer. Glycerol—as an alternative carbon source—is already known to have positive impact on product formation when coupled with glucose and lactose in auto-induction systems, and has been referred to show no signs of repression when cultivated with lactose concomitantly. In recent research activities, the impact of different products on the lactose uptake using glucose as carbon source was highlighted, and a mechanistic model for glucose-lactose induction systems showed correlations between specific substrate uptake rate for glucose or glycerol (qs,C and the maximum specific lactose uptake rate (qs,lac,max. In this study, we investigated the mechanistic of glycerol uptake when using the inducer lactose. We were able to show that a product-producing strain has significantly higher inducer uptake rates when being compared to a non-producer strain. Additionally, it was shown that glycerol has beneficial effects on viability of cells and on productivity of the recombinant protein compared to glucose.

  4. Compound-specific amino acid δ15N patterns in marine algae: Tracer potential for cyanobacterial vs. eukaryotic organic nitrogen sources in the ocean

    Science.gov (United States)

    McCarthy, Matthew D.; Lehman, Jennifer; Kudela, Raphael

    2013-02-01

    Stable nitrogen isotopic analysis of individual amino acids (δ15N-AA) has unique potential to elucidate the complexities of food webs, track heterotrophic transformations, and understand diagenesis of organic nitrogen (ON). While δ15N-AA patterns of autotrophs have been shown to be generally similar, prior work has also suggested that differences may exist between cyanobacteria and eukaryotic algae. However, δ15N-AA patterns in differing oceanic algal groups have never been closely examined. The overarching goals of this study were first to establish a more quantitative understanding of algal δ15N-AA patterns, and second to examine whether δ15N-AA patterns have potential as a new tracer for distinguishing prokaryotic vs. eukaryotic N sources. We measured δ15N-AA from prokaryotic and eukaryotic phytoplankton cultures and used a complementary set of statistical approaches (simple normalization, regression-derived fractionation factors, and multivariate analyses) to test for variations. A generally similar δ15N-AA pattern was confirmed for all algae, however significant AA-specific variation was also consistently identified between the two groups. The relative δ15N fractionation of Glx (glutamine + glutamic acid combined) vs. total proteinaceous N appeared substantially different, which we hypothesize could be related to differing enzymatic forms. In addition, the several other AA (most notably glycine and leucine) appeared to have strong biomarker potential. Finally, we observed that overall patterns of δ15N values in algae correspond well with the Trophic vs. Source-AA division now commonly used to describe variable AA δ15N changes with trophic transfer, suggesting a common mechanistic basis. Overall, these results show that autotrophic δ15N-AA patterns can differ between major algal evolutionary groupings for many AA. The statistically significant multivariate results represent a first approach for testing ideas about relative eukaryotic vs. prokaryotic

  5. Hanford performance evaluation program for Hanford site analytical services

    International Nuclear Information System (INIS)

    Markel, L.P.

    1995-09-01

    The U.S. Department of Energy (DOE) Order 5700.6C, Quality Assurance, and Title 10 of the Code of Federal Regulations, Part 830.120, Quality Assurance Requirements, states that it is the responsibility of DOE contractors to ensure that ''quality is achieved and maintained by those who have been assigned the responsibility for performing the work.'' Hanford Analytical Services Quality Assurance Plan (HASQAP) is designed to meet the needs of the Richland Operations Office (RL) for maintaining a consistent level of quality for the analytical chemistry services provided by contractor and commmercial analytical laboratory operations. Therefore, services supporting Hanford environmental monitoring, environmental restoration, and waste management analytical services shall meet appropriate quality standards. This performance evaluation program will monitor the quality standards of all analytical laboratories supporting the Hanforad Site including on-site and off-site laboratories. The monitoring and evaluation of laboratory performance can be completed by the use of several tools. This program will discuss the tools that will be utilized for laboratory performance evaluations. Revision 0 will primarily focus on presently available programs using readily available performance evaluation materials provided by DOE, EPA or commercial sources. Discussion of project specific PE materials and evaluations will be described in section 9.0 and Appendix A

  6. Distinguishing between old and modern permafrost sources in the northeast Siberian land–shelf system with compound-specific δ2H analysis

    Directory of Open Access Journals (Sweden)

    J. E. Vonk

    2017-08-01

    Full Text Available Pleistocene ice complex permafrost deposits contain roughly a quarter of the organic carbon (OC stored in permafrost (PF terrain. When permafrost thaws, its OC is remobilized into the (aquatic environment where it is available for degradation, transport or burial. Aquatic or coastal environments contain sedimentary reservoirs that can serve as archives of past climatic change. As permafrost thaw is increasing throughout the Arctic, these reservoirs are important locations to assess the fate of remobilized permafrost OC.We here present compound-specific deuterium (δ2H analysis on leaf waxes as a tool to distinguish between OC released from thawing Pleistocene permafrost (ice complex deposits; ICD and from thawing Holocene permafrost (from near-surface soils. Bulk geochemistry (%OC; δ13C; %total nitrogen, TN was analyzed as well as the concentrations and δ2H signatures of long-chain n-alkanes (C21 to C33 and mid- to long-chain n-alkanoic acids (C16 to C30 extracted from both ICD-PF samples (n =  9 and modern vegetation and O-horizon (topsoil-PF samples (n =  9 from across the northeast Siberian Arctic. Results show that these topsoil-PF samples have higher %OC, higher OC ∕ TN values and more depleted δ13C-OC values than ICD-PF samples, suggesting that these former samples trace a fresher soil and/or vegetation source. Whereas the two investigated sources differ on the bulk geochemical level, they are, however, virtually indistinguishable when using leaf wax concentrations and ratios. However, on the molecular isotope level, leaf wax biomarker δ2H values are statistically different between topsoil PF and ICD PF. For example, the mean δ2H value of C29 n-alkane was −246 ± 13 ‰ (mean ± SD for topsoil PF and −280 ± 12 ‰ for ICD PF. With a dynamic isotopic range (difference between two sources of 34 to 50 ‰; the isotopic fingerprints of individual, abundant, biomarker molecules from leaf waxes can

  7. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  8. Carbon sources in suspended particles and surface sediments from the Beaufort Sea revealed by molecular lipid biomarkers and compound-specific isotope analysis

    Directory of Open Access Journals (Sweden)

    I. Tolosa

    2013-03-01

    Full Text Available Molecular lipid biomarkers (hydrocarbons, alcohols, sterols and fatty acids and compound-specific isotope analysis of suspended particulate organic matter (SPM and surface sediments of the Mackenzie Shelf and slope (southeast Beaufort Sea, Arctic Ocean were studied in summer 2009. The concentrations of the molecular lipid markers, characteristic of known organic matter sources, were grouped and used as proxies to evaluate the relative importance of fresh algal, detrital algal, fossil, C3 terrestrial plants, bacterial and zooplankton material in the organic matter (OM of this area. Fossil and detrital algal contributions were the major fractions of the freshwater SPM from the Mackenzie River with ~34% each of the total molecular biomarkers. Fresh algal, C3 terrestrial, bacterial and zooplanktonic components represented much lower percentages, 17, 10, 4 and 80%, with a minor contribution of fossil and C3 terrestrial biomarkers. Characterization of the sediments revealed a major sink of refractory algal material mixed with some fresh algal material, fossil hydrocarbons and a small input of C3 terrestrial sources. In particular, the sediments from the shelf and at the mouth of the Amundsen Gulf presented the highest contribution of detrital algal material (60–75%, whereas those from the slope contained the highest proportion of fossil (40% and C3 terrestrial plant material (10%. Overall, considering that the detrital algal material is marine derived, autochthonous sources contributed more than allochthonous sources to the OM lipid pool. Using the ratio of an allochthonous biomarker (normalized to total organic carbon, TOC found in the sediments to those measured at the river mouth water, we estimated that the fraction of terrestrial material preserved in the sediments accounted for 30–40% of the total carbon in the inner shelf sediments, 17% in the outer shelf and Amundsen Gulf and up to 25% in the slope sediments. These estimates are low

  9. Optimizing RDF Data Cubes for Efficient Processing of Analytical Queries

    DEFF Research Database (Denmark)

    Jakobsen, Kim Ahlstrøm; Andersen, Alex B.; Hose, Katja

    2015-01-01

    data warehouses and data cubes. Today, external data sources are essential for analytics and, as the Semantic Web gains popularity, more and more external sources are available in native RDF. With the recent SPARQL 1.1 standard, performing analytical queries over RDF data sources has finally become...

  10. Open Data, Open Specifications and Free and Open Source Software: A powerful mix to create distributed Web-based water information systems

    Science.gov (United States)

    Arias, Carolina; Brovelli, Maria Antonia; Moreno, Rafael

    2015-04-01

    We are in an age when water resources are increasingly scarce and the impacts of human activities on them are ubiquitous. These problems don't respect administrative or political boundaries and they must be addressed integrating information from multiple sources at multiple spatial and temporal scales. Communication, coordination and data sharing are critical for addressing the water conservation and management issues of the 21st century. However, different countries, provinces, local authorities and agencies dealing with water resources have diverse organizational, socio-cultural, economic, environmental and information technology (IT) contexts that raise challenges to the creation of information systems capable of integrating and distributing information across their areas of responsibility in an efficient and timely manner. Tight and disparate financial resources, and dissimilar IT infrastructures (data, hardware, software and personnel expertise) further complicate the creation of these systems. There is a pressing need for distributed interoperable water information systems that are user friendly, easily accessible and capable of managing and sharing large volumes of spatial and non-spatial data. In a distributed system, data and processes are created and maintained in different locations each with competitive advantages to carry out specific activities. Open Data (data that can be freely distributed) is available in the water domain, and it should be further promoted across countries and organizations. Compliance with Open Specifications for data collection, storage and distribution is the first step toward the creation of systems that are capable of interacting and exchanging data in a seamlessly (interoperable) way. The features of Free and Open Source Software (FOSS) offer low access cost that facilitate scalability and long-term viability of information systems. The World Wide Web (the Web) will be the platform of choice to deploy and access these systems

  11. Analysis and analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Batuecas Rodriguez, T [Department of Chemistry and Isotopes, Junta de Energia Nuclear, Madrid (Spain)

    1967-01-01

    The technology associated with the use of organic coolants in nuclear reactors depends to a large extent on the determination and control of their physical and chemical properties, and particularly on the viability, speed, sensitivity, precision and accuracy (depending on the intended usage) of the methods employed in detection and analytical determination. This has led to the study and development of numerous techniques, some specially designed for the extreme conditions involved in working with the types of product in question and others adapted from existing techniques. In the specific case of polyphenyl and hydropolyphenyl mixtures, which have been the principal subjects of study to date and offer greatest promise, the analytical problems are broadly as follows: Composition of initial product or virgin coolant composition of macro components and amounts of organic and inorganic impurities; Coolant during and after operation. Determination of gases and organic compounds produced by pyrolysis and radiolysis (degradation and polymerization products); Control of systems for purifying and regenerating the coolant after use. Dissolved pressurization gases; Detection of intermediate products during decomposition; these are generally very unstable (free radicals); Degree of fouling and film formation. Tests to determine potential formation of films; Corrosion of structural elements and canning materials; Health and safety. Toxicity, inflammability and impurities that can be activated. Although some of the above problems are closely interrelated and entail similar techniques, they vary as to degree of difficulty. Another question is the difficulty of distinguishing clearly between techniques for determining physical and physico-chemical properties, on one hand, and analytical techniques on the other. Any classification is therefore somewhat arbitrary (for example, in the case of dosimetry and techniques for determining mean molecular weights or electrical conductivity

  12. Analyticity without Differentiability

    Science.gov (United States)

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  13. Understanding Business Analytics

    Science.gov (United States)

    2015-01-05

    analytics have been used in organizations for a variety of reasons for quite some time; ranging from the simple (generating and understanding business analytics...process. understanding business analytics 3 How well these two components are orchestrated will determine the level of success an organization has in

  14. Microgenetic Learning Analytics Methods: Workshop Report

    Science.gov (United States)

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  15. Source-specific workplace social support and high-sensitivity C-reactive protein levels among Japanese workers: A 1-year prospective cohort study.

    Science.gov (United States)

    Eguchi, Hisashi; Shimazu, Akihito; Kawakami, Norito; Inoue, Akiomi; Tsutsumi, Akizumi

    2016-08-01

    This study investigated the prospective association between source-specific workplace social support and high-sensitivity C-reactive protein (hs-CRP) levels in workers in Japan. We conducted a 1-year prospective cohort study with 1,487 men and 533 women aged 18-65 years. Participants worked at two manufacturing worksites in Japan and were free of major illness. We used multivariable linear regression analyses to evaluate the prospective association between supervisor and coworker support at baseline, and hs-CRP levels at follow-up. We conducted the analyses separately for men and women. For women, high supervisor support at baseline was significantly associated with lower hs-CRP levels at follow-up (β = -0.109, P support at baseline was not significantly associated with hs-CRP levels at follow-up. Associations between supervisor and coworker support and hs-CRP levels were not significant for men. Supervisor support may have beneficial effects on inflammatory markers in working women. Am. J. Ind. Med. 59:676-684, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  16. Reactor Section standard analytical methods. Part 1

    Energy Technology Data Exchange (ETDEWEB)

    Sowden, D.

    1954-07-01

    the Standard Analytical Methods manual was prepared for the purpose of consolidating and standardizing all current analytical methods and procedures used in the Reactor Section for routine chemical analyses. All procedures are established in accordance with accepted practice and the general analytical methods specified by the Engineering Department. These procedures are specifically adapted to the requirements of the water treatment process and related operations. The methods included in this manual are organized alphabetically within the following five sections which correspond to the various phases of the analytical control program in which these analyses are to be used: water analyses, essential material analyses, cotton plug analyses boiler water analyses, and miscellaneous control analyses.

  17. Planning for Low End Analytics Disruptions in Business School Curricula

    Science.gov (United States)

    Rienzo, Thomas; Chen, Kuanchin

    2018-01-01

    Analytics is getting a great deal of attention in both industrial and academic venues. Organizations of all types are becoming more serious about transforming data from a variety of sources into insight, and analytics is the key to that transformation. Academic institutions are rapidly responding to the demand for analytics talent, with hundreds…

  18. Toxic effects of two sources of dietborne cadmium on the juvenile cobia, Rachycentron canadum L. and tissue-specific accumulation of related minerals.

    Science.gov (United States)

    Liu, Kang; Chi, Shuyan; Liu, Hongyu; Dong, Xiaohui; Yang, Qihui; Zhang, Shuang; Tan, Beiping

    2015-08-01

    In the present study, juvenile cobia, Rachycentron canadum L. were fed diets contaminated by two different sources of cadmium: squid viscera meal (SVM-Cd, organic form) and cadmium chloride (CdCl2-Cd, inorganic form). The Cd concentrations in fish diet were approximate 3.0, 5.0 and 10.0mg Cd kg(-1) for both inorganic and organic forms. In the control diet (0.312mg Cd kg(-1) diet, Cd mainly come from fish meal), no cadmium was added. The experiment lasted for 16 weeks and a statistically significant inverse relationship was observed between specific growth rate (SGR) and the concentration of dietary Cd. The SGR of cobia fed a diet with SVM-Cd increased at the lowest doses and decreased with the increasing level of dietary SVM. Fish fed diet contaminated SVM-Cd had significantly higher SGR than those fed diets contaminated CdCl2-Cd among the high Cd level diets treatments. The dietary Cd levels also significantly affected the survival rate of the fish. Among the hematological characteristics and plasma constituents, glutamic-pyruvic transaminase activities and alkaline phosphatase activities in serum and liver increased and hepatic superoxide dismutase activity decreased with the increasing dietary Cd levels. The cobia fed diet contaminated by high level of CdCl2-Cd had significantly higher ALP activity than cobia fed diet contaminated by high level of SVM-Cd. The results from these studies indicate no differences in toxicity response to dietborne SVM-Cd and CdCl2-Cd at a low level of Cd. However, at a higher level, cobia was more sensitive to dietborne CdCl2-Cd than SVM-Cd. Based on quadratic regression of SGR, The Cd concentrations was 3.617mg kg(-1) in the optimal diet, Cd source was SVM (126mg Cd kg(-1) in SVM) which stimulate the growth of cobia and the added level was determined to be 26.7g kg(-1) diet in the present study. Cd accumulations in the kidney of cobia fed both types of Cd were higher than other tissues, and the order of Cd accumulation in tissues

  19. NC CATCH: Advancing Public Health Analytics.

    Science.gov (United States)

    Studnicki, James; Fisher, John W; Eichelberger, Christopher; Bridger, Colleen; Angelon-Gaetz, Kim; Nelson, Debi

    2010-01-01

    The North Carolina Comprehensive Assessment for Tracking Community Health (NC CATCH) is a Web-based analytical system deployed to local public health units and their community partners. The system has the following characteristics: flexible, powerful online analytic processing (OLAP) interface; multiple sources of multidimensional, event-level data fully conformed to common definitions in a data warehouse structure; enabled utilization of available decision support software tools; analytic capabilities distributed and optimized locally with centralized technical infrastructure; two levels of access differentiated by the user (anonymous versus registered) and by the analytical flexibility (Community Profile versus Design Phase); and, an emphasis on user training and feedback. The ability of local public health units to engage in outcomes-based performance measurement will be influenced by continuing access to event-level data, developments in evidence-based practice for improving population health, and the application of information technology-based analytic tools and methods.

  20. Analytic nuclear scattering theories

    International Nuclear Information System (INIS)

    Di Marzio, F.; University of Melbourne, Parkville, VIC

    1999-01-01

    A wide range of nuclear reactions are examined in an analytical version of the usual distorted wave Born approximation. This new approach provides either semi analytic or fully analytic descriptions of the nuclear scattering processes. The resulting computational simplifications, when used within the limits of validity, allow very detailed tests of both nuclear interaction models as well as large basis models of nuclear structure to be performed

  1. Library improvement through data analytics

    CERN Document Server

    Farmer, Lesley S J

    2017-01-01

    This book shows how to act on and make sense of data in libraries. Using a range of techniques, tools and methodologies it explains how data can be used to help inform decision making at every level. Sound data analytics is the foundation for making an evidence-based case for libraries, in addition to guiding myriad organizational decisions, from optimizing operations for efficiency to responding to community needs. Designed to be useful for beginners as well as those with a background in data, this book introduces the basics of a six point framework that can be applied to a variety of library settings for effective system based, data-driven management. Library Improvement Through Data Analytics includes: - the basics of statistical concepts - recommended data sources for various library functions and processes, and guidance for using census, university, or - - government data in analysis - techniques for cleaning data - matching data to appropriate data analysis methods - how to make descriptive statistics m...

  2. Data mining and business analytics with R

    CERN Document Server

    Ledolter, Johannes

    2013-01-01

    Collecting, analyzing, and extracting valuable information from a large amount of data requires easily accessible, robust, computational and analytical tools. Data Mining and Business Analytics with R utilizes the open source software R for the analysis, exploration, and simplification of large high-dimensional data sets. As a result, readers are provided with the needed guidance to model and interpret complicated data and become adept at building powerful models for prediction and classification. Highlighting both underlying concepts and practical computational skills, Data Mining

  3. Analytical Electron Microscope

    Data.gov (United States)

    Federal Laboratory Consortium — The Titan 80-300 is a transmission electron microscope (TEM) equipped with spectroscopic detectors to allow chemical, elemental, and other analytical measurements to...

  4. Effect of nitrogen source and acclimatization on specific growth rates of microalgae determined by a high-throughput in vivo microplate autofluorescence method

    DEFF Research Database (Denmark)

    Podevin, Mike; De Francisci, Davide; Holdt, Susan Løvstad

    2015-01-01

    SGRs of the second and third cultivations. ANOVA of SGRs in the acclimatized second and third cultivations revealed preferences for nitrogen sources among most of the algae; C. vulgaris preferred sodiumnitrate over other nitrogen sources, A. protothecoides adapted to urea after no growth in the first...

  5. Role of analytical chemistry in environmental monitoring

    International Nuclear Information System (INIS)

    Kayasth, S.; Swain, K.

    2004-01-01

    Basic aspects of pollution and the role of analytical chemistry in environmental monitoring are highlighted and exemplified, with emphasis on trace elements. Sources and pathways of natural and especially man-made polluting substances as well as physico-chemical characteristics are given. Attention is paid to adequate sampling in various compartments of the environment comprising both lithosphere and biosphere. Trace analysis is dealt with using a variety of analytical techniques, including criteria for choice of suited techniques, as well as aspects of analytical quality assurance and control. Finally, some data on trace elements levels in soil and water samples from India are presented. (author)

  6. Waste minimization in analytical methods

    International Nuclear Information System (INIS)

    Green, D.W.; Smith, L.L.; Crain, J.S.; Boparai, A.S.; Kiely, J.T.; Yaeger, J.S. Schilling, J.B.

    1995-01-01

    The US Department of Energy (DOE) will require a large number of waste characterizations over a multi-year period to accomplish the Department's goals in environmental restoration and waste management. Estimates vary, but two million analyses annually are expected. The waste generated by the analytical procedures used for characterizations is a significant source of new DOE waste. Success in reducing the volume of secondary waste and the costs of handling this waste would significantly decrease the overall cost of this DOE program. Selection of appropriate analytical methods depends on the intended use of the resultant data. It is not always necessary to use a high-powered analytical method, typically at higher cost, to obtain data needed to make decisions about waste management. Indeed, for samples taken from some heterogeneous systems, the meaning of high accuracy becomes clouded if the data generated are intended to measure a property of this system. Among the factors to be considered in selecting the analytical method are the lower limit of detection, accuracy, turnaround time, cost, reproducibility (precision), interferences, and simplicity. Occasionally, there must be tradeoffs among these factors to achieve the multiple goals of a characterization program. The purpose of the work described here is to add waste minimization to the list of characteristics to be considered. In this paper the authors present results of modifying analytical methods for waste characterization to reduce both the cost of analysis and volume of secondary wastes. Although tradeoffs may be required to minimize waste while still generating data of acceptable quality for the decision-making process, they have data demonstrating that wastes can be reduced in some cases without sacrificing accuracy or precision

  7. Toxic effects of two sources of dietborne cadmium on the juvenile cobia, Rachycentron canadum L. and tissue-specific accumulation of related minerals

    International Nuclear Information System (INIS)

    Liu, Kang; Chi, Shuyan; Liu, Hongyu; Dong, Xiaohui; Yang, Qihui; Zhang, Shuang; Tan, Beiping

    2015-01-01

    Highlights: • CdCl 2 –Cd showed a higher toxicity than SVM-Cd for cobia. • Cd accumulation in cobia fed diets contaminated SVM-Cd was higher than in cobia fed diets contaminated CdCl 2 –Cd. • Cd accumulation in tissues of cobia fed both types of Cd was kidney > liver > intestine > gill muscle. • Dietborne Cd decreased the Fe concentration in kidney and liver, Ca concentrations in vertebra and scale. - Abstract: In the present study, juvenile cobia, Rachycentron canadum L. were fed diets contaminated by two different sources of cadmium: squid viscera meal (SVM-Cd, organic form) and cadmium chloride (CdCl 2 –Cd, inorganic form). The Cd concentrations in fish diet were approximate 3.0, 5.0 and 10.0 mg Cd kg −1 for both inorganic and organic forms. In the control diet (0.312 mg Cd kg −1 diet, Cd mainly come from fish meal), no cadmium was added. The experiment lasted for 16 weeks and a statistically significant inverse relationship was observed between specific growth rate (SGR) and the concentration of dietary Cd. The SGR of cobia fed a diet with SVM-Cd increased at the lowest doses and decreased with the increasing level of dietary SVM. Fish fed diet contaminated SVM-Cd had significantly higher SGR than those fed diets contaminated CdCl 2 –Cd among the high Cd level diets treatments. The dietary Cd levels also significantly affected the survival rate of the fish. Among the hematological characteristics and plasma constituents, glutamic-pyruvic transaminase activities and alkaline phosphatase activities in serum and liver increased and hepatic superoxide dismutase activity decreased with the increasing dietary Cd levels. The cobia fed diet contaminated by high level of CdCl 2 –Cd had significantly higher ALP activity than cobia fed diet contaminated by high level of SVM-Cd. The results from these studies indicate no differences in toxicity response to dietborne SVM-Cd and CdCl 2 –Cd at a low level of Cd. However, at a higher level, cobia was

  8. Toxic effects of two sources of dietborne cadmium on the juvenile cobia, Rachycentron canadum L. and tissue-specific accumulation of related minerals

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Kang [Laboratory of Aquatic Animal Nutrition and Feed, College of Fisheries, Guangdong Ocean University, Zhanjiang, Guangdong (China); Guangdong Yuehai Feed Group Co., Ltd., Zhanjiang, Guangdong (China); Chi, Shuyan; Liu, Hongyu; Dong, Xiaohui; Yang, Qihui; Zhang, Shuang [Laboratory of Aquatic Animal Nutrition and Feed, College of Fisheries, Guangdong Ocean University, Zhanjiang, Guangdong (China); Tan, Beiping, E-mail: bptan@126.com [Laboratory of Aquatic Animal Nutrition and Feed, College of Fisheries, Guangdong Ocean University, Zhanjiang, Guangdong (China)

    2015-08-15

    Highlights: • CdCl{sub 2}–Cd showed a higher toxicity than SVM-Cd for cobia. • Cd accumulation in cobia fed diets contaminated SVM-Cd was higher than in cobia fed diets contaminated CdCl{sub 2}–Cd. • Cd accumulation in tissues of cobia fed both types of Cd was kidney > liver > intestine > gill muscle. • Dietborne Cd decreased the Fe concentration in kidney and liver, Ca concentrations in vertebra and scale. - Abstract: In the present study, juvenile cobia, Rachycentron canadum L. were fed diets contaminated by two different sources of cadmium: squid viscera meal (SVM-Cd, organic form) and cadmium chloride (CdCl{sub 2}–Cd, inorganic form). The Cd concentrations in fish diet were approximate 3.0, 5.0 and 10.0 mg Cd kg{sup −1} for both inorganic and organic forms. In the control diet (0.312 mg Cd kg{sup −1} diet, Cd mainly come from fish meal), no cadmium was added. The experiment lasted for 16 weeks and a statistically significant inverse relationship was observed between specific growth rate (SGR) and the concentration of dietary Cd. The SGR of cobia fed a diet with SVM-Cd increased at the lowest doses and decreased with the increasing level of dietary SVM. Fish fed diet contaminated SVM-Cd had significantly higher SGR than those fed diets contaminated CdCl{sub 2}–Cd among the high Cd level diets treatments. The dietary Cd levels also significantly affected the survival rate of the fish. Among the hematological characteristics and plasma constituents, glutamic-pyruvic transaminase activities and alkaline phosphatase activities in serum and liver increased and hepatic superoxide dismutase activity decreased with the increasing dietary Cd levels. The cobia fed diet contaminated by high level of CdCl{sub 2}–Cd had significantly higher ALP activity than cobia fed diet contaminated by high level of SVM-Cd. The results from these studies indicate no differences in toxicity response to dietborne SVM-Cd and CdCl{sub 2}–Cd at a low level of Cd

  9. Land-use regression with long-term satellite-based greenness index and culture-specific sources to model PM2.5 spatial-temporal variability.

    Science.gov (United States)

    Wu, Chih-Da; Chen, Yu-Cheng; Pan, Wen-Chi; Zeng, Yu-Ting; Chen, Mu-Jean; Guo, Yue Leon; Lung, Shih-Chun Candice

    2017-05-01

    This study utilized a long-term satellite-based vegetation index, and considered culture-specific emission sources (temples and Chinese restaurants) with Land-use Regression (LUR) modelling to estimate the spatial-temporal variability of PM 2.5 using data from Taipei metropolis, which exhibits typical Asian city characteristics. Annual average PM 2.5 concentrations from 2006 to 2012 of 17 air quality monitoring stations established by Environmental Protection Administration of Taiwan were used for model development. PM 2.5 measurements from 2013 were used for external data verification. Monthly Normalized Difference Vegetation Index (NDVI) images coupled with buffer analysis were used to assess the spatial-temporal variations of greenness surrounding the monitoring sites. The distribution of temples and Chinese restaurants were included to represent the emission contributions from incense and joss money burning, and gas cooking, respectively. Spearman correlation coefficient and stepwise regression were used for LUR model development, and 10-fold cross-validation and external data verification were applied to verify the model reliability. The results showed a strongly negative correlation (r: -0.71 to -0.77) between NDVI and PM 2.5 while temples (r: 0.52 to 0.66) and Chinese restaurants (r: 0.31 to 0.44) were positively correlated to PM 2.5 concentrations. With the adjusted model R 2 of 0.89, a cross-validated adj-R 2 of 0.90, and external validated R 2 of 0.83, the high explanatory power of the resultant model was confirmed. Moreover, the averaged NDVI within a 1750 m circular buffer (p < 0.01), the number of Chinese restaurants within a 1750 m buffer (p < 0.01), and the number of temples within a 750 m buffer (p = 0.06) were selected as important predictors during the stepwise selection procedures. According to the partial R 2 , NDVI explained 66% of PM 2.5 variation and was the dominant variable in the developed model. We suggest future studies

  10. The Analytical Hierarchy Process

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn

    2007-01-01

    The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use.......The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use....

  11. Signals: Applying Academic Analytics

    Science.gov (United States)

    Arnold, Kimberly E.

    2010-01-01

    Academic analytics helps address the public's desire for institutional accountability with regard to student success, given the widespread concern over the cost of higher education and the difficult economic and budgetary conditions prevailing worldwide. Purdue University's Signals project applies the principles of analytics widely used in…

  12. Analytic Moufang-transformations

    International Nuclear Information System (INIS)

    Paal, Eh.N.

    1988-01-01

    The paper is aimed to be an introduction to the concept of an analytic birepresentation of an analytic Moufang loop. To describe the deviation of (S,T) from associativity, the associators (S,T) are defined and certain constraints for them, called the minimality conditions of (S,T) are established

  13. Quine's "Strictly Vegetarian" Analyticity

    NARCIS (Netherlands)

    Decock, L.B.

    2017-01-01

    I analyze Quine’s later writings on analyticity from a linguistic point of view. In Word and Object Quine made room for a “strictly vegetarian” notion of analyticity. In later years, he developed this notion into two more precise notions, which I have coined “stimulus analyticity” and “behaviorist

  14. Learning analytics dashboard applications

    NARCIS (Netherlands)

    Verbert, K.; Duval, E.; Klerkx, J.; Govaerts, S.; Santos, J.L.

    2013-01-01

    This article introduces learning analytics dashboards that visualize learning traces for learners and teachers. We present a conceptual framework that helps to analyze learning analytics applications for these kinds of users. We then present our own work in this area and compare with 15 related

  15. Learning Analytics Considered Harmful

    Science.gov (United States)

    Dringus, Laurie P.

    2012-01-01

    This essay is written to present a prospective stance on how learning analytics, as a core evaluative approach, must help instructors uncover the important trends and evidence of quality learner data in the online course. A critique is presented of strategic and tactical issues of learning analytics. The approach to the critique is taken through…

  16. Analytical mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    1990-01-01

    This 43rd Annual Summer Symposium on Analytical Chemistry was held July 24--27, 1990 at Oak Ridge, TN and contained sessions on the following topics: Fundamentals of Analytical Mass Spectrometry (MS), MS in the National Laboratories, Lasers and Fourier Transform Methods, Future of MS, New Ionization and LC/MS Methods, and an extra session. (WET)

  17. Analytical mass spectrometry. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    1990-12-31

    This 43rd Annual Summer Symposium on Analytical Chemistry was held July 24--27, 1990 at Oak Ridge, TN and contained sessions on the following topics: Fundamentals of Analytical Mass Spectrometry (MS), MS in the National Laboratories, Lasers and Fourier Transform Methods, Future of MS, New Ionization and LC/MS Methods, and an extra session. (WET)

  18. Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China.

    Science.gov (United States)

    Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li'an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling

    2016-03-01

    A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box-Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China.

  19. DCODE: A Distributed Column-Oriented Database Engine for Big Data Analytics

    OpenAIRE

    Liu, Yanchen; Cao, Fang; Mortazavi, Masood; Chen, Mengmeng; Yan, Ning; Ku, Chi; Adnaik, Aniket; Morgan, Stephen; Shi, Guangyu; Wang, Yuhu; Fang, Fan

    2015-01-01

    Part 10: Big Data and Text Mining; International audience; We propose a novel Distributed Column-Oriented Database Engine (DCODE) for efficient analytic query processing that combines advantages of both column storage and parallel processing. In DCODE, we enhance an existing open-source columnar database engine by adding the capability for handling queries over a cluster. Specifically, we studied parallel query execution and optimization techniques such as horizontal partitioning, exchange op...

  20. A big data geospatial analytics platform - Physical Analytics Integrated Repository and Services (PAIRS)

    Science.gov (United States)

    Hamann, H.; Jimenez Marianno, F.; Klein, L.; Albrecht, C.; Freitag, M.; Hinds, N.; Lu, S.

    2015-12-01

    A big data geospatial analytics platform:Physical Analytics Information Repository and Services (PAIRS)Fernando Marianno, Levente Klein, Siyuan Lu, Conrad Albrecht, Marcus Freitag, Nigel Hinds, Hendrik HamannIBM TJ Watson Research Center, Yorktown Heights, NY 10598A major challenge in leveraging big geospatial data sets is the ability to quickly integrate multiple data sources into physical and statistical models and be run these models in real time. A geospatial data platform called Physical Analytics Information and Services (PAIRS) is developed on top of open source hardware and software stack to manage Terabyte of data. A new data interpolation and re gridding is implemented where any geospatial data layers can be associated with a set of global grid where the grid resolutions is doubling for consecutive layers. Each pixel on the PAIRS grid have an index that is a combination of locations and time stamp. The indexing allow quick access to data sets that are part of a global data layers and allowing to retrieve only the data of interest. PAIRS takes advantages of parallel processing framework (Hadoop) in a cloud environment to digest, curate, and analyze the data sets while being very robust and stable. The data is stored on a distributed no-SQL database (Hbase) across multiple server, data upload and retrieval is parallelized where the original analytics task is broken up is smaller areas/volume, analyzed independently, and then reassembled for the original geographical area. The differentiating aspect of PAIRS is the ability to accelerate model development across large geographical regions and spatial resolution ranging from 0.1 m up to hundreds of kilometer. System performance is benchmarked on real time automated data ingestion and retrieval of Modis and Landsat data layers. The data layers are curated for sensor error, verified for correctness, and analyzed statistically to detect local anomalies. Multi-layer query enable PAIRS to filter different data

  1. The analytic nodal method in cylindrical geometry

    International Nuclear Information System (INIS)

    Prinsloo, Rian H.; Tomasevic, Djordje I.

    2008-01-01

    Nodal diffusion methods have been used extensively in nuclear reactor calculations, specifically for their performance advantage, but also for their superior accuracy. More specifically, the Analytic Nodal Method (ANM), utilising the transverse integration principle, has been applied to numerous reactor problems with much success. In this work, a nodal diffusion method is developed for cylindrical geometry. Application of this method to three-dimensional (3D) cylindrical geometry has never been satisfactorily addressed and we propose a solution which entails the use of conformal mapping. A set of 1D-equations with an adjusted, geometrically dependent, inhomogeneous source, is obtained. This work describes the development of the method and associated test code, as well as its application to realistic reactor problems. Numerical results are given for the PBMR-400 MW benchmark problem, as well as for a 'cylindrisized' version of the well-known 3D LWR IAEA benchmark. Results highlight the improved accuracy and performance over finite-difference core solutions and investigate the applicability of nodal methods to 3D PBMR type problems. Results indicate that cylindrical nodal methods definitely have a place within PBMR applications, yielding performance advantage factors of 10 and 20 for 2D and 3D calculations, respectively, and advantage factors of the order of 1000 in the case of the LWR problem

  2. Analytical control in metallurgical processes

    International Nuclear Information System (INIS)

    Coedo, A.G.; Dorado, M.T.; Padilla, I.

    1998-01-01

    This paper illustrates the role of analysis in enabling metallurgical industry to meet quality demands. For example, for the steel industry the demands by the automotive, aerospace, power generation, tinplate packaging industries and issue of environment near steel plants. Although chemical analysis technology continues to advance, achieving improved speed, precision and accuracy at lower levels of detection, the competitiveness of manufacturing industry continues to drive property demands at least at the same rate. Narrower specification ranges, lower levels of residual elements and economic pressures prescribe faster process routes, all of which lead to increased demands on the analytical function. These damands are illustrated by examples from several market sectors in which customer issues are considered together with ther analytical implications. (Author) 5 refs

  3. Analytic Reflected Lightcurves for Exoplanets

    Science.gov (United States)

    Haggard, Hal M.; Cowan, Nicolas B.

    2018-04-01

    The disk-integrated reflected brightness of an exoplanet changes as a function of time due to orbital and rotational motion coupled with an inhomogeneous albedo map. We have previously derived analytic reflected lightcurves for spherical harmonic albedo maps in the special case of a synchronously-rotating planet on an edge-on orbit (Cowan, Fuentes & Haggard 2013). In this letter, we present analytic reflected lightcurves for the general case of a planet on an inclined orbit, with arbitrary spin period and non-zero obliquity. We do so for two different albedo basis maps: bright points (δ-maps), and spherical harmonics (Y_l^m-maps). In particular, we use Wigner D-matrices to express an harmonic lightcurve for an arbitrary viewing geometry as a non-linear combination of harmonic lightcurves for the simpler edge-on, synchronously rotating geometry. These solutions will enable future exploration of the degeneracies and information content of reflected lightcurves, as well as fast calculation of lightcurves for mapping exoplanets based on time-resolved photometry. To these ends we make available Exoplanet Analytic Reflected Lightcurves (EARL), a simple open-source code that allows rapid computation of reflected lightcurves.

  4. Emerging technology and architecture for big-data analytics

    CERN Document Server

    Chang, Chip; Yu, Hao

    2017-01-01

    This book describes the current state of the art in big-data analytics, from a technology and hardware architecture perspective. The presentation is designed to be accessible to a broad audience, with general knowledge of hardware design and some interest in big-data analytics. Coverage includes emerging technology and devices for data-analytics, circuit design for data-analytics, and architecture and algorithms to support data-analytics. Readers will benefit from the realistic context used by the authors, which demonstrates what works, what doesn’t work, and what are the fundamental problems, solutions, upcoming challenges and opportunities. Provides a single-source reference to hardware architectures for big-data analytics; Covers various levels of big-data analytics hardware design abstraction and flow, from device, to circuits and systems; Demonstrates how non-volatile memory (NVM) based hardware platforms can be a viable solution to existing challenges in hardware architecture for big-data analytics.

  5. Mobility Data Analytics Center.

    Science.gov (United States)

    2016-01-01

    Mobility Data Analytics Center aims at building a centralized data engine to efficiently manipulate : large-scale data for smart decision making. Integrating and learning the massive data are the key to : the data engine. The ultimate goal of underst...

  6. Encyclopedia of analytical surfaces

    CERN Document Server

    Krivoshapko, S N

    2015-01-01

    This encyclopedia presents an all-embracing collection of analytical surface classes. It provides concise definitions  and description for more than 500 surfaces and categorizes them in 38 classes of analytical surfaces. All classes are cross references to the original literature in an excellent bibliography. The encyclopedia is of particular interest to structural and civil engineers and serves as valuable reference for mathematicians.

  7. Intermediate algebra & analytic geometry

    CERN Document Server

    Gondin, William R

    1967-01-01

    Intermediate Algebra & Analytic Geometry Made Simple focuses on the principles, processes, calculations, and methodologies involved in intermediate algebra and analytic geometry. The publication first offers information on linear equations in two unknowns and variables, functions, and graphs. Discussions focus on graphic interpretations, explicit and implicit functions, first quadrant graphs, variables and functions, determinate and indeterminate systems, independent and dependent equations, and defective and redundant systems. The text then examines quadratic equations in one variable, system

  8. DETERMINING HOT SPOTS OF FECAL CONTAMINATION IN A TROPICAL WATERSHED BY COMBINING LAND-USE INFORMATION AND METEOROLOGICAL DATA WITH SOURCE-SPECIFIC ASSAYS

    Science.gov (United States)

    Microbial source tracking (MST) assays have been mostly employed in temperate climates. However, their value as monitoring tools in tropical and subtropical regions is unknown since the geographic and temporal stability of the assays has not been extensively tested. The objective...

  9. SRL online Analytical Development

    International Nuclear Information System (INIS)

    Jenkins, C.W.

    1991-01-01

    The Savannah River Site is operated by the Westinghouse Savannah River Co. for the Department of Energy to produce special nuclear materials for defense. R ampersand D support for site programs is provided by the Savannah River Laboratory, which I represent. The site is known primarily for its nuclear reactors, but actually three fourths of the efforts at the site are devoted to fuel/target fabrication, fuel/target reprocessing, and waste management. All of these operations rely heavily on chemical processes. The site is therefore a large chemical plant. There are then many potential applications for process analytical chemistry at SRS. The Savannah River Laboratory (SRL) has an Analytical Development Section of roughly 65 personnel that perform analyses for R ampersand D efforts at the lab, act as backup to the site Analytical Laboratories Department and develop analytical methods and instruments. I manage a subgroup of the Analytical Development Section called the Process Control ampersand Analyzer Development Group. The Prime mission of this group is to develop online/at-line analytical systems for site applications

  10. High Relative Abundance of Biofuel Sourced Ethanol in Precipitation in the US and Brazil Determined Using Compound Specific Stable Carbon Isotopes

    Science.gov (United States)

    Shimizu, M. S.; Felix, J. D. D.; Casas, M.; Avery, G. B., Jr.; Kieber, R. J.; Mead, R. N.; Willey, J. D.; Lane, C.

    2017-12-01

    Ethanol biofuel production and consumption have increased exponentially over the last two decades to help reduce greenhouse gas emissions. Currently, 85% of global ethanol production and consumption occurs in the US and Brazil. Increasing biofuel ethanol usage in these two countries enhances emissions of uncombusted ethanol to the atmosphere contributing to poor air quality. Although measurements of ethanol in the air and the precipitation reveal elevated ethanol concentrations in densely populated cities, other sources such as natural vegetation can contribute to emission to the atmosphere. Previous modeling studies indicated up to 12% of atmospheric ethanol is from anthropogenic emissions. Only one gas phase study in southern Florida attempted to constrain the two sources through direct isotopic measurements. The current study used a stable carbon isotope method to constrain sources of ethanol in rainwater from the US and Brazil. A method was developed using solid phase microextraction (SPME) with subsequent analysis by gas chromatography-combustion-isotope ratio mass spectrometry (GC-C-IRMS). Stable carbon isotope signatures (δ13C) of vehicle ethanol emission sources for both the US (-9.8‰) and Brazil (-12.7‰) represented C4 plants as feedstock (corn and sugarcane) for biofuel production. An isotope mixing model using biofuel from vehicles (C4 plants) and biogenic (C3 plants) end-members was implemented to estimate ethanol source apportionment in the rain. We found that stable carbon isotope ratio of ethanol in the rain ranged between -22.6‰ and -12.7‰. Our results suggest that the contribution of biofuel to atmospheric ethanol can be higher than previously estimated. As biofuel usage increasing globally, it is essential to determine the relative abundance of anthropogenic ethanol in other areas of the world.

  11. A global multicenter study on reference values: 2. Exploration of sources of variation across the countries.

    Science.gov (United States)

    Ichihara, Kiyoshi; Ozarda, Yesim; Barth, Julian H; Klee, George; Shimizu, Yoshihisa; Xia, Liangyu; Hoffmann, Mariza; Shah, Swarup; Matsha, Tandi; Wassung, Janette; Smit, Francois; Ruzhanskaya, Anna; Straseski, Joely; Bustos, Daniel N; Kimura, Shogo; Takahashi, Aki

    2017-04-01

    The intent of this study, based on a global multicenter study of reference values (RVs) for serum analytes was to explore biological sources of variation (SVs) of the RVs among 12 countries around the world. As described in the first part of this paper, RVs of 50 major serum analytes from 13,396 healthy individuals living in 12 countries were obtained. Analyzed in this study were 23 clinical chemistry analytes and 8 analytes measured by immunoturbidimetry. Multiple regression analysis was performed for each gender, country by country, analyte by analyte, by setting four major SVs (age, BMI, and levels of drinking and smoking) as a fixed set of explanatory variables. For analytes with skewed distributions, log-transformation was applied. The association of each source of variation with RVs was expressed as the partial correlation coefficient (r p ). Obvious gender and age-related changes in the RVs were observed in many analytes, almost consistently between countries. Compilation of age-related variations of RVs after adjusting for between-country differences revealed peculiar patterns specific to each analyte. Judged fromthe r p , BMI related changes were observed for many nutritional and inflammatory markers in almost all countries. However, the slope of linear regression of BMI vs. RV differed greatly among countries for some analytes. Alcohol and smoking-related changes were observed less conspicuously in a limited number of analytes. The features of sex, age, alcohol, and smoking-related changes in RVs of the analytes were largely comparable worldwide. The finding of differences in BMI-related changes among countries in some analytes is quite relevant to understanding ethnic differences in susceptibility to nutritionally related diseases. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  12. Analytical chemistry: Principles and techniques

    International Nuclear Information System (INIS)

    Hargis, L.G.

    1988-01-01

    Although this text seems to have been intended for use in a one-semester course in undergraduate analytical chemistry, it includes the range of topics usually encountered in a two-semester introductory course in chemical analysis. The material is arranged logically for use in a two-semester course: the first 12 chapters contain the subjects most often covered in the first term, and the next 10 chapters pertain to the second (instrumental) term. Overall breadth and level of treatment are standards for an undergraduate text of this sort, and the only major omission is that of kinetic methods (which is a common omission in analytical texts). In the first 12 chapters coverage of the basic material is quite good. The emphasis on the underlying principles of the techniques rather than on specifics and design of instrumentation is welcomed. This text may be more useful for the instrumental portion of an analytical chemistry course than for the solution chemistry segment. The instrumental analysis portion is appropriate for an introductory textbook

  13. Kawerau fluid chemistry : analytical results

    International Nuclear Information System (INIS)

    Mroczek, E.K.; Christenson, B.W.; Mountain, B.; Stewart, M.K.

    2001-01-01

    This report summarises the water and gas analytical data collected from Kawerau geothermal field 1998-2000 under the Sustainable Management of Geothermal and Mineral Resources (GMR) Project, Objective 2 'Understanding New Zealand Geothermal Systems'. The work is part of the continuing effort to characterise the chemical, thermal and isotopic signatures of the deep magmatic heat sources which drive our geothermal systems. At Kawerau there is clear indication that the present-day heat source relates to young volcanism within the field. However, being at the margins of the explored reservoir, little is presently known of the characteristics of that heat source. The Kawerau study follows on directly from the recently completed work characterising the geochemical signatures of the Ohaaki hydrothermal system. In the latter study the interpretation of the radiogenic noble gas isotope systematics was of fundamental importance in characterising the magmatic heat source. Unfortunately the collaboration with LLNL, which analysed the isotopes, could not be extended to include the Kawerau data. The gas samples have been archived and will be analysed once a new collaborator is found to continue the work. The purpose of the present compilation is to facilitate the final completion of the study by ensuring the data is accessible in one report. (author). 5 refs., 2 figs., 9 tabs

  14. Programming system for analytic geometry

    International Nuclear Information System (INIS)

    Raymond, Jacques

    1970-01-01

    After having outlined the characteristics of computing centres which do not comply with engineering tasks, notably the time required by all different tasks to be performed when developing a software (assembly, compilation, link edition, loading, run), and identified constraints specific to engineering, the author identifies the characteristics a programming system should have to suit engineering tasks. He discussed existing conversational systems and their programming language, and their main drawbacks. Then, he presents a system which aims at facilitating programming and addressing problems of analytic geometry and trigonometry

  15. Jet substructure with analytical methods

    Energy Technology Data Exchange (ETDEWEB)

    Dasgupta, Mrinal [University of Manchester, Consortium for Fundamental Physics, School of Physics and Astronomy, Manchester (United Kingdom); Fregoso, Alessandro; Powling, Alexander [University of Manchester, School of Physics and Astronomy, Manchester (United Kingdom); Marzani, Simone [Durham University, Institute for Particle Physics Phenomenology, Durham (United Kingdom)

    2013-11-15

    We consider the mass distribution of QCD jets after the application of jet-substructure methods, specifically the mass-drop tagger, pruning, trimming and their variants. In contrast to most current studies employing Monte Carlo methods, we carry out analytical calculations at the next-to-leading order level, which are sufficient to extract the dominant logarithmic behaviour for each technique, and compare our findings to exact fixed-order results. Our results should ultimately lead to a better understanding of these jet-substructure methods which in turn will influence the development of future substructure tools for LHC phenomenology. (orig.)

  16. Analytic manifolds in uniform algebras

    International Nuclear Information System (INIS)

    Tonev, T.V.

    1988-12-01

    Here we extend Bear-Hile's result concerning the version of famous Bishop's theorem for one-dimensional analytic structures in two directions: for n-dimensional complex analytic manifolds, n>1, and for generalized analytic manifolds. 14 refs

  17. Use of compound-specific stable carbon isotope ratio measurements of asphaltene-bound polycyclic aromatic hydrocarbons (PAHs) as a novel aid to source apportionment of environmental PAHs

    Energy Technology Data Exchange (ETDEWEB)

    C. Sun; C. Snape; M. Cooper; W. Ivwurie [University of Nottingham, Nottingham (United Kingdom). Nottingham Energy & Fuel Centre

    2005-07-01

    In this study, the PAHs from hydropyrolysis of asphaltenes from different primary sources (e.g. crude oil, low and high temperature coal tars) were characterized by their molecular distributions and {sup 13}C/{sup 12}C isotope ratios. It was found that for all oil samples, the molecular and isotopic profiles for their asphaltene-derived PAHs are both similar to those observed for their contained free aromatics, with {sup 13}C-isotopic values varying from -25 to -27{per_thousand} for the Nigerian and -27 to -30{per_thousand} for North Sea oil samples. For low and high temperature coal tar samples, however, similar molecular but different isotopic profiles were observed for their asphaltene-bound PAHs. The free aromatics are significantly isotopically lighter (by nearly -3{per_thousand}) than their asphaltene-derived counterparts having isotopic values typically between -22 and -23{per_thousand} for all coal tar samples examined, and this leads to a larger isotopic difference of up to 7{per_thousand} between the two sources of PAHs than that already observed between their free aromatics (3{per_thousand}). Applying these results to samples previously examined in an area where unambiguous source apportionment could not be conducted for the PAHs due to likely biodegradation, it was found that the bound PAHs released from the asphaltenes recovered from the soil samples in this area are extremely similar to low temperature tar as the source, in terms of their both molecular (highly alkylated) and isotopic profiles. The free PAHs are much less alkyl substituted confirming that the aromatics detected in this area have been subjected to intensiveenvironmental degradation with alkylated aromatic constituents being preferentially removed from their initial matrix.

  18. Optimization of a Laboratory-Developed Test Utilizing Roche Analyte-Specific Reagents for Detection of Staphylococcus aureus, Methicillin-Resistant S. aureus, and Vancomycin-Resistant Enterococcus Species▿

    OpenAIRE

    Mehta, Maitry S.; Paule, Suzanne M.; Hacek, Donna M.; Thomson, Richard B.; Kaul, Karen L.; Peterson, Lance R.

    2008-01-01

    Nasal and perianal swab specimens were tested for detection of Staphylococcus aureus and vancomycin-resistant Enterococcus species (VRE) using a laboratory-developed real-time PCR test and microbiological cultures. The real-time PCR and culture results for S. aureus were similar. PCR had adequate sensitivity, but culture was more specific for the detection of VRE.

  19. Scientific Opinion on the substantiation of a health claim related to fat-free yogurts and fermented milks with live yogurt cultures complying with the specifications “fat free”, “low in sugars”, “high protein”, “source of calcium” and “source of vitamin

    DEFF Research Database (Denmark)

    Tetens, Inge

    2015-01-01

    substantiation of a health claim related to fat-free yogurts and fermented milks with live yogurt cultures complying with the specifications “fat free”, “low in sugars”, “high protein”, “source of calcium” and “source of vitamin D” for nutrition claims and maintenance of lean body mass in the context...... of an energy-restricted diet. The Panel considers that the food that is the subject of the claim, fat-free yogurts and fermented milks complying with the specifications “fat free”, “low in sugars”, “high protein”, “source of calcium” and “source of vitamin D” for nutrition claims, is sufficiently characterised...... and effect relationship has not been established between the consumption of fat-free yogurts and fermented milks with live yogurt cultures complying with the specifications “fat free”, “low in sugars”, “high protein”, “source of calcium” and “source of vitamin D” for nutrition claims and maintenance of lean...

  20. Scientific Opinion on the substantiation of a health claim related to fat-free yogurts and fermented milks complying with the specifications “fat free”, “low in sugars”, “high protein”, “source of calcium” and “source of vitamin D” for nutrition claims

    DEFF Research Database (Denmark)

    Tetens, Inge

    2015-01-01

    substantiation of a health claim related to fat-free yogurts and fermented milks complying with the specifications “fat free”, “low in sugars”, “high protein”, “source of calcium” and “source of vitamin D” for nutrition claims and reduction of body and visceral fat while maintaining lean body mass in the context...... of an energy-restricted diet. The food that is the subject of the claim is fat-free yogurts and fermented milks complying with the specifications “fat free”, “low in sugars”, “high protein”, “source of calcium” and “source of vitamin D” for nutrition claims. The Panel considers that fat-free yogurts......-free yogurts and fermented milks complying with the specifications “fat free”, “low in sugars”, “high protein”, “source of calcium” and “source of vitamin D” for nutrition claims and reduction of body and visceral fat mass while maintaining lean body mass in the context of an energy-restricted diet....

  1. Deployment of Analytics into the Healthcare Safety Net: Lessons Learned.

    Science.gov (United States)

    Hartzband, David; Jacobs, Feygele

    2016-01-01

    As payment reforms shift healthcare reimbursement toward value-based payment programs, providers need the capability to work with data of greater complexity, scope and scale. This will in many instances necessitate a change in understanding of the value of data, and the types of data needed for analysis to support operations and clinical practice. It will also require the deployment of different infrastructure and analytic tools. Community health centers, which serve more than 25 million people and together form the nation's largest single source of primary care for medically underserved communities and populations, are expanding and will need to optimize their capacity to leverage data as new payer and organizational models emerge. To better understand existing capacity and help organizations plan for the strategic and expanded uses of data, a project was initiated that deployed contemporary, Hadoop-based, analytic technology into several multi-site community health centers (CHCs) and a primary care association (PCA) with an affiliated data warehouse supporting health centers across the state. An initial data quality exercise was carried out after deployment, in which a number of analytic queries were executed using both the existing electronic health record (EHR) applications and in parallel, the analytic stack. Each organization carried out the EHR analysis using the definitions typically applied for routine reporting. The analysis deploying the analytic stack was carried out using those common definitions established for the Uniform Data System (UDS) by the Health Resources and Service Administration. 1 In addition, interviews with health center leadership and staff were completed to understand the context for the findings. The analysis uncovered many challenges and inconsistencies with respect to the definition of core terms (patient, encounter, etc.), data formatting, and missing, incorrect and unavailable data. At a population level, apparent underreporting

  2. Quantifying sources of bias in longitudinal data linkage studies of child abuse and neglect: measuring impact of outcome specification, linkage error, and partial cohort follow-up.

    Science.gov (United States)

    Parrish, Jared W; Shanahan, Meghan E; Schnitzer, Patricia G; Lanier, Paul; Daniels, Julie L; Marshall, Stephen W

    2017-12-01

    Health informatics projects combining statewide birth populations with child welfare records have emerged as a valuable approach to conducting longitudinal research of child maltreatment. The potential bias resulting from linkage misspecification, partial cohort follow-up, and outcome misclassification in these studies has been largely unexplored. This study integrated epidemiological survey and novel administrative data sources to establish the Alaska Longitudinal Child Abuse and Neglect Linkage (ALCANLink) project. Using these data we evaluated and quantified the impact of non-linkage misspecification and single source maltreatment ascertainment use on reported maltreatment risk and effect estimates. The ALCANLink project integrates the 2009-2011 Alaska Pregnancy Risk Assessment Monitoring System (PRAMS) sample with multiple administrative databases through 2014, including one novel administrative source to track out-of-state emigration. For this project we limited our analysis to the 2009 PRAMS sample. We report on the impact of linkage quality, cohort follow-up, and multisource outcome ascertainment on the incidence proportion of reported maltreatment before age 6 and hazard ratios of selected characteristics that are often available in birth cohort linkage studies of maltreatment. Failure to account for out-of-state emigration biased the incidence proportion by 12% (from 28.3% w to 25.2% w ), and the hazard ratio (HR) by as much as 33% for some risk factors. Overly restrictive linkage parameters biased the incidence proportion downwards by 43% and the HR by as much as 27% for some factors. Multi-source linkages, on the other hand, were of little benefit for improving reported maltreatment ascertainment. Using the ALCANLink data which included a novel administrative data source, we were able to observe and quantify bias to both the incidence proportion and HR in a birth cohort linkage study of reported child maltreatment. Failure to account for out

  3. Information theory in analytical chemistry

    National Research Council Canada - National Science Library

    Eckschlager, Karel; Danzer, Klaus

    1994-01-01

    Contents: The aim of analytical chemistry - Basic concepts of information theory - Identification of components - Qualitative analysis - Quantitative analysis - Multicomponent analysis - Optimum analytical...

  4. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    Science.gov (United States)

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  5. Ion source

    International Nuclear Information System (INIS)

    1977-01-01

    The specifications of a set of point-shape electrodes of non-corrodable material that can hold a film of liquid material of equal thickness is described. Contained in a jacket, this set forms an ion source. The electrode is made of tungsten with a glassy carbon layer for insulation and an outer layer of aluminium-oxide ceramic material

  6. Competing on talent analytics.

    Science.gov (United States)

    Davenport, Thomas H; Harris, Jeanne; Shapiro, Jeremy

    2010-10-01

    Do investments in your employees actually affect workforce performance? Who are your top performers? How can you empower and motivate other employees to excel? Leading-edge companies such as Google, Best Buy, Procter & Gamble, and Sysco use sophisticated data-collection technology and analysis to answer these questions, leveraging a range of analytics to improve the way they attract and retain talent, connect their employee data to business performance, differentiate themselves from competitors, and more. The authors present the six key ways in which companies track, analyze, and use data about their people-ranging from a simple baseline of metrics to monitor the organization's overall health to custom modeling for predicting future head count depending on various "what if" scenarios. They go on to show that companies competing on talent analytics manage data and technology at an enterprise level, support what analytical leaders do, choose realistic targets for analysis, and hire analysts with strong interpersonal skills as well as broad expertise.

  7. Advanced business analytics

    CERN Document Server

    Lev, Benjamin

    2015-01-01

    The book describes advanced business analytics and shows how to apply them to many different professional areas of engineering and management. Each chapter of the book is contributed by a different author and covers a different area of business analytics. The book connects the analytic principles with business practice and provides an interface between the main disciplines of engineering/technology and the organizational, administrative and planning abilities of management. It also refers to other disciplines such as economy, finance, marketing, behavioral economics and risk analysis. This book is of special interest to engineers, economists and researchers who are developing new advances in engineering management but also to practitioners working on this subject.

  8. Analytic number theory

    CERN Document Server

    Iwaniec, Henryk

    2004-01-01

    Analytic Number Theory distinguishes itself by the variety of tools it uses to establish results, many of which belong to the mainstream of arithmetic. One of the main attractions of analytic number theory is the vast diversity of concepts and methods it includes. The main goal of the book is to show the scope of the theory, both in classical and modern directions, and to exhibit its wealth and prospects, its beautiful theorems and powerful techniques. The book is written with graduate students in mind, and the authors tried to balance between clarity, completeness, and generality. The exercis

  9. Social network data analytics

    CERN Document Server

    Aggarwal, Charu C

    2011-01-01

    Social network analysis applications have experienced tremendous advances within the last few years due in part to increasing trends towards users interacting with each other on the internet. Social networks are organized as graphs, and the data on social networks takes on the form of massive streams, which are mined for a variety of purposes. Social Network Data Analytics covers an important niche in the social network analytics field. This edited volume, contributed by prominent researchers in this field, presents a wide selection of topics on social network data mining such as Structural Pr

  10. News for analytical chemists

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov; Karlberg, Bo

    2009-01-01

    welfare. In conjunction with the meeting of the steering committee in Tallinn, Estonia, in April, Mihkel Kaljurand and Mihkel Koel of Tallinn University of Technology organised a successful symposium attended by 51 participants. The symposium illustrated the scientific work of the steering committee...... directed to various topics of analytical chemistry. Although affected by the global financial crisis, the Euroanalysis Conference will be held on 6 to 10 September in Innsbruck, Austria. For next year, the programme for the analytical section of the 3rd European Chemistry Congress is in preparation...

  11. Foundations of predictive analytics

    CERN Document Server

    Wu, James

    2012-01-01

    Drawing on the authors' two decades of experience in applied modeling and data mining, Foundations of Predictive Analytics presents the fundamental background required for analyzing data and building models for many practical applications, such as consumer behavior modeling, risk and marketing analytics, and other areas. It also discusses a variety of practical topics that are frequently missing from similar texts. The book begins with the statistical and linear algebra/matrix foundation of modeling methods, from distributions to cumulant and copula functions to Cornish--Fisher expansion and o

  12. Supercritical fluid analytical methods

    International Nuclear Information System (INIS)

    Smith, R.D.; Kalinoski, H.T.; Wright, B.W.; Udseth, H.R.

    1988-01-01

    Supercritical fluids are providing the basis for new and improved methods across a range of analytical technologies. New methods are being developed to allow the detection and measurement of compounds that are incompatible with conventional analytical methodologies. Characterization of process and effluent streams for synfuel plants requires instruments capable of detecting and measuring high-molecular-weight compounds, polar compounds, or other materials that are generally difficult to analyze. The purpose of this program is to develop and apply new supercritical fluid techniques for extraction, separation, and analysis. These new technologies will be applied to previously intractable synfuel process materials and to complex mixtures resulting from their interaction with environmental and biological systems

  13. Compound-specific C- and H-isotope compositions of enclosed organic matter in carbonate rocks: Implications for source identification of sedimentary organic matter and paleoenvironmental reconstruction

    International Nuclear Information System (INIS)

    Xiong Yongqiang; Wang Yanmei; Wang Yongquan; Xu Shiping

    2007-01-01

    The Bohai Bay Basin is one of the most important oil-producing provinces in China. Molecular organic geochemical characteristics of Lower Paleozoic source rocks in this area have been investigated by analyzing chemical and isotopic compositions of solvent extracts and acid-released organic matter from the Lower Paleozoic carbonate rocks in the Jiyang Sub-basin of the Bohai Bay Basin. The results indicate that enclosed organic matter in carbonate rocks has not been recognizably altered by post-depositional processes. Two end-member compositions are suggested for early organic matter trapped in the Lower Paleozoic carbonate rocks: (1) a source dominated by aquatic organisms and deposited in a relatively deep marine environment and (2) a relatively high saline, evaporative marine depositional environment. In contrast, chemical and isotopic compositions of solvent extracts from these Lower Paleozoic carbonate rocks are relatively complicated, not only inheriting original characteristics of their precursors, but also overprinted by various post-depositional alterations, such as thermal maturation, biodegradation and mixing. Therefore, the integration of both organic matter characteristics can provide more useful information on the origin of organic matter present in carbonate rocks and the environments of their deposition

  14. Human-Induced Pluripotent Stem Cell-Derived Mesenchymal Stem Cells as an Individual-Specific and Renewable Source of Adult Stem Cells.

    Science.gov (United States)

    Sequiera, Glen Lester; Saravanan, Sekaran; Dhingra, Sanjiv

    2017-01-01

    This chapter deals with the employment of human-induced pluripotent stem cells (hiPSCs) as a candidate to differentiate into mesenchymal stem cells (MSCs). This would enable to help establish a regular source of human MSCs with the aim of avoiding the problems associated with procuring the MSCs either from different healthy individuals or patients, limited extraction potentials, batch-to-batch variations or from diverse sources such as bone marrow or adipose tissue. The procedures described herein allow for a guided and ensured approach for the regular maintenance of hiPSCs and their subsequent differentiation into MSCs using the prescribed medium. Subsequently, an easy protocol for the successive isolation and purification of the hiPSC-differentiated MSCs is outlined, which is carried out through passaging and can be further sorted through flow cytometry. Further, the maintenance and expansion of the resultant hiPSC-differentiated MSCs using appropriate characterization techniques, i.e., Reverse-transcription PCR and immunostaining is also elaborated. The course of action has been deliberated keeping in mind the awareness and the requisites available to even beginner researchers who mostly have access to regular consumables and medium components found in the general laboratory.

  15. Compound-specific C- and H-isotope compositions of enclosed organic matter in carbonate rocks: Implications for source identification of sedimentary organic matter and paleoenvironmental reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Xiong Yongqiang [State Key Laboratory of Organic Geochemistry, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou 510640 (China)], E-mail: xiongyq@gig.ac.cn; Wang Yanmei; Wang Yongquan; Xu Shiping [State Key Laboratory of Organic Geochemistry, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou 510640 (China)

    2007-11-15

    The Bohai Bay Basin is one of the most important oil-producing provinces in China. Molecular organic geochemical characteristics of Lower Paleozoic source rocks in this area have been investigated by analyzing chemical and isotopic compositions of solvent extracts and acid-released organic matter from the Lower Paleozoic carbonate rocks in the Jiyang Sub-basin of the Bohai Bay Basin. The results indicate that enclosed organic matter in carbonate rocks has not been recognizably altered by post-depositional processes. Two end-member compositions are suggested for early organic matter trapped in the Lower Paleozoic carbonate rocks: (1) a source dominated by aquatic organisms and deposited in a relatively deep marine environment and (2) a relatively high saline, evaporative marine depositional environment. In contrast, chemical and isotopic compositions of solvent extracts from these Lower Paleozoic carbonate rocks are relatively complicated, not only inheriting original characteristics of their precursors, but also overprinted by various post-depositional alterations, such as thermal maturation, biodegradation and mixing. Therefore, the integration of both organic matter characteristics can provide more useful information on the origin of organic matter present in carbonate rocks and the environments of their deposition.

  16. Guided Text Search Using Adaptive Visual Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Symons, Christopher T [ORNL; Senter, James K [ORNL; DeNap, Frank A [ORNL

    2012-10-01

    This research demonstrates the promise of augmenting interactive visualizations with semi- supervised machine learning techniques to improve the discovery of significant associations and insights in the search and analysis of textual information. More specifically, we have developed a system called Gryffin that hosts a unique collection of techniques that facilitate individualized investigative search pertaining to an ever-changing set of analytical questions over an indexed collection of open-source documents related to critical national infrastructure. The Gryffin client hosts dynamic displays of the search results via focus+context record listings, temporal timelines, term-frequency views, and multiple coordinate views. Furthermore, as the analyst interacts with the display, the interactions are recorded and used to label the search records. These labeled records are then used to drive semi-supervised machine learning algorithms that re-rank the unlabeled search records such that potentially relevant records are moved to the top of the record listing. Gryffin is described in the context of the daily tasks encountered at the US Department of Homeland Security s Fusion Center, with whom we are collaborating in its development. The resulting system is capable of addressing the analysts information overload that can be directly attributed to the deluge of information that must be addressed in the search and investigative analysis of textual information.

  17. From Theory Use to Theory Building in Learning Analytics: A Commentary on "Learning Analytics to Support Teachers during Synchronous CSCL"

    Science.gov (United States)

    Chen, Bodong

    2015-01-01

    In this commentary on Van Leeuwen (2015, this issue), I explore the relation between theory and practice in learning analytics. Specifically, I caution against adhering to one specific theoretical doctrine while ignoring others, suggest deeper applications of cognitive load theory to understanding teaching with analytics tools, and comment on…

  18. Enhancing Safeguards through Information Analysis: Business Analytics Tools

    International Nuclear Information System (INIS)

    Vincent, J.; Midwinter, J.

    2015-01-01

    For the past 25 years the IBM i2 Intelligence Analysis product portfolio has assisted over 4,500 organizations across law enforcement, defense, government agencies, and commercial private sector businesses to maximize the value of the mass of information to discover and disseminate actionable intelligence that can help identify, investigate, predict, prevent, and disrupt criminal, terrorist, and fraudulent acts; safeguarding communities, organizations, infrastructures, and investments. The collaborative Intelligence Analysis environment delivered by i2 is specifically designed to be: · scalable: supporting business needs as well as operational and end user environments · modular: an architecture which can deliver maximum operational flexibility with ability to add complimentary analytics · interoperable: integrating with existing environments and eases information sharing across partner agencies · extendable: providing an open source developer essential toolkit, examples, and documentation for custom requirements i2 Intelligence Analysis brings clarity to complex investigations and operations by delivering industry leading multidimensional analytics that can be run on-demand across disparate data sets or across a single centralized analysis environment. The sole aim is to detect connections, patterns, and relationships hidden within high-volume, all-source data, and to create and disseminate intelligence products in near real time for faster informed decision making. (author)

  19. Antigen-specific and nonspecific mediators of T cell/B cell cooperation. III. Characterization of the nonspecific mediator(s) from different sources.

    Science.gov (United States)

    Harwell, L; Kappler, J W; Marrack, P

    1976-05-01

    T cell-containing lymphoid populations produce a nonantigen-specific mediator(s) (NSM) which can replace T cell helper function in vitro in the response of B cells to sheep red blood cells (SRBC), but not to the hapten-protein conjugate, trinitrophenyl-keyhole limpet hemocyanin, (TNP-KLH). NSM produced under three conditions: 1) stimulation of KLH-primed cells with KLH; 2) allogeneic stimulation of normal spleen cells; and 3) stimulation of normal spleen cells with Con A (but not PHA) are indistinguishable on the basis of their biologic activity and m.w., estimated as 30 to 40,000 daltons by G-200 chromatography. Production of NSM is dependent on the presence of T cells. The action of NSM on B cells responding to SRBC in the presence of 2-mercaptoethanol is unaffected by severe macrophage depletion. Extensive absorption of NSM with SRBC failed to remove its activity, confirming its nonantigen-specific nature.

  20. Source apportionment and health risk assessment among specific age groups during haze and non-haze episodes in Kuala Lumpur, Malaysia.

    Science.gov (United States)

    Sulong, Nor Azura; Latif, Mohd Talib; Khan, Md Firoz; Amil, Norhaniza; Ashfold, Matthew J; Wahab, Muhammad Ikram Abdul; Chan, Kok Meng; Sahani, Mazrura

    2017-12-01

    This study aims to determine PM 2.5 concentrations and their composition during haze and non-haze episodes in Kuala Lumpur. In order to investigate the origin of the measured air masses, the Numerical Atmospheric-dispersion Modelling Environment (NAME) and Global Fire Assimilation System (GFAS) were applied. Source apportionment of PM 2.5 was determined using Positive Matrix Factorization (PMF). The carcinogenic and non-carcinogenic health risks were estimated using the United State Environmental Protection Agency (USEPA) method. PM 2.5 samples were collected from the centre of the city using a high-volume air sampler (HVS). The results showed that the mean PM 2.5 concentrations collected during pre-haze, haze and post-haze periods were 24.5±12.0μgm -3 , 72.3±38.0μgm -3 and 14.3±3.58μgm -3 , respectively. The highest concentration of PM 2.5 during haze episode was five times higher than World Health Organisation (WHO) guidelines. Inorganic compositions of PM 2.5 , including trace elements and water soluble ions were determined using inductively coupled plasma-mass spectrometry (ICP-MS) and ion chromatography (IC), respectively. The major trace elements identified were K, Al, Ca, Mg and Fe which accounted for approximately 93%, 91% and 92% of the overall metals' portions recorded during pre-haze, haze and post-haze periods, respectively. For water-soluble ions, secondary inorganic aerosols (SO 4 2- , NO 3 - and NH 4 + ) contributed around 12%, 43% and 16% of the overall PM 2.5 mass during pre-haze, haze and post-haze periods, respectively. During haze periods, the predominant source identified using PMF was secondary inorganic aerosol (SIA) and biomass burning where the NAME simulations indicate the importance of fires in Sumatra, Indonesia. The main source during pre-haze and post-haze were mix SIA and road dust as well as mineral dust, respectively. The highest non-carcinogenic health risk during haze episode was estimated among the infant group (HI=1

  1. Characterizing the Sources and Processing of Submicron Aerosols at a Coastal Site near Houston, TX, with a Specific Focus on the Impact of Regional Shipping Emissions

    Science.gov (United States)

    Schulze, B.; Wallace, H. W., IV; Bui, A.; Flynn, J. H., III; Erickson, M. H.; Griffin, R. J.

    2017-12-01

    The Texas Gulf Coast region historically has been influenced heavily by regional shipping emissions. However, the effects of the recent establishment of the North American Emissions Control Area (ECA) on aerosol properties in this region are presently unknown. In order to understand better the current sources and processing mechanisms influencing coastal aerosol near Houston, a high-resolution time-of-flight aerosol mass spectrometer (HR-ToF-AMS) was deployed for three weeks at a coastal location during May-June 2016. Total mass loadings of organic and inorganic non-refractory aerosol components during onshore flow periods were similar to those published before establishment of the regulations. Using estimated methanesulfonic acid (MSA) mass loadings and published biogenic MSA:non-sea-salt-sulfate (nss-SO4) ratios, we determined that over 70% of nss-SO4 over the Gulf was from anthropogenic sources, predominantly shipping emissions. Mass spectral analysis indicated that for periods with similar backward-trajectory-averaged meteorological conditions, air masses influenced by shipping emissions have an increased mass fraction of ions related to carboxylic acids and a significantly larger oxygen-to-carbon (O:C) ratio than air masses that stay within the ECA boundary, suggesting that shipping emissions impact marine organic aerosol (OA) oxidation state. Amine fragment mass loadings were positively correlated with anthropogenic nss-SO4 during onshore flow, implying anthropogenic-biogenic interaction in marine OA production. Five OA factors were resolved by positive matrix factorization, corresponding to a hydrocarbon-like OA, a semi-volatile OA, and three different oxygenated organic aerosols ranked by their O:C ratio (OOA-1, OOA-2, and OOA-3). OOA-1 constituted the majority of OA mass during a period likely influenced by aqueous-phase processing and may be linked to local glyoxal/methylglyoxal-related sources. OOA-2 was produced within the Houston urban region and was

  2. Integrated Array/Metadata Analytics

    Science.gov (United States)

    Misev, Dimitar; Baumann, Peter

    2015-04-01

    Data comes in various forms and types, and integration usually presents a problem that is often simply ignored and solved with ad-hoc solutions. Multidimensional arrays are an ubiquitous data type, that we find at the core of virtually all science and engineering domains, as sensor, model, image, statistics data. Naturally, arrays are richly described by and intertwined with additional metadata (alphanumeric relational data, XML, JSON, etc). Database systems, however, a fundamental building block of what we call "Big Data", lack adequate support for modelling and expressing these array data/metadata relationships. Array analytics is hence quite primitive or non-existent at all in modern relational DBMS. Recognizing this, we extended SQL with a new SQL/MDA part seamlessly integrating multidimensional array analytics into the standard database query language. We demonstrate the benefits of SQL/MDA with real-world examples executed in ASQLDB, an open-source mediator system based on HSQLDB and rasdaman, that already implements SQL/MDA.

  3. Analytic theory of the gyrotron

    International Nuclear Information System (INIS)

    Lentini, P.J.

    1989-06-01

    An analytic theory is derived for a gyrotron operating in the linear gain regime. The gyrotron is a coherent source of microwave and millimeter wave radiation based on an electron beam emitting at cyclotron resonance Ω in a strong, uniform magnetic field. Relativistic equations of motion and first order perturbation theory are used. Results are obtained in both laboratory and normalized variables. An expression for cavity threshold gain is derived in the linear regime. An analytic expression for the electron phase angle in momentum space shows that the effect of the RF field is to form bunches that are equal to the unperturbed transit phase plus a correction term which varies as the sine of the input phase angle. The expression for the phase angle is plotted and bunching effects in and out of phase (0 and -π) with respect to the RF field are evident for detunings leading to gain and absorption, respectively. For exact resonance, field frequency ω = Ω, a bunch also forms at a phase of -π/2. This beam yields the same energy exchange with the RF field as an unbunched, (nonrelativistic) beam. 6 refs., 10 figs

  4. Analytical system availability techniques

    NARCIS (Netherlands)

    Brouwers, J.J.H.; Verbeek, P.H.J.; Thomson, W.R.

    1987-01-01

    Analytical techniques are presented to assess the probability distributions and related statistical parameters of loss of production from equipment networks subject to random failures and repairs. The techniques are based on a theoretical model for system availability, which was further developed

  5. Explanatory analytics in OLAP

    NARCIS (Netherlands)

    Caron, E.A.M.; Daniëls, H.A.M.

    2013-01-01

    In this paper the authors describe a method to integrate explanatory business analytics in OLAP information systems. This method supports the discovery of exceptional values in OLAP data and the explanation of such values by giving their underlying causes. OLAP applications offer a support tool for

  6. Analytical procedures. Pt. 1

    International Nuclear Information System (INIS)

    Weber, G.

    1985-01-01

    In analytical procedures (Boole procedures) there is certain to be a close relationship between the safety assessment and reliability assessment of technical facilities. The paper gives an overview of the organization of models, fault trees, the probabilistic evaluation of systems, evaluation with minimum steps or minimum paths regarding statistically dependent components and of systems liable to suffer different kinds of outages. (orig.) [de

  7. Ada & the Analytical Engine.

    Science.gov (United States)

    Freeman, Elisabeth

    1996-01-01

    Presents a brief history of Ada Byron King, Countess of Lovelace, focusing on her primary role in the development of the Analytical Engine--the world's first computer. Describes the Ada Project (TAP), a centralized World Wide Web site that serves as a clearinghouse for information related to women in computing, and provides a Web address for…

  8. User Behavior Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Turcotte, Melissa [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Moore, Juston Shane [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-28

    User Behaviour Analytics is the tracking, collecting and assessing of user data and activities. The goal is to detect misuse of user credentials by developing models for the normal behaviour of user credentials within a computer network and detect outliers with respect to their baseline.

  9. Of the Analytical Engine

    Indian Academy of Sciences (India)

    cloth will be woven all of one colour; but there will be a damask pattern upon it ... mathematical view of the Analytical Engine, and illustrate by example some of its .... be to v~rify the number of the card given it by subtracting its number from 2 3 ...

  10. Limitless Analytic Elements

    Science.gov (United States)

    Strack, O. D. L.

    2018-02-01

    We present equations for new limitless analytic line elements. These elements possess a virtually unlimited number of degrees of freedom. We apply these new limitless analytic elements to head-specified boundaries and to problems with inhomogeneities in hydraulic conductivity. Applications of these new analytic elements to practical problems involving head-specified boundaries require the solution of a very large number of equations. To make the new elements useful in practice, an efficient iterative scheme is required. We present an improved version of the scheme presented by Bandilla et al. (2007), based on the application of Cauchy integrals. The limitless analytic elements are useful when modeling strings of elements, rivers for example, where local conditions are difficult to model, e.g., when a well is close to a river. The solution of such problems is facilitated by increasing the order of the elements to obtain a good solution. This makes it unnecessary to resort to dividing the element in question into many smaller elements to obtain a satisfactory solution.

  11. Social Learning Analytics

    Science.gov (United States)

    Buckingham Shum, Simon; Ferguson, Rebecca

    2012-01-01

    We propose that the design and implementation of effective "Social Learning Analytics (SLA)" present significant challenges and opportunities for both research and enterprise, in three important respects. The first is that the learning landscape is extraordinarily turbulent at present, in no small part due to technological drivers.…

  12. History of analytic geometry

    CERN Document Server

    Boyer, Carl B

    2012-01-01

    Designed as an integrated survey of the development of analytic geometry, this study presents the concepts and contributions from before the Alexandrian Age through the eras of the great French mathematicians Fermat and Descartes, and on through Newton and Euler to the "Golden Age," from 1789 to 1850.

  13. Analytics for Customer Engagement

    NARCIS (Netherlands)

    Bijmolt, Tammo H. A.; Leeflang, Peter S. H.; Block, Frank; Eisenbeiss, Maik; Hardie, Bruce G. S.; Lemmens, Aurelie; Saffert, Peter

    In this article, we discuss the state of the art of models for customer engagement and the problems that are inherent to calibrating and implementing these models. The authors first provide an overview of the data available for customer analytics and discuss recent developments. Next, the authors

  14. European Analytical Column

    DEFF Research Database (Denmark)

    Karlberg, B.; Grasserbauer, M.; Andersen, Jens Enevold Thaulov

    2009-01-01

    for European analytical chemistry. During the period 2002–07, Professor Grasserbauer was Director of the Institute for Environment and Sustainability, Joint Research Centre of the European Commission (EC), Ispra, Italy. There is no doubt that many challenges exist at the present time for all of us representing...

  15. Analytical Chemistry Laboratory

    Science.gov (United States)

    Anderson, Mark

    2013-01-01

    The Analytical Chemistry and Material Development Group maintains a capability in chemical analysis, materials R&D failure analysis and contamination control. The uniquely qualified staff and facility support the needs of flight projects, science instrument development and various technical tasks, as well as Cal Tech.

  16. Big Data Analytics in Chemical Engineering.

    Science.gov (United States)

    Chiang, Leo; Lu, Bo; Castillo, Ivan

    2017-06-07

    Big data analytics is the journey to turn data into insights for more informed business and operational decisions. As the chemical engineering community is collecting more data (volume) from different sources (variety), this journey becomes more challenging in terms of using the right data and the right tools (analytics) to make the right decisions in real time (velocity). This article highlights recent big data advancements in five industries, including chemicals, energy, semiconductors, pharmaceuticals, and food, and then discusses technical, platform, and culture challenges. To reach the next milestone in multiplying successes to the enterprise level, government, academia, and industry need to collaboratively focus on workforce development and innovation.

  17. service line analytics in the new era.

    Science.gov (United States)

    Spence, Jay; Seargeant, Dan

    2015-08-01

    To succeed under the value-based business model, hospitals and health systems require effective service line analytics that combine inpatient and outpatient data and that incorporate quality metrics for evaluating clinical operations. When developing a framework for collection, analysis, and dissemination of service line data, healthcare organizations should focus on five key aspects of effective service line analytics: Updated service line definitions. Ability to analyze and trend service line net patient revenues by payment source. Access to accurate service line cost information across multiple dimensions with drill-through capabilities. Ability to redesign key reports based on changing requirements. Clear assignment of accountability.

  18. Nodewise analytical calculation of the transfer function

    International Nuclear Information System (INIS)

    Makai, Mihaly

    1994-01-01

    The space dependence of neutron noise has so far been mostly investigated in homogeneous core models. Application of core diagnostic methods to locate a malfunction requires however that the transfer function be calculated for real, inhomogeneous cores. A code suitable for such purpose must be able to handle complex arithmetic and delta-function source. Further requirements are analytical dependence in one spatial variable and fast execution. The present work describes the TIDE program written to fulfil the above requirements. The core is subdivided into homogeneous, square assemblies. An analytical solution is given, which is a generalisation of the inhomogeneous response matrix method. (author)

  19. Big Data Analytics in Medicine and Healthcare.

    Science.gov (United States)

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  20. Designing a Marketing Analytics Course for the Digital Age

    Science.gov (United States)

    Liu, Xia; Burns, Alvin C.

    2018-01-01

    Marketing analytics is receiving great attention because of evolving technology and the radical changes in the marketing environment. This study aims to assist the design and implementation of a marketing analytics course. We assembled a rich data set from four sources: business executives, 400 employers' job postings, one million tweets about…

  1. Analytic solutions of a class of nonlinearly dynamic systems

    International Nuclear Information System (INIS)

    Wang, M-C; Zhao, X-S; Liu, X

    2008-01-01

    In this paper, the homotopy perturbation method (HPM) is applied to solve a coupled system of two nonlinear differential with first-order similar model of Lotka-Volterra and a Bratus equation with a source term. The analytic approximate solutions are derived. Furthermore, the analytic approximate solutions obtained by the HPM with the exact solutions reveals that the present method works efficiently

  2. Streamlining Smart Meter Data Analytics

    DEFF Research Database (Denmark)

    Liu, Xiufeng; Nielsen, Per Sieverts

    2015-01-01

    of the so-called big data possible. This can improve energy management, e.g., help utilities improve the management of energy and services, and help customers save money. As this regard, the paper focuses on building an innovative software solution to streamline smart meter data analytic, aiming at dealing......Today smart meters are increasingly used in worldwide. Smart meters are the advanced meters capable of measuring customer energy consumption at a fine-grained time interval, e.g., every 15 minutes. The data are very sizable, and might be from different sources, along with the other social......-economic metrics such as the geographic information of meters, the information about users and their property, geographic location and others, which make the data management very complex. On the other hand, data-mining and the emerging cloud computing technologies make the collection, management, and analysis...

  3. Multispectral analytical image fusion

    International Nuclear Information System (INIS)

    Stubbings, T.C.

    2000-04-01

    With new and advanced analytical imaging methods emerging, the limits of physical analysis capabilities and furthermore of data acquisition quantities are constantly pushed, claiming high demands to the field of scientific data processing and visualisation. Physical analysis methods like Secondary Ion Mass Spectrometry (SIMS) or Auger Electron Spectroscopy (AES) and others are capable of delivering high-resolution multispectral two-dimensional and three-dimensional image data; usually this multispectral data is available in form of n separate image files with each showing one element or other singular aspect of the sample. There is high need for digital image processing methods enabling the analytical scientist, confronted with such amounts of data routinely, to get rapid insight into the composition of the sample examined, to filter the relevant data and to integrate the information of numerous separate multispectral images to get the complete picture. Sophisticated image processing methods like classification and fusion provide possible solution approaches to this challenge. Classification is a treatment by multivariate statistical means in order to extract analytical information. Image fusion on the other hand denotes a process where images obtained from various sensors or at different moments of time are combined together to provide a more complete picture of a scene or object under investigation. Both techniques are important for the task of information extraction and integration and often one technique depends on the other. Therefore overall aim of this thesis is to evaluate the possibilities of both techniques regarding the task of analytical image processing and to find solutions for the integration and condensation of multispectral analytical image data in order to facilitate the interpretation of the enormous amounts of data routinely acquired by modern physical analysis instruments. (author)

  4. Structural level characterization of base oils using advanced analytical techniques

    KAUST Repository

    Hourani, Nadim; Muller, Hendrik; Adam, Frederick M.; Panda, Saroj K.; Witt, Matthias; Al-Hajji, Adnan A.; Sarathy, Mani

    2015-01-01

    cyclotron resonance mass spectrometry (FT-ICR MS) equipped with atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) sources. First, the capabilities and limitations of each analytical technique were evaluated

  5. Sampling and analytical procedures for the determination of VOCs released into air from natural and anthropogenic sources: A comparison between SPME (Solid Phase Micro Extraction) and ST (Solid Trap) methods

    International Nuclear Information System (INIS)

    Tassi, F.; Capecchiacci, F.; Buccianti, A.; Vaselli, O.

    2012-01-01

    In the present study, two sampling and analytical methods for VOC determination in fumarolic exhalations related to hydrothermal-magmatic reservoirs in volcanic and geothermal areas and biogas released from waste landfills were compared: (a) Solid Traps (STs), consisting of three phase (Carboxen B, Carboxen C and Carbosieve S111) absorbent stainless steel tubes and (b) Solid Phase Micro Extraction (SPME) fibers, composed of DiVinylBenzene (DVB), Carboxen and PolyDimethylSiloxane. These techniques were applied to pre-concentrate VOCs discharged from: (i) low-to-high temperature fumaroles collected at Vulcano Island, Phlegrean Fields (Italy), and Nisyros Island (Greece), (ii) recovery wells in a solid waste disposal site located near Florence (Italy). A glass condensing system cooled with water was used to collect the dry fraction of the fumarolic gases, in order to allow more efficient VOC absorption avoiding any interference by water vapor and acidic gases, such as SO 2 , H 2 S, HF and HCl, typically present at relatively high concentrations in these fluids. Up to 37 organic species, in the range of 40–400 m/z, were determined by coupling gas chromatography to mass spectrometry (GC–MS). This study shows that the VOC compositions of fumaroles and biogas determined via SPME and ST are largely consistent and can be applied to the analysis of VOCs in gases released from different natural and anthropogenic environments. The SPME method is rapid and simple and more appropriate for volcanic and geothermal emissions, where VOCs are present at relatively high concentrations and prolonged gas sampling may be hazardous for the operator. The ST method, allowing the collection of large quantities of sample, is to be preferred to analyze the VOC composition of fluids from diffuse emissions and air, where these compounds are present at relatively low concentrations.

  6. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    Science.gov (United States)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to

  7. Trypanosoma cruzi infection induces a massive extrafollicular and follicular splenic B-cell response which is a high source of non-parasite-specific antibodies.

    Science.gov (United States)

    Bermejo, Daniela A; Amezcua Vesely, María C; Khan, Mahmood; Acosta Rodríguez, Eva V; Montes, Carolina L; Merino, Maria C; Toellner, Kai Michael; Mohr, Elodie; Taylor, Dale; Cunningham, Adam F; Gruppi, Adriana

    2011-01-01

    Acute infection with Trypanosoma cruzi, the aetiological agent of Chagas' disease, results in parasitaemia and polyclonal lymphocyte activation. It has been reported that polyclonal B-cell activation is associated with hypergammaglobulinaemia and delayed parasite-specific antibody response. In the present study we analysed the development of a B-cell response within the different microenvironments of the spleen during acute T. cruzi infection. We observed massive germinal centre (GC) and extrafollicular (EF) responses at the peak of infection. However, the EF foci were evident since day 3 post-infection (p.i.), and, early in the infection, they mainly provided IgM. The EF foci response reached its peak at 11 days p.i. and extended from the red pulp into the periarteriolar lymphatic sheath. The GCs were detected from day 8 p.i. At the peak of parasitaemia, CD138(+) B220(+) plasma cells in EF foci, red pulp and T-cell zone expressed IgM and all the IgG isotypes. Instead of the substantial B-cell response, most of the antibodies produced by splenic cells did not target the parasite, and parasite-specific IgG isotypes could be detected in sera only after 18 days p.i. We also observed that the bone marrow of infected mice presented a strong reduction in CD138(+) B220(+) cells compared with that of normal mice. Hence, in acute infection with T. cruzi, the spleen appears to be the most important lymphoid organ that lodges plasma cells and the main producer of antibodies. The development of a B-cell response during T. cruzi infection shows features that are particular to T. cruzi and other protozoan infection but different to other infections or immunization with model antigens.

  8. Developments in analytical instrumentation

    Science.gov (United States)

    Petrie, G.

    The situation regarding photogrammetric instrumentation has changed quite dramatically over the last 2 or 3 years with the withdrawal of most analogue stereo-plotting machines from the market place and their replacement by analytically based instrumentation. While there have been few new developments in the field of comparators, there has been an explosive development in the area of small, relatively inexpensive analytical stereo-plotters based on the use of microcomputers. In particular, a number of new instruments have been introduced by manufacturers who mostly have not been associated previously with photogrammetry. Several innovative concepts have been introduced in these small but capable instruments, many of which are aimed at specialised applications, e.g. in close-range photogrammetry (using small-format cameras); for thematic mapping (by organisations engaged in environmental monitoring or resources exploitation); for map revision, etc. Another innovative and possibly significant development has been the production of conversion kits to convert suitable analogue stereo-plotting machines such as the Topocart, PG-2 and B-8 into fully fledged analytical plotters. The larger and more sophisticated analytical stereo-plotters are mostly being produced by the traditional mainstream photogrammetric systems suppliers with several new instruments and developments being introduced at the top end of the market. These include the use of enlarged photo stages to handle images up to 25 × 50 cm format; the complete integration of graphics workstations into the analytical plotter design; the introduction of graphics superimposition and stereo-superimposition; the addition of correlators for the automatic measurement of height, etc. The software associated with this new analytical instrumentation is now undergoing extensive re-development with the need to supply photogrammetric data as input to the more sophisticated G.I.S. systems now being installed by clients, instead

  9. Biodiesel Analytical Methods: August 2002--January 2004

    Energy Technology Data Exchange (ETDEWEB)

    Van Gerpen, J.; Shanks, B.; Pruszko, R.; Clements, D.; Knothe, G.

    2004-07-01

    Biodiesel is an alternative fuel for diesel engines that is receiving great attention worldwide. The material contained in this book is intended to provide the reader with information about biodiesel engines and fuels, analytical methods used to measure fuel properties, and specifications for biodiesel quality control.

  10. Analytical Techniques in the Pharmaceutical Sciences

    DEFF Research Database (Denmark)

    Leurs, Ulrike; Mistarz, Ulrik Hvid; Rand, Kasper Dyrberg

    2016-01-01

    Mass spectrometry (MS) offers the capability to identify, characterize and quantify a target molecule in a complex sample matrix and has developed into a premier analytical tool in drug development science. Through specific MS-based workflows including customized sample preparation, coupling...

  11. Data analytics in the ATLAS Distributed Computing

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration; Bryant, Lincoln

    2015-01-01

    The ATLAS Data analytics effort is focused on creating systems which provide the ATLAS ADC with new capabilities for understanding distributed systems and overall operational performance. These capabilities include: warehousing information from multiple systems (the production and distributed analysis system - PanDA, the distributed data management system - Rucio, the file transfer system, various monitoring services etc. ); providing a platform to execute arbitrary data mining and machine learning algorithms over aggregated data; satisfy a variety of use cases for different user roles; host new third party analytics services on a scalable compute platform. We describe the implemented system where: data sources are existing RDBMS (Oracle) and Flume collectors; a Hadoop cluster is used to store the data; native Hadoop and Apache Pig scripts are used for data aggregation; and R for in-depth analytics. Part of the data is indexed in ElasticSearch so both simpler investigations and complex dashboards can be made ...

  12. Interior beam searchlight semi-analytical benchmark

    International Nuclear Information System (INIS)

    Ganapol, Barry D.; Kornreich, Drew E.

    2008-01-01

    Multidimensional semi-analytical benchmarks to provide highly accurate standards to assess routine numerical particle transport algorithms are few and far between. Because of the well-established 1D theory for the analytical solution of the transport equation, it is sometimes possible to 'bootstrap' a 1D solution to generate a more comprehensive solution representation. Here, we consider the searchlight problem (SLP) as a multidimensional benchmark. A variation of the usual SLP is the interior beam SLP (IBSLP) where a beam source lies beneath the surface of a half space and emits directly towards the free surface. We consider the establishment of a new semi-analytical benchmark based on a new FN formulation. This problem is important in radiative transfer experimental analysis to determine cloud absorption and scattering properties. (authors)

  13. Recent analytical applications of magnetic nanoparticles

    Directory of Open Access Journals (Sweden)

    Mohammad Faraji

    2016-07-01

    Full Text Available Analytical chemistry has experienced, as well as other areas of science, a big change due to the needs and opportunities provided by analytical nanoscience and nanotechnology. Now, nanotechnology is increasingly proving to be a powerful ally of analytical chemistry to achieve its objectives, and to simplify analytical processes. Moreover, the information needs arising from the growing nanotechnological activity are opening an exciting new field of action for analytical chemists. Magnetic nanoparticles have been used in various fields owing to their unique properties including large specific surface area and simple separation with magnetic fields. For Analytical applications, they have been used mainly for sample preparation techniques (magnetic solid phase extraction with different advanced functional groups (layered double hydroxide, β-cyclodextrin, carbon nanotube, graphen, polymer, octadecylsilane and automation of it, microextraction techniques enantioseparation and chemosensors. This review summarizes the basic principles and achievements of magnetic nanoparticles in sample preparation techniques, enantioseparation and chemosensors. Also, some selected articles recently published (2010-2016 have been reviewed and discussed.

  14. Multifunctional nanoparticles: Analytical prospects

    International Nuclear Information System (INIS)

    Dios, Alejandro Simon de; Diaz-Garcia, Marta Elena

    2010-01-01

    Multifunctional nanoparticles are among the most exciting nanomaterials with promising applications in analytical chemistry. These applications include (bio)sensing, (bio)assays, catalysis and separations. Although most of these applications are based on the magnetic, optical and electrochemical properties of multifunctional nanoparticles, other aspects such as the synergistic effect of the functional groups and the amplification effect associated with the nanoscale dimension have also been observed. Considering not only the nature of the raw material but also the shape, there is a huge variety of nanoparticles. In this review only magnetic, quantum dots, gold nanoparticles, carbon and inorganic nanotubes as well as silica, titania and gadolinium oxide nanoparticles are addressed. This review presents a narrative summary on the use of multifuncional nanoparticles for analytical applications, along with a discussion on some critical challenges existing in the field and possible solutions that have been or are being developed to overcome these challenges.

  15. Analytical chemists and dinosaurs

    International Nuclear Information System (INIS)

    Brooks, R.R.

    1987-01-01

    The role of the analytical chemist in the development of the extraterrestrial impact theory for mass extinctions at the terminal Cretaceous Period is reviewed. High iridium concentrations in Cretaceous/Tertiary boundary clays have been linked to a terrestrial impact from an iridium-rich asteroid or large meteorite som 65 million years ago. Other evidence in favour of the occurrence of such an impact has been provided by the detection of shocked quartz grains originating from impact and of amorphous carbon particles similar to soot, derived presumably from wordwide wildfires at the terminal Cretaceous. Further evidence provided by the analytical chemist involves the determination of isotopic ratios such as 144 Nd/ 143 Nd, 187 Os/ 186 Os, and 87 Sr/ 86 Sr. Countervailing arguments put forward by the gradualist school (mainly palaeontological) as opposed to the catastrophists (mainly chemists and geochemists) are also presented and discussed

  16. Hermeneutical and analytical jurisprudence

    Directory of Open Access Journals (Sweden)

    Spaić Bojan

    2014-01-01

    Full Text Available The article examines the main strands of development in jurisprudence in the last few decades from the standpoint of the metatheoretical differentiation between analytical and hermeneutical perspective in the study of law. The author claims that recent jurisprudent accounts can rarely be positioned within the traditional dichotomy natural law theories - legal positivism, and that this dichotomy is not able to account for the differences between contemporary conceptions of law. As an alternative the difference between the analytical and hermeneutical traditions in philosophy are explained, as they have crucially influenced posthartian strands in Anglo-American philosophy and postkelsenian strands in continental philosophy of law. Finally, the influence of hermeneutical philosophy and legal theory is examined in regards of the development of a hermeneutical theory of law and the development of legal hermeneutics.

  17. Analytical chemists and dinosaurs

    Energy Technology Data Exchange (ETDEWEB)

    Brooks, R R

    1987-05-01

    The role of the analytical chemist in the development of the extraterrestrial impact theory for mass extinctions at the terminal Cretaceous Period is reviewed. High iridium concentrations in Cretaceous/Tertiary boundary clays have been linked to a terrestrial impact from an iridium-rich asteroid or large meteorite som 65 million years ago. Other evidence in favour of the occurrence of such an impact has been provided by the detection of shocked quartz grains originating from impact and of amorphous carbon particles similar to soot, derived presumably from wordwide wildfires at the terminal Cretaceous. Further evidence provided by the analytical chemist involves the determination of isotopic ratios such as /sup 144/Nd//sup 143/Nd, /sup 187/Os//sup 186/Os, and /sup 87/Sr//sup 86/Sr. Countervailing arguments put forward by the gradualist school (mainly palaeontological) as opposed to the catastrophists (mainly chemists and geochemists) are also presented and discussed.

  18. Communication Theoretic Data Analytics

    OpenAIRE

    Chen, Kwang-Cheng; Huang, Shao-Lun; Zheng, Lizhong; Poor, H. Vincent

    2015-01-01

    Widespread use of the Internet and social networks invokes the generation of big data, which is proving to be useful in a number of applications. To deal with explosively growing amounts of data, data analytics has emerged as a critical technology related to computing, signal processing, and information networking. In this paper, a formalism is considered in which data is modeled as a generalized social network and communication theory and information theory are thereby extended to data analy...

  19. Social Data Analytics Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    This paper presents the design, development and demonstrative case studies of the Social Data Analytics Tool, SODATO. Adopting Action Design Framework [1], the objective of SODATO [2] is to collect, store, analyze, and report big social data emanating from the social media engagement of and social...... media conversations about organizations. We report and discuss results from two demonstrative case studies that were conducted using SODATO and conclude with implications and future work....

  20. Analytic chemistry of molybdenum

    International Nuclear Information System (INIS)

    Parker, G.A.

    1983-01-01

    Electrochemical, colorimetric, gravimetric, spectroscopic, and radiochemical methods for the determination of molybdenum are summarized in this book. Some laboratory procedures are described in detail while literature citations are given for others. The reader is also referred to older comprehensive reviews of the analytical chemistry of molybdenum. Contents, abridged: Gravimetric methods. Titrimetric methods. Colorimetric methods. X-ray fluorescence. Voltammetry. Catalytic methods. Molybdenum in non-ferrous alloys. Molydbenum compounds

  1. Competing on analytics.

    Science.gov (United States)

    Davenport, Thomas H

    2006-01-01

    We all know the power of the killer app. It's not just a support tool; it's a strategic weapon. Companies questing for killer apps generally focus all their firepower on the one area that promises to create the greatest competitive advantage. But a new breed of organization has upped the stakes: Amazon, Harrah's, Capital One, and the Boston Red Sox have all dominated their fields by deploying industrial-strength analytics across a wide variety of activities. At a time when firms in many industries offer similar products and use comparable technologies, business processes are among the few remaining points of differentiation--and analytics competitors wring every last drop of value from those processes. Employees hired for their expertise with numbers or trained to recognize their importance are armed with the best evidence and the best quantitative tools. As a result, they make the best decisions. In companies that compete on analytics, senior executives make it clear--from the top down--that analytics is central to strategy. Such organizations launch multiple initiatives involving complex data and statistical analysis, and quantitative activity is managed atthe enterprise (not departmental) level. In this article, professor Thomas H. Davenport lays out the characteristics and practices of these statistical masters and describes some of the very substantial changes other companies must undergo in order to compete on quantitative turf. As one would expect, the transformation requires a significant investment in technology, the accumulation of massive stores of data, and the formulation of company-wide strategies for managing the data. But, at least as important, it also requires executives' vocal, unswerving commitment and willingness to change the way employees think, work, and are treated.

  2. Introduction to analytical mechanics

    CERN Document Server

    Gamalath, KAILW

    2011-01-01

    INTRODUCTION TO ANALYTICAL MECHANICS is an attempt to introduce the modern treatment of classical mechanics so that transition to many fields in physics can be made with the least difficulty. This book deal with the formulation of Newtonian mechanics, Lagrangian dynamics, conservation laws relating to symmetries, Hamiltonian dynamics Hamilton's principle, Poisson brackets, canonical transformations which are invaluable in formulating the quantum mechanics and Hamilton-Jacobi equation which provides the transition to wave mechanics.

  3. Inorganic Analytical Chemistry

    DEFF Research Database (Denmark)

    Berg, Rolf W.

    The book is a treatise on inorganic analytical reactions in aqueous solution. It covers about half of the elements in the periodic table, i.e. the most important ones : H, Li, B, C, N, O, Na, Mg, Al, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, As, Se, Br, Sr, Mo, Ag, Cd, Sn, Sb, I, Ba, W,...

  4. Business analytics a practitioner's guide

    CERN Document Server

    Saxena, Rahul

    2013-01-01

    This book provides a guide to businesses on how to use analytics to help drive from ideas to execution. Analytics used in this way provides "full lifecycle support" for business and helps during all stages of management decision-making and execution.The framework presented in the book enables the effective interplay of business, analytics, and information technology (business intelligence) both to leverage analytics for competitive advantage and to embed the use of business analytics into the business culture. It lays out an approach for analytics, describes the processes used, and provides gu

  5. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management.

    Science.gov (United States)

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-12-15

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.

  6. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA for Environmental Risk Management

    Directory of Open Access Journals (Sweden)

    Yan Li

    2016-12-01

    Full Text Available With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity and the degree of Socio-Economic Deprivation (SED at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.

  7. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management

    Science.gov (United States)

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-01-01

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713

  8. Automatic gamma spectrometry analytical apparatus

    International Nuclear Information System (INIS)

    Lamargot, J.-P.; Wanin, Maurice.

    1980-01-01

    This invention falls within the area of quantitative or semi-quantitative analysis by gamma spectrometry and particularly refers to a device for bringing the samples into the counting position. The purpose of this invention is precisely to provide an automatic apparatus specifically adapted to the analysis of hard gamma radiations. To this effect, the invention relates to a gamma spectrometry analytical device comprising a lead containment, a detector of which the sensitive part is located inside the containment and additionally comprising a transfer system for bringing the analyzed samples in succession to a counting position inside the containment above the detector. A feed compartment enables the samples to be brought in turn one by one on to the transfer system through a duct connecting the compartment to the transfer system. Sequential systems for the coordinated forward feed of the samples in the compartment and the transfer system complete this device [fr

  9. Road Transportable Analytical Laboratory (RTAL) system

    International Nuclear Information System (INIS)

    1993-01-01

    The goal of this contractual effort is the development and demonstration of a Road Transportable Analytical Laboratory (RTAL) system to meet the unique needs of the Department of Energy (DOE) for rapid, accurate analysis of a wide variety of hazardous and radioactive contaminants in soil, groundwater, and surface waters. This laboratory system will be designed to provide the field and laboratory analytical equipment necessary to detect and quantify radionuclides, organics, heavy metals and other inorganics, and explosive materials. The planned laboratory system will consist of a set of individual laboratory modules deployable independently or as an interconnected group to meet each DOE site's specific needs

  10. Analytic posteriors for Pearson's correlation coefficient.

    Science.gov (United States)

    Ly, Alexander; Marsman, Maarten; Wagenmakers, Eric-Jan

    2018-02-01

    Pearson's correlation is one of the most common measures of linear dependence. Recently, Bernardo (11th International Workshop on Objective Bayes Methodology, 2015) introduced a flexible class of priors to study this measure in a Bayesian setting. For this large class of priors, we show that the (marginal) posterior for Pearson's correlation coefficient and all of the posterior moments are analytic. Our results are available in the open-source software package JASP.

  11. Analytic posteriors for Pearson's correlation coefficient

    OpenAIRE

    Ly, A.; Marsman, M.; Wagenmakers, E.-J.

    2018-01-01

    Pearson's correlation is one of the most common measures of linear dependence. Recently, Bernardo (11th International Workshop on Objective Bayes Methodology, 2015) introduced a flexible class of priors to study this measure in a Bayesian setting. For this large class of priors, we show that the (marginal) posterior for Pearson's correlation coefficient and all of the posterior moments are analytic. Our results are available in the open‐source software package JASP.

  12. Analytical research using synchrotron radiation based techniques

    International Nuclear Information System (INIS)

    Jha, Shambhu Nath

    2015-01-01

    There are many Synchrotron Radiation (SR) based techniques such as X-ray Absorption Spectroscopy (XAS), X-ray Fluorescence Analysis (XRF), SR-Fourier-transform Infrared (SRFTIR), Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. which are increasingly being employed worldwide in analytical research. With advent of modern synchrotron sources these analytical techniques have been further revitalized and paved ways for new techniques such as microprobe XRF and XAS, FTIR microscopy, Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. The talk will cover mainly two techniques illustrating its capability in analytical research namely XRF and XAS. XRF spectroscopy: XRF spectroscopy is an analytical technique which involves the detection of emitted characteristic X-rays following excitation of the elements within the sample. While electron, particle (protons or alpha particles), or X-ray beams can be employed as the exciting source for this analysis, the use of X-ray beams from a synchrotron source has been instrumental in the advancement of the technique in the area of microprobe XRF imaging and trace level compositional characterisation of any sample. Synchrotron radiation induced X-ray emission spectroscopy, has become competitive with the earlier microprobe and nanoprobe techniques following the advancements in manipulating and detecting these X-rays. There are two important features that contribute to the superb elemental sensitivities of microprobe SR induced XRF: (i) the absence of the continuum (Bremsstrahlung) background radiation that is a feature of spectra obtained from charged particle beams, and (ii) the increased X-ray flux on the sample associated with the use of tunable third generation synchrotron facilities. Detection sensitivities have been reported in the ppb range, with values of 10 -17 g - 10 -14 g (depending on the particular element and matrix). Keeping in mind its demand, a microprobe XRF beamline has been setup by RRCAT at Indus-2 synchrotron

  13. MERRA Analytic Services

    Science.gov (United States)

    Schnase, J. L.; Duffy, D. Q.; McInerney, M. A.; Tamkin, G. S.; Thompson, J. H.; Gill, R.; Grieg, C. M.

    2012-12-01

    MERRA Analytic Services (MERRA/AS) is a cyberinfrastructure resource for developing and evaluating a new generation of climate data analysis capabilities. MERRA/AS supports OBS4MIP activities by reducing the time spent in the preparation of Modern Era Retrospective-Analysis for Research and Applications (MERRA) data used in data-model intercomparison. It also provides a testbed for experimental development of high-performance analytics. MERRA/AS is a cloud-based service built around the Virtual Climate Data Server (vCDS) technology that is currently used by the NASA Center for Climate Simulation (NCCS) to deliver Intergovernmental Panel on Climate Change (IPCC) data to the Earth System Grid Federation (ESGF). Crucial to its effectiveness, MERRA/AS's servers will use a workflow-generated realizable object capability to perform analyses over the MERRA data using the MapReduce approach to parallel storage-based computation. The results produced by these operations will be stored by the vCDS, which will also be able to host code sets for those who wish to explore the use of MapReduce for more advanced analytics. While the work described here will focus on the MERRA collection, these technologies can be used to publish other reanalysis, observational, and ancillary OBS4MIP data to ESGF and, importantly, offer an architectural approach to climate data services that can be generalized to applications and customers beyond the traditional climate research community. In this presentation, we describe our approach, experiences, lessons learned,and plans for the future.; (A) MERRA/AS software stack. (B) Example MERRA/AS interfaces.

  14. Analytical elements of mechanics

    CERN Document Server

    Kane, Thomas R

    2013-01-01

    Analytical Elements of Mechanics, Volume 1, is the first of two volumes intended for use in courses in classical mechanics. The books aim to provide students and teachers with a text consistent in content and format with the author's ideas regarding the subject matter and teaching of mechanics, and to disseminate these ideas. The book opens with a detailed exposition of vector algebra, and no prior knowledge of this subject is required. This is followed by a chapter on the topic of mass centers, which is presented as a logical extension of concepts introduced in connection with centroids. A

  15. Analytical chemistry in space

    CERN Document Server

    Wainerdi, Richard E

    1970-01-01

    Analytical Chemistry in Space presents an analysis of the chemical constitution of space, particularly the particles in the solar wind, of the planetary atmospheres, and the surfaces of the moon and planets. Topics range from space engineering considerations to solar system atmospheres and recovered extraterrestrial materials. Mass spectroscopy in space exploration is also discussed, along with lunar and planetary surface analysis using neutron inelastic scattering. This book is comprised of seven chapters and opens with a discussion on the possibilities for exploration of the solar system by

  16. Analytical chemistry experiment

    International Nuclear Information System (INIS)

    Park, Seung Jo; Paeng, Seong Gwan; Jang, Cheol Hyeon

    1992-08-01

    This book deals with analytical chemistry experiment with eight chapters. It explains general matters that require attention on experiment, handling of medicine with keep and class, the method for handling and glass devices, general control during experiment on heating, cooling, filtering, distillation and extraction and evaporation and dry, glass craft on purpose of the craft, how to cut glass tube and how to bend glass tube, volumetric analysis on neutralization titration and precipitation titration, gravimetric analysis on solubility product, filter and washing and microorganism experiment with necessary tool, sterilization disinfection incubation and appendixes.

  17. Analytic aspects of convexity

    CERN Document Server

    Colesanti, Andrea; Gronchi, Paolo

    2018-01-01

    This book presents the proceedings of the international conference Analytic Aspects in Convexity, which was held in Rome in October 2016. It offers a collection of selected articles, written by some of the world’s leading experts in the field of Convex Geometry, on recent developments in this area: theory of valuations; geometric inequalities; affine geometry; and curvature measures. The book will be of interest to a broad readership, from those involved in Convex Geometry, to those focusing on Functional Analysis, Harmonic Analysis, Differential Geometry, or PDEs. The book is a addressed to PhD students and researchers, interested in Convex Geometry and its links to analysis.

  18. Local analytic geometry

    CERN Document Server

    Abhyankar, Shreeram Shankar

    1964-01-01

    This book provides, for use in a graduate course or for self-study by graduate students, a well-motivated treatment of several topics, especially the following: (1) algebraic treatment of several complex variables; (2) geometric approach to algebraic geometry via analytic sets; (3) survey of local algebra; (4) survey of sheaf theory. The book has been written in the spirit of Weierstrass. Power series play the dominant role. The treatment, being algebraic, is not restricted to complex numbers, but remains valid over any complete-valued field. This makes it applicable to situations arising from

  19. Biosensors: Future Analytical Tools

    Directory of Open Access Journals (Sweden)

    Vikas

    2007-02-01

    Full Text Available Biosensors offer considerable promises for attaining the analytic information in a faster, simpler and cheaper manner compared to conventional assays. Biosensing approach is rapidly advancing and applications ranging from metabolite, biological/ chemical warfare agent, food pathogens and adulterant detection to genetic screening and programmed drug delivery have been demonstrated. Innovative efforts, coupling micromachining and nanofabrication may lead to even more powerful devices that would accelerate the realization of large-scale and routine screening. With gradual increase in commercialization a wide range of new biosensors are thus expected to reach the market in the coming years.

  20. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    Science.gov (United States)

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.